title
stringlengths
8
300
abstract
stringlengths
0
10k
High pressure solid state chemistry of carbon dioxide.
A review of experimental and theoretical studies performed over the past three decades on high pressure chemistry of solid CO2, at 0-80 GPa and 40-3000 K, is presented. Emphasis is placed on the recently discovered non-molecular covalent crystalline phase V, and its glassy counterpart a-CO2, along with other molecular phases, whose interpretation is crucial for determining the reaction path to non-molecular CO2. The matter is still under debate, and many open issues are outlined, such as the true reaction mechanism for forming phase V. Finally, we propose arguments to stimulate possible future research in a more extended P-T range. This work is a tutorial review and should be of general interest both for solid state chemistry and condensed matter physics communities.
Learning Effective Embeddings from Medical Notes
With the large amount of available data and the variety of features they offer, electronic health records (EHR) have gotten a lot of interest over recent years, and start to be widely used by the machine learning and bioinformatics communities. While typical numerical fields such as demographics, vitals, lab measurements, diagnoses and procedures, are natural to use in machine learning models, there is no consensus yet on how to use the free-text clinical notes. We show how embeddings can be learned from patients’ history of notes, at the word, note and patient level, using simple neural and sequence models. We show on various relevant evaluation tasks that these embeddings are easily transferable to smaller problems, where they enable accurate predictions using only clinical notes.
Activity-based costing / management and its implications for operations management
Activity-Based Costing/Management (ABC/M) is an Information System developed in the 1980s to overcome some of the limitations of traditional cost accounting and to enhance its usefulness to strategic decision-making. In this paper, we show how an ABC/M system can serve as a useful information system to support effective operations decision-making processes. We propose a conceptual framework, Operations Hexagon, to discuss the managerial implications of an ABC/M system for various operations management decisions related to product planning and design, quality management and control, inventory management, capacity management and work force management. By viewing an ABC/M system as an enabler to improve the operations decision-making, we demonstrate that these systems enable an operations manager to enhance the quality of the decision-making process.  2001 Elsevier Science Ltd. All rights reserved.
Knowledge Discovery in Manufacturing Simulations
Discrete event simulation studies in a manufacturing context are a powerful instrument when modeling and evaluating processes of various industries. Usually simulation experts conduct simulation experiments for a predetermined system specification by manually varying parameters through educated assumptions and according to a prior defined goal. Moreover, simulation experts try to reduce complexity and number of simulation runs by excluding parameters that they consider as not influential regarding the simulation project scope. On the other hand, today's world of big data technology enables us to handle huge amounts of data. We therefore investigate the potential benefits of designing large scale experiments with a much broader coverage of possible system behavior. In this paper, we propose an approach for applying data mining methods on simulation data in combination with suitable visualization methods in order to uncover relationships in model behavior to discover knowledge that otherwise would have remained hidden. For a prototypical demonstration we used a clustering algorithm to divide large amounts of simulation output datasets into groups of similar performance values and depict those groups through visualizations to conduct a visual investigation process of the simulation data.
Creation of spiking neuron models applied in pattern recognition problems
Some spiking neuron models have proved to solve different linear and non-linear pattern recognition problems. Indeed, only one spiking neuron can generate comparable results as classical artificial neural network. However, depending on the classification problem, one spiking model could be better or less efficient than other. In this paper we propose a methodology to create spiking neuron models using Gene Expression Programming. The new models created are applied in eight pattern recognition problems. The results obtained are compared with previous results generated adopting the Izhikevich spiking neuron model. This first effort will help us to generate spiking neuron models which will be adaptable to a specific pattern recognition problem.
Millimeter Wave MIMO With Lens Antenna Array: A New Path Division Multiplexing Paradigm
Millimeter wave (mmWave) communication is a promising technology for future wireless systems, while one practical challenge is to achieve its large-antenna gains with only limited radio frequency (RF) chains for cost-effective implementation. To this end, we study in this paper a new lens antenna array enabled mmWave multiple-input multiple-output (MIMO) communication system. We first show that the array response of lens antenna arrays follows a “sinc” function, where the antenna element with the peak response is determined by the angle of arrival (AoA)/departure (AoD) of the received/transmitted signal. By exploiting this unique property along with the multi-path sparsity of mmWave channels, we propose a novel low-cost and capacity-achieving spatial multiplexing scheme for both narrow-band and wide-band mmWave communications, termed path division multiplexing (PDM), where parallel data streams are transmitted over different propagation paths with simple per-path processing. We further propose a simple path grouping technique with group-based small-scale MIMO processing to effectively mitigate the inter-stream interference due to similar AoAs/AoDs. Numerical results are provided to compare the performance of the proposed mmWave lens MIMO against the conventional MIMO with uniform planar arrays (UPAs) and hybrid analog/digital processing. It is shown that the proposed design achieves significant throughput gains as well as complexity and cost reductions, thus leading to a promising new paradigm for mmWave MIMO communications.
Evaluation of Land Administration Systems
Currently there are no internationally accepted methodologies to evaluate and compare the performance of land administration systems. This is partly because land administration systems are in constant reform, and probably more importantly, they represent societies' different perceptions of land. This paper describes the development of a framework to measure and compare the performance of land administration systems. The research is of particular relevance since it develops a management model which links the operational aspects of land administration with land policy.
Pediatric lipoblastoma in the head and neck: a systematic review of 48 reported cases.
BACKGROUND Lipoblastoma is an exceedingly rare cause of pediatric head and neck masses. There have been 47 cases previously reported in the English literature. We present an additional case and review of the available literature on this rare neoplasm. OBJECTIVE To review and assess the current published literature regarding the efficacy of preserving neurovascular structures in the surgical management of pediatric lipoblastoma. METHODS Literature analysis of case reports was performed. MEDLINE was searched for the terms "neonatal lipoblastoma", "lipoblastomatosis", and "benign lipoblastoma". Results in the English literature were mined for relevant clinical data when available. The citations of case reviews found were searched to find additional cases. RESULTS Including our new case, a total of 48 cases of head and neck lipoblastoma have been reported in the English literature within 23 manuscripts. Four manuscripts presented cases series (Evidence Based Medicine Level 4) and 19 were case reports (Level 5). The median sample size was 1 (range 1-4). For those 14 articles (N=23 cases) reporting follow-up, the median follow-up duration was 22 months. Male to female ratio was 2.1:1 with an average age at presentation of 2.1 years (range: newborn to 12 years). Lesions ranged from 3 to 12 cm in longest diameter. Recurrence was seen in 27% of patients in which there was at least 1-year follow-up. The most common presenting symptoms were painless enlarging neck mass (53%, 17/32) and respiratory distress (12%, 4/32). An exact binomial sign test indicated that most authors recommend conservative complete excision with preservation of vital structures with 10 of 11 authors giving a stance supporting conservative surgical resection, p=.012. CONCLUSIONS Our findings suggest that although total excision is ideal and curative, subtotal resection may be a viable treatment alternative for lipoblastoma of the head and neck. This tumor presents a clinical challenge and should be considered in infants presenting with a cervical mass. It is difficult to differentiate from the much more common lymphangioma on clinical and radiological examination. Additionally, the potential for rapid growth and adhesion to neurovascular tissue makes surgical resection arduous. Nonetheless, recurrence rates for head and neck lipoblastomas are similar to those rates observed elsewhere in the body.
Support for vehicle-to-everything services based on LTE
Vehicular applications and communications technologies are often referred to as vehicle-to-everything (V2X), which is classified into four different types: vehicle-to-vehicle (V2V), vehicle- to-infrastructure (V2I), vehicle-to-network (V2N), and vehicle-to-pedestrian (V2P) [1]. V2X related research projects, field tests, and regulatory work have been promoted in different countries and regions. In May 2015, the Ministry of Industry and Information Technology (MIIT) of China explained national strategies, "Made in China 2025," about intelligent connected vehicles. In 2020 and 2025, the overall technology and key technologies for intelligent driver assistance and automatic driving will be available in China, respectively [2]. V2X solutions are the critical technologies to support the realization of such visions. Although IEEE 802.11p has been selected as the technology for V2X communications in some countries such as the United States and in Europe, the intrinsic characteristics of IEEE 802.11p have confined the technology to support low latency with high reliability [3, 4]. Standardization of Long Term Evolution (LTE)-based V2X is being actively conducted by the Third Generation Partnership Project (3GPP) to provide the solution for V2X communications that benefit from the global deployment and fast commercialization of LTE systems. Because of the wide deployment of LTE networks, V2I and V2N services can be provided with high data rate, comprehensive quality of service (QoS) support, ubiquitous coverage, and high penetration rate [5]. Meanwhile, LTE can be extended to support V2V direct communications based on device-to-device (D2D) sidelink design to satisfy the QoS requirements, such as low latency, high reliability, and high speed in the case of high vehicle density [6].
Software and Hardware FPGA-Based Digital Watermarking and Steganography Approaches: Toward New Methodology for Evaluation and Benchmarking Using Multi-Criteria Decision-Making Techniques
Evaluating and benchmarking software and hardware ̄eld programmable gate array (FPGA)based digital watermarking are considered challenging tasks because of multiple and con°icting evaluation criteria. A few evaluation and benchmarking techniques/frameworks have been implemented to digital watermarking or steganography; however, these approaches still present certain limitations. In particular, ̄xing some attributes on account of other attributes and wellknown benchmarking approaches are limited to robust watermarking techniques. Thus, this study aims toward a new methodology for evaluation and benchmarking using multi-criteria analysis for software and hardware \FPGA"-based digital watermarking or steganography. To achieve this objective, two iterations are conducted. The ̄rst iteration consists of two stages: discussing software and hardware \FPGA"-based digital watermarking or steganography to create a dataset with various samples for benchmarking and discussing the evaluation method and then discussing the test for software and hardware \FPGA"-based digital watermarking or steganography according to multi-criteria evaluation (i.e., complexity, payload and quality) to create a decision matrix. The second iteration applies di®erent decision-making techniques (i.e., SAW, MEW, HAW, TOPSIS, WSM and WPM)) to benchmark the results of the ̄rst iteration (i.e., software or hardware FPGA-based digital watermarking or steganography approaches). Then, the discussed mean, standard deviation and paired sample t-test results are used to measure the correlations among di®erent techniques based on the ranking results. The discussion ̄ndings are described as follows: (1) the integration of developer and evaluator preferences into the evaluation and benchmarking for software and hardware FPGA-based digital watermarking or steganography, (2) the process of assigning weights and (3) visualizing large-scale data sample in either software or hardware FPGA-based digital watermarking or steganography algorithms.
Service employee behaviour: the role of compliance and risk taking
This paper looks at the decision-making process that determines the amount of effort frontline service employees will expend in delivering a service in a business-to-business context. Using theories in behavioural economics and interactional and social psychology, the paper develops and presents a model of employee decision-making. Managerial implications, which have the potential to enhance the marketing of business-to-business services and directions for future research in this area, are indicated.
Recent Trends in Performance Modeling of Big Data Systems
With the advent of big data through social media and continuous creation of digital footprints through various mobile devices, special-purpose programming models were developed that would make it easy to write programs to process such data. MapReduce and its Hadoop implementation is one of the most popular platforms for writing such programs. The MapReduce framework involves a "map" phase where various tasks work in parallel for intermediate processing of data and a "reduce" phase where again various tasks work in parallel to extract information from this processed data. Performance modeling of such systems will need different approaches than are used for traditional multi-threaded multi-core systems supporting Web applications, primarily because the dependencies and synchronization required between various tasks is not easily expressible using standard queuing network models. In this talk we will review work done by researchers to address this modeling problem. The work done encompasses first-principles calculations of execution time completion, queuing network models, and finally, simulation. We will review these efforts as well as highlight opportunities for further work in this area.
Treatment of osteoporosis after liver transplantation with ibandronate.
Osteoporosis is a major side-effect after liver transplantation (LTX). Therefore, the objective of the study was to evaluate the efficacy of ibandronate to reduce fractures after LTX. Seventy-four patients after LTX were included in the study and measurements of bone mineral density (BMD) of lumbar spine and proximal femur using dual energy X-ray absorptiometry (DEXA) were performed prior to and 3, 6, 12 and 24 months after surgery. The study group (IBA) consisted of 34 patients who received calcium (1 g/day), vitamin D3 (800-1000 IE/day) and ibandronate 2 mg every 3 months intravenously for 1 year. The control group consisted of 40 patients (CON) who received calcium and vitamin D3 at the same dosages. Prevalence of new fractures was predefined as primary endpoint. Changes of BMD and biochemical markers of bone metabolism were also investigated. In all patients, we found a reduction of BMD in the first few months after LTX. In the lumbar spine and the proximal femur the maximum reduction occurred 3 and 6 months post-LTX. One and 2 years after transplantation, the group receiving ibandronate demonstrated a better recovery from loss of BMD and a significantly lower prevalence of fractures (IBA 2 vs. CON 10 P < 0.04, chi(2)). Ibandronate with calcium and vitamin D3 reduces the BMD-loss after LTX and decreases the rate of bone fractures significantly.
Fake and Spam Messages: Detecting Misinformation During Natural Disasters on Social Media
During natural disasters or crises, users on social media tend to easily believe contents of postings related to the events, and retweet the postings with hoping them to be reached to many other users. Unfortunately, there are malicious users who understand the tendency and post misinformation such as spam and fake messages with expecting wider propagation. To resolve the problem, in this paper we conduct a case study of 2013 Moore Tornado and Hurricane Sandy. Concretely, we (i) understand behaviors of these malicious users, (ii) analyze properties of spam, fake and legitimate messages, (iii) propose flat and hierarchical classification approaches, and (iv) detect both fake and spam messages with even distinguishing between them. Our experimental results show that our proposed approaches identify spam and fake messages with 96.43% accuracy and 0.961 F-measure.
Informant discrepancies in clinical reports of youths and interviewers' impressions of the reliability of informants.
OBJECTIVE In this study the authors examined whether discrepancies between parent and youth reports of the youth's emotional and behavioral functioning are related to interviewers' reliability ratings of parents and youths. METHODS In a consecutive case series analysis of 328 parents and youths aged 11-17 years, parents and youths provided reports of youth emotional and behavioral functioning and participated in structured clinical interviews. At the conclusion of the interviews, interviewers rated the reliability of informants. Interviewers rated youths' clinical severity and parents and youths provided information on youth demographics. Nominal logistic regressions tested patterns of discrepancies between parent and youth reports (i.e., which informant consistently reported greater degrees of youth emotional and behavioral concerns than the other) as predictors of interviewers' ratings of the reliability of parents and youths. All analyses controlled for variance explained by youth demographics and youth severity. RESULTS When parents reported greater degrees of youth emotional and behavioral concerns than youths self-reported, interviewers were likely to rate the youth as an unreliable informant, and were unlikely to rate the youth as an unreliable informant when parents reported less concerns than youths self-reported. However, interviewers' ratings of parents' reliability did not relate to the discrepancies between reports, regardless of which informant reported greater degrees of youth concerns. CONCLUSIONS Prior research indicates that informant discrepancies potentially reveal important information of youths' emotional and behavioral concerns, such as the settings in which youths express these concerns. Yet, when parents and youths disagree in their clinical reports of the youth's functioning, this relates to whether a clinical interviewer views the youth as a reliable informant of their own functioning. To increase the cost-effectiveness and clinical utility of multi-informant clinical evaluations, practitioners and researchers should anticipate informant discrepancies and predict what they may represent before conducting clinical evaluations.
ACRec: a co-authorship based random walk model for academic collaboration recommendation
Recent academic procedures have depicted that work involving scientific research tends to be more prolific through collaboration and cooperation among researchers and research groups. On the other hand, discovering new collaborators who are smart enough to conduct joint-research work is accompanied with both difficulties and opportunities. One notable difficulty as well as opportunity is the big scholarly data. In this paper, we satisfy the demand of collaboration recommendation through co-authorship in an academic network. We propose a random walk model using three academic metrics as basics for recommending new collaborations. Each metric is studied through mutual paper co-authoring information and serves to compute the link importance such that a random walker is more likely to visit the valuable nodes. Our experiments on DBLP dataset show that our approach can improve the precision, recall rate and coverage rate of recommendation, compared with other state-of-the-art approaches.
Fiberoptic ductoscopy for breast cancer patients with nipple discharge
Background: Breast cancer and precancer are thought to originate in the lining of the milk duct, but until recently, we have not had direct access to this area other than in tissue removed blindly by core biopsy or fine-needle aspiration. Fiberoptic ductoscopy (FDS) is an emerging technique that allows direct visual access of the ductal system of the breast through nipple orifice cannulation and exploration. To date, this technique has been used only in pilot studies. Previously, we have demonstrated that fiberoptic ductoscopy in patients with and without nipple discharge is a safe and effective means of visualizing the intraductal lesion. When combined with cytology, it is a screening technique that has high predictive value. Methods: We applied ductoscopy to 415 women with nipple discharge with the specific intent of detecting those patients with nipple discharge who had intraductal carcinoma (DCIS) as the basis of their discharge. Results: In this cohort of patients, ductoscopy was successful in visualizing an intraductal lesion in 166 patients (40%). In these cases, ductal lavage following ductoscopy increased the yield of cytologically interpretable ductal epithelial cells 100-fold compared to discharge fluid alone. In the majority of these patients, FDS examination detected lesions that had the appearance of typical papillomas. However, in 10 patients, the intraductal lesion exhibited one of several atypical features, including bleeding, circumferential obstruction, and gross fungating projections. In eight of these patients, the subsequent histopathology turned out to be DCIS. In two of these eight patients, endoscopic biopsy revealed cytologically malignant cells; in two others, ductal lavage (washings) revealed cytologically malignant cells. In three additional patients, although FDS examination uncovered a typical papilloma that was not biopsied, ductal lavage (washings) revealed cytologically malignant cells. On surgical pathology review of the extirpated lesions, all 11 patients were subsequently shown to have DCIS. Of these 11 cases of DCIS that were initially detected with a combination of FDS and ductal lavage cytology, six were completely negative on mammogram and physical exam. Conclusion: Although nipple discharge is an unusual presentation for DCIS, in patients with nipple discharge, FDS with ductal lavage cytology is a useful technique for diagnosing DCIS prior to definitive surgery.
Unsupervised Natural Language Generation with Denoising Autoencoders
Generating text from structured data is important for various tasks such as question answering and dialog systems. We show that in at least one domain, without any supervision and only based on unlabeled text, we are able to build a Natural Language Generation (NLG) system with higher performance than supervised approaches. In our approach, we interpret the structured data as a corrupt representation of the desired output and use a denoising auto-encoder to reconstruct the sentence. We show how to introduce noise into training examples that do not contain structured data, and that the resulting denoising auto-encoder generalizes to generate correct sentences when given structured data.
Automatically Processing Tweets from Gang-Involved Youth: Towards Detecting Loss and Aggression
Violence is a serious problems for cities like Chicago and has been exacerbated by the use of social media by gang-involved youths for taunting rival gangs. We present a corpus of tweets from a young and powerful female gang member and her communicators, which we have annotated with discourse intention, using a deep read to understand how and what triggered conversations to escalate into aggression. We use this corpus to develop a part-of-speech tagger and phrase table for the variant of English that is used, as well as a classifier for identifying tweets that express grieving and aggression.
Data mining in manufacturing: a review based on the kind of knowledge
In modern manufacturing environments, vast amounts of data are collected in database management systems and data warehouses from all involved areas, such as product and process design, assembly, materials planning, quality control, scheduling, maintenance, fault detection and so on. Data mining has emerged as an important tool for knowledge acquisition in manufacturing databases. This paper reviews the literature dealing with knowledge discovery and data mining applications in the broad domain of manufacturing with an special emphasis on the type of functions to be performed on data. The major data mining functions to be performed include characterization and description, association, classification, prediction, clustering and evolution analysis. The papers reviewed have therefore been categorized in these five categories. It has been shown that there is a rapid growth in the application of data mining in the context of manufacturing processes and enterprises in the last 3 years. This review reveals the progressive applications and existing gaps identified in the context of data mining in manufacturing. A novel text mining approach has also been applied to the abstracts and keywords of 150 identified literatures to identify the research gaps and find the linkages between knowledge area, knowledge type and data mining tools and techniques applied. 1 Wolfson School of Mechanical and Manufacturing Engineering, Loughborough University, Loughborough, Leicestershire, UK, LE113TU
Bias in Wikipedia
While studies have shown that Wikipedia articles exhibit quality that is comparable to conventional encyclopedias, research still proves that Wikipedia, overall, is prone to many different types of Neutral Point of View (NPOV) violations that are explicitly or implicitly caused by bias from its editors. Related work focuses on political, cultural and gender bias. We are developing an approach for detecting both explicit and implicit bias in Wikipedia articles and observing its evolution over time. Our approach is based on different factors of bias, with the most important ones being language style, editors, and citations. In this paper we present the approach, methodology and a first analysis.
Ganoderma Lucidum (Reishi Mushroom) and cancer.
Having a long historical past in traditional Chinese medicine, Ganoderma Lucidum (G. Lucidum) is a type of mushroom believed to extend life and promote health. Due to the increasing consumption pattern, it has been cultivated and marketed intensively since the 1970s. It is claimed to be effective in the prevention and treatment of many diseases, and in addition, it exerts anticancer properties. Almost all the data on the benefits of G. Lucidum are based on laboratory and preclinical studies. The few clinical studies conducted are questionable. Nevertheless, when the findings obtained from laboratory studies are considered, it turns that G. Lucidum is likely to have some benefits for cancer patients. What is important at this point is to determine the components that will provide these benefits, and use them in drug development, after testing their reliability. In conclusion, it would be the right approach to abstain from using and incentivizing this product, until its benefits and harms are set out clearly, by considering its potential side effects.
Non-line-of-sight Imaging with Partial Occluders and Surface Normals.
Fig. 1. Non-line-of-sight (NLOS) imaging aims at recovering the shape and albedo of objects hidden from a camera or a light source. Using ultra-fast pulsed illumination and single photon detectors, the light transport in the scene is sampled for visible objects (left). The global illumination components of these time-resolved measurements (A,E) contain sufficient information to estimate the shape of hidden objects (B,C). Using a novel formulation for NLOS light transport that models partial occlusions of hidden objects (D) via visibility terms (F), we demonstrate higher-fidelity reconstructions (C) than previous approaches to NLOS imaging (B).
Proximity-Fed Circularly Polarized Slotted Patch Antenna for RFID Handheld Reader
A novel, compact X-shaped slotted square-patch antenna is proposed for circularly polarized (CP) radiation. A cross-strip is embedded along the X-shaped slot for a novel proximity-fed technique to produce CP radiation in the UHF band. In the proposed design, two pairs of T-shaped slots are etched orthogonally on the square patch, connected to the center of the X-shaped slot for CP radiation and antenna size reduction. Proper adjustment of the length and coupling gap of the cross-strip will excite two orthogonal modes with 90 ° phase difference for good CP radiation. Simulated and measured results indicate that the proposed structure can achieve circular polarization. A measured impedance bandwidth (VSWR ≤ 2) of about 3.0% (909-937 MHz) and a 3-dB axial-ratio (AR) bandwidth of about 1.3% (917-929 MHz) were obtained.
Author ' s personal copy Video Games : Good , Bad , or Other ?
In most cases authors are permitted to post their version of the article (e.g. in Word or Tex form) to their personal website or institutional repository. Authors requiring further information regarding Elsevier's archiving and manuscript policies are encouraged to visit:
Empirical Study of Unsupervised Chinese Word Segmentation Methods for SMT on Large-scale Corpora
Unsupervised word segmentation (UWS) can provide domain-adaptive segmentation for statistical machine translation (SMT) without annotated data, and bilingual UWS can even optimize segmentation for alignment. Monolingual UWS approaches of explicitly modeling the probabilities of words through Dirichlet process (DP) models or Pitman-Yor process (PYP) models have achieved high accuracy, but their bilingual counterparts have only been carried out on small corpora such as basic travel expression corpus (BTEC) due to the computational complexity. This paper proposes an efficient unified PYP-based monolingual and bilingual UWS method. Experimental results show that the proposed method is comparable to supervised segmenters on the in-domain NIST OpenMT corpus, and yields a 0.96 BLEU relative increase on NTCIR PatentMT corpus which is out-of-domain.
Real-time procedural generation of 'pseudo infinite' cities
We present an approach to procedural generation of `pseudo infinite' virtual cities in real-time. The cities contain geometrically varied buildings that are generated as needed. The building generation parameters are created by a pseudo random number generator, seeded with an integer derived from the building's position. The varied building geometries are extruded from a set of floor plans. The floor plans for each building are created by combining randomly generated polygons in an iterative process. A display list caching and frustum filling approach manages the generation of buildings and the use of system resources. This approach has been implemented on commodity PC hardware, resulting in interactive frame rates.
Organoboron compounds as Lewis acid receptors of fluoride ions in polymeric membranes.
Newly synthesized organoboron compounds - 4-octyloxyphenylboronic acid (OPBA) and pinacol ester of 2,4,6-trifluorophenylboronic acid (PE-PBA) - were applied as Lewis acid receptors of fluoride anions. Despite enhanced selectivity, the polymer membrane electrodes containing the lipophilic receptor OPBA exhibited non-Nernstian slopes of the responses toward fluoride ions in acidic conditions. Such behavior was explained by the lability of the B-O bond in the boronic acids, and the OH(-)/F(-) exchange at higher fluoride content in the sample solution. In consequence, the stoichiometry of the OPBA-fluoride complexes in the membrane could vary during the calibration, changing the equilibrium concentration of the primary anion in membrane and providing super-Nernstian responses. The proposed mechanism was supported by (19)F NMR studies, which indicated that the fluoride complexation proceeds more effectively in acidic solution leading mainly to PhBF(3)(-) species. Finally, the performances of the membranes based on the phenylboronic acid pinacol ester, with a more stable B-O bond, were tested. As it was expected, Nernstian fluoride responses were recorded for such membranes with worsened fluoride selectivity.
Low-Cost Compact Circularly Polarized Directional Antenna for Universal UHF RFID Handheld Reader Applications
A low-cost compact circularly polarized (CP) directional antenna is proposed for universal ultra-high frequency (UHF) RF identification (RFID) handheld reader applications. The antenna consists of four sequentially rotated inverted-F antennas (IFAs) fed by a compact four-feed network with closed-form design formulas. A prototype is implemented by FR-4 substrates and a thick air layer for the reduction of cost. For return loss (RL) > 13 dB, 3-dB gain variation, and axial ratio (AR) , the prototype achieves the measured bandwidth of 14.9% and exhibits stable symmetrical directional radiation patterns with wide half-power beamwidth. Compared with the reported UHF RFID directional reader antennas, the proposed structure not only exhibits the largest CP bandwidth for easily covering the entire UHF RFID band but also achieves compact size of 95 ×100 ×13.6 mm3.
The economic impacts of Mergers and Acquisitions: modelling and application
This paper presents a dynamic Computable General Equilibrium (CGE) model for China by incorporating the scale economy and imperfect competition into the MCHUGE model. Based on the Cournot equation, we derive a link between the share weighted price-cost mark-up across all firms in industry and the absolute degree of concentration (CRn) index and Hirschman-Herfindahl Index (HHI). Using the extension model, this paper gives an empirical illustration to analyse the impacts of the Chinese steel industry's Mergers and Acquisitions (M&As) on China's macro economy and industries.
Experimental Design and Data Analysis for Biologists
An essential textbook for any student or researcher in biology needing to design experiments, sampling programs or analyze the resulting data. The text begins with a revision of estimation and hypothesis testing methods, covering both classical and Bayesian philosophies, before advancing to the analysis of linear and generalized linear models. Topics covered include linear and logistic regression, simple and complex ANOVA models (for factorial, nested, block, split-plot and repeated measures and covariance designs), and log-linear models. Multivariate techniques, including classification and ordination, are then introduced. Special emphasis is placed on checking assumptions, exploratory data analysis and presentation of results. The main analyses are illustrated with many examples from published papers and there is an extensive reference list to both the statistical and biological literature. The book is supported by a website that provides all data sets, questions for each chapter and links to software.
Deep Reinforcement Learning for Continuous Control
Reinforcement learning is a mathematical framework for agents to interact intelligently with their environment. In this field, real-world control problems are particularly challenging because of the noise and the high-dimensionality of input data (e.g., visual input). In the last few years, deep neural networks have been successfully used to extract meaning from such data. Building on these advances, deep reinforcement learning achieved stunning results in the field of artificial intelligence, being able to solve complex problems like Atari games [1] and Go [2]. However, in order to apply the same methods to real-world control problems, deep reinforcement learning has to be able to deal with continuous action spaces. In this thesis, Deep Deterministic Policy Gradients, a deep reinforcement learning method for continuous control, has been implemented, evaluated and put into context to serve as a basis for further research in the field.
RN to BSN Transition: A Concept Analysis.
Over 670,000 ADN- and diploma-prepared nurses will need to complete their BSN degrees to meet the Institute of Medicine's recommendation that at least 80% of registered nurses (RNs) be BSN-prepared by year 2020. Understanding motivators, barriers, and the transition experience for RNs to advance their degree will help educators and nurse leaders understand the importance of a partnership to educate and mentor RNs to pursue a BSN degree.
Comparative evaluation of soft-switching concepts for bi-directional buck+boost dc-dc converters
Soft-switching techniques are an enabling technology to further reduce the losses and the volume of automotive dc-dc converters, utilized to interconnect the high voltage battery or ultra-capacitor to the dc-link of a Hybrid Electrical Vehicle (HEV) or a Fuel Cell Vehicle (FCV). However, as the performance indices of a power electronics converter, such as efficiency and power density, are competing and moreover dependent on the underlying specifications and technology node, a comparison of different converter topologies naturally demands detailed analytical models. Therefore, to investigate the performance of the ARCP, CF-ZVS-M, SAZZ and ZCT-QZVT soft-switching converters, the paper discusses in detail the advantages and drawbacks of each concept, and the impact of the utilized semiconductor technology and silicon area on the converter efficiency. The proposed analytical models that correlate semiconductor, capacitor and inductor losses with the component volume furthermore allow for a comparison of power density and to find the η-ρ-Pareto-Front of the CF-ZVS-M converter.
Reuse in the wild: an empirical and ethnographic study of organizational content reuse
We present a large-scale study of content reuse networks in a large and highly hierarchical organization. In our study, we combine analysis of a collection of presentations produced by employees with interviews conducted throughout the organization and a survey to study presentation content reuse. Study results show a variety of information needs and behaviors related to content reuse as well as a need for a personalized and socially-integrated networking tool for enabling easy access to previously generated presentation material. In this paper we describe our findings and outline a set of requirements for an effective content reuse facility.
Multiview Clustering via Adaptively Weighted Procrustes
In this paper, we make a multiview extension of the spectral rotation technique raised in single view spectral clustering research. Since spectral rotation is closely related to the Procrustes Analysis for points matching, we point out that classical Procrustes Average approach can be used for multiview clustering. Besides, we show that direct applying Procrustes Average (PA) in multiview tasks may not be optimal theoretically and empirically, since it does not take the clustering capacity differences of different views into consideration. Other than that, we propose an Adaptively Weighted Procrustes (AWP) approach to overcome the aforementioned deficiency. Our new AWP weights views with their clustering capacities and forms a weighted Procrustes Average problem accordingly. The optimization algorithm to solve the new model is computational complexity analyzed and convergence guaranteed. Experiments on five real-world datasets demonstrate the effectiveness and efficiency of the new models.
Congruent leadership: values in action.
AIM(S) To discuss the significance of an appropriate leadership theory in order to develop an understanding of clinical leadership. BACKGROUND Leadership theories developed from management and related paradigms, particularly transformational leadership, may be ineffective in supporting nurses to gain insights into clinical leadership or to develop and implement clinical leadership skills. Instead, congruent leadership theory, based on a match between the clinical leaders' actions and their values and beliefs about care and nursing, may offer a more firm theoretical foundation on which clinical nurses can build an understanding of and capacity to implement clinical leadership or become clinical leaders. Evaluation The information used is drawn from the contemporary literature and a study conducted by the author. KEY ISSUE(S) Leadership can be better understood when an appropriate theoretical foundation is employed. CONCLUSIONS With regard to clinical leadership, congruent leadership is proposed as the most appropriate theory. IMPLICATIONS FOR NURSING MANAGEMENT It is important to recognize that leadership theories based on a management paradigm may not be appropriate for all clinical applications. Education should be aimed specifically at clinical leaders, recognizing that clinical leaders are followed not for their vision or creativity (even if they demonstrate these), but because they translate their values and beliefs about care into action.
Inactivation of antibiotic resistance genes in municipal wastewater by chlorination, ultraviolet, and ozonation disinfection.
This study investigated the inactivation of two antibiotic resistance genes (ARGs)-sul1 and tetG, and the integrase gene of class 1 integrons-intI1 by chlorination, ultraviolet (UV), and ozonation disinfection. Inactivation of sul1, tetG, and intI1 underwent increased doses of three disinfectors, and chlorine disinfection achieved more inactivation of ARGs and intI1 genes (chlorine dose of 160 mg/L with contact time of 120 min for 2.98-3.24 log reductions of ARGs) than UV irradiation (UV dose of 12,477 mJ/cm(2) for 2.48-2.74 log reductions of ARGs) and ozonation disinfection (ozonation dose of 177.6 mg/L for 1.68-2.55 log reductions of ARGs). The 16S rDNA was more efficiently removed than ARGs by ozone disinfection. The relative abundance of selected genes (normalized to 16S rDNA) increased during ozonation and with low doses of UV and chlorine disinfection. Inactivation of sul1 and tetG showed strong positive correlations with the inactivation of intI1 genes (for sul1, R (2)  = 0.929 with p < 0.01; for tetG, R (2)  = 0.885 with p < 0.01). Compared to other technologies (ultraviolet disinfection, ozonation disinfection, Fenton oxidation, and coagulation), chlorination is an alternative method to remove ARGs from wastewater effluents. At a chlorine dose of 40 mg/L with 60 min contact time, the selected genes inactivation efficiency could reach 1.65-2.28 log, and the cost was estimated at 0.041 yuan/m(3).
The effect of C-reactive protein reduction with a highly specific antisense oligonucleotide on atrial fibrillation assessed using beat-to-beat pacemaker Holter follow-up
C-reactive protein (CRP) is known to be strongly associated with atrial fibrillation (AF). However, it is not clear if CRP is a causal factor for AF. ISIS-CRPRx is a novel antisense oligonucleotide that reduces CRP production by specifically inhibiting mRNA translation. The effect of ISIS-CRPRx on AF was evaluated. A double-blind phase II trial of ISIS-CRPRx in patients with paroxysmal AF and DDDRP permanent pacemakers (PPMs) with advanced atrial and ventricular Holters allowing beat-to-beat arrhythmia follow-up. Twenty six patients were screened and seven patients dosed with ISIS-CRPRx. After 4 weeks of baseline assessment, patients were randomly assigned to two treatment periods of either placebo then ISIS-CRPRx or ISIS-CRPRx then placebo. All patients were followed up for 8 weeks after the active treatment period. There was a 63.7 % (95 % CI 38.4 to 78.6 %, p = 0.003) relative reduction in CRP on treatment with ISIS-CRPRx versus baseline. Sensitivity analyses demonstrated a consistent treatment effect. The primary end-point was change in AF burden assessed by PPM. There was no significant difference in AF burden on treatment with ISIS-CRPRx versus baseline (OR 1.6, 95 % CI −2.42 to 5.62, p = 0.37). ISIS CRPRx was safe and well tolerated and there were no serious adverse events. Treatment with ISIS-CRPRx did not reduce AF burden in patients with paroxysmal AF and PPMs, despite a large relative reduction in CRP. In this population, highly specific CRP reduction had no clinically discernable effect upon paroxysmal AF. However, average levels of CRP at baseline were relatively low, so it remains possible that AF patients with higher levels of CRP may benefit from CRP-directed therapy.
A Web-Based Intelligent Tutoring System for Computer Programming
Web Intelligence is a direction for scientific research that explores practical applications of Artificial Intelligence to the next generation of Web-empowered systems. In this paper, we present a Web-based intelligent tutoring system for computer programming. The decision making process conducted in our intelligent system is guided by Bayesian networks, which are a formal framework for uncertainty management in Artificial Intelligence based on probability theory. Whereas many tutoring systems are static HTML Web pages of a class textbook or lecture notes, our intelligent system can help a student navigate through the online course materials, recommend learning goals, and generate appropriate reading sequences.
Rule extraction from trained adaptive neural networks using artificial immune systems
Although artificial neural network (ANN) usually reaches high classification accuracy, the obtained results sometimes may be incomprehensible. This fact is causing a serious problem in data mining applications. The rules that are derived from ANN are needed to be formed to solve this problem and various methods have been improved to extract these rules. Activation function is critical as the behavior and performance of an ANN model largely depends on it. So far there have been limited studies with emphasis on setting a few free parameters in the neuron activation function. ANN’s with such activation function seem to provide better fitting properties than classical architectures with fixed activation function neurons [Xu, S., & Zhang, M. (2005). Data mining – An adaptive neural network model for financial analysis. In Proceedings of the third international conference on information technology and applications]. In this study a new method that uses artificial immune systems (AIS) algorithm has been presented to extract rules from trained adaptive neural network. Two real time problems data were investigated for determining applicability of the proposed method. The data were obtained from University of California at Irvine (UCI) machine learning repository. The datasets were obtained from Breast Cancer disease and ECG data. The proposed method achieved accuracy values 94.59% and 92.31% for ECG and Breast Cancer dataset, respectively. It has been observed that these results are one of the best results comparing with results obtained from related previous studies and reported in UCI web sites. 2007 Elsevier Ltd. All rights reserved.
Feature Selection for Maximizing the Area Under the ROC Curve
Feature selection is an important pre-processing step for solving classification problems. A good feature selection method may not only improve the performance of the final classifier, but also reduce the computational complexity of it. Traditionally, feature selection methods were developed to maximize the classification accuracy of a classifier. Recently, both theoretical and experimental studies revealed that a classifier with the highest accuracy might not be ideal in real-world problems. Instead, the Area Under the ROC Curve (AUC) has been suggested as the alternative metric, and many existing learning algorithms have been modified in order to seek the classifier with maximum AUC. However, little work was done to develop new feature selection methods to suit the requirement of AUC maximization. To fill this gap in the literature, we propose in this paper a novel algorithm, called AUC and Rank Correlation coefficient Optimization (ARCO) algorithm. ARCO adopts the general framework of a well-known method, namely minimal redundancy- maximal-relevance (mRMR) criterion, but defines the terms ”relevance” and ”redundancy” in totally different ways. Such a modification looks trivial from the perspective of algorithmic design. Nevertheless, experimental study on four gene expression data sets showed that feature subsets obtained by ARCO resulted in classifiers with significantly larger AUC than the feature subsets obtained by mRMR. Moreover, ARCO also outperformed the Feature Assessment by Sliding Thresholds algorithm, which was recently proposed for AUC maximization, and thus the efficacy of ARCO was validated.
Research on Chinese text classification based on Word2vec
The set of features which the traditional feature selection algorithm of chi-square selected is not complete. This causes the low performance for the final text classification. Therefore, this paper proposes a method. The method utilizes word2vec to generate word vector to improve feature selection algorithm of the chi square. The algorithm applies the word vector generated by word2vec to the process of the traditional feature selection and uses these words to supplement the set of features as appropriate. Ultimately, the set of features obtained by this method has better discriminatory power. Because, the feature words with the better discriminatory power has the strong ability of distinguishing categories as its semantically similar words. On this base, multiple experiments have been carried out in this paper. The experimental results show that the performance of text classification can increase after extension of feature words.
Cloud-based smart waste management for smart cities
With the ever increasing population, urbanization, migration issues, and change in lifestyle, municipal solid waste generation levels are increasing significantly. Hence, waste management becomes a challenge faced not only by the developing nations, but also the developed and advanced countries. The overall waste management involves three main types of entities: 1) users who generate waste, 2) waste collectors/city admin., 3) stakeholders. Waste management directly effects the lifestyle, healthcare, environment, recycling and disposal, and several other industries. Current waste management trends are not sophisticated enough to achieve a robust and efficient waste management mechanism. It is very important to have a smart way of managing waste, so that not only the waste status is notified in-time when to be collected, but also, all the stakeholders are made aware in timely fashion that what type of waste in what quantity is coming up at what particular time. This will not only help in attracting and identifying stakeholders, but also aids in creating more effective ways of recycling and minimizing waste also making the overall waste management more efficient and environment friendly. Keeping all this in mind, we propose a cloud-based smart waste management mechanism in which the waste bins are equipped with sensors, capable of notifying their waste level status and upload the status to the cloud. The stakeholders are able to access the desired data from the cloud. Moreover, for city administration and waste management, it will be possible to do route optimization and select path for waste collection according to the statuses of waste bins in a metropolis, helping in fuel and time efficiency.
Human Identification using Face and Voice Recognition
In this paper we proposed new technique for human identification using fusion of both face and speech which can substantially improve the rate of recognition as compared to the single biometric identification for security system development. The proposed system uses principal component analysis (PCA) as feature extraction techniques which calculate the Eigen vectors and Eigen values. These feature vectors are compared using the similarity measure algorithm like Mahalanobis Distances for the decision making. The Mel-Frequency cestrum coefficients (MFCC) feature extraction techniques are used for speech recognition in our project. Cross correlation coefficients are considered as primary features. The Hidden Markov Model (HMM) is used to calculate the like hoods in the MFCC extracted features to make the decision about the spoken wards.
Collective Spammer Detection in Evolving Multi-Relational Social Networks
Detecting unsolicited content and the spammers who create it is a long-standing challenge that affects all of us on a daily basis. The recent growth of richly-structured social networks has provided new challenges and opportunities in the spam detection landscape. Motivated by the Tagged.com social network, we develop methods to identify spammers in evolving multi-relational social networks. We model a social network as a time-stamped multi-relational graph where vertices represent users, and edges represent different activities between them. To identify spammer accounts, our approach makes use of structural features, sequence modelling, and collective reasoning. We leverage relational sequence information using k-gram features and probabilistic modelling with a mixture of Markov models. Furthermore, in order to perform collective reasoning and improve the predictive power of a noisy abuse reporting system, we develop a statistical relational model using hinge-loss Markov random fields (HL-MRFs), a class of probabilistic graphical models which are highly scalable. We use Graphlab Create and Probabilistic Soft Logic (PSL) to prototype and experimentally evaluate our solutions on internet-scale data from Tagged.com. Our experiments demonstrate the effectiveness of our approach, and show that models which incorporate the multi-relational nature of the social network significantly gain predictive performance over those that do not.
DrAcc: a DRAM based Accelerator for Accurate CNN Inference
Modern Convolutional Neural Networks (CNNs) are computation and memory intensive. Thus it is crucial to develop hardware accelerators to achieve high performance as well as power/energy-efficiency on resource limited embedded systems. DRAM-based CNN accelerators exhibit great potentials but face inference accuracy and area overhead challenges. In this paper, we propose DrAcc, a novel DRAM-based processing-in-memory CNN accelerator. DrAcc achieves high inference accuracy by implementing a ternary weight network using in-DRAM bit operation with simple enhancements. The data partition and mapping strategies can be flexibly configured for the best trade-off among performance, power and energy consumption, and DRAM data reuse factors. Our experimental results show that DrAcc achieves 84.8 FPS (frame per second) at 2W and 2.9× power efficiency improvement over the process-near-memory design.
Antibodies to citrullinated α-enolase peptide 1 and clinical and radiological outcomes in rheumatoid arthritis.
INTRODUCTION The anticyclic citrullinated peptide 2 (anti-CCP2) assay is a generic test for antibodies to citrullinated proteins, among which there is a subset of about 50% with antibodies to citrullinated enolase peptide 1 (CEP-1). The anti-CEP-1 positive subset is strongly associated with the HLA-DRB1 shared epitope and its interaction with smoking. OBJECTIVE To investigate whether anti-CEP-1 antibodies may be helpful in predicting outcome. METHODS Anti-CEP-1 and anti-CCP2 antibodies were measured in two prospective cohorts of patients (Karolinska n=272, Norfolk Arthritis Register (NOAR) n=408) with early rheumatoid arthritis (RA). Outcomes measured were C-reactive protein, erythrocyte sedimentation rate, visual analogue scales for pain and global assessment of disease activity, Health Assessment Questionnaire, physician's assessment, swollen and tender joint counts and radiological progression. RESULTS Anti-CCP2 antibodies were present in 57% and 50%, and anti-CEP-1 in 27% and 24% of the Karolinska and NOAR cohorts, respectively. Importantly, no statistically significant differences in clinical outcomes were demonstrated between the anti-CEP-1-/CCP2+ and the anti-CEP-1+/CCP2+ subsets in either cohort, or in radiological outcomes in the Karolinska cohort. CONCLUSION Although antibodies to specific citrullinated proteins may have distinct genetic and environmental risk factors, the similarity in clinical phenotype suggests that they share common pathways in the pathogenesis of joint disease in RA.
Graph-boosted convolutional neural networks for semantic segmentation
This paper investigates the problem of weakly-supervised semantic segmentation, where image-level labels are used as weak supervision. Inspired by the successful use of Convolutional Neural Networks (CNNs) for fully-supervised semantic segmentation, we choose to directly train the CNNs over the oversegmented regions of images for weakly-supervised semantic segmentation. Although there are a few studies on CNNs-based weakly-supervised semantic segmentation, they have rarely considered the noise issue, i.e., the initial weak labels (e.g., social tags) may be noisy. To cope with this issue, we thus propose graph-boosted CNNs (GB-CNNs) for weakly-supervised semantic segmentation. In our GB-CNNs, the graph-based model provides the initial supervision for training the CNNs, and then the outcomes of the CNNs are used to retrain the graph-based model. This training procedure is iteratively implemented to boost the results of semantic segmentation. Experimental results demonstrate that the proposed model outperforms the state-of-the-art weakly-supervised methods. More notably, the proposed model is shown to be more robust in the noisy setting for weakly-supervised semantic segmentation.
Designing a socially assistive robot for personalized number concepts learning in preschool children
Designing technological systems for personalized education is an iterative and interdisciplinary process that demands a deep understanding of the application domain, the limitations of current methods and technologies, and the computational methods and complexities behind user modeling and adaptation. We present our design process and the Socially Assistive Robot (SAR) tutoring system to support the efforts of educators in teaching number concepts to preschool children. We focus on the computational considerations of designing a SAR system for young children that may later be personalized along multiple dimensions. We conducted an initial data collection to validate that the system is at the proper challenge level for our target population, and discovered promising patterns in participants' learning styles, nonverbal behavior, and performance. We discuss our plans to leverage the data collected to learn and validate a computational, multidimensional model of number concepts learning.
Under which conditions does ICT have a positive effect on teaching and learning? A Call to Action
Under which conditions does ICT have a positive effect on teaching and learning?’ This was the leading question of the International EDUsummIT in The Hague, the Netherlands. The bases for the discussion were the scholarly findings of the International Handbook of Information Technology in Primary and Secondary Education, a synthesis of research in the field of information and communication technology (ICT) in education. Seventy international policymakers, researchers, and practitioners developed a Call to Action, which summarizes the main action points where policy, research, and leadership need to join forces in order to successfully implement ICT in educational practice. These main action points include a view on the role of ICT in 21st century learning; conditions for realizing the potential of multiple technologies to address individual needs of students; better understanding of the relationship between formal and informal learning; the implications of technology for student assessment; the need for models for leadership and teacher learning to successfully implement technology; the potential of ICT for digital equity; and the development of a list of essential conditions to ensure benefit from ICT investments. In this contribution, we present the Call to Action and synthesize the research on which the Call is based.
Predominant Leptospiral Serogroups Circulating among Humans, Livestock and Wildlife in Katavi-Rukwa Ecosystem, Tanzania
BACKGROUND Leptospirosis is a worldwide zoonotic disease and a serious, under-reported public health problem, particularly in rural areas of Tanzania. In the Katavi-Rukwa ecosystem, humans, livestock and wildlife live in close proximity, which exposes them to the risk of a number of zoonotic infectious diseases, including leptospirosis. METHODOLOGY/PRINCIPAL FINDINGS A cross-sectional epidemiological study was carried out in the Katavi region, South-west Tanzania, to determine the seroprevalence of Leptospira spp in humans, domestic ruminants and wildlife. Blood samples were collected from humans (n = 267), cattle (n = 1,103), goats (n = 248), buffaloes (n = 38), zebra (n = 2), lions (n = 2), rodents (n = 207) and shrews (n = 11). Decanted sera were tested using the Microscopic Agglutination Test (MAT) for antibodies against six live serogroups belonging to the Leptospira spp, with a cutoff point of ≥ 1:160. The prevalence of leptospiral antibodies was 29.96% in humans, 30.37% in cattle, 8.47% in goats, 28.95% in buffaloes, 20.29% in rodents and 9.09% in shrews. Additionally, one of the two samples in lions was seropositive. A significant difference in the prevalence P<0.05 was observed between cattle and goats. No significant difference in prevalence was observed with respect to age and sex in humans or any of the sampled animal species. The most prevalent serogroups with antibodies of Leptospira spp were Sejroe, Hebdomadis, Grippotyphosa, Icterohaemorrhagie and Australis, which were detected in humans, cattle, goats and buffaloes; Sejroe and Grippotyphosa, which were detected in a lion; Australis, Icterohaemorrhagie and Grippotyphosa, which were detected in rodents; and Australis, which was detected in shrews. Antibodies to serogroup Ballum were detected only in humans. CONCLUSIONS The results of this study demonstrate that leptospiral antibodies are widely prevalent in humans, livestock and wildlife from the Katavi-Rukwa ecosystem. The disease poses a serious economic and public health threat in the study area. This epidemiological study provides information on circulating serogroups, which will be essential in designing intervention measures to reduce the risk of disease transmission.
A framework for robust discovery of entity synonyms
Entity synonyms are critical for many applications like information retrieval and named entity recognition in documents. The current trend is to automatically discover entity synonyms using statistical techniques on web data. Prior techniques suffer from several limitations like click log sparsity and inability to distinguish between entities of different concept classes. In this paper, we propose a general framework for robustly discovering entity synonym with two novel similarity functions that overcome the limitations of prior techniques. We develop efficient and scalable techniques leveraging the MapReduce framework to discover synonyms at large scale. To handle long entity names with extraneous tokens, we propose techniques to effectively map long entity names to short queries in query log. Our experiments on real data from different entity domains demonstrate the superior quality of our synonyms as well as the efficiency of our algorithms. The entity synonyms produced by our system is in production in Bing Shopping and Video search, with experiments showing the significance it brings in improving search experience.
Particle Swarm Optimization
A concept for the optimization of nonlinear functions using particle swarm methodology is introduced. The evolution of several paradigms is outlined, and an implementation of one of the paradigms is discussed. Benchmark testing of the paradigm is described, and applications, including nonlinear function optimization and neural network training, are proposed. The relationships between particle swarm optimization and both artificial life and genetic algorithms are described,
Real-Time Illegal Parking Detection in Outdoor Environments Using 1-D Transformation
With decreasing costs of high-quality surveillance systems, human activity detection and tracking has become increasingly practical. Accordingly, automated systems have been designed for numerous detection tasks, but the task of detecting illegally parked vehicles has been left largely to the human operators of surveillance systems. We propose a methodology for detecting this event in real time by applying a novel image projection that reduces the dimensionality of the data and, thus, reduces the computational complexity of the segmentation and tracking processes. After event detection, we invert the transformation to recover the original appearance of the vehicle and to allow for further processing that may require 2-D data. We evaluate the performance of our algorithm using the i-LIDS vehicle detection challenge datasets as well as videos we have taken ourselves. These videos test the algorithm in a variety of outdoor conditions, including nighttime video and instances of sudden changes in weather.
Background Inpainting for Videos with Dynamic Objects and a Free-Moving Camera
We propose a method for removing marked dynamic objects from videos captured with a free-moving camera, so long as the objects occlude parts of the scene with a static background. Our approach takes as input a video, a mask marking the object to be removed, and a mask marking the dynamic objects to remain in the scene. To inpaint a frame, we align other candidate frames in which parts of the missing region are visible. Among these candidates, a single source is chosen to fill each pixel so that the final arrangement is color-consistent. Intensity differences between sources are smoothed using gradient domain fusion. Our frame alignment process assumes that the scene can be approximated using piecewise planar geometry: A set of homographies is estimated for each frame pair, and one each is selected for aligning pixels such that the color-discrepancy is minimized and the epipolar constraints are maintained. We provide experimental validation with several real-world video sequences to demonstrate that, unlike in previous work, inpainting videos shot with free-moving cameras does not necessarily require estimation of absolute camera positions and per-frame per-pixel depth maps.
How to draw a group?
Abstract A map is at the same time a group. To represent a map (that is, a graph drawn on the sphere or on another surface) we usually use a pair of permutations on the set of the ‘ends’ of edges. These permutations generate a group which we call a cartographic group . The main motivation for the study of the cartographic group is the so-called theory of “dessins d'enfants’ of Grothendieck, which relates the theory of maps to Galois theory [24]. In the present paper we address the questions of identifying the cartographic group for a given map, and of constructing the maps with a given cartographic group.
Rectifying the Bound Document Image Captured by the Camera: A Model Based Approach
A model based approach for rectifying the camera image of the bound document has been developed, i.e., the surface of the document is represented by a general cylindrical surface. The principle of using the model to unwrap the image is discussed. Practically, the skeleton of each horizontal text line is extracted to help estimate the parameter of the model, and rectify the images. To use the model, only a few priori is required, and no more auxiliary device is necessary. Experiment results are given to demonstrate the feasibility and the stability of the method.
Feature integration analysis of bag-of-features model for image retrieval
One of the biggest challenges in content based image retrieval is to solve the problem of “semantic gaps” between low-level features and high-level semantic concepts. In this paper, we aim to investigate various combinations of mid-level features to build an effective image retrieval system based on the bag-offeatures (BoF) model. Specifically, we study two ways of integrating the SIFT and LBP descriptors, HOG and LBP descriptors, respectively. Based on the qualitative and quantitative evaluations on two benchmark datasets, we show that the integrations of these features yield complementary and substantial improvement on image retrieval even with noisy background and ambiguous objects. Two integration models are proposed: the patch-based integration and image-based integration. By using a weighted K-means clustering algorithm, the image-based SIFT-LBP integration achieves the best performance on the given benchmark problems comparing to the existing algorithms. & 2013 Elsevier B.V. All rights reserved.
High-Frequency Equity Index Futures Trading Using Recurrent Reinforcement Learning with Candlesticks
In 1997, Moody and Wu presented recurrent reinforcement learning (RRL) as a viable machine learning method within algorithmic trading. Subsequent research has shown a degree of controversy with regards to the benefits of incorporating technical indicators in the recurrent reinforcement learning framework. In 1991, Nison introduced Japanese candlesticks to the global research community as an alternative to employing traditional indicators within the technical analysis of financial time series. The literature accumulated over the past two and a half decades of research contains conflicting results with regards to the utility of using Japanese candlestick patterns to exploit inefficiencies in financial time series. In this paper, we combine features based on Japanese candlesticks with recurrent reinforcement learning to produce a high-frequency algorithmic trading system for the E-mini S&P 500 index futures market. Our empirical study shows a statistically significant increase in both return and Sharpe ratio compared to relevant benchmarks, suggesting the existence of exploitable spatio-temporal structure in Japanese candlestick patterns and the ability of recurrent reinforcement learning to detect and take advantage of this structure in a high-frequency equity index futures trading environment.
Polyglot Neural Language Models: A Case Study in Cross-Lingual Phonetic Representation Learning
We introduce polyglot language models, recurrent neural network models trained to predict symbol sequences in many different languages using shared representations of symbols and conditioning on typological information about the language to be predicted. We apply these to the problem of modeling phone sequences—a domain in which universal symbol inventories and cross-linguistically shared feature representations are a natural fit. Intrinsic evaluation on held-out perplexity, qualitative analysis of the learned representations, and extrinsic evaluation in two downstream applications that make use of phonetic features show (i) that polyglot models better generalize to held-out data than comparable monolingual models and (ii) that polyglot phonetic feature representations are of higher quality than those learned monolingually.
Bystander Responses to a Violent Incident in an Immersive Virtual Environment
Under what conditions will a bystander intervene to try to stop a violent attack by one person on another? It is generally believed that the greater the size of the crowd of bystanders, the less the chance that any of them will intervene. A complementary model is that social identity is critical as an explanatory variable. For example, when the bystander shares common social identity with the victim the probability of intervention is enhanced, other things being equal. However, it is generally not possible to study such hypotheses experimentally for practical and ethical reasons. Here we show that an experiment that depicts a violent incident at life-size in immersive virtual reality lends support to the social identity explanation. 40 male supporters of Arsenal Football Club in England were recruited for a two-factor between-groups experiment: the victim was either an Arsenal supporter or not (in-group/out-group), and looked towards the participant for help or not during the confrontation. The response variables were the numbers of verbal and physical interventions by the participant during the violent argument. The number of physical interventions had a significantly greater mean in the in-group condition compared to the out-group. The more that participants perceived that the Victim was looking to them for help the greater the number of interventions in the in-group but not in the out-group. These results are supported by standard statistical analysis of variance, with more detailed findings obtained by a symbolic regression procedure based on genetic programming. Verbal interventions made during their experience, and analysis of post-experiment interview data suggest that in-group members were more prone to confrontational intervention compared to the out-group who were more prone to make statements to try to diffuse the situation.
Induction motor versus permanent magnet synchronous motor in motion control applications : a comparative study
Thesis for the degree of Doctor of Science (Technology) to be presented with due permission for the public examination and criticism in the Auditorium 1382 at High dynamic performance of an electric motor is a fundamental prerequisite in motion control applications, also known as servo drives. Recent developments in the field of microprocessors and power electronics have enabled faster and faster movements with an electric motor. In such a dynamically demanding application, the dimensioning of the motor differs substantially from the industrial motor design, where feasible characteristics of the motor are for example high efficiency, a high power factor, and a low price. In motion control instead, such characteristics as high overloading capability, high-speed operation, high torque density and low inertia are required. The thesis investigates how the dimensioning of a high-performance servomotor differs from the dimensioning of industrial motors. The two most common servomotor types are examined; an induction motor and a permanent magnet synchronous motor. The suitability of these two motor types in dynamically demanding servo applications is assessed, and the design aspects that optimize the servo characteristics of the motors are analyzed. Operating characteristics of a high performance motor are studied, and some methods for improvements are suggested. The main focus is on the induction machine, which is frequently compared to the permanent magnet synchronous motor. A 4 kW prototype induction motor was designed and manufactured for the verification of the simulation results in the laboratory conditions. Also a dynamic simulation model for estimating the thermal behaviour of the induction motor in servo applications was constructed. The accuracy of the model was improved by coupling it with the electromagnetic motor model in order to take into account the variations in the motor electromagnetic characteristics due to the temperature rise.
A pedagogical framework for mobile learning: Categorizing educational applications of mobile technologies into four types
Instructional designers and educators recognize the potential of mobile technologies as a learning tool for students and have incorporated them into the distance learning environment. However, little research has been done to categorize the numerous examples of mobile learning in the context of distance education, and few instructional design guidelines based on a solid theoretical framework for mobile learning exist. In this paper I compare mobile learning (m-learning) with electronic learning (e-learning) and ubiquitous learning (u-learning) and describe the technological attributes and pedagogical affordances of mobile learning presented in previous studies. I modify transactional distance (TD) theory and adopt it as a relevant theoretical framework for mobile learning in distance education. Furthermore, I attempt to position previous studies into four types of mobile learning: 1) high transactional distance socialized m-learning, 2) high transactional distance individualized m-learning, 3) low transactional distance socialized mlearning and 4) low transactional distance individualized m-learning. As a result, this paper can be used by instructional designers of open and distance learning to learn about the concepts of mobile learning and how mobile technologies can be incorporated into their teaching and learning more effectively.
Substitution of immediate-release valproic acid for divalproex sodium for adult psychiatric inpatients.
In a prospective open-label study, the substitution of immediate-release valproic acid for divalproex sodium was evaluated in the treatment of 47 adult psychiatric inpatients who had been stabilized on divalproex for at least one month. After two weeks, no significant change in Clinical Global Impressions scale (CGI) scores or in seizure frequency occurred, and serum valproate concentrations decreased by 14.4 percent (p=.001). One patient was restarted on divalproex because of gastrointestinal complaints. Among the 19 patients remaining hospitalized at six months, mean CGI scores did not significantly change. Costs were reduced 83 percent; annual savings per patient was approximately $905. These preliminary results suggest that many chronic psychiatric inpatients stabilized on divalproex may be safely switched to valproic acid.
Verifying android applications using Java PathFinder
Mobile application testing is a specialised and complex field. Due to mobile applications' event driven design and mobile runtime environment, there currently exist only a small number of tools to verify these applications. This paper describes the development of JPF-ANDROID, an Android application verification tool. JPF-ANDROID is built on Java Pathfinder, a Java model checking engine. JPF-ANDROID provides a simplified model of the Android framework on which an Android application can run. It then allows the user to script input events to drive the application flow. JPF-ANDROID provides a way to detect common property violations such as deadlocks and runtime exceptions in Android applications.
OMWEdit - The Integrated Open Multilingual Wordnet Editing System
Wordnets play a central role in many natural language processing tasks. This paper introduces a multilingual editing system for the Open Multilingual Wordnet (OMW: Bond and Foster, 2013). Wordnet development, like most lexicographic tasks, is slow and expensive. Moving away from the original Princeton Wordnet (Fellbaum, 1998) development workflow, wordnet creation and expansion has increasingly been shifting towards an automated and/or interactive system facilitated task. In the particular case of human edition/expansion of wordnets, a few systems have been developed to aid the lexicographers’ work. Unfortunately, most of these tools have either restricted licenses, or have been designed with a particular language in mind. We present a webbased system that is capable of multilingual browsing and editing for any of the hundreds of languages made available by the OMW. All tools and guidelines are freely available under an open license.
A 3.1 mW 8b 1.2 GS/s Single-Channel Asynchronous SAR ADC With Alternate Comparators for Enhanced Speed in 32 nm Digital SOI CMOS
An 8b 1.2 GS/s single-channel Successive Approximation Register (SAR) ADC is implemented in 32 nm CMOS, achieving 39.3 dB SNDR and a Figure-of-Merit (FoM) of 34 fJ per conversion step. High-speed operation is achieved by converting each sample with two alternate comparators clocked asynchronously and a redundant capacitive DAC with constant common mode to improve the accuracy of the comparator. A low-power, clocked capacitive reference buffer is used, and fractional reference voltages are provided to reduce the number of unit capacitors in the capacitive DAC (CDAC). The ADC stacks the CDAC with the reference capacitor to reduce the area and enhance the settling speed. Background calibration of comparator offset is implemented. The ADC consumes 3.1 mW from a 1 V supply and occupies 0.0015 mm2.
Functional Status Assessment of Patients With COPD
Presently, there is no recommendation on how to assess functional status of chronic obstructive pulmonary disease (COPD) patients. This study aimed to summarize and systematically evaluate these measures.Studies on measures of COPD patients' functional status published before the end of January 2015 were included using a search filters in PubMed and Web of Science, screening reference lists of all included studies, and cross-checking against some relevant reviews. After title, abstract, and main text screening, the remaining was appraised using the Consensus-based Standards for the Selection of Health Measurement Instruments (COSMIN) 4-point checklist. All measures from these studies were rated according to best-evidence synthesis and the best-rated measures were selected.A total of 6447 records were found and 102 studies were reviewed, suggesting 44 performance-based measures and 14 patient-reported measures. The majority of the studies focused on internal consistency, reliability, and hypothesis testing, but only 21% of them employed good or excellent methodology. Their common weaknesses include lack of checks for unidimensionality, inadequate sample sizes, no prior hypotheses, and improper methods. On average, patient-reported measures perform better than performance-based measures. The best-rated patient-reported measures are functional performance inventory (FPI), functional performance inventory short form (FPI-SF), living with COPD questionnaire (LCOPD), COPD activity rating scale (CARS), University of Cincinnati dyspnea questionnaire (UCDQ), shortness of breath with daily activities (SOBDA), and short-form pulmonary functional status scale (PFSS-11), and the best-rated performance-based measures are exercise testing: 6-minute walk test (6MWT), endurance treadmill test, and usual 4-meter gait speed (usual 4MGS).Further research is needed to evaluate the reliability and validity of performance-based measures since present studies failed to provide convincing evidence. FPI, FPI-SF, LCOPD, CARS, UCDQ, SOBDA, PFSS-11, 6MWT, endurance treadmill test, and usual 4MGS performed well and are preferable to assess functional status of COPD patients.
Fashioning the Face: Sensorimotor Simulation Contributes to Facial Expression Recognition
When we observe a facial expression of emotion, we often mimic it. This automatic mimicry reflects underlying sensorimotor simulation that supports accurate emotion recognition. Why this is so is becoming more obvious: emotions are patterns of expressive, behavioral, physiological, and subjective feeling responses. Activation of one component can therefore automatically activate other components. When people simulate a perceived facial expression, they partially activate the corresponding emotional state in themselves, which provides a basis for inferring the underlying emotion of the expresser. We integrate recent evidence in favor of a role for sensorimotor simulation in emotion recognition. We then connect this account to a domain-general understanding of how sensory information from multiple modalities is integrated to generate perceptual predictions in the brain.
Progressing the definition of "e-book"
Purpose – This paper aims to propose a definition for the concept “e-book” on the basis of an analysis of existing definitions. The e-book marketplace is growing rapidly and the potential impact of e-books on publishers, librarian and users is increasing in significance. Yet, there is agreement that despite a few widely accepted definitions there is no consensus on the definition of the term e-book, and, further that consensus on the definition would be beneficial for both researchers and practitioners. Design/methodology/approach – This paper starts with a brief overview of the developments in e-books, covering technologies, marketplaces, and the attractions and challenges associated with e-books for users and libraries. It then reports on a content analysis of existing definitions of e-book. A collection of definitions was compiled through an exhaustive literature review. Content analysis was performed to identify the frequency of occurrence of key words and phrases across these definitions. Findings – There is a consensus that definitions of e-book should include reference to: the digital or electronic nature of e-books, analogy to printed book, some indication of the content of e-books, and some allusion to e-book technologies. We propose a two-part definition that embraces these themes, but also reflects the in-use features of the e-book. Conclusions and recommendations make proposals for further discussion on the concept of e-book and, more widely, into the publication, acquisition and use of e-books. Originality/value – In the rapidly developing e-book marketplace it is essential to have agreement on the definition of e-book, and furthermore, such a definition needs to reflect both the persistent characteristics of e-books, and their dynamic and developing nature.
Resolvin E1 Reverses Experimental Periodontitis and Dysbiosis.
Periodontitis is a biofilm-induced inflammatory disease characterized by dysbiosis of the commensal periodontal microbiota. It is unclear how natural regulation of inflammation affects the periodontal biofilm. Promoters of active resolution of inflammation, including resolvin E1 (RvE1), effectively treat inflammatory periodontitis in animal models. The goals of this study were 1) to compare periodontal tissue gene expression in different clinical conditions, 2) to determine the impact of local inflammation on the composition of subgingival bacteria, and 3) to understand how inflammation impacts these changes. Two clinically relevant experiments were performed in rats: prevention and treatment of ligature-induced periodontitis with RvE1 topical treatment. The gingival transcriptome was evaluated by RNA sequencing of mRNA. The composition of the subgingival microbiota was characterized by 16S rDNA sequencing. Periodontitis was assessed by bone morphometric measurements and histomorphometry of block sections. H&E and tartrate-resistant acid phosphatase staining were used to characterize and quantify inflammatory changes. RvE1 treatment prevented bone loss in ligature-induced periodontitis. Osteoclast density and inflammatory cell infiltration in the RvE1 groups were lower than those in the placebo group. RvE1 treatment reduced expression of inflammation-related genes, returning the expression profile to one more similar to health. Treatment of established periodontitis with RvE1 reversed bone loss, reversed inflammatory gene expression, and reduced osteoclast density. Assessment of the rat subgingival microbiota after RvE1 treatment revealed marked changes in both prevention and treatment experiments. The data suggest that modulation of local inflammation has a major role in shaping the composition of the subgingival microbiota.
A multilevel inverter system for an induction motor with open-end windings
In this paper, a multilevel inverter system for an open-end winding induction motor drive is described. Multilevel inversion is achieved by feeding an open-end winding induction motor with two two-level inverters in cascade (equivalent to a three-level inverter) from one end and a single two-level inverter from the other end of the motor. The combined inverter system with open-end winding induction motor produces voltage space-vector locations identical to a six-level inverter. A total of 512 space-vector combinations are available in the proposed scheme, distributed over 91 space-vector locations. The proposed inverter drive scheme is capable of producing a multilevel pulsewidth-modulation (PWM) waveform for the phase voltage ranging from a two-level waveform to a six-level waveform depending on the modulation range. A space-vector PWM scheme for the proposed drive is implemented using a 1.5-kW induction motor with open-end winding structure.
A 300-mV 220-nW Event-Driven ADC With Real-Time QRS Detection for Wearable ECG Sensors
This paper presents an ultra-low-power event-driven analog-to-digital converter (ADC) with real-time QRS detection for wearable electrocardiogram (ECG) sensors in wireless body sensor network (WBSN) applications. Two QRS detection algorithms, pulse-triggered (PUT) and time-assisted PUT (t-PUT), are proposed based on the level-crossing events generated from the ADC. The PUT detector achieves 97.63% sensitivity and 97.33% positive prediction in simulation on the MIT-BIH Arrhythmia Database. The t-PUT improves the sensitivity and positive prediction to 97.76% and 98.59% respectively. Fabricated in 0.13 μm CMOS technology, the ADC with QRS detector consumes only 220 nW measured under 300 mV power supply, making it the first nanoWatt compact analog-to-information (A2I) converter with embedded QRS detector.
Lack of social support and incidence of coronary heart disease in middle-aged Swedish men.
Lack of social support has been found to predict all causes of mortality in population studies. It has often been assumed that the lack of social ties is associated with the general social conditions related to mortality and has little to do with specific disease etiology. So far, the association between lack of support and cardiovascular disease incidence has not been demonstrated. We have measured both emotional support from very close persons ("attachment") and the support provided by the extended network ("social integration"). This measure was applied along with standard measures of traditional risk factors to a random sample of 50-year-old men born in Gothenborg in 1933. All men (n = 736) were followed for 6 years and the incidence of myocardial infarction and death from coronary heart disease (CHD) was determined. Both "attachment" and "social integration" were lower in men who contracted CHD, with a significant effect for social integration (p = 0.04) and an almost significant effect for attachment (p = 0.07). When controlling for other risk factors in multiple logistic regression analyses, both factors remained as significant predictors of new CHD events. Smoking and lack of social support were the two leading risk factors for CHD in these middle-aged men.
Children balance theories and evidence in exploration, explanation, and learning
We look at the effect of evidence and prior beliefs on exploration, explanation and learning. In Experiment 1, we tested children both with and without differential prior beliefs about balance relationships (Center Theorists, mean: 82 months; Mass Theorists, mean: 89 months; No Theory children, mean: 62 months). Center and Mass Theory children who observed identical evidence explored the block differently depending on their beliefs. When the block was balanced at its geometric center (belief-violating to a Mass Theorist, but belief-consistent to a Center Theorist), Mass Theory children explored the block more, and Center Theory children showed the standard novelty preference; when the block was balanced at the center of mass, the pattern of results reversed. The No Theory children showed a novelty preference regardless of evidence. In Experiments 2 and 3, we follow-up on these findings, showing that both Mass and Center Theorists selectively and differentially appeal to auxiliary variables (e.g., a magnet) to explain evidence only when their beliefs are violated. We also show that children use the data to revise their predictions in the absence of the explanatory auxiliary variable but not in its presence. Taken together, these results suggest that children's learning is at once conservative and flexible; children integrate evidence, prior beliefs, and competing causal hypotheses in their exploration, explanation, and learning.
Does Size Matter ? Men ’ s and Women ’ s Views on Penis Size Across the Lifespan
The media equate a man’s penis size with his power and masculinity. Views about penis size were assessed in an Internet survey of 52,031 heterosexual men and women. Most men (66%) rated their penis as average, 22% as large, and 12% as small. Self-reported penis size was correlated positively with height and negatively with body fat level. Whereas 85% of women were satisfied with their partner’s penis size, only 55% of men were satisfied with their penis size, 45% wanted to be larger, and 0.2% wanted to be smaller. Satisfaction did not vary across age groups from 18 to 65. Men reporting a larger-than-average penis rated their appearance most favorably, suggesting a possible confidence effect of perceived large penis size.
Better Punctuation Prediction with Dynamic Conditional Random Fields
This paper focuses on the task of inserting punctuation symbols into transcribed conversational speech texts, without relying on prosodic cues. We investigate limitations associated with previous methods, and propose a novel approach based on dynamic conditional random fields. Different from previous work, our proposed approach is designed to jointly perform both sentence boundary and sentence type prediction, and punctuation prediction on speech utterances. We performed evaluations on a transcribed conversational speech domain consisting of both English and Chinese texts. Empirical results show that our method outperforms an approach based on linear-chain conditional random fields and other previous approaches.
Representation of three-dimensional space in the hippocampus of flying bats.
Many animals, on air, water, or land, navigate in three-dimensional (3D) environments, yet it remains unclear how brain circuits encode the animal's 3D position. We recorded single neurons in freely flying bats, using a wireless neural-telemetry system, and studied how hippocampal place cells encode 3D volumetric space during flight. Individual place cells were active in confined 3D volumes, and in >90% of the neurons, all three axes were encoded with similar resolution. The 3D place fields from different neurons spanned different locations and collectively represented uniformly the available space in the room. Theta rhythmicity was absent in the firing patterns of 3D place cells. These results suggest that the bat hippocampus represents 3D volumetric space by a uniform and nearly isotropic rate code.
Local and global analysis: complementary activities for increasing the effectiveness of requirements verification and validation
This paper presents a unique approach to connecting requirements engineering activities into a process framework that can be employed to obtain quality requirements with reduced expenditures of effort and cost. It is well understood that early detection and correction of errors offers the greatest potential for improving requirements quality and avoiding cost overruns in the development of software systems. To realize the maximum benefits of this ideology, we propose a two-phase model that is novel in that it introduces the concept of verification and validation (V&V) early in the requirements life cycle. In the first phase, we perform V&V immediately following the elicitation of requirements for each individually distinct function of the system. Because the first phase focuses on capturing smaller sets of related requirements iteratively, each corresponding V&V activity is better focused for detecting and correcting errors in each requirement set. In the second phase, a complementary verification activity is initiated; the corresponding focus is on the quality of linkages between requirements sets rather than on the requirements within the sets. Consequently, this approach reduces the effort in verification and enhances the focus on the verification task. The second phase also addresses the business concerns collectively, and thereby produces requirements that are not only quality adherent, but are also business compliant. Our approach, unlike other models, has a minimal time delay between the elicitation of requirements and the execution of the V&V activities. Because of this short time gap, the stakeholders have a clearer recollection of the requirements, their context and rationale; this enhances the feedback during the V&V activities. Furthermore, our model includes activities that closely align with the effective requirements engineering processes employed in the software industry. Thus, our approach facilitates a better understanding of the flow of requirements, and provides guidance for the implementation of the requirements engineering process.This paper describes a well-defined, two-phase requirements engineering approach that incorporates the principles of early V&V to provide the benefits of reduced costs and enhanced quality requirements.
Semantic distance in WordNet: An experimental, application-oriented evaluation of five measures
Five different proposed measures of similarity or semantic distance in WordNet were experimentally compared by examining their performance in a real-word spelling correction system. It was found that Jiang and Conrath’s measure gave the best results overall. That of Hirst and St-Onge seriously over-related, that of Resnik seriously under-related, and those of Lin and of Leacock and Chodorow fell in between.
A positive role for patched-smoothened signaling in promoting cell proliferation during normal head development in Drosophila.
The transmembrane receptor Patched regulates several developmental processes in both invertebrates and vertebrates. In vertebrates, Patched also acts as a tumor suppressor. The Patched pathway normally operates by negatively regulating Smoothened, a G-protein-coupled receptor; binding of Hedgehog ligand to Patched relieves this negative interaction and allows signaling by Smoothened. We show that Ptc regulates Drosophila head development by promoting cell proliferation in the eye-antennal disc. During head morphogenesis, Patched positively interacts with Smoothened, which leads to the activation of Activin type I receptor Baboon and stimulation of cell proliferation in the eye-antennal disc. Thus, loss of Ptc or Smoothened activity affects cell proliferation in the eye-antennal disc and results in adult head capsule defects. Similarly, reducing the dose of smoothened in a patched background enhances the head defects. Consistent with these results, gain-of-function Hedgehog interferes with the activation of Baboon by Patched and Smoothened, leading to a similar head capsule defect. Expression of an activated form of Baboon in the patched domain in a patched mutant background completely rescues the head defects. These results provide insight into head morphogenesis, a process we know very little about, and reveal an unexpected non-canonical positive signaling pathway in which Patched and Smoothened function to promote cell proliferation as opposed to repressing it.
Empirical studies of software engineering: a roadmap
In this article we summarize the strengths and weaknesses of empirical research in software engineering. We argue that in order to improve the current situation we must create better studies and draw more credible interpretations from them. We finally present a roadmap for this improvement, which includes a general structure for software empirical studies and concrete steps for achieving these goals: designing better studies, collecting data more effectively, and involving others in our empirical enterprises.
Ecological factors associated with dengue fever in a central highlands Province, Vietnam
BACKGROUND Dengue is a leading cause of severe illness and hospitalization in Vietnam. This study sought to elucidate the linkage between climate factors, mosquito indices and dengue incidence. METHODS Monthly data on dengue cases and mosquito larval indices were ascertained between 2004 and 2008 in the Dak Lak province (Vietnam). Temperature, sunshine, rainfall and humidity were also recorded as monthly averages. The association between these ecological factors and dengue was assessed by the Poisson regression model with adjustment for seasonality. RESULTS During the study period, 3,502 cases of dengue fever were reported. Approximately 72% of cases were reported from July to October. After adjusting for seasonality, the incidence of dengue fever was significantly associated with the following factors: higher household index (risk ratio [RR]: 1.66; 95% confidence interval [CI]: 1.62-1.70 per 5% increase), higher container index (RR: 1.78; 95% CI: 1.73-1.83 per 5% increase), and higher Breteau index (RR: 1.57; 95% CI: 1.53-1.60 per 5 unit increase). The risk of dengue was also associated with elevated temperature (RR: 1.39; 95% CI: 1.25-1.55 per 2 °C increase), higher humidity (RR: 1.59; 95% CI: 1.51-1.67 per 5% increase), and higher rainfall (RR: 1.13; 95% CI: 1.21-1.74 per 50 mm increase). The risk of dengue was inversely associated with duration of sunshine, the number of dengue cases being lower as the sunshine increases (RR: 0.76; 95% CI: 0.73-0.79 per 50 hours increase). CONCLUSIONS These data suggest that indices of mosquito and climate factors are main determinants of dengue fever in Vietnam. This finding suggests that the global climate change will likely increase the burden of dengue fever infection in Vietnam, and that intensified surveillance and control of mosquito during high temperature and rainfall seasons may be an important strategy for containing the burden of dengue fever.
A new stepwise adiabatic charging circuit with a smaller capacitance in a regenerator than a load capacitance
A stepwise-voltage-generation circuit was devised that is based on a capacitor bank and that dissipates no energy when a stepwise voltage is generated. The stepwise voltage is generated spontaneously, and depends neither on the initial voltages to the capacitors nor on the switching order. A new adiabatic-charging circuit based on this circuit was also devised that increases the voltage in a stepwise fashion. The total capacitance of the capacitors in the regenerator is much smaller than a load capacitance, which enables construction of a very small adiabatic regenerator. This regenerator cannot be made with a conventional circuit, which uses a tank capacitor that is much larger than a load capacitor for adiabatic charging.
GPU-based cone beam computed tomography
The use of cone beam computed tomography (CBCT) is growing in the clinical arena due to its ability to provide 3D information during interventions, its high diagnostic quality (sub-millimeter resolution), and its short scanning times (60 s). In many situations, the short scanning time of CBCT is followed by a time-consuming 3D reconstruction. The standard reconstruction algorithm for CBCT data is the filtered backprojection, which for a volume of size 256(3) takes up to 25 min on a standard system. Recent developments in the area of Graphic Processing Units (GPUs) make it possible to have access to high-performance computing solutions at a low cost, allowing their use in many scientific problems. We have implemented an algorithm for 3D reconstruction of CBCT data using the Compute Unified Device Architecture (CUDA) provided by NVIDIA (NVIDIA Corporation, Santa Clara, California), which was executed on a NVIDIA GeForce GTX 280. Our implementation results in improved reconstruction times from minutes, and perhaps hours, to a matter of seconds, while also giving the clinician the ability to view 3D volumetric data at higher resolutions. We evaluated our implementation on ten clinical data sets and one phantom data set to observe if differences occur between CPU and GPU-based reconstructions. By using our approach, the computation time for 256(3) is reduced from 25 min on the CPU to 3.2 s on the GPU. The GPU reconstruction time for 512(3) volumes is 8.5 s.
A framework for browser-based Multiplayer Online Games using WebGL and WebSocket
Compared to the traditional stand-alone game client, browser-based Multiplayer Online Game (MOG), which requires no explicit installation and is able to achieve cross-platform easily, is getting more and more popular. With the rapid development of HTML5 standard and other Web technologies, MOG systems based on WebGL and WebSocket seem to be very promising. We implemented such a framework for browser-based multiplayer online games and studied its performance and feasibility. Our analytical result shows that Three.js 3D Engine and jWebSocket based MOG can easily support the interaction of a small group of users.
Matching RDF Graphs
RDF, graph isomorphism, equality The Resource Description Framework (RDF) describes graphs of statements about resources. This paper explores the equality of two RDF graphs in light of the graph isomorphism literature. We consider anonymous resources as unlabelled vertices in a graph, and show that the standard graph isomorphism algorithms, developed in the 1970's, can be used effectively for comparing RDF graphs.
Joint Solution for Deeper Structure Beneath Volcanic Rocks Using Gravity, Magnetic & Seismic Data
E019 Joint Solution for Deeper Structure Beneath Volcanic Rocks Using Gravity Magnetic & Seismic Data Z. Yang* (BGP) SUMMARY In some areas where rift basin was caused by magma intumescing because of many times of magma activities not only the igneous rocks developed in basement but also volcanic rocks developed in sedimentary formations which results in that seismic data is poor in deeper formations beneath volcanic rocks so it is difficult to identify basement and to do the further study on basement undulation and evaluation of source rocks beneath volcanic rocks. To improve the exploration effect this paper presents a
Increase in Daily Steps After an Exercise Specialist Led Lifestyle Intervention for Adults With Type 2 Diabetes In Primary Care: A Controlled Implementation Trial.
OBJECTIVE To determine the effectiveness of an exercise specialist led lifestyle program for adults with type 2 diabetes in primary care. METHODS Eligible participants from 4 primary care networks in Alberta, Canada were assigned to either a lifestyle program or a control group. The program targeted increased daily walking through individualized daily pedometer step goals for the first 3 months and brisk walking speed, along with substitution of low-relative to high-glycemic index foods over the next 3 months. The outcomes were daily steps, diet, and clinical markers, and were compared using random effects models. RESULTS 198 participants were enrolled (102 in the intervention and 96 in the control). For all participants, (51% were women), mean age 59.5 (SD 8.3) years, A1c 6.8% (SD 1.1), BMI 33.6 kg/m(2) (SD 6.5), systolic BP 125.6 mmHg (SD 16.2), glycemic index 51.7 (4.6), daily steps 5879 (SD 3130). Daily steps increased for the intervention compared with the control at 3-months (1292 [SD 2698] vs. 418 [SD 2458] and 6-months (1481 [SD 2631] vs. 336 [SD 2712]; adjusted P = .002). No significant differences were observed for diet or clinical outcomes. CONCLUSIONS A 6-month lifestyle program delivered in primary care by an exercise specialist can be effective for increasing daily walking among adults with recently diagnosed type 2 diabetes. This short-term increase in daily steps requires longer follow-up to estimate the potential impact on health outcomes.
Repellent and Contact Toxicity of Alpinia officinarum Rhizome Extract against Lasioderma serricorne Adults
The repellent and contact toxicities of Alpinia officinarum rhizome extract on Lasioderma serricorne adults, and its ability to protect stored wheat flour from L. serricorne adults infestation were investigated. The A. officinarum extract exhibited strong repellent and contact toxicities against L. serricorne adults. The toxicities enhanced significantly with the increasing treatment time and treatment dose. The mean percentage repellency value reached 91.3% at class V at the dose of 0.20 μL/cm2 after 48 h of exposure. The corrected mortality reached over 80.0% at the dose of 0.16 μL/cm2 after 48 h of exposure. The A. officinarum extract could significantly reduce L. serricorne infestation level against stored wheat flour. Particularly, the insect infestation was nil in wheat flour packaged with kraft paper bags coated with the A. officinarum extract at the dose of above 0.05 μL/cm2. The naturally occurring A. officinarum extract could be useful for integrated management of L. serricorne.
A practice-based randomized controlled trial to improve medication adherence among Latinos with hypertension: study protocol for a randomized controlled trial
BACKGROUND Latinos experience disproportionately higher rates of uncontrolled hypertension as compared to Blacks and Whites. While poor adherence is a major contributor to disparities in blood pressure control, data in Latino patients are scant. More importantly, translation of interventions to improve medication adherence in community-based primary care practices, where the majority of Latino patients receive their care is non-existent. METHODS Using a randomized controlled design, this study evaluates the effectiveness of a culturally tailored, practice-based intervention compared to usual care on medication adherence, among 148 Latino patients with uncontrolled hypertension who are non-adherent to their antihypertensive medications. Bilingual medical assistants trained as Health Coaches deliver the intervention using an electronic medical record system-embedded adherence script. Patients randomized to the intervention group receive patient-centered counseling with a Health Coach to develop individualized self-monitoring strategies to overcome barriers and improve adherence behaviors. Health Coach sessions are held biweekly for the first 3 months (6 sessions total) and then monthly for the remaining 3 months (3 sessions total). Patients randomized to the usual care group receive standard hypertension treatment recommendations as determined by their primary care providers. The primary outcome is the rate of medication adherence at 6 months. The secondary outcome is reduction in systolic and diastolic blood pressure at 6 months. DISCUSSION If successful, findings from this study will provide salient information on the translation of culturally tailored, evidence-based interventions targeted at medication adherence and blood pressure control into practice-based settings for this high-risk population. TRIAL REGISTRATION NCT01643473 on 16 July 2012.
Stationary and mobile battery energy storage systems for smart grids
Renewable energy is a key technology in reducing global carbon dioxide emissions. Currently, penetration of intermittent renewable energies in most power grids is low, such that the impact of renewable energy's intermittency on grid stability is controllable. Utility scale energy storage systems can enhance stability of power grids with increasing share of intermittent renewable energies. With the grid communication network in smart grids, mobile battery systems in battery electric vehicles and plug-in hybrid electric vehicles can also be used for energy storage and ancillary services in smart grids. This paper will review the stationary and mobile battery systems for grid voltage and frequency stability control in smart grids with increasing shares of intermittent renewable energies. An optimization algorithm on vehicle-to-grid operation will also be presented.
Wideband Circularly Polarized Patch Antennas on Reactive Impedance Substrates
A reduced-size wideband single-feed circularly polarized patch antenna is introduced for telemetry applications in S-band around 2300 MHz. The proposed structure consists of a slot-loaded patch antenna printed over an optimized metamaterial-inspired reactive impedance substrate (RIS). We demonstrate, step by step, the main role of each antenna element by comparing numerically and experimentally the performance of various antenna configurations: antenna over a single- or dual-layer substrate, standard patch or slot-loaded patch, antenna with or without RIS. The final optimized structure exhibits an axial-ratio bandwidth of about 15% and an impedance bandwidth better than 11%, which is much wider than the conventional printed antenna on the same materials.
A Survey of Incentive Mechanisms for Participatory Sensing
Participatory sensing is now becoming more popular and has shown its great potential in various applications. It was originally proposed to recruit ordinary citizens to collect and share massive amounts of sensory data using their portable smart devices. By attracting participants and paying rewards as a return, incentive mechanisms play an important role to guarantee a stable scale of participants and to improve the accuracy/coverage/timeliness of the sensing results. Along this direction, a considerable amount of research activities have been conducted recently, ranging from experimental studies to theoretical solutions and practical applications, aiming at providing more comprehensive incentive procedures and/or protecting benefits of different system stakeholders. To this end, this paper surveys the literature over the period of 2004-2014 from the state of the art of theoretical frameworks, applications and system implementations, and experimental studies of the incentive strategies used in participatory sensing by providing up-to-date research in the literature. We also point out future directions of incentive strategies used in participatory sensing.
Post mastectomy adjuvant radiotherapy in breast cancer: a comparision of three hypofractionated protocols.
OBJECTIVES To compare three hypofractionated protocols in postmastectomy cancinoma breast in terms of local control, toxicity and work load. METHODS A total of three hundred patients suffering from cancer breast stage T2-4, N any were randomized into three arms after mastectomy. All the patients were treated with four fields on Co60 i.e. two tangential portals for chest wall, one anterior supraclavicular and axillary field and a posterior axillary boost and were randomized into three arms i.e. 2700 CGy in 5 fractions (one week) arm A, 3500 CGy in 10 fractions (2 weeks) arm B and 4000 CGy in 15 fractions (3 weeks) arm C. Skin, cardiac, pulmonary and haematological toxicities and lymphoedema were compared in addition to local control and work load. RESULTS The locoregional relapses were 11%, 12% and 10% in arms A, B and C respectively. 26%, 24% and 28% patients developed metastatic disease and 17%, 18% and 20% died in the three arms. G3 and G4 skin toxicities were 37%, 28% and 14%. G2 and G3 lymphedoema was 21%, 22% and 27%. Cardiac toxicity was 5%, 6% and 5% while pulmonary toxicity was 4%, 5% and 5% respectively. All the differences except skin toxicity were statistically insignificant. There were no cases of haematological depression or rib fractures. CONCLUSION All the three short protocols were equally effective in locoregional disease control and toxicity was also comparable. They were helpful in reducing the work load and can be safely recommended for routine clinical use.
Fog detection system based on computer vision techniques
In this document, a real-time fog detection system using an on-board low cost b&w camera, for a driving application, is presented. This system is based on two clues: estimation of the visibility distance, which is calculated from the camera projection equations and the blurring due to the fog. Because of the water particles floating in the air, sky light gets diffuse and, focus on the road zone, which is one of the darkest zones on the image. The apparent effect is that some part of the sky introduces in the road. Also in foggy scenes, the border strength is reduced in the upper part of the image. These two sources of information are used to make this system more robust. The final purpose of this system is to develop an automatic vision-based diagnostic system for warning ADAS of possible wrong working conditions. Some experimental results and the conclusions about this work are presented.
Ensemble learning methods for pay-per-click campaign management
http://dx.doi.org/10.1016/j.eswa.2015.01.047 0957-4174/ 2015 Elsevier Ltd. All rights reserved. ⇑ Corresponding author at: Department of Business Information Technology, Virginia Polytechnic Institute and State University, Blacksburg, VA, USA. Tel.: +1 540 231 6596. E-mail addresses: [email protected] (M.A. King), [email protected] (A.S. Abrahams), [email protected] (C.T. Ragsdale). Michael A. King ⇑, Alan S. Abrahams, Cliff T. Ragsdale
Tensile properties of commonly used prolapse meshes
To improve our understanding of the differences in commonly used synthetic prolapse meshes, we compared four newer generation meshes to Gynecare PS™ using a tensile testing protocol. We hypothesize that the newer meshes have inferior biomechanical properties. Meshes were loaded to failure (n = 5 per group) generating load–elongation curves from which the stiffness, the load at failure, and the relative elongation were determined. Additional mesh samples (n = 3) underwent a cyclic loading protocol to measure permanent elongation in response to subfailure loading. With the exception of Popmesh, which displayed uniform stiffness, other meshes were characterized by a bilinear behavior. Newer meshes were 70–90% less stiff than Gynecare™ (p < 0.05) and more readily deformed in response to uniaxial and cyclical loading (p < 0.001). Relative to Gynecare™, the newer generation of prolapse meshes were significantly less stiff, with irreversible deformation at significantly lower loads.
Bronchial hyperactivity, sputum analysis and skin prick test to inhalant allergens in patients with symptomatic food hypersensitivity.
BACKGROUND Dyspnea may be a presenting symptom of type I food hypersensitivity; bronchial hyperactivity, without known asthma, can coexist in patients with food allergy. OBJECTIVE To evaluate airway involvement in young adult patients with food allergy and no asthma and compare the findings to those of patients with food allergy and asthma, with food allergy and allergic rhinitis, with asthma and no food allergy, and of apparently healthy controls. METHODS The evaluation involved prick skin test to food (65 allergens) and inhalants (24 allergens), spirometry, methacholine inhalation challenge, and induced sputum for cell analysis. The five groups consisted of 18 patients with food allergy alone, 11 with food allergy and asthma, 13 with food allergy and allergic rhinitis, 10 with asthma alone, and 10 controls. RESULTS Patients with food allergy alone were mainly (86%) skin sensitive to pollens. Those with either asthma or allergic rhinitis were mainly (95%) sensitive to mites. BHR was detected in 40% of the patients with food allergy alone, 55% of the patients with allergic rhinitis, and 100% of the patients with asthma. Cell counts in the sputum of patients with asthma and in those with food allergy and asthma showed higher eosinophil counts compared to those with food allergy and allergic rhinitis. Patients with food allergy and no asthma, regardless of BHR status, had mainly neutrophils in the sputum. CONCLUSIONS Patients with food allergy are highly likely to have concomitant asymptomatic BHR. Mite sensitivity in patients with food allergy predicts respiratory allergy (either asthma or allergic rhinitis). High eosinophil levels in the sputum of food allergy patients predict respiratory involvement.