title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
A Review of Convolutional Neural Networks for Inverse Problems in Imaging | In this survey paper, we review recent uses of convolution neural networks (CNNs) to solve inverse problems in imaging. It has recently become feasible to train deep CNNs on large databases of images, and they have shown outstanding performance on object classification and segmentation tasks. Motivated by these successes, researchers have begun to apply CNNs to the resolution of inverse problems such as denoising, deconvolution, super-resolution, and medical image reconstruction, and they have started to report improvements over state-of-the-art methods, including sparsity-based techniques such as compressed sensing. Here, we review the recent experimental work in these areas, with a focus on the critical design decisions: Where does the training data come from? What is the architecture of the CNN? and How is the learning problem formulated and solved? We also bring together a few key theoretical papers that offer perspective on why CNNs are appropriate for inverse problems and point to some next steps in the field. |
Pilot clinical study of Adacolumn cytapheresis in patients with systemic lupus erythematosus | The aim of this study is to investigate the clinical effects of cytapheresis using the Adacolumn system (selective removal of circulating monocytes and granulocytes by means of an extracorporeal type column) in patients with active systemic lupus erythematosus (SLE). An open uncontrolled multicenter pilot study was conducted in 18 SLE patients who were showing a SLEDAI score of 8 or more under conventional medication. Patients with lupus nephritis (>class 1, WHO classification) were excluded. Extracorporeal cytapheresis with the Adacolumn system was administered once a week for five consecutive weeks. The efficacy of the treatment was evaluated using the SLEDAI for 10 weeks after the first cytapheresis session. The median SLEDAI decreased from 16 at baseline to six at week 11 (10 weeks after the first apheresis) (p<0.001). Significant improvements in musculoskeletal and dermal systems were observed. Arthritis and alopecia were present in 14 and nine patients at baseline and this number decreased to five and one patients, respectively by week 11.Three mild and one moderate adverse events out of the 42 reported events were judged ‘probably related’ to the treatment; no serious adverse events were reported. Selective removal of monocytes and granulocytes from the blood in an extracorporeal circulation system was associated with clinical improvement in this small series of patients with SLE. Since this approach seems not to have the disadvantages of pharmacological immunosuppression, further controlled studies of Adacolumn cytapheresis are warranted in SLE. |
Including Parent Training in the Early Childhood Special Education Curriculum for Children With Autism Spectrum Disorders | Volume 8, Number 2, Spring 2006, pages 79–87 Abstract: Parent training has been shown to be a very effective method for promoting generalization and maintenance of skills in children with autism. However, despite its wellestablished benefits, few public school programs include parent training as part of the early childhood special education (ECSE) curriculum. Barriers to the provision of parent training include the need for parent education models that can be easily implemented in ECSE programs and the need for preparation of special educators in parent education strategies. This article describes a parent training model for children with autism developed for use in ECSE programs. The implementation of the program, teacher preparation, and preliminary outcomes and challenges will be discussed. Brooke Ingersoll Lewis & Clark College Hearing & Speech Institute |
Learning real-time MRF inference for image denoising | Many computer vision problems can be formulated in a Bayesian framework with Markov Random Field (MRF) or Conditional Random Field (CRF) priors. Usually, the model assumes that a full Maximum A Posteriori (MAP) estimation will be performed for inference, which can be really slow in practice. In this paper, we argue that through appropriate training, a MRF/CRF model can be trained to perform very well on a suboptimal inference algorithm. The model is trained together with a fast inference algorithm through an optimization of a loss function on a training set containing pairs of input images and desired outputs. A validation set can be used in this approach to estimate the generalization performance of the trained system. We apply the proposed method to an image denoising application, training a Fields of Experts MRF together with a 1-4 iteration gradient descent inference algorithm. Experimental validation on unseen data shows that the proposed training approach obtains an improved benchmark performance as well as a 1000-3000 times speedup compared to the Fields of Experts MRF trained with contrastive divergence. Using the new approach, image denoising can be performed in real-time, at 8 fps on a single CPU for a 256 × 256 image sequence, with close to state-of-the-art accuracy. |
Nonverbal expression of psychological states in psychiatric patients | Nonverbal behavior, especially facial expression, appears as one of the most important means for communicating affective states. Studies on groups of psychiatric patients and control subjects are reported in which nonverbal behavior is analyzed from videotaped dialogues. Using a quantitative approach, results on facial behavior, speech, and gaze are described, which shed light on the expressive and communicative functions of nonverbal behavior. From longitudinal observations on depressed patients it emerged that individual-specific associations have to be taken into account for the relationship between expressive behavior and mood changes. The predominance of facial behavior in the speaker role of an individual found in patients and control groups points to the integrated communicative function of the verbal and nonverbal elements. However, recovered schizophrenic patients exhibited a dissociation of these elements. Implications for our understanding of nonverbal communications are discussed. |
Cultures and Selves: A Cycle of Mutual Constitution. | The study of culture and self casts psychology's understanding of the self, identity, or agency as central to the analysis and interpretation of behavior and demonstrates that cultures and selves define and build upon each other in an ongoing cycle of mutual constitution. In a selective review of theoretical and empirical work, we define self and what the self does, define culture and how it constitutes the self (and vice versa), define independence and interdependence and determine how they shape psychological functioning, and examine the continuing challenges and controversies in the study of culture and self. We propose that a self is the "me" at the center of experience-a continually developing sense of awareness and agency that guides actions and takes shape as the individual, both brain and body, becomes attuned to various environments. Selves incorporate the patterning of their various environments and thus confer particular and culture-specific form and function to the psychological processes they organize (e.g., attention, perception, cognition, emotion, motivation, interpersonal relationship, group). In turn, as selves engage with their sociocultural contexts, they reinforce and sometimes change the ideas, practices, and institutions of these environments. |
Birdbrains could teach basal ganglia research a new song | Recent advances in anatomical, physiological and histochemical characterization of avian basal ganglia neurons and circuitry have revealed remarkable similarities to mammalian basal ganglia. A modern revision of the avian anatomical nomenclature has now provided a common language for studying the function of the cortical-basal-ganglia-cortical loop, enabling neuroscientists to take advantage of the specialization of basal ganglia areas in various avian species. For instance, songbirds, which learn their vocal motor behavior using sensory feedback, have specialized a portion of their cortical-basal ganglia circuitry for song learning and production. This discrete circuit dedicated to a specific sensorimotor task could be especially tractable for elucidating the interwoven sensory, motor and reward signals carried by basal ganglia, and the function of these signals in task learning and execution. |
Reservoir Computing and Self-Organized Neural Hierarchies | There is a growing understanding that machine learning architectures have to be much bigger and more complex to approach any intelligent behavior. There is also a growing understanding that purely supervised learning is inadequate to train such systems. A recent paradigm of artificial recurrent neural network (RNN) training under the umbrella-name Reservoir Computing (RC) demonstrated that training big recurrent networks (the reservoirs) differently than supervised readouts from them is often better. It started with Echo State Networks (ESNs) and Liquid State Machines ten years ago where the reservoir was generated randomly and only linear readouts from it were trained. Rather surprisingly, such simply and fast trained ESNs outperformed classical fully-trained RNNs in many tasks. While full supervised training of RNNs is problematic, intuitively there should also be something better than a random network. In recent years RC became a vivid research field extending the initial paradigm from fixed random reservoir and trained output into using different methods for training the reservoir and the readout. In this thesis we overview existing and investigate new alternatives to the classical supervised training of RNNs and their hierarchies. First we present a taxonomy and a systematic overview of the RNN training approaches under the RC umbrella. Second, we propose and investigate the use of two different neural network models for the reservoirs together with several unsupervised adaptation techniques, as well as unsupervisedly layer-wise trained deep hierarchies of such models. We rigorously empirically test the proposed methods on two temporal pattern recognition datasets, comparing it to the classical reservoir computing state of art. |
Neurocognition across the spectrum of mucopolysaccharidosis type I: Age, severity, and treatment. | OBJECTIVES
Precise characterization of cognitive outcomes and factors that contribute to cognitive variability will enable better understanding of disease progression and treatment effects in mucopolysaccharidosis type I (MPS I). We examined the effects on cognition of phenotype, genotype, age at evaluation and first treatment, and somatic disease burden.
METHODS
Sixty patients with severe MPS IH (Hurler syndrome treated with hematopoietic cell transplant and 29 with attenuated MPS I treated with enzyme replacement therapy), were studied with IQ measures, medical history, genotypes. Sixty-seven patients had volumetric MRI. Subjects were grouped by age and phenotype and MRI and compared to 96 normal controls.
RESULTS
Prior to hematopoietic cell transplant, MPS IH patients were all cognitively average, but post-transplant, 59% were below average, but stable. Genotype and age at HCT were associated with cognitive ability. In attenuated MPS I, 40% were below average with genotype and somatic disease burden predicting their cognitive ability. White matter volumes were associated with IQ for controls, but not for MPS I. Gray matter volumes were positively associated with IQ in controls and attenuated MPS I patients, but negatively associated in MPS IH.
CONCLUSIONS
Cognitive impairment, a major difficulty for many MPS I patients, is associated with genotype, age at treatment and somatic disease burden. IQ association with white matter differed from controls. Many attenuated MPS patients have significant physical and/or cognitive problems and receive insufficient support services. Results provide direction for future clinical trials and better disease management. |
Effects of human trampling on abundance and diversity of vascular plants, bryophytes and lichens in alpine heath vegetation, Northern Sweden | This study investigated the effects of human trampling on cover, diversity and species richness in an alpine heath ecosystem in northern Sweden. We tested the hypothesis that proximity to trails decreases plant cover, diversity and species richness of the canopy and the understory. We found a significant decrease in plant cover with proximity to the trail for the understory, but not for the canopy level, and significant decreases in the abundance of deciduous shrubs in the canopy layer and lichens in the understory. Proximity also had a significant negative impact on species richness of lichens. However, there were no significant changes in species richness, diversity or evenness of distribution in the canopy or understory with proximity to the trail. While not significant, liverworts, acrocarpous and pleurocarpous bryophytes tended to have contrasting abundance patterns with differing proximity to the trail, indicating that trampling may cause shifts in dominance hierarchies of different groups of bryophytes. Due to the decrease in understory cover, the abundance of litter, rock and soil increased with proximity to the trail. These results demonstrate that low-frequency human trampling in alpine heaths over long periods can have major negative impacts on lichen abundance and species richness. To our knowledge, this is the first study to demonstrate that trampling can decrease species richness of lichens. It emphasises the importance of including species-level data on non-vascular plants when conducting studies in alpine or tundra ecosystems, since they often make up the majority of species and play a significant role in ecosystem functioning and response in many of these extreme environments. |
Global EDF scheduling for parallel real-time tasks | As multicore processors become ever more prevalent, it is important for real-time programs to take advantage of intra-task parallelism in order to support computation-intensive applications with tight deadlines. In this paper, we consider the global earliest deadline first (GEDF) scheduling policy for task sets consisting of parallel tasks. Each task can be represented by a directed acyclic graph (DAG) where nodes represent computational work and edges represent dependences between nodes. In this model, we prove that GEDF provides a capacity augmentation bound of $$4-\frac{2}{m}$$ 4 - 2 m and a resource augmentation bound of $$2-\frac{1}{m}$$ 2 - 1 m . The capacity augmentation bound acts as a linear-time schedulability test since it guarantees that any task set with total utilization of at most $$m/(4-\frac{2}{m})$$ m / ( 4 - 2 m ) where each task’s critical-path length is at most $$1/(4-\frac{2}{m})$$ 1 / ( 4 - 2 m ) of its deadline is schedulable on $$m$$ m cores under GEDF. In addition, we present a pseudo-polynomial time fixed-point schedulability test for GEDF; this test uses a carry-in work calculation based on the proof for the capacity bound. Finally, we present and evaluate a prototype platform—called PGEDF—for scheduling parallel tasks using global earliest deadline first (GEDF). PGEDF is built by combining the GNU OpenMP runtime system and the $$\text {LITMUS}^\text {RT}$$ LITMUS RT operating system. This platform allows programmers to write parallel OpenMP tasks and specify real-time parameters such as deadlines for tasks. We perform two kinds of experiments to evaluate the performance of GEDF for parallel tasks. (1) We run numerical simulations for DAG tasks. (2) We execute randomly generated tasks using PGEDF. Both sets of experiments indicate that GEDF performs surprisingly well and outperforms an existing scheduling techniques that involves task decomposition. |
Recognizing Products: A Per-exemplar Multi-label Image Classification Approach | Large-scale instance-level image retrieval aims at retrieving specific instances of objects or scenes. Simultaneously retrieving multiple objects in a test image adds to the difficulty of the problem, especially if the objects are visually similar. This paper presents an efficient approach for per-exemplar multi-label image classification, which targets the recognition and localization of products in retail store images. We achieve runtime efficiency through the use of discriminative random forests, deformable dense pixel matching and genetic algorithm optimization. Cross-dataset recognition is performed, where our training images are taken in ideal conditions with only one single training image per product label, while the evaluation set is taken using a mobile phone in real-life scenarios in completely different conditions. In addition, we provide a large novel dataset and labeling tools for products image search, to motivate further research efforts on multi-label retail products image classification. The proposed approach achieves promising results in terms of both accuracy and runtime efficiency on 680 annotated images of our dataset, and 885 test images of GroZi-120 dataset. We make our dataset of 8350 different product images and the 680 test images from retail stores with complete annotations available to the wider community. |
Volume Transmission in Central Dopamine and Noradrenaline Neurons and Its Astroglial Targets | Already in the 1960s the architecture and pharmacology of the brainstem dopamine (DA) and noradrenaline (NA) neurons with formation of vast numbers of DA and NA terminal plexa of the central nervous system (CNS) indicated that they may not only communicate via synaptic transmission. In the 1980s the theory of volume transmission (VT) was introduced as a major communication together with synaptic transmission in the CNS. VT is an extracellular and cerebrospinal fluid transmission of chemical signals like transmitters, modulators etc. moving along energy gradients making diffusion and flow of VT signals possible. VT interacts with synaptic transmission mainly through direct receptor–receptor interactions in synaptic and extrasynaptic heteroreceptor complexes and their signaling cascades. The DA and NA neurons are specialized for extrasynaptic VT at the soma-dendrtitic and terminal level. The catecholamines released target multiple DA and adrenergic subtypes on nerve cells, astroglia and microglia which are the major cell components of the trophic units building up the neural–glial networks of the CNS. DA and NA VT can modulate not only the strength of synaptic transmission but also the VT signaling of the astroglia and microglia of high relevance for neuron–glia interactions. The catecholamine VT targeting astroglia can modulate the fundamental functions of astroglia observed in neuroenergetics, in the Glymphatic system, in the central renin–angiotensin system and in the production of long-distance calcium waves. Also the astrocytic and microglial DA and adrenergic receptor subtypes mediating DA and NA VT can be significant drug targets in neurological and psychiatric disease. |
A Motion Recognition Method Using Foot Pressure Sensors | This paper proposes a method for recognizing postures and gestures using foot pressure sensors, and we investigate optimal positions for pressure sensors on soles are the best for motion recognition. In experiments, the recognition accuracies of 22 kinds of daily postures and gestures were evaluated from foot-pressure sensor values. Furthermore, the optimum measurement points for high recognition accuracy were examined by evaluating combinations of two foot pressure measurement areas on a round-robin basis. As a result, when selecting the optimum two points for a user, the recognition accuracy was about 93.6% on average. Although individual differences were seen, the best combinations of areas for each subject were largely divided into two major patterns. When two points were chosen, combinations of the near thenar, which is located near the thumb ball, and near the heel or point of the outside of the middle of the foot were highly recognized. Of the best two points, one was commonly the near thenar for subjects. By taking three points of data and covering these two combinations, it will be possible to cope with individual differences. The recognition accuracy of the averaged combinations of the best two combinations for all subjects was classified with an accuracy of about 91.0% on average. On the basis of these results, two types of pressure sensing shoes were developed. |
Understanding Cybercrime from Its Stakeholders' Perspectives: Part 2--Defenders and Victims | A comprehensive model and taxonomy of Cybercrime, including all of its stakeholders, would contribute to better cybersecurity. Part 1 of this two-part series, which appeared in the January/February 2015 issue of IEEE Security & Privacy, explored Cyberattackers and their motives in detail. Part 2 focuses on the other key stakeholders: defenders and victims of cybercrime. |
Market Deregulation and Optimal Monetary Policy in a Monetary Union | The wave of crises that began in 2008 reheated the debate on market deregulation as a tool to improve economic performance. This paper addresses the consequences of increased flexibility in goods and labor markets for the conduct of monetary policy in a monetary union. We model a two-country monetary union with endogenous product creation, labor market frictions, and price and wage rigidities. Regulation affects producer entry costs, employment protection, and unemployment benefits. We first characterize optimal monetary policy when regulation is high in both countries and show that the Ramsey allocation requires significant departures from price stability both in the long run and over the business cycle. Welfare gains from the Ramsey-optimal policy are sizable. Second, we show that the adjustment to market reform requires expansionary policy to reduce transition costs. Third, deregulation reduces static and dynamic inefficiencies, making price stability more desirable. International synchronization of reforms can eliminate policy tradeoffs generated by asymmetric deregulation. |
Lack of short-term effect of the thromboxane synthetase inhibitor UK-38,485 on airway reactivity to methacholine in asthmatic subjects. | Previous open studies have suggested that thromboxane receptor antagonists or synthesis inhibitors lower airway hyperresponsiveness in human subjects. This would indicate a role of thromboxane A2 in the development or maintenance of hyperresponsiveness in asthma. Ten nonsmoking asthmatics (aged 23-64 yrs, 9 male) were included in a randomized, double-blind, placebo-controlled, cross-over study of the effect of one week of treatment with a potent selective thromboxane synthetase inhibitor (UK-38,485, 600 mg daily) on airway responsiveness. The study was preceded by a two week run-in period, and two weeks were used for wash-out between the two trial periods. Adequacy of dosage and patient compliance was confirmed by a reduction in the ex vivo formation of thromboxane B2 (median concentration 3.22 micrograms.ml-1 after placebo, 0.10 microgram.ml-1 after UK-38,485, p < 0.05). The mean forced expiratory volume in one second (FEV1) after UK-38,485 was 2.55 l, compared to 2.56 l after treatment with placebo (p = 0.74). The geometric mean provocative dose of methacholine producing a 20% fall in FEV1 (PD20) before and after UK-38,485 was 23.9 and 32.2 micrograms, respectively, compared to 25.1 and 26.3 micrograms respectively, before and after placebo (p = 0.31). The results of this study suggest that thromboxane A2 does not play an important role in the maintenance of increased airway responsiveness in moderately severe asthmatics treated with low doses of inhaled steroids. |
Computer-Assisted Text Analysis for Social Science: Topic Models and Beyond | Topic models are a family of statistical-based algorithms to summarize, explore and index large collections of text documents. After a decade of research led by computer scientists, topic models have spread to social science as a new generation of data-driven social scientists have searched for tools to explore large collections of unstructured text. Recently, social scientists have contributed to topic model literature with developments in causal inference and tools for handling the problem of multi-modality. In this paper, I provide a literature review on the evolution of topic modeling including extensions for document covariates, methods for evaluation and interpretation, and advances in interactive visualizations along with each aspect’s relevance and application for social science research. Keywords—computational social science, computer-assisted text analysis, visual analytics, structural topic model |
Increasing SCADA System Availability by Fault Tolerance Techniques | In SCADA systems, there are many RTUs (Remote Terminal Units) are used for field data collection as well as sending data to master node through the communication system. In such case master node represents the collected data and enables manager to handle the remote controlling activities. The RTU is nothing but the unit of data acquisition in standalone manner. The processor used in RTU is vulnerable to random faults due to harsh environment around RTUs. Faults may lead to the failure of RTU unit and hence it becomes inaccessible for information acquisition. For long running methods, fault tolerance is major concern and research problem since from last two decades. Using the SCADA systems increase the problem of fault tolerance is becoming servered. To handle the faults in oreder to perform the message passing through all the layers of communication system fo the SCADA that time need the efficient fault tolerance. The faults like RTU, message passing layer faults in communication system etc. SCADA is nothing but one of application of MPI. The several techniques for the fault tolerance has been described for MPI which are utilized in different applications such as SCADA. The goal of this paper is to present the study over the different fault tolerance techniques which can be used to optimize the SCADA system availability by mitigating the faults in RTU devices and communication systems. |
Carbapenem resistance, inappropriate empiric treatment and outcomes among patients hospitalized with Enterobacteriaceae urinary tract infection, pneumonia and sepsis | BACKGROUND
Drug resistance among gram-negative pathogens is a risk factor for inappropriate empiric treatment (IET), which in turn increases the risk for mortality. We explored the impact of carbapenem-resistant Enterobacteriaceae (CRE) on the risk of IET and of IET on outcomes in patients with Enterobacteriaceae infections.
METHODS
We conducted a retrospective cohort study in Premier Perspective database (2009-2013) of 175 US hospitals. We included all adult patients with community-onset culture-positive urinary tract infection (UTI), pneumonia, or sepsis as a principal diagnosis, or as a secondary diagnosis in the setting of respiratory failure, treated with antibiotics within 2 days of admission. We employed regression modeling to compute adjusted association of presence of CRE with risk of receiving IET, and of IET on hospital mortality, length of stay (LOS) and costs.
RESULTS
Among 40,137 patients presenting to the hospital with an Enterobacteriaceae UTI, pneumonia or sepsis, 1227 (3.1%) were CRE. In both groups, the majority of the cases were UTI (51.4% CRE and 54.3% non-CRE). Those with CRE were younger (66.6+/-15.3 vs. 69.1+/-15.9 years, p < 0.001), and more likely to be African-American (19.7% vs. 14.0%, p < 0.001) than those with non-CRE. Both chronic (Charlson score 2.0+/-2.0 vs. 1.9+/-2.1, p = 0.009) and acute (by day 2: ICU 56.3% vs. 30.4%, p < 0.001, and mechanical ventilation 35.8% vs. 11.7%, p < 0.001) illness burdens were higher among CRE than non-CRE subjects, respectively. CRE patients were 3× more likely to receive IET than non-CRE (46.5% vs. 11.8%, p < 0.001). In a regression model CRE was a strong predictor of receiving IET (adjusted relative risk ratio 3.95, 95% confidence interval 3.5 to 4.5, p < 0.001). In turn, IET was associated with an adjusted rise in mortality of 12% (95% confidence interval 3% to 23%), and an excess of 5.2 days (95% confidence interval 4.8, 5.6, p < 0.001) LOS and $10,312 (95% confidence interval $9497, $11,126, p < 0.001) in costs.
CONCLUSIONS
In this large US database, the prevalence of CRE among patients with Enterobacteriaceae UTI, pneumonia or sepsis was comparable to other national estimates. Infection with CRE was associated with a four-fold increased risk of receiving IET, which in turn increased mortality, LOS and costs. |
Planned caesarean section versus planned vaginal birth for breech presentation at term: a randomised multicentre trial | BACKGROUND
For 3-4% of pregnancies, the fetus will be in the breech presentation at term. For most of these women, the approach to delivery is controversial. We did a randomised trial to compare a policy of planned caesarean section with a policy of planned vaginal birth for selected breech-presentation pregnancies.
METHODS
At 121 centres in 26 countries, 2088 women with a singleton fetus in a frank or complete breech presentation were randomly assigned planned caesarean section or planned vaginal birth. Women having a vaginal breech delivery had an experienced clinician at the birth. Mothers and infants were followed-up to 6 weeks post partum. The primary outcomes were perinatal mortality, neonatal mortality, or serious neonatal morbidity; and maternal mortality or serious maternal morbidity. Analysis was by intention to treat.
FINDINGS
Data were received for 2083 women. Of the 1041 women assigned planned caesarean section, 941 (90.4%) were delivered by caesarean section. Of the 1042 women assigned planned vaginal birth, 591 (56.7%) delivered vaginally. Perinatal mortality, neonatal mortality, or serious neonatal morbidity was significantly lower for the planned caesarean section group than for the planned vaginal birth group (17 of 1039 [1.6%] vs 52 of 1039 [5.0%]; relative risk 0.33 [95% CI 0.19-0.56]; p<0.0001). There were no differences between groups in terms of maternal mortality or serious maternal morbidity (41 of 1041 [3.9%] vs 33 of 1042 [3.2%]; 1.24 [0.79-1.95]; p=0.35).
INTERPRETATION
Planned caesarean section is better than planned vaginal birth for the term fetus in the breech presentation; serious maternal complications are similar between the groups. |
Pushing the contextual envelope: developing and diffusing IS theory for health information systems research | The healthcare sector is a crucial and socially challenging component of modern economies. Information systems (IS) research could contribute to the effective development, application and use of information technologies to manage and coordinate health services. Healthcare also provides opportunities to develop or refine IS theory because of its unique institutional context. To profile IS research in health-related settings, we examine the publication of health information systems research (HISR) in 17 IS journals since 1985. Our analysis revealed a small but growing body of HISR literature. These publications are concentrated in ‘‘HISR-friendly journals’’ and employ a variety of strategies for balancing general IS theories and knowledge with attention to the institutional characteristics of healthcare. We consider the strengths and limitations of these strategies in advancing HISR within the IS field and for contributing to multidisciplinary HISR knowledge. # 2004 Elsevier Ltd. All rights reserved. |
DataHub: Collaborative Data Science & Dataset Version Management at Scale | Relational databases have limited support for data collaboration, where teams collaboratively curate and analyze large datasets. Inspired by software version control systems like git, we propose (a) a dataset version control system, giving users the ability to create, branch, merge, difference and search large, divergent collections of datasets, and (b) a platform, DATAHUB, that gives users the ability to perform collaborative data analysis building on this version control system. We outline the challenges in providing dataset version control at scale. |
WoLMIS: a labor market intelligence system for classifying web job vacancies | In the last decades, an increasing number of employers and job seekers have been relying on Web resources to get in touch and to find a job. If appropriately retrieved and analyzed, the huge number of job vacancies available today on on-line job portals can provide detailed and valuable information about the Web Labor Market dynamics and trends. In particular, this information can be useful to all actors, public and private, who play a role in the European Labor Market. This paper presents WoLMIS, a system aimed at collecting and automatically classifying multilingual Web job vacancies with respect to a standard taxonomy of occupations. The proposed system has been developed for the Cedefop European agency, which supports the development of European Vocational Education and Training (VET) policies and contributes to their implementation. In particular, WoLMIS allows analysts and Labor Market specialists to make sense of Labor Market dynamics and trends of several countries in Europe, by overcoming linguistic boundaries across national borders. A detailed experimental evaluation analysis is also provided for a set of about 2 million job vacancies, collected from a set of UK and Irish Web job sites from June to September 2015. |
Wrong, but useful: regional species distribution models may not be improved by range‐wide data under biased sampling | Species distribution modeling (SDM) is an essential method in ecology and conservation. SDMs are often calibrated within one country's borders, typically along a limited environmental gradient with biased and incomplete data, making the quality of these models questionable. In this study, we evaluated how adequate are national presence-only data for calibrating regional SDMs. We trained SDMs for Egyptian bat species at two different scales: only within Egypt and at a species-specific global extent. We used two modeling algorithms: Maxent and elastic net, both under the point-process modeling framework. For each modeling algorithm, we measured the congruence of the predictions of global and regional models for Egypt, assuming that the lower the congruence, the lower the appropriateness of the Egyptian dataset to describe the species' niche. We inspected the effect of incorporating predictions from global models as additional predictor ("prior") to regional models, and quantified the improvement in terms of AUC and the congruence between regional models run with and without priors. Moreover, we analyzed predictive performance improvements after correction for sampling bias at both scales. On average, predictions from global and regional models in Egypt only weakly concur. Collectively, the use of priors did not lead to much improvement: similar AUC and high congruence between regional models calibrated with and without priors. Correction for sampling bias led to higher model performance, whatever prior used, making the use of priors less pronounced. Under biased and incomplete sampling, the use of global bats data did not improve regional model performance. Without enough bias-free regional data, we cannot objectively identify the actual improvement of regional models after incorporating information from the global niche. However, we still believe in great potential for global model predictions to guide future surveys and improve regional sampling in data-poor regions. |
A Data-Efficient Framework for Training and Sim-to-Real Transfer of Navigation Policies | Learning effective visuomotor policies for robots purely from data is challenging, but also appealing since a learning-based system should not require manual tuning or calibration. In the case of a robot operating in a real environment the training process can be costly, time-consuming, and even dangerous since failures are common at the start of training. For this reason, it is desirable to be able to leverage simulation and off-policy data to the extent possible to train the robot. In this work, we introduce a robust framework that plans in simulation and transfers well to the real environment. Our model incorporates a gradient-descent based planning module, which, given the initial image and goal image, encodes the images to a lower dimensional latent state and plans a trajectory to reach the goal. The model, consisting of the encoder and planner modules, is trained through a meta-learning strategy in simulation first. We subsequently perform adversarial domain transfer on the encoder by using a bank of unlabelled but random images from the simulation and real environments to enable the encoder to map images from the real and simulated environments to a similarly distributed latent representation. By fine tuning the entire model (encoder + planner) with far fewer real world expert demonstrations, we show successful planning performances in different navigation tasks. |
Microservice-Based IoT for Smart Buildings | A large percentage of buildings in domestic orspecial-purpose is expected to become increasingly "smarter"in the future, due to the immense benefits in terms of en-ergy saving, safety, flexibility, and comfort, that relevant newtechnologies offer. As concerns hardware, software, or platformlevel, however, no clearly dominant standards currently exist. Such standards, would ideally, fulfill a number of importantdesiderata, which are to be touched upon in this paper. Here, we will present a prototype platform for supporting multipleconcurrent applications for smart buildings, which is utilizing anadvanced sensor network as well as a distributed microservicesarchitecture, centrally featuring the Jolie programming language. The architecture and benefits of our system are discussed, as wellas a prototype containing a number of nodes and a user interface, deployed in a real-world academic building environment. Ourresults illustrate the promising nature of our approach, as wellas open avenues for future work towards its wider and largerscale applicability. |
Health promotion by social cognitive means. | This article examines health promotion and disease prevention from the perspective of social cognitive theory. This theory posits a multifaceted causal structure in which self-efficacy beliefs operate together with goals, outcome expectations, and perceived environmental impediments and facilitators in the regulation of human motivation, behavior, and well-being. Belief in one's efficacy to exercise control is a common pathway through which psychosocial influences affect health functioning. This core belief affects each of the basic processes of personal change--whether people even consider changing their health habits, whether they mobilize the motivation and perseverance needed to succeed should they do so, their ability to recover from setbacks and relapses, and how well they maintain the habit changes they have achieved. Human health is a social matter, not just an individual one. A comprehensive approach to health promotion also requires changing the practices of social systems that have widespread effects on human health. |
Millimeter-wave access and backhauling: the solution to the exponential data traffic increase in 5G mobile communications systems? | The exponential increase of mobile data traffic requires disrupting approaches for the realization of future 5G systems. In this article, we overview the technologies that will pave the way for a novel cellular architecture that integrates high-data-rate access and backhaul networks based on millimeter-wave frequencies (57-66, 71-76, and 81-86 GHz). We evaluate the feasibility of short- and medium-distance links at these frequencies and analyze the requirements from the transceiver architecture and technology, antennas, and modulation scheme points of view. Technical challenges are discussed, and design options highlighted; finally, a performance evaluation quantifies the benefits of millimeter- wave systems with respect to current cellular technologies. |
Patterns of Enterprise Application Architecture | protected void doInsert(DomainObject subject, PreparedStatement insertStatement) throws SQLException; class PersonMapper... protected String insertStatement() { return "INSERT INTO people VALUES (?, ?, ?, ?)"; } protected void doInsert( DomainObject abstractSubject, PreparedStatement stmt) throws SQLException { Person subject = (Person) abstractSubject; stmt.setString(2, subject.getLastName()); stmt.setString(3, subject.getFirstName()); stmt.setInt(4, subject.getNumberOfDependents()); } Example: Separating the Finders (Java) To allow domain objects to invoke finder behavior I can use Separated Interface (476) to separate the finder interfaces from the mappers (Figure 10.5 ). I can put these finder interfaces in a separate package that's visible to the domain layer, or, as in this case, I can put them in the domain layer itself. Figure 10.5. Defining a finder interface in the domain package. |
Preschool children learn about causal structure from conditional interventions. | The conditional intervention principle is a formal principle that relates patterns of interventions and outcomes to causal structure. It is a central assumption of experimental design and the causal Bayes net formalism. Two studies suggest that preschoolers can use the conditional intervention principle to distinguish causal chains, common cause and interactive causal structures even in the absence of differential spatiotemporal cues and specific mechanism knowledge. Children were also able to use knowledge of causal structure to predict the patterns of evidence that would result from interventions. A third study suggests that children's spontaneous play can generate evidence that would support such accurate causal learning. |
Physiological and neurophysiological determinants of postcancer fatigue: design of a randomized controlled trial | Postcancer fatigue is a frequently occurring, severe, and invalidating problem, impairing quality of life. Although it is possible to effectively treat postcancer fatigue with cognitive behaviour therapy, the nature of the underlying (neuro)physiology of postcancer fatigue remains unclear. Physiological aspects of fatigue include peripheral fatigue, originating in muscle or the neuromuscular junction; central fatigue, originating in nerves, spinal cord, and brain; and physical deconditioning, resulting from a decreased cardiopulmonary function. Studies on physiological aspects of postcancer fatigue mainly concentrate on deconditioning. Peripheral and central fatigue and brain morphology and function have been studied for patients with fatigue in the context of chronic fatigue syndrome and neuromuscular diseases and show several characteristic differences with healthy controls. Fifty seven severely fatigued and 21 non-fatigued cancer survivors will be recruited from the Radboud University Nijmegen Medical Centre. Participants should have completed treatment of a malignant, solid tumour minimal one year earlier and should have no evidence of disease recurrence. Severely fatigued patients are randomly assigned to either the intervention condition (cognitive behaviour therapy) or the waiting list condition (start cognitive behaviour therapy after 6 months). All participants are assessed at baseline and the severely fatigued patients also after 6 months follow-up (at the end of cognitive behaviour therapy or waiting list). Primary outcome measures are fatigue severity, central and peripheral fatigue, brain morphology and function, and physical condition and activity. This study will be the first randomized controlled trial that characterizes (neuro)physiological factors of fatigue in disease-free cancer survivors and evaluates to which extent these factors can be influenced by cognitive behaviour therapy. The results of this study are not only essential for a theoretical understanding of this invalidating condition, but also for providing an objective biological marker for fatigue that could support the diagnosis and follow-up of treatment. The study is registered at http://ClinicalTrials.gov (NCT01096641). |
Astrocytes from Familial and Sporadic ALS Patients are Toxic to Motor Neurons | Amyotrophic lateral sclerosis (ALS) is a fatal motor neuron disease, with astrocytes implicated as contributing substantially to motor neuron death in familial (F)ALS. However, the proposed role of astrocytes in the pathology of ALS derives in part from rodent models of FALS based upon dominant mutations within the superoxide dismutase 1 (SOD1) gene, which account for <2% of all ALS cases. Their role in sporadic (S)ALS, which affects >90% of ALS patients, remains to be established. Using astrocytes generated from postmortem tissue from both FALS and SALS patients, we show that astrocytes derived from both patient groups are similarly toxic to motor neurons. We also demonstrate that SOD1 is a viable target for SALS, as its knockdown significantly attenuates astrocyte-mediated toxicity toward motor neurons. Our data highlight astrocytes as a non–cell autonomous component in SALS and provide an in vitro model system to investigate common disease mechanisms and evaluate potential therapies for SALS and FALS. |
Low-Profile Integrated Microstrip Antenna for GPS-DSRC Application | This letter describes the concept, design, and measurement of a low-profile integrated microstrip antenna for dual-band applications. The antenna operates at both the GPS L1 frequency of 1.575 GHz with circular polarization and 5.88 GHz with a vertical linear polarization for dedicated short-range communication (DSRC) application. The antenna is low profile and meets stringent requirements on pattern/polarization performance in both bands. The design procedure is discussed, and full measured data are presented. |
Reduction in driveline infection rates: Results from the HeartMate II Multicenter Driveline Silicone Skin Interface (SSI) Registry. | BACKGROUND
During left ventricular assist device implantation, a surgical tunneling technique to keep the entire driveline (DL) velour portion in the subcutaneous tunnel, resulting in a silicone-skin interface (SSI) at the exit site, has been adopted by many centers. To assess long-term freedom from DL infection associated with this technique, a multicenter SSI registry was initiated. It was hypothesized that the modified tunneling technique is associated with at least 50% reduction in DL infection at 1 year post-implant compared with the velour-to-skin method used in the HeartMate II (HMII) Destination Therapy (DT) trial.
METHODS
SSI is a retrospective and prospective registry of patients who have received the HMII device. Results are reported from the retrospective cohort, which consists of 200 patients who were implanted during the period 2009-2012 with the SSI tunneling method and on HMII support for at least 10 months at the time of enrollment. The prevalence and incidence of DL infection after left ventricular assist device implantation in the SSI retrospective cohort were determined and compared with a control group of 201 patients also on HMII support for at least 10 months from the HMII DT clinical trial who were implanted during the period 2007-2009 using the traditional method in which a small section of the velour portion of the DL was externalized.
RESULTS
The 1-year and 2-year prevalence rates of DL infection were 9% and 19% in the SSI patient group compared with 23% and 35% in the control group (hazard ratio 0.49, 95% confidence interval 0.33-0.73, p < 0.001). The event-per-patient year was 0.11 and 0.22 for the SSI and control groups, respectively (p < 0.001). Based on a multivariate analysis, age and DL exit side were the only independent variables associated with DL infection. Effects of management changes over the eras were not studied and could have contributed to the findings.
CONCLUSIONS
These results suggest that leaving the entire DL velour portion below the skin is associated with 50% reduction in DL infection compared with results from the HMII DT trial. |
Single-chip microprocessor that communicates directly using light | Data transport across short electrical wires is limited by both bandwidth and power density, which creates a performance bottleneck for semiconductor microchips in modern computer systems—from mobile phones to large-scale data centres. These limitations can be overcome by using optical communications based on chip-scale electronic–photonic systems enabled by silicon-based nanophotonic devices8. However, combining electronics and photonics on the same chip has proved challenging, owing to microchip manufacturing conflicts between electronics and photonics. Consequently, current electronic–photonic chips are limited to niche manufacturing processes and include only a few optical devices alongside simple circuits. Here we report an electronic–photonic system on a single chip integrating over 70 million transistors and 850 photonic components that work together to provide logic, memory, and interconnect functions. This system is a realization of a microprocessor that uses on-chip photonic devices to directly communicate with other chips using light. To integrate electronics and photonics at the scale of a microprocessor chip, we adopt a ‘zero-change’ approach to the integration of photonics. Instead of developing a custom process to enable the fabrication of photonics, which would complicate or eliminate the possibility of integration with state-of-the-art transistors at large scale and at high yield, we design optical devices using a standard microelectronics foundry process that is used for modern microprocessors. This demonstration could represent the beginning of an era of chip-scale electronic–photonic systems with the potential to transform computing system architectures, enabling more powerful computers, from network infrastructure to data centres and supercomputers. |
An overview of text-independent speaker recognition: From features to supervectors | This paper gives an overview of automatic speaker recognition technology, with an emphasis on text-independent recognition. Speaker recognition has been studied actively for several decades. We give an overview of both the classical and the state-of-the-art methods. We start with the fundamentals of automatic speaker recognition, concerning feature extraction and speaker modeling. We elaborate advanced computational techniques to address robustness and session variability. The recent progress from vectors towards supervectors opens up a new area of exploration and represents a technology trend. We also provide an overview of this recent development and discuss the evaluation methodology of speaker recognition systems. We conclude the paper with discussion on future directions. 2009 Elsevier B.V. All rights reserved. |
Answering graph pattern query using incremental views | In recent years, modeling data in graph structure became evident and effective for processing in some of the prominent application areas like social analytics, health care analytics, scientific analytics etc. The key sources of massively scaled data are petascale simulations, experimental devices, the internet and scientific applications. Hence, there is a demand for adapt graph querying techniques on such large graph data. Graphs are pervasive in large scale analytics, facing the new challenge such as data size, heterogeneity, uncertainty and data quality. Traditional graph pattern matching approaches are based on inherent isomorphism and simulation. In real life applications, many of them either fail to capture structural or semantic or both similarities. Moreover, in real life applications data graphs constantly bear modifications with small updates. In response to these challenges, we propose a notion that revises traditional notions to characterize graph pattern matching using graph views. Based on this characterization, we outline an approach that efficiently solve graph pattern queries problem over both static and dynamic real life data graphs. |
GEOCHEMICAL AND PETROLOGICAL CHARACTERISTICS OF DEH SIAHAN GRANITIC ROCKS, SOUTHWEST OF KERMAN, IRAN: DATA BEARING ON GENESIS | The Oligocene-Miocene granitic rocks of Deh Siahan, part of central Iranian volcanic belt, are intruded into Eocene volcano-sedimentary complex where their contact is marked by albite-epidote hornblende hornfels facies and granitic apophyses. The granitic rocks show enhanced LIL element abundances and low HFS/LIL ratios. Geochemical data, various trace element discriminant diagrams, enhanced Y/Nb and Ce/Nb ratio, and ocean ridge granite normalized multielement diagrams indicate that the Deh Siahan granitic rocks have characteristics of high-K, calc-alkaline, I-Cordilleran type granites of volcanic arc settings. In this aspect, it may represent part of an Andean-type magmatic arc formed in response to subduction of Neotethys oceanic crust beneath Central Iran, unrelated to a rift settings. The partial melting of subducted oceanic crust led to the formation of basic magma. Its emplacement under the mantle wedge provoked melting in the considerably metasomatized and enriched sub-continental lithosphere. This caused generation of siliceous magma which its low pressure crystal fractionation eventually led to the formation of Deh Siahan granitic rocks. |
Accuracy of screening methods for the diagnosis of breast disease. | Clinical examination, thermography, and 70-mm. mammography were performed in 891 patients-414 presented to hospital with symptoms of breast disease and 477 were asymptomatic. Comparison of the diagnostic accuracy of these methods showed that neither thermography nor 70-mm. mammography has a useful place as an isolated screening procedure for breast cancer. In fact, we consider such a policy dangerous. |
The Global Street: Making the Political | This article explores key vectors in the uprisings of the MENA (Middle East and North Africa) region from an urban perspective. The aim is to open up a larger conceptual field to understand the complex interactions between power and powerlessness as they get shaped in urban space. I argue that the city makes visible the limits of superior military power and, most importantly, that cities enable powerlessness to become complex, not simply elementary. In this complexity lies the possibility of making history and remaking the political. The question of public space is central to giving the powerless rhetorical and operational openings. But that public space needs to be distinguished from the concept of public space in the European tradition. This leads me to the concept of The Global Street. |
Strategy-Proofness and Arrow’s Conditions: Existence and Correspondence Theorems for Voting Procedures and Social Welfare Functions* | Consider a committee which must select one alternative from a set of three or more alternatives. Committee members each cast a ballot which the voting procedure counts. The voting procedure is strategy-proof if it always induces every committee member to cast a ballot revealing his preference. I prove three theorems. First, every strategy-proof voting procedure is dictatorial. Second, this paper’s strategy-proofness condition for voting procedures corresponds to Arrow’s rationality, independence of irrelevant alternatives, nonnegative response, and citizens’ sovereignty conditions for social welfare functions. Third, Arrow’s general possibility theorem is proven in a new manner. |
Stable haptic interaction with virtual environments | Stable Haptic Interaction with Virtual Environments |
Efficient analysis of power consumption behaviour of embedded wireless IoT systems | From wearables to smart appliances, the Internet of Things (IoT) is developing at a rapid pace. The challenge is to find the best fitting solution within a range of different technologies that all may be appropriate at the first sight to realize a specific embedded device. A single tool for measuring power consumption of various wireless technologies and low power modes helps to optimize the development process of modern IoT systems. In this paper, we present an accurate but still cost-effective measurement solution for tracking the highly dynamic power consumption of wireless embedded systems. We extended the conventional measurement of a single shunt resistor's voltage drop by using a dual shunt resistor stage with an automatic switch-over between two stages, which leads to a large dynamic measurement range from μA up to several hundreds mA. To demonstrate the usability of our simple-to-use power measurement system different use cases are presented. Using two independent current measurement channels allows to evaluate the timing relation of proprietary RF communication. Furthermore a forecast is given on the expected battery lifetime of a Wifi-based data acquisition system using measurement results of the presented tool. |
Classification of seizure based on the time-frequency image of EEG signals using HHT and SVM | The detection of seizure activity in electroencephalogram (EEG) signals is crucial for the classification of epileptic seizures. However, epileptic seizures occur irregularly and unpredictably, automatic seizure detection in EEG recordings is highly required. In this work, we present a new technique for seizure classification of EEG signals using Hilbert–Huang transform (HHT) and support vector machine (SVM). In our method, the HHT based time-frequency representation (TFR) has been considered as time-frequency image (TFI), the segmentation of TFI has been implemented based on the frequency-bands of the rhythms of EEG signals, the histogram of grayscale sub-images has been represented. Statistical features such as ime-frequency image upport vector machine eizure classification mean, variance, skewness and kurtosis of pixel intensity in the histogram have been extracted. The SVM with radial basis function (RBF) kernel has been employed for classification of seizure and nonseizure EEG signals. The classification accuracy and receiver operating characteristics (ROC) curve have been used for evaluating the performance of the classifier. Experimental results show that the best average classification accuracy of this algorithm can reach 99.125% with the theta rhythm of EEG signals. |
Impacts of IoT and big data to automotive industry | With the recent advancement in technologies such as embedded system, wireless distributed sensor, light weight material, smart cognitive radio networks, cloud computing, higher efficiency and ultra-low emission internal combustion engines, intelligent converter, high performance battery and fuel cell technology, the production of smarter, safer, energy efficient and zero emission vehicles is possible in near future. Apart from vehicle technologies, other factors such as road users', well maintained road infrastructure, well maintained vehicles, drivers' attitudes and law and enforcement are also important to be considered and they should work together in order to make our world natural resources can be preserved and maintain cleaner environment and produce sustainable mobility. This paper will discuss the impacts of IoT and Big Data and other emerging technologies mentioned above to the automotive industry. It will include discussion on education, economy, advanced technology, environment, safety and energy. |
Fused-layer CNN accelerators | Deep convolutional neural networks (CNNs) are rapidly becoming the dominant approach to computer vision and a major component of many other pervasive machine learning tasks, such as speech recognition, natural language processing, and fraud detection. As a result, accelerators for efficiently evaluating CNNs are rapidly growing in popularity. The conventional approaches to designing such CNN accelerators is to focus on creating accelerators to iteratively process the CNN layers. However, by processing each layer to completion, the accelerator designs must use off-chip memory to store intermediate data between layers, because the intermediate data are too large to fit on chip. In this work, we observe that a previously unexplored dimension exists in the design space of CNN accelerators that focuses on the dataflow across convolutional layers. We find that we are able to fuse the processing of multiple CNN layers by modifying the order in which the input data are brought on chip, enabling caching of intermediate data between the evaluation of adjacent CNN layers. We demonstrate the effectiveness of our approach by constructing a fused-layer CNN accelerator for the first five convolutional layers of the VGGNet-E network and comparing it to the state-of-the-art accelerator implemented on a Xilinx Virtex-7 FPGA. We find that, by using 362KB of on-chip storage, our fused-layer accelerator minimizes off-chip feature map data transfer, reducing the total transfer by 95%, from 77MB down to 3.6MB per image. |
Autonomous driving: investigating the feasibility of car-driver handover assistance | Self-driving vehicles are able to drive on their own as long as the requirements of their autonomous systems are met. If the system reaches the boundary of its capabilities, the system has to de-escalate (e.g. emergency braking) or hand over control to the human driver. Accordingly, the design of a functional handover assistant requires that it enable drivers to both take over control and feel comfortable while doing so -- even when they were "out of the loop" with other tasks. We introduce a process to hand over control from a full self-driving system to manual driving, and propose a number of handover implementation strategies. Moreover, we designed and implemented a handover assistant based on users' preferences and conducted a user study with 30 participants, whose distraction was ensured by a realistic distractor task. Our evaluation shows that car-driver handovers prompted by multimodal (auditory and visual) warnings are a promising strategy to compensate for system boundaries of autonomous vehicles. The insights we gained from the take-over behavior of drivers led us to formulate recommendations for more realistic evaluation settings and the design of future handover assistants. |
Numerical differentiation with annihilators in noisy environment | Numerical differentiation in noisy environment is revised through an algebraic approach. For each given order, an explicit formula yielding a pointwise derivative estimation is derived, using elementary differential algebraic operations. These expressions are composed of iterated integrals of the noisy observation signal. We show in particular that the introduction of delayed estimates affords significant improvement. An implementation in terms of a classical finite impulse response (FIR) digital filter is given. Several simulation results are presented. |
A novel sentiment analysis of social networks using supervised learning | Online microblog-based social networks have been used for expressing public opinions through short messages. Among popular microblogs, Twitter has attracted the attention of several researchers in areas like predicting the consumer brands, democratic electoral events, movie box office, popularity of celebrities, the stock market, etc. Sentiment analysis over a Twitter-based social network offers a fast and efficient way of monitoring the public sentiment. This paper studies the sentiment prediction task over Twitter using machine-learning techniques, with the consideration of Twitter-specific social network structure such as retweet. We also concentrate on finding both direct and extended terms related to the event and thereby understanding its effect. We employed supervised machine-learning techniques such as support vector machines (SVM), Naive Bayes, maximum entropy and artificial neural networks to classify the Twitter data using unigram, bigram and unigram + bigram (hybrid) feature extraction model for the case study of US Presidential Elections 2012 and Karnataka State Assembly Elections (India) 2013. Further, we combined the results of sentiment analysis with the influence factor generated from the retweet count to improve the prediction accuracy of the task. Experimental results demonstrate that SVM outperforms all other classifiers with maximum accuracy of 88 % in predicting the outcome of US Elections 2012, and 68 % for Indian State Assembly Elections 2013. |
Classifying EEG signals preceding right hand, left hand, tongue, and right foot movements and motor imageries | OBJECTIVE
To use the neural signals preceding movement and motor imagery to predict which of the four movements/motor imageries is about to occur, and to access this utility for brain-computer interface (BCI) applications.
METHODS
Eight naïve subjects performed or kinesthetically imagined four movements while electroencephalogram (EEG) was recorded from 29 channels over sensorimotor areas. The task was instructed with a specific stimulus (S1) and performed at a second stimulus (S2). A classifier was trained and tested offline at differentiating the EEG signals from movement/imagery preparation (the 1.5-s preceding movement/imagery execution).
RESULTS
Accuracy of movement/imagery preparation classification varied between subjects. The system preferentially selected event-related (de)synchronization (ERD/ERS) signals for classification, and high accuracies were associated with classifications that relied heavily on the ERD/ERS to discriminate movement/imagery planning.
CONCLUSIONS
The ERD/ERS preceding movement and motor imagery can be used to predict which of the four movements/imageries is about to occur. Prediction accuracy depends on this signal's accessibility.
SIGNIFICANCE
The ERD/ERS is the most specific pre-movement/imagery signal to the movement/imagery about to be performed. |
Bayesian Optimization with Empirical Constraints ( PhD Proposal ) | This work is motivated by the experimental design problem of optimizing the power output of nano-enhanced microbial fuel cells. Microbial fuel cells (MFCs) (Bond and Lovley, 2003; Fan et al., 2007; Park and Zeikus, 2003; Reguera, 2005) use micro-organisms to break down organic matter and generate electricity. For a particular MFC design, it is critical to optimize the biological energetics and the microbial/electrode interface of the system, which research has shown to depend strongly on the surface properties of the anodes (Park and Zeikus, 2003; Reguera, 2005). This motivates the design of nano-enhanced anodes, where nano-structures (e.g. carbon nano-wire) are grown on the anode surface to improve the MFC’s power output. Unfortunately, there is little understanding of the interaction between various possible nano-enhancements and MFC capabilities for different micro-organisms. Thus, optimizing anode design for a particular application is largely guess work. Our goal is to develop algorithms to aid this process. Bayesian optimization (Jones, 2001a; Brochu et al., 2009) has been widely used for experimental design problems where the goal is to optimize an unknown function f(·), that is costly to evaluate. In general, we are interested in finding the point x∗ ∈ X d ⊂ Rd such that: x∗ = argmax x∈X d f(x), (1) |
Nagging: A scalable fault-tolerant paradigm for distributed search | This paper describes nagging, a technique for parallelizing search in a heterogeneous distributed computing environment. Nagging exploits the speedup anomaly often observed when parallelizing problems by playing multiple reformulations of the problem or portions of the problem against each other. Nagging is both fault tolerant and robust to long message latencies. In this paper, we show how nagging can be used to parallelize several different algorithms drawn from the artificial intelligence literature, and describe how nagging can be combined with partitioning, the more traditional search parallelization strategy. We present a theoretical analysis of the advantage of nagging with respect to partitioning, and give empirical results obtained on a cluster of 64 processors that demonstrate nagging’s effectiveness and scalability as applied to A* search, α β minimax game tree search, and the Davis-Putnam algorithm. |
WebGazer: Scalable Webcam Eye Tracking Using User Interactions | We introduce WebGazer, an online eye tracker that uses common webcams already present in laptops and mobile devices to infer the eye-gaze locations of web visitors on a page in real time. The eye tracking model self-calibrates by watching web visitors interact with the web page and trains a mapping between features of the eye and positions on the screen. This approach aims to provide a natural experience to everyday users that is not restricted to laboratories and highly controlled user studies. WebGazer has two key components: a pupil detector that can be combined with any eye detection library, and a gaze estimator using regression analysis informed by user interactions. We perform a large remote online study and a small in-person study to evaluate WebGazer. The findings show that WebGazer can learn from user interactions and that its accuracy is sufficient for approximating the user’s gaze. As part of this paper, we release the first eye tracking library that can be easily integrated in any website for real-time gaze interactions, usability studies, or web research. |
Petroleum Ether Extract of Cissus Quadrangularis (Linn.) Enhances Bone Marrow Mesenchymal Stem Cell Proliferation and Facilitates Osteoblastogenesis | OBJECTIVE
To evaluate the effects of the petroleum ether extract of Cissus quadrangularis on the proliferation rate of bone marrow mesenchymal stem cells, the differentiation of marrow mesenchymal stem cells into osteoblasts (osteoblastogenesis) and extracellular matrix calcification. This study also aimed to determine the additive effect of osteogenic media and Cissus quadrangularis on proliferation, differentiation and calcification.
METHODS
MSCs were cultured in media with or without Cissus quadrangularis for 4 weeks and were then stained for alkaline phosphatase. Extracellular matrix calcification was confirmed by Von Kossa staining. marrow mesenchymal stem cells cultures in control media and osteogenic media supplemented with Cissus quadrangularis extract (100, 200, 300 microg/mL) were also subjected to a cell proliferation assay (MTT).
RESULTS
Treatment with 100, 200 or 300 microg/mL petroleum ether extract of Cissus quadrangularis enhanced the differentiation of marrow mesenchymal stem cells into ALP-positive osteoblasts and increased extracellular matrix calcification. Treatment with 300 microg/mL petroleum ether extract of Cissus quadrangularis also enhanced the proliferation rate of the marrow mesenchymal stem cells. Cells grown in osteogenic media containing Cissus quadrangularis exhibited higher proliferation, differentiation and calcification rates than did control cells.
CONCLUSION
The results suggest that Cissus quadrangularis stimulates osteoblastogenesis and can be used as preventive/ alternative natural medicine for bone diseases such as osteoporosis. |
A stimulus-response analysis of anxiety and its role as a reinforcing agent. | Within recent decades an important change has taken place in the scientific view of anxiety (fear), 2 its genesis, and its psychological significance. Writing in 1890, William James (6) stoutly supported the then current supposition that anxiety was an instinctive (‘idiopathic’) reaction to certain objects or situations, which might or might not represent real danger. To the extent that the instinctively given, predetermined objects of anxiety were indeed dangerous, anxiety reactions had biological utility and could be accounted for as an evolutionary product of the struggle for existence. On the other hand, there were, James assumed, also anxiety reactions that were altogether senseless and which, conjecturally, came about through Nature’s imperfect wisdom. But in all cases, an anxiety reaction was regarded as phylogenetically fixed and unlearned. The fact that children may show no fear of a given type of object, e.g., live frogs, during the first year of life but may later manifest such a reaction, James attributed to the ‘ripening’ of the fear-of-live-frogs instinct; and the fact that such fears, once they have ‘ripened,’ may also disappear he explained on the assumption that all instincts, after putting in an appearance and, as it were, placing themselves at the individual’s disposal, tend to undergo a kind of obliviscence or decay unless taken advantage of and made ‘habitual.’ |
Algorithms for Vertex Partitioning Problems on Partial k-Trees | In this paper, we consider a large class of vertex partitioning problems and apply to them the theory of algorithm design for problems restricted to partial k-trees. We carefully describe the details of algorithms and analyze their complexity in an attempt to make the algorithms feasible as solutions for practical applications. We give a precise characterization of vertex partitioning problems, which include domination, coloring and packing problems, and their variants. Several new graph parameters are introduced as generalizations of classical parameters. This characterization provides a basis for a taxonomy of a large class of problems, facilitating their common algorithmic treatment and allowing their uniform complexity classification. We present a design methodology of practical solution algorithms for generally NP-hard problems when restricted to partial k-trees (graphs with treewidth bounded by k). This “practicality” accounts for dependency on the parameter k of the computational complexity of the resulting algorithms. By adapting the algorithm design methodology on partial k-trees to vertex partitioning problems, we obtain the first algorithms for these problems with reasonable time complexity as a function of treewidth. As an application of the methodology, we give the first polynomial-time algorithm on partial k-trees for computation of the Grundy number. |
A Two-Step Flow of Influence? Opinion-Leader Campaigns on Climate Change | In this article, we review concepts, measures, and strategies that can be applied to opinion-leader campaigns on climate change. These campaigns can be used to catalyze wider political engagement on the issue and to promote sustainable consumer choices and behaviors. From past research, we outline six relevant categories of self-designated opinion-leaders, detailing issues related to identification, recruitment, training, message development, and coordination. We additionally analyze as prominent initiatives Al Gore’s The Climate Project and his more recent We campaign, which combines the recruitment of digital opinion-leaders with traditional media strategies. In evaluating digital opinion-leader campaigns, we conclude that there are likely to be significant trade-offs in comparison to face-to-face initiatives. The challenge for both scholars and practitioners is to understand under what conditions are digital opinion-leaders effective and in which ways can online interactions strengthen or build on real-world connections. |
Sources and the Circulation of Renaissance Music | Contents: Introduction Part I Scribes and the Making of Manuscripts: Manuscript structure in the Dufay era, Charles Hamm Simon Mellet, scribe of Cambrai cathedral, Liane Curtis A contemporary perception of early 15th-century style: Bologna Q15 as a document of scribal editorial initiative, Margaret Bent The origins of the Chigi Codex: the date, provenance, and original ownership of Rome, Biblioteca Vaticana, Chigiana, C.VIII.234, Herbert Kellman Jean Michel, Maistre Jhan and a chorus of beasts: old light on some Ferrarese music manuscripts, Joshua Rifkin. Part II Sources, Politics and Transmission: European politics and the distribution of music in the early 15th century, Reinhard Strohm A gift of madrigals and chansons: the Winchester Part Books and the courtship of Elizabeth I by Erik XIV of Sweden, Kristine K. Forney Danish diplomacy and the dedication of Giardino novo II (1606) to King James I, Susan G. Lewis [Hammond]. Part III Sources and the Transmission of Repertory: The early Tudor court, the provinces and the Eton Choirbook, Magnus Williamson Antwerp's role in the reception and dissemination of the madrigal in the North, Kristine K. Forney. Part IV Patrons and Collectors: The purpose of the gift: for display or for performance?, Stanley Boorman Music in the library of Johannes Klein, Tom R. Ward Music for the nuns of Verona: a story about MS DCCLXI of the Biblioteca Capitolare in Verona, Howard Mayer Brown The salon as marketplace in the 1550s: patrons and collectors of Lasso's secular music, Donna G. Cardamone. Part V Music Printing: 1501-1528: The 500th anniversary of the first music printing: a history of patronage and taste in the early years, Stanley Boorman The printing contract for the Libro primo de musica de la salamandra (Rome 1526), Bonnie J. Blackburn. Part VI Printing and Printing Houses after 1528: The Libro Primo of Constanzo Festa, James Haar Twins, cousins, and heirs: relationships among editions of music printed in 16th-century Venice, Mary S. Lewis Thoughts on the popularity of printed music in 16th-century Italy, Stanley Boorman The Burning Salamander: assigning a printer to some 16th-century music prints, Jane A. Bernstein. Part VII The Financial Side of Music Printing: Financial arrangements and the role of printer and composer, Jane A. Bernstein The Venetian privilege and music-printing in the 16th century, Richard J. Agee Orlando di Lasso, composer and print entrepreneur, James Haar Music selling in late 16th-century Florence: the bookshop of Piero di Giuliano Morosi, Tim Carter Name index. |
Private Empirical Risk Minimization: Efficient Algorithms and Tight Error Bounds | Convex empirical risk minimization is a basic tool in machine learning and statistics. We provide new algorithms and matching lower bounds for differentially private convex empirical risk minimization assuming only that each data point's contribution to the loss function is Lipschitz and that the domain of optimization is bounded. We provide a separate set of algorithms and matching lower bounds for the setting in which the loss functions are known to also be strongly convex. Our algorithms run in polynomial time, and in some cases even match the optimal nonprivate running time (as measured by oracle complexity). We give separate algorithms (and lower bounds) for (ε, 0)and (ε, δ)-differential privacy; perhaps surprisingly, the techniques used for designing optimal algorithms in the two cases are completely different. Our lower bounds apply even to very simple, smooth function families, such as linear and quadratic functions. This implies that algorithms from previous work can be used to obtain optimal error rates, under the additional assumption that the contributions of each data point to the loss function is smooth. We show that simple approaches to smoothing arbitrary loss functions (in order to apply previous techniques) do not yield optimal error rates. In particular, optimal algorithms were not previously known for problems such as training support vector machines and the high-dimensional median. |
The paradoxical future of digital learning | What constitutes learning in the 21st century will be contested terrain as our society strives toward post-industrial forms of knowledge acquisition and production without having yet overcome the educational contradictions and failings of the industrial age. Educational reformers suggest that the advent of new technologies will radically transform what people learn, how they learn, and where they learn, yet studies of diverse learners’ use of new media cast doubt on the speed and extent of change. Drawing on recent empirical and theoretical work, this essay critically examines beliefs about the nature of digital learning and points to the role of social, culture, and economic factors in shaping and constraining educational transformation in the digital era. |
Designing AI systems that obey our laws and values | Calling for research on automatic oversight for artificial intelligence systems. |
Towards Competency Question-Driven Ontology Authoring | Ontology authoring is a non-trivial task for authors who are not proficient in logic. It is difficult to either specify the requirements for an ontology, or test their satisfaction. In this paper, we propose a novel approach to address this problem by leveraging the ideas of competency questions and test-before software development. We first analyse real-world competency questions collected from two different domains. Analysis shows that many of them can be categorised into patterns that differ along a set of features. Then we employ the linguistic notion of presupposition to describe the ontology requirements implied by competency questions, and show that these requirements can be tested automatically. |
Evaluating Hive and Spark SQL with BigBench | Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission. |
Tempo Induction Using Filterbank Analysis and Tonal Features | This paper presents an algorithm that extracts the tempo of a musical excerpt. The proposed system assumes a constant tempo and deals directly with the audio signal. A sliding window is applied to the signal and two feature classes are extracted. The first class is the log-energy of each band of a mel-scale triangular filterbank, a common feature vector used in various MIR applications. For the second class, a novel feature for the tempo induction task is presented; the strengths of the twelve western musical tones at all octaves are calculated for each audio frame, in a similar fashion with Pitch Class Profile. The timeevolving feature vectors are convolved with a bank of resonators, each resonator corresponding to a target tempo. Then the results of each feature class are combined to give the final output. The algorithm was evaluated on the popular ISMIR 2004 Tempo Induction Evaluation Exchange Dataset. Results demonstrate that the superposition of the different types of features enhance the performance of the algorithm, which is in the current state-of-the-art algorithms of the tempo induction task. |
Visualizing and Understanding Atari Agents | While deep reinforcement learning (deep RL) agents are effective at maximizing rewards, it is often unclear what strategies they use to do so. In this paper, we take a step toward explaining deep RL agents through a case study using Atari 2600 environments. In particular, we focus on using saliency maps to understand how an agent learns and executes a policy. We introduce a method for generating useful saliency maps and use it to show 1) what strong agents attend to, 2) whether agents are making decisions for the right or wrong reasons, and 3) how agents evolve during learning. We also test our method on non-expert human subjects and find that it improves their ability to reason about these agents. Overall, our results show that saliency information can provide significant insight into an RL agent’s decisions and learning behavior. |
Postmarketing Drug Safety Surveillance Using Publicly Available Health-Consumer-Contributed Content in Social Media | Postmarketing drug safety surveillance is important because many potential adverse drug reactions cannot be identified in the premarketing review process. It is reported that about 5% of hospital admissions are attributed to adverse drug reactions and many deaths are eventually caused, which is a serious concern in public health. Currently, drug safety detection relies heavily on voluntarily reporting system, electronic health records, or relevant databases. There is often a time delay before the reports are filed and only a small portion of adverse drug reactions experienced by health consumers are reported. Given the popularity of social media, many health social media sites are now available for health consumers to discuss any health-related issues, including adverse drug reactions they encounter. There is a large volume of health-consumer-contributed content available, but little effort has been made to harness this information for postmarketing drug safety surveillance to supplement the traditional approach. In this work, we propose the association rule mining approach to identify the association between a drug and an adverse drug reaction. We use the alerts posted by Food and Drug Administration as the gold standard to evaluate the effectiveness of our approach. The result shows that the performance of harnessing health-related social media content to detect adverse drug reaction is good and promising. |
Iron therapy in the pediatric hemodialysis population | Iron therapy maintains iron stores and optimizes the response to recombinant human erythropoietin (r-HuEPO) in patients with end-stage renal failure. Information is limited, however, regarding the preferential route of iron administration in pediatric patients receiving hemodialysis. Therefore, we prospectively randomized 35 iron-replete patients (aged >1 to <20 years) to receive up to 16 weeks of maintenance IV (n=17) or daily oral (n=18) iron. Eligible patients had received hemodialysis for >2 months, had a baseline transferrin saturation [TSAT] >20%, and were receiving maintenance r-HuEPO. Treatment arms were evenly distributed with respect to baseline demographic and clinical characteristics, with no statistically significant differences in baseline hemoglobin (Hb), hematocrit (Hct), reticulocyte Hb content (CHr), serum ferritin (SF), TSAT, or r-HuEPO dose. In the 35 patients, IV iron dextran and not oral iron was associated with a significant increase (138.5 to 259.1 ng/ml, P=0.003) in SF. A comparison of the change in SF between the IV iron group and the oral iron group was also significant (P=0.001). Whereas only IV iron was associated with a significant decrease in the dose of r-HuEPO (234.0 to 157.6 U/kg per week, P=0.046) and an increase of the CHr (29.2 to 30.1 pg, P=0.049), these changes were not significantly different from those experienced by patients in the oral iron group. In both groups, the Hct remained stable and in neither group was there a significant change in the TSAT. In summary, although both oral and IV iron maintained patients in an iron-replete state in this short-term study, only IV therapy allowed for a significant improvement in iron stores. |
Variational Inference for Mahalanobis Distance Metrics in Gaussian Process Regression | We introduce a novel variational method that allows to approximately integrate out kernel hyperparameters, such as length-scales, in Gaussian process regression. This approach consists of a novel variant of the variational framework that has been recently developed for the Gaussian process latent variable model which additionally makes use of a standardised representation of the Gaussian process. We consider this technique for learning Mahalanobis distance metrics in a Gaussian process regression setting and provide experimental evaluations and comparisons with existing methods by considering datasets with high-dimensional inputs. |
Non-volcanic deep low-frequency tremors accompanying slow slips in the southwest Japan subduction zone | Non-volcanic deep low-frequency tremors in southwest Japan exhibit a strong temporal and spatial correlation with slow slip detected by the dense seismic network. The tremor signal is characterized by a low-frequency vibration with a predominant frequency of 0.5–5 Hz without distinct Por S-wave onset. The tremors are located using the coherent pattern of envelopes over many stations, and are estimated to occur near the transition zone on the plate boundary on the forearc side along the strike of the descending Philippine Sea plate. The belt-like distribution of tremors consists of many clusters. In western Shikoku, the major tremor activity has a recurrence interval of approximately six months, with each episode lasting over a week. The tremor source area migrates during each episode along the strike of the subducting plate with a migration velocity of about 10 km/day. Slow slip events occur contemporaneously with this tremor activity, with a coincident estimated source area that also migrates during each episode. The coupling of tremor and slow slip in western Shikoku is very similar to the episodic tremor and slip phenomenon reported for the Cascadia margin in northwest North America. The duration and recurrence interval of these episodes varies between tremor clusters even on the same subduction zone, attributable to regional difference in the frictional properties of the plate interface. D 2005 Elsevier B.V. All rights reserved. |
Non-invasive determination of cardiac output: comparison of a novel CW Doppler ultrasonic technique and inert gas rebreathing. | BACKGROUND
Cardiac Output (CO) is an important parameter in the diagnosis and therapy of heart diseases. Inert gas rebreathing (IGR) and continuous wave Doppler ultrasound (CWD) are among the most promising newer techniques aiming at a non-invasive, point of care measurement of CO. A direct comparison of the two methods has not yet been carried out.
METHODS
63 consecutive patients were included in the study. CO was measured twice with both CWD and IGR to assess inter-method agreement and reproducibility. The statistical comparisons were performed as proposed by Bland and Altman.
RESULTS
There was a significant correlation between the CO measurements by both methods (r=0.53, p<0.001). Bland-Altman analysis showed a good agreement of measurements with a bias of 0.4+/-1.0 l/min (mean+/-standard deviation). Both methods showed a good reproducibility. CWD measurements were not possible in 14% of patients while IGR measurements were not possible in 5% of patients (p=0.13).
CONCLUSION
The determination of CO by IGR and CWD revealed a good agreement and reproducibility with a low rate of impossible measurements, suggesting that IGR and CWD can be used interchangeably in the clinical setting. |
A Wideband Frequency-Tunable Optoelectronic Oscillator Based on a Narrowband Phase-Shifted FBG and Wavelength Tuning of Laser | A wideband frequency-tunable optoelectronic oscillator (OEO) based on a narrowband phase-shifted fiber Bragg grating (PSFBG) and wavelength tuning of the laser is proposed and demonstrated. With assistance of an optical single-sideband modulator, the PSFBG with a narrow transmission peak acts like an equivalent high-Q microwave filter to select the oscillation frequency. Frequency tunability of the OEO can be simply realized by tuning the wavelength of a laser source. An X-band optoelectronic oscillating signal with frequency tunable at wide range from 8.4 to 11.8 GHz is generated and its performance is investigated in the experiment. |
Effect of psychopathy, abuse, and ethnicity on juvenile probation officers' decision-making and supervision strategies. | Probation officers exercise substantial discretion in their daily work with troubled and troubling juvenile offenders. In this experiment, we examine the effect of psychopathic features, child abuse, and ethnicity on 204 officers' expectancies of, recommendations for, and approach to supervising, juvenile offenders. The results indicate that officers (a) have decision-making and supervision approaches that are affected by a youth's psychopathic traits and history of child abuse-but not ethnicity; (b) view both abused youth and psychopathic youth as highly challenging cases on a path toward adult criminality; and (c) have greater hope and sympathy for abused youth than psychopathic youth. For abused youth, officers are likely to recommend psychological services and "go the extra mile" by providing greater support, referrals, and networking than is typical for their caseload. For psychopathic youth, officers expect poor treatment outcomes and are" extra strict," enforcing rules that typically are not enforced for others on their caseload. |
Gratifications and social network service usage: The mediating role of online experience | This paper incorporates dual theories from communication research (uses and gratifications) and psychology research (online flow) to examine consumer behavior in using social network services. Particularly, the study proposes that consumer online experience of interaction and arousal serves as the mediator between social motivations and use behaviors. The empirical results indicate that arousal fully mediates the relationship between social gratifications and problematic social network service use. Furthermore, both interaction and arousal are partial mediators between social gratifications and the intention to revisit social networking websites. |
Surgery cancelling at a teaching hospital: implications for cost management. | This study discusses the problem of surgery cancellation on the economic-financial perspective. It was carried out in the Surgical Center Unit of a school hospital with the objective to identify and analyze the direct costs (human resources, medications and materials) and the opportunity costs that result from the cancellation of elective surgeries. Data were collected during three consecutive months through institutional documents and a form elaborated by the researchers. Only 58 (23.3%) of the 249 cancelled scheduled surgeries represented costs for the institution. The cancellations direct total cost was R$ 1.713.66 (average cost per patient R$ 29.54); distributed as follows: expenses with consumption materials R$ 333.05; sterilization process R$201.22; medications R$149.77 and human resources R$1,029.62. The human resources costs represented the greatest percentile in relation to the total cost (60.40%). It was observed that most of the cancellations could be partially avoided. Planning on management; redesigning work processes, training the staff and making early clinical evaluation can be strategies to minimize this occurrence. |
Structure determination of ions formed by the decomposition of metastable ions. C8H9+ ions from p-bromoethylbenzene | Collisional activation spectra are useful for structure determination of ions produced by metastable decomposition, showing that only half of the C8H9+ ions from p-bromoethylbenzene have the methyltropylium structure, contrary to previous conclusions. |
Hyperlink Analysis: Techniques and Applications | ................................................................................................................................................. 0 |
A survey of technical trend of ADAS and autonomous driving | For past 10 years Advanced Driving Assistance System (ADAS) has rapidly grown. Recently not only luxury cars but some entry level cars are equipped with ADAS applications, such as Automated Emergency Braking System (AEBS). The European New Car Assessment Programme (EuroNCAP) announced its introduction of AEBS test from 2014, which will accelerate the penetration of ADAS in Europe. Also DARPA challenge started from 2004 accelerated the research for autonomous driving. Several OEMs and universities have demonstrated autonomous driving cars. This paper gives a brief survey of technical trend of ADAS and autonomous driving focusing on algorithms actually used for autonomous driving prototype cars. |
Analysis of Drain-Induced Barrier Rising in Short-Channel Negative-Capacitance FETs and Its Applications | We investigate the performance of hysteresis-free short-channel negative-capacitance FETs (NCFETs) by combining quantum-mechanical calculations with the Landau–Khalatnikov equation. When the subthreshold swing (SS) becomes smaller than 60 mV/dec, a negative value of drain-induced barrier lowering is obtained. This behavior, drain-induced barrier rising (DIBR), causes negative differential resistance in the output characteristics of the NCFETs. We also examine the performance of an inverter composed of hysteresis-free NCFETs to assess the effects of DIBR at the circuit level. Contrary to our expectation, although hysteresis-free NCFETs are used, hysteresis behavior is observed in the transfer properties of the inverter. Furthermore, it is expected that the NCFET inverter with hysteresis behavior can be used as a Schmitt trigger inverter. |
A Distributed Data-Gathering Protocol Using AUV in Underwater Sensor Networks | In this paper, we propose a distributed data-gathering scheme using an autonomous underwater vehicle (AUV) working as a mobile sink to gather data from a randomly distributed underwater sensor network where sensor nodes are clustered around several cluster headers. Unlike conventional data-gathering schemes where the AUV visits either every node or every cluster header, the proposed scheme allows the AUV to visit some selected nodes named path-nodes in a way that reduces the overall transmission power of the sensor nodes. Monte Carlo simulations are performed to investigate the performance of the proposed scheme compared with several preexisting techniques employing the AUV in terms of total amount of energy consumption, standard deviation of each node's energy consumption, latency to gather data at a sink, and controlling overhead. Simulation results show that the proposed scheme not only reduces the total energy consumption but also distributes the energy consumption more uniformly over the network, thereby increasing the lifetime of the network. |
The female rat reproductive cycle: a practical histological guide to staging. | During preclinical investigations into the safety of drugs and chemicals, many are found to interfere with reproductive function in the female rat. This interference is commonly expressed as a change in normal morphology of the reproductive tract or a disturbance in the duration of particular phases of the estrous cycle. Such alterations can be recognized only if the pathologist has knowledge of the continuously changing histological appearance of the various components of the reproductive tract during the cycle and can accurately and consistently ascribe an individual tract to a particular phase of the cycle. Unfortunately, although comprehensive reports illustrating the normal appearance of the tract during the rat estrous cycle have been available over many years, they are generally somewhat ambiguous about distinct criteria for defining the end of one stage and the beginning of another. This detail is absolutely essential to achieve a consistent approach to staging the cycle. For the toxicologic pathologist, this report illustrates a pragmatic and practical approach to staging the estrous cycle in the rat based on personal experience and a review of the literature from the last century. |
Sentiment analysis for Arabic language: A brief survey of approaches and techniques | With the emergence of Web 2.0 technology and the expansion of on-line social networks, current Internet users have the ability to add their reviews, ratings and opinions on social media and on commercial and news web sites. Sentiment analysis aims to classify these reviews reviews in an automatic way. In the literature, there are numerous approaches proposed for automatic sentiment analysis for different language contexts. Each language has its own properties that makes the sentiment analysis more challenging. In this regard, this work presents a comprehensive survey of existing Arabic sentiment analysis studies, and covers the various approaches and techniques proposed in the literature. Moreover, we highlight the main difficulties and challenges of Arabic sentiment analysis, and the proposed techniques in literature to overcome these barriers. |
A 3D printed soft gripper integrated with curvature sensor for studying soft grasping | The grasping dynamics between a soft gripper and a deformable object has not been investigated so far. To this end, a 3D printed soft robot gripper with modular design was proposed in this paper. The gripper consists of a rigid base and three modular soft fingers. A snap-lock mechanism was designed on each finger for easy attach-detach to the base. All components were 3D printed using the Objet260 Connex system. The soft finger is air-driven and the idea is based on the principle of fluidic elastomer actuator. A curvature sensor was integrated inside each finger to measure the curvature during grasping. The fingers integrated with sensors were calibrated under different pneumatic pressure inputs. Relationship between pressure loading and bending angle, and relationship between bending angle and sensor output (voltage) were derived. Experiments with the gripper grasping and lifting a paper container filled with peanuts were conducted and results were presented and discussed. |
Progress in natural language understanding: an application to lunar geology | The advent of computer networks such as the ARPA net (see e.g., Ornstein et al.) has significantly increased the opportunity for access by a single researcher to a variety of different computer facilities and data bases, thus raising expectations of a day when it will be a common occurrence rather than an exception that a scientist will casually undertake to use a computer facility located 3000 miles away and whose languages, formats, and conventions are unknown to him. In this foreseeable future, learning and remembering the number of different languages and conventions that such a scientist would have to know will require significant effort---much greater than that now required to learn the conventions of his local computing center (where other users and knowledgeable assistance is readily available). The Lunar Sciences Natural Language Information System (which we will hereafter refer to as LUNAR) is a research prototype of a system to deal with this and other man-machine communication problems by adapting the machine to the conventions of ordinary natural English rather than requiring the man to adapt to the machine. |
ProM 6: The Process Mining Toolkit | Process mining has been around for a decade, and it has proven to be a very fertile and successful research field. Part of this success can be contributed to the ProM tool, which combines most of the existing process mining techniques as plug-ins in a single tool. ProM 6 removes many limitations that existed in the previous versions, in particular with respect to the tight integration between the tool and the |
Meta-cognitive extreme learning machine for regression problems | In this paper, we present an efficient fast learning algorithm for regression problems using meta-cognitive extreme learning machine(McELM). The proposed algorithm has two components, namely the cognitive component and meta-cognitive component. The cognitive component is an extreme learning machine (ELM) while the meta-cognitive component which controls the cognitive component employs a self-regulating learning mechanism to decide what to learn, when to learn and how to learn. The meta-cognitive component chooses suitable learning method based on the samples presented namely, delete sample, reserve sample and network update. The use of ELM improves the network speed and reduces computational cost. Unlike traditional ELM, the number of hidden layers is not fixed priori in McELM, instead, the network is built during the learning phase. This algorithm is evaluated on a set of benchmark regression and approximation problems and also on a real-world wind force and moment coefficient prediction problem. Performance results in this study highlight that McELM can achieve better results compared with conventional ELM, support vector regression (SVR). |
A Multi-Representational and Multi-Layered Treebank for Hindi/Urdu | This paper describes the simultaneous development of dependency structure and phrase structure treebanks for Hindi and Urdu, as well as a PropBank. The dependency structure and the PropBank are manually annotated, and then the phrase structure treebank is produced automatically. To ensure successful conversion the development of the guidelines for all three representations are carefully coordinated. |
Determination of band gap in polycrystalline Si/Ge thin film multilayers | The valence band (VB) photoemission supported by ultraviolet–visible–near infrared spectroscopy techniques were used to determine the band gap values of polycrystalline Si and Ge single layers as well as of Si/Ge multilayer structures. The band gap values obtained from VB photoemission measurements for these structures were found to be much larger than their corresponding bulks and to match well with those determined from standard optical absorption measurements. In each case, the VB offset values were obtained by considering the corresponding VB maximum as a reference. The increase in band gap in case of thin single layers of Si and Ge with respect to bulks were interpreted in terms of quantum confinement effect, while in case of multilayer sample, the effect of various factors such as (i) intermixing leading to the formation of SiGe alloy, (ii) roughness at the interface, (iii) particle size, and (iv) strain seem to play an important role in the observed change in band gap. |
Saliency-based Sequential Image Attention with Multiset Prediction | Humans process visual scenes selectively and sequentially using attention. Central to models of human visual attention is the saliency map. We propose a hierarchical visual architecture that operates on a saliency map and uses a novel attention mechanism to sequentially focus on salient regions and take additional glimpses within those regions. The architecture is motivated by human visual attention, and is used for multi-label image classification on a novel multiset task, demonstrating that it achieves high precision and recall while localizing objects with its attention. Unlike conventional multi-label image classification models, the model supports multiset prediction due to a reinforcement-learning based training process that allows for arbitrary label permutation and multiple instances per label. |
What is a sports injury? | Current sports injury reporting systems lack a common conceptual basis. We propose a conceptual foundation as a basis for the recording of health problems associated with participation in sports, based on the notion of impairment used by the World Health Organization. We provide definitions of sports impairment concepts to represent the perspectives of health services, the participants in sports and physical exercise themselves, and sports institutions. For each perspective, the duration of the causative event is used as the norm for separating concepts into those denoting impairment conditions sustained instantly and those developing gradually over time. Regarding sports impairment sustained in isolated events, 'sports injury' denotes the loss of bodily function or structure that is the object of observations in clinical examinations; 'sports trauma' is defined as an immediate sensation of pain, discomfort or loss of functioning that is the object of athlete self-evaluations; and 'sports incapacity' is the sidelining of an athlete because of a health evaluation made by a legitimate sports authority that is the object of time loss observations. Correspondingly, sports impairment caused by excessive bouts of physical exercise is denoted as 'sports disease' (overuse syndrome) when observed by health service professionals during clinical examinations, 'sports illness' when observed by the athlete in self-evaluations, and 'sports sickness' when recorded as time loss from sports participation by a sports body representative. We propose a concerted development effort in this area that takes advantage of concurrent ontology management resources and involves the international sporting community in building terminology systems that have broad relevance. |
Depth Super-Resolution on RGB-D Video Sequences With Large Displacement 3D Motion | To enhance the resolution and accuracy of depth data, some video-based depth super-resolution methods have been proposed, which utilizes its neighboring depth images in the temporal domain. They often consist of two main stages: motion compensation of temporally neighboring depth images and fusion of compensated depth images. However, large displacement 3D motion often leads to compensation error, and the compensation error is further introduced into the fusion. A video-based depth super-resolution method with novel motion compensation and fusion approaches is proposed in this paper. We claim that 3D nearest neighboring field (NNF) is a better choice than using positions with true motion displacement for depth enhancements. To handle large displacement 3D motion, the compensation stage utilized 3D NNF instead of true motion used in the previous methods. Next, the fusion approach is modeled as a regression problem to predict the super-resolution result efficiently for each depth image by using its compensated depth images. A new deep convolutional neural network architecture is designed for fusion, which is able to employ a large amount of video data for learning the complicated regression function. We comprehensively evaluate our method on various RGB-D video sequences to show its superior performance. |
Wii-based compared to standard of care balance and mobility rehabilitation for two individuals post-stroke | Great interest and some hype have accompanied the introduction of Nintendo Wii-based rehabilitation. The purpose of these cases is to describe a Wii-based balance and mobility program and compare it to a standard of care balance and mobility program for two individuals in the chronic phase post-stroke. Both individuals with left cerebrovascular accidents received four weeks (12 one hour sessions) of either a Nintendo Wii and Wii Fit program or standard of care balance and mobility program. Gait speed, walking endurance (six minute walk test), balance (Dynamic Gait Index) balance confidence (Activity Balance Confidence Questionnaire) and dual tasks mobility tests (Timed-Up and Go) were measured prior to training, upon training completion and at three months post-training. Both individuals demonstrated improvements in most outcomes measured. The percent increases were generally greater for the person in the Wii-based program. Retention of improvements, however, was greater for the individual who received the standard of care. Enthusiasm for new therapies needs to be tempered with evidence of efficacy with particular attention to retention of gains. |
The Epidemiology of Newly Diagnosed Chronic Liver Disease in Gastroenterology Practices in the United States: Results From Population-Based Surveillance | OBJECTIVES: Chronic liver disease (CLD) is an important cause of morbidity and mortality, but the epidemiology is not well described. We conducted prospective population-based surveillance to estimate newly diagnosed CLD incidence, characterize etiology distribution, and determine disease stage.METHODS: We identified cases of CLD newly diagnosed during 1999–2001 among adult county residents seen in any gastroenterology practice in New Haven County, Connecticut; Multnomah County, Oregon; and Northern California Kaiser Permanente Medical Care Program (KPMCP, Oakland, California [total population 1.48 million]). We defined CLD as abnormal liver tests of at least 6 months' duration or pathologic, clinical, or radiologic evidence of CLD. Consenting patients were interviewed, a blood specimen obtained, and the medical record reviewed.RESULTS: We identified 2,353 patients with newly diagnosed CLD (63.9 cases/100,000 population), including 1,225 hepatitis C patients (33.2 cases/100,000). Men aged 45–54 yr had the highest hepatitis C incidence rate (111.3/100,000). Among 1,040 enrolled patients, the median age was 48 yr (range 19–86 yr). Hepatitis C, either alone (442 [42%]) or in combination with alcohol-related liver disease (ALD) (228 [22%]), accounted for two-thirds of the cases. Other etiologies included nonalcoholic fatty liver disease (NAFLD, 95 [9%]), ALD (82 [8%]), and hepatitis B (36 [3%]). Other identified etiologies each accounted for <3% of the cases. A total of 184 patients (18%) presented with cirrhosis, including 44% of patients with ALD.CONCLUSIONS: Extrapolating from this population-based surveillance network to the adult U.S. population, approximately 150,000 patients with CLD were diagnosed in gastroenterology practices each year during 1999–2001. Most patients had hepatitis C; heavy alcohol consumption among these patients was common. Almost 20% of patients, an estimated 30,000 per year, had cirrhosis at presentation. These results provide population-level baseline data to evaluate trends in identification of patients with CLD in gastroenterology practices. |
A 12-bit, 45-MS/s, 3-mW Redundant Successive-Approximation-Register Analog-to-Digital Converter With Digital Calibration | This paper presents a sub-radix-2 redundant architecture to improve the performance of switched-capacitor successive-approximation-register (SAR) analog-to-digital converters (ADCs). The redundancy not only guarantees digitally correctable static nonlinearities of the converter, it also offers means to combat dynamic errors in the conversion process, and thus, accelerating the speed of the SAR architecture. A perturbation-based digital calibration technique is also described that closely couples with the architecture choice to accomplish simultaneous identification of multiple capacitor mismatch errors of the ADC, enabling the downsizing of all sampling capacitors to save power and silicon area. A 12-bit prototype measured a Nyquist 70.1-dB signal-to-noise-plus-distortion ratio (SNDR) and a Nyquist 90.3-dB spurious free dynamic range (SFDR) at 22.5 MS/s, while dissipating 3.0-mW power from a 1.2-V supply and occupying 0.06-mm2 silicon area in a 0.13-μm CMOS process. The figure of merit (FoM) of this ADC is 51.3 fJ/step measured at 22.5 MS/s and 36.7 fJ/step at 45 MS/s. |
Re: CAPTCHAs-Understanding CAPTCHA-Solving Services in an Economic Context | Reverse Turing tests, or CAPTCHAs, have become an ubiquitous defense used to protect open Web resources from being exploited at scale. An effective CAPTCHA resists existing mechanistic software solving, yet can be solved with high probability by a human being. In response, a robust solving ecosystem has emerged, reselling both automated solving technology and realtime human labor to bypass these protections. Thus, CAPTCHAs can increasingly be understood and evaluated in purely economic terms; the market price of a solution vs the monetizable value of the asset being protected. We examine the market-side of this question in depth, analyzing the behavior and dynamics of CAPTCHA-solving service providers, their price performance, and the underlying labor markets driving this economy. |
Towards the creation of decellularized organ constructs using irreversible electroporation and active mechanical perfusion | BACKGROUND
Despite advances in transplant surgery and general medicine, the number of patients awaiting transplant organs continues to grow, while the supply of organs does not. This work outlines a method of organ decellularization using non-thermal irreversible electroporation (N-TIRE) which, in combination with reseeding, may help supplement the supply of organs for transplant.
METHODS
In our study, brief but intense electric pulses were applied to porcine livers while under active low temperature cardio-emulation perfusion. Histological analysis and lesion measurements were used to determine the effects of the pulses in decellularizing the livers as a first step towards the development of extracellular scaffolds that may be used with stem cell reseeding. A dynamic conductivity numerical model was developed to simulate the treatment parameters used and determine an irreversible electroporation threshold.
RESULTS
Ninety-nine individual 1000 V/cm 100-μs square pulses with repetition rates between 0.25 and 4 Hz were found to produce a lesion within 24 hours post-treatment. The livers maintained intact bile ducts and vascular structures while demonstrating hepatocytic cord disruption and cell delamination from cord basal laminae after 24 hours of perfusion. A numerical model found an electric field threshold of 423 V/cm under specific experimental conditions, which may be used in the future to plan treatments for the decellularization of entire organs. Analysis of the pulse repetition rate shows that the largest treated area and the lowest interstitial density score was achieved for a pulse frequency of 1 Hz. After 24 hours of perfusion, a maximum density score reduction of 58.5 percent had been achieved.
CONCLUSIONS
This method is the first effort towards creating decellularized tissue scaffolds that could be used for organ transplantation using N-TIRE. In addition, it provides a versatile platform to study the effects of pulse parameters such as pulse length, repetition rate, and field strength on whole organ structures. |
Prevention of Generalized Anxiety Disorder Using a Web Intervention, iChill: Randomized Controlled Trial | BACKGROUND
Generalized Anxiety Disorder (GAD) is a high prevalence, chronic disorder. Web-based interventions are acceptable, engaging, and can be delivered at scale. Few randomized controlled trials evaluate the effectiveness of prevention programs for anxiety, or the factors that improve effectiveness and engagement.
OBJECTIVE
The intent of the study was to evaluate the effectiveness of a Web-based program in preventing GAD symptoms in young adults, and to determine the role of telephone and email reminders.
METHODS
A 5-arm randomized controlled trial with 558 Internet users in the community, recruited via the Australian Electoral Roll, was conducted with 6- and 12-month follow-up. Five interventions were offered over a 10-week period. Group 1 (Active website) received a combined intervention of psycho-education, Internet-delivered Cognitive Behavioral Therapy (ICBT) for anxiety, physical activity promotion, and relaxation. Group 2 (Active website with telephone) received the identical Web program plus weekly telephone reminder calls. Group 3 (Active website with email) received the identical Web program plus weekly email reminders. Group 4 (Control) received a placebo website. Group 5 (Control with telephone) received the placebo website plus telephone calls. Main outcome measures were severity of anxiety symptoms as measured by the GAD 7-item scale (GAD-7) (at post-test, 6, and 12 months). Secondary measures were GAD caseness, measured by the Mini International Neuropsychiatric Interview (MINI) at 6 months, Centre for Epidemiologic Studies-Depression scale (CES-D), Anxiety Sensitivity Index (ASI), Penn State Worry Questionnaire (PSWQ), and Days out of Role.
RESULTS
GAD-7 symptoms reduced over post-test, 6-month, and 12-month follow-up. There were no significant differences between Group 4 (Control) and Groups 1 (Active website), 2 (Active website with telephone), 3 (Active website with email), or 5 (Control with telephone) at any follow-up. A total of 16 cases of GAD were identified at 6 months, comprising 6.7% (11/165) from the Active groups (1, 2, 3) and 4.5% (5/110) from the Control groups (4, 5), a difference that was not significant. CES-D, ASI, and PSWQ scores were significantly lower for the active website with email reminders at post-test, relative to the control website condition.
CONCLUSIONS
Indicated prevention of GAD was not effective in reducing anxiety levels, measured by GAD-7. There were significant secondary effects for anxiety sensitivity, worry, and depression. Challenges for indicated prevention trials are discussed.
TRIAL REGISTRATION
International Standard Randomized Controlled Trial Number (ISRCTN): 76298775; http://www.controlled-trials.com/ISRCTN76298775 (Archived by WebCite at http://www.webcitation.org/6S9aB5MAq). |
A Market in Your Social Network: The Effects of Extrinsic Rewards on Friendsourcing and Relationships | Friendsourcing consists of broadcasting questions and help requests to friends on social networking sites. Despite its potential value, friendsourcing requests often fall on deaf ears. One way to improve response rates and motivate friends to undertake more effortful tasks may be to offer extrinsic rewards, such as money or a gift, for responding to friendsourcing requests. However, past research suggests that these extrinsic rewards can have unintended consequences, including undermining intrinsic motivations and undercutting the relationship between people. To explore the effects of extrinsic reward on friends' response rate and perceived relationship, we conducted an experiment on a new friendsourcing platform - Mobilyzr. Results indicate that large extrinsic rewards increase friends' response rates without reducing the relationship strength between friends. Additionally, the extrinsic rewards allow requesters to explain away the failure of friendsourcing requests and thus preserve their perceptions of relationship ties with friends. |
Relative effectiveness of methods of breast self-examination | This study investigated the effectiveness of different methods of breast self-examination (BSE) on coverage of breast area and lump detection, using a factorial design, pairing three search patterns (concentric circle, radial spoke, vertical strip) with two finger palpation techniques (small circular movements, sliding movements). Ninety-seven female undergraduates were randomly assigned to one of six BSE training conditions which were identical except in the BSE search pattern and finger palpation technique explained by the instructor. Following the 20-min, small-group training, subjects' coverage of breast area was assessed by scoring their BSE performance on a breast board. Lump detection was determined by the number of lumps correctly identified in silicone breast models. Results indicated that the vertical strip pattern was associated with significantly greater coverage of the breast area. There were no significant differences in lump detection; however, the sliding finger palpation technique resulted in significantly more false identifications of lumps. |
Hybrid excited claw pole electric machine | The paper presents the concept and results of simulation and experimental research of a claw pole generator with hybrid excitation. Hybrid excitation is performed with a conventional coil located between two parts of the claw-shape rotor and additional permanent magnets which are placed on claw poles. Within the research first a simulation and next constructed experimental model has been developed on the basis of the mass-produced vehicle alternator. Experimental researches have shown that - at a suitable rotational speed - it is possible to self-excite of the generator without any additional source of electrical power. |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.