title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
A Joint Human/Machine Process for Coding Events and Conflict Drivers | Constructing datasets to analyse the progression of conflicts has been a longstanding objective of peace and conflict studies research. In essence, the problem is to reliably extract relevant text snippets and code (annotate) them using an ontology that is meaningful to social scientists. Such an ontology usually characterizes either types of violent events (killing, bombing, etc.), and/or the underlying drivers of conflict, themselves hierarchically structured, for example security, governance and economics, subdivided into conflict-specific indicators. Numerous coding approaches have been proposed in the social science literature, ranging from fully automated “machine” coding to human coding. Machine coding is highly error prone, especially for labelling complex drivers, and suffers from extraction of duplicated events, but human coding is expensive, and suffers from inconsistency between annotators; thus hybrid approaches are required. In this paper, we analyse experimentally how human input can most effectively be used in a hybrid system to complement machine coding. Using two newly created real-world datasets, we show that machine learning methods improve on rule-based automated coding for filtering large volumes of input, while human verification of relevant/irrelevant text leads to improved performance of machine learning for predicting multiple labels in the ontology. |
Dyadic Memory Networks for Aspect-based Sentiment Analysis | This paper proposes Dyadic Memory Networks (DyMemNN), a novel extension of end-to-end memory networks (memNN) for aspect-based sentiment analysis (ABSA). Originally designed for question answering tasks, memNN operates via a memory selection operation in which relevant memory pieces are adaptively selected based on the input query. In the problem of ABSA, this is analogous to aspects and documents in which the relationship between each word in the document is compared with the aspect vector. In the standard memory networks, simple dot products or feed forward neural networks are used to model the relationship between aspect and words which lacks representation learning capability. As such, our dyadic memory networks ameliorates this weakness by enabling rich dyadic interactions between aspect and word embeddings by integrating either parameterized neural tensor compositions or holographic compositions into the memory selection operation. To this end, we propose two variations of our dyadic memory networks, namely the Tensor DyMemNN and Holo DyMemNN. Overall, our two models are end-to-end neural architectures that enable rich dyadic interaction between aspect and document which intuitively leads to better performance. Via extensive experiments, we show that our proposed models achieve the state-of-the-art performance and outperform many neural architectures across six benchmark datasets. |
The vitamin E isoforms α-tocopherol and γ-tocopherol have opposite associations with spirometric parameters: the CARDIA study | BACKGROUND
Clinical studies of the associations of vitamin E with lung function have reported conflicting results. However, these reports primarily examine the α-tocopherol isoform of vitamin E and have not included the isoform γ-tocopherol which we recently demonstrated in vitro opposes the function of α-tocopherol. We previously demonstrated, in vitro and in animal studies, that the vitamin E isoform α-tocopherol protects, but the isoform γ-tocopherol promotes lung inflammation and airway hyperresponsiveness.
METHODS
To translate these findings to humans, we conducted analysis of 4526 adults in the Coronary Artery Risk Development in Young Adults (CARDIA) multi-center cohort with available spirometry and tocopherol data in blacks and whites. Spirometry was obtained at years 0, 5, 10, and 20 and serum tocopherol was from years 0, 7 and 15 of CARDIA.
RESULTS
In cross-sectional regression analysis at year 0, higher γ-tocopherol associated with lower FEV1 (p = 0.03 in blacks and p = 0.01 in all participants) and FVC (p = 0.01 in blacks, p = 0.05 in whites, and p = 0.005 in all participants), whereas higher α-tocopherol associated with higher FVC (p = 0.04 in blacks and whites and p = 0.01 in all participants). In the lowest quartile of α-tocopherol, higher γ-tocopherol associated with a lower FEV1 (p = 0.05 in blacks and p = 0.02 in all participants). In contrast, in the lowest quartile of γ-tocopherol, higher α-tocopherol associated with a higher FEV1 (p = 0.03) in blacks. Serum γ-tocopherol >10 μM was associated with a 175-545 ml lower FEV1 and FVC at ages 21-55 years.
CONCLUSION
Increasing serum concentrations of γ-tocopherol were associated with lower FEV1 or FVC, whereas increasing serum concentrations of α-tocopherol was associated with higher FEV1 or FVC. Based on the prevalence of serum γ-tocopherol >10 μM in adults in CARDIA and the adult U.S. population in the 2011 census, we expect that the lower FEV1 and FVC at these concentrations of serum γ-tocopherol occur in up to 4.5 million adults in the population. |
Chronic boron exposure and human semen parameters. | Boron found as borates in soil, food, and water has important industrial and medical applications. A panel reviewing NTP reproductive toxicants identified boric acid as high priority for occupational studies to determine safe versus adverse reproductive effects. To address this, we collected boron exposure/dose measures in workplace inhalable dust, dietary food/fluids, blood, semen, and urine from boron workers and two comparison worker groups (n=192) over three months and determined correlations between boron and semen parameters (total sperm count, sperm concentration, motility, morphology, DNA breakage, apoptosis and aneuploidy). Blood boron averaged 499.2 ppb for boron workers, 96.1 and 47.9 ppb for workers from high and low environmental boron areas (p<0.0001). Boron concentrated in seminal fluid. No significant correlations were found between blood or urine boron and adverse semen parameters. Exposures did not reach those causing adverse effects published in animal toxicology work but exceeded those previously published for boron occupational groups. |
Restrictive dermopathy—a lethal congenital laminopathy. Case report and review of the literature | Restrictive dermopathy (RD) is a rare, fatal, and genetically heterogeneous laminopathy with a predominant autosomal recessive heredity pattern. The phenotype can be caused by mutations in either LMNA (primary laminopathy) or ZMPSTE24 (secondary laminopathy) genes but mostly by homozygous or compound heterozygous ZMPSTE24 mutations. Clinicopathologic findings are unique, allowing a specific diagnosis in most cases. We describe a premature newborn girl of non-consanguineous parents who presented a rigid, translucent and tightly adherent skin, dysmorphic facies, multiple joint contractures and radiological abnormalities. The overall clinical, radiological, histological, and ultrastructural features were typical of restrictive dermopathy. Molecular genetic analysis revealed a homozygous ZMPSTE24 mutation (c.1085_1086insT). Parents and sister were heterozygous asymptomatic carriers. We conclude that RD is a relatively easy and consistent clinical and pathological diagnosis. Despite recent advances in our understanding of RD, the pathogenetic mechanisms of the disease are not entirely clarified. Recognition of RD and molecular genetic diagnosis are important to define the prognosis of an affected child and for recommending genetic counseling to affected families. However, the outcome for a live born patient in the neonatal period is always fatal. |
Cognitive Strategies for the Visual Search of Hierarchical Computer Displays | This article investigates the cognitive strategies that people use to search computer displays. Several different visual layouts are examined: unlabeled layouts that contain multiple groups of items but no group headings, labeled layouts in which items are grouped and each group has a useful heading, and a target-only layout that contains just one item. A number of plausible strategies were proposed for each layout. Each strategy was programmed into the EPIC cognitive architecture, producing models that simulate the human visual-perceptual, oculomotor, and cognitive processing required for the task. The models generate search time predictions. For unlabeled layouts, the mean layout search times are predicted by a purely random search strategy, and the more detailed positional search times are predicted by a noisy systematic strategy. The labeled layout search times are predicted by a hierarchical strategy in which first the group labels are systematically searched, and then the contents of the target group. The target-only layout search times are predicted by a strategy in which the eyes move directly to the sudden appearance of the target. The models demonstrate that human visual search performance can be explained largely in terms of the cognitive strategy HUMAN–COMPUTER INTERACTION, 2004, Volume 19, pp. 183–223 Copyright © 2004, Lawrence Erlbaum Associates, Inc. Anthony Hornof is a computer scientist with interests in human–computer interaction, cognitive modeling, visual search, and eye tracking; he is an Assistant Professor in the Department of Computer and Information Science at the University of Oregon. that is used to coordinate the relevant perceptual and motor processes, a clear and useful visual hierarchy triggers a fundamentally different visual search strategy and effectively gives the user greater control over the visual navigation, and cognitive strategies will be an important component of a predictive visual search tool. The models provide insights pertaining to the visual-perceptual and oculomotor processes involved in visual search and contribute to the science base needed for predictive interface analysis. 184 HORNOF |
Multi-Path Feedback Recurrent Neural Network for Scene Parsing | In this paper, we consider the scene parsing problem and propose a novel MultiPath Feedback recurrent neural network (MPF-RNN) for parsing scene images. MPF-RNN can enhance the capability of RNNs in modeling long-range context information at multiple levels and better distinguish pixels that are easy to confuse. Different from feedforward CNNs and RNNs with only single feedback, MPFRNN propagates the contextual features learned at top layer through weighted recurrent connections to multiple bottom layers to help them learn better features with such “hindsight”. For better training MPF-RNN, we propose a new strategy that considers accumulative loss at multiple recurrent steps to improve performance of the MPF-RNN on parsing small objects. With these two novel components, MPF-RNN has achieved significant improvement over strong baselines (VGG16 and Res101) on five challenging scene parsing benchmarks, including traditional SiftFlow, Barcelona, CamVid, Stanford Background as well as the recently released large-scale ADE20K. |
Deep Learning for IoT Big Data and Streaming Analytics: A Survey | In the era of the Internet of Things (IoT), an enormous amount of sensing devices collect and/or generate various sensory data over time for a wide range of fields and applications. Based on the nature of the application, these devices will result in big or fast/real-time data streams. Applying analytics over such data streams to discover new information, predict future insights, and make control decisions is a crucial process that makes IoT a worthy paradigm for businesses and a quality-of-life improving technology. In this paper, we provide a thorough overview on using a class of advanced machine learning techniques, namely deep learning (DL), to facilitate the analytics and learning in the IoT domain. We start by articulating IoT data characteristics and identifying two major treatments for IoT data from a machine learning perspective, namely IoT big data analytics and IoT streaming data analytics. We also discuss why DL is a promising approach to achieve the desired analytics in these types of data and applications. The potential of using emerging DL techniques for IoT data analytics are then discussed, and its promises and challenges are introduced. We present a comprehensive background on different DL architectures and algorithms. We also analyze and summarize major reported research attempts that leveraged DL in the IoT domain. The smart IoT devices that have incorporated DL in their intelligence background are also discussed. DL implementation approaches on the fog and cloud centers in support of IoT applications are also surveyed. Finally, we shed light on some challenges and potential directions for future research. At the end of each section, we highlight the lessons learned based on our experiments and review of the recent literature. |
Flavourings significantly affect inhalation toxicity of aerosol generated from electronic nicotine delivery systems (ENDS). | BACKGROUND
E-cigarettes or electronic nicotine delivery systems (ENDS) are designed to deliver nicotine-containing aerosol via inhalation. Little is known about the health effects of flavoured ENDS aerosol when inhaled.
METHODS
Aerosol from ENDS was generated using a smoking machine. Various types of ENDS devices or a tank system prefilled with liquids of different flavours, nicotine carrier, variable nicotine concentrations and with modified battery output voltage were tested. A convenience sample of commercial fluids with flavour names of tobacco, piña colada, menthol, coffee and strawberry were used. Flavouring chemicals were identified using gas chromatography/mass spectrometry. H292 human bronchial epithelial cells were directly exposed to 55 puffs of freshly generated ENDS aerosol, tobacco smoke or air (controls) using an air-liquid interface system and the Health Canada intense smoking protocol. The following in vitro toxicological effects were assessed: (1) cell viability, (2) metabolic activity and (3) release of inflammatory mediators (cytokines).
RESULTS
Exposure to ENDS aerosol resulted in decreased metabolic activity and cell viability and increased release of interleukin (IL)-1β, IL-6, IL-10, CXCL1, CXCL2 and CXCL10 compared to air controls. Cell viability and metabolic activity were more adversely affected by conventional cigarettes than most tested ENDS products. Product type, battery output voltage and flavours significantly affected toxicity of ENDS aerosol, with a strawberry-flavoured product being the most cytotoxic.
CONCLUSIONS
Our data suggest that characteristics of ENDS products, including flavours, may induce inhalation toxicity. Therefore, ENDS users should use the products with caution until more comprehensive studies are performed. |
A Woman in the World of Engineering | This paper discusses the career of one woman in the world of engineering as a way of pointing out the problems that women encounter in engineering. The problems include hiring, salaries, promotion, management, travel, and attitudes of coworkers. The paper includes suggestions for engineering professors to help prepare women students for engineering. |
Outcomes With Edoxaban Versus Warfarin in Patients With Previous Cerebrovascular Events: Findings From ENGAGE AF-TIMI 48 (Effective Anticoagulation With Factor Xa Next Generation in Atrial Fibrillation-Thrombolysis in Myocardial Infarction 48). | BACKGROUND AND PURPOSE
Patients with atrial fibrillation and previous ischemic stroke (IS)/transient ischemic attack (TIA) are at high risk of recurrent cerebrovascular events despite anticoagulation. In this prespecified subgroup analysis, we compared warfarin with edoxaban in patients with versus without previous IS/TIA.
METHODS
ENGAGE AF-TIMI 48 (Effective Anticoagulation With Factor Xa Next Generation in Atrial Fibrillation-Thrombolysis in Myocardial Infarction 48) was a double-blind trial of 21 105 patients with atrial fibrillation randomized to warfarin (international normalized ratio, 2.0-3.0; median time-in-therapeutic range, 68.4%) versus once-daily edoxaban (higher-dose edoxaban regimen [HDER], 60/30 mg; lower-dose edoxaban regimen, 30/15 mg) with 2.8-year median follow-up. Primary end points included all stroke/systemic embolic events (efficacy) and major bleeding (safety). Because only HDER is approved, we focused on the comparison of HDER versus warfarin.
RESULTS
Of 5973 (28.3%) patients with previous IS/TIA, 67% had CHADS2 (congestive heart failure, hypertension, age, diabetes, prior stroke/transient ischemic attack) >3 and 36% were ≥75 years. Compared with 15 132 without previous IS/TIA, patients with previous IS/TIA were at higher risk of both thromboembolism and bleeding (stroke/systemic embolic events 2.83% versus 1.42% per year; P<0.001; major bleeding 3.03% versus 2.64% per year; P<0.001; intracranial hemorrhage, 0.70% versus 0.40% per year; P<0.001). Among patients with previous IS/TIA, annualized intracranial hemorrhage rates were lower with HDER than with warfarin (0.62% versus 1.09%; absolute risk difference, 47 [8-85] per 10 000 patient-years; hazard ratio, 0.57; 95% confidence interval, 0.36-0.92; P=0.02). No treatment subgroup interactions were found for primary efficacy (P=0.86) or for intracranial hemorrhage (P=0.28).
CONCLUSIONS
Patients with atrial fibrillation with previous IS/TIA are at high risk of recurrent thromboembolism and bleeding. HDER is at least as effective and is safer than warfarin, regardless of the presence or the absence of previous IS or TIA.
CLINICAL TRIAL REGISTRATION
URL: http://www.clinicaltrials.gov. Unique identifier: NCT00781391. |
Media-portrayed idealized images, body shame, and appearance anxiety. | OBJECTIVE
This study was designed to determine the effects of media-portrayed idealized images on young women's body shame and appearance anxiety, and to establish whether the effects depend on advertisement type and on participant self-objectification.
METHOD
Participants were 39 female university students. Twenty-four magazine advertisements comprised 12 body-related and 12 non-body-related products, one half of each with, and the other one half without, idealized images. Preexposure and post exposure body shame and appearance anxiety measures were recorded.
RESULTS
Appearance anxiety increased after viewing advertisements featuring idealized images. There was also a significant interaction between self-objectification level and idealized body (presence vs. absence). No differences emerged for body-related compared with non-body-related product advertisements. The only result for body shame was a main effect for time. Participants' body shame increased after exposure to idealized images, irrespective of advertisement type.
DISCUSSION
Although our findings reveal that media-portrayed idealized images detrimentally affect the body image of young women, they highlight the individual differences in vulnerability and the different effects for different components of body image. These results are discussed in terms of their implications for the prevention and early intervention of body image and dieting-related disorders. ( |
Blind Source Separation in nonminimum-phase systems based on filter decomposition | This paper focuses on the causality problem in the task of Blind Source Separation (BSS) of speech signals in nonminimum-phase mixing channels. We propose a new algorithm for solving this problem using filter decomposition approach. Our proposed algorithm uses an integrated cost function in which independence criterion is defined in frequency-domain. The parameters of demixing system are derived in time-domain, so the algorithm has the benefits of both time and frequency-domain approaches. Compared to the previous work in this framework, our proposed algorithm is the extension of filter decomposition idea in multi-channel blind deconvolution to the problem of blind source separation of speech signals. The proposed method is capable of dealing with both minimum-phase and nonminimum-phase mixing situations. Simulation results show considerable improvement in separating speech signals specially when the mixing system is nonminimum-phase. |
Hyporesponsiveness to PegIFNα2B plus ribavirin in patients with hepatitis C-related advanced fibrosis. | BACKGROUND & AIMS
The success of pegylated-interferon (PegIFN)/ribavirin (Rbv) therapy of chronic hepatitis C is compromised by liver fibrosis. Whether fibrosis equally affects the two PegIFNα-based therapies is unknown. To assess the response to the two PegIFN regimens in patients with different degree of liver fibrosis.
METHODS
A sub-analysis of the MIST study: 431 consecutive naïve patients randomly assigned, based on HCV genotype, to receive either (A) PegIFNα2a 180 μg/wk plus daily Rbv 800-1200 mg or (B) PegIFNα2b 1.5 μg/kg/week plus daily Rbv 800-1200 mg, were stratified according to Ishak staging (S) into mild (S0-S2) or moderate (S3, S4) fibrosis and cirrhosis (S5, S6).
RESULTS
In A the sustained virological response (SVR) rates were not significantly influenced by fibrosis stage (71% in S0-S2, 66% in S3, S4, 53% in S5, S6, p=0.12), compared to B where the SVR rates differed according to fibrosis stage (65%, 46%, and 38%, p=0.004, respectively). This was even more so in HCV-1/4 patients treated with PegIFNα2b where the SVR rates were twice as many in S0-S2 vs. S≥3 (44% vs. 22%, p=0.02), while in A the SVR rates were similar between the two fibrosis subgroups (S0-S2: 47% vs. S≥3: 48%, p=0.8). By logistic regression analysis genotype 1/4 and lack of rapid virological response were independent predictors of treatment failure in both treatment groups, while S≥3 fibrosis was associated to PegIFNα2b treatment failure, only (OR 2.83, 95% CI 1.4-5.68, p=0.004).
CONCLUSIONS
Liver fibrosis was an independent moderator of treatment outcome in patients receiving PegIFNα2b, not in those receiving PegIFNα2a. |
Graphical presentation of diagnostic information | BACKGROUND
Graphical displays of results allow researchers to summarise and communicate the key findings of their study. Diagnostic information should be presented in an easily interpretable way, which conveys both test characteristics (diagnostic accuracy) and the potential for use in clinical practice (predictive value).
METHODS
We discuss the types of graphical display commonly encountered in primary diagnostic accuracy studies and systematic reviews of such studies, and systematically review the use of graphical displays in recent diagnostic primary studies and systematic reviews.
RESULTS
We identified 57 primary studies and 49 systematic reviews. Fifty-six percent of primary studies and 53% of systematic reviews used graphical displays to present results. Dot-plot or box-and- whisker plots were the most commonly used graph in primary studies and were included in 22 (39%) studies. ROC plots were the most common type of plot included in systematic reviews and were included in 22 (45%) reviews. One primary study and five systematic reviews included a probability-modifying plot.
CONCLUSION
Graphical displays are currently underused in primary diagnostic accuracy studies and systematic reviews of such studies. Diagnostic accuracy studies need to include multiple types of graphic in order to provide both a detailed overview of the results (diagnostic accuracy) and to communicate information that can be used to inform clinical practice (predictive value). Work is required to improve graphical displays, to better communicate the utility of a test in clinical practice and the implications of test results for individual patients. |
PixelNet: Representation of the pixels, by the pixels, and for the pixels | We explore design principles for general pixel-level prediction problems, from low-level edge detection to midlevel surface normal estimation to high-level semantic segmentation. Convolutional predictors, such as the fullyconvolutional network (FCN), have achieved remarkable success by exploiting the spatial redundancy of neighboring pixels through convolutional processing. Though computationally efficient, we point out that such approaches are not statistically efficient during learning precisely because spatial redundancy limits the information learned from neighboring pixels. We demonstrate that stratified sampling of pixels allows one to (1) add diversity during batch updates, speeding up learning; (2) explore complex nonlinear predictors, improving accuracy; and (3) efficiently train state-of-the-art models tabula rasa (i.e., “from scratch”) for diverse pixel-labeling tasks. Our single architecture produces state-of-the-art results for semantic segmentation on PASCAL-Context dataset, surface normal estimation on NYUDv2 depth dataset, and edge detection on BSDS. |
Image Restoration by Iterative Denoising and Backward Projections | Inverse problems appear in many applications, such as image deblurring and inpainting. The common approach to address them is to design a specific algorithm for each problem. The Plug-and-Play (P&P) framework, which has been recently introduced, allows solving general inverse problems by leveraging the impressive capabilities of existing denoising algorithms. While this fresh strategy has found many applications, a burdensome parameter tuning is often required in order to obtain high-quality results. In this paper, we propose an alternative method for solving inverse problems using off-the-shelf denoisers, which requires less parameter tuning. First, we transform a typical cost function, composed of fidelity and prior terms, into a closely related, novel optimization problem. Then, we propose an efficient minimization scheme with a P&P property, i.e., the prior term is handled solely by a denoising operation. Finally, we present an automatic tuning mechanism to set the method’s parameters. We provide a theoretical analysis of the method and empirically demonstrate its competitiveness with task-specific techniques and the P&P approach for image inpainting and deblurring. |
5G fronthaul-latency and jitter studies of CPRI over ethernet | Common Public Radio Interface (CPRI) is a successful industry cooperation defining the publicly available specification for the key internal interface of radio base stations between the radio equipment control (REC) and the radio equipment (RE) in the fronthaul of mobile networks. However, CPRI is expensive to deploy, consumes large bandwidth, and currently is statically configured. On the other hand, an Ethernet-based mobile fronthaul will be cost-efficient and more easily reconfigurable. Encapsulating CPRI over Ethernet (CoE) is an attractive solution, but stringent CPRI requirements such as delay and jitter are major challenges that need to be met to make CoE a reality. This study investigates whether CoE can meet delay and jitter requirements by performing FPGA-based Verilog experiments and simulations. Verilog experiments show that CoE encapsulation with fixed Ethernet frame size requires about tens of microseconds. Numerical experiments show that the proposed scheduling policy of CoE flows on Ethernet can reduce jitter when redundant Ethernet capacity is provided. The reduction in jitter can be as large as 1 μs, hence making Ethernet-based mobile fronthaul a credible technology. |
Frontal fibrosing alopecia: a clinical review of 36 patients. | BACKGROUND
Frontal fibrosing alopecia (FFA) is a primary lymphocytic cicatricial alopecia with a distinctive clinical pattern of progressive frontotemporal hairline recession. Currently, there are no evidence-based studies to guide treatment for patients with FFA; thus, treatment options vary among clinicians.
OBJECTIVES
We report clinical findings and treatment outcomes of 36 patients with FFA, the largest cohort to date. Further, we report the first evidence-based study of the efficacy of hydroxychloroquine in FFA using a quantitative clinical score, the Lichen Planopilaris Activity Index (LPPAI).
METHODS
A retrospective case note review was performed of 36 adult patients with FFA. Data were collected on demographics and clinical findings. Treatment responses to hydroxychloroquine, doxycycline and mycophenolate mofetil were assessed using the LPPAI. Adverse events were monitored.
RESULTS
Most patients in our cohort were female (97%), white (92%) and postmenopausal (83%). Apart from hairline recession, 75% also reported eyebrow loss. Scalp pruritus (67%) and perifollicular erythema (86%) were the most common presenting symptom and sign, respectively. A statistically significant reduction in signs and symptoms in subjects treated with hydroxychloroquine (P < 0·05) was found at both 6- and 12-month follow up.
CONCLUSIONS
In FFA, hairline recession, scalp pruritus, perifollicular erythema and eyebrow loss are common at presentation. Despite the limitations of a retrospective review, our data reveal that hydroxychloroquine is significantly effective in reducing signs and symptoms of FFA after both 6 and 12 months of treatment. However, the lack of a significant reduction in signs and symptoms between 6 and 12 months indicates that the maximal benefits of hydroxychloroquine are evident within the first 6 months of use. |
Deep3: Leveraging three levels of parallelism for efficient Deep Learning | This paper proposes Deep3 an automated platform-aware Deep Learning (DL) framework that brings orders of magnitude performance improvement to DL training and execution. Deep3 is the first to simultaneously leverage three levels of parallelism for performing DL: data, network, and hardware. It uses platform profiling to abstract physical characterizations of the target platform. The core of Deep3 is a new extensible methodology that enables incorporation of platform characteristics into the higher-level data and neural network transformation. We provide accompanying libraries to ensure automated customization and adaptation to different datasets and platforms. Proof-of-concept evaluations demonstrate 10-100 fold physical performance improvement compared to the state-of-the-art DL frameworks, e.g., TensorFlow. |
Weak units and homotopy 3-types | We show that every braided monoidal category arises as End(I) for a weak unit I in an otherwise completely strict monoidal 2-category. This implies a version of Simpson’s weak-unit conjecture in dimension 3, namely that one-object 3-groupoids that are strict in all respects, except that the object has only weak identity arrows, can model all connected, simply connected homotopy 3-types. The proof has a clear intuitive content and relies on a geometrical argument with string diagrams and configuration spaces. 0. Introduction The subtleties and challenges of higher category theory start with the observation (in fact, not a trivial result) that not every weak 3-category is equivalent to a strict 3-category. The topological counterpart of this is that not every homotopy 3type can be realised by a strict 3-groupoid. The discrepancy between the strict and weak worlds can be pinpointed down to the case of connected, simply-connected 3-types, where it can be observed rather explicitly: such 3-types correspond to braided monoidal categories (in fact braided categorical groups), while connected, simply-connected strict 3-categories are essentially commutative monoidal categories — the braiding is forced to collapse, as a consequence of the Eckmann-Hilton argument. In precise terms, strict n-groupoids can realise only homotopy n-types with trivial Whitehead brackets. This collapse can be circumvented by weakening the structures. The notion of tricategory of Gordon, Power, and Street [1] is meant to be the weakest possible definition of 3-category. They show that a tricategory with only one object is equivalent to a Gray monoid, and in particular, a tricategory with one object and one arrow is equivalent to a braided monoidal category. Furthermore, every braided monoidal category arises in this way. The most general result relating higher categories to homotopy types is Tamsamani’s theorem [9], that weak n-groupoids (in 1991 Mathematics Subject Classification. 18D05; 18D10; 55P99. |
An up to 16-year prospective study of 304 porcelain veneers. | PURPOSE
This study aimed to prospectively analyze the outcomes of 304 feldspathic porcelain veneers prepared by the same operator, in 100 patients, that were in situ for up to 16 years.
MATERIALS AND METHODS
A total of 304 porcelain veneers on incisors, canines, and premolars in 100 patients completed by one prosthodontist between 1988 and 2003 were sequentially included. Preparations were designed with chamfer margins, incisal reduction, and palatal overlap. At least 80% of each preparation was in enamel. Feldspathic porcelain veneers from refractory dies were etched (hydrofluoric acid), silanated, and cemented (Vision 2, Mirage Dental Systems). Outcomes were expressed as percentages (success, survival, unknown, dead, repair, failure). The results were statistically analyzed using the chi-square test and Kaplan-Meier survival estimation. Statistical significance was set at P < .05.
RESULTS
The cumulative survival for veneers was 96% +/- 1% at 5 to 6 years, 93% +/- 2% at 10 to 11 years, 91% +/- 3% at 12 to 13 years, and 73% +/- 16% at 15 to 16 years. The marked drop in survival between 13 and 16 years was the result of the death of 1 patient and the low number of veneers in that period. The cumulative survival was greater when different statistical methods were employed. Sixteen veneers in 14 patients failed. Failed veneers were associated with esthetics (31%), mechanical complications (31%), periodontal support (12.5%), loss of retention >2 (12.5%), caries (6%), and tooth fracture (6%). Statistically significantly fewer veneers survived as the time in situ increased.
CONCLUSIONS
Feldspathic porcelain veneers, when bonded to enamel substrate, offer a predictable long-term restoration with a low failure rate. The statistical methods used to calculate the cumulative survival can markedly affect the apparent outcome and thus should be clearly defined in outcome studies. |
DAVID Bioinformatics Resources: expanded annotation database and novel algorithms to better extract biology from large gene lists | All tools in the DAVID Bioinformatics Resources aim to provide functional interpretation of large lists of genes derived from genomic studies. The newly updated DAVID Bioinformatics Resources consists of the DAVID Knowledgebase and five integrated, web-based functional annotation tool suites: the DAVID Gene Functional Classification Tool, the DAVID Functional Annotation Tool, the DAVID Gene ID Conversion Tool, the DAVID Gene Name Viewer and the DAVID NIAID Pathogen Genome Browser. The expanded DAVID Knowledgebase now integrates almost all major and well-known public bioinformatics resources centralized by the DAVID Gene Concept, a single-linkage method to agglomerate tens of millions of diverse gene/protein identifiers and annotation terms from a variety of public bioinformatics databases. For any uploaded gene list, the DAVID Resources now provides not only the typical gene-term enrichment analysis, but also new tools and functions that allow users to condense large gene lists into gene functional groups, convert between gene/protein identifiers, visualize many-genes-to-many-terms relationships, cluster redundant and heterogeneous terms into groups, search for interesting and related genes or terms, dynamically view genes from their lists on bio-pathways and more. With DAVID (http://david.niaid.nih.gov), investigators gain more power to interpret the biological mechanisms associated with large gene lists. |
Using depth in visual simultaneous localisation and mapping | We present a method of utilizing depth information as provided by RGBD sensors for robust real-time visual simultaneous localisation and mapping (SLAM) by augmenting monocular visual SLAM to take into account depth data. This is implemented based on the feely available software “Parallel Tracking and Mapping” by Georg Klein. Our modifications allow PTAM to be used as a 6D visual SLAM system even without any additional information about odometry or from an inertial measurement unit. |
Presentation attack detection algorithm for face and iris biometrics | Biometric systems are vulnerable to the diverse attacks that emerged as a challenge to assure the reliability in adopting these systems in real-life scenario. In this work, we propose a novel solution to detect a presentation attack based on exploring both statistical and Cepstral features. The proposed Presentation Attack Detection (PAD) algorithm will extract the statistical features that can capture the micro-texture variation using Binarized Statistical Image Features (BSIF) and Cepstral features that can reflect the micro changes in frequency using 2D Cepstrum analysis. We then fuse these features to form a single feature vector before making a decision on whether a capture attempt is a normal presentation or an artefact presentation using linear Support Vector Machine (SVM). Extensive experiments carried out on a publicly available face and iris spoof database show the efficacy of the proposed PAD algorithm with an Average Classification Error Rate (ACER) = 10.21% on face and ACER = 0% on the iris biometrics. |
Flow-Based Detection of DNS Tunnels | DNS tunnels allow circumventing access and security policies in firewalled networks. Such a security breach can be misused for activities like free web browsing, but also for command & control traffic or cyber espionage, thus motivating the search for effective automated DNS tunnel detection techniques. In this paper we develop such a technique, based on the monitoring and analysis of network flows. Our methodology combines flow information with statistical methods for anomaly detection. The contribution of our paper is twofold. Firstly, based on flow-derived variables that we identified as indicative of DNS tunnelling activities, we identify and evaluate a set of non-parametrical statistical tests that are particularly useful in this context. Secondly, the efficacy of the resulting tests is demonstrated by extensive validation experiments in an operational environment, covering many different usage scenarios. |
Reduction of wound angiogenesis in patients treated with BMS-275291, a broad spectrum matrix metalloproteinase inhibitor. | PURPOSE
The purpose of this study was to evaluate the feasibility of incorporating a novel wound angiogenesis assay into a Phase I study of BMS-275291, a broad-spectrum matrix metalloproteinase inhibitor, and to determine whether the wound angiogenesis assay was able to detect the inhibition of angiogenesis in patients treated with BMS-275291.
EXPERIMENTAL DESIGN
Before treatment began, a 4-mm skin biopsy was performed. The wound was imaged for 14 days. Treatment was started on day 0, and a separate 4-mm biopsy was performed 14 days later. The second wound was also imaged for 14 days. Wound angiogenesis was scored by two independent observers who were blinded to treatment status.
RESULTS
The median times in days (95% confidence interval) to reach the target average vascular score (AVS) of 1.5 and 2.0 based on the data of Observer 1 were 3.7 (2.2-6.9) and 8.0 (5.0-10.0) pretreatment whereas on-treatment the values were 4.9 (3.7-8.0) and 9.3 (7.0-11.5), respectively. The delay in the median time to reach an AVS of 1.5 was 1.2 days or a 32% reduction when comparing pretreatment with on-treatment (P = 0.06). For the target AVS of 2.0 the delay in the median time pretreatment versus on-treatment was 1.3 days or a 16% reduction (P = 0.04).
CONCLUSIONS
The wound angiogenesis assay used in this study was practical, well tolerated, and reproducible. Delays in wound angiogenesis because of BMS-275291 were detectable with this assay. This technique warrants additional investigation in clinical trials of other antiangiogenic agents. |
SIFT Flow: Dense Correspondence across Scenes and Its Applications | While image alignment has been studied in different areas of computer vision for decades, aligning images depicting different scenes remains a challenging problem. Analogous to optical flow, where an image is aligned to its temporally adjacent frame, we propose SIFT flow, a method to align an image to its nearest neighbors in a large image corpus containing a variety of scenes. The SIFT flow algorithm consists of matching densely sampled, pixelwise SIFT features between two images while preserving spatial discontinuities. The SIFT features allow robust matching across different scene/object appearances, whereas the discontinuity-preserving spatial model allows matching of objects located at different parts of the scene. Experiments show that the proposed approach robustly aligns complex scene pairs containing significant spatial differences. Based on SIFT flow, we propose an alignment-based large database framework for image analysis and synthesis, where image information is transferred from the nearest neighbors to a query image according to the dense scene correspondence. This framework is demonstrated through concrete applications such as motion field prediction from a single image, motion synthesis via object transfer, satellite image registration, and face recognition. |
Automated Mass Detection in Mammograms Using Cascaded Deep Learning and Random Forests | Mass detection from mammograms plays a crucial role as a pre- processing stage for mass segmentation and classification. The detection of masses from mammograms is considered to be a challenging problem due to their large variation in shape, size, boundary and texture and also because of their low signal to noise ratio compared to the surrounding breast tissue. In this paper, we present a novel approach for detecting masses in mammograms using a cascade of deep learning and random forest classifiers. The first stage classifier consists of a multi-scale deep belief network that selects suspicious regions to be further processed by a two-level cascade of deep convolutional neural networks. The regions that survive this deep learning analysis are then processed by a two-level cascade of random forest classifiers that use morphological and texture features extracted from regions selected along the cascade. Finally, regions that survive the cascade of random forest classifiers are combined using connected component analysis to produce state-of-the-art results. We also show that the proposed cascade of deep learning and random forest classifiers are effective in the reduction of false positive regions, while maintaining a high true positive detection rate. We tested our mass detection system on two publicly available datasets: DDSM-BCRP and INbreast. The final mass detection produced by our approach achieves the best results on these publicly available datasets with a true positive rate of 0.96 ± 0.03 at 1.2 false positive per image on INbreast and true positive rate of 0.75 at 4.8 false positive per image on DDSM-BCRP. |
Using Link Features for Entity Clustering in Knowledge Graphs | Knowledge graphs holistically integrate information about entities from multiple sources. A key step in the construction and maintenance of knowledge graphs is the clustering of equivalent entities from different sources. Previous approaches for such an entity clustering suffer from several problems, e.g., the creation of overlapping clusters or the inclusion of several entities from the same source within clusters. We therefore propose a new entity clustering algorithm CLIP that can be applied both to create entity clusters and to repair entity clusters determined with another clustering scheme. In contrast to previous approaches, CLIP not only uses the similarity between entities for clustering but also further features of entity links such as the so-called link strength. To achieve a good scalability we provide a parallel implementation of CLIP based on Apache Flink. Our evaluation for different datasets shows that the new approach can achieve substantially higher cluster quality than previous approaches. |
An optimal randomized incremental gradient method | In this paper, we consider a class of finite-sum convex optimization problems whose objective function is given by the summation of m (≥ 1) smooth components together with some other relatively simple terms. We first introduce a deterministic primal-dual gradient (PDG) method that can achieve the optimal blackbox iteration complexity for solving these composite optimization problems using a primal-dual termination criterion. Our major contribution is to develop a randomized primal-dual gradient (RPDG) method, which needs to compute the gradient of only one randomly selected smooth component at each iteration, but can possibly achieve better complexity than PDG in terms of the total number of gradient evaluations. More specifically, we show that the total number of gradient evaluations performed by RPDG can be O(√m) times smaller, both in expectation and with high probability, than those performed by deterministic optimal first-order methods under favorable situations. We also show that the complexity of the RPDG method is not improvable by developing a new lower complexity bound for a general class of randomized methods for solving large-scale finite-sum convex optimization problems. Moreover, through the development of PDG and RPDG, we introduce a novel gametheoretic interpretation for these optimal methods for convex optimization. |
Bimodal Distribution and Co-Bursting in Review Spam Detection | Online reviews play a crucial role in helping consumers evaluate and compare products and services. This critical importance of reviews also incentivizes fraudsters (or spammers) to write fake or spam reviews to secretly promote or demote some target products and services. Existing approaches to detecting spam reviews and reviewers employed review contents, reviewer behaviors, star rating patterns, and reviewer-product networks for detection. In this research, we further discovered that reviewers’ posting rates (number of reviews written in a period of time) also follow an interesting distribution pattern, which has not been reported before. That is, their posting rates are bimodal. Multiple spammers also tend to collectively and actively post reviews to the same set of products within a short time frame, which we call co-bursting. Furthermore, we found some other interesting patterns in individual reviewers’ temporal dynamics and their co-bursting behaviors with other reviewers. Inspired by these findings, we first propose a two-mode Labeled Hidden Markov Model to model spamming using only individual reviewers’ review posting times. We then extend it to the Coupled Hidden Markov Model to capture both reviewer posting behaviors and co-bursting signals. Our experiments show that the proposed model significantly outperforms state-of-the-art baselines in identifying individual spammers. Furthermore, we propose a cobursting network based on co-bursting relations, which helps detect groups of spammers more effectively than existing approaches. |
Comparing Convolutional Neural Networks to Traditional Models for Slot Filling | We address relation classification in the context of slot filling, the task of finding and evaluating fillers like “Steve Jobs” for the slot X in “X founded Apple”. We propose a convolutional neural network which splits the input sentence into three parts according to the relation arguments and compare it to state-ofthe-art and traditional approaches of relation classification. Finally, we combine different methods and show that the combination is better than individual approaches. We also analyze the effect of genre differences on performance. |
A robust 2D point-sequence curve offset algorithm with multiple islands for contour-parallel tool path | An offset algorithm is important to the contour-parallel tool path generation process. Usually, it is necessary to offset with islands. In this paper a new offset algorithm for a 2D point-sequence curve (PS-curve) with multiple islands is presented. The algorithm consists of three sub-processes, the islands bridging process, the raw offset curve generation and the global invalid loops removal. The input of the algorithm is a set of PS-curves, in which one of them is the outer profile and the others are islands. The bridging process bridges all the islands to the outer profile with the Delaunay triangulation method, forming a single linked PS-curve.With the fact that local problems are caused by intersections of adjacent bisectors, the concept of stuck circle is proposed. Based on stuck circle, local problems are fixed by updating the original profile with the proposed basic rule and append rule, so that a raw offset curve can be generated. The last process first reports all the self-intersections on the raw offset PS-curve, and then a procedure called tree analysis puts all the self-intersections into a tree. All the points between the nodes in even depth and its immediate children are collected using the collecting rule. The collected points form the valid loops, which is the output of the proposed algorithm. Each sub-process can be complete in near linear time, so the whole algorithm has a near linear time complexity. This can be proved by the examples tested in the paper. © 2012 Elsevier Ltd. All rights reserved. |
The Online Parent Information and Support project, meeting parents' information and support needs for home-based management of childhood chronic kidney disease: research protocol. | AIM
This article is a report of a protocol for studying the development and evaluation of an online parent information and support package for home-based care of children with chronic kidney disease stages 3-5. The study is funded by a National Institute of Health Research, Research for Patient Benefit Grant awarded (December 2010). Approval to undetake the study was obtained from the Department of Health National Research Ethics Service (June 2011).
BACKGROUND
Children with chronic kidney disease require skilled, home-based care by parents, supported by professionals. Parents have identified a need for continuously available online resources to supplement professional support, and structured resources tailored to parents' needs are highlighted by policy makers as key to optimizing care; yet, online resource provision is patchy with little evidence base.
METHODS
Using mixed methods, we will (i) conduct parent/child/young person/professional/patient and parent volunteer focus groups to explore views on existing resources, (ii) collaboratively define gaps in provision, identify desirable components, develop/test resources and conduct a feasibility randomized controlled trial, and (iii) of usual professional support versus usual support supplemented by the package. Eighty parents of children with chronic kidney disease will be randomized. Primary outcomes will assess parents' self-efficacy and views of resources, using standardized measures at entry and 24 weeks, and semi-structured interviews at 24 weeks. We will finalize trial components for a later definitive trial.
DISCUSSION
By working collaboratively, we will derive a detailed insight into parents' information and support needs and experiences of using the package, and should see improved parental self-efficacy. |
A Neural Approach to Automated Essay Scoring | Traditional automated essay scoring systems rely on carefully designed features to evaluate and score essays. The performance of such systems is tightly bound to the quality of the underlying features. However, it is laborious to manually design the most informative features for such a system. In this paper, we develop an approach based on recurrent neural networks to learn the relation between an essay and its assigned score, without any feature engineering. We explore several neural network models for the task of automated essay scoring and perform some analysis to get some insights of the models. The results show that our best system, which is based on long short-term memory networks, outperforms a strong baseline by 5.6% in terms of quadratic weighted Kappa, without requiring any feature engineering. |
Ontology based Chatbot ( For E-commerce Website ) Anusha Vegesna Information Technology | A working model of Ontology based chatbot is proposed that handles queries from users for an E-commerce website. It is mainly concerned with providing user the total control over the search result on the website. This chatbot helps the user by mapping relationships of the various entities required by the user, thus providing detailed and accurate information there by overcoming the drawbacks of traditional chatbots. The Ontology template is developed using Protégé which stores the knowledge acquired from the website APIs while the dialog manager is handled using Wit.ai. The integration of the dialog manager and the ontology template is managed through Python. The related response to the query will be formatted and returned to the user on the dialog manager. General Terms Artificial intelligence and Machine learning |
The link prediction problem for social networks | Given a snapshot of a social network, can we infer which new interactions among its members are likely to occur in the near future? We formalize this question as the link prediction problem, and develop approaches to link prediction based on measures the "proximity" of nodes in a network. Experiments on large co-authorship networks suggest that information about future interactions can be extracted from network topology alone, and that fairly subtle measures for detecting node proximity can outperform more direct measures. |
Wikipedia-based Semantic Interpretation for Natural Language Processing | Adequate representation of natural language semantics requires access to vast amounts of common sense and domain-specific world knowledge. Prior work in the field was based on purely statistical techniques that did not make use of background knowledge, on limited lexicographic knowledge bases such as WordNet, or on huge manual efforts such as the CYC project. Here we propose a novel method, called Explicit Semantic Analysis (ESA), for fine-grained semantic interpretation of unrestricted natural language texts. Our method represents meaning in a high-dimensional space of concepts derived from Wikipedia, the largest encyclopedia in existence. We explicitly represent the meaning of any text in terms of Wikipedia-based concepts. We evaluate the effectiveness of our method on text categorization and on computing the degree of semantic relatedness between fragments of natural language text. Using ESA results in significant improvements over the previous state of the art in both tasks. Importantly, due to the use of natural concepts, the ESA model is easy to explain to human users. |
Evolving Deep Neural Networks | The success of deep learning depends on finding an architecture to fit the task. As deep learning has scaled up to more challenging tasks, the architectures have become difficult to design by hand. This paper proposes an automated method, CoDeepNEAT, for optimizing deep learning architectures through evolution. By extending existing neuroevolution methods to topology, components, and hyperparameters, this method achieves results comparable to best human designs in standard benchmarks in object recognition and language modeling. It also supports building a real-world application of automated image captioning on a magazine website. Given the anticipated increases in available computing power, evolution of deep networks is promising approach to constructing deep learning applications in the future. |
Association between PTPN22/CTLA-4 Gene Polymorphism and Allergic Rhinitis with Asthma in Children. | Allergic rhinitis (AR) is an IgE-mediated upper airway disease, and its impact on asthma has been widely recognized. Protein tyrosine phosphatase non-receptor 22 (PTPN22) gene and the cytotoxic T-lymphocyte-associated antigen 4 (CTLA-4) gene polymorphisms have been reported to be associated with several immune-related diseases. Here we investigated the reffect of these two genes' polymorphisms on the risk of AR and asthma in Chinese Han children. A total of 106 AR patients, 112 AR with asthma patients, and 109 healthy children were enrolled in the study. The SNPs of PTPN22 (rs2488457, rs1310182, rs3789604) and CTLA-4 (rs3087243, rs11571302, rs11571315, rs231725, rs335219727, and rs4553808) were genotyped using a PCR-restriction fragment length polymorphism assay. For PTPN22, an increased prevalence of the CC genotype and C allele in rs1310182 were identified in AR group. For CTLA-4, AA genotype and A allele in rs3087243 and rs231725 were increased in AR with asthma group while in AR group, AA genotype and A allele in rs231725 were obviously decreased. This study reveals a significant association between SNPs in PTPN22, CTLA-4 gene and AR with asthma in Chinese Han children, which might be susceptibility factors for AR and asthma. |
Automatic recognition of serial numbers in bank notes | This paper presents a new topic of automatic recognition of bank note serial numbers, which will not only facilitate the prevention of forgery crimes, but also have a positive impact on the economy. Among all the different currencies, we focus on the study of RMB (renminbi bank note, the paper currency used in China) serial numbers. For evaluation, a new database NUST-RMB2013 has been collected from scanned RMB images, which contains the serial numbers of 35 categories with 17,262 training samples and 7000 testing samples in total. We comprehensively implement and compare two classic and one newly merged feature extraction methods (namely gradient direction feature, Gabor feature, and CNN trainable feature), four different types of well-known classifiers (SVM, LDF, MQDF, and CNN), and five multiple classifier combination strategies (including a specially designed novel cascade method). To further improve the recognition accuracy, the enhancements of three different kinds of distortions have been tested. Since high reliability is more important than accuracy in financial applications, we introduce three rejection schemes of first rank measurement (FRM), first two ranks measurement (FTRM) and linear discriminant analysis based measurement (LDAM). All the classifiers and classifier combination schemes are combined with different rejection criteria. A novel cascade rejection measurement achieves 100% reliability with less rejection rate compared with the existing methods. Experimental results show that MQDF reaches the accuracy of 99.59% using the gradient direction feature trained with gray level normalized data; the cascade classifier combination achieves the best performance of 99.67%. The distortions have been proved to be very helpful because the performances of CNNs boost at least 0.5% by training with transformed samples. With the cascade rejection method, 100% reliability has been obtained by rejecting 1.01% test samples. & 2014 Elsevier Ltd. All rights reserved. |
The Bimini Ghost Maps of William P. Cumming | Five manuscript maps of the early sixteenth century have been noted in the literature of the history of cartography for the place-name Bimini where Florida is today. An examination of the sources for these references reveals there are only two such maps. The other maps — bibliographical ghost maps — arose because of reliance upon secondary sources. |
Attribute-Based Access Control | Attribute-based access control (ABAC) is a flexible approach that can implement AC policies limited only by the computational language and the richness of the available attributes, making it ideal for many distributed or rapidly changing environments. |
Staffing, skill mix and the model of care. | AIMS AND OBJECTIVES
The study aimed to explore whether nurse staffing, experience and skill mix influenced the model of nursing care in medical-surgical wards.
BACKGROUND
Methods of allocating nurses to patients are typically divided into four types: primary nursing, patient allocation, task assignment and team nursing. Research findings are varied in regard to the relationship between these models of care and outcomes such as satisfaction and quality. Skill mix has been associated with various models, with implications for collegial support, teamwork and patient outcomes.
DESIGN
Secondary analysis of data collected on 80 randomly selected medical-surgical wards in 19 public hospitals in New South Wales, Australia during 2004-2005.
METHODS
Nurses (n = 2278, 80.9% response rate) were surveyed using The Nursing Care Delivery System and the Nursing Work Index-Revised. Staffing and skill mix was obtained from the ward roster and other data from the patient record. Models of care were examined in relation to these practice environment and organisational variables.
RESULTS
The models of nursing care most frequently reported by nurses in medical-surgical wards in this study were patient allocation (91%) and team nursing (80%). Primary nursing and task based models were unlikely to be practised. Skill mix, nurse experience, nursing workload and factors in the ward environment significantly influenced the model of care in use. Wards with a higher ratio of degree qualified, experienced registered nurses, working on their 'usual' ward were more likely to practice patient allocation while wards with greater variability in staffing levels and skill mix were more likely to practice team nursing.
CONCLUSIONS
Models of care are not prescriptive but are varied according to ward circumstances and staffing levels based on complex clinical decision making skills.
RELEVANCE TO CLINICAL PRACTICE
Variability in the models of care reported by ward nurses indicates that nurses adapt the model of nursing care on a daily or shift basis, according to patients' needs, skill mix and individual ward environments. |
Knowledge , Motivation , and Adaptive Behavior : A Framework for Improving Selling Effectiveness | Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://www.jstor.org/about/terms.html. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, non-commercial use. Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at http://www.jstor.org/journals/ama.html. Each copy of any part of a JSTOR transmission must contain the same copyright notice that appears on the screen or printed page of such transmission. |
Hereditary medullary thyroid cancer in Slovenia – genotype-phenotype correlations | HINTERGRUND: Das medulläre Schilddrüsenkarzinom (MTC) ist ein seltener endokriner Tumor, der sporadisch oder vererbt als Teil von MEN 2A beziehungsweise MEN 2B, oder als familiärer Tumor (FMTC) auftreten kann. Keimbahn-Punkt-Mutationen im RET-Proto-onkogen sind für die Tumorentstehung, die Vererbung und für die große klinische Variabilität verantwortlich. Ziel dieser Studie war es, den Genotyp mit dem Phänotyp (Alter bei Diagnoseerstellung, Geschlecht, TNM Klassifikation und Klinik) von Patienten mit vererbbarem MTC zu korrelieren. PATIENTEN: Von 1997 bis 2003 wurden bei 69 von 98 Patienten mit so genanntem "sporadischem" MTC Genanalysen durchgeführt. Es wurden 14 (20,2%) Mutationsträger (= Index-Patienten) gefunden. Bei 16 (51,6%) der 31 Verwandten dieser Index-Patienten wurde dieselbe Mutation festgestellt. Ein Patient mit MEN 2B (Mutation im Codon 918) wurde nicht in die Studie aufgenommen. METHODEN: Genomische DNS wurde aus den Leukozyten des peripheren Blutes isoliert. Die Exons 10, 11, 12, 13, 14, 15 und 16 des RET-Protoonkogens wurden in der Polymerase-Ketten-Reaktion (PCR) amplifiziert. Die Punktmutationen im RET-Gen wurden durch Single Strand Conformation Analyse (SSCA) und DNS Sequenzierung entdeckt. Die beobachteten Mutationen wurden durch Restriktions-Enzyme bestätigt. ERGEBNISSE: Mutationen im Codon 634 wurden bei 15 Patienten (50%; Alter: 18–76 Jahre; 6 Familien), im Codon 618 bei 9 Patienten (30%; Alter 12–65; 4 Familien) und im Codon 790 bei 5 Patienten (16,6%; Alter 16–74; 3 Familien) entdeckt. Der Median des Alters bei Diagnosestellung lag bei 31 ± 17,3 (Codon 618); bzw 33 ± 15,9 (Codon 634), bzw 36 ± 23,8 (Codon 790) Jahren. Frauen mit den Mutationen im Codon 618 und im Codon 634 hatten ein medianes Alter bei Diagnoseerstellung von 34,5 ± 15,6 verglichen mit 43,5 ± 22,9 bei den Patienten mit Mutation im Codon 790. Bei Männern war im Gegensatz dazu das Alter bei der Diagnose 26,5 ± 18 im Vergleich zu 16 Jahren bei der Mutation im Codon 790. Der Mann/Frau Quotient war 1:2 bei Patienten mit Veränderungen im Codon 618 und 634 und 1:4 bei Patienten mit Mutation im Codon 790. Ein Phäochromozytom (12/15 Patienten) bzw. ein primärer Hyperparathyreoidismus (6/15 Patienten) wurde nur bei Patienten mit einer Mutation im Codon 634 gefunden. Es wurde ein Patient mit einem FMTC und der Hirschsprung'schen Erkrankung in einer Familie mit Mutation im Codon 618 gefunden. SCHLUSSFOLGERUNGEN: Unsere limitierte Studie gibt Hinweise auf eine Korrelation des Genotyps mit der Tumorgröße und dem MTC Tumorstadium vor allem bei Frauen mit einer Mutation im Codon 790. Der Umstand, dass diese Tumore später auftreten und einen weniger aggressiven Verlauf nehmen, sollte bei der Planung einer prophylaktischen Schilddrüsenentfernung Berücksichtigung finden. Das MEN 2A Syndrom wurde ausschließlich bei Patientne mit der Mutation im Codon 634 beobachtet. BACKGROUND: Medullary thyroid cancer (MTC) is a rare endocrine tumor that may be sporadic or inherited in settings of MEN2A, MEN2B and FMTC. Germline point mutations in the RET proto-oncogene are responsible for tumor occurrence, inheritance and great clinical variability. The aim of this study was to correlate the genotype and phenotype of patients with hereditary MTC (age at diagnosis, sex, TNM classification and clinical features). PATIENTS: Between 1997 and 2003 genetic testing was performed in 69 out of 98 patients with "sporadic" MTC. Carriage of mutation was found in 14 (20.2%) patients (index patients) and in 16 out of 31 (51.6%) of their relatives. One patient with MEN2B and codon 918 mutation was excluded from further analysis. METHODS: Genomic DNA was isolated from peripheral blood leukocytes. Exons 10, 11, 13, 14, 15 and 16 of the RET proto-oncogene were amplified in polymerase chain reactions. Point mutations of the RET gene were detected with single-strand conformation analysis and DNA sequencing. Detected mutations were confirmed with restriction enzyme analysis. RESULTS: Codon 634 mutations were detected in 15 patients (50%; aged 18–76 years; 6 families), codon 618 in nine patients (30%; aged 12–65 years; 4 families) and codon 790 in five patients (16.6%; aged 16–74 years; 3 families). The median age at diagnosis was 31 ± 17.3, 33 ± 15.9 and 36 ± 23.8 years for patients with codon 618, 634 and 790 mutations. Selected by sex, females with codon mutations 618 and 634 versus 790 had median age at diagnosis of 34.5 ± 15.6 years and 43.5 ± 22.9 years, whereas the inverse result was observed in males (26.5 ± 18.0 versus 16 years). The male/female ratio was 1:2 for patients with codon 618 and 634 mutations and 1:4 for patients with codon 790 mutations. Some of the data suggested correlation between specific genotypes, tumor size, stage of MTC and age at diagnosis. Pheochromocytoma (12 out of 15 patients) and primary hyperparathyroidism (6 out of 15 patients) were diagnosed solely in patients with codon 634 mutations. One patient with FMTC and Hirschprung disease was found in a family with codon 618 mutations. CONCLUSION: Correlation between tumor size, stage of MTC at diagnosis in view of patient's age, and specific genotype were indicated in our limited series and were more evident in female patients with codon 790 mutations. Later onset and a probably less aggressive course of MTC in these patients than in patients with other mutations should be considered in planning prophylactic thyroid surgery. MEN2A syndrome was related solely to codon 634 mutations. |
Itolizumab in combination with methotrexate modulates active rheumatoid arthritis: safety and efficacy from a phase 2, randomized, open-label, parallel-group, dose-ranging study | The objective of this study was to assess the safety and efficacy of itolizumab with methotrexate in active rheumatoid arthritis (RA) patients who had inadequate response to methotrexate. In this open-label, phase 2 study, 70 patients fulfilling American College of Rheumatology (ACR) criteria and negative for latent tuberculosis were randomized to four arms: 0.2, 0.4, or 0.8 mg/kg itolizumab weekly combined with oral methotrexate, and methotrexate alone (2:2:2:1). Patients were treated for 12 weeks, followed by 12 weeks of methotrexate alone during follow-up. Twelve weeks of itolizumab therapy was well tolerated. Forty-four patients reported adverse events (AEs); except for six severe AEs, all others were mild or moderate. Infusion-related reactions mainly occurred after the first infusion, and none were reported after the 11th infusion. No serum anti-itolizumab antibodies were detected. In the full analysis set, all itolizumab doses showed evidence of efficacy. At 12 weeks, 50 % of the patients achieved ACR20, and 58.3 % moderate or good 28-joint count Disease Activity Score (DAS-28) response; at week 24, these responses were seen in 22 and 31 patients. Significant improvements were seen in Short Form-36 Health Survey and Health Assessment Questionnaire Disability Index scores. Overall, itolizumab in combination with methotrexate was well tolerated and efficacious in RA for 12 weeks, with efficacy persisting for the entire 24-week evaluation period. (Clinical Trial Registry of India, http://ctri.nic.in/Clinicaltrials/login.php , CTRI/2008/091/000295). |
Optimum BMI Cut Points to Screen Asian Americans for Type 2 Diabetes | OBJECTIVE
Asian Americans manifest type 2 diabetes at low BMI levels but may not undergo diagnostic testing for diabetes if the currently recommended BMI screening cut point of ≥25 kg/m(2) is followed. We aimed to ascertain an appropriate lower BMI cut point among Asian-American adults without a prior diabetes diagnosis.
RESEARCH DESIGN AND METHODS
We consolidated data from 1,663 participants, ages ≥45 years, without a prior diabetes diagnosis, from population- and community-based studies, including the Mediators of Atherosclerosis in South Asians Living in America study, the North Kohala Study, the Seattle Japanese American Community Diabetes Study, and the University of California San Diego Filipino Health Study. Clinical measures included a 2-h 75-g oral glucose tolerance test, BMI, and glycosylated hemoglobin (HbA1c).
RESULTS
Mean age was 59.7 years, mean BMI was 25.4 kg/m(2), 58% were women, and type 2 diabetes prevalence (American Diabetes Association 2010 criteria) was 16.9%. At BMI ≥25 kg/m(2), sensitivity (63.7%), specificity (52.8%), and Youden index (0.16) values were low; limiting screening to BMI ≥25 kg/m(2) would miss 36% of Asian Americans with type 2 diabetes. For screening purposes, higher sensitivity is desirable to minimize missing cases, especially if the diagnostic test is relatively simple and inexpensive. At BMI ≥23 kg/m(2), sensitivity (84.7%) was high in the total sample and by sex and Asian-American subgroup and would miss only ∼15% of Asian Americans with diabetes.
CONCLUSIONS
The BMI cut point for identifying Asian Americans who should be screened for undiagnosed type 2 diabetes should be <25 kg/m(2), and ≥23 kg/m(2) may be the most practical. |
Standard operation procedures for conducting the on-the-road driving test, and measurement of the standard deviation of lateral position (SDLP) | This review discusses the methodology of the standardized on-the-road driving test and standard operation procedures to conduct the test and analyze the data. The on-the-road driving test has proven to be a sensitive and reliable method to examine driving ability after administration of central nervous system (CNS) drugs. The test is performed on a public highway in normal traffic. Subjects are instructed to drive with a steady lateral position and constant speed. Its primary parameter, the standard deviation of lateral position (SDLP), ie, an index of 'weaving', is a stable measure of driving performance with high test-retest reliability. SDLP differences from placebo are dose-dependent, and do not depend on the subject's baseline driving skills (placebo SDLP). It is important that standard operation procedures are applied to conduct the test and analyze the data in order to allow comparisons between studies from different sites. |
Building and Employing Cross-Reality | Considers present and future practical applications of cross-reality. From tools to build new 3D virtual worlds to the products of those tools, cross-reality is becoming a staple of our everyday reality. Practical applications of cross-reality include the ability to virtually visit a factory to manage and maintain resources from the comfort of your laptop or desktop PC as well as sentient visors that augment reality with additional information so that users can make more informed choices. Tools and projects considered are:Project Wonderland for multiuser mixed reality;ClearWorlds: mixed- reality presence through virtual clearboards; VICI (Visualization of Immersive and Contextual Information) for ubiquitous augmented reality based on a tangible user interface; Mirror World Chocolate Factory; and sentient visors for browsing the world. |
Effect of a low-fat fish oil diet on proinflammatory eicosanoids and cell-cycle progression score in men undergoing radical prostatectomy. | We previously reported that a 4- to 6-week low-fat fish oil (LFFO) diet did not affect serum insulin-like growth factor (IGF)-1 levels (primary outcome) but resulted in lower omega-6 to omega-3 fatty acid ratios in prostate tissue and lower prostate cancer proliferation (Ki67) as compared with a Western diet. In this post hoc analysis, the effect of the LFFO intervention on serum pro-inflammatory eicosanoids, leukotriene B4 (LTB4) and 15-S-hydroxyeicosatetraenoic acid [15(S)-HETE], and the cell-cycle progression (CCP) score were investigated. Serum fatty acids and eicosanoids were measured by gas chromatography and ELISA. CCP score was determined by quantitative real-time reverse transcriptase PCR (RT-PCR). Associations between serum eicosanoids, Ki67, and CCP score were evaluated using partial correlation analyses. BLT1 (LTB4 receptor) expression was determined in prostate cancer cell lines and prostatectomy specimens. Serum omega-6 fatty acids and 15(S)-HETE levels were significantly reduced, and serum omega-3 levels were increased in the LFFO group relative to the Western diet group, whereas there was no change in LTB4 levels. The CCP score was significantly lower in the LFFO compared with the Western diet group. The 15(S)-HETE change correlated with tissue Ki67 (R = 0.48; P < 0.01) but not with CCP score. The LTB4 change correlated with the CCP score (r = 0.4; P = 0.02) but not with Ki67. The LTB4 receptor BLT1 was detected in prostate cancer cell lines and human prostate cancer specimens. In conclusion, an LFFO diet resulted in decreased 15(S)-HETE levels and lower CCP score relative to a Western diet. Further studies are warranted to determine whether the LFFO diet antiproliferative effects are mediated through the LTB4/BLT1 and 15(S)-HETE pathways. |
Affective and Content Analysis of Online Depression Communities | A large number of people use online communities to discuss mental health issues, thus offering opportunities for new understanding of these communities. This paper aims to study the characteristics of online depression communities (CLINICAL) in comparison with those joining other online communities (CONTROL). We use machine learning and statistical methods to discriminate online messages between depression and control communities using mood, psycholinguistic processes and content topics extracted from the posts generated by members of these communities. All aspects including mood, the written content and writing style are found to be significantly different between two types of communities. Sentiment analysis shows the clinical group have lower valence than people in the control group. For language styles and topics, statistical tests reject the hypothesis of equality on psycholinguistic processes and topics between two groups. We show good predictive validity in depression classification using topics and psycholinguistic clues as features. Clear discrimination between writing styles and contents, with good predictive power is an important step in understanding social media and its use in mental health. |
Human fall detection via shape analysis on Riemannian manifolds with applications to elderly care | This paper addresses issues in fall detection from videos. The focus is on the analysis of human shapes which deform drastically in camera views while a person falls onto the ground. A novel approach is proposed that performs fall detection from an arbitrary view angle, via shape analysis on a unified Riemannian manifold for different camera views. The main novelties of this paper include: (a) representing dynamic shapes as points moving on a unit n-sphere, one of the simplest Riemannian manifolds; (b) characterizing the deformation of shapes by computing velocity statistics of their corresponding manifold points, based on geodesic distances on the manifold. Experiments have been conducted on two publicly available video datasets for fall detection. Test, evaluations and comparisons with 6 existing methods show the effectiveness of our proposed method. |
Financial Liberalization and Stability of the Financial System in Emerging Markets: The Institutional Dimension of Financial Crises | Emerging economies, which have implemented since the end of the 80's a process of financial liberalization, are confronted at the same time to banking crisis. The latter highlight the role played by the institutional framework in the process of financial liberalization. The objective of this paper is to go through the usual alternative too much/ too little market in order to explain that the success of any liberalization process relies on the complementarity between market and intermediation. The point is that the solution to financial instability is to be found within the institutional dynamics in which emerging economies may benefit from intermediation in order to enforce the market process. |
Security and Privacy in Cloud Computing: A Survey | Cloud computing can be defined as management and provision of different resources, such as, software, applications and information as services over the cloud (internet) on demand. Cloud computing is based on the assumption that the information can be quickly and easily accessed via the net. With its ability to provide dynamically scalable access for users, and the ability to share resources over the Internet, cloud computing has recently emerged as a promising hosting platform that performs an intelligent usage of a collection of services, applications, information and infrastructure comprised of pools of computers, networks, information and storage resources. Cloud computing is a multi-tenant resource sharing platform, which allows different service providers to deliver software as services in an economical way. Cloud computing is the latest technology revolution in terms of usage and management of IT resources and services driven largely by marketing and service offerings from the largest IT vendors including Google [26], IBM [19], Microsoft, and HP along with Amazon [17, 20, 24] and VMWare. However along with these advantages, storing a large amount of data including critical information on the cloud motivates highly skilled hackers, thus creating a need for the security is considered as one of the top issues while considering Cloud Computing. In this paper, we first explain the security model of cloud computing, and then analyze the feasibility, threats, and security in cloud computing in terms of extensive existing methods to control them along with their pros and cons. After that, the related open research problems and challenges are explored to promote the development of cloud computing. KeywordsCloud Computing; Fault Tolerance; Security And Privacy; Services |
Information Security Policy Compliance in Higher Education: A Neo-Institutional Perspective | External pressures could be a powerful force that drives the institution of higher education to attain information security policy compliance. Drawing on the Neo-Institutional Theory (NIT), this study examined how the three external expectations: regulative, normative, and cognitive expectations, impel the higher education of the United States to reach information security policy compliance. The research findings suggest that regulatory and social normative pressures, but not cognitive pressure, have significant effects on information security policy compliance in higher education. Based on these results, this study unfolds both the practical and research implications. |
Solving the Permutation Problem of Frequency-Domain BSS when Spatial Aliasing Occurs with Wide Sensor Spacing | This paper describes a method for solving the permutation problem of frequency-domain blind source separation (BSS). The method analyzes the mixing system information estimated with independent component analysis (ICA). When we use widely spaced sensors or increase the sampling rate, spatial aliasing may occur for high frequencies due to the possibility of multiple cycles in the sensor spacing. In such cases, the estimated information would imply multiple possibilities for a source location. This causes some difficulty when analyzing the information. We propose a new method designed to overcome this difficulty. This method first estimates the model parameters for the mixing system at low frequencies where spatial aliasing does not occur, and then refines the estimations by using data at all frequencies. This refinement leads to precise parameter estimation and therefore precise permutation alignment. Experimental results show the effectiveness of the new method |
More Dengue, More Questions | engue is epidemic or endemic in virtually every country in the tropics; it is even cited in the Guinness World Records, 2002, as the world's most important viral hemorrhagic fever and the most geographically widespread of the arthropodborne viruses. As illustrated in this issue of Emerging Infectious Diseases, dengue epidemics are expanding rapidly, as is the literature on the subject. That dengue was transmitted in the United States for nearly 1 year in 2001 (1) should serve as a wake up call to Americans, most of whom are ignorant of the threat of this disease. Both the major dengue vectors, Aedes aegypti and Ae. albopictus, are widely distributed in the continental United States. This emerging disease continues to baffle and challenge epidemiologists and clinicians. Despite endemicity of 3 or more different dengue viruses, why does severe dengue occur in some populations and not in others? Why are children principally affected in some areas and adults in oth-ers? How can severe dengue reliably be recognized early enough to permit appropriate therapy to be applied? Recent studies point in the direction of answers to these questions. During an infection with any of the 4 dengue viruses, the principal threat to human health resides in the ability of the infecting virus to produce an acute febrile syndrome characterized by clinically significant vascular permeabil-ity, dengue hemorrhagic fever (DHF). However, because at onset vascular permeability exhibits only subtle changes, how can a diagnosis be made early enough to begin life-saving intravenous treatment? In persons with light skin color, the standard syphygmomanometer cuff tourniquet test has been widely used to screen children in outpatient settings; a positive test result is an early warning of incipient DHF. Because of genetic diversity among humans, the tourniquet test as a screening tool requires widespread evaluation and validation. In a prospective study of 1,136 Vietnamese children with serologically confirmed overt dengue infections, the tourniquet test had a sensitivity of 41.6%, a specificity of 94.4%, a positive pre-dictive value of 98.3%, and a negative predictive value of 17.3% (2). A positive result should prompt close observation , but a negative result does not exclude an ongoing dengue infection. A more robust screening test could result from the studies of Wills et al., who measured the size and charge characteristics of proteins leaking through the endothelial sieve in DHF patients (3). Such changes are caused, presumably, by a cytokine cascade, as yet … |
Learning Visual Models from Shape Contours Using Multiscale Convex/Concave Structure Matching | A novel approach is proposed for learning a visual model from real shape samples of the same class. The visual model means a prototype shape that contains the salient shape features of the class. Conventional methods cannot form visual models because they are based on the generalization of structural descriptions with symbolic representations. Our approach can directly acquire a visual model by generalizing the multiscale convedconcave structure of a class of shapes, that is, the approach is based on the concept that shape generalization is shape simplification wherein perceptually relevant features are retained. The simplification does not mean the approximation of shapes but rather the extraction of the optimum scale convedconcave structure common to shape samples of the class. The common structure is obtained by applying our multiscale convex/concave structure-matching method to all shape pairs among given shape samples of the class and by integrating the matching results. The matching method, which is newly proposed in this paper, is applicable to heavily deformed shapes and is effectively implemented with dynamic programming techniques. Our approach can acquire a visual model from a few samples without any a priori knowledge of the class. The obtained model is very useful for shape recognition. Therefore, a flexible shape recognition system having a learning function can be constructed. The results of applying the proposed method to synthetic and real shapes show that the method is useful for learning and shape recognition. Index Tenns-Dynamicprogramming,multiscale convedconcave structure, shape generalization, shape learning, shape matching, visual model generation. |
Implementation of RSA 2048-bit and AES 256-bit with digital signature for secure electronic health record application | This research addresses the implementation of encryption and digital signature technique for electronic health record to prevent cybercrime problem such as robbery, modification and unauthorized access. In this research, RSA 2048-bit algorithm, AES 256-bit and SHA 256 will be implemented in Java programming. Secure Electronic Health Record Information (SEHR) application design is intended to combine given services, such as confidentiality, integrity, authentication, and non-repudiation. Cryptography is used to ensure the file records and electronic documents for detailed information on the medical past, present and future forecasts that have been given only for the patients. The document will be encrypted using an encryption algorithm based on NIST Standard. In the application, there are two schemes, namely the protection and verification scheme. This research uses black-box testing and white-box testing to test the software input, output, and code without testing the process and design that occurs in the system. We demonstrated the implementation of cryptography in Secure Electronic Health Record Information (SEHR). The implementation of encryption and digital signature in this research can prevent archive thievery which is shown on implementation and is proven on the test. |
Phase I trial of nelarabine in indolent leukemias. | PURPOSE
To test whether nelarabine is an effective agent for indolent leukemias and to evaluate whether there is a relationship between cellular pharmacokinetics of the analog triphosphate and clinical responses.
PATIENTS AND METHODS
Thirty-five patients with relapsed/refractory leukemias (n = 24, B-cell chronic lymphocytic leukemia and n = 11, T-cell prolymphocytic leukemia) were entered onto three different protocols. For schedule A, patient received nelarabine daily for 5 days, whereas for schedule B, nelarabine was administered on days 1, 3, and 5. Schedule C was similar to schedule B except that fludarabine was also infused. Plasma and cellular pharmacokinetics were studied during the first cycle.
RESULTS
Responses were achieved in 20%, 15%, and 63% of patients receiving schedule A, B, and C, respectively. Histologic category, number of prior therapies, and fludarabine refractoriness did not influence the response rate. The most common nonhematologic toxicity was peripheral neuropathy. Grade 4 neutropenia and thrombocytopenia complicated 23% and 26% of courses respectively, and were significantly more frequent among patients with pre-existing marrow failure. Pharmacokinetics of plasma nelarabine and arabinosylguanine (ara-G) and of cellular ara-G triphosphate (ara-GTP) were similar in the two groups of diagnoses, and the elimination of ara-GTP from leukemia cells was slow (median, > 24 hours). The median peak intracellular concentrations of ara-GTP were significantly different (P = .0003) between responders (440 micromol/L; range, 35 to 1,438 micromol/L; n = 10) and nonresponders (50 micromol/L; range, 22 to 178 micromol/L; n = 15).
CONCLUSION
Nelarabine is an effective regimen against indolent leukemias, and combining it with fludarabine was most promising. Determination of tumor cell ara-GTP levels may provide a predictive test for response to nelarabine. |
Perfusion-weighted MRI to evaluate cerebral autoregulation in aneurysmal subarachnoid haemorrhage | The aim of this study was to evaluate autoregulatory mechanisms in different vascular territories within the first week after aneurysmal subarachnoid haemorrhage (SAH) by perfusion-weighted magnetic resonance imaging (PW-MRI). For this purpose, regional cerebral blood flow and volume (rCVF and rCBV) were measured in relation to different degrees of angiographically visible cerebral vasospasm (CVS). In 51 SAH patients, PW-MRI and digital subtraction angiography were performed about 5 days after onset of SAH. Regional CBF and rCBV were analysed in the territories of the anterior cerebral artery (ACA), the middle cerebral artery (MCA) and the basal ganglia of each hemisphere in relationship to the degree of CVS in the particular territory. Correlations between rCBF, rCBV and CVS were analysed. CVS was found in 22 out of 51 patients in at least one territory. In all territories, rCBV decreased with increasing degree of CVS, correlated with a decrease of rCBF. In the ACA territories, SAH patients with severe CVS had significantly lower rCBF compared to healthy subjects and to SAH patients without CVS. In the basal ganglia, rCBF and rCBV of the control group were significantly higher compared to the patients without and with moderate vasospasms. PW-MRI showed simultaneous decrease of rCBF and rCBV in patients with SAH. The fact that rCBV did not increase in territories with CVS to maintain rCBF reveals dysfunctional vascular autoregulation. Vasospasms in the microvasculature are most evident in the basal ganglia, showing decreased rCBV and rCBF even in SAH patients without CVS. |
Evaluating different query reformulation techniques for the Geographical Information Retrieval task considering geospatial entities as textual terms | Geographic Information Retrieval (GIR) is an active and growing research area that focuses on the retrieval of textual documents according to a geographical criteria of relevance. However, since a GIR system can be treated as a traditional Information Retrieval (IR) system, it is important to pay attention to finding effective methods for query reformulation. In this way, the search results will improve their quality and recall. In this paper, we propose different Natural Language Processing (NLP) techniques of query reformulation related to the modification and/or expansion of both parts thematic and geospatial that are usually recognized in a geographical query. We have evaluated each of the reformulations proposed using GeoCLEF as an evaluation framework for GIR systems. The results obtained show that all proposed query reformulations retrieved relevant documents that were not retrieved using the original query. |
Machine learning: Trends, perspectives, and prospects | Machine learning addresses the question of how to build computers that improve automatically through experience. It is one of today’s most rapidly growing technical fields, lying at the intersection of computer science and statistics, and at the core of artificial intelligence and data science. Recent progress in machine learning has been driven both by the development of new learning algorithms and theory and by the ongoing explosion in the availability of online data and low-cost computation. The adoption of data-intensive machine-learning methods can be found throughout science, technology and commerce, leading to more evidence-based decision-making across many walks of life, including health care, manufacturing, education, financial modeling, policing, and marketing. |
Diabetes Research in Iran: a Scientometric Analysis of Publications Output | INTRODUCTION
In the developing countries, diabetes mellitus as a chronic diseases, have replaced infectious diseases as the main causes of morbidity and mortality. International Diabetes Federation (IDF) recently estimates 382 million people have diabetes globally and more than 34.6 million people in the Middle East Region and this number will increase to 67.9 million by 2035. The aim of this study was to analyze Iran's research performance on diabetes in national and international context.
METHODS
This Scientometric analysis is based on the Iranian publication data in diabetes research retrieved from the Scopus citation database till the end of 2014. The string used to retrieve the data was developed using "diabetes" keyword in title, abstract and keywords, and finally Iran in the affiliation field was our main string.
RESULTS
Iran's cumulative publication output in diabetes research consisted of 4425 papers from 1968 to 2014, with an average number of 96.2 papers per year and an annual average growth rate of 25.5%. Iran ranked 25th place with 4425 papers among top 25 countries with a global share of 0.72 %. Average of Iran's publication output was 6.19 citations per paper. The average citation per paper for Iranian publications in diabetes research increased from 1.63 during 1968-1999 to 10.42 for 2014.
CONCLUSIONS
Although diabetic population of Iran is increasing, number of diabetes research is not remarkable. International Diabetes Federation suggested increased funding for research in diabetes in Iran for cost-effective diabetes prevention and treatment. In addition to universal and comprehensive services for diabetes care and treatment provided by Iranian health care system, Iranian policy makers should invest more on diabetes research. |
Genome editing with engineered zinc finger nucleases | Reverse genetics in model organisms such as Drosophila melanogaster, Arabidopsis thaliana, zebrafish and rats, efficient genome engineering in human embryonic stem and induced pluripotent stem cells, targeted integration in crop plants, and HIV resistance in immune cells — this broad range of outcomes has resulted from the application of the same core technology: targeted genome cleavage by engineered, sequence-specific zinc finger nucleases followed by gene modification during subsequent repair. Such 'genome editing' is now established in human cells and a number of model organisms, thus opening the door to a range of new experimental and therapeutic possibilities. |
Plant Disease Detection using Image Processing- A Review | This paper holds a survey on plant leaf diseases classification using image processing. Digital image processing has three basic steps: image processing, analysis and understanding. Image processing contains the preprocessing of the plant leaf as segmentation, color extraction, diseases specific data extraction and filtration of images. Image analysis generally deals with the classification of diseases. Plant leaf can be classified based on their morphological features with the help of various classification techniques such as PCA, SVM, and Neural Network. These classifications can be defined various properties of the plant leaf such as color, intensity, dimensions. Back propagation is most commonly used neural network. It has many learning, training, transfer functions which is used to construct various BP networks. Characteristics features are the performance parameter for image recognition. BP networks shows very good results in classification of the grapes leaf diseases. This paper provides an overview on different image processing techniques along with BP Networks used in leaf disease classification. |
A patient decision aid for antidepressant use in pregnancy: study protocol for a randomized controlled trial | BACKGROUND
Many women with depression experience significant difficulty making a decision about whether or not to use antidepressant medication in pregnancy. Patient decision aids (PDAs) are tools that assist patients in making complex health decisions. PDAs can reduce decision-making difficulty and lead to better treatment outcomes. We describe the methods for a pilot randomized controlled trial of an interactive web-based PDA for women who are having difficulty deciding about antidepressant drug use in pregnancy.
METHODS/DESIGN
This is a pilot randomized controlled trial that aims to assess the feasibility of a larger, multi-center efficacy study. The PDA aims to help a woman: (1) understand why an antidepressant is being recommended, (2) be knowledgeable about potential benefits and risks of treatment and non-treatment with antidepressants, and (3) be clear about which benefits and risks are most important to her, with the goal of improving confidence in her decision-making. We include women aged 18 years or older who are: (1) planning a pregnancy or are pregnant (gestational age less than 30 weeks), (2) diagnosed with major depressive disorder, (3) deciding whether or not to use a selective serotonin reuptake inhibitor (SSRI) or serotonin norepinephrine reuptake inhibitor (SNRI) antidepressant in pregnancy, and (4) having at least moderate decision-making difficulty as per a Decisional Conflict Scale (DCS) Score ≥25. Participants are randomized to receive the PDA or an informational resource sheet via a secure website, and have access to the stated allocation until their final study follow-up. The primary outcomes of the pilot study are feasibility of recruitment and retention, acceptability of the intervention, and adherence to the trial protocol. The primary efficacy outcome is DCS score at 4 weeks post randomization, with secondary outcomes including depressive and anxiety symptoms.
DISCUSSION
Our PDA represents a key opportunity to optimize the decision-making process for women around antidepressants in pregnancy, leading to effective decision-making and optimizing improved maternal and child outcomes related to depression in pregnancy. The electronic nature of the PDA will facilitate keeping it up-to-date, and allow for widespread dissemination after efficacy is demonstrated.
TRIAL REGISTRATION
This trial is registered on ClinicalTrials.Gov under the identifier NCT02308592 (first registered: 2 December 2014). |
Risk management by a neoliberal state : construction of new knowledge through lifelong learning in Japan | This article examines the current developments in Japan's lifelong learning policy and practices. I argue that promoting lifelong learning is an action that manages the risks of governance for the ... |
Offense related characteristics and psychosexual development of juvenile sex offenders | OBJECTIVE
This article reports on offense related characteristics and the psychosexual development in subgroups of juvenile sex offenders as measured by the Global Assessment Instrument for Juvenile Sex Offenders (GAIJSO). The predictive validity of these characteristics for persistent (sexual) offensive behavior in subgroups of juvenile sex offenders was investigated.
METHODS
One hundred seventy four sex offenders (mean age 14.9 SD 1.4) referred by the police to the Dutch Child Protection Board were examined. Offense related characteristics were assessed by means of the GAIJSO and the BARO (a global assessment tool for juvenile delinquents), and criminal careers of the subjects were ascertained from official judicial records.
RESULTS
Serious need for comprehensive diagnostics were found on the domains sexual offense and psychosexual development in juvenile sex offenders, especially in the group of child molesters. These youngsters displayed more internalizing and (psychosexual) developmental problems and their sexual offense was more alarming as compared to the other juvenile sex offender subgroups. Although one third of the juveniles had already committed one or more sex offenses prior to the index offense, at follow up (mean follow up period: 36 months SD 18 months) almost no sexual recidivism was found (0.6% of the entire sample). However, a substantial proportion of the entire sample of juvenile sex offenders showed non-sexual (55.6%) and violent recidivism (32.1%). Several predictors for a history of multiple sex offending and non-sexual recidivism were identified.
CONCLUSION
This study revealed numerous problems in juvenile sex offenders. Assessment using the GAIJSO is helpful in order to identify indicators for extensive diagnostic assessment. In order to investigate the predictive validity for sexual reoffending a longer follow up period is necessary. |
The odyssey of the Cache Creek terrane, Canadian Cordillera: Implications for accretionary orogens, tectonic setting of Panthalassa, the Pacific superwell, and break-up of Pangea | The Cache Creek terrane (CCT) of the Canadian Cordillera consists of accreted seamounts that originated adjacent to the Tethys Ocean in the Permian. We utilize Potential Translation Path plots to place quantitative constraints on the location of the CCT seamounts through time, including limiting the regions within which accretion events occurred. We assume a starting point for the CCT seamounts in the easternmost Tethys at 280 Ma. Using reasonable translation rates (11 cm/a), accretion to the Stikinia– Quesnellia oceanic arc, which occurred at about 230 Ma, took place in western Panthalassa, consistent with the mixed Tethyan fauna of the arc. Subsequent collision with a continental terrane, which occurred at about 180 Ma, took place in central Panthalassa, N4000 km west of North America yielding a composite ribbon continent. Westward subduction of oceanic lithosphere continuous with the North American continent from 180 to 150 Ma facilitated docking of the ribbon continent with the North American plate. The paleogeographic constraints provided by the CCT indicate that much of the Canadian Cordilleran accretionary orogen is exotic. The accreting crustal block, a composite ribbon continent, grew through repeated collisional events within Panthalassa prior to docking with the North American plate. CCT's odyssey requires the presence of subduction zones within Panthalassa and indicates that the tectonic setting of the Panthalassa superocean differed substantially from the current Pacific basin, with its central spreading ridge and marginal outward dipping subduction zones. A substantial volume of oceanic lithosphere was subducted during CCT's transit of Panthalassa. Blanketing of the core by these cold oceanic slabs enhanced heat transfer out of the core into the lowermost mantle, and may have been responsible for the Cretaceous Normal Superchron, the coeval Pacific-centred mid-Cretaceous superplume event, and its lingering progeny, the Pacific Superswell. Far field tensile stress attributable to the pull of the slab subducting beneath the ribbon continent from 180 to 150Ma instigated the opening of the Atlantic, initiating the dispersal phase of the supercontinent cycle by breaking apart Pangea. Docking of the ribbon continent with the North American plate at 150 Ma terminated the slab pull induced stress, resulting in a drastic reduction in the rate of spreading within the growing Atlantic Ocean. © 2006 Elsevier B.V. All rights reserved. |
Treatment-related parameters predicting efficacy of Lym-1 radioimmunotherapy in patients with B-lymphocytic malignancies. | This study was designed to evaluate dosimetric, pharmacokinetic, and other treatment-related parameters as predictors of outcome in patients with advanced B-lymphocytic malignancies. Fifty-seven patients were treated with radiolabeled Lym-1 antibody in early phase trials between 1985 and 1994. Logistic regression and proportional hazards models were used to evaluate treatment parameters for their ability to predict outcome, taking into account patient risk group based on Karnofsky performance status and serum lactic dehydrogenase. The occurrence of a partial or complete response (31 of 57 patients) and development of human antimouse antibody (HAMA) predicted improved survival using a time-dependent proportional hazards model. The final multivariate model for survival with parameters significant at P </= 0.05 included overall response and pretreatment risk group. Although some of the dosimetric and pharmacokinetic parameters were predictive in univariate analyses, only longer half-time of radionuclide in the blood showed any indication of improved prediction beyond that provided by the lactic dehydrogenase/Karnofsky performance status-based risk groups. Splenic volume, splenectomy, and malignant tissue Lym-1 reactivity were not contributory. In this patient group, the effect of radiolabeled Lym-1 treatment as indicated by measurable tumor response was associated with improved survival. Development of HAMA was also associated with improved survival, indicating that concern about HAMA should not preclude exploration of radioimmunotherapy. Although dosimetry has a role in determining safety based on dose to normal organs, when adjusted for baseline clinical features, dosimetric and pharmacokinetic parameters showed limited ability to improve outcome prediction. |
Four Quadrant Operation of BLDC Motor in MATLAB/SIMULINK | Brush less Direct Current (BLDC) motors are one of the motor types rapidly gaining popularity. BLDC motors are used in industries such as Appliances, Automotive, Aerospace, Medical, Industrial Automation Equipment and Instrumentation. BLDC motor with matching servo amplifiers offer many features than other motion control systems, like broader speed range, operation in special environments and mechanical advantages. Brush less dc motor and servo amplifier therefore presents a complete line of compatible, latest technology of brush less system components. In this paper, the modeling of Brush less DC motor drive system along with control system for speed and current has been presented using MATLAB / SIMULINK. In order to evaluate the model, various cases of simulation studies are carried out. Test results thus obtained show that, the model performance is satisfactory. |
The Ideological Effects of Framing Threat on Immigration and Civil Liberties | Assuming that migration threat is multi-dimensional, this article seeks to investigate how various types of threats associated with immigration affect attitudes towards immigration and civil liberties. Through experimentation, the study unpacks the ‘securitization of migration’ discourse by disaggregating the nature of immigration threat, and its impact on policy positions and ideological patterns at the individual level. Based on framing and attitudinal analysis, we argue that physical security in distinction from cultural insecurity is enough to generate important ideological variations stemming from strategic input (such as framing and issue-linkage). We expect then that as immigration shifts from a cultural to a physical threat, immigration issues may become more politically salient but less politicized and subject to consensus. Interestingly, however, the findings reveal that the effects of threat framing are not ubiquitous, and may be conditional upon ideology. Liberals were much more susceptible to the frames than were conservatives. Potential explanations for the ideological effects of framing, as well as their implications, are explored. |
Multi-view Sparse Co-clustering via Proximal Alternating Linearized Minimization | 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 045 046 047 048 049 050 051 052 053 054 055 056 057 058 059 060 061 062 063 064 065 066 067 068 069 070 071 072 073 074 075 076 077 078 079 080 081 082 083 084 085 086 087 088 089 090 091 092 093 094 095 096 097 098 099 100 101 102 103 104 105 106 107 108 109 Multi-view Sparse Co-clustering via Proximal Alternating Linearized Minimization |
Forwarding metamorphosis: fast programmable match-action processing in hardware for SDN | In Software Defined Networking (SDN) the control plane is physically separate from the forwarding plane. Control software programs the forwarding plane (e.g., switches and routers) using an open interface, such as OpenFlow. This paper aims to overcomes two limitations in current switching chips and the OpenFlow protocol: i) current hardware switches are quite rigid, allowing ``Match-Action'' processing on only a fixed set of fields, and ii) the OpenFlow specification only defines a limited repertoire of packet processing actions. We propose the RMT (reconfigurable match tables) model, a new RISC-inspired pipelined architecture for switching chips, and we identify the essential minimal set of action primitives to specify how headers are processed in hardware. RMT allows the forwarding plane to be changed in the field without modifying hardware. As in OpenFlow, the programmer can specify multiple match tables of arbitrary width and depth, subject only to an overall resource limit, with each table configurable for matching on arbitrary fields. However, RMT allows the programmer to modify all header fields much more comprehensively than in OpenFlow. Our paper describes the design of a 64 port by 10 Gb/s switch chip implementing the RMT model. Our concrete design demonstrates, contrary to concerns within the community, that flexible OpenFlow hardware switch implementations are feasible at almost no additional cost or power. |
Exact Synthesis and Implementation of New High-Order Wideband Marchand Baluns | New ultra-wideband high-order Marchand baluns with one microstrip unbalanced port and two microstrip balanced ports are proposed in this paper. The proposed Marchand baluns are synthesized based on an -plane high-pass prototype using the Richards' transformation. The responses of the synthesized high-order Marchand baluns are exactly predicted at all real frequencies. All circuit elements are commensurate, which means the electrical lengths of all transmission line elements are a quarter-wavelength long at the center frequency. Two fifth-order Marchand baluns with reflection coefficients of 20.53 and 21.71 dB corresponding to 131% and 152% bandwidth, respectively, are synthesized and realized using the combinations of microstrip lines, slotlines, and coplanar striplines. Simulated and measured results are presented. |
Radiostereometric analysis study of tantalum compared with titanium acetabular cups and highly cross-linked compared with conventional liners in young patients undergoing total hip replacement. | BACKGROUND
Radiostereometric analysis provides highly precise measurements of component micromotion relative to the bone that is otherwise undetectable by routine radiographs. This study compared, at a minimum of five years following surgery, the micromotion of tantalum and titanium acetabular cups and femoral head penetration in highly cross-linked polyethylene liners and conventional (ultra-high molecular weight polyethylene) liners in active patients who had undergone total hip replacement.
METHODS
This institutional review board-approved prospective, randomized, blinded study involved forty-six patients. Patients were randomized into one of four cohorts according to both acetabular cup and polyethylene liner. Patients received either a cementless cup with a titanium mesh surface or a tantalum trabecular surface and either a highly cross-linked polyethylene liner or an ultra-high molecular weight polyethylene liner. Radiostereometric analysis examinations and Short Form-36 Physical Component Summary, Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC), University of California Los Angeles (UCLA) activity, and Harris hip scores were obtained preoperatively, postoperatively, at six months, and annually thereafter.
RESULTS
All patients had significant improvement (p < 0.05) in Short Form-36 Physical Component Summary, WOMAC, UCLA activity, and Harris hip scores postoperatively. On radiostereometric analysis examination, highly cross-linked polyethylene liners showed significantly less median femoral head penetration at five years (p < 0.05). Steady-state wear rates from one year to five years were 0.04 mm per year for ultra-high molecular weight polyethylene liners and 0.004 mm per year for highly cross-linked polyethylene liners. At the five-year follow-up, the median migration (and standard error) was 0.05 ± 0.20 mm proximally for titanium cups and 0.21 ± 0.05 mm for tantalum cups.
CONCLUSIONS
In this young population who had undergone total hip replacement, radiostereometric analysis showed significantly less femoral head penetration in the highly cross-linked polyethylene liners compared with that in the conventional ultra-high molecular weight polyethylene liners. Penetration rates were one order of magnitude less in highly cross-linked polyethylene liners compared with ultra-high molecular weight polyethylene liners. There was no significant difference in proximal migration between the tantalum and titanium acetabular cups through the five-year follow-up (p > 0.19). |
Diagnosis and management of Castleman disease. | BACKGROUND
Castleman disease is an uncommon lymphoproliferative disorder characterized as either unicentric or multicentric. Unicentric Castleman disease (UCD) is localized and carries an excellent prognosis, whereas multicentric Castleman disease (MCD) is a systemic disease occurring most commonly in the setting of HIV infection and is associated with human herpesvirus 8. MCD has been associated with considerable morbidity and mortality, and the therapeutic landscape for its management continues to evolve.
METHODS
The available medical literature on UCD and MCD was reviewed. The clinical presentation and pathological diagnosis of Castleman disease was reviewed, along with associated disorders such as certain malignancies and autoimmune complications.
RESULTS
Surgical resection remains the standard therapy for UCD, while systemic therapies are required for the management of MCD. Rituximab monotherapy is the mainstay of therapy; however, novel therapies targeting interleukin 6 may represent a treatment option in the near future. Antiviral strategies as well as single-agent and combination chemotherapy with glucocorticoids are established systemic therapies. The management of Castleman disease also requires careful attention to potential concomitant infections, malignancies, and associated syndromes.
CONCLUSIONS
UCD and MCD constitute uncommon but well-defined clinicopathologic entities. Although UCD is typically well controlled with local therapy, MCD continues to pose formidable challenges in management. We address historical chemotherapy-based approaches to this disease as well as recently developed targeted therapies, including rituximab and siltuximab, that have improved the outcome for newly diagnosed patients. Ongoing research into the management of MCD is needed. |
A morte no pensamento de Lúcio Aneu Sêneca | Death in the works of Lucius Annaeus Seneca. Reflections on death, in its natural form or through suicide, in the works of Lucius Annaeus Seneca, a philosopher of the 1 st century A.D., are provided. The Roman philosopher reflects on death as part and parcel to the formation of the ideal wise man. Since Seneca considers death as one of the duties of being, the ideal wise man must be aware of such a condition and should distance himself from the fear that its thought awakes in him. Such conscious awareness leads to suicide if it is necessary so that the dignity of the human being be preserved. Philosophy makes the human being distance himself from the fear and the anguish provoked by death. |
Random-Forests-Based Network Intrusion Detection Systems | Prevention of security breaches completely using the existing security technologies is unrealistic. As a result, intrusion detection is an important component in network security. However, many current intrusion detection systems (IDSs) are rule-based systems, which have limitations to detect novel intrusions. Moreover, encoding rules is time-consuming and highly depends on the knowledge of known intrusions. Therefore, we propose new systematic frameworks that apply a data mining algorithm called random forests in misuse, anomaly, and hybrid-network-based IDSs. In misuse detection, patterns of intrusions are built automatically by the random forests algorithm over training data. After that, intrusions are detected by matching network activities against the patterns. In anomaly detection, novel intrusions are detected by the outlier detection mechanism of the random forests algorithm. After building the patterns of network services by the random forests algorithm, outliers related to the patterns are determined by the outlier detection algorithm. The hybrid detection system improves the detection performance by combining the advantages of the misuse and anomaly detection. We evaluate our approaches over the knowledge discovery and data mining 1999 (KDDpsila99) dataset. The experimental results demonstrate that the performance provided by the proposed misuse approach is better than the best KDDpsila99 result; compared to other reported unsupervised anomaly detection approaches, our anomaly detection approach achieves higher detection rate when the false positive rate is low; and the presented hybrid system can improve the overall performance of the aforementioned IDSs. |
Effects of Oxytocin on Attention to Emotional Faces in Healthy Volunteers and Highly Socially Anxious Males | BACKGROUND
Evidence suggests that individuals with social anxiety demonstrate vigilance to social threat, whilst the peptide hormone oxytocin is widely accepted as supporting affiliative behaviour in humans.
METHODS
This study investigated whether oxytocin can affect attentional bias in social anxiety. In a double-blind, randomized, placebo-controlled, within-group study design, 26 healthy and 16 highly socially anxious (HSA) male volunteers (within the HSA group, 10 were diagnosed with generalized social anxiety disorder) were administered 24 IU of oxytocin or placebo to investigate attentional processing in social anxiety. Attentional bias was assessed using the dot-probe paradigm with angry, fearful, happy and neutral face stimuli.
RESULTS
In the baseline placebo condition, the HSA group showed greater attentional bias for emotional faces than healthy individuals. Oxytocin reduced the difference between HSA and non-socially anxious individuals in attentional bias for emotional faces. Moreover, it appeared to normalize attentional bias in HSA individuals to levels seen in the healthy population in the baseline condition. The biological mechanisms by which oxytocin may be exerting these effects are discussed.
CONCLUSIONS
These results, coupled with previous research, could indicate a potential therapeutic use of this hormone in treatment for social anxiety. |
Using the WordNet Ontology for Interpreting Medical Records | As hospitals throughout Europe are striving exploit advantages of IT and network technologies, electronic medical records systems are starting to replace paper based archives. This paper suggests and describes an add-on service to electronic medical record systems that will help regular patients in getting insight to their diagnoses and medical record. The add-on service is based annotating polysemous and foreign terms with WordNet synsets. By exploiting the way that relationships between synsets are structured and described in WordNet, it is shown how patients can get interactive opportunities to generalize and understand their personal records. |
Using Online Conversations to Study Word-of-Mouth Communication | M are very interested in word-of-mouth communication because they believe that a product’s success is related to the word of mouth that it generates. However, there are at least three significant challenges associated with measuring word of mouth. First, how does one gather the data? Because the information is exchanged in private conversations, direct observation traditionally has been difficult. Second, what aspect of these conversations should one measure? The third challenge comes from the fact that word of mouth is not exogenous. While the mapping from word of mouth to future sales is of great interest to the firm, we must also recognize that word of mouth is an outcome of past sales. Our primary objective is to address these challenges. As a context for our study, we have chosen new television (TV) shows during the 1999–2000 seasons. Our source of word-of-mouth conversations is Usenet, a collection of thousands of newsgroups with diverse topics. We find that online conversations may offer an easy and cost-effective opportunity to measure word of mouth. We show that a measure of the dispersion of conversations across communities has explanatory power in a dynamic model of TV ratings. |
Event Extraction via Dynamic Multi-Pooling Convolutional Neural Networks | Traditional approaches to the task of ACE event extraction primarily rely on elaborately designed features and complicated natural language processing (NLP) tools. These traditional approaches lack generalization, take a large amount of human effort and are prone to error propagation and data sparsity problems. This paper proposes a novel event-extraction method, which aims to automatically extract lexical-level and sentence-level features without using complicated NLP tools. We introduce a word-representation model to capture meaningful semantic regularities for words and adopt a framework based on a convolutional neural network (CNN) to capture sentence-level clues. However, CNN can only capture the most important information in a sentence and may miss valuable facts when considering multiple-event sentences. We propose a dynamic multi-pooling convolutional neural network (DMCNN), which uses a dynamic multi-pooling layer according to event triggers and arguments, to reserve more crucial information. The experimental results show that our approach significantly outperforms other state-of-the-art methods. |
Local optima smoothing for global optimization | It is widely believed that in order to solve large scale global optimization problems an appropriate mixture of local approximation and global exploration is necessary. Local approximation, if first order information on the objective function is available, is efficiently performed by means of local optimization methods. Unfortunately, global exploration, in absence of some kind of global information on the problem, is a “blind” procedure, aimed at placing observations as evenly as possible in the search domain. Often this procedure reduces to uniform random sampling (like in Multistart algorithms, or in techniques based on clustering). In this paper we propose a new framework for global exploration which tries to guide random exploration towards the region of attraction of low-level local optima. The main idea originated by the use of smoothing techniques (based on gaussian convolutions): the possibility of applying a smoothing transformation not to the objective function but to the result of local searches seems to have never been explored yet. Although an exact smoothing of the results of local searches is impossible to implement, in this paper we propose a computational approximation scheme which has proven to be very efficient and (maybe more important) extremely robust in solving large scale global optimization problems with huge numbers of local optima. ∗Email: [email protected] Dip. Sistemi e Informatica Università di Firenze †Email: [email protected] Dip. Informatica Università di Torino ‡Email: [email protected] Dip. Sistemi e Informatica Università di Firenze |
A Data-Centric Approach for Modeling and Estimating Efficiency of Dataflows for Accelerator Design. | The mechanisms used by DNN accelerators to leverage datareuse and perform data staging are known as dataflow, and they directly impact the performance and energy efficiency of DNN accelerator designs. Co-optimizing the accelerator microarchitecture and its internal dataflow is crucial for accelerator designers, but there is a severe lack of tools and methodologies to help them explore the co-optimization design space. In this work, we first introduce a set of datacentric directives to concisely specify DNN dataflows in a compiler-friendly form. Next, we present an analytical model, MAESTRO, that estimates various cost-benefit tradeoffs of a dataflow including execution time and energy efficiency for a DNN model and hardware configuration. Finally, we demonstrate the use of MAESTRO to drive a hardware design space exploration (DSE) engine. The DSE engine searched 480M designs and identified 2.5M valid designs at an average rate of 0.17M designs per second, and also identified throughputand energy-optimized designs among this set. |
Current neuroimaging techniques in Alzheimer's disease and applications in animal models. | With Alzheimer's disease (AD) quickly becoming the most costly disease to society, and with no disease-modifying treatment currently, prevention and early detection have become key points in AD research. Important features within this research focus on understanding disease pathology, as well as finding biomarkers that can act as early indicators and trackers of disease progression or potential treatment. With the advances in neuroimaging technology and the development of new imaging techniques, the search for cheap, noninvasive, sensitive biomarkers becomes more accessible. Modern neuroimaging techniques are able to cover most aspects of disease pathology, including visualization of senile plaques and neurofibrillary tangles, cortical atrophy, neuronal loss, vascular damage, and changes in brain biochemistry. These methods can provide complementary information, resulting in an overall picture of AD. Additionally, applying neuroimaging to animal models of AD could bring about greater understanding in disease etiology and experimental treatments whilst remaining in vivo. In this review, we present the current neuroimaging techniques used in AD research in both their human and animal applications, and discuss how this fits in to the overall goal of understanding AD. |
Comparison of the effi cacy and safety of new oral anticoagulants with warfarin in patients with atrial fi brillation : a meta-analysis of randomised trials | Methods We searched Medline from Jan 1, 2009, to Nov 19, 2013, limiting searches to phase 3, randomised trials of patients with atrial fi brillation who were randomised to receive new oral anticoagulants or warfarin, and trials in which both effi cacy and safety outcomes were reported. We did a prespecifi ed meta-analysis of all 71 683 participants included in the RE-LY, ROCKET AF, ARISTOTLE, and ENGAGE AF–TIMI 48 trials. The main outcomes were stroke and systemic embolic events, ischaemic stroke, haemorrhagic stroke, all-cause mortality, myocardial infarction, major bleeding, intracranial haemorrhage, and gastrointestinal bleeding. We calculated relative risks (RRs) and 95% CIs for each outcome. We did subgroup analyses to assess whether diff erences in patient and trial characteristics aff ected outcomes. We used a random-eff ects model to compare pooled outcomes and tested for heterogeneity. |
10 Communicative Language Teaching in the twenty-first century : the ' Principled Communicative Approach ' | Earl Stevick has always been interested in improving language teaching methodology, and he has never been afraid of innovation. His seminal work, Teaching Languages: A Way and Ways (Stevick 1980), introduced many of us to Counselling-Learning and Suggestopedia for the first time, and in Memory, Meaning and Method: A View of Language Teaching. (Stevick 1996) he discussed a wide range of theoretical and practical considerations to help us better understand the intricate cognitive and interpersonal processes whereby a language is acquired and then used for meaningful communication. The proposal in this chapter to revitalize Communicative Language Teaching (CLT) in the light of contemporary scholarly advances is fully within the spirit of Earl's approach.' By the turn of the new millennium, CLT had become a real buzzword in language teaching methodology, but the extent to which the term covers a well-defined and uniform teaching method is highly questionable. In fact, since the genesis of CLT in the early 1970s, its proponents have developed a very wide range of variants that were only loosely related to each other (for overviews, see Savignon 2005; Spada 2007). In this chapter I first look at the core characteristics of CLT to explore the roots of the diverse interpretations and then argue that in order for CLT to fulfil all the expectations attached to it in the twenty-first century, the method needs to be revised according to the latest findings of psycholinguistic research. I will conclude the chapter by outlining the main principles of a proposed revised approach that I have termed the `Principled Communicative Approach' (PCA). |
Mobile Scrum | In this paper we discuss whether mobile applications can support software developers to improve their efficiency and communication. We propose the idea of Mobile Scrum, a native and integrated mobile application that supports scrum teams in their activities and fits into their environment. Team members can use it anytime and anywhere to communicate and share knowledge within the team. Mobile Scrum provides a lightweight, yet easy and intuitive to use interface compared to existing web-based and desktop applications. With guides and templates it helps to prevent typical problems when applying Scrum. Furthermore it improves the consistency of scrum artifacts and enables easy access to important information. |
An induction furnace employing with half bridge series resonant inverter | In this paper, an induction furnace employing with half bridge series resonant inverter built around IGBT as its switching devices suitable for melting 500 grams of brass is presented. Melting times of 10 minutes were achieved at a power level of approximately 4 kW. The operating frequency is automatically tracked to maintain a small constant lagging phase angle using dual phase lock loop when brass is melted. The coil voltage is controlled to protect the resonant capacitors. The experimental results are presented. |
Automatic Seed Classification by Shape and Color Features using Machine Vision Technology | : In this paper the proposed system uses content based image retrieval (CBIR) technique for identification of seed e.g. wheat, rice, gram etc. on the basis of their features. CBIR is a technique to identify or recognize the image on the basis of features present in image. Basically features are classified in to four categories 1.color 2.Shape 3. texture 4. size .In this system we are extracting color, shape feature extraction. After that classifying images in to categories using neural network according to the weights and image displayed from the category for which neural network shows maximum weight. category1 belongs to wheat and category2 belongs to gram. Experiment was conducted on 200 images of wheat and gram by using Euclidean distance(ED) and artificial neural network techniques. From 200 images 150 are used for training purpose and 50 images are used for testing purpose. The precision rate of the system by using ED is 84.4 percent By using Artificial neural network precision rate is 95 percent. |
Trait Anxiety Modulates the Neural Efficiency of Inhibitory Control | An impairment of attentional control in the face of threat-related distracters is well established for high-anxious individuals. Beyond that, it has been hypothesized that high trait anxiety more generally impairs the neural efficiency of cognitive processes requiring attentional control—even in the absence of threat-related stimuli. Here, we use fMRI to show that trait anxiety indeed modulates brain activation and functional connectivities between task-relevant brain regions in an affectively neutral Stroop task. In high-anxious individuals, dorsolateral pFC showed stronger task-related activation and reduced coupling with posterior lateral frontal regions, dorsal ACC, and a word-sensitive area in the left fusiform gyrus. These results support the assumption that a general (i.e., not threat-specific) impairment of attentional control leads to reduced neural processing efficiency in anxious individuals. The increased dorsolateral pFC activation is interpreted as an attempt to compensate for suboptimal connectivity within the cortical network subserving task performance. |
Enhancing K-12 education with alice programming adventures | This paper describes the integration of the Alice 3D virtual worlds environment into many disciplines in elementary school, middle school and high school. We have developed a wide range of Alice instructional materials including tutorials for both computer science concepts and animation concepts. To encourage the building of more complicated worlds, we have developed template Alice classes and worlds. With our materials, teachers and students are exposed to computing concepts while using Alice to create projects, stories, games and quizzes. These materials were successfully used in the summers 2008 and 2009 in training and working with over 130 teachers. |
The incredible queen of green: Nutritive value and therapeutic potential of Moringa oleifera Lam. | Moringa oleifera Lam. (synonym: Moringa pterygosperma Gaertn.) (M. oleifera) known in 82 countries by 210 different names is well known by the name of the miracle tree. It is one of the extensively cultivated and highly valued members of Moringaceae, a monogeneric family, comprising of thirteen perennial angiosperm shrubs and trees[1-3]. Moringa tree is endemic to the Himalayan foothills of Pakistan, Afghanistan, Bangladesh and India, and is cultivated throughout tropics. It is recognized by a mixture of vernacular names, among of them, drumstick tree, horseradish tree, ben oil tree and malunggay are the most commonly reported in the history of this plant[4]. In Pakistan, Sohanjna is the vernacular name of M. oleifera[5,6]. It yields low quality timber, as it is a softwood tree, but it is belived for centuries that this plant possesses a number of industrial, traditional and medicinal benefits[7]. Fertilizer (seed cake), green manure (leaves), blue dye (wood), fencing (living trees), domestic cleaning agent (crushed leaves), alley cropping, animal feed (leaves and seed cake), medicine (all plant parts), foliar nutrient (juice expressed from the leaves), gum (tree trunks), biogas (leaves), biopesticide, ornamental plantings, water purifier (powdered seeds), honey (flower nectar) are different uses of this plant reported in literature[2,6,8-20]. M. oleifera is a good source of aminoacids and contains a number of important minerals, β-carotene, various phenolics and vitamins[21,22]. M. oleifera is also an important vegetable food article of trade, particularly in Pakistan, Hawaii, Philippines, Africa and India which has a huge deliberation as the natural nutrition[1,23]. In South Asia, various plant parts, including leaves, bark, root, gum, flowers, pods, seeds and seed oil are used for the variety of infectious and inflammatory disorders along with hepatorenal, gastrointestinal, hematological and cardiovascular diseases[22,24-26]. Various therapeutic potentials are also credited to different parts of ARTICLE INFO ABSTRACT |
Acute changes in electromechanical parameters during different pacing configurations using a quadripolar left ventricular lead | Quadripolar left ventricular (LV) leads allow for several pacing configurations in candidates for cardiac resynchronization therapy (CRT). Whether different pacing configurations may affect LV dyssynchrony and systolic function is not completely known. We aimed to evaluate the acute effects of different pacing vectors on LV electromechanical parameters in patients implanted with a quadripolar LV lead. In this two-centre study, within 1 month of implantation 21 CRT patients (65 ± 8 years, 76 % men, 38 % ischemic) receiving a quadripolar LV lead (Quartet 1458Q, St Jude Medical) underwent LV capture threshold assessment, intracardiac electrogram optimization, and two-dimensional echocardiography during four pacing configurations: D1-P4, P4-RV coil, D1-RV coil, and P4-M2. LV dyssynchrony and contractile function were expressed by septal-to-lateral delay and global longitudinal strain (GLS). LV capture threshold varied between the configurations (P < 0.001), showing higher values in the configurations P4-RV coil and P4-M2. Septal-to-lateral delay decreased in the configurations D1-P4 and D1-RV coil (P = 0.003 and P = 0.033 vs. spontaneous rhythm, respectively). GLS improved significantly vs. spontaneous rhythm only in the configuration D1-P4 (from −8.6 ± 3.5 to −11.0 ± 3.2 %, P = 0.001). Accordingly, an increase in stroke volume and a decrease in mitral regurgitation were observed in the configuration D1-P4 (P ≤ 0.001 vs. spontaneous rhythm). In CRT patients receiving a quadripolar LV lead, significant variations in electromechanical parameters were observed by changing pacing vector. Individually targeting the optimal pacing site may enhance the acute haemodynamic response to CRT. |
Comparison of optometry vs digital photography screening for diabetic retinopathy in a single district | AbstractPurpose To compare (a) the clinical effectiveness and (b) cost effectiveness of the two models in screening for diabetic retinopathy.Methods (a) Retrospective analysis of referral diagnoses of each screening model in their first respective years of operation and an audit of screen positive patients and a sample of screen negatives referred to the hospital eye service from both screening programmes. (b) Cost effectiveness study.Participants (1) A total of 1643 patients screened in the community and in digital photography clinics; (2) 109 consecutive patients referred to the Diabetic Eye Clinic through the two existing models of diabetic retinopathy screening; (3) 55 screen negative patients from the optometry model; (4) 68 screen negative patients audited from the digital photography model.Results The compliance rate was 45% for optometry (O) vs50% for the digital imaging system (I). Background retinopathy was recorded at screening in 22% (O) vs17% (I) (P=0.03) and maculopathy in 3.8% (O) vs1.7% (I) (P=0.02). Hospital referral rates were 3.8% (O) vs4.2% (I) Sensitivity (75% for optometry, 80% for digital photography) and specificity (98% for optometry and digital photography) were similar in both models. The cost of screening each patient was £23.99 (O) vs£29.29 (I). The cost effectiveness was £832 (O) vs£853(I) in the first year.Conclusion The imaging system was not always able to detect early retinopathy and maculopathy; it was equally specific in identifying sight-threatening disease. Cost effectiveness was poor in both models, in their first operational year largely as a result of poor compliance rates in the newly introduced screening programme. Cost effectiveness of the imaging model should further improve with falling costs of imaging systems. Until then, it is essential to continue any existing well-coordinated optometry model. |
Lateral crural turn-in flap in functional rhinoplasty. | OBJECTIVE
To use the trimmed cartilage as a support material for both internal and external valves.
METHODS
The lateral crural turn-in flap (LCTF) technique is simply to make cephalic trimming of the lateral crura and turn it into a pocket created under the remaining lateral crus. Twenty-four patients with lateral crura wider than 12 mm and in whom this technique was applied took part in this study. The trimmed cartilage was used to reshape and/or support the lateral crus and the internal valve by keeping the scroll intact. The support and suspension of the lateral crura "sandwich" helped not only to prevent stenosis of the internal valve angle but also to widen it in some cases.
RESULTS
The LCTF has been used in 24 patients to reshape and/or add structure to the lateral crus with great success. The internal valve was also kept open by keeping the scroll area intact, especially in 1 patient with concave lateral crura in whom this technique helped to widen the internal valve angle.
CONCLUSIONS
This study shows that the LCTF can be used to reshape and add structure to the lateral crus and to suspend the internal valve. Although it is a powerful technique by itself in functional rhinoplasty, it should be combined with other methods, such as spreader flaps/grafts or alar battens, to obtain the maximum functional result. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.