title
stringlengths
8
300
abstract
stringlengths
0
10k
Multimodal nonlinear imaging of the human cornea.
PURPOSE To evaluate the potential of third-harmonic generation (THG) microscopy combined with second-harmonic generation (SHG) and two-photon excited fluorescence (2PEF) microscopies for visualizing the microstructure of the human cornea and trabecular meshwork based on their intrinsic nonlinear properties. METHODS Fresh human corneal buttons and corneoscleral discs from an eye bank were observed under a multiphoton microscope incorporating a titanium-sapphire laser and an optical parametric oscillator for the excitation, and equipped with detection channels in the forward and backward directions. RESULTS Original contrast mechanisms of THG signals in cornea with physiological relevance were elucidated. THG microscopy with circular incident polarization detected microscopic anisotropy and revealed the stacking and distribution of stromal collagen lamellae. THG imaging with linear incident polarization also revealed cellular and anchoring structures with micrometer resolution. In edematous tissue, a strong THG signal around cells indicated the local presence of water. Additionally, SHG signals reflected the distribution of fibrillar collagen, and 2PEF imaging revealed the elastic component of the trabecular meshwork and the fluorescence of metabolically active cells. CONCLUSIONS The combined imaging modalities of THG, SHG, and 2PEF provide key information about the physiological state and microstructure of the anterior segment over its entire thickness with remarkable contrast and specificity. This imaging method should prove particularly useful for assessing glaucoma and corneal physiopathologies.
Efficient Wireless Electric Power Transmission Using Magnetic Resonance Coupling
IJSER © 2014 http://www.ijser.org Abstract— This paper presents how one can effectively transfer electric power wirelessly using magnetic resonance coupling, this paper also shows the results obtained from effective wireless electric power transmission, these results are obtained by transmitting magnetic waves at specific resonance frequency between two coils. This paper also provides some ways to improve wireless power transmission efficiency. In the end this paper gives a lot of practical applications where results of this project can efficiently use.
Measuring Temporal and Contextual Proximity: Big Text-Data Analytics in Concept Maps
Despite being important, time and context have yet to be formally incorporated into the process of visually representing the temporal and contextual proximity between keywords in a concept map. In response to the context and time challenges, this study improves automated conventional concept mapping by measuring the temporal and contextual distance between pairs of co-occurring concepts. After generating a conventional concept map, it is temporally and contextually augmented in this work by applying an unsupervised temporal trend detection algorithm and a novel measure of contextual proximity. This proposed approach is demonstrated and validated without loss of generality for a spectrum of information technologies, showing that the resulting assessments of temporal and contextual proximity are highly correlated with subjective assessments of experts. The contribution of this work is emphasized and magnified against the current growing attention to big data analytics in general and to big text-data analytics in particular.
Comparing community structure identification
We compare recent approaches to community structure identification in terms of sensitivity and computational cost. The recently proposed modularity measure is revisited and the performance of the methods as applied to ad hoc networks with known community structure, is compared. We find that the most accurate methods tend to be more computationally expensive, and that both aspects need to be considered when choosing a method for practical purposes. The work is intended as an introduction as well as a proposal for a standard benchmark test of community detection methods.
Time in the mind: Using space to think about time
How do we construct abstract ideas like justice, mathematics, or time-travel? In this paper we investigate whether mental representations that result from physical experience underlie people's more abstract mental representations, using the domains of space and time as a testbed. People often talk about time using spatial language (e.g., a long vacation, a short concert). Do people also think about time using spatial representations, even when they are not using language? Results of six psychophysical experiments revealed that people are unable to ignore irrelevant spatial information when making judgments about duration, but not the converse. This pattern, which is predicted by the asymmetry between space and time in linguistic metaphors, was demonstrated here in tasks that do not involve any linguistic stimuli or responses. These findings provide evidence that the metaphorical relationship between space and time observed in language also exists in our more basic representations of distance and duration. Results suggest that our mental representations of things we can never see or touch may be built, in part, out of representations of physical experiences in perception and motor action.
A mineralogical perspective on the apatite in bone
A crystalline solid that is a special form of the mineral apatite dominates the composite material bone. A mineral represents the intimate linkage of a three-dimensional atomic structure with a chemical composition, each of which can vary slightly, but not independently. The specific compositional–structural linkage of a mineral influences its chemical and physical properties, such as solubility, density, hardness, and growth morphology. In this paper, we show how a mineralogic approach to bone can provide insights into the resorption–precipitation processes of bone development, the exceedingly small size of bone crystallites, and the body’s ability to (bio)chemically control the properties of bone. We also discuss why the apatite phase in bone should not be called hydroxylapatite, as well as the limitations to the use of the stoichiometric mineral hydroxylapatite as a mineral model for the inorganic phase in bone. This mineralogic approach can aid in the development of functionally specific biomaterials. D 2005 Elsevier B.V. All rights reserved.
Advantages and pitfalls of 18F-fluoro-2-deoxy-D-glucose positron emission tomography in detecting locally residual or recurrent nasopharyngeal carcinoma: comparison with magnetic resonance imaging
This prospective study was designed to elucidate the advantages and pitfalls of 18F-FDG PET in detecting locally residual/recurrent nasopharyngeal carcinoma (NPC) in comparison with MRI. We recruited NPC patients from two ongoing prospective trials. One is being performed to evaluate suspected local recurrence (group A) and the other to assess local treatment response 3 months after therapy (group B). Both groups received 18F-FDG PET and head and neck MRI. The gold standard was histopathology or clinical/imaging follow-up. An optimal cut-off standardised uptake value (SUV) was retrospectively determined. From January 2002 to August 2004, 146 patients were eligible. Thirty-four were from group A and 112 from group B. In all, 26 had locally recurrent/residual tumours. Differences in detection rate between 18F-FDG PET and MRI were not statistically significant in either group. However, 18F-FDG PET showed significantly higher specificity than MRI in detecting residual tumours among patients with initial T4 disease (p=0.04). In contrast, the specificity of 18F-FDG PET for patients with an initial T1–2 tumour treated with intracavitary brachytherapy (ICBT) was significantly lower than that for patients not treated by ICBT (72.2% vs 98.1%, p=0.003). At an SUV cut-off of 4.2, PET showed an equal and a higher accuracy compared with MRI in groups A and B, respectively. 18F-FDG PET is superior to MRI in identifying locally residual NPC among patients with initial T4 disease but demonstrates limitations in assessing treatment response in patients with initial T1–2 disease after ICBT. A cut-off SUV is a useful index for aiding in the visual detection of locally residual/recurrent NPC.
On Asymptotic Estimates in Switching and Automata Theory
Formulas and algorithms have recently been given for calculating the number of symmetry types of functions, networks and automata under various transformation groups. In almost all cases, the computations involved are quite difficult and require the use of a digital computer. In this paper, asymptotic estimates are given for these numbers, which are trivial to compute and which are very accurate in most cases even for small values of the parameters.
Carbon-based electrode materials for application in the electrochemical water splitting process
The synthesis and characterization of N-HTC materials for the application in the water electrolysis are the main focus of this work. N-incorporation is of fundamental importance throughout this work since, first of all, N is an n-type dopant by acting as electron-donor. The technique of N-doping tunes the physicochemical material properties. It is cheap, non-toxic and has the advantage to tailor the desired electrical, mechanical, optical, magnetic, structural, morphological or chemical properties. N-doping may improve conductivity, surface wettability, catalytic or storage characteristics. Besides, N-doping creates active sites and thus might enhance electrocatalytic activity as well as long-term electrochemical stability. The hydrothermal synthetic route has been applied based on the precursors glucose and urotropine. During hydrothermal synthesis, urotropine together with glucose undergoes a plethora of complex reaction pathways. Ammonia, in particular, the decomposition product of urotropine, contributes to a wide range of reaction mechanisms. The molar ratio of glucose and urotropine has been modified in order to achieve a steady increase of the N-content. Urotropine was proved to be a highly effective N-precursor. With raising the mass fraction of urotropine towards glucose, maximal N-proportion of 19 wt% can be achieved. The pH plays a major role during synthesis and has a high impact in the reaction mechanisms of the corresponding reaction steps. Based on HTC chars, to date, no systematic pH investigations exist by modification of the molar ratios of the applied precursors. In this work, systematic studies on the pH have been performed. By variation of the molar ratio of urotropine to glucose the pH can be tuned due to the decomposition of the N-precursor urotropine into ammonia and formaldehyde. The pH changes to a more alkaline behavior if the molar ratio of urotropine to glucose is increased. Reversely, rising the molar ratio of glucose to urotropine or if only glucose is used as starting material, the pH tend to be more acidic. In both cases ≤ C6 products are obtained. At pH 7 the degradation via HMF is depressed and reducing effects yields deprotonations or benzylic acid-type rearrangements. Decarbonylations and decarboxylations arise only to a lesser extent. Besides, fragmentations are the predominant reactions enclosing β-elimination, retro-aldol fragmentation or hydrolytic β-dicarbonyl fragmentation. Throughout this work, both reaction pathways have been selected in order to analyze the degree of internal condensation of the polymeric carbon network, the preferential structural moieties or functional groups that have been formed. These special features are the main reason for the high diversity of organic compounds, functional groups and numerous structural motifs in the hydrochar. A large variety of characterization techniques have been employed. It is the first comprehensive analytical study of a series of as-synthesized N-containing hydrothermal carbon (N-HTCs), consisting of analytical data of UV/VIS, HPLC, optical microscopy, SEM/(HR)TEM, BET, elemental analysis, FTIR, electronic structure calculations, TG-MS, zeta potential, acid-base titration, Raman, XRD, EELS and solid-state NMR. Due to the combination of these characterization techniques, scientific findings on the complex molecular structure could be gained. In N-free HTCs, solely O-functional groups exist, for the most part furan-based functions with aliphatic compounds acting as linkers. In contrast to it, N-HTCs consist of a large number of both, O- and N-functional groups. Only by adding higher amount urotropine, more temperature stable N-based structural elements such as pyrrole and pyridine are formed. The N-HTC disc electrodes are subjected to electrocatalytic investigations with regard to water splitting process. Electrochemical characterization studies include cyclic voltammetry (CV), modular potentiometry (MP, stationary polarization), chronopotentiometry (CP) and chronoamperometry (CA) to investigate electron transfer processes, electrochemical activity as well as stability. The pellets exhibit a remarkable mechanical stability. It is the very first approach in which no binder needed to be used to stabilize the material for electrochemical applications. Disc electrodes at high anodic-potentials are tested under oxygen evolution reaction (OER) conditions in alkaline media according to a standardization protocol. N-free disc electrodes are most susceptible to carbon corrosion whereas the incorporation of N improves the material properties. The higher the amount of N is, the higher the electrochemical activity and stability. Besides, the electrolyte undergoes a color change turning from colorless into dark brown. The brown substance was isolated and is presumed to be a polymeric humic acid-like organic compound.
Automatic Image Mosaic System Using Image Feature Detection and Taylor Series
Image mosaicing has been collecting considerable attention in the field of computer vision and photogrammetry. Unlike previous methods using a tripod, we have developed which can handle images taken with a hand-held camera to accurately construct a panoramic image. This paper proposes the automatic image mosaic system implementation, which it sees to use feature detection of the image, which is extracted a feature point adjustment from continuous two images hour automatically from pixel price of image and in order to accomplish. Feature based image mosaics require the integration of corner detection, corner matching, motion parameters estimation, and image stitching. We used Taylor series for parameter and perspective transform estimation.
Risk factors for cervical cancer in criminal justice settings.
BACKGROUND Women in criminal justice settings have an increased prevalence of cervical cancer compared with the general population. However, little is known about abnormal cervical cancer screening results among women in jail and community-based criminal justice settings. Thus, the aims of this study were to compare the prevalence of self-reported abnormal Papanicolou (Pap) test results in women in jail and under community criminal justice supervision and to examine factors associated with abnormal Pap tests in these criminal justice settings. METHODS We analyzed data from two cross-sectional surveys of women in jails and community corrections in two Southern cities (n=380) about their history of abnormal Pap tests and risk factors for cervical cancer. Univariate analyses (analysis of variance [ANOVA] and chi-square) and a binary logistic regression analysis were conducted to test associations between a history of abnormal Pap testing and factors known to be associated with cervical cancer. RESULTS Nearly half of the women surveyed (n=163, 43%) reported ever having an abnormal Pap test. There was a high prevalence of risk factors for cervical cancer among women with and without an abnormal Pap test. After controlling for age and race, there were significant associations between an abnormal Pap test and inconsistent use of barrier protection (odds ratio [OR] 2.01, 95% confidence interval [CI] 1.18-3.43), having a history of gynecologic infections (OR 1.68, 95% CI 1.05-2.67), and having a history of sexually transmitted diseases (OR 1.92, 95% CI 1.17-3.15). CONCLUSIONS Women in jail and under community justice supervision reported a high prevalence of risk factors for cervical cancer. Because of their high prevalence of abnormal Pap testing, women in criminal justice settings may be appropriate targets for improved cervical cancer screening, prevention with human papillomavirus (HPV) vaccination, risk reduction education, and treatment.
Biphasic culture strategy based on hyperosmotic pressure for improved humanized antibody production in Chinese hamster ovary cell culture
Hyperosmotic pressure increased specific antibody productivity (q Ab) of recombinant Chinese hamster ovary (rCHO) cells (SH2-0.32) and it depressed cell growth. Thus, the use of hyperosmolar medium did not increase the maximum antibody concentration substantially. To overcome this drawback, the feasibility of biphasic culture strategy was investigated. In the biphasic culture, cells were first cultivated in the standard medium with physiological osmolality (294 mOsm/kg) for cell growth. When cells reached the late exponential growth phase, the spent standard medium was replaced with the fresh hyperosmolar medium (522 mOsm/kg) for antibody production. The q Ab in growth phase with the standard medium was 2.1 μg per 106 cells/d, whereas the q Ab in antibody production phase with the hyperosmolar medium was 11.1 μg per 106 cells/d. Northern blot analysis showed a positive relationship between the relative contents of intracellular immunoglobulin messenger ribonucleic acid and q Ab. Because of the enhanced q Ab and the increased cell concentration in biphasic culture, the maximum antibody concentration obtained in biphasic culture with 522 mOsm/kg medium exchange was 161% higher than that obtained in batch culture with the standard medium. Taken together, the simple biphasic culture strategy based on hyperosmotic culture is effective in improving antibody production of rCHO cells.
Real-time 3D scene reconstruction with dynamically moving object using a single depth camera
Online 3D reconstruction of real-world scenes has been attracting increasing interests from both the academia and industry, especially with the consumer-level depth cameras becoming widely available. Recent most online reconstruction systems take live depth data from a moving Kinect camera and incrementally fuse them to a single high-quality 3D model in real time. Although most real-world scenes have static environment, the daily objects in a scene often move dynamically, which are non-trivial to reconstruct especially when the camera is also not still. To solve this problem, we propose a single depth camera-based real-time approach for simultaneous reconstruction of dynamic object and static environment, and provide solutions for its key issues. In particular, we first introduce a robust optimization scheme which takes advantage of raycasted maps to segment moving object and background from the live depth map. The corresponding depth data are then fused to the volumes, respectively. These volumes are raycasted to extract views of the implicit surface which can be used as a consistent reference frame for the next iteration of segmentation and tracking. Particularly, in order to handle fast motion of dynamic object and handheld camera in the fusion stage, we propose a sequential 6D pose prediction method which largely increases the registration robustness and avoids registration failures occurred in conventional methods. Experimental results show that our approach can reconstruct moving object as well as static environment with rich details, and outperform conventional methods in multiple aspects.
Accurate multi-view reconstruction using robust binocular stereo and surface meshing
This paper presents a new algorithm for multi-view reconstruction that demonstrates both accuracy and efficiency. Our method is based on robust binocular stereo matching, followed by adaptive point-based filtering of the merged point clouds, and efficient, high-quality mesh generation. All aspects of our method are designed to be highly scalable with the number of views. Our technique produces the most accurate results among current algorithms for a sparse number of viewpoints according to the Middlebury datasets. Additionally, we prove to be the most efficient method among non-GPU algorithms for the same datasets. Finally, our scaled-window matching technique also excels at reconstructing deformable objects with high-curvature surfaces, which we demonstrate with a number of examples.
The wheelchair circuit: reliability of a test to assess mobility in persons with spinal cord injuries.
OBJECTIVE To assess the reliability of a 9-task wheelchair circuit. DESIGN Three test trials per subject were conducted by 2 raters. Inter- and intrarater reliability were examined. SETTING Eight rehabilitation centers in the Netherlands. PARTICIPANTS Convenience sample of 27 patients (age, >or=18 y) with spinal cord injury (SCI), all of whom were in the final stage of their inpatient rehabilitation. INTERVENTION A wheelchair circuit was developed to assess mobility in subjects with SCI. The circuit consisted of 9 tasks: figure-of-8 shape, doorstep crossing, mounting a platform, sprint, walking, driving up treadmill slopes of 3% and 6%, wheelchair driving and transfer. MAIN OUTCOME MEASURE Task feasibility, task performance time, and peak heart rates. RESULTS The number of tasks that subjects could perform varied from 3 to 9. Feasibility intrarater reliability was.98, and the interrater reliability intraclass correlation coefficient (ICC) was.97. Performance time ICCs ranged from.70 to.99 (mean,.88) for intrarater reliability and from.76 to.98 (mean,.92) for interrater reliability. Heart rate ICCs ranged from.64 to.96 (mean,.81) for intrarater reliability and from.82 to.99 (mean,.89) for interrater reliability. CONCLUSIONS The reliability of the wheelchair circuit was good. More research is needed to assess test validity and responsiveness.
Resilient individuals use positive emotions to bounce back from negative emotional experiences.
Theory indicates that resilient individuals "bounce back" from stressful experiences quickly and effectively. Few studies, however, have provided empirical evidence for this theory. The broaden-and-build theory of positive emotions (B. L. Fredrickson, 1998, 2001) is used as a framework for understanding psychological resilience. The authors used a multimethod approach in 3 studies to predict that resilient people use positive emotions to rebound from, and find positive meaning in, stressful encounters. Mediational analyses revealed that the experience of positive emotions contributed, in part, to participants' abilities to achieve efficient emotion regulation, demonstrated by accelerated cardiovascular recovery from negative emotional arousal (Studies 1 and 2) and by finding positive meaning in negative circumstances (Study 3). Implications for research on resilience and positive emotions are discussed.
Cardiovascular imaging with computed tomography: responsible steps to balancing diagnostic yield and radiation exposure.
Cardiovascular computed tomography (CT) is at the center of the risk-benefit debate about ionizing radiation exposure to the public from medical procedures. Although the risk has been sensationalized, the cardiovascular CT community has responded to the scrutiny by increasing efforts to ensure the responsible use of this young technology. Efforts to date have primarily included the development of appropriateness criteria and the implementation of dose-lowering techniques. Still needed is the development of standards that incorporate radiation exposure optimization into scan protocol selection. Such standards must consider applied radiation in the context of the clinical indication as well as the characteristics of the patient and provide guidance with regard to specific parameter settings. This editorial viewpoint demonstrates the need for comprehensive, individualized review of the clinical scenario before performing a cardiovascular CT, as well as the need for standards. If cardiovascular CT is the appropriate test and scan parameters are optimized with respect to radiation exposure, benefit should necessarily outweigh potential risk. However, efforts to promote responsible cardiovascular CT imaging must continue to ensure this is true for every patient.
Location of banking automatic teller machines based on convolution
In this paper, the problem of determining the optimum number and locations of banking automatic teller machines (ATMs) is considered. The objective is to minimize the total number of ATMs to cover all customer demands within a given geographical area. First, a mathematical model of this optimization problem is formulated. A novel heuristic algorithm with unique features is then developed to efficiently solve this problem. Finally, simulation results show the effectiveness of this algorithm in solving the ATM placement problem. 2009 Elsevier Ltd. All rights reserved.
Drug therapy for resistant hypertension: simplifying the approach.
Despite the availability of many effective antihypertensive drugs, the drug therapy for resistant hypertension remains a prominent problem. Reviews offer only the general recommendations of increasing dosage and adding drugs, offering clinicians little guidance with respect to the specifics of selecting medications and dosages. A simplified decision tree for drug selection that would be effective in most cases is needed. This review proposes such an approach. The approach is mechanism-based, targeting treatment at three hypertensive mechanisms: (1) sodium/volume, (2) the renin-angiotensin system (RAS), and (3) the sympathetic nervous system (SNS). It assumes baseline treatment with a 2-drug combination directed at sodium/volume and the RAS and recommends proceeding with one or both of just two treatment options: (1) strengthening the diuretic regimen, possibly with the addition of spironolactone, and/or (2) adding agents directed at the SNS, usually a β-blocker or combination of an α- and a β-blocker. The review calls for greater research and clinical attention directed to: (1) assessment of clinical clues that can help direct treatment toward either sodium/volume or the SNS, (2) increased recognition of the role of neurogenic (SNS-mediated) hypertension in resistant hypertension, (3) increased recognition of the effective but underutilized combination of α- + β-blockade, and (4) drug pharmacokinetics and dosing.
Statistical Significance Tests for Machine Translation Evaluation
If two translation systems differ differ in performance on a test set, can we trust that this indicates a difference in true system quality? To answer this question, we describe bootstrap resampling methods to compute statistical significance of test results, and validate them on the concrete example of the BLEU score. Even for small test sizes of only 300 sentences, our methods may give us assurances that test result differences are real.
AR-RRNS: Configurable reliable distributed data storage systems for Internet of Things to ensure security
Benefits of Internet of Things and cloud–fog-edge computing are associated with the risks of confidentiality, integrity, and availability related with the loss of information, denial of access for a long time, information leakage, conspiracy and technical failures. In this article, we propose a configurable, reliable, and confidential distributed data storage scheme with the ability to process encrypted data and control results of computations. Our systemutilizes Redundant ResidueNumber System (RRNS)with newmethod of error correction codes and secret sharing schemes. We introduce the concept of an approximate value of a rank of a number (AR), which allows us to reduce the computational complexity of the decoding from RNS to binary representation, and size of the coefficients. Based on the properties of the approximate value and arithmetic properties of RNS,we introduce AR-RRNSmethod for error detection, correction, and controlling computational results. We provide a theoretical basis to configure probability of information loss, data redundancy, speed of encoding and decoding to cope with different objective preferences, workloads, and storage properties. Theoretical analysis shows that by appropriate selection of RRNS parameters, the proposed scheme allows not only increasing safety, reliability, and reducing an overhead of data storage, but also processing of encrypted data. © 2017 Elsevier B.V. All rights reserved.
Interactive Facial Feature Localization
We address the problem of interactive facial feature localization from a single image. Our goal is to obtain an accurate segmentation of facial features on high-resolution images under a variety of pose, expression, and lighting conditions. Although there has been significant work in facial feature localization, we are addressing a new application area, namely to facilitate intelligent high-quality editing of portraits, that brings requirements not met by existing methods. We propose an improvement to the Active Shape Model that allows for greater independence among the facial components and improves on the appearance fitting step by introducing a Viterbi optimization process that operates along the facial contours. Despite the improvements, we do not expect perfect results in all cases. We therefore introduce an interaction model whereby a user can efficiently guide the algorithm towards a precise solution. We introduce the Helen Facial Feature Dataset consisting of annotated portrait images gathered from Flickr that are more diverse and challenging than currently existing datasets. We present experiments that compare our automatic method to published results, and also a quantitative evaluation of the effectiveness of our interactive method.
Opinion Mining: Aspect Level Sentiment Analysis using SentiWordNet and Amazon Web Services
In today's linked world, users can purchase items at any time. However, in online shopping sites customers can locate their concerned product by visiting the site of the trader directly or by seeking among different vendors by using a shopping search engine, which demonstrate the similar product’s accessibility and costing at alternative e-retailers. The active progress of the audience of shopping sites on the internet lead to the development of these resources as a new origin of the public’s mood and opinion about particular product. The tracking of public’s responses through reviews and feedbacks in online shopping sites has attracted a greater level of enthusiasm in the research society. Researchers notice that the millions of public opinion polls can’t be processed manually. This figure out the requirement of computerized methods for intelligent analysis of text instructions, which allows to process a large amount of data in short time and to interpret customer’s feedbacks. This interpretation of feedback is the most valuable and complicated element of the computerized processing. These notions provide the opportunity to perform large-scale research and to observe Online shopping sited in real-time. The main focus of this paper is to determine the aspect terms present in each sentence,
Infected joint replacements in HIV-positive patients with haemophilia.
Joint replacement in HIV-positive patients remains uncommon, with most experience gained in patients with haemophilia. We analysed retrospectively the outcome of 102 replacement arthroplasties in 73 HIV-positive patients from eight specialist haemophilia centres. Of these, 91 were primary procedures. The mean age of the patients at surgery was 39 years, and the median follow-up was for five years. The overall rate of deep sepsis was 18.7% for primary procedures and 36.3% for revisions. This is a much higher rate of infection than that seen in normal populations. A total of 44% of infections resolved fully after medical and/or surgical treatment. The benefits of arthroplasty in haemophilic patients are well established but the rates of complications are high. As this large study has demonstrated, high rates of infection occur, but survivorship analysis strongly suggests that most patients already diagnosed with HIV infection at the time of surgery should derive many years of symptomatic relief after a successful joint replacement. Careful counselling and education of both patients and healthcare workers before operation are therefore essential.
Sex dimorphism of the brain in male-to-female transsexuals.
Gender dysphoria is suggested to be a consequence of sex atypical cerebral differentiation. We tested this hypothesis in a magnetic resonance study of voxel-based morphometry and structural volumetry in 48 heterosexual men (HeM) and women (HeW) and 24 gynephillic male to female transsexuals (MtF-TR). Specific interest was paid to gray matter (GM) and white matter (WM) fraction, hemispheric asymmetry, and volumes of the hippocampus, thalamus, caudate, and putamen. Like HeM, MtF-TR displayed larger GM volumes than HeW in the cerebellum and lingual gyrus and smaller GM and WM volumes in the precentral gyrus. Both male groups had smaller hippocampal volumes than HeW. As in HeM, but not HeW, the right cerebral hemisphere and thalamus volume was in MtF-TR lager than the left. None of these measures differed between HeM and MtF-TR. MtF-TR displayed also singular features and differed from both control groups by having reduced thalamus and putamen volumes and elevated GM volumes in the right insular and inferior frontal cortex and an area covering the right angular gyrus.The present data do not support the notion that brains of MtF-TR are feminized. The observed changes in MtF-TR bring attention to the networks inferred in processing of body perception.
Moving objects in space: exploiting proprioception in virtual-environment interaction
Manipulation in immersive virtual environments is difficult partly because users must do without the haptic contact with real objects they rely on in the real world to orient themselves and their manipulanda. To compensate for this lack, we propose exploiting the one real object every user has in a virtual environment, his body. We present a unified framework for virtual-environment interaction based on proprioception, a person's sense of the position and orientation of his body and limbs. We describe three forms of body-relative interaction: • Direct manipulation—ways to use body sense to help control manipulation • Physical mnemonics—ways to store/recall information relative to the body • Gestural actions—ways to use body-relative actions to issue commands Automatic scaling is a way to bring objects instantly within reach so that users can manipulate them using proprioceptive cues. Several novel virtual interaction techniques based upon automatic scaling and our proposed framework of proprioception allow a user to interact with a virtual world intuitively, efficiently, precisely, and lazily. We report the results of both informal user trials and formal user studies of the usability of the body-relative interaction techniques presented. CR
Self-enhancing effects of exposure to thin-body images.
OBJECTIVE This study examines the effect of thin-body media images on mood, self-esteem, and self-image ratings of restrained and unrestrained eaters. A secondary purpose was to examine whether these effects were influenced by exposure duration. METHOD Under the guise of a perception study, participants were exposed to thin-body or control advertisements (e.g., perfume bottles) for either 7 or 150 ms and then completed a questionnaire packet. RESULTS Restrained eaters reported more favorable self-image and social self-esteem (but not appearance self-esteem) scores after exposure to thin-body images than after exposure to control advertisements. The self-image and social self-esteem scores of unrestrained eaters were unaffected by advertisement type, but their appearance self-esteem scores were lower after exposure to thin-body advertisements. No differences were found for mood ratings and total self-esteem. DISCUSSION We discuss restraint status as a moderator of the effects of thin-body images on women's body image.
RecResNet: A Recurrent Residual CNN Architecture for Disparity Map Enhancement
We present a neural network architecture applied to the problem of refining a dense disparity map generated by a stereo algorithm to which we have no access. Our approach is able to learn which disparity values should be modified and how, from a training set of images, estimated disparity maps and the corresponding ground truth. Its only input at test time is a disparity map and the reference image. Two design characteristics are critical for the success of our network: (i) it is formulated as a recurrent neural network, and (ii) it estimates the output refined disparity map as a combination of residuals computed at multiple scales, that is at different up-sampling and down-sampling rates. The first property allows the network, which we named RecResNet, to progressively improve the disparity map, while the second property allows the corrections to come from different scales of analysis, addressing different types of errors in the current disparity map. We present competitive quantitative and qualitative results on the KITTI 2012 and 2015 benchmarks that surpass the accuracy of previous disparity refinement methods.
Forms of forward quadrupedal locomotion. II. A comparison of posture, hindlimb kinematics, and motor patterns for upslope and level walking.
To gain insight into the neural mechanisms controlling different forms of quadrupedal walking of normal cats, data on postural orientation, hindlimb kinematics, and motor patterns of selected hindlimb muscles were assessed for four grades of upslope walking, from 25 to 100% (45 degrees incline), and compared with similar data for level treadmill walking (0.6 m/s). Kinematic data for the hip, knee, ankle, and metatarsophalangeal joints were obtained from digitizing ciné film that was synchronized with electromyographic (EMG) records from 13 different hindlimb muscles. Cycle periods, the structure of the step cycle, and paw-contact sequences were similar at all grades and typical of lateral-sequence walking. Also, a few half-bound and transverse gallop steps were assessed from trials at the 100% grade; these steps had shorter cycle periods than the walking steps and less of the cycle (68 vs. 56%) was devoted to stance. Each cat assumed a crouched posture at the steeper grades of upslope walking and stride length decreased, whereas the overall position of the stride shifted caudally with respect to the hip joint. At the steeper grades, the range and duration of swing-related flexion increased at all joints, the stance-phase yield was absent at the knee and ankle joints, and the range of stance-phase extension at knee and ankle joints increased. Patterns of muscle activity for upslope and level walking were similar with some notable exceptions. At the steeper grades, the EMG activity of muscles with swing-related activity, such as the digit flexor muscle, the flexor digitorum longus (FDL), and the knee flexor muscle, the semitendinosus (ST), was prolonged and continued well into midswing. The EMG activity of stance-related muscles also increased in amplitude with grade, and three muscles not active during the stance phase of level walking had stance activity that increased in amplitude and duration at the steepest grades; these muscles were the ST, FDL, and extensor digitorum brevis. Overall the changes in posture, hindlimb kinematics, and the activity patterns of hindlimb muscles during upslope walking reflected the need to continually move the body mass forward and upward during stance and to ensure that the paw cleared the inclined slope during swing. The implications of these changes for the neural control of walking and expected changes in hindlimb kinetics for slope walking are discussed.
A long-term study of children with autism playing with a robotic pet Taking inspirations from non-directive play therapy to encourage children ’ s proactivity and initiative-taking
!is paper presents a novel methodological approach of how to design, conduct and analyse robot-assisted play. !is approach is inspired by non-directive play therapy. !e experimenter participates in the experiments, but the child remains the main leader for play. Besides, beyond inspiration from non-directive play therapy, this approach enables the experimenter to regulate the interaction under speci"c conditions in order to guide the child or ask her questions about reasoning or a#ect related to the robot. !is approach has been tested in a longterm study with six children with autism in a school setting. An autonomous robot with zoomorphic, dog-like appearance was used in the studies. !e children’s progress was analyzed according to three dimensions, namely, Play, Reasoning and A#ect. Results from the case-study evaluations have shown the capability of the method to meet each child’s needs and abilities. Children who mainly played solitarily progressively experienced basic imitation games with the experimenter. Children who proactively played socially progressively experienced higher levels of play and constructed more reasoning related to the robot. !ey also expressed some interest in the robot, including, on occasion, a#ect.
Randomized, double-blind trial of simultaneous right and left atrial epicardial pacing for prevention of post-open heart surgery atrial fibrillation.
BACKGROUND The purpose of this study was to assess simultaneous right and left atrial pacing as prophylaxis for postoperative atrial fibrillation. METHODS AND RESULTS In a double-blind, randomized fashion, 118 patients who underwent open heart surgery were assigned to right atrial pacing at 45 bpm (RA-AAI; n=39), right atrial triggered pacing at a rate of >/=85 bpm (RA-AAT; n=38), or simultaneous right and left atrial triggered pacing at a rate of >/=85 bpm (Bi-AAT; n=41). Holter monitoring was performed for 4. 8+/-1.4 days after surgery to assess for episodes of atrial fibrillation lasting >5 minutes. The prevalence of postoperative atrial fibrillation was significantly less in the patients randomized to biatrial AAT pacing when compared with the other 2 pacing regimens (P=0.02). An episode of atrial fibrillation occurred in 4 (10%) of 41 patients in the Bi-AAT group compared with 11 (28%) of 39 patients in the RA-AAI group (P=0.03 versus Bi-AAT) and 12 (32%) of 38 patients in the RA-AAT group (P=0.01 versus Bi-AAT). There was no difference in the occurrence of atrial fibrillation between the right atrial AAI and AAT groups (P=0.8). There was no significant difference among the 3 groups with regard to the number of postoperative hospital days (7.3+/-4.2 days), morbidity (5.1%), or mortality rate (2.5%). CONCLUSIONS Simultaneous right and left atrial triggered pacing is well tolerated and significantly reduces the prevalence of post-open heart surgery atrial fibrillation.
Trustworthiness Attributes and Metrics for Engineering Trusted Internet-Based Software Systems
Trustworthiness of Internet-based software systems, apps, services and platform is a key success factor for their use and acceptance by organizations and end-users. The notion of trustworthiness, though, is subject to individual interpretation and preference, e.g., organizations require confidence about how their business critical data is handled whereas end-users may be more concerned about usability. As one main contribution, we present an extensive list of software quality attributes that contribute to trustworthiness. Those software quality attributes have been identified by a systematic review of the research literature and by analyzing two real-world use cases. As a second contribution, we sketch an approach for systematically deriving metrics to measure the trustworthiness of software system. Our work thereby contributes to better understanding which software quality attributes should be considered and assured when engineering trustworthy Internet-based software systems.
Neutron star population dynamics
We study the Ðeld millisecond pulsar (MSP) population to infer its intrinsic distribution in spin period and luminosity and to determine its spatial distribution within the Galaxy. Our likelihood analysis on data from extant surveys (22 pulsars with periods less than 20 ms) accounts for the following important selection e†ects : (1) the survey sensitivity as a function of direction, spin period, and sky coverage ; (2) interstellar scintillation, which modulates the pulsed Ñux and causes a net increase in search volume of D30%; and (3) errors in the pulsar distance scale. Adopting power-law models (with cuto†s) for the intrinsic distributions, the analysis yields a minimum-period cuto† ms (99% conÐdence), a period distribution proportional to P&[ 0.65 P~2.0B0.33, and a pseudoluminosity distribution proportional to (where is the product of L p ~2.0B0.2 L p the Ñux density and the square of the distance, for mJy kpc2). L p o 1.1 We Ðnd that the column density of MSPs (uncorrected for beaming e†ects) is kpc~2 in the D50~20 `30 vicinity of the solar system. For a Gaussian model, the z scale height is kpc, corresponding to ~0.12 `0.16 the local number density kpc~3. (For an exponential model, the scale height becomes ~11 `17 ~0.13 `0.19 kpc, and the number density kpc~3.) Estimates of the total number of MSPs in the disk of the ~16 `25 Galaxy and for the associated birthrate are given. The contribution of a di†use halo-like component (tracing the Galactic spheroid, the halo, or the globular cluster density proÐle) to the local number density of MSPs is limited to of the midplane value. [1% We consider a kinematic model for the MSP spatial distribution in which objects in the disk are kicked once at birth and then orbit in a smooth Galactic potential, becoming dynamically well-mixed. The analysis yields a column density kpc~2 (comparable to the above), a birth z kick velocity ~17 `27 km s~1, and a three-dimensional velocity dispersion of D84 km s~1. MSP velocities are smaller ~11 `17 than those of young, long-period pulsars by about a factor of 5. The kinematic properties of the MSP population are discussed, including expected transverse motions, the occurrence of asymmetric drift, the shape of the velocity ellipsoid, and the z scale height at birth. If MSPs are long-lived, then a signiÐcant contribution to observed MSP z velocities is the result of di†usive processes that increase the scale height of old stellar populations ; our best estimate of the one-dimensional velocity kick that is unique to MSP evolution is D40 km s~1 if such di†usion is taken into account. The scale heights of millisecond pulsars and low-mass X-ray binaries are consistent, suggesting a common origin and that the primary channel for forming both classes of objects imparts only low velocities. Binaries involving a common envelope phase and a neutron starÈforming supernova explosion can yield such objects, even with explosion asymmetries like those needed to provide the velocity distribution of isolated, nonspun-up radio pulsars. Future searches for MSPs may be optimized using the model results. As an example, we give the expected number of detectable MSPs per beam area and the volumes of the Galaxy sampled per beam area for a hypothetical Green Bank Telescope all sky survey. Estimates for the volume that must be surveyed to Ðnd a pulsar faster than 1.5 ms are given. We also brieÑy discuss how selection e†ects associated with fast binaries inÑuence our results. Subject headings : pulsars : general È stars : kinematics È stars : neutron È stars : rotation È stars : statistics
Leaky CPW-Based Slot Antenna Arrays for Millimeter-Wave Applications
A uniplanar leaky-wave antenna (LWA) suitable for operation at millimeter-wave frequencies is introduced. Both unidirectional and bidirectional versions of the antenna are presented. The proposed structure consists of a coplanar waveguide fed linear array of closely spaced capacitive transverse slots. This configuration results in a fast-wave structure in which the = 0 spatial harmonic radiates in the forward direction. Since the distance, , between adjacent elements of the array is small , the slot array essentially becomes a uniform LWA. A comprehensive transmission line model is developed based upon the theory of truncated periodic transmission lines to explain the operation of the antenna and provide a tool for its design. Measured and simulated radiation patterns, directivity, gain, and an associated loss budget are presented for a 32-element antenna operating at 30 GHz. The uniplanar nature of the structure makes the antenna appropriate for integration of shunt variable capacitors such as diode or micro-electromechanical system varactors for fixed frequency beam steering at low-bias voltages.
Towards the Implementation of IoT for Environmental Condition Monitoring in Homes
In this paper, we have reported an effective implementation for Internet of Things used for monitoring regular domestic conditions by means of low cost ubiquitous sensing system. The description about the integrated network architecture and the interconnecting mechanisms for the reliable measurement of parameters by smart sensors and transmission of data via internet is being presented. The longitudinal learning system was able to provide a self-control mechanism for better operation of the devices in monitoring stage. The framework of the monitoring system is based on a combination of pervasive distributed sensing units, information system for data aggregation, and reasoning and context awareness. Results are encouraging as the reliability of sensing information transmission through the proposed integrated network architecture is 97%. The prototype was tested to generate real-time graphical information rather than a test bed scenario.
Control of 24-hour intragastric acidity with morning dosing of immediate-release and delayed-release proton pump inhibitors in patients with GERD.
GOALS To compare the effects of immediate-release omeprazole and 2 different delayed-release proton pump inhibitors on 24-hour intragastric acidity in gastroesophageal reflux disease patients. BACKGROUND Because of its unique pharmacokinetic properties, immediate-release omeprazole does not need to be dosed before a meal to control intragastric acidity. Previous studies showed effectiveness of immediate-release omeprazole in controlling nocturnal intragastric acidity with bedtime dosing. This is the first study to compare the effects of prebreakfast dosing of immediate-release omeprazole and delayed-release lansoprazole and pantoprazole on 24-hour intragastric acidity. AIM To compare the effects of prebreakfast dosing of immediate-release omeprazole 40 mg capsules, lansoprazole 30 mg capsules, and pantoprazole 40 mg tablets on 24-hour intragastric acidity. METHODS Fifty-five patients with gastroesophageal reflux disease received 7 consecutive once-daily morning doses of each drug in this open-label, randomized, 3-period crossover study. On day 7, intragastric pH was recorded for 24 hours. RESULTS After 7 days, the percentage of time with intragastric pH >4 over 24 hours was 59.7% (14.3 hours) with immediate-release omeprazole, 48.8% (11.7 hours) with lansoprazole (P=0.005), and 41.8% (10.0 hours) with pantoprazole (P<0.001). Median intragastric pH was significantly higher with immediate-release omeprazole than with lansoprazole (P=0.003) or pantoprazole (P<0.001). All drugs were well tolerated. CONCLUSIONS When dosed in the morning, immediate-release omeprazole provided significantly better control of 24-hour intragastric acidity than lansoprazole and pantoprazole.
Beyond 5G With UAVs: Foundations of a 3D Wireless Cellular Network
In this paper, a novel concept of three-dimensional (3D) cellular networks, that integrate drone base stations (drone-BS) and cellular-connected drone users (drone-UEs), is introduced. For this new 3D cellular architecture, a novel framework for network planning for drone-BSs and latency-minimal cell association for drone-UEs is proposed. For network planning, a tractable method for drone-BSs’ deployment based on the notion of truncated octahedron shapes is proposed, which ensures full coverage for a given space with a minimum number of drone-BSs. In addition, to characterize frequency planning in such 3D wireless networks, an analytical expression for the feasible integer frequency reuse factors is derived. Subsequently, an optimal 3D cell association scheme is developed for which the drone-UEs’ latency, considering transmission, computation, and backhaul delays, is minimized. To this end, first, the spatial distribution of the drone-UEs is estimated using a kernel density estimation method, and the parameters of the estimator are obtained using a cross-validation method. Then, according to the spatial distribution of drone-UEs and the locations of drone-BSs, the latency-minimal 3D cell association for drone-UEs is derived by exploiting tools from an optimal transport theory. The simulation results show that the proposed approach reduces the latency of drone-UEs compared with the classical cell association approach that uses a signal-to-interference-plus-noise ratio (SINR) criterion. In particular, the proposed approach yields a reduction of up to 46% in the average latency compared with the SINR-based association. The results also show that the proposed latency-optimal cell association improves the spectral efficiency of a 3D wireless cellular network of drones.
Business process verification - finally a reality!
Purpose – The purpose of this paper is to demonstrate that process verification has matured to a level where it can be used in practice. This paper reports on new verification techniques that can be used to assess the correctness of real-life models. Design/methodology/approach – The proposed approach relies on using formal methods to determine the correctness of business processes with cancellation and OR-joins. The paper also demonstrates how reduction rules can be used to improve the efficiency. These techniques are presented in the context of the workflow language yet another workflow language (YAWL) that provides direct support for 20 most frequently used patterns found today (including cancellation and OR-joins). But the results also apply to other languages with these features (e.g. BPMN, EPCs, UML activity diagrams, etc.). An editor has been developed that provides diagnostic information based on the techniques presented in this paper. Findings – The paper proposes four properties for business processes with cancellation and OR-joins, namely: soundness, weak soundness, irreducible cancellation regions and immutable OR-joins and develop new techniques to verify these properties. Reduction rules have been used as a means of improving the efficiency of the algorithm. The paper demonstrates the feasibility of this verification approach using a realistic and complex business process, the visa application process for general skilled migration to Australia, modelled as a YAWL workflow with cancellation regions and OR-joins. Originality/value – Business processes sometimes require complex execution interdependencies to properly complete a process. For instance, it is possible that certain activities need to be cancelled mid-way though the process. Some parallel activities may require complex “wait and see” style synchronisation depending on a given context. These types of business processes can be found in various domains, such as application integration, B2B commerce, web service composition and workflow systems. Even though cancellation and sophisticated join structures are present in many business processes, existing verification techniques are unable to deal with such processes. Hence, this paper plays an important role in making process verification a reality.
A scaffolding approach to coreference resolution integrating statistical and rule-based models
We describe a scaffolding approach to the task of coreference resolution that incrementally combines statistical classifiers, each designed for a particular mention type, with rule-based models (for sub-tasks well-matched to determinism). We motivate our design by an oracle-based analysis of errors in a rule-based coreference resolution system, showing that rule-based approaches are poorly suited to tasks that require a large lexical feature space, such as resolving pronominal and common-noun mentions. Our approach combines many advantages: it incrementally builds clusters integrating joint information about entities, uses rules for deterministic phenomena, and integrates rich lexical, syntactic, and semantic features with random forest classifiers well-suited to modeling the complex feature interactions that are known to characterize the coreference task. We demonstrate that all these decisions are important. The resulting system achieves 63.2 F1 on the CoNLL-2012 shared task dataset, outperforming the rule-based starting point by over 7 F1 points. Similarly, our system outperforms an equivalent sieve-based approach that relies on logistic regression classifiers instead of random forests by over 4 F1 points. Lastly, we show that by changing the coreference resolution system from relying on constituent-based syntax to using dependency syntax, which can be generated in linear time, we achieve a runtime speedup of 550% without considerable loss of accuracy.
Repressed induction of interferon-related microRNAs miR-146a and miR-155 in peripheral blood mononuclear cells infected with HCV genotype 4
MicroRNAs regulate the expression of many genes and subsequently control various cellular processes, such as the immune response to viral infections mediated by type I interferon (IFN). In this study, the expression pattern of two interferon-related microRNAs, miR-146a and miR-155, was examined in healthy and HCV-genotype-4-infected peripheral blood mononuclear cells (PBMCs) using qRT-PCR. In contrast to other viral infections, the expression pattern was similar in both healthy and infected PBMCs. This could be attributed to attenuation of IFN pathway by HCV, which was assessed by investigating the expression of MxA, an interferon-stimulated gene, that showed lower expression in HCV-infected PBMCs. To determine the site of interference of HCV in the IFN pathway, expression of both microRNAs was examined following stimulation of PBMCs with IFN-α2a, an activator of the JAK/STAT pathway as well as with imiquimod, a toll-like receptor-7 (TLR-7) agonist that promotes interferon release. IFN stimulation induced the expression of miR-146a and miR-155 in HCV-infected and healthy PBMCs. Stimulation with imiquimod led to a down-regulation of both microRNAs in infected PBMCs, while it increased their expression in healthy PBMCs, indicating that HCV might interfere with miR-146a and miR-155 expression at sites upstream of interferon release, specifically in the TLR-7 pathway. The pattern of expression of both miR-146a and miR-155 was very similar with a strong positive correlation, but showed no correlation to the patients' clinical or histopathological parameters or response to treatment. In conclusion, HCV infection might repress the induction of miR-146a and miR-155 by interfering with TLR-7 signaling.
Building equitable computer science classrooms: elements of a teaching approach
This paper offers a framework for equitable instruction that emerged while designing a computer science course for students entering the sixth grade. Leveraging research from a range of fields, including sociology, mathematics education, and the learning sciences, we argue that in addition to material resources, such as rich course content and quality instruction, equity also depends on students' access to non-material resources, such as productive domain identities and peer relationships. We illustrate each dimension of the framework by describing in detail a core set of pedagogical practices implemented during a summer course.
Structuration Theory and Information Systems Research
Research on the social and organizational aspects of information systems often lacks an adequate theoretical and methodological basis. In this paper, we propose that structuration theory provides a broad conception of social action and human society which can be used as the basis for empiricallyorientated theory and research. A critique is given of some published work, which makes use of the theory, on the introduction of new technology and group decision support systems. A categorization of the use of structuration theory in IS research is proposed and new application areas are identified and discussed, including design and development, strategy formation, user resistance to implementation, and the informating aspects of information systems.
Estimating mono- and bi-phasic regression parameters using a mixture piecewise linear Bayesian hierarchical model
The dynamics of tumor burden, secreted proteins or other biomarkers over time, is often used to evaluate the effectiveness of therapy and to predict outcomes for patients. Many methods have been proposed to investigate longitudinal trends to better characterize patients and to understand disease progression. However, most approaches assume a homogeneous patient population and a uniform response trajectory over time and across patients. Here, we present a mixture piecewise linear Bayesian hierarchical model, which takes into account both population heterogeneity and nonlinear relationships between biomarkers and time. Simulation results show that our method was able to classify subjects according to their patterns of treatment response with greater than 80% accuracy in the three scenarios tested. We then applied our model to a large randomized controlled phase III clinical trial of multiple myeloma patients. Analysis results suggest that the longitudinal tumor burden trajectories in multiple myeloma patients are heterogeneous and nonlinear, even among patients assigned to the same treatment cohort. In addition, between cohorts, there are distinct differences in terms of the regression parameters and the distributions among categories in the mixture. Those results imply that longitudinal data from clinical trials may harbor unobserved subgroups and nonlinear relationships; accounting for both may be important for analyzing longitudinal data.
Question-answering in an industrial setting
This work discusses a mix of challenges arising from Watson Discovery Advisor (WDA), an industrial strength descendant of the Watson Jeopardy! Question Answering system currently used in production in industry settings. Typical challenges include generation of appropriate training questions, adaptation to new industry domains, and iterative improvement of the system through manual error analyses.
Contact sexual offending by men with online sexual offenses.
There is much concern about the likelihood that online sexual offenders (particularly online child pornography offenders) have either committed or will commit offline sexual offenses involving contact with a victim. This study addresses this question in two meta-analyses: the first examined the contact sexual offense histories of online offenders, whereas the second examined the recidivism rates from follow-up studies of online offenders. The first meta-analysis found that approximately 1 in 8 online offenders (12%) have an officially known contact sexual offense history at the time of their index offense (k = 21, N = 4,464). Approximately one in two (55%) online offenders admitted to a contact sexual offense in the six studies that had self-report data (N = 523). The second meta-analysis revealed that 4.6% of online offenders committed a new sexual offense of some kind during a 1.5- to 6-year follow-up (k = 9, N = 2,630); 2.0% committed a contact sexual offense and 3.4% committed a new child pornography offense. The results of these two quantitative reviews suggest that there may be a distinct subgroup of online-only offenders who pose relatively low risk of committing contact sexual offenses in the future.
Research Priorities for Robust and Beneficial Artificial Intelligence
WINTER 2015 105 Artificial intelligence (AI) research has explored a variety of problems and approaches since its inception, but for the last 20 years or so has been focused on the problems surrounding the construction of intelligent agents — systems that perceive and act in some environment. In this context, the criterion for intelligence is related to statistical and economic notions of rationality — colloquially, the ability to make good decisions, plans, or inferences. The adoption of probabilistic representations and statistical learning methods has led to a large degree of integration and crossfertilization between AI, machine learning, statistics, control theory, neuroscience, and other fields. The establishment of shared theoretical frameworks, combined with the availability of data and processing power, has yielded remarkable successes in various component tasks such as speech recognition, image classification, autonomous vehicles, machine translation, legged locomotion, and question-answering systems.
Matrix Factorization Techniques for Recommender Systems
As the Netflix Prize competition has demonstrated, matrix factorization models are superior to classic nearest neighbor techniques for producing product recommendations, allowing the incorporation of additional information such as implicit feedback, temporal effects, and confidence levels.
Cyber-physical attacks in power networks: Models, fundamental limitations and monitor design
Future power networks will be characterized by safe and reliable functionality against physical and cyber attacks. This paper proposes a unified framework and advanced monitoring procedures to detect and identify network components malfunction or measurements corruption caused by an omniscient adversary. We model a power system under cyber-physical attack as a linear time-invariant descriptor system with unknown inputs. Our attack model generalizes the prototypical stealth, (dynamic) false-data injection and replay attacks. We characterize the fundamental limitations of both static and dynamic procedures for attack detection and identification. Additionally, we design provably-correct (dynamic) detection and identification procedures based on tools from geometric control theory. Finally, we illustrate the effectiveness of our method through a comparison with existing (static) detection algorithms, and through a numerical study.
Life-History Evolution: At the Origins of Metamorphosis
Metamorphosis is a widespread life history strategy of animals but apart from some model organisms it is poorly characterized. A recent study of moon jellies highlights the similarities and differences between the various types of metamorphosis and illuminates its molecular determinants.
Clone-resistant vehicular RKE by deploying SUC
Many automotive Remote Keyless Entry (RKE) systems have been successfully attacked in the last few years. The security of RKE systems is still a sensitive and crucial issue for vehicular industry. The problem needs to be essentially addressed by the fundamental fabrication weakness namely: "the manufacture can clone his own fabricated keys" and create equal units. This should not be possible. The fact that: "the only safe secret is the one which nobody knows" leads to proposing the fabrication of physically "Secret Unknown Cipher" (SUC) to serve as a clone-resistant security anchor (module) for the RKE system. Each fabricated key, in its ultimate case, is then equivalent to an individual unclonable DNA-like biological entity. We preset a protocol exhibiting mutual authentication properties to manage a complete RKE system. It is based on a digital hardware token incorporating an SUC module designed to fit to the vehicular industrial mass-production environment at reasonable costs.
Attention to Head Locations for Crowd Counting
Occlusions, complex backgrounds, scale variations and non-uniform distributions present great challenges for crowd counting in practical applications. In this paper, we propose a novel method using an attention model to exploit head locations which are the most important cue for crowd counting. The attention model estimates a probability map in which high probabilities indicate locations where heads are likely to be present. The estimated probability map is used to suppress nonhead regions in feature maps from several multi-scale feature extraction branches of a convolutional neural network for crowd density estimation, which makes our method robust to complex backgrounds, scale variations and non-uniform distributions. In addition, we introduce a relative deviation loss to compensate a commonly used training loss, Euclidean distance, to improve the accuracy of sparse crowd density estimation. Experiments on ShanghaiTech, UCF CC 50 and WorldExpo’10 datasets demonstrate the effectiveness of our method.
Scenario and countermeasure for replay attack using join request messages in LoRaWAN
LPWAN (Low Power Wide Area Networks) technologies have been attracting attention continuously in IoT (Internet of Things). LoRaWAN is present on the market as a LPWAN technology and it has features such as low power consumption, low transceiver chip cost and wide coverage area. In the LoRaWAN, end devices must perform a join procedure for participating in the network. Attackers could exploit the join procedure because it has vulnerability in terms of security. Replay attack is a method of exploiting the vulnerability in the join procedure. In this paper, we propose a attack scenario and a countermeasure against replay attack that may occur in the join request transfer process.
A Co-CRISPR Strategy for Efficient Genome Editing in Caenorhabditis elegans
Genome editing based on CRISPR (clustered regularly interspaced short palindromic repeats)-associated nuclease (Cas9) has been successfully applied in dozens of diverse plant and animal species, including the nematode Caenorhabditis elegans. The rapid life cycle and easy access to the ovary by micro-injection make C. elegans an ideal organism both for applying CRISPR-Cas9 genome editing technology and for optimizing genome-editing protocols. Here we report efficient and straightforward CRISPR-Cas9 genome-editing methods for C. elegans, including a Co-CRISPR strategy that facilitates detection of genome-editing events. We describe methods for detecting homologous recombination (HR) events, including direct screening methods as well as new selection/counterselection strategies. Our findings reveal a surprisingly high frequency of HR-mediated gene conversion, making it possible to rapidly and precisely edit the C. elegans genome both with and without the use of co-inserted marker genes.
The Generality of Interview-Informed Functional Analyses: Systematic Replications in School and Home.
Behavioral interventions preceded by a functional analysis have been proven efficacious in treating severe problem behavior associated with autism. There is, however, a lack of research showing socially validated outcomes when assessment and treatment procedures are conducted by ecologically relevant individuals in typical settings. In this study, interview-informed functional analyses and skill-based treatments (Hanley et al. in J Appl Behav Anal 47:16-36, 2014) were applied by a teacher and home-based provider in the classroom and home of two children with autism. The function-based treatments resulted in socially validated reductions in severe problem behavior (self-injury, aggression, property destruction). Furthermore, skills lacking in baseline-functional communication, denial and delay tolerance, and compliance with adult instructions-occurred with regularity following intervention. The generality and costs of the process are discussed.
HSA-RNN: Hierarchical Structure-Adaptive RNN for Video Summarization
Although video summarization has achieved great success in recent years, few approaches have realized the influence of video structure on the summarization results. As we know, the video data follow a hierarchical structure, i.e., a video is composed of shots, and a shot is composed of several frames. Generally, shots provide the activity-level information for people to understand the video content. While few existing summarization approaches pay attention to the shot segmentation procedure. They generate shots by some trivial strategies, such as fixed length segmentation, which may destroy the underlying hierarchical structure of video data and further reduce the quality of generated summaries. To address this problem, we propose a structure-adaptive video summarization approach that integrates shot segmentation and video summarization into a Hierarchical Structure-Adaptive RNN, denoted as HSA-RNN. We evaluate the proposed approach on four popular datasets, i.e., SumMe, TVsum, CoSum and VTW. The experimental results have demonstrated the effectiveness of HSA-RNN in the video summarization task.
Opinion Dynamics and the Evolution of Social Power in Influence Networks
This paper studies the evolution of self appraisal, social power and interpersonal influences for a group of individuals who discuss and form opinions about a sequence of issues. Our empirical model combines the averaging rule by DeGroot to describe opinion formation processes and the reflected appraisal mechanism by Friedkin to describe the dynamics of individuals’ self appraisal and social power. Given a set of relative interpersonal weights, the DeGroot-Friedkin model predicts the evolution of the influence network governing the opinion formation process. We provide a rigorous mathematical formulation of the influence network dynamics, characterize its equilibria and establish its convergence properties for all possible structures of the relative interpersonal weights and corresponding eigenvector centrality scores. The model predicts that the social power ranking among individuals is asymptotically equal to their centrality ranking, that social power tends to accumulate at the top of the hierarchy, and that an autocratic (resp. democratic) power structure arises when the centrality scores are maximally non-uniform (resp. uniform).
Face anti-spoofing based on color texture analysis
Research on face spoofing detection has mainly been focused on analyzing the luminance of the face images, hence discarding the chrominance information which can be useful for discriminating fake faces from genuine ones. In this work, we propose a new face anti-spoofing method based on color texture analysis. We analyze the joint color-texture information from the luminance and the chrominance channels using a color local binary pattern descriptor. More specifically, the feature histograms are extracted from each image band separately. Extensive experiments on two benchmark datasets, namely CASIA face anti-spoofing and Replay-Attack databases, showed excellent results compared to the state-of-the-art. Most importantly, our inter-database evaluation depicts that the proposed approach showed very promising generalization capabilities.
How emotion shapes behavior: feedback, anticipation, and reflection, rather than direct causation.
Fear causes fleeing and thereby saves lives: this exemplifies a popular and common sense but increasingly untenable view that the direct causation of behavior is the primary function of emotion. Instead, the authors develop a theory of emotion as a feedback system whose influence on behavior is typically indirect. By providing feedback and stimulating retrospective appraisal of actions, conscious emotional states can promote learning and alter guidelines for future behavior. Behavior may also be chosen to pursue (or avoid) anticipated emotional outcomes. Rapid, automatic affective responses, in contrast to the full-blown conscious emotions, may inform cognition and behavioral choice and thereby help guide current behavior. The automatic affective responses may also remind the person of past emotional outcomes and provide useful guides as to what emotional outcomes may be anticipated in the present. To justify replacing the direct causation model with the feedback model, the authors review a large body of empirical findings.
New Robotics: Design Principles for Intelligent Systems
New robotics is an approach to robotics that, in contrast to traditional robotics, employs ideas and principles from biology. While in the traditional approach there are generally accepted methods (e.g., from control theory), designing agents in the new robotics approach is still largely considered an art. In recent years, we have been developing a set of heuristics, or design principles, that on the one hand capture theoretical insights about intelligent (adaptive) behavior, and on the other provide guidance in actually designing and building systems. In this article we provide an overview of all the principles but focus on the principles of ecological balance, which concerns the relation between environment, morphology, materials, and control, and sensory-motor coordination, which concerns self-generated sensory stimulation as the agent interacts with the environment and which is a key to the development of high-level intelligence. As we argue, artificial evolution together with morphogenesis is not only nice to have but is in fact a necessary tool for designing embodied agents.
The Tactile Internet: Applications and Challenges
Wireless communications today enables us to connect devices and people for an unprecedented exchange of multimedia and data content. The data rates of wireless communications continue to increase, mainly driven by innovation in electronics. Once the latency of communication systems becomes low enough to enable a round-trip delay from terminals through the network back to terminals of approximately 1 ms, an overlooked breakthrough?human tactile to visual feedback control?will change how humans communicate around the world. Using these controls, wireless communications can be the platform for enabling the control and direction of real and virtual objects in many situations of our life. Almost no area of the economy will be left untouched, as this new technology will change health care, mobility, education, manufacturing, smart grids, and much more. The Tactile Internet will become a driver for economic growth and innovation and will help bring a new level of sophistication to societies.
Development of fast dissolving oral film containing dexamethasone as an antiemetic medication: clinical usefulness.
We developed a fast dissolving oral film containing 4 mg dexamethasone and examined the clinical effect of the film as the antiemetic by a randomized controlled crossover study in breast cancer patients receiving a combination chemotherapy with anthracycline and cyclophosphamide, a highly emetogenic chemotherapy. The film was prepared as reported previously using microcrystalline cellulose, polyethylene glycol, hypromellose, polysorbate 80 and 5% low substituted hydroxypropylcellulose as base materials. The uniformity of the film was shown by the relative standard deviation of 2.7% and acceptance value of 5.9% by the Japanese Pharmacopoeia. Patients were administered with 8 mg dexamethasone as oral film or tablet on days 2-4 after chemotherapy in addition to the standard antiemetic medication. The rates of complete protection from vomiting during acute and delayed phases were not different between film-treated group and tablet-treated group. The time course of the complete protection from nausea or vomiting during 0-120 h was also similar between the two groups. Patient's impressions on the oral acceptability in respect of the taste and ease in taking were significantly better for film than for tablet. Therefore, the present fast dissolving oral film containing dexamethasone seems to be potentially useful as an antiemetic agent in patients receiving highly emetogenic chemotherapy.
Cracking Bank PINs by Playing Mastermind
The bank director was pretty upset noticing Joe, the system administrator, spending his spare time playing Mastermind, an old useless game of the 70ies. He had fought the instinct of telling him how to better spend his life, just limiting to look at him in disgust long enough to be certain to be noticed. No wonder when the next day the director fell on his chair astonished while reading, on the newspaper, about a huge digital fraud on the ATMs of his bank, with millions of Euros stolen by a team of hackers all around the world. The article mentioned how the hackers had ‘played with the bank computers just like playing Mastermind’, being able to disclose thousands of user PINs during the one-hour lunch break. That precise moment, a second before falling senseless, he understood the subtle smile on Joe’s face the day before, while training at his preferred game, Mastermind.
Short-term effects of cervical kinesio taping on pain and cervical range of motion in patients with acute whiplash injury: a randomized clinical trial.
DESIGN Randomized clinical trial. OBJECTIVES To determine the short-term effects of Kinesio Taping, applied to the cervical spine, on neck pain and cervical range of motion in individuals with acute whiplash-associated disorders (WADs). BACKGROUND Researchers have begun to investigate the effects of Kinesio Taping on different musculoskeletal conditions (eg, shoulder and trunk pain). Considering the demonstrated short-term effectiveness of Kinesio Tape for the management of shoulder pain, it is suggested that Kinesio Tape may also be beneficial in reducing pain associated with WAD. METHODS AND MEASURES Forty-one patients (21 females) were randomly assigned to 1 of 2 groups: the experimental group received Kinesio Taping to the cervical spine (applied with tension) and the placebo group received a sham Kinesio Taping application (applied without tension). Both neck pain (11-point numerical pain rating scale) and cervical range-of-motion data were collected at baseline, immediately after the Kinesio Tape application, and at a 24-hour follow-up by an assessor blinded to the treatment allocation of the patients. Mixed-model analyses of variance (ANOVAs) were used to examine the effects of the treatment on each outcome variable, with group as the between-subjects variable and time as the within-subjects variable. The primary analysis was the group-by-time interaction. RESULTS The group-by-time interaction for the 2-by-3 mixed-model ANOVA was statistically significant for pain as the dependent variable (F = 64.8; P<.001), indicating that patients receiving Kinesio Taping experienced a greater decrease in pain immediately postapplication and at the 24-hour follow-up (both, P<.001). The group-by-time interaction was also significant for all directions of cervical range of motion: flexion (F = 50.8; P<.001), extension (F = 50.7; P<.001), right (F = 39.5; P<.001) and left (F = 3.8, P<.05) lateral flexion, and right (F = 33.9, P<.001) and left (F = 39.5, P<.001) rotation. Patients in the experimental group obtained a greater improvement in range of motion than thosein the control group (all, P<.001). CONCLUSIONS Patients with acute WAD receiving an application of Kinesio Taping, applied with proper tension, exhibited statistically significant improvements immediately following application of the Kinesio Tape and at a 24-hour follow-up. However, the improvements in pain and cervical range of motion were small and may not be clinically meaningful. Future studies should investigate if Kinesio Taping provides enhanced outcomes when added to physical therapy interventions with proven efficacy or when applied over a longer period. LEVEL OF EVIDENCE Therapy, level 1b. J Orthop Sports Phys Ther 2009;39(7):515-521, Epub 24 February 2009. doi:10.2519/jospt.2009.3072.
Examining a hate speech corpus for hate speech detection and popularity prediction
As research on hate speech becomes more and more relevant every day, most of it is still focused on hate speech detection. By attempting to replicate a hate speech detection experiment performed on an existing Twitter corpus annotated for hate speech, we highlight some issues that arise from doing research in the field of hate speech, which is essentially still in its infancy. We take a critical look at the training corpus in order to understand its biases, while also using it to venture beyond hate speech detection and investigate whether it can be used to shed light on other facets of research, such as popularity of hate tweets.
Noticing Future Me: Reducing Egocentrism Through Mental Imagery.
People drastically overestimate how often others attend to them or notice their unusual features, a phenomenon termed the spotlight effect Despite the prevalence of this egocentric bias, little is known about how to reduce the tendency to see oneself as the object of others' attention. Here, we tested the hypothesis that a basic property of mental imagery-the visual perspective from which an event is viewed-may alleviate a future-oriented variant of the spotlight effect. The results of three experiments supported this prediction. Experiment 1 revealed a reduction in egocentric spotlighting when participants imagined an event in the far compared with near future. Experiments 2 and 3 demonstrated reduced spotlighting and feelings of embarrassment when participants viewed an impending event from a third-person (vs. first-person) vantage point. Simple changes in one's visual perspective may be sufficient to diminish the illusion of personal salience.
Validation of a BMI cut-off point to predict an adverse cardiometabolic profile with adiposity measurements by dual-energy X-ray absorptiometry in Guatemalan children.
OBJECTIVE To identify a body fat percentage (%BF) threshold related to an adverse cardiometabolic profile and its surrogate BMI cut-off point. DESIGN Cross-sectional study. SETTING Two public schools in poor urban areas on the outskirts of Guatemala City. SUBJECTS A convenience sample of ninety-three healthy, prepubertal, Ladino children (aged 7-12 years). RESULTS Spearman correlations of cardiometabolic parameters were higher with %BF than with BMI-for-age Z-score. BMI-for-age Z-score and %BF were highly correlated (r=0·84). The %BF threshold that maximized sensitivity and specificity for predicting an adverse cardiometabolic profile (elevated homeostasis model assessment-insulin resistance index and/or total cholesterol:HDL-cholesterol ratio) according to receiver operating characteristic curve analysis was 36 %. The BMI-for-age Z-score cut-off point that maximized the prediction of BF ≥ 36 % by the same procedure was 1·5. The area under the curve (AUC) for %BF and for BMI data showed excellent accuracy to predict an adverse cardiometabolic profile (AUC 0·93 (sd 0·04)) and excess adiposity (AUC 0·95 (sd 0·02)). CONCLUSIONS Since BMI standards have limitations in screening for adiposity, specific cut-off points based on ethnic-/sex- and age-specific %BF thresholds are needed to better predict an adverse cardiometabolic profile.
Multi-Parametric Toolbox 3.0
The Multi-Parametric Toolbox is a collection of algorithms for modeling, control, analysis, and deployment of constrained optimal controllers developed under Matlab. It features a powerful geometric library that extends the application of the toolbox beyond optimal control to various problems arising in computational geometry. The new version 3.0 is a complete rewrite of the original toolbox with a more flexible structure that offers faster integration of new algorithms. The numerical side of the toolbox has been improved by adding interfaces to state of the art solvers and by incorporation of a new parametric solver that relies on solving linear-complementarity problems. The toolbox provides algorithms for design and implementation of real-time model predictive controllers that have been extensively tested.
Spoken Language Understanding for a Nutrition Dialogue System
Food logging is recommended by dieticians for prevention and treatment of obesity, but currently available mobile applications for diet tracking are often too difficult and time-consuming for patients to use regularly. For this reason, we propose a novel approach to food journaling that uses speech and language understanding technology in order to enable efficient self-assessment of energy and nutrient consumption. This paper presents ongoing language understanding experiments conducted as part of a larger effort to create a nutrition dialogue system that automatically extracts food concepts from a user's spoken meal description. We first summarize the data collection and annotation of food descriptions performed via Amazon Mechanical Turk AMT, for both a written corpus and spoken data from an in-domain speech recognizer. We show that the addition of word vector features improves conditional random field CRF performance for semantic tagging of food concepts, achieving an average F1 test score of 92.4 on written data; we also demonstrate that a convolutional neural network CNN with no hand-crafted features outperforms the best CRF on spoken data, achieving an F1 test score of 91.3. We illustrate two methods for associating foods with properties: segmenting meal descriptions with a CRF, and a complementary method that directly predicts associations with a feed-forward neural network. Finally, we conduct an end-to-end system evaluation through an AMT user study with worker ratings of 83% semantic tagging accuracy.
Diagnosing Lassa virus infection by tracking the antiviral response
Background Lassa fever is an acute viral hemorrhagic fever caused by the Lassa virus. It infects 300,000 to 500,000 West Africans every year, with a mortality rate among hospitalized patients of 15%. It is difficult to diagnose because its early symptoms (fever, sore throat, general malaise) often go unnoticed, or are confused with those of the common flu, malaria, or other febrile diseases. For this reason, there is considerable interest in developing tests that can detect the presence of the virus at the earliest stages of infection, when treatment is most effective. Methods such as the enzyme-linked immunosorbent assay (ELISA) or the
Waste Bin Monitoring System UsingIntegrated Technologies
Now a days, there are a number of techniques which are purposefully used and are being build up for well management of garbage or solid waste . Zigbee and Global System for Mobile Communication (GSM) are the latest trends and are one of the best combination to be used in the project. Hence,a combination of both of these technologies is used in the project . To give a brief description of the project , the sensors are placed in the common garbage bins placed at the public places. When the garbage reaches the level of the sensor, then that indication will be given to ARM 7 Controller. The controller will give indication to the driver of garbage collection truck as to which garbage bin is completely filled and needs urgent attention. ARM 7 will give indication by sending SMS using GSM technology.
Biometric template selection and update: a case study in fingerprints
A biometric authentication system operates by acquiring biometric data from a user and comparing it against the template data stored in a database in order to identify a person or to verify a claimed identity. Most systems store multiple templates per user in order to account for variations observed in a person’s biometric data. In this paper we propose two methods to perform automatic template selection where the goal is to select prototype #ngerprint templates for a #nger from a given set of #ngerprint impressions. The #rst method, called DEND, employs a clustering strategy to choose a template set that best represents the intra-class variations, while the second method, called MDIST, selects templates that exhibit maximum similarity with the rest of the impressions. Matching results on a database of 50 di4erent #ngers, with 200 impressions per #nger, indicate that a systematic template selection procedure as presented here results in better performance than random template selection. The proposed methods have also been utilized to perform automatic template update. Experimental results underscore the importance of these techniques. ? 2003 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.
Contour Boxplots: A Method for Characterizing Uncertainty in Feature Sets from Simulation Ensembles
Ensembles of numerical simulations are used in a variety of applications, such as meteorology or computational solid mechanics, in order to quantify the uncertainty or possible error in a model or simulation. Deriving robust statistics and visualizing the variability of an ensemble is a challenging task and is usually accomplished through direct visualization of ensemble members or by providing aggregate representations such as an average or pointwise probabilities. In many cases, the interesting quantities in a simulation are not dense fields, but are sets of features that are often represented as thresholds on physical or derived quantities. In this paper, we introduce a generalization of boxplots, called contour boxplots, for visualization and exploration of ensembles of contours or level sets of functions. Conventional boxplots have been widely used as an exploratory or communicative tool for data analysis, and they typically show the median, mean, confidence intervals, and outliers of a population. The proposed contour boxplots are a generalization of functional boxplots, which build on the notion of data depth. Data depth approximates the extent to which a particular sample is centrally located within its density function. This produces a center-outward ordering that gives rise to the statistical quantities that are essential to boxplots. Here we present a generalization of functional data depth to contours and demonstrate methods for displaying the resulting boxplots for two-dimensional simulation data in weather forecasting and computational fluid dynamics.
Higher-Order Concurrent Linear Logic Programming
We propose a typed, higher-order, concurrent linear logic programming called higher-order ACL, which uniformly integrates a variety of mechanisms for concurrent computation based on asynchronous message passing. Higher-order ACL is based on a proof search paradigm according to the principle, proofs as computations, formulas as processes in linear logic. In higher-order ACL, processes as well as functions, and other values can be communicated via messages, which provides high modularity of concurrent programs. Higher-order ACL can be viewed as an asynchronous counterpart of Milner's higher-order, polyadic calculus. Moreover, higher-order ACL is equipped with an elegant MLstyle type system that ensures (1) well typed programs can never cause type mismatch errors, and (2) there is a type inference algorithm which computes a most general typing for an untyped term. We also demonstrate a power of higher-order ACL by showing several examples of \higher-order concurrent programming."
Ampicillin plus ceftriaxone is as effective as ampicillin plus gentamicin for treating enterococcus faecalis infective endocarditis.
BACKGROUND The aim of this study was to compare the effectiveness of the ampicillin plus ceftriaxone (AC) and ampicillin plus gentamicin (AG) combinations for treating Enterococcus faecalis infective endocarditis (EFIE). METHODS An observational, nonrandomized, comparative multicenter cohort study was conducted at 17 Spanish and 1 Italian hospitals. Consecutive adult patients diagnosed of EFIE were included. Outcome measurements were death during treatment and at 3 months of follow-up, adverse events requiring treatment withdrawal, treatment failure requiring a change of antimicrobials, and relapse. RESULTS A larger percentage of AC-treated patients (n = 159) had previous chronic renal failure than AG-treated patients (n = 87) (33% vs 16%, P = .004), and AC patients had a higher incidence of cancer (18% vs 7%, P = .015), transplantation (6% vs 0%, P = .040), and healthcare-acquired infection (59% vs 40%, P = .006). Between AC and AG-treated EFIE patients, there were no differences in mortality while on antimicrobial treatment (22% vs 21%, P = .81) or at 3-month follow-up (8% vs 7%, P = .72), in treatment failure requiring a change in antimicrobials (1% vs 2%, P = .54), or in relapses (3% vs 4%, P = .67). However, interruption of antibiotic treatment due to adverse events was much more frequent in AG-treated patients than in those receiving AC (25% vs 1%, P < .001), mainly due to new renal failure (≥25% increase in baseline creatinine concentration; 23% vs 0%, P < .001). CONCLUSIONS AC appears as effective as AG for treating EFIE patients and can be used with virtually no risk of renal failure and regardless of the high-level aminoglycoside resistance status of E. faecalis.
Ontology-based cloud platform for human-driven applications
The number of systems and applications where large groups of people are included into the information processing “loop” is growing. Common problem with this kind of systems is that each of them requires large number of contributors and collecting this number may take significant time and effort. This paper aims at development of an ontology-driven cloud platform that would support deployment of various human-based applications and therefore reuse the existing crowd. Three features that distinguish the proposed platform from similar developments are ontologies, digital contracts and resource monitoring facilities. Ontological mechanisms (ability to precisely define semantics and use inference to find related terms) are used to find and allocate human resources required by software services. Digital contracts are used to achieve predictability required by cloud users (application developers). Finally, explicit mechanisms for resource monitoring are essential, as human resources are always limited and the developers of applications deployed in the platform should be aware of particular limitations.
Around-body interaction: sensing & interaction techniques for proprioception-enhanced input with mobile devices
The space around the body provides a large interaction volume that can allow for big interactions on small mobile devices. However, interaction techniques making use of this opportunity are underexplored, primarily focusing on distributing information in the space around the body. We demonstrate three types of around-body interaction including canvas, modal and context-aware interactions in six demonstration applications. We also present a sensing solution using standard smartphone hardware: a phone's front camera, accelerometer and inertia measurement units. Our solution allows a person to interact with a mobile device by holding and positioning it between a normal field of view and its vicinity around the body. By leveraging a user's proprioceptive sense, around-body Interaction opens a new input channel that enhances conventional interaction on a mobile device without requiring additional hardware.
MetaFac: community discovery via relational hypergraph factorization
This paper aims at discovering community structure in rich media social networks, through analysis of time-varying, multi-relational data. Community structure represents the latent social context of user actions. It has important applications in information tasks such as search and recommendation. Social media has several unique challenges. (a) In social media, the context of user actions is constantly changing and co-evolving; hence the social context contains time-evolving multi-dimensional relations. (b) The social context is determined by the available system features and is unique in each social media website. In this paper we propose MetaFac (MetaGraph Factorization), a framework that extracts community structures from various social contexts and interactions. Our work has three key contributions: (1) metagraph, a novel relational hypergraph representation for modeling multi-relational and multi-dimensional social data; (2) an efficient factorization method for community extraction on a given metagraph; (3) an on-line method to handle time-varying relations through incremental metagraph factorization. Extensive experiments on real-world social data collected from the Digg social media website suggest that our technique is scalable and is able to extract meaningful communities based on the social media contexts. We illustrate the usefulness of our framework through prediction tasks. We outperform baseline methods (including aspect model and tensor analysis) by an order of magnitude.
Novel terephthaloyl thiourea cross-linked chitosan hydrogels as antibacterial and antifungal agents.
Four novel terephthaloyl thiourea chitosan (TTU-chitosan) hydrogels were synthesized via a cross-linking reaction of chitosan with different concentrations of terephthaloyl diisothiocyanate. Their structures were investigated by elemental analyses, FTIR, SEM and X-ray diffraction. The antimicrobial activities of the hydrogels against three species of bacteria (Bacillis subtilis, Staphylococcus aureus and Escherichia coli) and three crop-threatening pathogenic fungi (Aspergillus fumigatus, Geotrichum candidum and Candida albicans) are much higher than that of the parent chitosan. The hydrogels were more potent in case of Gram-positive bacteria than Gram-negative bacteria. Increasing the degree of cross-linking in the hydrogels resulted in a stronger antimicrobial activity.
Results from a Study for Teaching Human Body Systems to Primary School Students Using Tablets.
The paper presents the results from a study which examined whether tablets together with a mobile application with augmented reality features can help students to better understand the functions of the respiratory and the circulatory system. The target group was 75 sixth-grade primary school students, divided into three groups. The first group was taught conventionally; students studied using a printed handbook. In the second, a constructivist teaching model was used, but the instruction was not technologically enhanced. The third group of students used tablets and an application, and the teaching was based on a slightly modified version of Bybee's 5Es model. All three groups of students worked in pairs, they were taught the same learning material, and the teacher acted as a facilitator of the process. Data were collected by means of a questionnaire and evaluation sheets. Results indicated that students in the third group outperformed students in the other two groups. The results can be attributed to students' enjoyment, motivation, and positive attitude towards the use of tablets as well as to the teaching method. The study's implications are also discussed.
Metabolomics in nutritional epidemiology: identifying metabolites associated with diet and quantifying their potential to uncover diet-disease relations in populations.
BACKGROUND Metabolomics is an emerging field with the potential to advance nutritional epidemiology; however, it has not yet been applied to large cohort studies. OBJECTIVES Our first aim was to identify metabolites that are biomarkers of usual dietary intake. Second, among serum metabolites correlated with diet, we evaluated metabolite reproducibility and required sample sizes to determine the potential for metabolomics in epidemiologic studies. DESIGN Baseline serum from 502 participants in the Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial was analyzed by using ultra-high-performance liquid-phase chromatography with tandem mass spectrometry and gas chromatography-mass spectrometry. Usual intakes of 36 dietary groups were estimated by using a food-frequency questionnaire. Dietary biomarkers were identified by using partial Pearson's correlations with Bonferroni correction for multiple comparisons. Intraclass correlation coefficients (ICCs) between samples collected 1 y apart in a subset of 30 individuals were calculated to evaluate intraindividual metabolite variability. RESULTS We detected 412 known metabolites. Citrus, green vegetables, red meat, shellfish, fish, peanuts, rice, butter, coffee, beer, liquor, total alcohol, and multivitamins were each correlated with at least one metabolite (P < 1.093 × 10(-6); r = -0.312 to 0.398); in total, 39 dietary biomarkers were identified. Some correlations (citrus intake with stachydrine) replicated previous studies; others, such as peanuts and tryptophan betaine, were novel findings. Other strong associations included coffee (with trigonelline-N-methylnicotinate and quinate) and alcohol (with ethyl glucuronide). Intraindividual variability in metabolite levels (1-y ICCs) ranged from 0.27 to 0.89. Large, but attainable, sample sizes are required to detect associations between metabolites and disease in epidemiologic studies, further emphasizing the usefulness of metabolomics in nutritional epidemiology. CONCLUSIONS We identified dietary biomarkers by using metabolomics in an epidemiologic data set. Given the strength of the associations observed, we expect that some of these metabolites will be validated in future studies and later used as biomarkers in large cohorts to study diet-disease associations. The PLCO trial was registered at clinicaltrials.gov as NCT00002540.
Relative Camera Pose Estimation Using Convolutional Neural Networks
This paper presents a convolutional neural network based approach for estimating the relative pose between two cameras. The proposed network takes RGB images from both cameras as input and directly produces the relative rotation and translation as output. The system is trained in an end-to-end manner utilising transfer learning from a large scale classification dataset. The introduced approach is compared with widely used local feature based methods (SURF, ORB) and the results indicate a clear improvement over the baseline. In addition, a variant of the proposed architecture containing a spatial pyramid pooling (SPP) layer is evaluated and shown to further improve the performance.
A layered software architecture for quantum computing design tools
Compilers and computer-aided design tools are essential for fine-grained control of nanoscale quantum-mechanical systems. A proposed four-phase design flow assists with computations by transforming a quantum algorithm from a high-level language program into precisely scheduled physical actions.
Quantizing Convolutional Neural Networks for Low-Power High-Throughput Inference Engines
Deep learning as a means to inferencing has proliferated thanks to its versatility and ability to approach or exceed human-level accuracy. These computational models have seemingly insatiable appetites for computational resources not only while training, but also when deployed at scales ranging from data centers all the way down to embedded devices. As such, increasing consideration is being made to maximize the computational efficiency given limited hardware and energy resources and, as a result, inferencing with reduced precision has emerged as a viable alternative to the IEEE 754 Standard for Floating-Point Arithmetic. We propose a quantization scheme that allows inferencing to be carried out using arithmetic that is fundamentally more efficient when compared to even half-precision floating-point. Our quantization procedure is significant in that we determine our quantization scheme parameters by calibrating against its reference floating-point model using a single inference batch rather than (re)training and achieve end-to-end post quantization accuracies comparable to the reference model.
Explaining the relation between birth order and intelligence.
Negative associations between birth order and intelligence level have been found in numerous studies. The explanation for this relation is not clear, and several hypotheses have been suggested. One family of hypotheses suggests that the relation is due to more-favorable family interaction and stimulation of low-birth-order children, whereas others claim that the effect is caused by prenatal gestational factors. We show that intelligence quotient (IQ) score levels among nearly 250,000 military conscripts were dependent on social rank in the family and not on birth order as such, providing support for a family interaction explanation.
Doped aromatic derivatives wide-gap crystalline semiconductor structured layers for electronic application
Abstract We present some investigations on the electrical conductivity of two benzene substituted derivatives (m-DNB, benzil) emphasizing the correlation between the molecular structure, purity, structural defects and the particularities of the conduction mechanisms. The influence of inorganic/organic doping on the I – V plots of silicon/wide-gap organic semiconductor/silicon heterostructures has been analysed. The most significant increase in the conductance has been obtained for organic crystalline films of benzil doped with 3 wt.% m-DNB and m-DNB films doped with 1 wt.% oxine or 10 wt.% resorcinol. The influence of the silicon wafer's properties as resistivity, conduction type and surface processing on the carrier transport properties in these structures has been studied. We have remarked an increase in the conductance of the organic films of m-DNB doped with oxine or resorcinol in heterostructures realized with chemically polished, “n” type single crystal silicon wafers, compared to lapped ones. A special type of conduction mechanism given by a Poole–Frenkel dependence was evidenced in resorcinol doped m-DNB in the low voltages range.
The Online Discovery Problem and Its Application to Lifelong Reinforcement Learning
Transferring knowledge across a sequence of related tasks is an important challenge in reinforcement learning. Despite much encouraging empirical evidence that shows benefits of transfer, there has been very little theoretical analysis. In this paper, we study a class of lifelong reinforcementlearning problems: the agent solves a sequence of tasks modeled as finite Markov decision processes (MDPs), each of which is from a finite set of MDPs with the same state/action spaces and different transition/reward functions. Inspired by the need for cross-task exploration in lifelong learning, we formulate a novel online discovery problem and give an optimal learning algorithm to solve it. Such results allow us to develop a new lifelong reinforcement-learning algorithm, whose overall sample complexity in a sequence of tasks is much smaller than that of single-task learning, with high probability, even if the sequence of tasks is generated by an adversary. Benefits of the algorithm are demonstrated in a simulated problem.
Gaussian belief propagation solver for systems of linear equations
The canonical problem of solving a system of linear equations arises in numerous contexts in information theory, communication theory, and related fields. In this contribution, we develop a solution based upon Gaussian belief propagation (GaBP) that does not involve direct matrix inversion. The iterative nature of our approach allows for a distributed message-passing implementation of the solution algorithm. We also address some properties of the GaBP solver, including convergence, exactness, its max-product version and relation to classical solution methods. The application example of decorrelation in CDMA is used to demonstrate the faster convergence rate of the proposed solver in comparison to conventional linear-algebraic iterative solution methods.
A method of CAD/CAE data integration based on XML
In spite of the widespread use of CAD systems for design and CAE systems for analysis, the two processes are not well integrated because CAD and CAE models inherently use different types of geometric models and there currently exists no generic, unified model that allows both design and analysis information to be specified and shared. XML has become the de-facto standard for data representation and exchange on the World-Wide Web. This paper proposes a data integration method based on the XML technique in order to resolve the problem of data transmission for CAD and CAE. Designers parametrically model the bridges through 3D CAD platform. CAE analysts conduct explicit dynamic FEA (Finite Element Analysis) on the designed bridge structure. CAD and CAE functions are accomplished through C/S architecture. An XML and Web Service based DAC (Design-Analysis Connection) is developed to maintain a consistence between CAD model and FEA model. The design is then displayed to the customers through B/S mechanism, which provides a convenient method for the customers to participate the design process. Since all the operations are conducted through internet/intranet, customers, designers and analysts are able to participate the design process at different geographical locations. According to the interface procedure of the model transformation compiled in this paper, the finite element model was successfully transformed from CAD system to CAE system.
Students' attitudes toward playing games and using games in education: Comparing Scotland and the Netherlands
Games-based learning has captured the interest of educationalists and industrialists who seek to exploit the characteristics of computer games as they are perceived by some to be a potentially effective approach for teaching and learning. Despite this interest in using games-based learning there is a dearth of empirical evidence supporting the validity of the approach covering the wider context of gaming and education. This study presents a large scale gaming survey, involving 887 students from 13 different Higher Education (HE) institutes in Scotland and the Netherlands, which examines students’ characteristics related to their gaming preferences, game playing habits, and their perceptions and thoughts on the use of games in education. It presents a comparison of three separate groups of students: a group in regular education in a Scottish university, a group in regular education in universities in the Netherlands and a distance learning group from a university in the Netherlands. This study addresses an overall research question of: Can computer games be used for educational purposes at HE level in regular and distance education in different countries? The study then addresses four sub-research questions related to the overall research question: What are the different game playing habits of the three groups? What are the different motivations for playing games across the three groups? What are the different reasons for using games in HE across the three groups? What are the different attitudes towards games across the three groups? To our knowledge this is the first in-depth cross-national survey on gaming and education. We found that a large number of participants believed that computer games could be used at HE level for educational purposes and that further research in the area of game playing habits, motivations for playing computer games and motivations for playing computer games in education are worthy of extensive further investigation. We also found a clear distinction between the views of students in regular education and those in distance education. Regular education students in both countries rated all motivations for playing computer games as significantly more important than distance education students. Also the results suggest that Scottish students aim to enhance their social experience with regards to competition and cooperation, while Dutch students aim to enhance their leisurely experience with regards to leisure, feeling good, preventing boredom and excitement. 2013 Elsevier Ltd. All rights reserved. Hainey), [email protected] (W. Westera), [email protected] (T.M. Connolly), [email protected]
Better clinical results after closed- compared to open-wedge high tibial osteotomy in patients with medial knee osteoarthritis and varus leg alignment
Studies comparing mid- or long-term outcomes of open- and closed-wedge high tibial osteotomy are limited. Here, the midterm survival rate and clinical and radiographic outcomes were compared for these two techniques. The study hypothesis, based on short-term follow-up, was that after midterm follow-up, the two techniques would not differ. A prospective follow-up study was conducted for a previously reported randomized controlled trial of an original 50 patients (25 open-wedge osteotomy and 25 closed-wedge osteotomy) with medial knee osteoarthritis and a varus leg alignment. We analyzed patients without knee arthroplasty (mean age 48.7 years, SD 8.0) for clinical and radiographic follow-up. Five patients in each group had undergone conversion to a total knee arthroplasty or unicompartmental knee arthroplasty, leaving 19 patients for analysis in each group. At 7.9 years of follow-up (range 7–9 years), survival did not differ significantly between groups (open-wedge group 81.3 % [95 % confidence interval (CI) 75.2–100], closed-wedge group 82.0 % [95 % CI 66.7–100]). At final follow-up, total Dutch Western Ontario and McMaster Universities Arthritis (WOMAC), Knee Society Score, and visual analog scale (VAS) pain did not differ between groups. However, the results were significantly better in the closed-wedge group for VAS satisfaction and WOMAC pain and stiffness compared to the open-wedge group. Radiographic evaluation did not differ between groups for any outcome at final follow-up. After a mean follow-up of 7.9 years, patients undergoing a closed-wedge osteotomy had favorable clinical results compared to those who underwent an open-wedge osteotomy. II.
2 Conceptual Foundations of the Balanced Scorecard Abstract
David Norton and I introduced the Balanced Scorecard in a 1992 Harvard Business Review article (Kaplan & Norton, 1992). The article was based on a multi-company research project to study performance measurement in companies whose intangible assets played a central role in value creation (Nolan Norton Institute, 1991). Norton and I believed that if companies were to improve the management of their intangible assets, they had to integrate the measurement of intangible assets into their management systems. After publication of the 1992 HBR article, several companies quickly adopted the Balanced Scorecard giving us deeper and broader insights into its power and potential. During the next 15 years, as it was adopted by thousands of private, public, and nonprofit enterprises around the world, we extended and broadened the concept into a management tool for describing, communicating and implementing strategy. This paper describes the roots and motivation for the original Balanced Scorecard article as well as the subsequent innovations that connected it to a larger management literature.
The Microsoft Indoor Localization Competition: Experiences and Lessons Learned
We present the results, experiences, and lessons learned from comparing a diverse set of indoor location technologies during the Microsoft Indoor Localization Competition. Over the last four years (2014-2017), more than 100 teams from academia and industry deployed their indoor location solutions in realistic, unfamiliar environments, allowing us to directly compare their accuracies and overhead. In this article, we provide an analysis of this four-year-long evaluation study's results and discuss the current state of the art in indoor localization.
Types of study in medical research: part 3 of a series on evaluation of scientific publications.
BACKGROUND The choice of study type is an important aspect of the design of medical studies. The study design and consequent study type are major determinants of a study's scientific quality and clinical value. METHODS This article describes the structured classification of studies into two types, primary and secondary, as well as a further subclassification of studies of primary type. This is done on the basis of a selective literature search concerning study types in medical research, in addition to the authors' own experience. RESULTS Three main areas of medical research can be distinguished by study type: basic (experimental), clinical, and epidemiological research. Furthermore, clinical and epidemiological studies can be further subclassified as either interventional or noninterventional. CONCLUSIONS The study type that can best answer the particular research question at hand must be determined not only on a purely scientific basis, but also in view of the available financial resources, staffing, and practical feasibility (organization, medical prerequisites, number of patients, etc.).
A Mathematical Approach to Physical Realism
I propose to ask mathematics itself for the possible behaviour of nature, with the focus on starting with a most simple realistic model, employing a philosophy of investigation rather than invention when looking for a unified theory of physics. Doing a ‘mathematical experiment’ of putting a least set of conditions on a general time-dependent manifold results in mathematics itself inducing a not too complex 4-dimensional object similar to our physical spacetime, with candidates for gravitational and electromagnetic fields emerging on the tangent bundle. This suggests that the same physics might govern spacetime not only on a macroscopic scale, but also on the microscopic scale of elementary particles, with possible junctions to quantum mechanics.
Effects of functional electrical stimulation on the joints of adolescents with spinal cord injury
Nineteen adolescent subjects with complete spinal cord injuries resulting in paraplegia or tetraplegia participated in a functional electrical stimulation (FES) program consisting of computerized, controlled exercise and/or weight bearing. The effects of stimulated exercise and standing/ walking on the lower extremity joints were prospectively studied. Plain radiographs and MRIs were obtained prior to and following completion of the exercise and standing and walking stages. In addition, the joints of five subjects were studied with synovial biopsies, arthroscopy, and the analysis of serum and synovial fluid for a 550 000 dalton cartilage matrix glycoprotein (CMGP). Pre-exercise joint abnormalities secondary to the spinal cord injury improved following the stimulation program. None of the subjects developed Charcot joint changes. Upon standing with FES, one subject with poor hip coverage prior to participation developed hip subluxation which required surgical repair. No other detrimental clinical effects occured in the lower extremity joints of subjects participating in an FES program over a 1-year period.
A nonlocking end screw can decrease fracture risk caused by locked plating in the osteoporotic diaphysis.
BACKGROUND Locking plates transmit load through fixed-angle locking screws instead of relying on plate-to-bone compression. Therefore, locking screws may induce higher stress at the screw-bone interface than that seen with conventional nonlocked plating. This study investigated whether locked plating in osteoporotic diaphyseal bone causes a greater periprosthetic fracture risk than conventional plating because of stress concentrations at the plate end. It further investigated the effect of replacing the locked end screw with a conventional screw on the strength of the fixation construct. METHODS Three different bridge-plate constructs were applied to a validated surrogate of the osteoporotic femoral diaphysis. Constructs were tested dynamically to failure in bending, torsion, and axial loading to determine failure loads and failure modes. A locked plating construct was compared with a nonlocked conventional plating construct. Subsequently, the outermost locking screw in locked plating constructs was replaced with a conventional screw to reduce stress concentrations at the plate end. RESULTS Compared with the conventional plating construct, the locked plating construct was 22% weaker in bending (p = 0.013), comparably strong in torsion (p = 0.05), and 15% stronger in axial compression (p = 0.017). Substituting the locked end screw with a conventional screw increased the construct strength by 40% in bending (p = 0.001) but had no significant effect on construct strength under torsion (p = 0.22) and compressive loading (p = 0.53) compared with the locked plating construct. Under bending, all constructs failed by periprosthetic fracture. CONCLUSIONS Under bending loads, the focused load transfer of locking plates through fixed-angle screws can increase the periprosthetic fracture risk in the osteoporotic diaphysis compared with conventional plates. Replacing the outermost locking screw with a conventional screw reduced the stress concentration at the plate end and significantly increased the bending strength of the plating construct compared with an all-locked construct (p = 0.001).
Tocilizumab, a humanized anti-interleukin-6 receptor antibody, for treatment of rheumatoid arthritis
Interleukin (IL)-6 has a variety of biological functions. For example, it stimulates the production of acute-phase reactants (C-reactive protein and serum amyloid A) and hepcidin which interferes with iron recycling and absorption, causing iron-deficient anemia, and augments expression of vascular endothelial growth factor and receptor activator of nuclear factor-κB ligand in synovial cells, leading to neovascularization and osteoclast formation. IL-6 also acts on lymphocytes, not only on B cells to stimulate autoantibody production, but also on naïve T helper cells to promote Th17 cell differentiation. Thus, an imbalance between T cell subsets possibly contributes to development of rheumatoid arthritis. Several clinical studies have demonstrated that a humanized anti-IL-6 receptor antibody, tocilizumab, improves clinical symptoms in rheumatoid arthritis. Tocilizumab prevented radiographic progression of joint destruction by inhibiting cartilage/bone resorption. Tocilizumab also improved hematological abnormalities, including hypergammaglobulinemia, high levels of autoantibodies, and elevation of erythrocyte sedimentation rate and acute-phase proteins. Importantly, tocilizumab improved quality of life by reducing systemic symptoms, including fatigue, anemia, anorexia, and fever. These findings have confirmed that hyperproduction of IL-6 is responsible for the above clinical symptoms, including joint destruction. Many patients treated with tocilizumab achieved clinical remission associated with decreased serum IL-6, suggesting that IL-6 enhances autoimmunity. Tocilizumab is a new therapeutic option for rheumatoid arthritis.
Input-to-Output Gate to Improve RNN Language Models
This paper proposes a reinforcing method that refines the output layers of existing Recurrent Neural Network (RNN) language models. We refer to our proposed method as Input-to-Output Gate (IOG)1. IOG has an extremely simple structure, and thus, can be easily combined with any RNN language models. Our experiments on the Penn Treebank and WikiText-2 datasets demonstrate that IOG consistently boosts the performance of several different types of current topline RNN language models.
Behavioral evaluation of consciousness in severe brain damage.
This paper reviews the current state of bedside behavioral assessment in brain-damaged patients with impaired consciousness (coma, vegetative state, minimally conscious state). As misdiagnosis in this field is unfortunately very frequent, we first discuss a number of fundamental principles of clinical evaluation that should guide the assessment of consciousness in brain-damaged patients in order to avoid confusion between vegetative state and minimally conscious state. The role of standardized behavioral assessment tools is particularly stressed. The second part of this paper reviews existing behavioral assessment techniques of consciousness, showing that there are actually a large number of these scales. After a discussion of the most widely used scale, the Glasgow Coma Scale, we present several new promising tools that show higher sensitivity and reliability for detecting subtle signs of recovery of consciousness in the post-acute setting.
Does silent myocardial infarction add prognostic value in ST-elevation myocardial infarction patients without a history of prior myocardial infarction? Insights from the Assessment of Pexelizumab in Acute Myocardial Infarction (APEX-AMI) Trial.
BACKGROUND ST-elevation myocardial infarction (STEMI) patients with a prior MI history have worse outcomes. The prognostic significance of silent MI (pathologic Q waves outside the ST-elevation territory) in STEMI is unclear. METHODS A total of 5,733 STEMI patients from 296 clinical centers in 17 countries were classified as (1) silent MI-baseline Q waves outside the infarct-related artery territory and no history of prior MI, (2) history of prior MI (HxMI), or (3) no prior MI. RESULTS Of 5,733 STEMI patients, 419 (7.3%) had silent MI, 693 (12.1%) had HxMI, and 4,621 (80.6%) had no prior MI. Ninety-day death and death/congestive heart failure/shock were higher in patients with HxMI (8.4% and 15.3%, respectively) and silent MI (6.7% and 13.9%, respectively) compared with patients with no prior MI (4.0% and 9.1%, respectively) (P ≤ .001 for all). After baseline adjustment, patients with HxMI were at increased risk for 90-day death (adjusted hazard ratio [HR] 1.62, 95% CI 1.18-2.21), whereas both those with HxMI and those with silent MI had increased risk of 90-day death/congestive heart failure/shock compared with those with no prior MI (adjusted HR 1.54, 95% CI 1.23-1.93 and adjusted HR 1.46, 95% CI 1.10-1.93, respectively). CONCLUSIONS Seven percent of STEMI patients had a silent MI. They represent a novel subgroup at increased risk comparable to those with known prior MI. Hence, in future studies, acquiring baseline Q wave data outside the distribution of acute injury should broaden the prognostic insights from STEMI patients with a prior MI.