title
stringlengths
8
300
abstract
stringlengths
0
10k
Operationalizing frailty among older residents of assisted living facilities
BACKGROUND Frailty in later life is viewed as a state of heightened vulnerability to poor outcomes. The utility of frailty as a measure of vulnerability in the assisted living (AL) population remains unexplored. We examined the feasibility and predictive accuracy of two different interpretations of the Cardiovascular Health Study (CHS) frailty criteria in a population-based sample of AL residents. METHODS CHS frailty criteria were operationalized using two different approaches in 928 AL residents from the Alberta Continuing Care Epidemiological Studies (ACCES). Risks of one-year mortality and hospitalization were estimated for those categorized as frail or pre-frail (compared with non-frail). The prognostic significance of individual criteria was explored, and the area under the ROC curve (AUC) was calculated for select models to assess the utility of frailty in predicting one-year outcomes. RESULTS Regarding feasibility, complete CHS criteria could not be assessed for 40% of the initial 1,067 residents. Consideration of supplementary items for select criteria reduced this to 12%. Using absolute (CHS-specified) cut-points, 48% of residents were categorized as frail and were at greater risk for death (adjusted risk ratio [RR] 1.75, 95% CI 1.08-2.83) and hospitalization (adjusted RR 1.54, 95% CI 1.20-1.96). Pre-frail residents defined by absolute cut-points (48.6%) showed no increased risk for mortality or hospitalization compared with non-frail residents. Using relative cut-points (derived from AL sample), 19% were defined as frail and 55% as pre-frail and the associated risks for mortality and hospitalization varied by sex. Frail (but not pre-frail) women were more likely to die (RR 1.58 95% CI 1.02-2.44) and be hospitalized (RR 1.53 95% CI 1.25-1.87). Frail and pre-frail men showed an increased mortality risk (RR 3.21 95% CI 1.71-6.00 and RR 2.61 95% CI 1.40-4.85, respectively) while only pre-frail men had an increased risk of hospitalization (RR 1.58 95% CI 1.15-2.17). Although incorporating either frailty measure improved the performance of predictive models, the best AUCs were 0.702 for mortality and 0.633 for hospitalization. CONCLUSIONS Application of the CHS criteria for frailty was problematic and only marginally improved the prediction of select adverse outcomes in AL residents. Development and validation of alternative approaches for detecting frailty in this population, including consideration of female/male differences, is warranted.
BLASX: A High Performance Level-3 BLAS Library for Heterogeneous Multi-GPU Computing
Basic Linear Algebra Subprograms (BLAS) are a set of low level linear algebra kernels widely adopted by applications involved with the deep learning and scientific computing. The massive and economic computing power brought forth by the emerging GPU architectures drives interest in implementation of compute-intensive level 3 BLAS on multi-GPU systems. In this paper, we investigate existing multi-GPU level 3 BLAS and present that 1) issues, such as the improper load balancing, inefficient communication, insufficient GPU stream level concurrency and data caching, impede current implementations from fully harnessing heterogeneous computing resources; 2) and the inter-GPU Peer-to-Peer(P2P) communication remains unexplored. We then present BLASX: a highly optimized multi-GPU level-3 BLAS. We adopt the concepts of algorithms-by-tiles treating a matrix tile as the basic data unit and operations on tiles as the basic task. Tasks are guided with a dynamic asynchronous runtime, which is cache and locality aware. The communication cost under BLASX becomes trivial as it perfectly overlaps communication and computation across multiple streams during asynchronous task progression. It also takes the current tile cache scheme one step further by proposing an innovative 2-level hierarchical tile cache, taking advantage of inter-GPU P2P communication. As a result, linear speedup is observable with BLASX under multi-GPU configurations; and the extensive benchmarks demonstrate that BLASX consistently outperforms the related leading industrial and academic implementations such as cuBLAS-XT, SuperMatrix, MAGMA.
Fixed Point and Bregman Iterative Methods for Matrix Rank Minimization
The linearly constrained matrix rank minimization problem is widely applicable in many fields such as control, signal processing and system identification. The linearly constrained nuclear norm minimization is a convex relaxation of this problem. Although it can be cast as a semidefinite programming problem, the nuclear norm minimization problem is expensive to solve when the matrices are large. In this paper, we propose fixed point and Bregman iterative algorithms for solving the nuclear norm minimization problem and prove convergence of the first of these algorithms. By using a homotopy approach together with an approximate singular value decomposition procedure, we get a very fast, robust and powerful algorithm that can solve very large matrix rank minimization problems. Our numerical results on randomly generated and real matrix completion problems demonstrate that this algorithm is much faster and provides much better recoverability than semidefinite programming solvers such as SDPT3.
Long-term clinical and angiographic follow-up after coronary stent placement in native coronary arteries.
BACKGROUND Although coronary stents have been proved effective in reducing clinical cardiac events for up to 3 to 5 years, longer term clinical and angiographic outcomes have not yet been fully clarified. METHODS AND RESULTS To evaluate longer term (7 to 11 years) outcome, clinical and angiographic follow-up information was analyzed in 405 patients with successful stenting in native coronary arteries. Primary or secondary stabilization, which was defined as freedom from death, coronary artery bypass grafting, and target lesion-percutaneous coronary intervention (TL-PCI) during the 14 months after the initial procedure or after the last TL-PCI, was achieved in 373 patients (92%) overall. Only 7 patients (1.7%) underwent TL-PCI more than twice. After the initial 14-month period, freedom from TL-PCI reached a plateau at 84.9% to 80.7% over 1 to 8 years. However, quantitative angiographic analysis in 179 lesions revealed a triphasic luminal response characterized by an early restenosis phase until 6 months, an intermediate-term regression phase from 6 months to 3 years, and a late renarrowing phase beyond 4 years. Minimal luminal diameter in 131 patients with complete serial data were 2.62+/-0.4 mm immediately after stenting, 2.0+/-0.49 mm at 6 months, 2.19+/-0.49 mm at 3 years, and 1.85+/-0.56 mm beyond 4 years (P<0.0001). CONCLUSIONS The efficacy and safety of coronary stenting seemed to be clinically sustained at 7 to 11 years of follow-up. However, late luminal renarrowing beyond 4 years was common, which demonstrates the need for further follow-up.
Study protocol of effectiveness of a biopsychosocial multidisciplinary intervention in the evolution of non-speficic sub-acute low back pain in the working population: cluster randomised trial
BACKGROUND Non-specific low back pain is a common cause for consultation with the general practitioner, generating increased health and social costs. This study will analyse the effectiveness of a multidisciplinary intervention to reduce disability, severity of pain, anxiety and depression, to improve quality of life and to reduce the incidence of chronic low back pain in the working population with non-specific low back pain, compared to usual clinical care. METHODS/DESIGN A Cluster randomised clinical trial will be conducted in 38 Primary Health Care Centres located in Barcelona, Spain and its surrounding areas. The centres are randomly allocated to the multidisciplinary intervention or to usual clinical care. Patients between 18 and 65 years old (n = 932; 466 per arm) and with a diagnostic of a non-specific sub-acute low back pain are included. Patients in the intervention group are receiving the recommendations of clinical practice guidelines, in addition to a biopsychosocial multidisciplinary intervention consisting of group educational sessions lasting a total of 10 hours. The main outcome is change in the score in the Roland Morris disability questionnaire at three months after onset of pain. Other outcomes are severity of pain, quality of life, duration of current non-specific low back pain episode, work sick leave and duration, Fear Avoidance Beliefs and Goldberg Questionnaires. Outcomes will be assessed at baseline, 3, 6 and 12 months. Analysis will be by intention to treat. The intervention effect will be assessed through the standard error of measurement and the effect-size. Responsiveness of each scale will be evaluated by standardised response mean and receiver-operating characteristic method. Recovery according to the patient will be used as an external criterion. A multilevel regression will be performed on repeated measures. The time until the current episode of low back pain takes to subside will be analysed by Cox regression. DISCUSSION We hope to provide evidence of the effectiveness of the proposed biopsychosocial multidisciplinary intervention in avoiding the chronification of low back pain, and to reduce the duration of non-specific low back pain episodes. If the intervention is effective, it could be applied to Primary Health Care Centres. TRIAL REGISTRATION ISRCTN21392091.
Surgical technique: development of an anatomic medial knee reconstruction.
BACKGROUND The main static stabilizers of the medial knee are the superficial medial collateral and posterior oblique ligaments. A number of reconstructive techniques have been advocated including one we describe here. However, whether these reconstructions restore function and stability is unclear. DESCRIPTION OF TECHNIQUE This anatomic reconstruction technique consisted of reconstruction of the proximal and distal divisions of the superficial medial collateral and the posterior oblique ligament using two separate grafts. PATIENTS AND METHODS We prospectively followed all 28 patients (19 male, nine females) who had this new reconstruction between 2007 and 2009. The average age was 32.4 years (range, 16-56 years). There were eight acute and 20 chronic injuries. All patients presented with side-to-side instability with activities of daily living and other higher level activities. Minimum followup was 6 months (average, 1.5 years; range, 0.5-3 years). No patients were lost to followup. RESULTS Preoperative International Knee Documentation Committee subjective outcome scores averaged 43.5 (range, 14-66) and final postoperative values averaged 76.2 (range, 54-88). Preoperative valgus stress radiographs averaged 6.2 mm of medial compartment gapping compared with the contralateral normal knee, whereas postoperative stress radiographs averaged 1.3 mm. CONCLUSIONS Early observations suggest this anatomic reconstruction technique improves overall patient function and restores valgus instability.
Partitioning the net effect of host diversity on an emerging amphibian pathogen.
The 'dilution effect' (DE) hypothesis predicts that diverse host communities will show reduced disease. The underlying causes of pathogen dilution are complex, because they involve non-additive (driven by host interactions and differential habitat use) and additive (controlled by host species composition) mechanisms. Here, we used measures of complementarity and selection traditionally employed in the field of biodiversity-ecosystem function (BEF) to quantify the net effect of host diversity on disease dynamics of the amphibian-killing fungus Batrachochytrium dendrobatidis (Bd). Complementarity occurs when average infection load in diverse host assemblages departs from that of each component species in uniform populations. Selection measures the disproportionate impact of a particular species in diverse assemblages compared with its performance in uniform populations, and therefore has strong additive and non-additive properties. We experimentally infected tropical amphibian species of varying life histories, in single- and multi-host treatments, and measured individual Bd infection loads. Host diversity reduced Bd infection in amphibians through a mechanism analogous to complementarity (sensu BEF), potentially by reducing shared habitat use and transmission among hosts. Additionally, the selection component indicated that one particular terrestrial species showed reduced infection loads in diverse assemblages at the expense of neighbouring aquatic hosts becoming heavily infected. By partitioning components of diversity, our findings underscore the importance of additive and non-additive mechanisms underlying the DE.
Tenofovir-related nephrotoxicity in human immunodeficiency virus-infected patients: three cases of renal failure, Fanconi syndrome, and nephrogenic diabetes insipidus.
We report 3 cases of renal toxicity associated with use of the antiviral agent tenofovir. Renal failure, proximal tubular dysfunction, and nephrogenic diabetes insipidus were observed, and, in 2 cases, renal biopsy revealed severe tubular necrosis with characteristic nuclear changes. Patients receiving tenofovir must be monitored closely for early signs of tubulopathy (glycosuria, acidosis, mild increase in the plasma creatinine level, and proteinuria).
Linking proactive personality to moral imagination: Moral identity as a moderator
In this study I explored the moderating effect of moral identity between proactive personality and moral imagination. Data were collected at 2 time points separated by 1 month from 378 participants. The data for proactive personality, moral personality, and moral identity were collected at Time 1, and moral imagination was measured at Time 2. Results showed that proactive personality was positively associated with moral imagination for people with high levels of moral identity but was negatively associated with moral imagination for people with low levels of moral identity.
Facial expression recognition from video sequences: temporal and static modeling
The most expressive way humans display emotions is through facial expressions. In this work we report on several advances we have made in building a system for classification of facial expressions from continuous video input. We introduce and test different Bayesian network classifiers for classifying expressions from video, focusing on changes in distribution assumptions, and feature dependency structures. In particular we use Naive–Bayes classifiers and change the distribution from Gaussian to Cauchy, and use Gaussian Tree-Augmented Naive Bayes (TAN) classifiers to learn the dependencies among different facial motion features. We also introduce a facial expression recognition from live video input using temporal cues. We exploit the existing methods and propose a new architecture of hidden Markov models (HMMs) for automatically segmenting and recognizing human facial expression from video sequences. The architecture performs both segmentation and recognition of the facial expressions automatically using a multi-level architecture composed of an HMM layer and a Markov model layer. We explore both person-dependent and person-independent recognition of expressions and compare the different methods. 2003 Elsevier Inc. All rights reserved. * Corresponding author. E-mail addresses: [email protected] (I. Cohen), [email protected] (N. Sebe), ashutosh@ us.ibm.com (A. Garg), [email protected] (L. Chen), [email protected] (T.S. Huang). 1077-3142/$ see front matter 2003 Elsevier Inc. All rights reserved. doi:10.1016/S1077-3142(03)00081-X I. Cohen et al. / Computer Vision and Image Understanding 91 (2003) 160–187 161
A taxonomy and catalog of runtime software-fault monitoring tools
A goal of runtime software-fault monitoring is to observe software behavior to determine whether it complies with its intended behavior. Monitoring allows one to analyze and recover from detected faults, providing additional defense against catastrophic failure. Although runtime monitoring has been in use for over 30 years, there is renewed interest in its application to fault detection and recovery, largely because of the increasing complexity and ubiquitous nature of software systems. We present taxonomy that developers and researchers can use to analyze and differentiate recent developments in runtime software fault-monitoring approaches. The taxonomy categorizes the various runtime monitoring research by classifying the elements that are considered essential for building a monitoring system, i.e., the specification language used to define properties; the monitoring mechanism that oversees the program's execution; and the event handler that captures and communicates monitoring results. After describing the taxonomy, the paper presents the classification of the software-fault monitoring systems described in the literature.
Road safety legislation in the Americas.
Legislating five of the main risk factors for road traffic injuries (RTIs), as much as enforcing the law, is essential in forging an integral culture of road safety. Analysis of the level of progression in law enforcement allows for an evaluation of the state of world regions. A secondary analysis of the 2009 Global status report on road safety: time for action survey was undertaken to evaluate legislation on five risk factors (speed management, drinking and driving, motorcycle helmet use, seatbelt use, and use of child restraints) in the Americas. Laws were classified depending on their level of progression: the existence of legislation, whether the legislation is adequate, a level of law enforcement > 6 (on a scale of 0-10), and whether the law is considered comprehensive. A descriptive analysis was performed. The totality of the countries has national or subnational legislation for at least one of the five risk factors. However, 63% have laws on the five risk factors studied, and none of them has comprehensive laws for all five. Seatbelt use appears to be the most extended enforced legislation, while speeding laws appear to be the least enforced. There are positive efforts that should be recognized in the region. However, the region stands in different stages of progression. Law enforcement remains the main issue to be tackled. Laws should be based on evidence about what is already known to be effective.
Spectral efficient protocols for half-duplex fading relay channels
We study two-hop communication protocols where one or several relay terminals assist in the communication between two or more terminals. All terminals operate in half-duplex mode, hence the transmission of one information symbol from the source terminal to the destination terminal occupies two channel uses. This leads to a loss in spectral efficiency due to the pre-log factor one-half in corresponding capacity expressions. We propose two new half-duplex relaying protocols that avoid the pre-log factor one-half. Firstly, we consider a relaying protocol where a bidirectional connection between two terminals is established via one amplify-and-forward (AF) or decode-and-forward (DF) relay (two-way relaying). We also extend this protocol to a multi-user scenario, where multiple terminals communicate with multiple partner terminals via several orthogonalize-and-forward (OF) relay terminals, i.e., the relays orthogonalize the different two-way transmissions by a distributed zero-forcing algorithm. Secondly, we propose a relaying protocol where two relays, either AF or DF, alternately forward messages from a source terminal to a destination terminal (two-path relaying). It is shown that both protocols recover a significant portion of the half-duplex loss
Microsatellite instability induced mutations in DNA repair genes CtIP and MRE11 confer hypersensitivity to poly (ADP-ribose) polymerase inhibitors in myeloid malignancies.
Inactivation of the DNA mismatch repair pathway manifests as microsatellite instability, an accumulation of mutations that drives carcinogenesis. Here, we determined whether microsatellite instability in acute myeloid leukemia and myelodysplastic syndrome correlated with chromosomal instability and poly (ADP-ribose) polymerase (PARP) inhibitor sensitivity through disruption of DNA repair function. Acute myeloid leukemia cell lines (n=12) and primary cell samples (n=18), and bone marrow mononuclear cells from high-risk myelodysplastic syndrome patients (n=63) were profiled for microsatellite instability using fluorescent fragment polymerase chain reaction. PARP inhibitor sensitivity was performed using cell survival, annexin V staining and cell cycle analysis. Homologous recombination was studied using immunocytochemical analysis. SNP karyotyping was used to study chromosomal instability. RNA silencing, Western blotting and gene expression analysis was used to study the functional consequences of mutations. Acute myeloid leukemia cell lines (4 of 12, 33%) and primary samples (2 of 18, 11%) exhibited microsatellite instability with mono-allelic mutations in CtIP and MRE11. These changes were associated with reduced expression of mismatch repair pathway components, MSH2, MSH6 and MLH1. Both microsatellite instability positive primary acute myeloid leukemia samples and cell lines demonstrated a downregulation of homologous recombination DNA repair conferring marked sensitivity to PARP inhibitors. Similarly, bone marrow mononuclear cells from 11 of 56 (20%) patients with de novo high-risk myelodysplastic syndrome exhibited microsatellite instability. Significantly, all 11 patients with microsatellite instability had cytogenetic abnormalities with 4 of them (36%) possessing a mono-allelic microsatellite mutation in CtIP. Furthermore, 50% reduction in CtIP expression by RNA silencing also down-regulated homologous recombination DNA repair responses conferring PARP inhibitor sensitivity, whilst CtIP differentially regulated the expression of homologous recombination modulating RecQ helicases, WRN and BLM. In conclusion, microsatellite instability dependent mutations in DNA repair genes, CtIP and MRE11 are detected in myeloid malignancies conferring hypersensitivity to PARP inhibitors. Microsatellite instability is significantly correlated with chromosomal instability in myeloid malignancies.
Observations regarding influenza A virus shedding in a swine breeding farm after mass vaccination
An outbreak of respiratory disease in suckling piglets started in December 2010 in a 1200-sow farrow-to-wean facility. Swine influenza virus H1N2 was isolated from nasal swabs of affected piglets and determined to be the cause of the respiratory disease. After 2 months of continuous respiratory disease in the suckling-piglet and nursery populations, a change in the influenza vaccination strategy was adopted. Administration of swine influenza autogenous vaccine at 85 to 91 days of gestation was discontinued, and mass vaccination of the breeding herd was performed with two doses of a commercial multivalent vaccine. Prevalence of virus shedding was monitored by real-time reverse transcriptase polymerase chain reaction assay in nasal swabs and oral fluids from sows and suckling piglets before and after mass vaccination. After vaccination, there was a significant increase (P < .001) in hemagglutination inhibition serum-antibody titers in breeding females. Prevalence of shedding in sows and suckling piglets decreased through the 13 weeks of monitoring until no influenza-positive samples were detected in suckling and recently weaned pigs. This case report provides insights into a potential control strategy for swine influenza in breeding herds through mass vaccination.
Interpreter Exploitation
As remote exploits further dwindle and perimeter defenses become the standard, remote client-side attacks are becoming the standard vector for attackers. Modern operating systems have quelled the explosion of client-side vulnerabilities using mitigation techniques such as data execution prevention (DEP) and address space layout randomization (ASLR). This work illustrates two novel techniques to bypass these mitigations. The two techniques leverage the attack surface exposed by the script interpreters commonly accessible within the browser. The first technique, pointer inference, is used to find the memory address of a string of shellcode within the Adobe Flash Player's ActionScript interpreter despite ASLR. The second technique, JIT spraying, is used to write shellcode to executable memory, bypassing DEP protections, by leveraging predictable behaviors of the ActionScript JIT compiler. Previous attacks are examined and future research directions are discussed.
Alobar holoprosencephaly associated with cebocephaly and craniosynostosis.
Cebocephaly is a very rare congenital midline facial anomaly characterized by a blind-ended single nostril and ocular hypotelorism, and is usually combined with alobar holoprosencephaly. We report here a case of alobar holoprosencephaly with cebocephaly and craniosynostosis. Chromosomal analysis revealed normal karyotyping. The facial dysmorphism was characterized by the single nostril, hypotelorism, absence of philtrum and small head girth. The failure of cleavage of the prosencephalon and the fusion of all cranial sutures except for the sagittal suture were documented by computed tomography (CT) and magnetic resonance image (MRI). Early detection by the prenatal ultrasound examination is important because of poor prognosis of alobar holoprosencephaly.
Habitat heterogeneity and niche structure of trees in two tropical rain forests
Dispersal-assembly theories of species coexistence posit that environmental factors play no role in explaining community diversity and structure. Dispersal-assembly theories shed light on some aspects of community structure such as species-area and species-abundance relationships. However, species’ environmental associations also affect these measures of community structure. Measurements of species’ niche breadth and overlap address this influence. Using a new continuous measure of niche and a dispersal-assembly null model that maintains species’ niche breadth and aggregation, we tested two hypotheses assessing the effects of habitat heterogeneity on the ability of dispersal-assembly theories to explain community niche structure. We found that in both homogenous and heterogeneous environments dispersal-assembly theories cannot fully explain observed niche structure. The performance of the dispersal-assembly null models was particularly poor in heterogeneous environments. These results indicate that non-dispersal based mechanisms are in part responsible for observed community structure and measures of community structure which include species’ environmental associations should be used to test theories of species diversity.
Information propagation in the Bitcoin network
Bitcoin is a digital currency that unlike traditional currencies does not rely on a centralized authority. Instead Bitcoin relies on a network of volunteers that collectively implement a replicated ledger and verify transactions. In this paper we analyze how Bitcoin uses a multi-hop broadcast to propagate transactions and blocks through the network to update the ledger replicas. We then use the gathered information to verify the conjecture that the propagation delay in the network is the primary cause for blockchain forks. Blockchain forks should be avoided as they are symptomatic for inconsistencies among the replicas in the network. We then show what can be achieved by pushing the current protocol to its limit with unilateral changes to the client's behavior.
An assessment of national surveillance systems for malaria elimination in the Asia Pacific
Heads of Government from Asia and the Pacific have committed to a malaria-free region by 2030. In 2015, the total number of confirmed cases reported to the World Health Organization by 22 Asia Pacific countries was 2,461,025. However, this was likely a gross underestimate due in part to incidence data not being available from the wide variety of known sources. There is a recognized need for an accurate picture of malaria over time and space to support the goal of elimination. A survey was conducted to gain a deeper understanding of the collection of malaria incidence data for surveillance by National Malaria Control Programmes in 22 countries identified by the Asia Pacific Leaders Malaria Alliance. In 2015–2016, a short questionnaire on malaria surveillance was distributed to 22 country National Malaria Control Programmes (NMCP) in the Asia Pacific. It collected country-specific information about the extent of inclusion of the range of possible sources of malaria incidence data and the role of the private sector in malaria treatment. The findings were used to produce recommendations for the regional heads of government on improving malaria surveillance to inform regional efforts towards malaria elimination. A survey response was received from all 22 target countries. Most of the malaria incidence data collected by NMCPs originated from government health facilities, while many did not collect comprehensive data from mobile and migrant populations, the private sector or the military. All data from village health workers were included by 10/20 countries and some by 5/20. Other sources of data included by some countries were plantations, police and other security forces, sentinel surveillance sites, research or academic institutions, private laboratories and other government ministries. Malaria was treated in private health facilities in 19/21 countries, while anti-malarials were available in private pharmacies in 16/21 and private shops in 6/21. Most countries use primarily paper-based reporting. Most collected malaria incidence data in the Asia Pacific is from government health facilities while data from a wide variety of other known sources are often not included in national surveillance databases. In particular, there needs to be a concerted regional effort to support inclusion of data on mobile and migrant populations and the private sector. There should also be an emphasis on electronic reporting and data harmonization across organizations. This will provide a more accurate and up to date picture of the true burden and distribution of malaria and will be of great assistance in helping realize the goal of malaria elimination in the Asia Pacific by 2030.
A Compact Koch Fractal UWB MIMO Antenna With WLAN Band-Rejection
In this letter, a compact octagonal shaped fractal ultrawideband multiple-input-multiple-output antenna is presented, and its characteristics are investigated. In order to achieve the desired miniaturization and wideband phenomena, self-similar and space filling properties of Koch fractal geometry are used in the antenna design. These fractal monopoles are placed orthogonal to each other for good isolation. Moreover, grounded stubs are used in the geometry to provide further improvement in the isolation. The band rejection phenomenon in wireless local area network band is achieved by etching a C-shaped slot from the monopole of the antenna. The proposed antenna has compact dimensions of 45 mm × 45 mm and exhibits quasi-omnidirectional radiation pattern. In addition, it shows an impedance bandwidth (S11 <; -10 dB ) from 2 to 10.6 GHz with isolation better than 17 dB over the entire ultra-wideband range. Diversity performance is also evaluated in terms of envelope correlation coefficient and capacity loss. The measured results show good agreement with the simulated ones.
Primary essential cutis verticis gyrata - Case report*
Cutis verticis gyrata is characterized by excessive formation of scalp skin. It may be primary (essential and nonessential) or secondary. In the primary essential form it presents only folding skin formation on the scalp, mimicking cerebral gyri, without associated comorbidities. We report a rare case of a 28 year-old male patient with primary essential cutis verticis gyrata.
Design of Permanent Magnet-Assisted Synchronous Reluctance Motors Made Easy
Electric motor design is a multi-variable problem which involves geometric dimensions of the stator and rotor. Presenting a unique solution for a family of optimization criteria has always been a challenge for motor designers. Several numerical tools such as finite element methods (FEM) have been developed to perform a precise analysis and predict the outcome of the design. However, limits in parametric analysis as well as mathematical and computational burden on numerical tools usually prohibit the designer in obtaining a unique solution for the design problem. These limits and demands in optimized solutions motivate the designer to use analytical models in order to perform a comprehensive parametric design. An accurate analytical model is crucial for this purpose. In this paper, an analytical model for permanent magnet assisted synchronous reluctance motor (PMa- SynRM) with four flux barriers and one cutout per pole is developed. Flux densities are found in the air gap, in the cutouts, and in the flux barriers; thus, the back-EMF developed by the permanent magnets is derived. Equations for the d-axis and the q-axis inductances are also obtained. Electromagnetic torque is finally derived using the co-energy method. The developed analytical model highlights the contribution of the reluctance variation and permanent magnets on the developed torque. Simulation results are obtained using both Matlab and Ansoft/Maxwell packages. These outcomes are supported by the experimental results obtained from a laboratory test bed.
CONTROL OF UNICYCLE TYPE ROBOTS Tracking , Path Following and Point Stabilization
This paper considers the motion control problem of unicycle type mobile ro bots. We present the mathematical model of the mobile robots taken explicitly into account their dynamics and fo rmulate the respectively motion control strategies of tracking and path-following. Two types of controller s presented in the literature are revised and their performance are illustrated through computer simulation s. The problem of regulation to a point is also addressed.
A component map tuning method for performance prediction and diagnostics of gas turbine compressors
In this paper, a novel compressor map tuning method is developed with the primary objective of improving the accuracy and fidelity of gas turbine engine models for performance prediction and diagnostics. A new compressor map fitting and modelling method is introduced to simultaneously determine the best elliptical curves to a set of compressor map data. The coefficients that determine the shape of the compressor map curves are analyzed and tuned through a multi-objective optimization scheme in order to simultaneously match multiple sets of engine performance measurements. The component map tuning method, that is developed in the object oriented Matlab Simulink environment, is implemented in a dynamic gas turbine engine model and tested in off-design steady state and transient as well as degraded operating conditions. The results provided demonstrate and illustrate the capabilities of our proposed method in refining existing engine performance models to different modes of the gas turbine operation. In addition, the excellent agreement between the injected and the predicted degradation of the engine model demonstrates the potential of the proposed methodology for gas turbine diagnostics. The proposed method can be integrated with the performance-based tools for improved condition monitoring and diagnostics of gas turbine power plants.
A Context-aware Attention Network for Interactive Question Answering
Neural network based sequence-to-sequence models in an encoder-decoder framework have been successfully applied to solve Question Answering (QA) problems, predicting answers from statements and questions. However, almost all previous models have failed to consider detailed context information and unknown states under which systems do not have enough information to answer given questions. These scenarios with incomplete or ambiguous information are very common in the setting of Interactive Question Answering (IQA). To address this challenge, we develop a novel model, employing context-dependent word-level attention for more accurate statement representations and question-guided sentence-level attention for better context modeling. We also generate unique IQA datasets to test our model, which will be made publicly available. Employing these attention mechanisms, our model accurately understands when it can output an answer or when it requires generating a supplementary question for additional input depending on different contexts. When available, user's feedback is encoded and directly applied to update sentence-level attention to infer an answer. Extensive experiments on QA and IQA datasets quantitatively demonstrate the effectiveness of our model with significant improvement over state-of-the-art conventional QA models.
Implementation and Optimization of the Accelerator Based on FPGA Hardware for LSTM Network
Today, artificial neural networks (ANNs) are important machine learning methods which are widely used in a variety of applications. As the emerging field of ANNs, recurrent neural networks (RNNs) are often used for sequencerelated applications. And Long Short-Term Memory (LSTM) is an improved RNN which contains complex computational logic. To achieve high accuracy, researchers always build largescale LSTM networks which are time-consuming and powerconsuming. Thus the acceleration of LSTM networks, low power & energy consumption become the hot issues in today's research. In this paper, we present a hardware accelerator for the LSTM neural network layer based on FPGA Zedboard and use pipeline methods to parallelize the forward computing process. To optimize our implementation, we also use multiple methods including tiled matrix-vector multiplication, binary adder tree, and overlap of computation and data access. Through the acceleration and optimization methods, our accelerator is power-efficient and has a better performance than ARM Cortex A9 processor and Intel Core i5 processor.
Supporting Undergraduate Computer Architecture Students Using a Visual MIPS64 CPU Simulator
The topics of computer architecture are always taught using an Assembly dialect as an example. The most commonly used textbooks in this field use the MIPS64 Instruction Set Architecture (ISA) to help students in learning the fundamentals of computer architecture because of its orthogonality and its suitability for real-world applications. This paper shows how to use the EduMIPS64 visual CPU Simulator as a supporting tool for teaching the standard topics covered by an undergraduate course in computer architecture. The proposed approach is first compared to other similar works in the field, then after a short description of the simulator, the paper focuses on how it can be used for teaching specific topics in an undergraduate computer architecture course. This discussion is then followed by a quantitative assessment of the suitability of the simulator by means of a survey compiled by students themselves; the results show that EduMIPS64 is suitable for the purpose for which it was built-that is, supporting the learning process of computer architecture topics.
Cloaked Facebook pages: Exploring fake Islamist propaganda in social media
Johan Farkas, Jannick Schou & Christina Neumayer Abstract This research analyses cloaked Facebook pages that are created to spread political propaganda by cloaking a user profile and imitating the identity of a political opponent in order to spark hateful and aggressive reactions. This inquiry is pursued through a multi-sited ethnographic case study of Danish Facebook pages disguised as radical Islamist pages, which provoked racist and anti-Muslim reactions as well as negative sentiments toward refugees, and immigrants in Denmark in general. Drawing on Jessie Daniels’ critical insights into cloaked websites, this research furthermore analyses the epistemological, methodological, and conceptual challenges of online propaganda. It enhances our understanding of disinformation and propaganda in an increasingly interactive social media environment and contributes to a critical inquiry into social media and subversive politics.
Laparoscopic versus open total mesorectal excision with anal sphincter preservation for low rectal cancer
The Laparoscopic approach has been applied to colorectal surgery for many years; however, there are only a few reports on laparoscopic low and ultralow anterior resection with construction of coloanal anastomosis. This study compares open versus laparoscopic low and ultralow anterior resections, assesses the feasibility and efficacy of the laparoscopic approach of total mesorectal excision (TME) with anal sphincter preservation (ASP), and analyzes the short-term results of patients with low rectal cancer. We analyzed our experience via a prospective, randomized control trail. From June 2001 to September 2002, 171 patients with low rectal cancer underwent TME with ASP, 82 by the laparoscopic procedure and 89 by the open technique. The lowest margin of tumors was below peritoneal reflection and 1.5–8 cm above the dentate line (1.5–4.9 cm in 104 cases and 5–8 cm in 67 cases). The grouping was randomized. Results of operation, postoperative recovery, and short-term oncological follow-up were compared between 82 laparoscopic procedures and 89 controls who underwent open surgery during the same period. In the laparoscopic group, 30 patients in whom low anterior resection was performed had the anastomosis below peritoneal reflection and more than 2 cm above the dentate line, 27 patients in whom ultralow anterior resection was performed had anastomotic height within 2 cm of the dentate line, and 25 patients in whom coloanal anastomosis was performed had the anastomosis at or below the dentate line. In the open group, the numbers were 35, 27, and 27, respectively. There was no statistical difference in operation time, administration of parenteral analgesics, start of food intake, and mortality rate between the two groups. However, blood loss was less, bowel function recovered earlier, and hospitalization time was shorter in the laparoscopic group. Totally laparoscopic TME with ASP is feasible, and it is a minimally invasive technique with the benefits of much less blood loss during operation, earlier return of bowel function, and shorter hospitalization.
High-Resolution Angle Estimation for an Automotive FMCW Radar Sensor
This paper introduces the application of high-resolution angle estimation algorithms for a 77GHz automotive long range radar sensor. Highresolution direction of arrival (DOA) estimation is important for future safety systems. Using FMCW principle, major challenges discussed in this paper are small number of snapshots, correlation of the signals, and antenna mismatches. Simulation results allow analysis of these effects and help designing the sensor. Road traffic measurements show superior DOA resolution and the feasibility of high-resolution angle estimation.
Real-time user-guided image colorization with learned deep priors
We propose a deep learning approach for user-guided image colorization. The system directly maps a grayscale image, along with sparse, local user "hints" to an output colorization with a Convolutional Neural Network (CNN). Rather than using hand-defined rules, the network propagates user edits by fusing low-level cues along with high-level semantic information, learned from large-scale data. We train on a million images, with simulated user inputs. To guide the user towards efficient input selection, the system recommends likely colors based on the input image and current user inputs. The colorization is performed in a single feed-forward pass, enabling real-time use. Even with randomly simulated user inputs, we show that the proposed system helps novice users quickly create realistic colorizations, and offers large improvements in colorization quality with just a minute of use. In addition, we demonstrate that the framework can incorporate other user "hints" to the desired colorization, showing an application to color histogram transfer.
Practical condition synchronization for transactional memory
Few transactional memory implementations allow for condition synchronization among transactions. The problems are many, most notably the lack of consensus about a single appropriate linguistic construct, and the lack of mechanisms that are compatible with hardware transactional memory. In this paper, we introduce a broadly useful mechanism for supporting condition synchronization among transactions. Our mechanism supports a number of linguistic constructs for coordinating transactions, and does so without introducing overhead on in-flight hardware transactions. Experiments show that our mechanisms work well, and that the diversity of linguistic constructs allows programmers to chose the technique that is best suited to a particular application.
DynamicFusion: Reconstruction and tracking of non-rigid scenes in real-time
We present the first dense SLAM system capable of reconstructing non-rigidly deforming scenes in real-time, by fusing together RGBD scans captured from commodity sensors. Our DynamicFusion approach reconstructs scene geometry whilst simultaneously estimating a dense volumetric 6D motion field that warps the estimated geometry into a live frame. Like KinectFusion, our system produces increasingly denoised, detailed, and complete reconstructions as more measurements are fused, and displays the updated model in real time. Because we do not require a template or other prior scene model, the approach is applicable to a wide range of moving objects and scenes.
Effects of users' envy and shame on social comparison that occurs on social network services
In the context of the social network service environment, we explore how discrete emotions—envy and shame, in particular—may mediate the effects of social comparison on behavior intention and psychological responses. Based on the survey responses of 446 university students, the results suggest that social comparison to media figures correlates with a range of emotional responses as well as with behavioral intention and psychological responses. Envy maintained a significantly greater association with switch intention as a behavioral intention compared to shame. Conversely, shame was significantly related to burnout as a psychological response. Further, mediational analyses were consistent with the argument that envy and shame mediate the social comparison–outcome variables relationship. This research helps to illuminate the driving mechanism for the emotional effect that social comparison on social network service could elicit from a user. This predicts the nature of the behavioral and psychological outcome associated with the comparison and has implications for an enhanced understanding of the way in which the unique social network service communication environment may stimulate this process. 2015 Elsevier Ltd. All rights reserved.
The Effect of Substrate-Enriched and Substrate-Impoverished Housing Environments on the Diversity of Behaviour in Pigs
In intensive farming situations, growing animals are housed in relatively barren environments. The lack of opportunity to perform substrate-interactive and manipulative behaviour patterns may affect the expression and organization of these behaviours. However, making direct comparisons of the behaviour expressed in environments of differing physical complexity is difficult. In this experiment a relative diversity index was used to compare the behavioural repertoires of pigs housed in two different environments for a period of five months. One group of pigs (substrate-enriched) had straw, forest bark and branches added to the standard pens and the other group (substrate-impoverished) did not. The pigs were individually housed, and their behaviour was focal sampled in these pens on one day each month. It was shown that the relative diversity of manipulative behaviour shown by the pigs in the substrateimpoverished environment was lower than in the pigs in the substrate-enriched environment (p < 0.05). The relative diversity of the whole behavioural repertoire shown by the pigs in the substrate-impoverished environment also tended to be lower than that in the substrate-enriched environment (p = 0.06). It is concluded that this may be due to a difference between the two groups in motivation to interact with and manipulate objects, or a function of the manipulable quality of the substrates available to them. Alternatively, exposure to substrate-impoverished environments may interfere with the ability to express manipulative behaviour. Both situations pose a threat to the welfare of growing pigs resident in barren environments.
Byzantine Generals in Action: Implementing Fail-Stop Processors
A fail-stop processor halts instead of performing an erroneous state transformation that might be visible to other processors, can detect whether another fail-stop processor has halted (due to a failure), and has a predefined portion of its storage that will remain unaffected by failures and accessible to any other fail-stop processor. Fail-stop processors can simplify the construction of fault-tolerant computing systems. In this paper, the problem of approximating fail-stop processors is discussed. Use of fail-stop processors is compared with the state machine approach, another general paradigm for constructing fault-tolerant systems.
Chloramphenicol Selection of IS10 Transposition in the cat Promoter Region of Widely Used Cloning Vectors
The widely used pSU8 family of cloning vectors is based on a p15A replicon and a chloramphenicol acetyltransferase (cat) gene conferring chloramphenicol resistance. We frequently observed an increase in the size of plasmids derived from these vectors. Analysis of the bigger molecular species shows that they have an IS10 copy inserted at a specific site between the promoter and the cat open reading frame. Promoter activity from both ends of IS10 has been reported, suggesting that the insertion events could lead to higher CAT production. Insertions were observed in certain constructions containing inserts that could lead to plasmid instability. To test the possibility that IS10 insertions were selected as a response to chloramphenicol selection, we have grown these constructs in the presence of different amounts of antibiotic and we observed that insertions arise promptly under higher chloramphenicol selective pressure. IS10 is present in many E. coli laboratory strains, so the possibility of insertion in constructions involving cat-containing vectors should be taken into account. Using lower chloramphenicol concentrations could solve this problem.
Web-Scale Bayesian Click-Through rate Prediction for Sponsored Search Advertising in Microsoft's Bing Search Engine
We describe a new Bayesian click-through rate (CTR) prediction algorithm used for Sponsored Search in Microsoft’s Bing search engine. The algorithm is based on a probit regression model that maps discrete or real-valued input features to probabilities. It maintains Gaussian beliefs over weights of the model and performs Gaussian online updates derived from approximate message passing. Scalability of the algorithm is ensured through a principled weight pruning procedure and an approximate parallel implementation. We discuss the challenges arising from evaluating and tuning the predictor as part of the complex system of sponsored search where the predictions made by the algorithm decide about future training sample composition. Finally, we show experimental results from the production system and compare to a calibrated Naïve Bayes algorithm.
A Fuzzy Logic Approach for Opinion Mining on Large Scale Twitter Data
Recently, some efforts have been made to mine social media for the analysis of public sentiment. By means of a literature review on early works related to social media analytics especially on opinion mining, it was recognized that in the real life social media environment, the structure of the data is commonly not clear and it does not directly generate enough information to fully represent any selected target. However, most of these works were unable to accurately extract clear indications of general public opinion from the ambiguous social media data. They also lacked the capacity to summarize multi-characteristics from the scattered mass of social data and use it to compile useful models, also lacked any efficient mechanism for managing the big data. Motivated by these research problems, this paper proposes a novel matrix-based fuzzy algorithm, called the FMM system, to mine the defined multi-layered Twitter data. Through sets of comparable experiments applied on Twitter data, the proposed FMM system achieved an excellent performance, with both fast processing speeds and high predictive accuracy.
Where Are the Blobs: Counting by Localization with Point Supervision
Object counting is an important task in computer vision due to its growing demand in applications such as surveillance, traffic monitoring, and counting everyday objects. State-of-the-art methods use regression-based optimization where they explicitly learn to count the objects of interest. These often perform better than detection-based methods that need to learn the more difficult task of predicting the location, size, and shape of each object. However, we propose a detectionbased method that does not need to estimate the size and shape of the objects and that outperforms regression-based methods. Our contributions are three-fold: (1) we propose a novel loss function that encourages the network to output a single blob per object instance using pointlevel annotations only; (2) we design two methods for splitting large predicted blobs between object instances; and (3) we show that our method achieves new state-of-the-art results on several challenging datasets including the Pascal VOC and the Penguins dataset. Our method even outperforms those that use stronger supervision such as depth features, multi-point annotations, and bounding-box labels.
Real-time Nodes Coordinator Nodes Broker Nodes
Druid is an open source data store designed for real-time exploratory analytics on large data sets. The system combines a column-oriented storage layout, a distributed, shared-nothing architecture, and an advanced indexing structure to allow for the arbitrary exploration of billion-row tables with sub-second latencies. In this paper, we describe Druid’s architecture, and detail how it supports fast aggregations, flexible filters, and low latency data ingestion.
Authorship Attribution Using Word Sequences
Authorship attribution is the task of identifying the author of a given text. The main concern of this task is to define an appropriate characterization of documents that captures the writing style of authors. This paper proposes a new method for authorship attribution supported on the idea that a proper identification of authors must consider both stylistic and topic features of texts. This method characterizes documents by a set of word sequences that combine functional and content words. The experimental results on poem classification demonstrated that this method outperforms most current state-of-the-art approaches, and that it is appropriate to handle the attribution of short documents.
EasyLocal++: an object-oriented framework for the flexible design of local-search algorithms
We propose EASYLOCAL++, an object-oriented framework for the design and the analysis of local search algorithms. The abstract classes that compose the framework specify and implement the invariant part of the algorithm, and are meant to be specialized by concrete classes that supply the problem-dependent part. The framework provides the full control structures of the algorithms, and the user has only to write the problem-specific code. Furthermore, the framework comes out with some tools that simplify the analysis of the algorithms.
Silk Server - Adding missing Links while consuming Linked Data
The Web of Linked Data is built upon the idea that data items on the Web are connected by RDF links. Sadly, the reality on the Web shows that Linked Data sources set some RDF links pointing at data items in related data sources, but they clearly do not set RDF links to all data sources that provide related data. In this paper, we present Silk Server, an identity resolution component, which can be used within Linked Data application architectures to augment Web data with additional RDF links. Silk Server is designed to be used with an incoming stream of RDF instances, produced for example by a Linked Data crawler. Silk Server matches the RDF descriptions of incoming instances against a local set of known instances and discovers missing links between them. Based on this assessment, an application can store data about newly discovered instances in its repository or fuse data that is already known about an entity with additional data about the entity from the Web. Afterwards, we report on the results of an experiment in which Silk Server was used to generate RDF links between authors and publications from the Semantic Web Dog Food Corpus and a stream of FOAF profiles that were crawled from the Web.
Printable hydraulics: A method for fabricating robots by 3D co-printing solids and liquids
This paper introduces a novel technique for fabricating functional robots using 3D printers. Simultaneously depositing photopolymers and a non-curing liquid allows complex, pre-filled fluidic channels to be fabricated. This new printing capability enables complex hydraulically actuated robots and robotic components to be automatically built, with no assembly required. The technique is showcased by printing linear bellows actuators, gear pumps, soft grippers and a hexapod robot, using a commercially-available 3D printer. We detail the steps required to modify the printer and describe the design constraints imposed by this new fabrication approach.
Ontology-Based Approach to Social Data Sentiment Analysis: Detection of Adolescent Depression Signals
BACKGROUND Social networking services (SNSs) contain abundant information about the feelings, thoughts, interests, and patterns of behavior of adolescents that can be obtained by analyzing SNS postings. An ontology that expresses the shared concepts and their relationships in a specific field could be used as a semantic framework for social media data analytics. OBJECTIVE The aim of this study was to refine an adolescent depression ontology and terminology as a framework for analyzing social media data and to evaluate description logics between classes and the applicability of this ontology to sentiment analysis. METHODS The domain and scope of the ontology were defined using competency questions. The concepts constituting the ontology and terminology were collected from clinical practice guidelines, the literature, and social media postings on adolescent depression. Class concepts, their hierarchy, and the relationships among class concepts were defined. An internal structure of the ontology was designed using the entity-attribute-value (EAV) triplet data model, and superclasses of the ontology were aligned with the upper ontology. Description logics between classes were evaluated by mapping concepts extracted from the answers to frequently asked questions (FAQs) onto the ontology concepts derived from description logic queries. The applicability of the ontology was validated by examining the representability of 1358 sentiment phrases using the ontology EAV model and conducting sentiment analyses of social media data using ontology class concepts. RESULTS We developed an adolescent depression ontology that comprised 443 classes and 60 relationships among the classes; the terminology comprised 1682 synonyms of the 443 classes. In the description logics test, no error in relationships between classes was found, and about 89% (55/62) of the concepts cited in the answers to FAQs mapped onto the ontology class. Regarding applicability, the EAV triplet models of the ontology class represented about 91.4% of the sentiment phrases included in the sentiment dictionary. In the sentiment analyses, "academic stresses" and "suicide" contributed negatively to the sentiment of adolescent depression. CONCLUSIONS The ontology and terminology developed in this study provide a semantic foundation for analyzing social media data on adolescent depression. To be useful in social media data analysis, the ontology, especially the terminology, needs to be updated constantly to reflect rapidly changing terms used by adolescents in social media postings. In addition, more attributes and value sets reflecting depression-related sentiments should be added to the ontology.
Index switching causes “ spreading-of-signal ” among multiplexed samples in Illumina HiSeq 4000 DNA sequencing
Illumina-based next generation sequencing (NGS) has accelerated biomedical discovery through its ability to generate thousands of gigabases of sequencing output per run at a fraction of the time and cost of conventional technologies. The process typically involves four basic steps: library preparation, cluster generation, sequencing, and data analysis. In 2015, a new chemistry of cluster generation was introduced in the newer Illumina machines (HiSeq 3000/4000/X Ten) called exclusion amplification (ExAmp), which was a fundamental shift from the earlier method of random cluster generation by bridge amplification on a non-patterned flow cell. The ExAmp peer-reviewed) is the author/funder. All rights reserved. No reuse allowed without permission. The copyright holder for this preprint (which was not . http://dx.doi.org/10.1101/125724 doi: bioRxiv preprint first posted online Apr. 9, 2017;
Biomechanical and viscoelastic properties of skin, SMAS, and composite flaps as they pertain to rhytidectomy.
Previous studies have focused on biomechanical and viscoelastic properties of the superficial musculoaponeurotic system (SMAS) flap and the skin flap lifted in traditional rhytidectomy procedures. The authors compared these two layers with the composite rhytidectomy flap to explain their clinical observations that the composite dissection allows greater tension and lateral pull to be placed on the facial and cervical flaps, with less long-term stress-relaxation and tissue creep. Eight fresh cadavers were dissected by elevating flaps on one side of the face and neck as skin and SMAS flaps and on the other side as a standard composite rhytidectomy flap. The tissue samples were tested for breaking strength, tissue tearing force, stress-relaxation, and tissue creep. For breaking strength, uniform samples were pulled at a rate of 1 inch per minute, and the stress required to rupture the tissues was measured. Tissue tearing force was measured by attaching a 3-0 suture to the tissues and pulling at the same rate as that used for breaking strength. The force required to tear the suture out of the tissues was then measured. Stress-relaxation was assessed by tensing the uniformly sized strips of tissue to 80 percent of their breaking strength, and the amount of tissue relaxation was measured at 1-minute intervals for a total of 5 minutes. This measurement is expressed as the percentage of tissue relaxation per minute. Tissue creep was assessed by using a 3-0 suture and calibrated pressure gauge attached to the facial flaps. The constant tension applied to the flaps was 80 percent of the tissue tearing force. The distance crept was measured in millimeters after 2 and 3 minutes of constant tension. Breaking strength measurements demonstrated significantly greater breaking strength of skin and composite flaps as compared with SMAS flaps (p < 0.05). No significant difference was noted between skin and composite flaps. However, tissue tearing force demonstrated that the composite flaps were able to withstand a significantly greater force as compared with both skin and SMAS flaps (p < 0.05). Stress-relaxation analysis revealed the skin flaps to have the highest degree of stress-relaxation over each of five 1-minute intervals. In contrast, the SMAS and composite flaps demonstrated a significantly lower degree of stress-relaxation over the five 1-minute intervals (p < 0.05). There was no difference noted between the SMAS flaps and composite flaps with regard to stress-relaxation. Tissue creep correlated with the stress-relaxation data. The skin flaps demonstrated the greatest degree of tissue creep, which was significantly greater than that noted for the SMAS flaps or composite flaps (p < 0.05). Comparison of facial flaps with cervical flaps revealed that cervical skin, SMAS, and composite flaps tolerated significantly greater tissue tearing forces and demonstrated significantly greater tissue creep as compared with facial skin, SMAS, and composite flaps (p < 0.05). These biomechanical studies on facial and cervical rhytidectomy flaps indicate that the skin and composite flaps are substantially stronger than the SMAS flap, allowing significantly greater tension to be applied for repositioning of the flap and surrounding subcutaneous tissues. The authors confirmed that the SMAS layer exhibits significantly less stress-relaxation and creep as compared with the skin flap, a property that has led aesthetic surgeons to incorporate the SMAS into the face lift procedure. On the basis of the authors' findings in this study, it seems that that composite flap, although composed of both the skin and SMAS, acquires the viscoelastic properties of the SMAS layer, demonstrating significantly less stress-relaxation and tissue creep as compared with the skin flap. This finding may play a role in maintaining long-term results after rhytidectomy. In addition, it is noteworthy that the cervical flaps, despite their increased strength, demonstrate significantly greater tissue creep as compared with facial flaps, suggesting earlier relaxation of the neck as compared with the face after rhytidectomy.
El tema de la soledad en la narrativa de Soledad Puértolas
This text studies the novels and short stories of post-Franco Spanish writer, Soledad Puertolas, examining the dominant and unifying theme of solitude and loneliness. Literal and visual correspondences are established with the "realistic" paintings of Edward Hopper and other contemporary artists. Puertolas's fiction exposes the social and moral ills of her country and of all men confronting the solitude of their lives at the end of the twentieth century; indifference and the lack of communication are constant themes, conveyed with a style that is often lyrical.
Oral acyclovir prophylaxis against herpes simplex virus in non-Hodgkin lymphoma and acute lymphoblastic leukaemia patients receiving remission induction chemotherapy. A randomised double blind, placebo controlled trial.
Forty-one patients receiving remission induction chemotherapy with vincristine, adriamycin and prednisolone (VAP) for high grade lymphoma or acute lymphoblastic leukaemia were entered into a double blind, placebo controlled trial of oral acyclovir prophylaxis against herpes simplex virus (HSV) infection. The dose of acyclovir was 200 mg four times daily for the duration of chemotherapy (six weeks). Of the 40 evaluable patients, 20 were randomised to each arm. Prophylactic oral acyclovir significantly reduced the incidence of clinical HSV infection from 60% on placebo to 5% acyclovir (P less than 0.001), and the incidence of viral isolates from 70% on placebo to 5% on acyclovir (P less than 0.001).
MRTouch: Adding Touch Input to Head-Mounted Mixed Reality
We present MRTouch, a novel multitouch input solution for head-mounted mixed reality systems. Our system enables users to reach out and directly manipulate virtual interfaces affixed to surfaces in their environment, as though they were touchscreens. Touch input offers precise, tactile and comfortable user input, and naturally complements existing popular modalities, such as voice and hand gesture. Our research prototype combines both depth and infrared camera streams together with real-time detection and tracking of surface planes to enable robust finger-tracking even when both the hand and head are in motion. Our technique is implemented on a commercial Microsoft HoloLens without requiring any additional hardware nor any user or environmental calibration. Through our performance evaluation, we demonstrate high input accuracy with an average positional error of 5.4 mm and 95% button size of 16 mm, across 17 participants, 2 surface orientations and 4 surface materials. Finally, we demonstrate the potential of our technique to enable on-world touch interactions through 5 example applications.
Feral Cats and Biodiversity Conservation: The Urgent Prioritization of Island Management
804 BioScience • October 2013 / Vol. 63 No. 10 www.biosciencemag.org Islands harbor a disproportionate amount of Earth’s biodiversity and are characterized by the presence of a great number of endemic plant and animal species (MacArthur and Wilson 1967, Carlquist 1974, Myers et al. 2000, Kier et al. 2009). Invasive predator species, parti­ cularly mammals, are one of the primary extinction drivers on islands (Groombridge and Jenkins 2000, Courchamp et al. 2003, Blackburn et al. 2004). Reviews of the impact of mongooses (Herpestes spp.; Hays and Conant 2007), rats (Rattus spp.; Towns et al. 2006, Jones et al. 2008), and feral cats (Felis silvestris catus; Medina et al. 2011) on islands all note significant impacts on native mammals, birds, and reptiles. Since domestication from the African wildcat (Felis silvestris lybica) some 9000 years ago (Driscoll et al. 2007), the domestic cat (figure 1) has established feral populations on many of the world’s islands, even in the most remote oceanic archipelagoes (Ebenhard 1988, Courchamp et al. 2003, Hilton and Cuthbert 2010). Feral cats are usually a superpredator in the trophic network of islands (Fitzgerald 1988, Courchamp et al. 1999). This generalist and oppor­ tunistic predator has a strong and direct effect on a great variety of native prey, including birds, mammals, reptiles, and invertebrates (for a review, see Bonnaud et al. 2011 and the references therein). Native mammalian carnivores are usually rare on islands because of their low dispersal ability over sea (except bats). Because island vertebrates are often not adapted to coexist with mammalian carnivores (Stone et al. 1994), introduced mammals on islands can have severe impacts on native populations. Introduced mammals (rodents and lagomorphs) are often the most common prey on the islands where feral cats are present; however, when they are available, other native vertebrates (mostly birds and reptiles) are important components of feral cats’ diet on islands (Bonnaud et al. 2011). The presence of alterna­ tive, abundant, year­round prey can facilitate the survival of and sustain large feral cat populations that can have an exacerbated impact on native species through a super­ predator effect (Courchamp et al. 2000). Therefore, even if native species are a lesser component of feral cat diet on islands, presumably because of the lower relative or absolute abundance of native species, and when introduced rodents and lagomorphs are present, feral cats still represent a threat to native island species (Nogales et al. 2004). A meta­analysis of 72 diet studies (based on scat, gut, and stomach contents) revealed that at least 248 species were preyed on by feral cats on 40 worldwide islands (27 mammals, 113 birds, 34 reptiles, 3 amphibians, 2 fish, and 69 invertebrates; for more detail, see Bonnaud et al. 2011). Impacts of feral cats on endangered species have pri­ marily been inferred from dietary studies (Fitzgerald 1988, Fitzgerald and Turner 2000). These studies are useful in Feral Cats and Biodiversity Conservation: The Urgent Prioritization of Island Management
Eagle's syndrome - A case report and review of the literature.
Eagle's syndrome (ES) occurs when an elongated styloid process or calcified stylohyoid ligament causes recurrent throat pain or foreign body sensation, dysphagia, or facial pain. Additional symptoms may include neck or throat pain with radiation to the ipsilateral ear. The symptoms related to this condition can be confused with those attributed to a wide variety of facial neuralgias. ES can be diagnosed radiologically and by physical examination. The treatment of ES is primarily surgical. The styloid process can be shortened through an intraoral or external approach. In this paper a case of ES exhibiting unilateral symptoms with bilateral elongation of styloid process is reported and the literature is reviewed.
Study of correlation between wall shear stress and elasticity in atherosclerotic carotid arteries
OBJECTIVE This paper presents the use of the texture matching method to measure the rabbit carotid artery elasticity value of the experimental group and control group respectively. It compares the experimental rabbits, when they are prompted by pathological histology to be at the period of carotid atherosclerosis fatty streaks and fiber plaques, with the control group. METHODS We have used ultrasound linear array probe for scanning the rabbit carotid arteries. This allows us to obtain the wall shear stress (WSS) and the elasticity values in the atherosclerotic arteries. Using statistical analysis, we are able to clarify whether the texture matching method can diagnose atherosclerosis at the early stage. We also analyze the rabbit carotid artery elasticity and WSS values to make sure whether there is a correlation between both. Combining the texture matching method with the WSS quantitative analysis in the future can enable better prediction of the occurrence and development of atherosclerosis by using noninvasive medical imaging techniques. RESULTS This study has confirmed that from the 2nd to the 10th week, with the development of atherosclerosis, the arterial WSS reduction has a negative correlation with the increasing of artery wall elasticity, which means that as the arterial WSS decreases the arterial wall becomes less elastic. Correlating shear stress with atherosclerosis can clarify that WSS can be used as one of the effective parameters of early diagnosis of atherosclerosis. CONCLUSION In summary, we have found that the elasticity value can reflect the degree of atherosclerosis more objectively. Therefore, by using noninvasive imaging, the quantitative analysis of shear stress and combined with texture matching method can assist in the early diagnosis of atherosclerosis.
A model for high school computer science education: the four key elements that make it!
This paper presents a model program for high school computer science education. It is based on an analysis of the structure of the Israeli high school computer science curriculum considered to be one of the leading curricula worldwide. The model consists of four key elements as well as interconnections between these elements. It is proposed that such a model be considered and/or adapted when a country wishes to implement a nation-wide program for high school computer science education.
Language, mind and brain
Language serves as a cornerstone of human cognition. However, our knowledge about its neural basis is still a matter of debate, partly because ‘language’ is often ill-defined. Rather than equating language with ‘speech’ or ‘communication’, we propose that language is best described as a biologically determined computational cognitive mechanism that yields an unbounded array of hierarchically structured expressions. The results of recent brain imaging studies are consistent with this view of language as an autonomous cognitive mechanism, leading to a view of its neural organization, whereby language involves dynamic interactions of syntactic and semantic aspects represented in neural networks that connect the inferior frontal and superior temporal cortices functionally and structurally. Friederici et al. outline a view of the neural organization of language that is compatible with a description of language as a biologically determined computational mechanism that yields an infinite number of hierarchically structured expressions.
Lack of efficacy of the selective iNOS inhibitor GW274150 in prophylaxis of migraine headache.
INTRODUCTION This study investigated the efficacy and tolerability of the highly selective iNOS inhibitor GW274150 in prophylaxis of migraine headache. SUBJECTS AND METHODS The study was conducted in two parts, each comprising a 4-week baseline period, a 12-week, double-blind, parallel-group treatment period, and a 4-week follow-up period. The study had an adaptive design in that findings of Part 1 of the study were used to inform the conduct of Part 2. Following an interim analysis at the end of Part 1, the trial could be stopped for futility or continued in Part 2 to study the full-dose response or to increase sample size in case initial assumptions had been violated. The primary end-point in both parts of the study was the probability of the occurrence of a migraine headache day during the baseline period and the treatment period. RESULTS In Part 1, adult male and female patients with migraine received GW274150 60 mg (n = 37), 120 mg (n = 37), or placebo (n = 38) once daily for 12 weeks. In Part 2, female patients with migraine received GW274150 60 mg (n= 160) or placebo (n = 154) once daily for 12 weeks. GW274150 was no more effective than placebo for the primary efficacy end-point or any secondary efficacy end-point in Part 1 or Part 2. GW274150 was generally well tolerated. CONCLUSIONS GW274150 at doses predicted to inhibit iNOS >80% did not differ from placebo in the prophylaxis of migraine. The results do not support a role of iNOS inhibition in migraine prevention.
How To Rigorously Develop Process Theory Using Case Research
Dynamic phenomena are key concerns of IS researchers. However, the methodological approaches usually selected to investigate IS phenomena often rely on variance theory. Underlying factor models represent a rather static approach to the phenomenon by focusing on independent and dependent variables and explaining the degree of the relationships between them. Process theory has been suggested to overcome this problem. Process theory provides a complementary, dynamic perspective on IS phenomena by explaining how independent and dependent variables are linked in terms of event sequences. Although applying both approaches provides more complete pictures of IS phenomena, a lack of research methods focusing on process theories may hinder this goal. This article seeks to help closing this gap by examining how case research can be applied to develop process theory. We analyzed IS case research as well as process research literature and consolidated inputs from both sources toward a single methodology. The results highlight that the development of process theory benefits from a consistent methodology and quality measures that have been suggested in general case research. However, we also found that each step requires specific consideration of process theory characteristics in order to develop rigorous process theory.
Radiofrequency ablation and vertebral augmentation for palliation of painful spinal metastases
Radiofrequency ablation (RFA) and vertebral augmentation is an emerging combination therapy for painful osseous metastases that cannot be or are incompletely palliated with radiation therapy. Herein, we report our experience performing RFA and vertebral augmentation of spinal metastases for pain palliation. Institutional review board approval was obtained to retrospectively review our tumor ablation database for all patients who underwent RFA of osseous metastases between April 2012 and July 2014. Patient demographics, lesion characteristics, concurrent palliative therapies, and complications were recorded. Pre- and post-procedure mean worst pain scores 1 and 4 weeks after treatment were measured using the Numeric Rating Scale (10-point scale) and compared. During the study period, 72 RFA treatments of 110 spinal metastases were performed. Eighty one percent (89/110) of metastases involved the posterior vertebral body and 45 % (49/110) involved the pedicles. Vertebral augmentation was performed after 95 % (105/110) of ablations. Mean and median pre-procedure pain scores were 8.0 ± 1.9 and 8.0, respectively. Patients reported clinically significant decreased pain scores at both 1-week (mean, 3.9 ± 3.0; median, 3.25; P < 0.0001) and 4-week (mean, 2.9 ± 3.0; median, 2.75; P < 0.0001) follow-up. No major complications occurred related to RFA and there were no instances of symptomatic cement extravasation. Combination RFA and vertebral augmentation is a safe and effective therapy for palliation of painful spinal metastases, including tumor involving the posterior vertebral body and/or pedicles.
Extension of Virtual-Signal-Injection-Based MTPA Control for Interior Permanent-Magnet Synchronous Machine Drives Into the Field-Weakening Region
This paper presents a field-weakening control scheme to expand the speed operating region of the recently reported virtual signal injection control (VSIC) method for interior permanent-magnet synchronous machine (IPMSM) drives. Because of voltage saturation, the VSIC for IPMSM drives is not effective in the field-weakening region. A new control scheme is developed to guarantee that the torque can be controlled with minimum current amplitude. The proposed method realizes fast dynamic response and efficient operation of IPMSM drives in both constant torque and field-weakening regions by controlling the d-axis current through virtual signal injection and detection of the voltage saturation. The proposed method can track maximum torque per ampere (MTPA) points in the constant torque region and voltage-constrained MTPA points in the field-weakening region accurately without prior knowledge of accurate machine parameters. The proposed control method is demonstrated by both simulations and experiments under various operating conditions on a prototype IPMSM drive system.
Postfatigue potentiation of the paralyzed soleus muscle: evidence for adaptation with long-term electrical stimulation training.
Understanding the torque output behavior of paralyzed muscle has important implications for the use of functional neuromuscular electrical stimulation systems. Postfatigue potentiation is an augmentation of peak muscle torque during repetitive activation after a fatigue protocol. The purposes of this study were 1) to quantify postfatigue potentiation in the acutely and chronically paralyzed soleus and 2) to determine the effect of long-term soleus electrical stimulation training on the potentiation characteristics of recently paralyzed soleus muscle. Five subjects with chronic paralysis (>2 yr) demonstrated significant postfatigue potentiation during a repetitive soleus activation protocol that induced low-frequency fatigue. Ten subjects with acute paralysis (<6 mo) demonstrated no torque potentiation in response to repetitive stimulation. Seven of these acute subjects completed 2 yr of home-based isometric soleus electrical stimulation training of one limb (compliance = 83%; 8,300 contractions/wk). With the early implementation of electrically stimulated training, potentiation characteristics of trained soleus muscles were preserved as in the acute postinjury state. In contrast, untrained limbs showed marked postfatigue potentiation at 2 yr after spinal cord injury (SCI). A single acute SCI subject who was followed longitudinally developed potentiation characteristics very similar to the untrained limbs of the training subjects. The results of the present investigation support that postfatigue potentiation is a characteristic of fast-fatigable muscle and can be prevented by timely neuromuscular electrical stimulation training. Potentiation is an important consideration in the design of functional electrical stimulation control systems for people with SCI.
Fabrication of fine grained molybdenum by fast resistance sintering under ultra-high pressure
Abstract A novel fast resistance sintering method has been developed to fabricate pure Mo bulk material with fine grain size and high sinter density. A pure Mo compact with more than 98.5% theoretical density and grain size less than 2 μm has been successfully fabricated without any sintering additive or grain growth inhibitor under ultra-high pressure of 9000 MPa, sintering temperature of 1300 °C for only 60 s. The relationships between microstructure, basic mechanical property and processing parameters, including pressure, input power and sintering time, were investigated.
Syntactic parsing of chat language in contact center conversation corpus
Chat language is often referred to as Computer-mediated communication (CMC). Most of the previous studies on chat language has been dedicated to collecting ”chat room” data as it is the kind of data which is the most accessible on the WEB. This kind of data falls under the informal register whereas we are interested in this paper in understanding the mechanisms of a more formal kind of CMC: dialog chat in contact centers. The particularities of this type of dialogs and the type of language used by customers and agents is the focus of this paper towards understanding this new kind of CMC data. The challenges for processing chat data comes from the fact that Natural Language Processing tools such as syntactic parsers and part of speech taggers are typically trained on mismatched conditions, we describe in this study the impact of such a mismatch for a syntactic parsing task.
The experiences of parents with infants in Neonatal Intensive Care Unit
BACKGROUND In recent years significant medical science advances have been made in the field midwifery and infant care. The premature, low birth weight and ill infants are admitted to the technologically advanced NICU for care and they often require long-term stay. This study addresses parental experiences with the infant care in NICU, explores their concerns regarding nursing supports for parents and offers nurses' perspectives on performing duties. MATERIALS AND METHODS A qualitative inductive content analysis method was applied in 2011 that included a purposely selected group of parents, nurses and physicians from neonatal unit at the Medical Science University of Isfahan. Participants were surveyed and interviewed according to the institutional ethics committee approval and signed informed consents. RESULTS THE CONTENT ANALYSIS IDENTIFIED TWO MAIN CATEGORIES: 1) the definition of stress, which consisted of misgivings, nervous pressure, imbalance, separation and 2) the parents' reaction to stress, which revealed emotional, psychotic and behavioral reactions as subcategories. DISCUSSION The medical team awareness of NICU parent experiences is essential to the quality of care. Recognizing the type of parents' reaction to the whole process by the healthcare team seems essential to the optimum outcome.
Modellingof a special class of spherical parallel manipulators with Euler parameters
A method of workspace modelling for spherical parallel manipulators (SPMs) of symmetrical architecture is developed by virtue of Euler parameters in the paper. The adoption of Euler parameters in the expression of spatial rotations of SPMs helps not only to eliminate the possible singularity in the rotation matrix, but also to formulate all equations in polynomials, which are more easily manipulated. Moreover, a homogeneous workspace can be obtained with Euler parameters for the SPMs, which facilitates the evaluation of dexterity. In this work, the problem of workspace modelling and analysis is formulated in terms of Euler parameters. An equation dealing with boundary surfaces is derived and branches of boundary surface are identified. Evaluation of dexterity is explored to quantitatively describe the capability of a manipulator to attain orientations. The singularity identification is also addressed. Examples are included to demonstrate the application of the proposed method.
Web Pontoon: a method for reflective web applications
Network-based provisioning of custom-made and adaptive services offers unlimited opportunities for service development. Examples include ICT-based information, assistance, coordination, and remote monitoring services for senior citizens. Addressing diversity and unpredictable changeability requirements of such service platforms entails novel design solutions. I present Web Pontoon, a method tailored specifically for handling these requirements by a combination of web content management, client-side end-user programming, closed-loop management of object lifecycles, and domain-driven design. Opportunities for massive deployment of relevant applications are being studied.
Effectiveness of Blog Advertising: Impact of Communicator Expertise, Advertising Intent, and Product Involvement
Blog advertising, which refers to the paid sponsorship of bloggers to review, promote, or sell products in their blog writing, is becoming prevalent. This paper investigates the impact of three critical factors on blog advertising effectiveness: communicator expertise, advertising intent, and product involvement. An experiment with a 2×2×2 factorial design was used to test their interaction effects on advertising effectiveness. The results indicate that, for low-involvement products, there is better advertising effectiveness when low-expertise communicators are explicit about the advertising intent or when high-expertise communicators are implicit about the advertising intent. But for high-involvement products, the results show that when low-expertise communicators are explicit about the advertising intent, the outcome is lesser advertising effectiveness. For such products, advertising effectiveness does not differ when high-expertise communicators are implicit or explicit about the advertising intent. Based on these results, some implications for further research and practice are given.
Temperature sensitivity of soil carbon decomposition and feedbacks to climate change
Significantly more carbon is stored in the world's soils—including peatlands, wetlands and permafrost—than is present in the atmosphere. Disagreement exists, however, regarding the effects of climate change on global soil carbon stocks. If carbon stored belowground is transferred to the atmosphere by a warming-induced acceleration of its decomposition, a positive feedback to climate change would occur. Conversely, if increases of plant-derived carbon inputs to soils exceed increases in decomposition, the feedback would be negative. Despite much research, a consensus has not yet emerged on the temperature sensitivity of soil carbon decomposition. Unravelling the feedback effect is particularly difficult, because the diverse soil organic compounds exhibit a wide range of kinetic properties, which determine the intrinsic temperature sensitivity of their decomposition. Moreover, several environmental constraints obscure the intrinsic temperature sensitivity of substrate decomposition, causing lower observed ‘apparent’ temperature sensitivity, and these constraints may, themselves, be sensitive to climate.
A novel LLC resonant controller with best-in-class transient performance and low standby power consumption
This paper introduces a novel LLC resonant controller with better light load efficiency and best-in-class transient performance. A new control algorithm — hybrid hysteretic control, is proposed which combines the benefit of charge control and frequency control. It maintains the good transient performance of charge control, but avoids the related stability issues by adding slope compensation. The slope compensation also helps to sense the resonant capacitor voltage by only using lossless capacitor divider. Burst mode operation is developed to improve the light load efficiency. By turning on the power devices for only a short period during light load, it helps to reduce the equivalent switching frequency and achieve high efficiency. The effectiveness of the proposed controller has been validated in a 12 V/120 W half-bridge LLC converter, as well as a commercial power supply unit.
A novel clustering oriented closeness measure based on neighborhood chain
Closeness measures are crucial to clustering methods. In most traditional clustering methods, the closeness between data points or clusters is measured by the geometric distances alone. These metrics quantify the closeness only based on the concerned data points' positions in the feature space, and they might cause problems when dealing with clustering tasks with arbitrary clusters shapes and different clusters scales (varying clusters densities). In this paper, a novel Closeness Measure between data points based on Neighborhood Chain (CMNC) is proposed. Instead of using geometric distances alone, CMNC measures the closeness between data points by quantifying the difficulty for one data point to reach another through a chain of neighbors. Experimental results show that by substituting the geometric-distances-based closeness measures with CMNC, modified versions of the traditional clustering algorithms (e.g. k-means, single-link and CURE) perform much better than their original versions, especially when dealing with clustering tasks with clusters having arbitrary shapes and different scales.
A Typology of Technology-Enhanced Tourism Experiences
Experiences constitute the essence of the tourism industry. While literature has recognised the recent impact of technology on experiences, its empirical exploration remains scarce. This study addresses the gap by empirically exploring five leading industry cases to generate a holistic understanding of technology enhanced tourism experiences. The main contribution of this paper lies in the development of a nine-field experience typology matrix based on the increasing intensity of cocreation and technology implementation. The final contribution of this study is the development of an experience hierarchy and discussing its relevance for experience enhancement in tourism research and practice. AUTHORS: Barbara Neuhofer* [email protected] Dimitrios Buhalis [email protected] Adele Ladkin [email protected] eTourismLab, School of Tourism Bournemouth University Talbot Campus, Fern Barrow Poole, Dorset BH12 5BB United Kingdom PLEASE CITE THIS ARTICLE AS: Neuhofer, B., Buhalis, D., Ladkin, A., 2013. A Typology of Technology-Enhanced Tourism Experiences. DOI: 10.1002/jtr.1958. A Typology of Technology-enhanced Tourism Experiences
Lifelong Generative Modeling
Lifelong learning is the problem of learning multiple consecutive tasks in an online manner and is essential towards the development of intelligent machines that can adapt to their surroundings. In this work we focus on learning a lifelong approach to generative modeling whereby we continuously incorporate newly observed distributions into our model representation. We utilize two models, aptly named the student and the teacher, in order to aggregate information about all past distributions without the preservation of any of the past data or previous models. The teacher is utilized as a form of compressed memory in order to allow for the student model to learn over the past as well as present data. We demonstrate why a naive approach to lifelong generative modeling fails and introduce a regularizer with which we demonstrate learning across a long range of distributions.
Asset Pricing When 'This Time is Different'
Recent evidence suggests that younger people update beliefs in response to aggregate shocks more than older people. We embed this generational learning bias in an equilibrium model in which agents have recursive preferences and are uncertain about exogenous aggregate dynamics. The departure from rational expectations is statistically modest, but generates high average risk premiums varying at generational frequencies, a positive relation between past returns and agents’ future return forecasts, and substantial and persistent over- and undervaluation. Consistent with the model, the price-dividend ratio is empirically more sensitive to macroeconomic shocks when the fraction of young in the population is higher.Received September 14, 2015; editorial decision March 10, 2016 by Editor Stefan Nagel
Distinctive patterns on CT angiography characterize acute internal carotid artery occlusion subtypes
Noninvasive computed tomography angiography (CTA) is widely used in acute ischemic stroke, even for diagnosing various internal carotid artery (ICA) occlusion sites, which often need cerebral digital subtraction angiography (DSA) confirmation. We evaluated whether clinical outcomes vary depending on the DSA-based occlusion sites and explored correlating features on baseline CTA that predict DSA-based occlusion site.We analyzed consecutive patients with acute ICA occlusion who underwent DSA and CTA. Occlusion site was classified into cervical, cavernous, petrous, and carotid terminus segments by DSA confirmation. Clinical and radiological features associated with poor outcome at 3 months (3-6 of modified Rankin scale) were analyzed. Baseline CTA findings were categorized according to carotid occlusive shape (stump, spearhead, and streak), presence of cervical calcification, Willisian occlusive patterns (T-type, L-type, and I-type), and status of leptomeningeal collaterals (LMC).We identified 49 patients with occlusions in the cervical (n = 17), cavernous (n = 22), and carotid terminus (n = 10) portions: initial NIH Stroke Scale (11.4 ± 4.2 vs 16.1 ± 3.7 vs 18.2 ± 5.1; P < 0.001), stroke volume (27.9 ± 29.6 vs 127.4 ± 112.6 vs 260.3 ± 151.8 mL; P < 0.001), and poor outcome (23.5 vs 77.3 vs 90.0%; P < 0.001). Cervical portion occlusion was characterized as rounded stump (82.4%) with calcification (52.9%) and fair LMC (94.1%); cavernous as spearhead occlusion (68.2%) with fair LMC (86.3%) and no calcification (95.5%); and terminus as streak-like occlusive pattern (60.0%) with poor LMC (60.0%), and no calcification (100%) on CTA.Our study indicates that acute ICA occlusion can be subtyped into cervical, cavernous, and terminus. Distinctive findings on initial CTA can help differentiate ICA-occlusion subtypes with specific characteristics.
The natural antioxidant alpha-lipoic acid induces p27(Kip1)-dependent cell cycle arrest and apoptosis in MCF-7 human breast cancer cells.
Unlike normal cells, tumor cells survive in a specific redox environment where the elevated reactive oxygen species contribute to enhance cell proliferation and to suppress apoptosis. Alpha-lipoic acid, a naturally occurring reactive oxygen species scavenger, has been shown to possess anticancer activity, due to its ability to suppress proliferation and to induce apoptosis in different cancer cell lines. Since at the moment little information is available regarding the potential effects of alpha-lipoic acid on breast cancer, in the present study we addressed the question whether alpha-lipoic acid induces cell cycle arrest and apoptosis in the human breast cancer cell line MCF-7. Moreover, we investigated some molecular mechanisms which mediate alpha-lipoic acid actions, focusing on the role of the PI3-K/Akt signalling pathway. We observed that alpha-lipoic acid is able to scavenge reactive oxygen species in MCF-7 cells and that the reduction of reactive oxygen species is followed by cell growth arrest in the G1 phase of the cell cycle, via the specific inhibition of Akt pathway and the up-regulation of the cyclin-dependent kinase inhibitor p27(kip1), and by apoptosis, via changes of the ratio of the apoptotic-related protein Bax/Bcl-2. Thus, the anti-tumor activity of alpha-lipoic acid observed in MCF-7 cells further stresses the role of redox state in regulating cancer initiation and progression.
Simple Open Stance Classification for Rumour Analysis
Stance classification determines the attitude, or stance, in a (typically short) text. The task has powerful applications, such as the detection of fake news or the automatic extraction of attitudes toward entities or events in the media. This paper describes a surprisingly simple and efficient classification approach to open stance classification in Twitter, for rumour and veracity classification. The approach profits from a novel set of automatically identifiable problem-specific features, which significantly boost classifier accuracy and achieve above state-of-theart results on recent benchmark datasets. This calls into question the value of using complex sophisticated models for stance classification without first doing informed feature extraction.
M-SAC-VLADNet: A Multi-Path Deep Feature Coding Model for Visual Classification
Vector of locally aggregated descriptor (VLAD) coding has become an efficient feature coding model for retrieval and classification. In some recent works, the VLAD coding method is extended to a deep feature coding model which is called NetVLAD. NetVLAD improves significantly over the original VLAD method. Although the NetVLAD model has shown its potential for retrieval and classification, the discriminative ability is not fully researched. In this paper, we propose a new end-to-end feature coding network which is more discriminative than the NetVLAD model. First, we propose a sparsely-adaptive and covariance VLAD model. Next, we derive the back propagation models of all the proposed layers and extend the proposed feature coding model to an end-to-end neural network. Finally, we construct a multi-path feature coding network which aggregates multiple newly-designed feature coding networks for visual classification. Some experimental results show that our feature coding network is very effective for visual classification.
Training Support Vector Machines: an Application to Face Detection
Detection (To appear in the Proceedings of CVPR'97, June 17-19, 1997, Puerto Rico.) Edgar Osunay? Robert Freund? Federico Girosiy yCenter for Biological and Computational Learning and ?Operations Research Center Massachusetts Institute of Technology Cambridge, MA, 02139, U.S.A. Abstract We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classi ers. The decision surfaces are found by solving a linearly constrained quadratic programming problem. This optimization problem is challenging because the quadratic form is completely dense and the memory requirements grow with the square of the number of data points. We present a decomposition algorithm that guarantees global optimality, and can be used to train SVM's over very large data sets. The main idea behind the decomposition is the iterative solution of sub-problems and the evaluation of optimality conditions which are used both to generate improved iterative values, and also establish the stopping criteria for the algorithm. We present experimental results of our implementation of SVM, and demonstrate the feasibility of our approach on a face detection problem that involves a data set of 50,000 data points.
District mental healthcare plans for five low- and middle-income countries: commonalities, variations and evidence gaps
BACKGROUND Little is known about the service and system interventions required for successful integration of mental healthcare into primary care across diverse low- and middle-income countries (LMIC). AIMS To examine the commonalities, variations and evidence gaps in district-level mental healthcare plans (MHCPs) developed in Ethiopia, India, Nepal, Uganda and South Africa for the PRogramme for Improving Mental health carE (PRIME). METHOD A comparative analysis of MHCP components and human resource requirements. RESULTS A core set of MHCP goals was seen across all countries. The MHCPs components to achieve those goals varied, with most similarity in countries within the same resource bracket (low income v. middle income). Human resources for advanced psychosocial interventions were only available in the existing health service in the best-resourced PRIME country. CONCLUSIONS Application of a standardised methodological approach to MHCP across five LMIC allowed identification of core and site-specific interventions needed for implementation.
Physical activity and bronchial hyperresponsiveness: European Community Respiratory Health Survey II.
BACKGROUND Identification of the risk factors for bronchial hyperresponsiveness (BHR) would increase the understanding of the causes of asthma. The relationship between physical activity and BHR in men and women aged 28.0-56.5 years randomly selected from 24 centres in 11 countries participating in the European Community Respiratory Health Survey II was investigated. METHODS 5158 subjects answered questionnaires about physical activity and performed BHR tests. Participants were asked about the frequency and duration of usual weekly exercise resulting in breathlessness or sweating. BHR was defined as a decrease in forced expiratory volume in 1 s of at least 20% of its post-saline value for a maximum methacholine dose of 2 mg. RESULTS Both frequency and duration of physical activity were inversely related to BHR. The prevalence of BHR in subjects exercising <or=1, 2-3 and >or=4 times a week was 14.5%, 11.6% and 10.9%, respectively (p<0.001). The corresponding odds ratios were 1.00, 0.78 (95% CI 0.62 to 0.99) and 0.69 (95% CI 0.50 to 0.94) after controlling for potential confounding factors. The frequency of BHR in subjects exercising <1 h, 1-3 h and >or=4 h a week was 15.9%, 10.9% and 10.7%, respectively (p<0.001). The corresponding adjusted odds ratios were 1.00, 0.70 (95% CI 0.57 to 0.87) and 0.67 (95% CI 0.50 to 0.90). Physical activity was associated with BHR in all studied subgroups. CONCLUSIONS These results suggest that BHR is strongly and independently associated with decreased physical activity. Further studies are needed to determine the mechanisms underlying this association.
The Internet of Things - A survey of topics and trends
The Internet of Things is a paradigm where everyday objects can be equipped with identifying, sensing, networking and processing capabilities that will allow them to communicate with one another and with other devices and services over the Internet to accomplish some objective. Ultimately, IoT devices will be ubiquitous, context-aware and will enable ambient intelligence. This article reports on the current state of research on the Internet of Things by examining the literature, identifying current trends, describing challenges that threaten IoT diffusion, presenting open research questions and future directions and compiling a comprehensive reference list to assist researchers.
Mechanical Coupling of 2D Resonator Arrays for MEMS Filter Applications
This paper presents a study of mechanical coupling in 2D resonator arrays for filter applications. A robust coupling design for 2D array filters, comprised of weak coupling in one dimension and strong coupling in the second, is demonstrated experimentally and compared with weakly coupled and electrically summed 2D resonator array filters. Effects of inherent disorder in resonator arrays due to fabrication variations are minimized in this mechanical coupling scheme, averaging over resonator mismatch to form a smooth pass-band. The strongly-coupled 2D filter improves insertion loss and ripple without degradation in filter shape factor or stop-band rejection relative to its ID counterpart.
Brain tumor MRI image classification with feature selection and extraction using linear discriminant analysis
Feature extraction is a method of capturing visual content of an image. The feature extraction is the process to represent raw image in its reduced form to facilitate decision making such as pattern classification. We have tried to address the problem of classification MRI brain images by creating a robust and more accurate classifier which can act as an expert assistant to medical practitioners. The objective of this paper is to present a novel method of feature selection and extraction. This approach combines the Intensity, Texture, shape based features and classifies the tumor as white matter, Gray matter, CSF, abnormal and normal area. The experiment is performed on 140 tumor contained brain MR images from the Internet Brain Segmentation Repository. The proposed technique has been carried out over a larger database as compare to any previous work and is more robust and effective. PCA and Linear Discriminant Analysis (LDA) were applied on the training sets. The Support Vector Machine (SVM) classifier served as a comparison of nonlinear techniques Vs linear ones. PCA and LDA methods are used to reduce the number of features used. The feature selection using the proposed technique is more beneficial as it analyses the data according to grouping class variable and gives reduced feature set with high classification accuracy.
Active Inference and Learning in the Cerebellum
This letter offers a computational account of Pavlovian conditioning in the cerebellum based on active inference and predictive coding. Using eyeblink conditioning as a canonical paradigm, we formulate a minimal generative model that can account for spontaneous blinking, startle responses, and (delay or trace) conditioning. We then establish the face validity of the model using simulated responses to unconditioned and conditioned stimuli to reproduce the sorts of behavior that are observed empirically. The scheme’s anatomical validity is then addressed by associating variables in the predictive coding scheme with nuclei and neuronal populations to match the (extrinsic and intrinsic) connectivity of the cerebellar (eyeblink conditioning) system. Finally, we try to establish predictive validity by reproducing selective failures of delay conditioning, trace conditioning, and extinction using (simulated and reversible) focal lesions. Although rather metaphorical, the ensuing scheme can account for a remarkable range of anatomical and neurophysiological aspects of cerebellar circuitry—and the specificity of lesion-deficit mappings that have been established experimentally. From a computational perspective, this work shows how conditioning or learning can be formulated in terms of minimizing variational free energy (or maximizing Bayesian model evidence) using exactly the same principles that underlie predictive coding in perception.
The Missing Piece in Complex Analytics: Low Latency, Scalable Model Management and Serving with Velox
To enable complex data-intensive applications such as personalized recommendations, targeted advertising, and intelligent services, the data management community has focused heavily on the design of systems to train complex models on large datasets. Unfortunately, the design of these systems largely ignores a critical component of the overall analytics process: the serving and management of models at scale. In this work, we present Velox, a new component of the Berkeley Data Analytics Stack. Velox is a data management system for facilitating the next steps in real-world, large-scale analytics pipelines: online model management, maintenance, and serving. Velox provides end-user applications and services with a low-latency, intuitive interface to models, transforming the raw statistical models currently trained using existing offline large-scale compute frameworks into full-blown, end-to-end data products capable of targeting advertisements, recommending products, and personalizing web content. To provide up-to-date results for these complex models, Velox also facilitates lightweight online model maintenance and selection (i.e., dynamic weighting). In this paper, we describe the challenges and architectural considerations required to achieve this functionality, including the abilities to span online and offline systems, to adaptively adjust model materialization strategies, and to exploit inherent statistical properties such as model error tolerance, all while operating at “Big Data” scale.
Intravenous administration of auto serum-expanded autologous mesenchymal stem cells in stroke.
Transplantation of human mesenchymal stem cells has been shown to reduce infarct size and improve functional outcome in animal models of stroke. Here, we report a study designed to assess feasibility and safety of transplantation of autologous human mesenchymal stem cells expanded in autologous human serum in stroke patients. We report an unblinded study on 12 patients with ischaemic grey matter, white matter and mixed lesions, in contrast to a prior study on autologous mesenchymal stem cells expanded in foetal calf serum that focused on grey matter lesions. Cells cultured in human serum expanded more rapidly than in foetal calf serum, reducing cell preparation time and risk of transmissible disorders such as bovine spongiform encephalomyelitis. Autologous mesenchymal stem cells were delivered intravenously 36-133 days post-stroke. All patients had magnetic resonance angiography to identify vascular lesions, and magnetic resonance imaging prior to cell infusion and at intervals up to 1 year after. Magnetic resonance perfusion-imaging and 3D-tractography were carried out in some patients. Neurological status was scored using the National Institutes of Health Stroke Scale and modified Rankin scores. We did not observe any central nervous system tumours, abnormal cell growths or neurological deterioration, and there was no evidence for venous thromboembolism, systemic malignancy or systemic infection in any of the patients following stem cell infusion. The median daily rate of National Institutes of Health Stroke Scale change was 0.36 during the first week post-infusion, compared with a median daily rate of change of 0.04 from the first day of testing to immediately before infusion. Daily rates of change in National Institutes of Health Stroke Scale scores during longer post-infusion intervals that more closely matched the interval between initial scoring and cell infusion also showed an increase following cell infusion. Mean lesion volume as assessed by magnetic resonance imaging was reduced by >20% at 1 week post-cell infusion. While we would emphasize that the current study was unblinded, did not assess overall function or relative functional importance of different types of deficits, and does not exclude placebo effects or a contribution of recovery as a result of the natural history of stroke, our observations provide evidence supporting the feasibility and safety of delivery of a relatively large dose of autologous mesenchymal human stem cells, cultured in autologous human serum, into human subjects with stroke and support the need for additional blinded, placebo-controlled studies on autologous mesenchymal human stem cell infusion in stroke.
Holographic Photolysis for Multiple Cell Stimulation in Mouse Hippocampal Slices
BACKGROUND Advanced light microscopy offers sensitive and non-invasive means to image neural activity and to control signaling with photolysable molecules and, recently, light-gated channels. These approaches require precise and yet flexible light excitation patterns. For synchronous stimulation of subsets of cells, they also require large excitation areas with millisecond and micrometric resolution. We have recently developed a new method for such optical control using a phase holographic modulation of optical wave-fronts, which minimizes power loss, enables rapid switching between excitation patterns, and allows a true 3D sculpting of the excitation volumes. In previous studies we have used holographic photololysis to control glutamate uncaging on single neuronal cells. Here, we extend the use of holographic photolysis for the excitation of multiple neurons and of glial cells. METHODS/PRINCIPAL FINDINGS The system combines a liquid crystal device for holographic patterned photostimulation, high-resolution optical imaging, the HiLo microscopy, to define the stimulated regions and a conventional Ca(2+) imaging system to detect neural activity. By means of electrophysiological recordings and calcium imaging in acute hippocampal slices, we show that the use of excitation patterns precisely tailored to the shape of multiple neuronal somata represents a very efficient way for the simultaneous excitation of a group of neurons. In addition, we demonstrate that fast shaped illumination patterns also induce reliable responses in single glial cells. CONCLUSIONS/SIGNIFICANCE We show that the main advantage of holographic illumination is that it allows for an efficient excitation of multiple cells with a spatiotemporal resolution unachievable with other existing approaches. Although this paper focuses on the photoactivation of caged molecules, our approach will surely prove very efficient for other probes, such as light-gated channels, genetically encoded photoactivatable proteins, photoactivatable fluorescent proteins, and voltage-sensitive dyes.
Variations of the Similarity Function of TextRank for Automated Summarization
This article presents new alternatives to the similarity function for the TextRank algorithm for automated summarization of texts. We describe the generalities of the algorithm and the different functions we propose. Some of these variants achieve a significative improvement using the same metrics and dataset as the original publication.
IEEE Standard 1500 Compliance Verification for Embedded Cores
Core-based design and reuse are the two key elements for an efficient system-on-chip (SoC) development. Unfortunately, they also introduce new challenges in SoC testing, such as core test reuse and the need of a common test infrastructure working with cores originating from different vendors. The IEEE 1500 Standard for Embedded Core Testing addresses these issues by proposing a flexible hardware test wrapper architecture for embedded cores, together with a core test language (CTL) used to describe the implemented wrapper functionalities. Several intellectual property providers have already announced IEEE Standard 1500 compliance in both existing and future design blocks. In this paper, we address the problem of guaranteeing the compliance of a wrapper architecture and its CTL description to the IEEE Standard 1500. This step is mandatory to fully trust the wrapper functionalities in applying the test sequences to the core. We present a systematic methodology to build a verification framework for IEEE Standard 1500 compliant cores, allowing core providers and/or integrators to verify the compliance of their products (sold or purchased) to the standard.
Bipartite graph reinforcement model for web image annotation
Automatic image annotation is an effective way for managing and retrieving abundant images on the internet. In this paper, a bipartite graph reinforcement model (BGRM) is proposed for web image annotation. Given a web image, a set of candidate annotations is extracted from its surrounding text and other textual information in the hosting web page. As this set is often incomplete, it is extended to include more potentially relevant annotations by searching and mining a large-scale image database. All candidates are modeled as a bipartite graph. Then a reinforcement algorithm is performed on the bipartite graph to re-rank the candidates. Only those with the highest ranking scores are reserved as the final annotations. Experimental results on real web images demonstrate the effectiveness of the proposed model.
Crowdsourcing of Pollution Data using Smartphones
In this paper we present our research into participatory sensing based solutions for the collection of data on urban pollution and nuisance. In the past 2 years we have been involved in the NoiseTube project which explores a crowdsourcing approach to measuring and mapping urban noise pollution using smartphones. By involving the general public and using off-the-shelf smartphones as noise sensors, we seek to provide a low cost solution for citizens to measure their personal exposure to noise in their everyday environment and participate in the creation of collective noise maps by sharing their geo-localized and annotated measurements with the community. We believe our work represents an interesting example of the novel mobile crowdsourcing applications which are enabled by ubiquitous computing systems. Furthermore we believe the NoiseTube system, and the currently ongoing validation experiments, provide an illustrative context for some of the open challenges faced by creators of ubiquitous crowdsourcing applications and services in general. We will also take the opportunity to present the insights we gained into some of the challenges.
The Emerging Role of Electronic Marketplaces on the Internet
Markets play a central role in the economy, facilitating the exchange of information, goods, services and payments. In the process, they create economic value for buyers, sellers, market intermediaries and for society at large. Recent years have seen a dramatic increase in the role of information technology in markets, both in traditional markets, and in the emergence of electronic marketplaces, such as the multitude of Internet-based online auctions.
Bullying and victimization of primary school children in England and Germany: prevalence and school factors.
Differences in definitions and methodologies for assessing bullying in primary school children between countries have precluded direct comparisons of prevalence rates and school factors related to bullying. A total of 2377 children in England (6-year-olds/Year 2: 1072; 8-year-olds/Year 4: 1305) and 1538 in Germany (8-year-olds/Year 2) were questioned individually using an identical standard interview. In both countries the types of bullying to victimize others were similar: boys were most often perpetrators, most bullies were also victims (bully/victims), most bullying occurred in playgrounds and the classroom, and SES and ethnicity only showed weak associations with bullying behaviour. Major differences were found in victimization rates with 24% of English pupils becoming victims every week compared with only 8% in Germany. In contrast, fewer boys in England engaged every week in bullying (2.5-4.5%) than German boys (7.5%), while no differences were found between girls. In England, children in smaller classes were more often victimized. Further study of the group of bully/victims, schooling differences in England vs. Germany and implications for prevention of bullying are discussed.
Model for end stage of liver disease (MELD) is better than the Child-Pugh score for predicting in-hospital mortality related to esophageal variceal bleeding.
AIM The Child Pugh and MELD are good methods for predicting mortality in patients with chronic liver disease. We investigated their performance as risk factors for failure to control bleeding, in-hospital overall mortality and death related to esophageal variceal bleeding episodes. METHODS From a previous collected database, 212 cirrhotic patients with variceal bleeding admitted to our hospital were studied. The predictive capability of Child Pugh and MELD scores were compared using c statistics. RESULTS The Child-Pugh and MELD scores showed marginal capability for predicting failure to control bleeding (the area under receiver operating characteristics curve (AUROC) values were < 0.70 for both). The AUROC values for predicting in-hospital overall mortality of Child-Pugh and MELD score were similar: 0.809 (CI 95%, 0.710 - 0.907) and 0.88 (CI 95% 0.77- 0.99,) respectively. There was no significant difference between them (p > 0.05). The AUROC value of MELD for predicting mortality related to variceal bleeding was higher than the Child-Pugh score: 0.905 (CI 95% 0.801-1.00) vs 0.794 (CI 95% 0.676 - 0.913) respectively (p < 0.05). CONCLUSIONS MELD and Child-Pugh were not efficacious scores for predicting failure to control bleeding. The Child-Pugh and MELD scores had similar capability for predicting in-hospital overall mortality. Nevertheless, MELD was significantly better than Child-Pugh score for predicting in-hospital mortality related to variceal bleeding.
Research challenges in legal-rule and QoS-aware cloud service brokerage
The ICT industry and specifically critical sectors, such as healthcare, transportation, energy and government, require as mandatory the compliance of ICT systems and services with legislation and regulation, as well as with standards. In the era of cloud computing, this compliance management issue is exacerbated by the distributed nature of the system and by the limited control that customers have on the services. Today, the cloud industry is aware of this problem (as evidenced by the compliance program of many cloud service providers), and the research community is addressing the many facets of the legal-rule compliance checking and quality assurance problem. Cloud service brokerage plays an important role in legislation compliance and QoS management of cloud services. In this paper we discuss our experience in designing a legal-rule and QoS-aware cloud service broker, and we explore relate research issues. Specifically we provide three main contributions to the literature: first, we describe the detailed design architecture of the legal-rule and QoS-aware broker. Second, we discuss our design choices which rely on the state of the art solutions available in literature. We cover four main research areas: cloud broker service deployment, seamless cloud service migration, cloud service monitoring, and legal rule compliance checking. Finally, from the literature review in these research areas, we identify and discuss research challenges.
Association of CCR5 human haplogroup E with rapid HIV type 1 disease progression.
The combination of unique single nucleotide polymorphisms in the CCR5 regulatory and in the CCR2 and CCR5 coding regions, defined nine CCR5 human haplogroups (HH): HHA-HHE, HHF*1, HHF*2, HHG*1, and HHG*2. Here we examined the distribution of CCR5 HH and their association with HIV infection and disease progression in 36 HIV-seronegative and 76 HIV-seropositive whites from North America and Spain [28 rapid progressors (RP) and 48 slow progressors (SP)]. Although analyses revealed that HHE frequencies were similar between HIV-seronegative and HIV-seropositive groups (25.0% vs. 32.2%, p > 0.05), HHE frequency in RP was significantly higher than that in SP (48.2% vs. 22.9%, p = 0.002). Survival analysis also showed that HHE heterozygous and homozygous were associated with an accelerated CD4 cell count decline to less than 200 cells/microL (adjusted RH 2.44, p = 0.045; adjusted RH = 3.12, p = 0.037, respectively). These data provide further evidence that CCR5 human haplogroups influence HIV-1 disease progression in HIV-infected persons.
Millimeter wave cellular channel models for system evaluation
The huge amount of (potentially) available spectrum makes millimeter wave (mmWave) a promising candidate for fifth generation cellular networks. Unfortunately, differences in the propagation environment as a function of frequency make it hard to make comparisons between systems operating at mmWave and microwave frequencies. This paper presents a simple channel model for evaluating system level performance in mmWave cellular networks. The model uses insights from measurement results that show mmWave is sensitive to blockages revealing very different path loss characteristics between line-of-sight (LOS) and non-line-of-sight (NLOS) links. The conventional path loss model with a single log-distance path loss function and a shadowing term is replaced with a stochastic path loss model with a distance-dependent LOS probability and two different path loss functions to account for LOS and NLOS links. The proposed model is used to compare microwave and mmWave networks in simulations. It is observed that mmWave networks can provide comparable coverage probability with a dense deployment, leading to much higher data rates thanks to the large bandwidth available in the mmWave spectrum.
Probabilistic inference as a model of planned behavior
The problem of planning and goal-directed behavior has been addressed in computer science for many years, typically based on classical concepts like Bellman’s optimality principle, dynamic programming, or Reinforcement Learning methods – but is this the only way to address the problem? Recently there is growing interest in using probabilistic inference methods for decision making and planning. Promising about such approaches is that they naturally extend to distributed state representations and efficiently cope with uncertainty. In sensor processing, inference methods typically compute a posterior over state conditioned on observations – applied in the context of action selection they compute a posterior over actions conditioned on goals. In this paper we will first introduce the idea of using inference for reasoning about actions on an intuitive level, drawing connections to the idea of internal simulation. We then survey previous and own work using the new approach to address (partially observable) Markov Decision Processes and stochastic optimal control problems.
Three theorems regarding testing graph properties
Property testing is a relaxation of decision problems in which it is required to distinguish YES-instances (i.e., objects having a predetermined property) from instances that are far from any YES-instance. We presents three theorems regarding testing graph properties in the adjacency matrix representation. More specifically, these theorems relate to the project of characterizing graph properties according to the complexity of testing them (in the adjacency matrix representation). The first theorem is that there exist monotone graph properties in NP for which testing is very hard (i.e., requires to examine a constant fraction of the entries in the matrix). The second theorem is that every graph property that can be tested making a number of queries that is independent of the size of the graph can be so tested by uniformly selecting a set of vertices and accepting iff the induced subgraph has some fixed graph property (which is not necessarily the same as the one being tested). The third theorem refers to the framework of graph partition problems, and is a characterization of the subclass of properties that can be tested using a one-sided error tester making a number of queries that is independent of the size of the graph.