title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
Perceived risk as a moderator of the effectiveness of framed HIV-test promotion messages among women: a randomized controlled trial. | OBJECTIVE
Researchers argue that gain-framed messages should be more effective for prevention behaviors, while loss frames should be more effective for detection behaviors (Rothman & Salovey, 1997). Evidence for this taxonomy has been mixed. This study examines whether the effects of gain- and loss-framed messages on HIV-testing intentions is moderated by perceived risk of a positive result.
METHOD
This experiment was conducted online and utilized a single factor (frame: gain/loss) between subjects design, with a separate HIV-test promotion control and a no message control to examine whether perceived risk of a positive test result moderates the effects of framed messages on intentions to seek an HIV test in the next 3 months. The sample (N = 1052; M age = 22, SD = 2.22), recruited through Survey Sampling International, included 51% Black women (49% White women).
RESULTS
HIV-test promotion messages were more effective than no message, but there were no other main effects for condition. Results also demonstrated a significant interaction between message frame and perceived risk, which is mediated through elaborative processing of the message. The interaction demonstrated an advantage for the loss-framed message among women with some perceived risk and an advantage for the gain-framed message among women with low perceived risk.
CONCLUSION
Results imply that the prevention/detection function of the behavior may be an inadequate distinction in the consideration of the effectiveness of framed messages promoting HIV testing. Rather, this study demonstrates that risk perceptions are an important moderator of framing effects. |
Translation and validation of the Chinese version of the Current Opioid Misuse Measure (COMM) for patients with chronic pain in Mainland China | BACKGROUND
Management of prescription opioids misuse and abuse problems among chronic pain patients has been increasingly important worldwide and little literature concerning prescription opioids can be found in mainland China so far.
METHODS
The Current Opioid Misuse Measure (COMM) was translated into Chinese following Brislin's model of cross-culture translation and was completed by a convenience sample of 180 patients with chronic pain recruited from two major hospitals in Jinan, Shandong province. Data were analyzed using internal consistency, test-retest reliability, exploratory factor analysis and confirmatory factor analysis.
RESULTS
The internal consistency coefficient for the total score of the COMM was 0.85 and item-total correlations of all items were above 0.20. Besides, the test-retest reliability was satisfactory with an ICC of 0.91 (95% CI = 0.65-0.98). Four principal components were extracted, accounting for 65.30% of the variance, and the factor loadings of all 17 items were above 0.40.
CONCLUSIONS
The Chinese version of COMM showed satisfactory reliability and validity, and could be used as a screening tool to evaluate and monitor current aberrant drug-related behavior among Chinese patients with chronic pain. |
Feasibility of neural networks in modelling radio propagation for field strength prediction | A typical back-propagation neural network (BPN) model is developed for modelling radio propagation for field strength prediction based on data measurements of propagation loss (in decibels) with terrain information taken in an urban area (Athens region) in the 900 MHz band. The feasibility of the BPN model is checked against the performance of a conventional semiempirical reference model. The performance of both models is quantified by statistical methods. The evaluation is done by comparing their prediction error statistics of average absolute, standard deviation and root mean square and by comparing their percentage accuracy and correlation of predicted values relative to true data measurements. 1998 John Wiley & Sons, Ltd. |
Application of inverse kinematics for skeleton manipulation in real-time | Usual way of character's animation is the use of motion captured data. Acquired bones' orientations are blended together according to user input in real-time. Although this massively used method gives a nice results, practical experience show how important is to have a system for interactive direct manipulation of character's skeleton in order to satisfy various tasks in Cartesian space. For this purpose, various methods for solving inverse kinematics problem are used. This paper presents three of such methods: Algebraical method based on limbs positioning; iterative optimization method based on Jacobian pseudo-inversion; and heuristic CCD iterative method. The paper describes them all in detail and discusses practical scope of their use in real-time applications. |
Quantum Annealing and Analog Quantum Computation | We review here the recent success in quantum annealing, i.e., optimization of the cost or energy functions of complex systems utilizing quantum fluctuations. The concept is introduced in successive steps through the studies of mapping of such computationally hard problems to the classical spin glass problems. The quantum spin glass problems arise with the introduction of quantum fluctuations, and the annealing behavior of the systems as these fluctuations are reduced slowly to zero. This provides a general framework for realizing analog quantum computation. |
Cloud Data Security while using Third Party Auditor | The Cloud is a plateform where all users not only store their data but also used the software and services provided by Cloud Service Provider (CSP). The service provided by the cloud is very economical. The user pay only for what he used.This is a platform where data owner remotely store their data in the cloud to enjoy the high quality applications and services. The user can access the data, use the data and store the data. In a Corporate world there are large number of client who accessing their data and modifying a data. In Cloud, application software and services are move to the centralized large data center and management of this data and services may not be trustworthy. To manage this data we use third party auditor (TPA).It will check the reliabilty of data but it increases the data integrity risk of data owner. Since TPA not only read the data but also he can modify the data, therefore a mechanism should be provided who solved this problem. We first examine the problem and new potential security scheme used to solve this problem. Our algorithm encrypt the content of file at user level which ensure the data owner and client that there data are intact. Side by side it also preserves the data dynamics and consistency of n number of client and server. Keywords— Third party Auditor, Integrity, Cloud Service Provider, Cloud Computing. . —————————— —————————— |
Impact of Anxiety and Depression Symptoms on Scholar Performance in High School and University Students | Emotional processes are important to survive. The Darwinian adaptive concept of stress refers to natural selection since evolved individuals have acquired effective strategies to adapt to the environment and to unavoidable changes. If demands are abrupt and intense, there might be insufficient time to successful responses. Usually, stress produces a cognitive or perceptual evaluation (emotional memory) which motivates to make a plan, to take a decision and to perform an action to face success‐ fully the demand. Between several kinds of stresses, there are psychosocial and emotional stresses with cultural, social and political influences. The cultural changes have modified the way in which individuals socially interact. Deficits in familiar relationships and social isolation alter physical and mental health in young students, producing reduction of their capacities of facing stressors in school. Adolescence is characterized by significant physiological, anatomical, and psychological changes in boys and girls, who become vulnerable to psychiatric disorders. In particular for young adult students, anxiety and depression symptoms could interfere in their academic performance. In this chapter, we reviewed approaches to the study of anxiety and depression symptoms related with the academic performance in adolescent and graduate students. Results from available published studies in academic journals are reviewed to discuss the importance to detect information about academic performance, which leads to discover in many cases the very commonly subdiagnosed psychiatric disorders in adolescents, that is, anxiety and depression. With the reviewed evidence of how anxiety and depression in young adult students may alter their main activity in life (studying and academic performance), we © 2015 The Author(s). Licensee InTech. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. discussed data in order to show a way in which professionals involved in schools could support students and stablish a routine of intervention in any case. |
Incremental Semantic Construction in a Dialogue System | This paper describes recent work on the DynDial project ∗ towards incremental semantic interpretation in dialogue. We outline our domain-general gramm r-based approach, using a variant of Dynamic Syntax integrated with Type Theory with Records and Davidsonian event-based semantics. We describe a Java-based implementation of the parser , u d within the Jindigo framework to produce an incremental dialogue system capable of handling inherently incremental phenomena such as split utterances, adjuncts, and mid-sentence clarificat ion requests or backchannels. |
Coherent updating on finite spaces | We compare the dierent notions of conditional coherence within the behavioural theory of imprecise probabilities when all the spaces are finite. We show that the dierences between the notions are due to conditioning on sets of (lower, and in some cases upper) probability zero. Next, we characterise the range of coherent extensions, proving that the greatest coherent extensions can always be calculated using the notion of regular extension. |
Cascaded Pyramid Network for Multi-person Pose Estimation | The topic of multi-person pose estimation has been largely improved recently, especially with the development of convolutional neural network. However, there still exist a lot of challenging cases, such as occluded keypoints, invisible keypoints and complex background, which cannot be well addressed. In this paper, we present a novel network structure called Cascaded Pyramid Network (CPN) which targets to relieve the problem from these "hard" keypoints. More specifically, our algorithm includes two stages: GlobalNet and RefineNet. GlobalNet is a feature pyramid network which can successfully localize the "simple" keypoints like eyes and hands but may fail to precisely recognize the occluded or invisible keypoints. Our RefineNet tries explicitly handling the "hard" keypoints by integrating all levels of feature representations from the GlobalNet together with an online hard keypoint mining loss. In general, to address the multi-person pose estimation problem, a top-down pipeline is adopted to first generate a set of human bounding boxes based on a detector, followed by our CPN for keypoint localization in each human bounding box. Based on the proposed algorithm, we achieve state-of-art results on the COCO keypoint benchmark, with average precision at 73.0 on the COCO test-dev dataset and 72.1 on the COCO test-challenge dataset, which is a 19% relative improvement compared with 60.5 from the COCO 2016 keypoint challenge. Code1 and the detection results for person used will be publicly available for further research. |
Human mimetic musculoskeletal humanoid Kengoro toward real world physically interactive actions | We have been developing human mimetic musculoskeletal humanoids from the view point of human-inspired design approach. Kengoro is our latest version of musculoskeletal humanoid designed to achieve physically interactive actions in real world. This study presents the design concept, body characteristics, and motion achievements of Kengoro. In the design process of Kengoro, we adopted the novel idea of multifunctional skeletal structures to achieve both humanoid performance and humanlike proportions. We adopted the sensor-driver integrated muscle modules for improved muscle control. In order to demonstrate the effectiveness of these body structures, we conducted several preliminary movements using Kengoro. |
Automatic Segmentation of MR Brain Images With a Convolutional Neural Network | Automatic segmentation in MR brain images is important for quantitative analysis in large-scale studies with images acquired at all ages. This paper presents a method for the automatic segmentation of MR brain images into a number of tissue classes using a convolutional neural network. To ensure that the method obtains accurate segmentation details as well as spatial consistency, the network uses multiple patch sizes and multiple convolution kernel sizes to acquire multi-scale information about each voxel. The method is not dependent on explicit features, but learns to recognise the information that is important for the classification based on training data. The method requires a single anatomical MR image only. The segmentation method is applied to five different data sets: coronal T2-weighted images of preterm infants acquired at 30 weeks postmenstrual age (PMA) and 40 weeks PMA, axial T2-weighted images of preterm infants acquired at 40 weeks PMA, axial T1-weighted images of ageing adults acquired at an average age of 70 years, and T1-weighted images of young adults acquired at an average age of 23 years. The method obtained the following average Dice coefficients over all segmented tissue classes for each data set, respectively: 0.87, 0.82, 0.84, 0.86, and 0.91. The results demonstrate that the method obtains accurate segmentations in all five sets, and hence demonstrates its robustness to differences in age and acquisition protocol. |
The cost-effectiveness of aripiprazole as adjunctive therapy in major depressive disorder: Thai economic model. | BACKGROUND
Aripiprazole is the first atypical antipsychotic approved for adjunctive treatment to antidepressant therapy in patients with major depressive disorder (MDD). The current study aims to present an economic model and cost-effectiveness estimates for aripiprazole compared with placebo as adjunctive therapy to antidepressant treatment in patients with MDD who showed an incomplete response to a prospective 8-week trial of antidepressant therapy.
MATERIAL AND METHOD
An economic model of MDD treatment was developed to estimate the clinical and economic outcomes in Thai patients. Efficacy data were derived from a pooled analysis of two studies. A cost-effectiveness analysis was constructed in simulate the impact of treatment outcomes and costs over a 6-week time horizon. The primary outcome of the model was remission of symptoms. Quality-adjusted life-year (QALYs) was the secondary outcome. The event probabilities were used to derive the transitional probability used in the model and to calculate the weighted cost of each treatment outcome. Only direct costs were considered. One-way sensitivity analysis was performed to test the sensitivity of the model outputs.
RESULTS
Treatment with aripiprazole came at the total costs per remission of 30,970 Baht while treatment with placebo came at the total costs per remission of 28,409 Baht. Placebo had lower total costs per QALY than aripiprazole (35,511 Baht vs. 38,713 Baht). The incremental cost-effectiveness ratio (ICER) of augmentation with aripiprazole compared with placebo was 2,561 Baht per remission gained and 3,201 Baht per QALY gained. Aripiprazole dominated placebo if the value of transitional probability of remission changed to a value of greater than 0.348 from the base-case value of 0.257. Aripiprazole was more cost-effective than placebo as adjunctive therapy if the unit cost of aripiprazole is more than 48.9% discount.
CONCLUSIONS
Adjunctive aripiprazole is not more cost-effective than adjunctive placebo in Thai patients with MDD who showed an inadequate response to at least one prospective antidepressant therapy. Remission rates and unit cost are the key parameters involving the cost-effectiveness of aripiprazole. |
Is Early Vision Optimized for Extracting Higher-order Dependencies? | Linear implementations of the efficient coding hypothesis, such as independent component analysis (ICA) and sparse coding models, have provided functional explanations for properties of simple cells in V1 [1, 2]. These models, however, ignore the non-linear behavior of neurons and fail to match individual and population properties of neural receptive fields in subtle but important ways. Hierarchical models, including Gaussian Scale Mixtures [3, 4] and other generative statistical models [5, 6], can capture higher-order regularities in natural images and explain nonlinear aspects of neural processing such as normalization and context effects [6,7]. Previously, it had been assumed that the lower level representation is independent of the hierarchy, and had been fixed when training these models. Here we examine the optimal lower-level representations derived in the context of a hierarchical model and find that the resulting representations are strikingly different from those based on linear models. Unlike the the basis functions and filters learned by ICA or sparse coding, these functions individually more closely resemble simple cell receptive fields and collectively span a broad range of spatial scales. Our work unifies several related approaches and observations about natural image structure and suggests that hierarchical models might yield better representations of image structure throughout the hierarchy. |
Evolutionary Psychology and the Emotions | Evolutionary psychology is an approach to the psychological sciences in which principles and results drawn from evolutionary biology, cognitive science, anthropology, and neuroscience are integrated with the rest of psychology in order to map human nature. By human nature, evolutionary psychologists mean the evolved, reliably developing, species-typical computational and neural architecture of the human mind and brain. According to this view, the functional components that comprise this architecture were designed by natural selection to solve adaptive problems faced by our hunter-gatherer ancestors, and to regulate behavior so that these adaptive problems were successfully addressed (for discussion, see Cosmides & Tooby, 1987, Tooby & Cosmides, 1992). Evolutionary psychology is not a specific subfield of psychology, such as the study of vision, reasoning, or social behavior. It is a way of thinking about psychology that can be applied to any topic within it including the emotions. |
A Closed-Form Solution to Natural Image Matting | Interactive digital matting, the process of extracting a foreground object from an image based on limited user input, is an important task in image and video editing. From a computer vision perspective, this task is extremely challenging because it is massively ill-posed - at each pixel we must estimate the foreground and the background colors, as well as the foreground opacity ("alpha matte") from a single color measurement. Current approaches either restrict the estimation to a small part of the image, estimating foreground and background colors based on nearby pixels where they are known, or perform iterative nonlinear estimation by alternating foreground and background color estimation with alpha estimation. In this paper, we present a closed-form solution to natural image matting. We derive a cost function from local smoothness assumptions on foreground and background colors and show that in the resulting expression, it is possible to analytically eliminate the foreground and background colors to obtain a quadratic cost function in alpha. This allows us to find the globally optimal alpha matte by solving a sparse linear system of equations. Furthermore, the closed-form formula allows us to predict the properties of the solution by analyzing the eigenvectors of a sparse matrix, closely related to matrices used in spectral image segmentation algorithms. We show that high-quality mattes for natural images may be obtained from a small amount of user input. |
Agent-based modelling of customer behaviour in the telecoms and media markets | Agent-based modelling is a bottom-up approach to understanding systems which provides a powerful tool for analysing complex, non-linear markets. The method involves creating artificial agents designed to mimic the attributes and behaviours of their real-world counterparts. The system’s macro-observable properties emerge as a consequence of these attributes and behaviours and the interactions between them. The simulation output may be potentially used for explanatory, exploratory and predictive purposes. The aim of this paper is to introduce the reader to some of the basic concepts and methods behind agent-based modelling and to present some recent business applications of these tools, including work in the telecoms and media markets. |
Mining group stock portfolio by using grouping genetic algorithms | In this paper, a grouping genetic algorithm based approach is proposed for dividing stocks into groups and mining a set of stock portfolios, namely group stock portfolio. Each chromosome consists of three parts. Grouping and stock parts are used to indicate how to divide stocks into groups. Stock portfolio part is used to represent the purchased stocks and their purchased units. The fitness of each chromosome is evaluated by the group balance and the portfolio satisfaction. The group balance is utilized to make the groups represented by the chromosome have as similar number of stocks as possible. The portfolio satisfaction is used to evaluate the goodness of profits and satisfaction of investor's requests of all possible portfolio combinations that can generate from a chromosome. Experiments on a real data were also made to show the effectiveness of the proposed approach. |
An embodied cognitive science? | The last ten years have seen an increasing interest, within cognitive science, in issues concerning the physical body, the local environment, and the complex interplay between neural systems and the wider world in which they function. Yet many unanswered questions remain, and the shape of a genuinely physically embodied, environmentally embedded science of the mind is still unclear. In this article I will raise a number of critical questions concerning the nature and scope of this approach, drawing a distinction between two kinds of appeal to embodiment: (1) 'Simple' cases, in which bodily and environmental properties merely constrain accounts that retain the focus on inner organization and processing; and (2) More radical appeals, in which attention to bodily and environmental features is meant to transform both the subject matter and the theoretical framework of cognitive science. |
ACE: automated CTF estimation. | We present a completely automated algorithm for estimating the parameters of the contrast transfer function (CTF) of a transmission electron microscope. The primary contribution of this paper is the determination of the astigmatism prior to the estimation of the CTF parameters. The CTF parameter estimation is then reduced to a 1D problem using elliptical averaging. We have also implemented an automated method to calculate lower and upper cutoff frequencies to eliminate regions of the power spectrum which perturb the estimation of the CTF parameters. The algorithm comprises three optimization subproblems, two of which are proven to be convex. Results of the CTF estimation method are presented for images of carbon support films as well as for images of single particles embedded in ice and suspended over holes in the support film. A MATLAB implementation of the algorithm, called ACE, is freely available. |
Combining heterogeneous service technologies for building an Internet of Things middleware | 0140-3664/$ see front matter 2011 Elsevier B.V. A doi:10.1016/j.comcom.2011.11.003 ⇑ Corresponding author. E-mail addresses: [email protected] (K. Ga (L. Touseau), [email protected] (D. Donsez). Radio-frequency identification (RFID) is a technology that allows ordinary objects to be uniquely identified by ‘‘smart tags’’ which are also capable of storing small quantities of data. The term Internet of Things was originated from a vision strongly coupled with supply-chain concerns and RFID tagged objects. However the idea of such Internet of Things has evolved in a wider sense, referring now to a ubiquitous object society combining RFID, sensor networks and pervasive computing technologies. This scenario involves different requirements such as heterogeneity and dynamicity of objects, sensors, applications and protocols as well as the need for allowing the dynamic evolution of such applications. These issues seemed to be easily addressed if the principles of service-oriented computing (SOC), like loose coupling and heterogeneity, are used for constructing such architectures and applications. In this paper we underline what benefits SOC can offer to constructing a middleware for the Internet of Things. These concepts have been applied in a service-oriented middleware that tries to leverage the existing Internet of Things architectural concepts by using SOC principles in order to bring more flexibility and dynamicity. We describe the approaches used in that middleware and the lessons learned from that experience. This middleware was initially tested on an application for tracking and monitoring supply-chain objects, and later extended to target wider application domains that are also described in this paper. The project described here has become part of the OW2 AspireRFID open-source project. 2011 Elsevier B.V. All rights reserved. |
The PMHT for Passive Radar in a DAB/DVB Network | Passive Bistatic Radar (PBR), also known as Passive Coherent Location (PCL), uses illuminators of opportunity. Passive radar using signals in a single frequency network modulated according to the Digital Audio/Video Broadcasting (DAB/DVB) standards using orthogonal frequency division multiplexing (OFDM) has recently been of increasing interest. There has been considerable research to develop tracking systems addressing its inherent difficulties [3]—[7], [10]—[13]; the poor quality–or absence–of angular information, and the lack of label of the transmitter on top of the usual target/measurement association concerns. First, there are algorithms using the Multi-Hypothesis Tracker (MHT) [12], [13] addressing the complexity problem from association ambiguities between measurement, targets and illuminators by initially forming two dimensional (measurement-target) hypotheses in the two-dimensional range/Doppler domain. Tracking is thence performed directly on target parameters by the MHT without considering the association between measurements and illuminators: the range/Doppler MHT extracts measurements and removes false alarms. Then, de-ghosting is performed by evaluating likelihood probabilities of possible data associations. When a Cartesian track is confirmed, the remaining tracks from other possible associations are declared false and tracking starts in the Cartesian domain. This MHT approach is good but but is not without issues. One is the appropriate motion model in range and Doppler space: probably the target dynamics in the Cartesian domain are known, the trajectories are not easily described in a space of target parameters, because the trajectories are related to illuminator/receiver/target geometry and there is association ambiguity among measurements, illuminators, and targets. And that is another concern: the illuminator association is never explicitly addressed. Now, track maintenance algorithms that operate directly in Cartesian coordinates have been explored [4], [5], one using modified Joint Probabilistic Data Association (JPDA) and another a particle filter. For the former, in order to address the large number of threelist hypotheses, a “super-target” idea was proposed; and the particle filters work under the PMHT measurement model that each measurement’s assignments are independent of others’. These methods have also been examined downstream from an initiation approach (the PMHTI method, suggested in [6]) that initiates tracks in Cartesian coordinates. In fact the PMHT seems to be an effective and natural way to accommodate the data association with the extra list (transmitters). So in this paper, we present it: it is really very simple. This tracker, combined with the initiation algorithm (the modified PMHTI method in [6]), shows excellent performance in comparison with the JPDA filter and particle filter. |
Frequency-dependent resistive and capacitive components in RF MOSFETs | It is found from measured high frequency (HF) S-parameter data that the extracted effective gate sheet resistance (R/sub gsh/), effective gate unit-area capacitance (C/sub gg, unit/), and transconductance (G/sub m/) in radio-frequency (RF) MOSFETs show strong frequency dependency when the device operates at frequencies higher than some critical frequency. As frequency increases, R/sub gsh/ increases but C/sub gg, unit/ and G/sub m/ decrease. This behavior is different from what we have observed at low or medium frequencies, at which these components are constant over a frequency range. This phenomenon has been observed in MOSFETs with L/sub f/ longer than 0.35 /spl mu/m at frequencies higher than 1 GHz, and becomes more serious as L/sub f/ becomes longer and the frequency higher. This behavior can be explained by a MOSFET model considering the Non-Quasi-Static (NQS) effect. Simulation results show that an RF model based on BSIM3v3 with the NQS effect describes well the behaviors of both real and imaginary parts of Y/sub 21/ of the device with strong NQS effect even though its fitting to Y/sub 11/ needs to be improved further. |
Embodied Energy : a Case for Wood Construction | The purpose of our article is to evaluate wood as a construction material in terms of the energy required for its construction and operation, compared to other types of construction materials. First, the role of construction and material manufacturing is evaluated within the full life cycle energy and CO2 emissions of a building, concluding that the issue of embodied energy justifi es the use of less energy intensive materials. Then the article reviews the literature dealing with the energy requirements of wood based construction, in order to establish whether the use of this natural, low density construction material is more energy effi cient than using brick, reinforced concrete and steel structures. According to our analysis, the vast majority of the studies found that the embodied energy is signifi cantly lower in wood based construction when compared to inorganic materials. According to several authors, wood construction could save much energy and signifi cantly reduce the emissions related to the building sector on the national level. Carbon sequestration, and the related mitigation of the global climate change effect, can be signifi cant if the share of durable wooden buildings can be increased in the market, using sustainably produced raw materials that are handled responsibly at the end of their lifetime. Some confl icting studies make important points concerning the heat storage, recycling and on-site labour demands related to these structures. These sources contribute to a deeper understanding of the issue, but do not alter the basic conclusions concerning the benefi ts of wood based construction. Some important aspects of wood extraction, manufacturing and construction that can help minimising the embodied energy of wood based structures are also discussed in the study. |
When a city tells a story: urban topic analysis | This paper explores the use of textual and event-based citizen-generated data from services such as Twitter and Foursquare to study urban dynamics. It applies a probabilistic topic model to obtain a decomposition of the stream of digital traces into a set of urban topics related to various activities of the citizens in the course of a week. Due to the combined use of implicit textual and movement data, we obtain semantically rich modalities of the urban dynamics and overcome the drawbacks of several previous attempts. Other important advantages of our method include its flexibility and robustness with respect to the varying quality and volume of the incoming data. We describe an implementation architecture of the system, the main outputs of the analysis, and the derived exploratory visualisations. Finally, we discuss the implications of our methodology for enriching location-based services with real-time context. |
Reliability and Validity of a Submaximal Warm-up Test for Monitoring Training Status in Professional Soccer Players. | Rabbani, A, Kargarfard, M, and Twist, C. Reliability and validity of a submaximal warm-up test for monitoring training status in professional soccer players. J Strength Cond Res 32(2): 326-333, 2018-Two studies were conducted to assess the reliability and validity of a submaximal warm-up test (SWT) in professional soccer players. For the reliability study, 12 male players performed an SWT over 3 trials, with 1 week between trials. For the validity study, 14 players of the same team performed an SWT and a 30-15 intermittent fitness test (30-15IFT) 7 days apart. Week-to-week reliability in selected heart rate (HR) responses (exercise heart rate [HRex], heart rate recovery [HRR] expressed as the number of beats recovered within 1 minute [HRR60s], and HRR expressed as the mean HR during 1 minute [HRpost1]) was determined using the intraclass correlation coefficient (ICC) and typical error of measurement expressed as coefficient of variation (CV). The relationships between HR measures derived from the SWT and the maximal speed reached at the 30-15IFT (VIFT) were used to assess validity. The range for ICC and CV values was 0.83-0.95 and 1.4-7.0% in all HR measures, respectively, with the HRex as the most reliable HR measure of the SWT. Inverse large (r = -0.50 and 90% confidence limits [CLs] [-0.78 to -0.06]) and very large (r = -0.76 and CL, -0.90 to -0.45) relationships were observed between HRex and HRpost1 with VIFT in relative (expressed as the % of maximal HR) measures, respectively. The SWT is a reliable and valid submaximal test to monitor high-intensity intermittent running fitness in professional soccer players. In addition, the test's short duration (5 minutes) and simplicity mean that it can be used regularly to assess training status in high-level soccer players. |
Bridge reliability experience in Switzerland | Keywords: 331/ICOM Note: College Custom Series Reference ICOM-CONF-1996-002 Record created on 2008-01-24, modified on 2016-08-08 |
Scalable agent alignment via reward modeling: a research direction | One obstacle to applying reinforcement learning algorithms to real-world problems is the lack of suitable reward functions. Designing such reward functions is difficult in part because the user only has an implicit understanding of the task objective. This gives rise to the agent alignment problem: how do we create agents that behave in accordance with the user’s intentions? We outline a high-level research direction to solve the agent alignment problem centered around reward modeling: learning a reward function from interaction with the user and optimizing the learned reward function with reinforcement learning. We discuss the key challenges we expect to face when scaling reward modeling to complex and general domains, concrete approaches to mitigate these challenges, and ways to establish trust in the resulting agents. |
Magnetic resonance imaging with ferumoxil, a negative superparamagnetic oral contrast agent, in the evaluation of ulcerative colitis | OBJECTIVE:The introduction of new oral contrast agents that enhance image quality has increased the importance of magnetic resonance imaging (MRI) in the management of ulcerative colitis. The aim of our study was to investigate the usefulness of a new negative superparamagnetic oral contrast (ferumoxil) alone or in association with gadolinium i.v. in the assessment of the disease.METHODS:Twenty-eight patients with clinically active ulcerative colitis and 10 control subjects entered the study. In each patient a clinical, endoscopic, histological, and MRI evaluation was performed. In particular, in 14 patients affected by ulcerative colitis (group A) and in five controls, magnetic resonance images were acquired 1 h after the oral administration of 900 ml of ferumoxil, while the remaining 14 patients (group B) and five controls were submitted to double-contrast MRI (ferumoxil and gadolinium). In both groups, wall thickness, length of affected bowel segments, and, in group B, also percent contrast enhancement were calculated.RESULTS:The comparison of endoscopic and MRI extent of disease was statistically significant. Wall thickness and, in group B, also percent contrast enhancement were significantly correlated with clinical and endoscopic activities. In each group wall thickness was significantly different in the activity phases of the disease.CONCLUSIONS:MRI with negative superparamagnetic oral contrast is comparable to endoscopy in the assessment of ulcerative colitis. The double-contrast imaging does not provide more information than single oral contrast, so we concluded that the latter is preferable in the follow-up of the disease and in patients unable or with a poor compliance to undergo endoscopy. |
particle in gravitational field of a rotating body | Abstract The effective Lagrangian describing gravitational source spin-particle spin interactions is given. Cosmological and astrophysical consequences of such interaction are examined. Although stronger than expected, the spin-spin interactions do not change any cosmological effect observed so far. They are important for background primordial neutrinos. |
Racism and health among urban Aboriginal young people | BACKGROUND
Racism has been identified as an important determinant of health but few studies have explored associations between racism and health outcomes for Australian Aboriginal young people in urban areas.
METHODS
Cross sectional data from participants aged 12-26 years in Wave 1 of the Victorian Aboriginal Health Service's Young People's Project were included in hierarchical logistic regression models. Overall mental health, depression and general health were all considered as outcomes with self-reported racism as the exposure, adjusting for a range of relevant confounders.
RESULTS
Racism was reported by a high proportion (52.3%) of participants in this study. Self-reported racism was significantly associated with poor overall mental health (OR 2.67, 95% CI 1.25-5.70, p = 0.01) and poor general health (OR 2.17, 95% CI 1.03-4.57, p = 0.04), and marginally associated with increased depression (OR 2.0; 95% CI 0.97-4.09, p = 0.06) in the multivariate models. Number of worries and number of friends were both found to be effect modifiers for the association between self-reported racism and overall mental health. Getting angry at racist remarks was found to mediate the relationship between self-reported racism and general health.
CONCLUSIONS
This study highlights the need to acknowledge and address racism as an important determinant of health and wellbeing for Aboriginal young people in urban areas of Australia. |
Feasibility study of S-1 and intraperitoneal docetaxel combination chemotherapy for gastric cancer with peritoneal dissemination. | BACKGROUND
Gastric cancer with cancer cells on peritoneal cytology has very poor prognosis because of the existence of simultaneous peritoneal metastasis. Here we performed a dose-escalation study of intraperitoneal docetaxel (DTX) combined with S-1 to determine the maximum-tolerated dose (MTD) and recommended dose (RD) in gastric cancer with peritoneal dissemination.
PATIENTS AND METHODS
Twelve gastric cancer patients with positive cytology were enrolled in this study. Peritoneal lavage specimens were obtained under local anesthesia or staging laparoscopy before treatment and the combination chemotherapy was applied in patients with positive cytology. DTX was administered on day 1 intraperitoneally with initial dose of 40 mg/m(2), stepped up to 50 or 60 mg/m(2). S-1 was administered at a fixed dose of 80 mg/m(2)/day on days 1-14, followed by 7 days of rest. After two cycles of the combination chemotherapy, staging laparoscopy was performed to evaluate the effect of the chemotherapy. Simultaneous gastrectomy was performed in cases without peritoneal deposits at staging laparoscopy.
RESULTS
The MTD of intraperitoneal DTX was not determined and the RD was defined as 60 mg/m(2) because dose-limiting toxicity occurred in only one patient at level II (DTX: 50 mg/m(2)). Out of twelve patients given the combination chemotherapy, nine had cytologically negative peritoneal lavage and had no peritoneal metastases at surgery after chemotherapy.
CONCLUSION
The combined chemotherapy of S-1 plus intraperitoneal DTX was revealed to be safe and may be effective for gastric cancer with peritoneal dissemination. |
Good Question! Statistical Ranking for Question Generation | We address the challenge of automatically generating questions from reading materials for educational practice and assessment. Our approach is to overgenerate questions, then rank them. We use manually written rules to perform a sequence of general purpose syntactic transformations (e.g., subject-auxiliary inversion) to turn declarative sentences into questions. These questions are then ranked by a logistic regression model trained on a small, tailored dataset consisting of labeled output from our system. Experimental results show that ranking nearly doubles the percentage of questions rated as acceptable by annotators, from 27% of all questions to 52% of the top ranked 20% of questions. |
Conformal hypofractionated and accelerated radiotherapy with cytoprotection (HypoARC) for high risk prostatic carcinoma: rationale, technique and early experience. | Recent radiobiological analysis of the radiotherapy results for prostate cancer revealed that prostate carcinoma behaves as a late responding tissue, sharing an alpha/beta ratio lower than 2Gy. These findings suggest that hypofractionation may be more effective. Reduction of the overall treatment time could further increase response by abrogating the effect of rapid tumor repopulation. In the present study we report a conformal technique applied (to pelvis and prostate) for the treatment of high-risk prostate cancer, using hypofractionated and accelerated radiotherapy (3.4Gy x 15 consecutive fractions) supported with high-dose daily amifostine (1000mg subcutaneously) to protect normal tissues against early and late effects. The biological dose delivered to the prostate cancer by this HypoARC (hypofractionated accelerated radiotherapy with cytoprotection) technique is estimated to be 71.4Gy (alpha/beta= 1.5 Gy). The time-adjusted biological dose is estimated to 77-94 Gy. Amifostine tolerance was excellent. All seven patients recruited up to now have accomplished their treatment with grade 0-1 cystitis or diarrhoea (5/7 grade 0). The study is ongoing to assess efficacy and late effects of HypoARC. |
Corpus callosum abnormalities, intellectual disability, speech impairment, and autism in patients with haploinsufficiency of ARID1B | Corpus callosum abnormalities, intellectual disability, speech impairment, and autism in patients with haploinsufficiency of ARID1B. Corpus callosum abnormalities are common brain malformations with a wide clinical spectrum ranging from severe intellectual disability to normal cognitive function. The etiology is expected to be genetic in as much as 30-50% of the cases, but the underlying genetic cause remains unknown in the majority of cases. By next-generation mate-pair sequencing we mapped the chromosomal breakpoints of a patient with a de novo balanced translocation, t(1;6)(p31;q25), agenesis of corpus callosum (CC), intellectual disability, severe speech impairment, and autism. The chromosome 6 breakpoint truncated ARID1B which was also truncated in a recently published translocation patient with a similar phenotype. Quantitative polymerase chain reaction (Q-PCR) data showed that a primer set proximal to the translocation showed increased expression of ARID1B, whereas primer sets spanning or distal to the translocation showed decreased expression in the patient relative to a non-related control set. Phenotype-genotype comparison of the translocation patient to seven unpublished patients with various sized deletions encompassing ARID1B confirms that haploinsufficiency of ARID1B is associated with CC abnormalities, intellectual disability, severe speech impairment, and autism. Our findings emphasize that ARID1B is important in human brain development and function in general, and in the development of CC and in speech development in particular. |
CLIP-Q: Deep Network Compression Learning by In-parallel Pruning-Quantization | Deep neural networks enable state-of-the-art accuracy on visual recognition tasks such as image classification and object detection. However, modern deep networks contain millions of learned weights; a more efficient utilization of computation resources would assist in a variety of deployment scenarios, from embedded platforms with resource constraints to computing clusters running ensembles of networks. In this paper, we combine network pruning and weight quantization in a single learning framework that performs pruning and quantization jointly, and in parallel with fine-tuning. This allows us to take advantage of the complementary nature of pruning and quantization and to recover from premature pruning errors, which is not possible with current two-stage approaches. Our proposed CLIP-Q method (Compression Learning by In-Parallel Pruning-Quantization) compresses AlexNet by 51-fold, GoogLeNet by 10-fold, and ResNet-50 by 15-fold, while preserving the uncompressed network accuracies on ImageNet. |
Exclusive Lasso for Multi-task Feature Selection | We propose a novel group regularization which we call exclusive lasso. Unlike the group lasso regularizer that assumes covarying variables in groups, the proposed exclusive lasso regularizer models the scenario when variables in the same group compete with each other. Analysis is presented to illustrate the properties of the proposed regularizer. We present a framework of kernel based multi-task feature selection algorithm based on the proposed exclusive lasso regularizer. An efficient algorithm is derived to solve the related optimization problem. Experiments with document categorization show that our approach outperforms state-of-theart algorithms for multi-task feature selection. |
Reading and controlling human brain activation using real-time functional magnetic resonance imaging | Understanding how to control how the brain's functioning mediates mental experience and the brain's processing to alter cognition or disease are central projects of cognitive and neural science. The advent of real-time functional magnetic resonance imaging (rtfMRI) now makes it possible to observe the biology of one's own brain while thinking, feeling and acting. Recent evidence suggests that people can learn to control brain activation in localized regions, with corresponding changes in their mental operations, by observing information from their brain while inside an MRI scanner. For example, subjects can learn to deliberately control activation in brain regions involved in pain processing with corresponding changes in experienced pain. This may provide a novel, non-invasive means of observing and controlling brain function, potentially altering cognitive processes or disease. |
ANN-Based Forecasting of Foreign Currency Exchange Rates | In this paper, we have investigated artificial neural networks based prediction modeling of foreign currency rates using three learning algorithms, namely, Standard Backpropagation (SBP), Scaled Conjugate Gradient (SCG) and Backpropagation with Bayesian Regularization (BPR). The models were trained from historical data using five technical indicators to predict six currency rates against Australian dollar. The forecasting performance of the models was evaluated using a number of widely used statistical metrics and compared. Results show that significantly close prediction can be made using simple technical indicators without extensive knowledge of market data. Among the three models, SCG based model outperforms other models when measured on two commonly used metrics and attains comparable results with BPR based model on other three metrics. The effect of network architecture on the performance of the forecasting model is also presented. Future research direction outlining further improvement of the model is discussed. Keywords-Neural network, ARIMA, financial forecasting, foreign exchange |
On compressing social networks | Motivated by structural properties of the Web graph that support efficient data structures for in memory adjacency queries, we study the extent to which a large network can be compressed. Boldi and Vigna (WWW 2004), showed that Web graphs can be compressed down to three bits of storage per edge; we study the compressibility of social networks where again adjacency queries are a fundamental primitive. To this end, we propose simple combinatorial formulations that encapsulate efficient compressibility of graphs. We show that some of the problems are NP-hard yet admit effective heuristics, some of which can exploit properties of social networks such as link reciprocity. Our extensive experiments show that social networks and the Web graph exhibit vastly different compressibility characteristics. |
Thermodynamics of spatiotemporal chaos: An experimental approach. | We report experimental evidence that, beyond the transition for spatiotemporal chaos, Rayleigh-Benard convection displays thermodynamic properties in Fourier space. It is shown that a generalized temperature may be defined using the natural fluctuations of the systems, and that a suitable free energy accounts for the experimental results |
Multinational Banks and the Global Financial Crisis. Weathering the Perfect Storm | We use data on the 48 largest multinational banking groups to compare the lending of their 199 foreign subsidiaries during the Great Recession with lending by a benchmark group of 202 domestic banks. Contrary to earlier, more contained crises, parent banks were not a significant source of strength to their subsidiaries during the 2008-09 crisis. As a result, multinational bank subsidiaries had to slow down credit growth about twice as fast as domestic banks. This was in particular the case for subsidiaries of banking groups that relied more on wholesale market funding. Domestic banks were better equipped to continue lending because of their greater use of deposits, a relatively stable funding source during the crisis. We conclude that while multinational banks may contribute to financial stability during local crisis episodes, they also increase the risk of ‘importing’ instability from abroad. |
PI3 kinase/Akt/HIF-1α pathway is associated with hypoxia-induced epithelial–mesenchymal transition in fibroblast-like synoviocytes of rheumatoid arthritis | Migration and invasion of fibroblast-like synoviocytes (FLSs) are critical in the pathogenesis of rheumatoid arthritis (RA). Hypoxic conditions are present in RA joints, and hypoxia has been extensively studied in angiogenesis and inflammation. However, its effect on the migration and invasion of RA-FLSs remains unknown. In this study, we observed that RA-FLSs exposed to hypoxic conditions experienced epithelial–mesenchymal transition (EMT), with increased cell migration and invasion. We demonstrated that hypoxia-induced EMT was accompanied by increased hypoxia-inducible factor (HIF)-1α expression and activation of Akt. After knockdown or inhibition of HIF-1α in hypoxia by small interfering RNA or genistein (Gen) treatment, the EMT transformation and invasion ability of FLSs were regained. HIF-1α could be blocked by phosphatidylinositol 3-kinase (PI3K) inhibitor, LY294002, indicating that HIF-1α activation was regulated by the PI3K/Akt pathway. Administration of LY294002 (20 mg/kg, intra-peritoneally) twice weekly and Gen (25 mg/kg, by gavage) daily for 3 weeks from day 20 after primary immunization in a collagen-induced arthritis rat model, markedly alleviated the clinical signs, radiology progression, synovial hyperplasia, and inflammatory cells infiltration of joints. Thus, results of this study suggest that activation of the PI3K/Akt/HIF-1α pathway plays a pivotal role in mediating hypoxia-induced EMT transformation and invasion of RA-FLSs under hypoxia. |
Theory and Application of Magnetic Flux Leakage Pipeline Detection | Magnetic flux leakage (MFL) detection is one of the most popular methods of pipeline inspection. It is a nondestructive testing technique which uses magnetic sensitive sensors to detect the magnetic leakage field of defects on both the internal and external surfaces of pipelines. This paper introduces the main principles, measurement and processing of MFL data. As the key point of a quantitative analysis of MFL detection, the identification of the leakage magnetic signal is also discussed. In addition, the advantages and disadvantages of different identification methods are analyzed. Then the paper briefly introduces the expert systems used. At the end of this paper, future developments in pipeline MFL detection are predicted. |
Predictive reward signal of dopamine neurons. | The effects of lesions, receptor blocking, electrical self-stimulation, and drugs of abuse suggest that midbrain dopamine systems are involved in processing reward information and learning approach behavior. Most dopamine neurons show phasic activations after primary liquid and food rewards and conditioned, reward-predicting visual and auditory stimuli. They show biphasic, activation-depression responses after stimuli that resemble reward-predicting stimuli or are novel or particularly salient. However, only few phasic activations follow aversive stimuli. Thus dopamine neurons label environmental stimuli with appetitive value, predict and detect rewards and signal alerting and motivating events. By failing to discriminate between different rewards, dopamine neurons appear to emit an alerting message about the surprising presence or absence of rewards. All responses to rewards and reward-predicting stimuli depend on event predictability. Dopamine neurons are activated by rewarding events that are better than predicted, remain uninfluenced by events that are as good as predicted, and are depressed by events that are worse than predicted. By signaling rewards according to a prediction error, dopamine responses have the formal characteristics of a teaching signal postulated by reinforcement learning theories. Dopamine responses transfer during learning from primary rewards to reward-predicting stimuli. This may contribute to neuronal mechanisms underlying the retrograde action of rewards, one of the main puzzles in reinforcement learning. The impulse response releases a short pulse of dopamine onto many dendrites, thus broadcasting a rather global reinforcement signal to postsynaptic neurons. This signal may improve approach behavior by providing advance reward information before the behavior occurs, and may contribute to learning by modifying synaptic transmission. The dopamine reward signal is supplemented by activity in neurons in striatum, frontal cortex, and amygdala, which process specific reward information but do not emit a global reward prediction error signal. A cooperation between the different reward signals may assure the use of specific rewards for selectively reinforcing behaviors. Among the other projection systems, noradrenaline neurons predominantly serve attentional mechanisms and nucleus basalis neurons code rewards heterogeneously. Cerebellar climbing fibers signal errors in motor performance or errors in the prediction of aversive events to cerebellar Purkinje cells. Most deficits following dopamine-depleting lesions are not easily explained by a defective reward signal but may reflect the absence of a general enabling function of tonic levels of extracellular dopamine. Thus dopamine systems may have two functions, the phasic transmission of reward information and the tonic enabling of postsynaptic neurons. |
Effect of adjuvant chemotherapy with fluorouracil plus folinic acid or gemcitabine vs observation on survival in patients with resected periampullary adenocarcinoma: the ESPAC-3 periampullary cancer randomized trial. | CONTEXT
Patients with periampullary adenocarcinomas undergo the same resectional surgery as that of patients with pancreatic ductal adenocarcinoma. Although adjuvant chemotherapy has been shown to have a survival benefit for pancreatic cancer, there have been no randomized trials for periampullary adenocarcinomas.
OBJECTIVE
To determine whether adjuvant chemotherapy (fluorouracil or gemcitabine) provides improved overall survival following resection.
DESIGN, SETTING, AND PATIENTS
The European Study Group for Pancreatic Cancer (ESPAC)-3 periampullary trial, an open-label, phase 3, randomized controlled trial (July 2000-May 2008) in 100 centers in Europe, Australia, Japan, and Canada. Of the 428 patients included in the primary analysis, 297 had ampullary, 96 had bile duct, and 35 had other cancers.
INTERVENTIONS
One hundred forty-four patients were assigned to the observation group, 143 patients to receive 20 mg/m2 of folinic acid via intravenous bolus injection followed by 425 mg/m2 of fluorouracil via intravenous bolus injection administered 1 to 5 days every 28 days, and 141 patients to receive 1000 mg/m2 of intravenous infusion of gemcitabine once a week for 3 of every 4 weeks for 6 months.
MAIN OUTCOME MEASURES
The primary outcome measure was overall survival with chemotherapy vs no chemotherapy; secondary measures were chemotherapy type, toxic effects, progression-free survival, and quality of life.
RESULTS
Eighty-eight patients (61%) in the observation group, 83 (58%) in the fluorouracil plus folinic acid group, and 73 (52%) in the gemcitabine group died. In the observation group, the median survival was 35.2 months (95%% CI, 27.2-43.0 months) and was 43.1 (95%, CI, 34.0-56.0) in the 2 chemotherapy groups (hazard ratio, 0.86; (95% CI, 0.66-1.11; χ2 = 1.33; P = .25). After adjusting for independent prognostic variables of age, bile duct cancer, poor tumor differentiation, and positive lymph nodes and after conducting multiple regression analysis, the hazard ratio for chemotherapy compared with observation was 0.75 (95% CI, 0.57-0.98; Wald χ2 = 4.53, P = .03).
CONCLUSIONS
Among patients with resected periampullary adenocarcinoma, adjuvant chemotherapy, compared with observation, was not associated with a significant survival benefit in the primary analysis; however, multivariable analysis adjusting for prognostic variables demonstrated a statistically significant survival benefit associated with adjuvant chemotherapy.
TRIAL REGISTRATION
clinicaltrials.gov Identifier: NCT00058201. |
Using Linear Algebra for Intelligent Information Retrieval | Currently most approaches to retrieving textual materials from scienti c databases depend on a lexical match between words in users requests and those in or assigned to documents in a database Because of the tremendous diversity in the words people use to describe the same document lexical methods are necessarily incomplete and imprecise Using the singular value decomposition SVD one can take advantage of the implicit higher order structure in the association of terms with documents by determining the SVD of large sparse term by document matrices Terms and documents represented by of the largest singular vectors are then matched against user queries We call this retrieval method Latent Semantic Indexing LSI because the subspace represents important associative relationships between terms and documents that are not evident in individual documents LSI is a completely automatic yet intelligent indexing method widely applicable and a promising way to improve users access to many kinds of textual materials or to documents and services for which textual descriptions are available A survey of the computational requirements for managing LSI encoded databases as well as current and future applications of LSI is presented |
Memory and communication support in dementia: research-based strategies for caregivers. | BACKGROUND
Difficulties with memory and communication are prominent and distressing features of dementia which impact on the person with dementia and contribute to caregiver stress and burden. There is a need to provide caregivers with strategies to support and maximize memory and communication abilities in people with dementia. In this project, a team of clinicians, researchers and educators in neuropsychology, psychogeriatrics, nursing and speech pathology translated research-based knowledge from these fields into a program of practical strategies for everyday use by family and professional caregivers.
METHODS
From the available research evidence, the project team identified compensatory or facilitative strategies to assist with common areas of difficulty, and structured these under the mnemonics RECAPS (for memory) and MESSAGE (for communication). This information was adapted for presentation in a DVD-based education program in accordance with known characteristics of effective caregiver education.
RESULTS
The resultant DVD comprises (1) information on the nature and importance of memory and communication in everyday life; (2) explanations of common patterns of difficulty and preserved ability in memory and communication across the stages of dementia; (3) acted vignettes demonstrating the strategies, based on authentic samples of speech in dementia; and (4) scenarios to prompt the viewer to consider the benefits of using the strategies.
CONCLUSION
Using a knowledge-translation framework, information and strategies can be provided to family and professional caregivers to help them optimize residual memory and communication in people with dementia. Future development of the materials, incorporating consumer feedback, will focus on methods for enabling wider dissemination. |
ICP and the Squid web cache | We describe the structure and functionality of the Internet Cache Protocol (ICP) and its implementation in the Squid Web Caching software. ICP is a lightweight message format used for communication among Web caches. Caches exchange ICP queries and replies to gather information to use in selecting the most appropriate location from which to retrieve an object. We present background on the history of ICP, and discuss issues in ICP deployment, e ciency, security, and interaction with other aspects of Web tra c behavior. We catalog successes, failures, and lessons learned from using ICP to deploy a global Web cache hierarchy. |
Thermal management consumption and its effect on remaining range estimation of electric vehicles | This paper investigates the significance of thermal management energy consumption on the range of electric vehicles. It is necessary to combine consumption models of powertrain components such as the electric machine, power electronics and high voltage battery that take into account trip-based information containing a predicted speed profile and elevation data with a thermal management system that has the ability to incorporate predicted thermal losses of the powertrain. The thermal management system refers to the sum of all components within the refrigeration cycle of the vehicle as well as the associated control strategy. Based on the results of a real world drive study in an electric vehicle, it is demonstrated, that the range uncertainties introduced by the consumption of the thermal management are significant and should be considered when developing predictive range estimation algorithms. These uncertainties are further investigated by an energy flow based thermal system model. In order to minimize range uncertainties, a system architecture for the prediction of thermal management consumption is proposed. |
Adversarial Attacks Beyond the Image Space | Generating adversarial examples is an intriguing problem and an important way of understanding the working mechanism of deep neural networks. Most existing approaches generated perturbations in the image space, i.e., each pixel can be modified independently. However, in this paper we pay special attention to the subset of adversarial examples that are physically authentic – those corresponding to actual changes in 3D physical properties (like surface normals, illumination condition, etc.). These adversaries arguably pose a more serious concern, as they demonstrate the possibility of causing neural network failure by small perturbations of real-world 3D objects and scenes. In the contexts of object classification and visual question answering, we augment state-of-the-art deep neural networks that receive 2D input images with a rendering module (either differentiable or not) in front, so that a 3D scene (in the physical space) is rendered into a 2D image (in the image space), and then mapped to a prediction (in the output space). The adversarial perturbations can now go beyond the image space, and have clear meanings in the 3D physical world. Through extensive experiments, we found that a vast majority of image-space adversaries cannot be explained by adjusting parameters in the physical space, i.e., they are usually physically inauthentic. But it is still possible to successfully attack beyond the image space on the physical space (such that authenticity is enforced), though this is more difficult than image-space attacks, reflected in lower success rates and heavier perturbations required. |
Feasibility of a novel extraperitoneal two-port laparoendoscopic approach for radical prostatectomy: an initial study. | The aim of this study was to describe the surgical technique and to report the early outcomes of an original extraperitoneal two-port laparoendoscopic approach for radical prostatectomy. A total of 22 consecutive patients diagnosed with early-stage prostate cancer (cT1c, cT2N0) were operated on and included in this analysis. A multichannel port with three 5-mm trocars, providing easier instrument handling, was inserted extraperitoneally through a 2.5-cm lower umbilical "U" incision. An additional 12-mm port was inserted into the left fossa to allow an adequate working angle to facilitate the most critical steps of the surgical procedures. The operation was successfully completed in all patients; one patient required an additional 5-mm port to control bleeding. The median operation time was 259 min (range 207-453 min), and the fluid loss, including urine and blood, was 946 mL (range 257-1821 mL). The median Foley catheter indwelling period was 6 days (range 3-11 days) after surgery. No intraoperative complications occurred. Judging from this initial trial, this procedure can be safely carried out if the surgeon is familiar with conventional five-port laparoscopic radical prostatectomy. |
Implicit feedback for inferring user preference: a bibliography | Relevance feedback has a history in information retrieval that dates back well over thirty years (c.f. [SL96]). Relevance feedback is typically used for query expansion during short-term modeling of a user’s immediate information need and for user profiling during long-term modeling of a user’s persistent interests and preferences. Traditional relevance feedback methods require that users explicitly give feedback by, for example, specifying keywords, selecting and marking documents, or answering questions about their interests. Such relevance feedback methods force users to engage in additional activities beyond their normal searching behavior. Since the cost to the user is high and the benefits are not always apparent, it can be difficult to collect the necessary data and the effectiveness of explicit techniques can be limited. |
An Entropy-Based Subspace Clustering Algorithm for Categorical Data | The interest in attribute weighting for soft subspace clustering have been increasing in the last years. However, most of the proposed approaches are designed for dealing only with numeric data. In this paper, our focus is on soft subspace clustering for categorical data. In soft subspace clustering, the attribute weighting approach plays a crucial role. Due to this, we propose an entropy-based approach for measuring the relevance of each categorical attribute in each cluster. Besides that, we propose the EBK-modes (entropy-based k-modes), an extension of the basic k-modes that uses our approach for attribute weighting. We performed experiments on five real-world datasets, comparing the performance of our algorithms with four state-of-the-art algorithms, using three well-known evaluation metrics: accuracy, f-measure and adjusted Rand index. According to the experiments, the EBK-modes outperforms the algorithms that were considered in the evaluation, regarding the considered metrics. |
PAGE: A Partition Aware Engine for Parallel Graph Computation | Graph partition quality affects the overall performance of parallel graph computation systems. The quality of a graph partition is measured by the balance factor and edge cut ratio. A balanced graph partition with small edge cut ratio is generally preferred since it reduces the expensive network communication cost. However, according to an empirical study on Giraph, the performance over well partitioned graph might be even two times worse than simple random partitions. This is because these systems only optimize for the simple partition strategies and cannot efficiently handle the increasing workload of local message processing when a high quality graph partition is used. In this paper, we propose a novel partition aware graph computation engine named PAGE, which equips a new message processor and a dynamic concurrency control model. The new message processor concurrently processes local and remote messages in a unified way. The dynamic model adaptively adjusts the concurrency of the processor based on the online statistics. The experimental evaluation demonstrates the superiority of PAGE over the graph partitions with various qualities. |
Effects of enalapril in systolic heart failure patients with and without chronic kidney disease: insights from the SOLVD Treatment trial. | BACKGROUND
Angiotensin-converting enzyme inhibitors improve outcomes in systolic heart failure (SHF). However, doubts linger about their effect in SHF patients with chronic kidney disease (CKD).
METHODS
In the Studies of Left Ventricular Dysfunction (SOLVD) Treatment trial, 2569 ambulatory chronic HF patients with left ventricular ejection fraction ≤ 35% and serum creatinine level ≤ 2.5mg/dl were randomized to receive either placebo (n=1284) or enalapril (n=1285). Of the 2502 patients with baseline serum creatinine data, 1036 had CKD (estimated glomerular filtration rate <60 ml/min/1.73 m(2)).
RESULTS
Overall, during 35 months of median follow-up, all-cause mortality occurred in 40% (502/1252) and 35% (440/1250) of placebo and enalapril patients, respectively (hazard ratio {HR}, 0.84; 95% confidence interval {CI}, 0.74-0.95; p=0.007). All-cause mortality occurred in 45% and 42% of patients with CKD (HR, 0.88; 95% CI, 0.73-1.06; p=0.164), and 36% and 31% of non-CKD patients (HR, 0.82; 95% CI, 0.69-0.98; p=0.028) in the placebo and enalapril groups, respectively (p for interaction=0.615). Enalapril reduced cardiovascular hospitalization in those with CKD (HR, 0.77; 95% CI, 0.66-0.90; p<0.001) and without CKD (HR, 0.80; 95% CI, 0.70-0.91; p<0.001). Among patients in the enalapril group, serum creatinine elevation was significantly higher in those without CKD (0.09 versus 0.04 mg/dl in CKD; p=0.003) during first year of follow-up, but there was no differences in changes in systolic blood pressure (mean drop, 7 mm Hg, both) and serum potassium (mean increase, 0. /L, both).
CONCLUSIONS
Enalapril reduces mortality and hospitalization in SHF patients without significant heterogeneity between those with and without CKD. |
Interatomic distances in molecules of gaseous alkali metal halides | SummaryOne of the possible methods of comparative calculation was used for calculating interatomic distancesd in gaseous alkali halides. The average deviation of the calculated values ofd from experimental values was 0.0012 A (for 15 substances). |
Zeolitic imidazolate framework membrane with molecular sieving properties by microwave-assisted solvothermal synthesis. | A zeolitic imidazolate framework (ZIF-8) as member of the metal-organic framework family has been crystallized as a thin porous layer on an asymmetric ceramic support. Hydrogen can be selected from other gases by molecular sieving. |
On inverse kinematics with fractional calculus | In this paper an application of the fractional calculus to path control is studied. The integer-order derivative and integral are replaced with the fractional-order ones in order to solve the inverse kinematics problem. The proposed algorithm is a modification of the existing one. In order to maintain the accuracy and to lower the memory requirements a history limit and varying order for derivation are proposed. An impact of the parameters upon a generated path is studied. Obtained results prove that using the fractional calculus may improve well known methods of solving the inverse kinematics problem. |
Relationships of peripheral IGF-1, VEGF and BDNF levels to exercise-related changes in memory, hippocampal perfusion and volumes in older adults | Animal models point towards a key role of brain-derived neurotrophic factor (BDNF), insulin-like growth factor-I (IGF-I) and vascular endothelial growth factor (VEGF) in mediating exercise-induced structural and functional changes in the hippocampus. Recently, also platelet derived growth factor-C (PDGF-C) has been shown to promote blood vessel growth and neuronal survival. Moreover, reductions of these neurotrophic and angiogenic factors in old age have been related to hippocampal atrophy, decreased vascularization and cognitive decline. In a 3-month aerobic exercise study, forty healthy older humans (60 to 77years) were pseudo-randomly assigned to either an aerobic exercise group (indoor treadmill, n=21) or to a control group (indoor progressive-muscle relaxation/stretching, n=19). As reported recently, we found evidence for fitness-related perfusion changes of the aged human hippocampus that were closely linked to changes in episodic memory function. Here, we test whether peripheral levels of BDNF, IGF-I, VEGF or PDGF-C are related to changes in hippocampal blood flow, volume and memory performance. Growth factor levels were not significantly affected by exercise, and their changes were not related to changes in fitness or perfusion. However, changes in IGF-I levels were positively correlated with hippocampal volume changes (derived by manual volumetry and voxel-based morphometry) and late verbal recall performance, a relationship that seemed to be independent of fitness, perfusion or their changes over time. These preliminary findings link IGF-I levels to hippocampal volume changes and putatively hippocampus-dependent memory changes that seem to occur over time independently of exercise. We discuss methodological shortcomings of our study and potential differences in the temporal dynamics of how IGF-1, VEGF and BDNF may be affected by exercise and to what extent these differences may have led to the negative findings reported here. |
Filling Holes in Complex Surfaces using Volumetric Diffusion | We address the problem of building watertight 3D models from surfaces that contain holes − for example, sets of range scans that observe most but not all of a surface. We specifically address situations in which the holes are too geometrically and topologically complex to fill using triangulation algorithms. Our solution begins by constructing a signed distance function, the zero set of which defines the surface. Initially, this function is defined only in the vicinity of observed surfaces. We then apply a diffusion process to extend this function through the volume until its zero set bridges whatever holes may be present. If additional information is available, such as known-empty regions of space inferred from the lines of sight to a 3D scanner, it can be incorporated into the diffusion process. Our algorithm is simple to implement, is guaranteed to produce manifold noninterpenetrating surfaces, and is efficient to run on large datasets because computation is limited to areas near holes. |
Image-Grounded Conversations: Multimodal Context for Natural Question and Response Generation | Visual Context Textual Context Oh my gosh, i’m so buying this shirt. I found a cawaii bird. Stocking up!! Only reason I come to carnival. Question Where did you see this for sale? Are you going to collect some feathers? Ayee! what the prices looking like? Oh my God. How the hell do you even eat that? Response Midwest sports There are so many crows here I’d be surprised if I never found one. Only like 10-20% off..I think I’m gonna wait a little longer. They are the greatest things ever chan. I could eat 5! |
Neural Networks for Continuous Online Learning and Control | This paper proposes a new hybrid neural network (NN) model that employs a multistage online learning process to solve the distributed control problem with an infinite horizon. Various techniques such as reinforcement learning and evolutionary algorithm are used to design the multistage online learning process. For this paper, the infinite horizon distributed control problem is implemented in the form of real-time distributed traffic signal control for intersections in a large-scale traffic network. The hybrid neural network model is used to design each of the local traffic signal controllers at the respective intersections. As the state of the traffic network changes due to random fluctuation of traffic volumes, the NN-based local controllers will need to adapt to the changing dynamics in order to provide effective traffic signal control and to prevent the traffic network from becoming overcongested. Such a problem is especially challenging if the local controllers are used for an infinite horizon problem where online learning has to take place continuously once the controllers are implemented into the traffic network. A comprehensive simulation model of a section of the Central Business District (CBD) of Singapore has been developed using PARAMICS microscopic simulation program. As the complexity of the simulation increases, results show that the hybrid NN model provides significant improvement in traffic conditions when evaluated against an existing traffic signal control algorithm as well as a new, continuously updated simultaneous perturbation stochastic approximation-based neural network (SPSA-NN). Using the hybrid NN model, the total mean delay of each vehicle has been reduced by 78% and the total mean stoppage time of each vehicle has been reduced by 84% compared to the existing traffic signal control algorithm. This shows the efficacy of the hybrid NN model in solving large-scale traffic signal control problem in a distributed manner. Also, it indicates the possibility of using the hybrid NN model for other applications that are similar in nature as the infinite horizon distributed control problem |
The kaleidoscope of effective gamification: deconstructing gamification in business applications | Developers of gamified business applications face the challenge of creating motivating gameplay strategies and creative design techniques to deliver subject matter not typically associated with games in a playful way. We currently have limited models that frame what makes gamification effective (i.e., engaging people with a business application). Thus, we propose a design-centric model and analysis tool for gamification: The kaleidoscope of effective gamification. We take a look at current models of game design, self-determination theory and the principles of systems design to deconstruct the gamification layer in the design of these applications. Based on the layers of our model, we provide design guidelines for effective gamification of business applications. |
The Evaluation of Question Answering Systems : Lessons Learned from the TREC QA Track | The TREC question answering (QA) track was the first large-sc ale evaluation of open-domain question answering systems. In addition to successfully fostering research on the QA task, the track h s also been used to investigate appropriate evaluation me thodologies for question answering systems. This paper gives a brief histor y of the TREC QA track, motivating the decisions made in its im plementation and summarizing the results. The lessons learned from the tr ack will be used to evolve new QA evaluations for both the trac k nd the ARDA AQUAINT program. |
Analysis of surface leaching processes in vitrified high-level nuclear wastes using in-situ raman imaging and atomistic modeling. 1997 annual progress report | 'The objectives of this report are: (1) To investigate the development of Raman spectroscopy for remote, in-situ, real-time measurement of the processes underlying chemical corrosion of glasses, (2) To conduct Raman spectroscopy measurements and quantum mechanical modelling studies of the transition states, corrosion products, and transition state energies for the hydrate species of higher valence and multivalent ions formed in the reconstructed glass surface. (3) To use these results to model long-term corrosion behavior of complex borosilicate wasteform glasses. (4) To apply the Raman spectroscopy and modelling methods developed here for the remote analysis of leaching processes in waste glasses containing radioactive components, and for imaging of variations in leaching behavior due to composition inhomogeneities in large scale waste glass products. Results of First Year Research During the first year, the authors primarily addressed Objective (1) which is to develop a methodology for the remote monitoring of leaching processes in glasses by Raman spectroscopy. The authors assembled a micro and macro Raman system for examining surface structure in glass samples, in-situ within the leaching vessel. The Raman spectrometer was prepared for imaging by installing a CCD detector which gives 2-dimensional information. The latter can be used to obtain spectrographic data in one dimension and to scan variations in materials behavior across the other dimension. By scanning the sample in a perpendicular direction, it is possible to conduct 2-dimensional spectral analysis of the sample surface. The plan is to select one or several identifying Raman lines that follow the leaching process After that, it will be possible to introduce a slit configuration and use both dimensions of the CCD detector for scanning the sample surface while examining only the selected spectral feature. Glass samples consisting of alkali silicates were first examined. The samples were melted in a conventional electric furnace in 100 gr batches and poured into brass molds. The samples were cut and polished under water-free conditions and then immersed under water in petri dishes. By leaving the upper surface uncovered, the authors focused the objective lens of a microscope on the submerged glass surface. Light from an Argon laser operating at 514.5 nm with approximately 100 mW of power was directed to the glass surface at a large angle of incidence (55 ). Raman spectra were collected in a 0.6 m spectrometer with a 1,200 line/mm grating using a supernotch filter to block the laser line. Typical data are shown in the attached figure. Samples of the following compositions were tested: 20% Na,O--80% SiO, and 30% L and O--70% SiO, . The attached figure reports the results from the Li,O-SiO, glass. The results show a significant difference between the unleached and water-exposed samples. This is evident in the relative amplitudes of the Raman peaks at 436 cm{sup -2} and 582 cm{sup -2}. These 2 peaks are associated with vibrations of the alkali coordinated non- bridging oxygen. In fact, the ratio of the 582 to the 436 peaks gives a measure of the alkali content of the material. By examining the sample surface only, and conducting the tests while the sample is submerged, the authors observed a decrease in the 582 cm{sup -1} peak and an increase in the 436 cm{sup -1} peak with increased exposure of the sample to the aqueous environment. This is shown in the attached figure and is clear indication of the progressive dealkalization of the glass surface by water. The peaks can be fitted to Gaussian shapes and the integrated intensity will yield a direct measure of the alkali content of the glass surface. In addition, the angle of incidence of the laser beam can be adjusted to provide more penetration into the sample and yield composition profiles.' |
Internal Nasal Valve Incompetence Is Effectively Treated Using Batten Graft Functional Rhinoplasty | Introduction. Internal nasal valve incompetence (INVI) has been treated with various surgical methods. Large, single surgeon case series are lacking, meaning that the evidence supporting a particular technique has been deficient. We present a case series using alar batten grafts to reconstruct the internal nasal valve, all performed by the senior author. Methods. Over a 7-year period, 107 patients with nasal obstruction caused by INVI underwent alar batten grafting. Preoperative assessment included the use of nasal strips to evaluate symptom improvement. Visual analogue scale (VAS) assessment of nasal blockage (NB) and quality of life (QOL) both pre- and postoperatively were performed and analysed with the Wilcoxon signed rank test. Results. Sixty-seven patients responded to both pre- and postoperative questionnaires. Ninety-one percent reported an improvement in NB and 88% an improvement in QOL. The greatest improvement was seen at 6 months (median VAS 15 mm and 88 mm resp., with a P value of <0.05 for both). Nasal strips were used preoperatively and are a useful tool in predicting patient operative success in both NB and QOL (odds ratio 2.15 and 2.58, resp.). Conclusions. Alar batten graft insertion as a single technique is a valid technique in treating INVI and produces good outcomes. |
Estimating Trends in Alligator Populations from Nightlight Survey Data | Nightlight surveys are commonly used to evaluate status and trends of crocodilian populations, but imperfect detection caused by survey- and location-specific factors makes it difficult to draw population inferences accurately from uncorrected data. We used a two-stage hierarchical model comprising population abundance and detection probability to examine recent abundance trends of American alligators (Alligator mississippiensis) in subareas of Everglades wetlands in Florida using nightlight survey data. During 2001–2008, there were declining trends in abundance of small and/or medium sized animals in a majority of subareas, whereas abundance of large sized animals had either demonstrated an increased or unclear trend. For small and large sized class animals, estimated detection probability declined as water depth increased. Detection probability of small animals was much lower than for larger size classes. The declining trend of smaller alligators may reflect a natural population response to the fluctuating environment of Everglades wetlands under modified hydrology. It may have negative implications for the future of alligator populations in this region, particularly if habitat conditions do not favor recruitment of offspring in the near term. Our study provides a foundation to improve inferences made from nightlight surveys of other crocodilian populations. |
Towards Model-based Acceptance Testing for Scrum | In agile processes like Scrum, strong customer involvement demands for techniques to facilitate the requirements analysis and acceptance testing. Additionally, test automation is crucial, as incremental development and continuous integration require high efforts for testing. To cope with these challenges, we propose a modelbased technique for documenting customer’s requirements in forms of test models. These can be used by the developers as requirements specification and by the testers for acceptance testing. The modeling languages we use are light-weight and easy-to-learn. From the test models, we generate test scripts for FitNesse or Selenium which are well-established test automation tools in agile community. |
Effective Focused Crawling Based on Content and Link Structure Analysis | A focused crawler traverses the web selecting out relevant pages to a predefined topic and neglecting those out of concern. While surfing the internet it is difficult to deal with irrelevant pages and to predict which links lead to quality pages. In this paper, a technique of effective focused crawling is implemented to improve the quality of web navigation. To check the similarity of web pages w.r.t. topic keywords, a similarity function is used and the priorities of extracted out links are also calculated based on meta data and resultant pages generated from focused crawler. The proposed work also uses a method for traversing the irrelevant pages that met during crawling to improve the coverage of a specific topic. Keywords-focused crawler, metadata, weight table, World-Wide Web, Search Engine, links ranking. |
A Clinical Study of 35 Cases of Pincer Nails | BACKGROUND
Pincer nail is a nail deformity characterized by transverse overcurvature of the nail plate. Pincer nail can affect a patient's quality of life due to its chronic, recurrent course; however, there have been no clinical studies on the pincer nail condition in Korean patients.
OBJECTIVE
The purpose of this study was to characterize the clinical findings and treatment of pincer nail. In addition, possible etiological factors were considered, and treatment efficacy was evaluated.
METHODS
The medical records and clinical photographs of 35 patients (12 males, 23 females) who were diagnosed with pincer nail between August 1, 2005 and July 31, 2009 were studied.
RESULTS
Patient age ranged from 10 to 77 (52.09±17.26) years, and there was a predominance of female (23 out of 35 patients, F:M=2:1). The mean duration of the disorder was 7.45 years (range 0.25~40); 85% had pincer nail for at least 1 year. In addition, 40% had a history of previous treatment and recurrence. There were 82.8% patients with the common type of pincer nails. The most commonly involved nails were both great toenails. Among 35 patients, nail grinding was started in 30 patients, and 25 patients showed clinical improvement with nail grinding. The width index increased and the height index decreased after treatment. The mean follow up period was 8.42 months (range 1~27), and 7 patients showed recurrence after 8.8 months (range 2~20). Among 35 patients, 5 patients were treated with nail extraction with matricectomy, and the symptoms resolved immediately. The mean follow up period was 7.6 months (range 0~19), and recurrence was not observed. Onychomycosis was also present in 37.1% of patients, and itraconazole pulse therapy for 3 months was added.
CONCLUSION
The results of this study demonstrate the clinical features of pincer nail in Korean patients. The findings show that the common type of pincer nail was most common, and nail grinding as a conservative treatment greatly improved pincer nails despite a risk of recurrence. When onychomycosis was also present, oral antifungal therapy added to nail grinding resulted in a more rapid change in nail thickness and clinical improvement. |
LibGuides: Political Science Subject Guide: Public opinion/social research/demographics | Resources for research and further study for Castleton political science faculty and students |
Realistic animation of rigid bodies | The theoretical background and implementation for a computer animation system to model a general class of three dimensional dynamic processes for arbitrary rigid bodies is presented. The simulation of the dynamic interaction among rigid bodies takes into account various physical characteristics such as elasticity, friction, mass, and moment of inertia to produce rolling and sliding contacts. If a set of bodies is statically unstable, the system dynamically drives it toward a stable configuration while obeying the geometric constraints of the system including general non-holonomic constraints. The system also provides a physical environment with which objects animated using more traditional techniques can interact. The degree of interaction is easily controlled by the animator. A computationally efficient method to merge kinematics and dynamics for articulated rigid bodies to produce realistic motion is presented. |
Chinese herbal medicine for infertility with anovulation: a systematic review. | The aim of this systematic review is to assess the effectiveness and safety of Chinese herbal medicine (CHM) in treatment of anovulation and infertility in women. Eight (8) databases were extensively retrieved. The Chinese electronic databases included VIP Information, CMCC, and CNKI. The English electronic databases included AMED, CINAHL, Cochrane Library, Embase, and MEDLINE(®). Randomized controlled trials using CHM as intervention were included in the study selection. The quality of studies was assessed by the Jadad scale and the criteria referred to Cochrane reviewers' handbook. The efficacy of CHM treatment for infertility with anovulation was evaluated by meta-analysis. There were 692 articles retrieved according to the search strategy, and 1659 participants were involved in the 15 studies that satisfied the selection criteria. All the included trials were done in China. Meta-analysis indicated that CHM significantly increased the pregnancy rate (odds ratio [OR] 3.12, 95% confidence interval [CI] 2.50-3.88) and reduced the miscarriage rate (OR 0.2, 95% CI 0.10-0.41) compared to clomiphene. In addition, CHM also increased the ovulation rate (OR 1.55, 95% CI 1.06-2.25) and improved the cervical mucus score (OR 3.82, 95% CI 1.78-8.21) compared to clomiphene, while there were no significant difference between CHM and clomiphene combined with other medicine. CHM is effective in treating infertility with anovulation. Also, no significant adverse effects were identified for the use of CHM from the studies included in this review. However, owing to the low quality of the studies investigated, more randomized controlled trials are needed before evidence-based recommendation regarding the effectiveness and safety of CHM in the management of infertility with anovulation can be provided. |
Dynamic, dyadic, intersubjective systems: An evolving paradigm for psychoanalysis. | Dynamic systems theory is a source of powerful new metaphors for psychoanalysis. Phenomena such as conflict, transference, resistance, and the unconscious itself are grasped from this perspective as dynamically emergent properties of self-organizing, nonlinear, dyadic, intersubjective systems. The conception of development as evolving and dissolving attractor states of intersubjective systems richly illuminates the processes of pattern formation and change in psychoanalysis. Effective interpretations are seen as perturbations of the therapeutic system that permit new organizing principles to come into being. |
Parallel processing flow models on desktop hardware | Numerical solution of any large, three-dimensional fluid flow problem is a computationally intensive task that typically requires supercomputer solution to achieve reasonable execution time. This paper describes an alternative approach, a technique for mapping three-dimensional fluid flow models to low-cost, desktop hardware. The approach is shown to deliver exceptional performance. |
Information Theoretic Model Validation for Spectral Clustering | Model validation constitutes a fundamental step in data clustering. The central question is: Which cluster model and how many clusters are most appropriate for a certain application? In this study, we introduce a method for the validation of spectral clustering based upon approximation set coding. In particular, we compare correlation and pairwise clustering to analyze the correlations of temporal gene expression profiles. To evaluate and select clustering models, we calculate their reliable informativeness. Experimental results in the context of gene expression analysis show that pairwise clustering yields superior amounts of reliable information. The analysis results are consistent with the Bayesian Information Criterion (BIC), and exhibit higher generality than BIC. |
Principles of survey research: part 3: constructing a survey instrument | In this article, we discuss how to construct a questionnaire. We point out the need to use any previous research results to reduce the overheads of survey construction. We identify a number of issues to consider when selecting questions, constructing questions, deciding on the type of question and finalizing the format of the questionnaire. |
Paving green passage for emergency vehicle in heavy traffic: Real-time motion planning under the connected and automated vehicles environment | This paper describes a real-time multi-vehicle motion planning (MVMP) algorithm for the emergency vehicle clearance task. To address the inherent limitations of human drivers in perception, communication, and cooperation, we require that the emergency vehicle and the surrounding normal vehicles are connected and automated vehicles (CAVs). The concerned MVMP task is to find cooperative trajectories such that the emergency vehicle can efficiently pass through the normal vehicles ahead. We use an optimal-control based formulation to describe the MVMP problem, which is centralized, straightforward, and complete. For the online solutions, the centralized MVMP formulation is converted into a multi-period and multi-stage version. Concretely, each period consists of two stages: the emergency vehicle and several normal CAVs ahead try to form a regularized platoon via acceleration or deceleration (stage 1); when a regularized platoon is formed, these vehicles act cooperatively to make way for the emergency vehicle until the emergency vehicle becomes the leader in this local platoon (stage 2). When one period finishes, the subsequent period begins immediately. This sequential process continues until the emergency vehicle finally passes through all the normal CAVs. The subproblem at stage 1 is extremely easy because nearly all the challenging nonlinearity gathers only in stage 2; typical solutions to the subproblem at stage 2 can be prepared offline, and then implemented online directly. Through this, our proposed MVMP algorithm avoids heavy online computations and thus runs in real time. |
Is autism curable? | Autism spectrum disorder (ASD) is a heterogeneous neurodevelopmental disorder of multifactorial origin. Today, ASD is generally not curable, although it is treatable to a varying degree to prevent worse outcomes. Some reports indicate the possibility of major improvements or even recovery in ASD. However, these studies are based on scientific shortcomings, and the lack of a clear definition of 'cure' in ASD further compromises interpretation of research findings. The development of animal models and decreasing costs of genome sequencing provide new options for treatment research and individualized medicine in ASD. This article briefly reviews several issues related to the question whether there is recovery from ASD, starting with a short overview of the presumed aetiologies. |
Coupling-Feed Circularly Polarized RFID Tag Antenna Mountable on Metallic Surface | A square patch passive RFID tag antenna designed for UHF band is presented in this article. To achieve compact size and circular polarization (CP) radiation, the square patch is embedded with a cross slot, while an L-shaped open-end microstrip line linked to a tag-chip and terminated by a shorting pin is capacitively coupled to the patch. By selecting an appropriate length for the microstrip line and its coupling distance with the radiating element, easy control on the input impedance of the proposed tag antenna which leads to excellent impedance matching is achieved. The measured 10-dB return-loss bandwidth of the tag antenna is 50 MHz (904-954 MHz), while its 3-dB axial-ratio bandwidth is 6 MHz (922-928 MHz). Further experiment shows that the tag antenna can provide better reading range when mounted on a metallic surface. |
Nachdiplom Lecture: Statistics Meets Optimization Some Comments on Relative Costs | Directly solving the ordinary least squares problem will (in general) require O(nd) operations. From Table 5.1, the Gaussian sketch does not actually improve upon this scaling for unconstrained problems: when m d (as is needed in the unconstrained case), then computing the sketch SA requires O(nd) operations as well. If we compute sketches using the JLT, then this cost is reduced to O(nd log(d)) so that we do see some significant savings relative to OLS. There are other strategies, of course. In a statistical setting, in which the rows of (A, y) correspond to distinct samples, it is natural to consider a method based on sample splitting. That is, suppose that we do the following: |
Comparison of two different methods for urethral lengthening in female to male (metoidioplasty) surgery. | INTRODUCTION
Metoidioplasty presents one of the variants of phalloplasty in female transsexuals. Urethral lengthening is the most difficult part in this surgery and poses many challenges.
AIM
We evaluated 207 patients who underwent metoidioplasty, aiming to compare two different surgical techniques of urethral lengthening, postoperative results, and complications.
METHODS
The study encompassed a total of 207 patients, aged from 18 to 62 years, who underwent single stage metoidioplasty between September 2002 and July 2011. The procedure included lengthening and straightening of the clitoris, urethral reconstruction, and scrotoplasty with implantation of testicular prostheses. Buccal mucosa graft was used in all cases for dorsal urethral plate formation and joined with one of the two different flaps: I-longitudinal dorsal clitoral skin flap (49 patients) and II-labia minora flap (158 patients).
MAIN OUTCOME MEASUREMENT
Results were analyzed using Z-test to evaluate the statistical difference between the two approaches. Also, postoperative questionnaire was used, which included questions on functioning and esthetical appearance of participating subjects as well as overall satisfaction.
RESULTS
The median follow-up was 39 months (ranged 12-116 months). The total length of reconstructed urethra was measured during surgery in both groups. It ranged from 9.1 to 12.3 cm (median 9.5) in group I and from 9.4 to 14.2 cm (median 10.8) in group II. Voiding while standing was significantly better in group II (93%) than in group I (87.82%) (P < 0.05). Urethral fistula occurred in 16 patients in both groups (7.72%). There was statistically significant difference between the groups, with lower incidence in group II (5.69%) vs. group I (14.30%) (P < 0.05). Overall satisfaction was noted in 193 patients.
CONCLUSION
Comparison of the two methods for urethral lengthening confirmed combined buccal mucosa graft and labia minora flap as a method of choice for urethroplasty in metoidioplasty, minimizing postoperative complications. |
Mining and summarizing customer reviews | Merchants selling products on the Web often ask their customers to review the products that they have purchased and the associated services. As e-commerce is becoming more and more popular, the number of customer reviews that a product receives grows rapidly. For a popular product, the number of reviews can be in hundreds or even thousands. This makes it difficult for a potential customer to read them to make an informed decision on whether to purchase the product. It also makes it difficult for the manufacturer of the product to keep track and to manage customer opinions. For the manufacturer, there are additional difficulties because many merchant sites may sell the same product and the manufacturer normally produces many kinds of products. In this research, we aim to mine and to summarize all the customer reviews of a product. This summarization task is different from traditional text summarization because we only mine the features of the product on which the customers have expressed their opinions and whether the opinions are positive or negative. We do not summarize the reviews by selecting a subset or rewrite some of the original sentences from the reviews to capture the main points as in the classic text summarization. Our task is performed in three steps: (1) mining product features that have been commented on by customers; (2) identifying opinion sentences in each review and deciding whether each opinion sentence is positive or negative; (3) summarizing the results. This paper proposes several novel techniques to perform these tasks. Our experimental results using reviews of a number of products sold online demonstrate the effectiveness of the techniques. |
Between MDPs and Semi-MDPs: A Framework for Temporal Abstraction in Reinforcement Learning | Learning, planning, and representing knowledge at multiple levels of temporal abstraction are key, longstanding challenges for AI. In this paper we consider how these challenges can be addressed within the mathematical framework of reinforcement learning and Markov decision processes (MDPs). We extend the usual notion of action in this framework to include options—closed-loop policies for taking action over a period of time. Examples of options include picking up an object, going to lunch, and traveling to a distant city, as well as primitive actions such as muscle twitches and joint torques. Overall, we show that options enable temporally abstract knowledge and action to be included in the reinforcement learning framework in a natural and general way. In particular, we show that options may be used interchangeably with primitive actions in planning methods such as dynamic programming and in learning methods such as Q-learning. Formally, a set of options defined over an MDP constitutes a semi-Markov decision process (SMDP), and the theory of SMDPs provides the foundation for the theory of options. However, the most interesting issues concern the interplay between the underlying MDP and the SMDP and are thus beyond SMDP theory. We present results for three such cases: (1) we show that the results of planning with options can be used during execution to interrupt options and thereby perform even better than planned, (2) we introduce new intra-option methods that are able to learn about an option from fragments of its execution, and (3) we propose a notion of subgoal that can be used to improve the options themselves. All of these results have precursors in the existing literature; the contribution of this paper is to establish them in a simpler and more general setting with fewer changes to the existing reinforcement learning framework. In particular, we show that these results can be obtained without committing to (or ruling out) any particular approach to state abstraction, hierarchy, function approximation, or the macroutility problem. ! 1999 Published by Elsevier Science B.V. All rights reserved. ∗ Corresponding author. 0004-3702/99/$ – see front matter ! 1999 Published by Elsevier Science B.V. All rights reserved. PII: S0004-3702(99)00052 -1 182 R.S. Sutton et al. / Artificial Intelligence 112 (1999) 181–211 |
A machine learning approach for filtering Monte Carlo noise | The most successful approaches for filtering Monte Carlo noise use feature-based filters (e.g., cross-bilateral and cross non-local means filters) that exploit additional scene features such as world positions and shading normals. However, their main challenge is finding the optimal weights for each feature in the filter to reduce noise but preserve scene detail. In this paper, we observe there is a complex relationship between the noisy scene data and the ideal filter parameters, and propose to learn this relationship using a nonlinear regression model. To do this, we use a multilayer perceptron neural network and combine it with a matching filter during both training and testing. To use our framework, we first train it in an offline process on a set of noisy images of scenes with a variety of distributed effects. Then at run-time, the trained network can be used to drive the filter parameters for new scenes to produce filtered images that approximate the ground truth. We demonstrate that our trained network can generate filtered images in only a few seconds that are superior to previous approaches on a wide range of distributed effects such as depth of field, motion blur, area lighting, glossy reflections, and global illumination. |
Stimulating personality: ethical criteria for deep brain stimulation in psychiatric patients and for enhancement purposes. | Within the recent development of brain-machine-interfaces deep brain stimulation (DBS) has become one of the most promising approaches for neuromodulation. After its introduction more than 20 years ago, it has in clinical routine become a successful tool for treating neurological disorders like Parkinson's disease, essential tremor and dystonia. Recent evidence also demonstrates efficacy in improving emotional and cognitive processing in obsessive-compulsive disorder and major depression, thus allowing new treatment options for treatment refractory psychiatric diseases, and even indicating future potential to enhance functioning in healthy subjects. We demonstrate here that DBS is neither intrinsically unethical for psychiatric indications nor for enhancement purposes. To gain normative orientation, the concept of "personality" is not useful--even if a naturalistic notion is employed. As an alternative, the common and widely accepted bioethical criteria of beneficence, non-maleficence, and autonomy allow a clinically applicable, highly differentiated context- and case-sensitive approach. Based on these criteria, an ethical analysis of empirical evidence from both DBS in movement disorders and DBS in psychiatric disease reveals that wide-spread use of DBS for psychiatric indications is currently not legitimated and that the basis for enhancement purposes is even more questionable. Nevertheless, both applications might serve as ethically legitimate, promising purposes in the future. |
Central versus lower body obesity distribution and the association with lower limb physical function and disability. | OBJECTIVE
To determine whether fat distribution in obese adults is significantly associated with decreased function and increased disability.
DESIGN
Cross-sectional epidemiologic analysis.
SETTING
Multicenter, community-based study.
PARTICIPANTS
Multicenter Osteoarthritis Study participants included adults ages 50-79 years at high risk of developing or already possessing knee osteoarthritis. A total of 549 men and 892 women from the Multicenter Osteoarthritis Study who had a body mass index ≥ 30 kg/m² and who underwent dual energy x-ray absorptiometry (DEXA) scans were included in these analyses. Exclusion criteria included bilateral knee replacements, cancer, or other rheumatologic disease.
METHODS
Body fat distribution was determined using baseline DEXA scan data. A ratio of abdominal fat in grams compared with lower limb fat in grams (trunk:lower limb fat ratio) was calculated. Participants were divided into quartiles of trunk:lower limb fat ratio, with highest and lowest quartiles representing central and lower body obesity, respectively. Backward elimination linear regression models stratified by gender were used to analyze statistical differences in function and disability between central and lower body obesity groups.
MAIN OUTCOME MEASURES
Lower limb physical function measures included 20-meter walk time, chair stand time, and peak knee flexion and extension strength. Disability was assessed using the Late Life Function and Disability Index.
RESULTS
Trunk:lower limb fat ratio was not significantly associated with physical function or disability in women or men (P value .167-.972). Total percent body fat (standardized β = -0.1533 and -0.1970 in men and women, respectively) was a better predictor of disability when compared with trunk:lower limb fat ratio (standardized β = 0.0309 and 0.0072).
CONCLUSIONS
Although fat distribution patterns may affect clinical outcomes in other areas, lower limb physical function and disability do not appear to be significantly influenced by the distribution of fat in obese older adults with, or at risk for, knee osteoarthritis. These data do not support differential treatment of functional limitations based on fat distribution. |
Customer Satisfaction and Service Quality to Develop Trust and Positive Word of Mouth in Vocational Education | This study generally aims to analyze the influence of customer satisfaction and service quality to develop trust and positive word of mouth especially in vocational education. Students in vocational education as well as higher education are considered as an internal customer; therefore, their satisfaction and perception of service quality are potential to bridging the emergence of external satisfaction and customer loyalty. According to previous literature, internal customer satisfaction can be measured by service product, service delivery, and service environment, all of which have a significant relation with trust where trust is significantly related to word of mouth. Meanwhile, the communication, productivity and responsiveness are significant in measuring the perceptions of service quality. This study is expected to enrich managerial implication on how to measure the internal customer satisfaction and service quality toward trust and word of mouth in vocational education. Thus, the vocational education can continue to strive to provide educational service in accordance with the expectations of the customers. |
Multi-plate reconstruction for severe bicondylar tibial plateau fractures of young adults | This study was to evaluate clinical outcomes and complications following multi-plate reconstruction for treating severe bicondylar tibial plateau fractures of young adults. Between September 2007 and February 2012, 26 patients with severe bicondylar tibial plateau fractures met inclusion criteria; they were treated using multi-plate technique through combined approaches. Patients received an average follow-up of 40.8 (range, 18–64) months, which included anteroposterior and lateral imaging, postoperative complications, range of motion and stability of the knee. The Rasmussen score was applied for functional and radiological evaluation. Three to five plates were used for reconstruction. No intra-operative complications occurred. Postoperative complications included bulge of hardware in four patients and superficial dehiscence in three cases in the anterolateral incision of which one developed to deep infection. There was no neurovascular damage, and no implant breakage or loosening. Hardware was removed partly or totally in 16 cases. The average Rasmussen score at final follow-up was 27.2 (range, 21–30) points for functional evaluation and 16.4 (range, 14–18) for radiology. Multi-plate reconstruction is a valid and safe method for treating severe bicondylar tibial plateau fractures of young adults. |
Validation of a preparation for decision making scale. | OBJECTIVE
The Preparation for Decision Making (PrepDM) scale was developed to evaluate decision processes relating to the preparation of patients for decision making and dialoguing with their practitioners. The objective of this study was to evaluate the scale's psychometric properties.
METHODS
From July 2005 to March 2006, after viewing a decision aid prescribed during routine clinical care, patients completed a questionnaire including: demographic information, treatment intention, decisional conflict, decision aid acceptability, and the PrepDM scale.
RESULTS
Four hundred orthopaedic patients completed the questionnaire. The PrepDM scale showed significant correlation with the informed (r=-0.21, p<0.01) and support (r=-0.13, p=0.01) subscales (DCS); and discriminated significantly between patients who did and did not find the decision aid helpful (p<0.0001). Alpha coefficients for internal consistency ranged from 0.92 to 0.96. The scale is strongly unidimensional (principal components analysis) and Item Response Theory analyses demonstrated that all ten scale items function very well.
CONCLUSION
The psychometric properties of the PrepDM scale are very good.
PRACTICE IMPLICATIONS
The scale could allow more comprehensive evaluation of interventions designed to prepare patients for shared-decision making encounters regarding complex health care decisions. |
Seeking the Strongest Rigid Detector | The current state of the art solutions for object detection describe each class by a set of models trained on discovered sub-classes (so called "components"), with each model itself composed of collections of interrelated parts (deformable models). These detectors build upon the now classic Histogram of Oriented Gradients+linear SVM combo. In this paper we revisit some of the core assumptions in HOG+SVM and show that by properly designing the feature pooling, feature selection, preprocessing, and training methods, it is possible to reach top quality, at least for pedestrian detections, using a single rigid component. Abstract We provide experiments for a large design space, that give insights into the design of classifiers, as well as relevant information for practitioners. Our best detector is fully feed-forward, has a single unified architecture, uses only histograms of oriented gradients and colour information in monocular static images, and improves over 23 other methods on the INRIA, ETH and Caltech-USA datasets, reducing the average miss-rate over HOG+SVM by more than 30%. |
The psychophysiology of generalized anxiety disorder: 2. Effects of applied relaxation. | Muscle relaxation therapy assumes that generalized anxiety disorder (GAD) patients lack the ability to relax but can learn this in therapy. We tested this by randomizing 49 GAD patients to 12 weeks of Applied Relaxation (AR) or waiting. Before, during, and after treatment participants underwent relaxation tests. Before treatment, GAD patients were more worried than healthy controls (n=21) and had higher heart rates and lower end-tidal pCO2, but not higher muscle tension (A. Conrad, L. Isaac, & W.T. Roth, 2008). AR resulted in greater symptomatic improvement than waiting. However, 28% of the AR group dropped out of treatment and some patients relapsed at the 6-week follow-up. There was little evidence that AR participants learned to relax in therapy or that a reduction in anxiety was associated with a decrease in activation. We conclude that the clinical effects of AR in improving GAD symptoms are moderate at most and cannot be attributed to reducing muscle tension or autonomic activation. |
Para un glosario de Ricardo Piglia | The present essay articulates the notions of utopia (since Nietzsche) and experience (since Benjamin) in order to make an analysis of the narrative strategies by the Argentinean writer Ricardo Piglia. By taking critique itself as a form of autobiographic fiction, the author of Artificial Respiration and Assumed Name offers a new look on the dialectics between critique and fiction, in which the reading of texts by others would be no more than the critic’s own experience trying to allegorize him/herself in these texts. And such an experiencing of the present is invariably worked by Piglia from the point of view of utopia, that is, of politics. |
Review essay: Genre, Activity, and Expertise | Genre Knowledge in Disciplinary Communication: Cognition/Culture/Power. Carol Berkenkotter and Thomas N. Huckin. Hillsdale, NJ: Lawrence Erlbaum, 1995. Academic Literacy and the Nature of Expertise: Reading, Writing, and Knowing in Academic Philosophy. Cheryl Geisler. Hillsdale, NJ: Lawrence Erlbaum, 1994. Constructing Experience. Charles Bazerman. Carbondale, IL: Southern Illinois University Press, 1994. |
Meaning in suffering. | Any books that you read, no matter how you got the sentences that have been read from the books, surely they will give you goodness. But, we will show you one of recommendation of the book that you need to read. This the meaning of suffering is what we surely mean. We will show you the reasonable reasons why you need to read this book. This book is a kind of precious book written by an experienced author. |
Nuclear Envelope Protein SUN2 Promotes Cyclophilin-A-Dependent Steps of HIV Replication | During the early phase of replication, HIV reverse transcribes its RNA and crosses the nuclear envelope while escaping host antiviral defenses. The host factor Cyclophilin A (CypA) is essential for these steps and binds the HIV capsid; however, the mechanism underlying this effect remains elusive. Here, we identify related capsid mutants in HIV-1, HIV-2, and SIVmac that are restricted by CypA. This antiviral restriction of mutated viruses is conserved across species and prevents nuclear import of the viral cDNA. Importantly, the inner nuclear envelope protein SUN2 is required for the antiviral activity of CypA. We show that wild-type HIV exploits SUN2 in primary CD4+ T cells as an essential host factor that is required for the positive effects of CypA on reverse transcription and infection. Altogether, these results establish essential CypA-dependent functions of SUN2 in HIV infection at the nuclear envelope. |
Dentigerous cyst associated with an ectopic third molar in the maxillary sinus: A case report and review of literature | Dentigerous cysts are the most common type of developmental odontogenic cysts arising from the crowns of impacted, embedded, or unerupted teeth. They constitute about 20% of all epithelium-lined cysts of the jaws. The teeth involved most often are mandibular third molar and maxillary canines. About 70% of dentigerous cysts occur in the mandible and 30% in the maxilla. Dentigerous cysts associated with ectopic teeth within the maxillary sinus are fairly rare, and only 20 cases had been reported in Medline since 1980. In the present paper, we report an additional case of dentigerous cysts associated with ectopic third molar in the right maxillary sinus. Also, pathogenesis of ectopic tooth, role of advanced imaging, differential diagnosis, and management are discussed. |
Discovering Overlapping Groups in Social Media | The increasing popularity of social media is shortening the distance between people. Social activities, e.g., tagging in Flickr, book marking in Delicious, twittering in Twitter, etc. are reshaping people’s social life and redefining their social roles. People with shared interests tend to form their groups in social media, and users within the same community likely exhibit similar social behavior (e.g., going for the same movies, having similar political viewpoints), which in turn reinforces the community structure. The multiple interactions in social activities entail that the community structures are often overlapping, i.e., one person is involved in several communities. We propose a novel co-clustering framework, which takes advantage of networking information between users and tags in social media, to discover these overlapping communities. In our method, users are connected via tags and tags are connected to users. This explicit representation of users and tags is useful for understanding group evolution by looking at who is interested in what. The efficacy of our method is supported by empirical evaluation in both synthetic and online social networking data. |
An iterative model-based approach to cochannel speech separation | Cochannel speech separation aims to separate two speech signals from a single mixture. In a supervised scenario, the identities of two speakers are given, and current methods use pre-trained speaker models for separation. One issue in model-based methods is the mismatch between training and test signal levels. We propose an iterative algorithm to adapt speaker models to match the signal levels in testing. Our algorithm first obtains initial estimates of source signals using unadapted speaker models and then detects the input signal-to-noise ratio (SNR) of the mixture. The input SNR is then used to adapt the speaker models for more accurate estimation. The two steps iterate until convergence. Compared to search-based SNR detection methods, our method is not limited to given SNR levels. Evaluations demonstrate that the iterative procedure converges quickly in a considerable range of SNRs and improves separation results significantly. Comparisons show that the proposed system performs significantly better than related model-based systems. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.