title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
Decreasing adrenergic or sympathetic hyperactivity after severe traumatic brain injury using propranolol and clonidine (DASH After TBI Study): study protocol for a randomized controlled trial | BACKGROUND
Severe TBI, defined as a Glasgow Coma Scale ≤ 8, increases intracranial pressure and activates the sympathetic nervous system. Sympathetic hyperactivity after TBI manifests as catecholamine excess, hypertension, abnormal heart rate variability, and agitation, and is associated with poor neuropsychological outcome. Propranolol and clonidine are centrally acting drugs that may decrease sympathetic outflow, brain edema, and agitation. However, there is no prospective randomized evidence available demonstrating the feasibility, outcome benefits, and safety for adrenergic blockade after TBI.
METHODS/DESIGN
The DASH after TBI study is an actively accruing, single-center, randomized, double-blinded, placebo-controlled, two-arm trial, where one group receives centrally acting sympatholytic drugs, propranolol (1 mg intravenously every 6 h for 7 days) and clonidine (0.1 mg per tube every 12 h for 7 days), and the other group, double placebo, within 48 h of severe TBI. The study uses a weighted adaptive minimization randomization with categories of age and Marshall head CT classification. Feasibility will be assessed by ability to provide a neuroradiology read for randomization, by treatment contamination, and by treatment compliance. The primary endpoint is reduction in plasma norepinephrine level as measured on day 8. Secondary endpoints include comprehensive plasma and urine catecholamine levels, heart rate variability, arrhythmia occurrence, infections, agitation measures using the Richmond Agitation-Sedation Scale and Agitated Behavior scale, medication use (anti-hypertensive, sedative, analgesic, and antipsychotic), coma-free days, ventilator-free days, length of stay, and mortality. Neuropsychological outcomes will be measured at hospital discharge and at 3 and 12 months. The domains tested will include global executive function, memory, processing speed, visual-spatial, and behavior. Other assessments include the Extended Glasgow Outcome Scale and Quality of Life after Brain Injury scale. Safety parameters evaluated will include cardiac complications.
DISCUSSION
The DASH After TBI Study is the first randomized, double-blinded, placebo-controlled trial powered to determine feasibility and investigate safety and outcomes associated with adrenergic blockade in patients with severe TBI. If the study results in positive trends, this could provide pilot evidence for a larger multicenter randomized clinical trial. If there is no effect of therapy, this trial would still provide a robust prospective description of sympathetic hyperactivity after TBI.
TRIAL REGISTRATION
ClinicalTrials.gov NCT01322048. |
Semisupervised Dimensionality Reduction and Classification Through Virtual Label Regression | Semisupervised dimensionality reduction has been attracting much attention as it not only utilizes both labeled and unlabeled data simultaneously, but also works well in the situation of out-of-sample. This paper proposes an effective approach of semisupervised dimensionality reduction through label propagation and label regression. Different from previous efforts, the new approach propagates the label information from labeled to unlabeled data with a well-designed mechanism of random walks, in which outliers are effectively detected and the obtained virtual labels of unlabeled data can be well encoded in a weighted regression model. These virtual labels are thereafter regressed with a linear model to calculate the projection matrix for dimensionality reduction. By this means, when the manifold or the clustering assumption of data is satisfied, the labels of labeled data can be correctly propagated to the unlabeled data; and thus, the proposed approach utilizes the labeled and the unlabeled data more effectively than previous work. Experimental results are carried out upon several databases, and the advantage of the new approach is well demonstrated. |
International Law , International Relations and | Commitments are a persistent feature of international affairs. Disagreement over the effect of international commitments and the causes of compliance with them is equally persistent. Yet in the last decade the long-standing divide between those who believed that international rules per se shaped state behavior and those who saw such rules as epiphenomena1 or insignificant has given way to a more nuanced and complex debate. Regime theory, originally focused on the creation and persistence of regimes, increasingly emphasizes variations in regimes and in their impact on behavior. The legal quality of regime rules is one important source of regime variation. At the same time the proliferation and evolution of intema-tional legal agreements, organizations and judicial bodies in the wake of the Cold War has provided the empirical predicate and a policy imperative for heightened attention to the role of international law. Across many issue-areas, the use of law to structure world politics seems to be increasing. This phenomenon of legalization raises several questions. What factors explain the choice to create and use international law? If law is a tool or method to organize interaction, how does it work? Does the use of international law make a difference to how states or domestic actors behave? These questions are increasingly of interest to IR theorists axid policy-makers alike. The core issue is the impact of law and legal-ization on state behavior, often understood in terms of compliance. While the distinction should not be overstated, legal rules and institutions presume compliance in a way that non-legal rules and institutions do not. Law and compliance are conceptually linked because law explicitly aims to produce compliance with its rules: legal rules set the standard by which compliance is gauged Explanations of why and when states comply with international law can help account for the turn to law as a positive phenomenon, but they also provide critical policy guidance for the design of new institutions and agreements. This chapter surveys the study of compliance in both the international relations (IR) and international law (IL) literature.' In many ways, the compliance literature is a microcosm of developments in both fields, and particularly of the rapproche-For IR scholars interested in reviving the study of international law in their discipline, it was a natural step to focus first on questions of whether, when and how law 'mattered' to state behavior. For international lawyers eager to use IR theory to … |
Common carotid artery intima-media thickness: the Cardiovascular Risk Factor Multiple Evaluation in Latin America (CARMELA) study results. | BACKGROUND
Measurement of far wall common carotid artery intima-media thickness (CCAIMT) has emerged as a predictor of incident cardiovascular events. The Cardiovascular Risk Factor Multiple Evaluation in Latin America (CARMELA) study was the first large-scale population-based assessment of both CCAIMT and cardiovascular risk factor prevalence in 7 Latin American cities; the relationship between CCAIMT and cardiovascular risk markers was assessed in these urban Latin American centers.
METHODS
CARMELA was a cross-sectional, population-based, observational study using stratified, multistage sampling. The participants completed a questionnaire, were evaluated in a clinical visit and underwent carotid ultrasonography. Clinical measurements were obtained by health personnel trained, certified and supervised by CARMELA investigators. Mannheim intima-media thickness consensus guidelines were followed for measurement of CCAIMT.
RESULTS
In all cities and for both sexes, CCAIMT increased with higher age. CCAIMT was greater in the presence of cardiovascular risk factors than in their absence. In all cities, there was a statistically significant linear trend between increasing CCAIMT and a growing number of cardiovascular risk factors (p < 0.001). After adjustment for age and sex, metabolic syndrome was strongly associated with increased CCAIMT (p < 0.001 in all cities), as were hypercholesterolemia, obesity and diabetes (p < 0.001 in most cities). By multivariate analysis, hypertension was independently associated with an increase in CCAIMT in all cities (p < 0.01).
CONCLUSIONS
CARMELA was the first large-scale population study to provide normal CCAIMT values according to age and sex in urban Latin American populations and to show CCAIMT increases in the presence of cardiovascular risk factors and metabolic syndrome. |
Melanoma Skin Cancer Detection by Segmentation and Feature Extraction using combination of OTSU and STOLZ Algorithm Technique | Skin cancer exists in different forms like Melanoma, Basal and Squamous cell Carcinoma among which Melanoma is the most dangerous and unpredictable. In this paper, we implement an image processing technique for the detection of Melanoma Skin Cancer using the software MATLAB which is easy for implementation as well as detection of Melanoma skin cancer. The input to the system is the skin lesion image. This image proceeds with the image pre-processing methods such as conversion of RGB image to Grayscale image, noise removal and so on. Further Otsu thresholding is used to segment the images followed by feature extraction that includes parameters like Asymmetry, Border Irregularity, Color and Diameter (ABCD) and then Total Dermatoscopy Score (TDS) is calculated. The calculation of TDS determines the presence of Melanoma skin cancer by classifying it as benign, suspicious or highly suspicious skin lesion. |
Understanding women's attitudes towards wife beating in Zimbabwe. | OBJECTIVE
To investigate the factors associated with attitudes towards wife beating among women in partnerships in Zimbabwe in order to assist public health practitioners in preventing intimate partner violence (IPV).
METHODS
A nationally representative survey of 5907 women of reproductive age (15-49 years) was conducted in Zimbabwe. Women were asked about their attitudes towards wife beating in five situations. The survey included sociodemographic characteristics, partnership characteristics, and household decision-making.
FINDINGS
Over half of all women in Zimbabwe (53%) believed that wife beating was justified in at least one of the five situations. Respondents were most likely to find wife beating justified if a wife argued with her spouse (36%), neglected her children (33%), or went out without telling her spouse (30%). Among women in partnerships (n=3077), younger age, living in rural areas, lower household wealth, schooling at a lower level than secondary, and lower occupational status were associated with women reporting that wife beating is justified. Women who reported that they make household decisions jointly with their partners were less likely to say that wife beating is justified.
CONCLUSIONS
Zimbabwe has a long way to go in preventing IPV, particularly because the younger generation of women is significantly more likely to believe that wife beating is justified compared with older women. Given the current social and political climate in Zimbabwe, finding means to negotiate rather than settle conflict through violence is essential from the household level to the national level. |
Non-invasive measurement of blood flow using magnetic disturbance method | Current laser Doppler method of blood flow sensing requires optical contact to the skin, tend to be bulky and have performance subjective to body fluids (e.g. blood, perspiration) and environmental contaminants (e.g. mud, water). This paper proposes a novel method of noninvasive acquisition of blood flow by measuring the magnetic disturbance created due to blood flowing through a localized magnetic field. The proposed system employs a GMR based magnetic sensor and magnet of 3 mm radius, placed on a major blood vessel. The magnetic field generated by the magnet acts both as the biasing field for the sensor and also the uniform magnetic flux for blood flow disturbance. As such, the system is compact, operates at room temperature and is able to sense through clothing. The signal acquired from the magnetic and optical methods are compared using the post-occlusive reactive hyperaemia test, where measurement results on 6 different healthy subjects are found to have error of less than 5%, showing the successful use of the magnetic method to measure blood flow. |
Fast Parallel Sorting Algorithms on GPUs | This paper presents a comparative analysis of the three widely used parallel sorting algorithms: OddEven sort, Rank sort and Bitonic sort in terms of sorting rate, sorting time and speed-up on CPU and different GPU architectures. Alongside we have implemented novel parallel algorithm: min-max butterfly network, for finding minimum and maximum in large data sets. All algorithms have been implemented exploiting data parallelism model, for achieving high performance, as available on multi-core GPUs using the OpenCL specification. Our results depicts minimum speed-up19x of bitonic sort against oddeven sorting technique for small queue sizes on CPU and maximum of 2300x speed-up for very large queue sizes on Nvidia Quadro 6000 GPU architecture. Our implementation of full-butterfly network sorting results in relatively better performance than all of the three sorting techniques: bitonic, odd-even and rank sort. For min-max butterfly network, our findings report high speed-up of Nvidia quadro 6000 GPU for high data set size reaching 2 24 with much lower sorting time. |
Data Mining Approaches for Intrusion Detection | In this paper we discuss our research in developing general and systematic methods for intrusion detection. The key ideas are to use data mining techniques to discover consistent and useful patterns of system features that describe program and user behavior, and use the set of relevant system features to compute (inductively learned) classifiers that can recognize anomalies and known intrusions. Using experiments on the s ndmailsystem call data and the network tcpdumpdata, we demonstrate that we can construct concise and accurate classifiers to detect anomalies. We provide an overview on two general data mining algorithms that we have implemented: the association rules algorithm and the frequent episodes algorithm. These algorithms can be used to compute the intraand interaudit record patterns, which are essential in describing program or user behavior. The discovered patterns can guide the audit data gathering process and facilitate feature selection. To meet the challenges of both efficient learning (mining) and real-time detection, we propose an agent-based architecture for intrusion detection systems where the learning agents continuously compute and provide the updated (detection) models to the detection agents. |
Labeled Faces in the Wild: A Survey | In 2007, Labeled Faces in the Wild was released in an effort to spur research in face recognition, specifically for the problem of face verification with unconstrained images. Since that time, more than 50 papers have been published that improve upon this benchmark in some respect. A remarkably wide variety of innovative methods have been developed to overcome the challenges presented in this database. As performance on some aspects of the benchmark approaches 100% accuracy, it seems appropriate to review this progress, derive what general principles we can from these works, and identify key future challenges in face recognition. In this survey, we review the contributions to LFW for which the authors have provided results to the curators (results found on the LFW results web page). We also review the cross cutting topic of alignment and how it is used in various methods. We end with a brief discussion of recent databases designed to challenge the next generation of face recognition algorithms. Erik Learned-Miller University of Massachusetts, Amherst, Massachusetts, e-mail: [email protected] Gary B. Huang Howard Hughes Medical Institute, Janelia Research Campus, e-mail: [email protected] Aruni RoyChowdhury University of Massachusetts, Amherst, Massachusetts, e-mail: [email protected] Haoxiang Li Stevens Institute of Technology, Hoboken, New Jersey, e-mail: [email protected] Gang Hua Stevens Institute of Technology, Hoboken, New Jersey, e-mail: [email protected] |
Advances in Intelligent and Soft Computing: Potential Application of Service Science in Engineering | This paper discusses the potential of emerging service science with engineering applications. First the definition and classification of service for engineering discipline are detailed and elaborated. Based on that, this paper focuses on the potential application of service science in the Construction Industry namely Building Information Modeling (BIM). Elaborate discussion on the service value of BIM leads to suggestion for further research in specific areas namely the interaction between experts from the world’s major BIM player in order to improve the implementation of BIM. |
Maternal outcomes at 2 years after planned cesarean section versus planned vaginal birth for breech presentation at term: the international randomized Term Breech Trial. | OBJECTIVE
This study was undertaken to compare maternal outcomes at 2 years postpartum after planned cesarean section and planned vaginal birth for the singleton fetus in breech presentation at term.
STUDY DESIGN
In selected centers in the Term Breech Trial, mothers completed a structured questionnaire at 2 or more years postpartum to determine their health in the previous 3 to 6 months.
RESULTS
A total of 917 of 1159 (79.1%) mothers from 85 centers completed a follow-up questionnaire at 2 years postpartum. There were no differences between groups in breast feeding, relationship with child or partner, pain, subsequent pregnancy, incontinence, depression, urinary, menstrual or sexual problems, fatigue, or distressing memories of the birth experience. Planned cesarean section was associated with a higher risk of constipation (P = .02).
CONCLUSION
Maternal outcomes at 2 years postpartum are similar after planned cesarean section and planned vaginal birth for the singleton breech fetus at term. |
An approach for automatic sleep apnea detection based on entropy of multi-band EEG signal | Sleep apnea is a very common sleep disorder affecting a large number of people all over the world. Electroencephalography (EEG) signal analysis is an important process that enables neurologists and sleep specialists to diagnose and monitor sleep apnea events. In view of exploiting the variation in random characteristics of multi-band EEG data between apnea and non-apnea events, in this paper, an entropy based feature extraction scheme is proposed. It is shown that the proposed feature set, extracted from five different band-limited EEG signals, offers satisfactory feature quality in terms of standard performance criteria, such as geometric separability index. For the purpose of classification, the K-nearest neighborhood (KNN) classifier is used. The proposed method is tested on several subjects taken from publicly available Physionet database. It is found that the proposed method offers superior classification performance with lower feature dimension in comparison to that obtained by existing methods, in terms of sensitivity, specificity and accuracy. |
Salience Estimation via Variational Auto-Encoders for Multi-Document Summarization | We propose a new unsupervised sentence salience framework for Multi-Document Summarization (MDS), which can be divided into two components: latent semantic modeling and salience estimation. For latent semantic modeling, a neural generative model called Variational Auto-Encoders (VAEs) is employed to describe the observed sentences and the corresponding latent semantic representations. Neural variational inference is used for the posterior inference of the latent variables. For salience estimation, we propose an unsupervised data reconstruction framework, which jointly considers the reconstruction for latent semantic space and observed term vector space. Therefore, we can capture the salience of sentences from these two different and complementary vector spaces. Thereafter, the VAEs-based latent semantic model is integrated into the sentence salience estimation component in a unified fashion, and the whole framework can be trained jointly by back-propagation via multi-task learning. Experimental results on the benchmark datasets DUC and TAC show that our framework achieves better performance than the state-of-the-art models. |
Detection and Recognition of Traffic Signs from Road Scene Images | Automatic detection and recognition of road signs is an important component of automated driver assistance systems contributing to the safety of the drivers, pedestrians and vehicles. Despite significant research, the problem of detecting and recognizing road signs still remains challenging due to varying lighting conditions, complex backgrounds and different viewing angles. We present an effective and efficient method for detection and recognition of traffic signs from images. Detection is carried out by performing color based segmentation followed by application of Hough transform to find circles, triangles or rectangles. Recognition is carried out using three state-of-the-art feature matching techniques, SIFT, SURF and BRISK. The proposed system evaluated on a custom developed dataset reported promising detection and recognition results. A comparative analysis of the three descriptors reveal that while SIFT achieves the best recognition rates, BRISK is the most efficient of the three descriptors in terms of computation time. |
Ultralow-power electronics for biomedical applications. | The electronics of a general biomedical device consist of energy delivery, analog-to-digital conversion, signal processing, and communication subsystems. Each of these blocks must be designed for minimum energy consumption. Specific design techniques, such as aggressive voltage scaling, dynamic power-performance management, and energy-efficient signaling, must be employed to adhere to the stringent energy constraint. The constraint itself is set by the energy source, so energy harvesting holds tremendous promise toward enabling sophisticated systems without straining user lifestyle. Further, once harvested, efficient delivery of the low-energy levels, as well as robust operation in the aggressive low-power modes, requires careful understanding and treatment of the specific design limitations that dominate this realm. We outline the performance and power constraints of biomedical devices, and present circuit techniques to achieve complete systems operating down to power levels of microwatts. In all cases, approaches that leverage advanced technology trends are emphasized. |
The Axiom Scheme of Acyclic Comprehension | A “new” criterion for set existence is presented, namely, that a set {x | φ} should exist if the multigraph whose nodes are variables in φ and whose edges are occurrences of atomic formulas in φ is acyclic. Formulas with acyclic graphs are stratified in the sense of New Foundations, so consistency of the set theory with weak extensionality and acyclic comprehension follows from the consistency of Jensen’s system NFU . It is much less obvious, but turns out to be the case, that this theory is equivalent to NFU : it appears at first blush that it ought to be weaker. This paper verifies that acyclic comprehension and stratified comprehension are equivalent, by verifying that each axiom in a finite axiomatization of stratified comprehension follows from acyclic comprehension. keywords: New Foundations, NFU, stratification, acyclic AMS Subject Classification Code: 03E70 The first author, who is a neurologist with an amateur interest in mathematical logic, proposed the criterion of acyclic comprehension for existence of sets (originally under another name) as an approach to the historical paradoxes of set theory, and communicated this to the second author. The second author noted that stratified comprehension implies acyclic comprehension, so the scheme of acyclic comprehension is certainly consistent relative to quite weak accepted theories (as the scheme of stratified comprehension is a subtheory of NFU , which was shown to be consistent by R. B. Jensen in [6]). The second author also conjectured that the scheme was very weak (meaning not equivalent to full stratified comprehension). The first author realized that one could attack this problem by attempting to prove all propositions in a finite axiomatization of stratified comprehension (that stratified comprehension is finitely axiomatizable was originally shown in [4], though the axiomatization given there is very unpleasant to work with). Undaunted by the skepticism of the second author, he proceeded to prove that each of the axioms of the finite axiomatization used in the second author’s [5] (adapted to the Wiener ordered pair of [9]) follows from Date: May19th, 2011 – submitted version. |
Mechanisms and quantification of adsorption of three anti-inflammatory pharmaceuticals onto goethite with/without surface-bound organic acids. | Nowadays non-steroidal anti-inflammatory drugs (NSAIDs) are often detected in surface water and groundwater. In this study, effects of environmental factors, i.e., solution pH, ionic strength, temperature and surface-bound organic acids, on bonding of three typical NSAIDs (ketoprofen, naproxen and diclofenac) onto goethite were systematically investigated. Column chromatography, batch experiments, attenuated total reflectance-Fourier transform infrared (ATR-FTIR) spectroscopy and surface complexation modeling were used to probe the adsorption mechanisms. Bonding of three NSAIDs onto goethite was totally reversible, ionic strength-dependent and endothermic (adsorption enthalpy 2.86-9.75 kJ/mol). These evidences supported H-bonding mechanism, which was further explained by ATR-FTIR observation and a triple planes model. Surface-bound organic acids (phthalic acid, trimellitic acid and pyromellitic acid) by inner-sphere complexation with goethite were hard to be desorbed. Surface-bound phthalic acid increased the uptake of NSAIDs but surface-bound trimellitic acid and pyromellitic acid reduced their adsorption. The reason is that the adsorbed phthalic acid can result in a more hydrophobic surface while adsorbed trimellitic acid and pyromellitic acid increased the surface negative charge and polarity. Finally, adsorption of NSAIDs onto goethite with/without surface-bound organic acids was well described by a free energy model, in which contributions of interactions (e.g., H-bonding and van der Waals) were evaluated. |
Handover Control in Wireless Systems via Asynchronous Multiuser Deep Reinforcement Learning | In this paper, we propose a two-layer framework to learn the optimal handover (HO) controllers in possibly large-scale wireless systems supporting mobile Internet-of-Things users or traditional cellular users, where the user mobility patterns could be heterogeneous. In particular, our proposed framework first partitions the user equipments (UEs) with different mobility patterns into clusters, where the mobility patterns are similar in the same cluster. Then, within each cluster, an asynchronous multiuser deep reinforcement learning (RL) scheme is developed to control the HO processes across the UEs in each cluster, in the goal of lowering the HO rate while ensuring certain system throughput. In this scheme, we use a deep neural network (DNN) as an HO controller learned by each UE via RL in a collaborative fashion. Moreover, we use supervised learning in initializing the DNN controller before the execution of RL to exploit what we already know with traditional HO schemes and to mitigate the negative effects of random exploration at the initial stage. Furthermore, we show that the adopted global-parameter-based asynchronous framework enables us to train faster with more UEs, which could nicely address the scalability issue to support large systems. Finally, simulation results demonstrate that the proposed framework can achieve better performance than the state-of-art online schemes, in terms of HO rates. |
Android anti-forensics through a local paradigm | Mobile devices are among the most disruptive technologies of the last years, gaining even more diffusion and success in the daily life of a wide range of people categories. Unfortunately, while the number of mobile devices implicated in crime activities is relevant and growing, the capability to perform the forensic analysis of such devices is limited both by technological and methodological problems. In this paper, we focus on Anti-Forensic techniques applied to mobile devices, presenting some fully automated instances of such techniques to Android devices. Furthermore, we tested the effectiveness of such techniques versus both the cursory examination of the device and some acquisition tools. a 2010 Digital Forensic Research Workshop. Published by Elsevier Ltd. All rights reserved. |
A randomized clinical trial of a brief hypnosis intervention to control side effects in breast surgery patients. | BACKGROUND
Breast cancer surgery is associated with side effects, including postsurgical pain, nausea, and fatigue. We carried out a randomized clinical trial to test the hypotheses that a brief presurgery hypnosis intervention would decrease intraoperative anesthesia and analgesic use and side effects associated with breast cancer surgery and that it would be cost effective.
METHODS
We randomly assigned 200 patients who were scheduled to undergo excisional breast biopsy or lumpectomy (mean age 48.5 years) to a 15-minute presurgery hypnosis session conducted by a psychologist or nondirective empathic listening (attention control). Patients were not blinded to group assignment. Intraoperative anesthesia use (i.e., of the analgesics lidocaine and fentanyl and the sedatives propofol and midazolam) was assessed. Patient-reported pain and other side effects as measured on a visual analog scale (0-100) were assessed at discharge, as was use of analgesics in the recovery room. Institutional costs and time in the operating room were assessed via chart review.
RESULTS
Patients in the hypnosis group required less propofol (means = 64.01 versus 96.64 microg; difference = 32.63; 95% confidence interval [CI] = 3.95 to 61.30) and lidocaine (means = 24.23 versus 31.09 mL; difference = 6.86; 95% CI = 3.05 to 10.68) than patients in the control group. Patients in the hypnosis group also reported less pain intensity (means = 22.43 versus 47.83; difference = 25.40; 95% CI = 17.56 to 33.25), pain unpleasantness (means = 21.19 versus 39.05; difference = 17.86; 95% CI = 9.92 to 25.80), nausea (means = 6.57 versus 25.49; difference = 18.92; 95% CI = 12.98 to 24.87), fatigue (means = 29.47 versus 54.20; difference = 24.73; 95% CI = 16.64 to 32.83), discomfort (means = 23.01 versus 43.20; difference = 20.19; 95% CI = 12.36 to 28.02), and emotional upset (means = 8.67 versus 33.46; difference = 24.79; 95% CI = 18.56 to 31.03). No statistically significant differences were seen in the use of fentanyl, midazolam, or recovery room analgesics. Institutional costs for surgical breast cancer procedures were $8561 per patient at Mount Sinai School of Medicine. Patients in the hypnosis group cost the institution $772.71 less per patient than those in the control group (95% CI = 75.10 to 1469.89), mainly due to reduced surgical time.
CONCLUSIONS
Hypnosis was superior to attention control regarding propofol and lidocaine use; pain, nausea, fatigue, discomfort, and emotional upset at discharge; and institutional cost. Overall, the present data support the use of hypnosis with breast cancer surgery patients. |
DLint: dynamically checking bad coding practices in JavaScript | JavaScript has become one of the most popular programming languages, yet it is known for its suboptimal design. To effectively use JavaScript despite its design flaws, developers try to follow informal code quality rules that help avoid correctness, maintainability, performance, and security problems. Lightweight static analyses, implemented in "lint-like" tools, are widely used to find violations of these rules, but are of limited use because of the language's dynamic nature. This paper presents DLint, a dynamic analysis approach to check code quality rules in JavaScript. DLint consists of a generic framework and an extensible set of checkers that each addresses a particular rule. We formally describe and implement 28 checkers that address problems missed by state-of-the-art static approaches. Applying the approach in a comprehensive empirical study on over 200 popular web sites shows that static and dynamic checking complement each other. On average per web site, DLint detects 49 problems that are missed statically, including visible bugs on the web sites of IKEA, Hilton, eBay, and CNBC. |
Joint Task Offloading and Resource Allocation for Multi-Server Mobile-Edge Computing Networks | Mobile-edge computing (MEC) is an emerging paradigm that provides a capillary distribution of cloud computing capabilities to the edge of the wireless access network, enabling rich services and applications in close proximity to the end users. In this paper, an MEC enabled multi-cell wireless network is considered where each base station (BS) is equipped with a MEC server that assists mobile users in executing computation-intensive tasks via task offloading. The problem of joint task offloading and resource allocation is studied in order to maximize the users’ task offloading gains, which is measured by a weighted sum of reductions in task completion time and energy consumption. The considered problem is formulated as a mixed integer nonlinear program (MINLP) that involves jointly optimizing the task offloading decision, uplink transmission power of mobile users, and computing resource allocation at the MEC servers. Due to the combinatorial nature of this problem, solving for optimal solution is difficult and impractical for a large-scale network. To overcome this drawback, we propose to decompose the original problem into a resource allocation (RA) problem with fixed task offloading decision and a task offloading (TO) problem that optimizes the optimal-value function corresponding to the RA problem. We address the RA problem using convex and quasi-convex optimization techniques, and propose a novel heuristic algorithm to the TO problem that achieves a suboptimal solution in polynomial time. Simulation results show that our algorithm performs closely to the optimal solution and that it significantly improves the users’ offloading utility over traditional approaches. |
Automatic Manga Colorization with Hint | [1] Goodfellow, Ian, et al. "Generative adversarial nets." Advances in neural information processing systems. 2014. [2] Mirza, Mehdi, and Simon Osindero. "Conditional generative adversarial nets." arXiv preprint arXiv:1411.1784 (2014). [3] Arjovsky, Martin, Soumith Chintala, and Léon Bottou. "Wasserstein gan." arXiv preprint arXiv:1701.07875 (2017). [4] Gulrajani, Ishaan, et al. "Improved training of wasserstein gans." arXiv preprint arXiv:1704.00028 (2017). [5] Ronneberger, Olaf, Philipp Fischer, and Thomas Brox. "U-net: Convolutional networks for biomedical image segmentation." International Conference on Medical Image Computing and Computer-Assisted Intervention. Springer International Publishing, 2015. [6] Liu, Yifan, et al. "Auto-painter: Cartoon Image Generation from Sketch by Using Conditional Generative Adversarial Networks." arXiv preprint arXiv:1705.01908 (2017). Quantitative Results Method |
Real-Time Anomaly Detection for Streaming Analytics | Much of the worlds data is streaming, time-series data, where anomalies give significant information in critical situations. Yet detecting anomalies in streaming data is a difficult task, requiring detectors to process data in real-time, and learn while simultaneously making predictions. We present a novel anomaly detection technique based on an on-line sequence memory algorithm called Hierarchical Temporal Memory (HTM). We show results from a live application that detects anomalies in financial metrics in realtime. We also test the algorithm on NAB, a published benchmark for real-time anomaly detection, where our algorithm achieves best-in-class results. |
Decoding fast-paced error-related potentials in monitoring protocols | Error-related EEG potentials (ErrP) can be used for brain-machine interfacing (BMI). Decoding of these signals, indicating subject's perception of erroneous system decisions or actions can be used to correct these actions or to improve the overall interfacing system. Multiple studies have shown the feasibility of decoding these potentials in single-trial using different types of experimental protocols and feedback modalities. However, previously reported approaches are limited by the use of long inter-stimulus intervals (ISI > 2 s). In this work we assess if it is possible to overcome this limitation. Our results show that it is possible to decode error-related potentials elicited by stimuli presented with ISIs lower than 1 s without decrease in performance. Furthermore, the increase in the presentation rate did not increase the subject workload. This suggests that the presentation rate for ErrP-based BMI protocols using serial monitoring paradigms can be substantially increased with respect to previous works. |
Open-TEE -- An Open Virtual Trusted Execution Environment | Hardware-based Trusted Execution Environments (TEEs) are widely deployed in mobile devices. Yet their use has been limited primarily to applications developed by the device vendors. Recent standardization of TEE interfaces by GlobalPlatform (GP) promises to partially address this problem by enabling GP-compliant trusted applications to run on TEEs from different vendors. Nevertheless ordinary developers wishing to develop trusted applications face significant challenges. Access to hardware TEE interfaces are difficult to obtain without support from vendors. Tools and software needed to develop and debug trusted applications may be expensive or non-existent. In this paper, we describe Open-TEE, a virtual, hardware-independent TEE implemented in software. Open-TEE conforms to GP specifications. It allows developers to develop and debug trusted applications with the same tools they use for developing software in general. Once a trusted application is fully debugged, it can be compiled for any actual hardware TEE. Through performance measurements and a user study we demonstrate that Open-TEE is efficient and easy to use. We have made Open-TEE freely available as open source. |
Path integral guided policy search | 3Sergey Levine is with Google Brain, Mountain View, CA 94043, USA. We present a policy search method for learning complex feedback control policies that map from high-dimensional sensory inputs to motor torques, for manipulation tasks with discontinuous contact dynamics. We build on a prior technique called guided policy search (GPS), which iteratively optimizes a set of local policies for specific instances of a task, and uses these to train a complex, high-dimensional global policy that generalizes across task instances. We extend GPS in the following ways: (1) we propose the use of a model-free local optimizer based on path integral stochastic optimal control (PI2), which enables us to learn local policies for tasks with highly discontinuous contact dynamics; and (2) we enable GPS to train on a new set of task instances in every iteration by using on-policy sampling: this increases the diversity of the instances that the policy is trained on, and is crucial for achieving good generalization. We show that these contributions enable us to learn deep neural network policies that can directly perform torque control from visual input. We validate the method on a challenging door opening task and a pick-and-place task, and we demonstrate that our approach substantially outperforms the prior LQR-based local policy optimizer on these tasks. Furthermore, we show that on-policy sampling significantly increases the generalization ability of these policies. |
Recognizing human activities from accelerometer and physiological sensors | Recently the interest about the services in the ubiquitous environment has increased. These kinds of services are focusing on the context of the userpsilas activities, location or environment. There were many studies about recognizing these contexts using various sensory resources. To recognize human activity, many of them used an accelerometer, which shows good accuracy to recognize the userpsilas activities of movements, but they did not recognize stable activities which can be classified by the userpsilas emotion and inferred by physiological sensors. In this paper, we exploit multiple sensor signals to recognize to userpsilas activity. As Armband includes an accelerometer and physiological sensors, we used them with a fuzzy Bayesian network for the continuous sensor data. The fuzzy membership function uses three stages differed by the distribution of each sensor data. Experiments in the activity recognition accuracy, have conducted by the combination of the usages of accelerometers and physiological signals. For the result, the total accuracy appears to be 74.4% for the activities including dynamic activities and stable activities, using the physiological signals and one 2-axis accelerometer. When we use only the physiological signals the accuracy is 60.9%, and when we use the 2 axis accelerometer the accuracy is 44.2%. We show that using physiological signals with accelerometer is more efficient in recognizing activities. |
Resampling algorithms and architectures for distributed particle filters | In this paper, we propose novel resampling algorithms with architectures for efficient distributed implementation of particle filters. The proposed algorithms improve the scalability of the filter architectures affected by the resampling process. Problems in the particle filter implementation due to resampling are described, and appropriate modifications of the resampling algorithms are proposed so that distributed implementations are developed and studied. Distributed resampling algorithms with proportional allocation (RPA) and nonproportional allocation (RNA) of particles are considered. The components of the filter architectures are the processing elements (PEs), a central unit (CU), and an interconnection network. One of the main advantages of the new resampling algorithms is that communication through the interconnection network is reduced and made deterministic, which results in simpler network structure and increased sampling frequency. Particle filter performances are estimated for the bearings-only tracking applications. In the architectural part of the analysis, the area and speed of the particle filter implementation are estimated for a different number of particles and a different level of parallelism with field programmable gate array (FPGA) implementation. In this paper, only sampling importance resampling (SIR) particle filters are considered, but the analysis can be extended to any particle filters with resampling. |
The intrinsic diversity of creativity research: Interview with Prof. Todd Lubart | Professor Todd Lubart’s past and present work on creativity is a perfect example of how dynamic and multi-faceted this area of psychology really is. Creative phenomena draw on cognitive, personality, emotional, motivational and social processes at once and creativity studies can be found at the intersection between different psychological fields: cognitive, social and personality, organisational, developmental, educational, individual differences and so on. In this interview Professor Lubart discusses his creativity research projects and how they evolved in a constant dialogue between personal interests and opportunities for research and collaboration. Creativity is portrayed as a heterogeneous domain where the most interesting breakthroughs happen ‘at the borders’. Here, those who make an impact are the ones ready to take risks and exploit the domain’s intrinsic diversity and the possibilities for creative thinking associated with it. |
What Drives Corporate Social Performance? International Evidence from Social, Environmental and Governance Scores | We investigate the institutional drivers of Corporate Social Performance (CSP) by focusing on its three fundamental components: social, environmental and governance performance. Using a large cross-section of firms from 42 countries over 7 years, we are able to explain 41, 46 and 63% of the variation in social performance, environmental performance, and corporate governance respectively, with observable firm, industry and institutional factors. More specifically, we hypothesize that country institutions have a profound influence on CSP. We find that political institutions, followed by legal and labor market institutions are the most important country determinants of social and environmental performance. In contrast, legal institutions, followed by political institutions are the most important country determinants of governance. Capital market institutions appear to be less important drivers of CSP. Our results provide insights on the demand and supply forces that determine CSP internationally. Assistant Professor of Strategic and International Management, London Business School, Regent’s Park, NW1 4SA, London, United Kingdom. Email: [email protected], Ph: +44 20 7000 8748, Fx: +44 20 7000 7001. Assistant Professor of Business Administration, Harvard Business School, Soldiers’ Field Road, Morgan Hall 381, 02163 Boston, MA, USA. Email:[email protected], Ph: +1 617 495 6548, Fx: +1 617 496 7387. We thank ASSET4 (Thompson Reuters) for providing us with the data for this paper, and we are particularly grateful to Dr. Christopher Greenwald and Ms. Caroline Harrison at ASSET4 for their help and cooperation. All errors remain our own. |
ICIET ’ 14 Image Steganography Method Using Integer Wavelet Transform | Digital Steganography explains the art and science of writing hidden messages in such a way that, apart from the sender and intended recipient, no one suspects the existence of the message, a form of security through the state of being unknown. The main two famous schemes used for image steganography are spatial domain embedding and transform domain embedding. The main aim of Wavelet Transform is used to transform original image (cover image) from spatial domain to frequency domain. But here in our proposed work, Integer Wavelet Transform is performed on a gray level cover image and in turn embeds the message bitstream into the LSB's of the integer wavelet coefficients of a the image . The main purpose of the proposed work is to focus on improving embedding capacity and bring down the distortion occurring to the stego image. The refinement of the algorithm plays an important role for accomplishing higher embedding capacity and low distortion rate. The experimental results prove that the assessment metric such as PSNR is improved in a high manner. The experimental results show that the algorithm has a high capacity and a good invisibility. |
Signal detection effects on deep neural networks utilizing raw IQ for modulation classification | Recently, automatic modulation classification techniques using convolutional neural networks on raw IQ samples have been investigated and show promise when compared to more traditional likelihood-based or feature-based techniques. While likelihood-based and feature-based techniques are effective, making classification decisions directly on the raw IQ samples allows for reduced system complexity and removes the need for expertly crafted transformations and feature extractions. In practice, RF environments are typically very dense, and a receiver must first detect and isolate each signal of interest before classification can be performed. The errors introduced by this detection and isolation process will affect the accuracy of convolutional neural networks making automatic modulation classification decisions using only raw IQ samples. To quantify this impact, a representative convolutional neural network designed to distinguish between 8 modulation classes (2FSK, 4FSK, 8FSK, BPSK, QPSK, 8PSK, 16QAM, and 64QAM), over a generalized parameter set, is analyzed. The classification accuracy of this neural network is investigated as a function of errors in carrier frequency estimation and errors in sample rate estimation. The importance of defining upper limits on frequency and sample rate estimation errors in a detector is highlighted, and the negative effects of over-estimating or under-estimating these limits is explored. |
Discriminative Multi-View Interactive Image Re-Ranking | Given an unreliable visual patterns and insufficient query information, content-based image retrieval is often suboptimal and requires image re-ranking using auxiliary information. In this paper, we propose a discriminative multi-view interactive image re-ranking (DMINTIR), which integrates user relevance feedback capturing users’ intentions and multiple features that sufficiently describe the images. In DMINTIR, heterogeneous property features are incorporated in the multi-view learning scheme to exploit their complementarities. In addition, a discriminatively learned weight vector is obtained to reassign updated scores and target images for re-ranking. Compared with other multi-view learning techniques, our scheme not only generates a compact representation in the latent space from the redundant multi-view features but also maximally preserves the discriminative information in feature encoding by the large-margin principle. Furthermore, the generalization error bound of the proposed algorithm is theoretically analyzed and shown to be improved by the interactions between the latent space and discriminant function learning. Experimental results on two benchmark data sets demonstrate that our approach boosts baseline retrieval quality and is competitive with the other state-of-the-art re-ranking strategies. |
An optimal technology mapping algorithm for delay optimization in lookup-table based FPGA designs | In this paper we present a polynomial time technology mapping algorithm, called Flow-Map, that optimally solves the LUT-based FPGA technology mapping problem for depth minimization for general Boolean networks. This theoretical breakthrough makes a sharp contrast with the fact that conventional technology mapping problem in library-based designs is NP-hard. A key step in Flow-Map is to compute a minimum height K-feasible cut in a network, solved by network flow computation. Our algorithm also effectively minimizes the number of LUTs by maximizing the volume of each cut and by several postprocessing operations. We tested the Flow-Map algorithm on a set of benchmarks and achieved reductions on both the network depth and the number of LUTs in mapping solutions as compared with previous algorithms. |
TLR9-adjuvanted pneumococcal conjugate vaccine induces antibody-independent memory responses in HIV-infected adults | HIV-patients have excess of pneumococcal infection. We immunized 40 HIV-patients twice with pneumococcal conjugate vaccine (Prevnar, Pfizer) +/- a TLR9 agonist (CPG 7909). Peripheral blood mononuclear cells were stimulated with pneumococcal polysaccharides and cytokine concentrations measured. The CPG 7909 adjuvant group had significantly higher relative cytokine responses than the placebo group for IL-1β, IL-2R, IL-6, IFN-γ and MIP-β, which, did not correlate with IgG antibody responses. These findings suggests that CPG 7909 as adjuvant to pneumococcal conjugate vaccine induces cellular memory to pneumococcal polysaccharides in HIV-patients, independently of the humoral response. |
Real-Time or Near Real-Time Persisting Daily Healthcare Data Into HDFS and ElasticSearch Index Inside a Big Data Platform | Mayo Clinic (MC) healthcare generates a large number of HL7 V2 messages—0.7–1.1 million on weekends and 1.7–2.2 million on business days at present. With multiple RDBMS-based systems, such a large volume of HL7 messages still cannot be real-time or near-real-time stored, analyzed, and retrieved for enterprise-level clinic and nonclinic usage. To determine if Big Data technology coupled with ElasticSearch technology can satisfy MC daily healthcare needs for HL7 message processing, a BigData platform was developed to contain two identical Hadoop clusters (TDH1.3.2 version)—each containing an ElasticSearch cluster and instances of a storm topology—MayoTopology for processing HL7 messages on MC ESB queues into an ElasticSearch index and the HDFS. The implemented BigData platform can process 62 ± 4 million HL7 messages per day while the ElasticSearch index can provide ultrafast free-text searching at a speed level of 0.2-s per query on an index containing a dataset of 25 million HL7-derived-JSON-documents. The results suggest that the implemented BigData platform exceeds MC enterprise-level patient-care needs. |
Separating Style and Content with Bilinear Models | Perceptual systems routinely separate content from style, classifying familiar words spoken in an unfamiliar accent, identifying a font or handwriting style across letters, or recognizing a familiar face or object seen under unfamiliar viewing conditions. Yet a general and tractable computational model of this ability to untangle the underlying factors of perceptual observations remains elusive (Hofstadter, 1985). Existing factor models (Mardia, Kent, & Bibby, 1979; Hinton & Zemel, 1994; Ghahramani, 1995; Bell & Sejnowski, 1995; Hinton, Dayan, Frey, & Neal, 1995; Dayan, Hinton, Neal, & Zemel, 1995; Hinton & Ghahramani, 1997) are either insufficiently rich to capture the complex interactions of perceptually meaningful factors such as phoneme and speaker accent or letter and font, or do not allow efficient learning algorithms. We present a general framework for learning to solve two-factor tasks using bilinear models, which provide sufficiently expressive representations of factor interactions but can nonetheless be fit to data using efficient algorithms based on the singular value decomposition and expectation-maximization. We report promising results on three different tasks in three different perceptual domains: spoken vowel classification with a benchmark multi-speaker database, extrapolation of fonts to unseen letters, and translation of faces to novel illuminants. |
The international epidemiology of child sexual abuse. | Surveys of child sexual abuse in large nonclinical populations of adults have been conducted in at least 19 countries in addition to the United States and Canada, including 10 national probability samples. All studies have found rates in line with comparable North American research, ranging from 7% to 36% for women and 3% to 29% for men. Most studies found females to be abused at 1 1/2 to 3 times the rate for males. Few comparisons among countries are possible because of methodological and definitional differences. However, they clearly confirm sexual abuse to be an international problem. |
The Exo-S probe class starshade mission | Exo-S is a direct imaging space-based mission to discover and characterize exoplanets. With its modest size, Exo-S bridges the gap between census missions like Kepler and a future space-based flagship direct imaging exoplanet mission. With the ability to reach down to Earth-size planets in the habitable zones of nearly two dozen nearby stars, Exo-S is a powerful first step in the search for and identification of Earth-like planets. Compelling science can be returned at the same time as the technological and scientific framework is developed for a larger flagship mission. The Exo-S Science and Technology Definition Team studied two viable starshade-telescope missions for exoplanet direct imaging, targeted to the $1B cost guideline. The first Exo-S mission concept is a starshade and telescope system dedicated to each other for the sole purpose of direct imaging for exoplanets (The “Starshade Dedicated Mission”). The starshade and commercial, 1.1-m diameter telescope co-launch, sharing the same low-cost launch vehicle, conserving cost. The Dedicated mission orbits in a heliocentric, Earth leading, Earth-drift away orbit. The telescope has a conventional instrument package that includes the planet camera, a basic spectrometer, and a guide camera. The second Exo-S mission concept is a starshade that launches separately to rendezvous with an existing on-orbit space telescope (the “Starshade Rendezvous Mission”). The existing telescope adopted for the study is the WFIRST-AFTA (Wide-Field Infrared Survey Telescope Astrophysics Focused Telescope Asset). The WFIRST-AFTA 2.4-m telescope is assumed to have previously launched to a Halo orbit about the Earth-Sun L2 point, away from the gravity gradient of Earth orbit which is unsuitable for formation flying of the starshade and telescope. The impact on WFIRST-AFTA for starshade readiness is minimized; the existing coronagraph instrument performs as the starshade science instrument, while formation guidance is handled by the existing coronagraph focal planes with minimal modification and an added transceiver. |
Generating Informative and Diverse Conversational Responses via Adversarial Information Maximization | Responses generated by neural conversational models tend to lack informativeness and diversity. We present a novel adversarial learning method, called Adversarial Information Maximization (AIM) model, to address these two related but distinct problems. To foster response diversity, we leverage adversarial training that allows distributional matching of synthetic and real responses. To improve informativeness, we explicitly optimize a variational lower bound on pairwise mutual information between query and response. Empirical results from automatic and human evaluations demonstrate that our methods significantly boost informativeness and diversity. |
A reusable BIST with software assisted repair technology for improved memory and IO debug, validation and test time | As silicon integration complexity increases with 3D stacking and Through-Silicon-Via (TSV), so does the occurrence of memory and IO defects and associated test and validation time. This ultimately leads to an overall cost increase. On a 14nm Intel SOC, a reusable BIST engine called Converged-Pattern-Generator-Checker (CPGC) are architected to detect memory and IO defects, and combined with the software assisted repair technology to automatically repair memory cell defects on 3D stacked Wide-IO DRAM. Additionally, we also present the CPGC gate count, power, simulation, and silicon results. The reusable CPGC IP is designed to connect to a standard IP interface, which enables a quick turn-key SOC development cycle. Silicon results show CPGC can speed up validation by 5x, improve test time from minutes down to seconds, and decrease debug time by 5x including root-cause of boot failures of the memory interface. CPGC is also used in memory training and initialization, which makes it a critical part of Intel SOC. |
Relationships between Oil Price and Stock Market: An Empirical Analysis from Istanbul Stock Exchange (ISE) | The present study examined long-term relationships and short-term dynamics between National 100, National 50 and National 30 Index of Istanbul Stock Exchange (ISE) and international Brent oil price by using various econometric techniques. The study, in which relationships of three index with oil price are sought separately, encompasses the period between 04.01.2000 and 04.01.2010 and was performed with data consisting of 2437 days. As a result of applied Johansen cointegration test, it was determined that there was a cointegrated relationship between each index and oil price, with other words, there was a long term relationship between each of the three index and oil price. As a result of Granger causality analysis, it was observed that there was one way causality relationship from all index of the stock exchange market to oil price, but oil price was not the causal of each of the three index. |
Optimizing Cu‐Cr Coatings for Environmental Protection of Copper Alloys | Cu-Cr barrier coatings, with suitably matched thermal and physical properties, are being developed to protect Cu-alloy liners of rocket engine thrusters via the formation of Cr 2 O 3 barriers. The challenge is to maximize Cr 2 O 3 protection with minimal Cr addition, since high Cr content degrades the coating conductivity and ductility. It is shown that, by using an efficient coating technique to produce a refined coating microstructure, adequate protection is achieved even while Cr content is reduced by nearly half from the current state-of-the-art levels. |
Comparison of Image Steganography Techniques | Steganography is an important area of research in recent years involving a number of applications. It is the science of embedding information into the cover image viz., text, video,and image (payload) without causing statistically significant modification to the cover image. The modern secure image steganography presents a challenging task of transferring the embedded information to the destination without being detected.This paper deals with hiding text in an image file using Least Significant Bit (LSB) based Steganography, Discrete Cosine Transform (DCT) based Steganography and Discrete Wavelet Transform (DWT) based steganography.The LSB algorithm is implemented in spatial domain in which the payload bits are embedded into the least significant bits of cover image to derive the stegoimage whereas DCT & DWT algorithm are implemented in frequency domain in which the stego-image is transformed from spatial domain to the frequency domain and the payload bits are embedded into the frequency components of the cover image.The performance and comparison of these three techniques is evaluated on the basis of the parameters MSE, PSNR, NC, processing time, Capacity& Robustness. |
Towards a Modular Recommender System for Research Papers written in Albanian | In the recent years there has been an increase in scientific papers publications in Albania and its neighboring countries that have large communities of Albanian speaking researchers. Many of these papers are written in Albanian. It is a very time consuming task to find papers related to the researchers’ work, because there is no concrete system that facilitates this process. In this paper we present the design of a modular intelligent search system for articles written in Albanian. The main part of it is the recommender module that facilitates searching by providing relevant articles to the users (in comparison with a given one). We used a cosine similarity based heuristics that differentiates the importance of term frequencies based on their location in the article. We did not notice big differences on the recommendation results when using different combinations of the importance factors of the keywords, title, abstract and body. We got similar results when using only theand body. We got similar results when using only the title and abstract in comparison with the other combinations. Because we got fairly good results in this initial approach, we believe that similar recommender systems for documents written in Albanian can be built also in contexts not related to scientific publishing. Keywords—recommender system; Albanian; information retrieval; intelligent search; digital library |
A survey of collaborative filtering based social recommender systems | Recommendation plays an increasingly important role in our daily lives. Recommender systems automatically suggest to a user items that might be of interest to her. Recent studies demonstrate that information from social networks can be exploited to improve accuracy of recommendations. In this paper, we present a survey of collaborative filtering (CF) based social recommender systems. We provide a brief overview over the task of recommender systems and traditional approaches that do not use social network information. We then present how social network information can be adopted by recommender systems as additional input for improved accuracy. We classify CF-based social recommender systems into two categories: matrix factorization based social recommendation approaches and neighborhood based social recommendation approaches. For each category, we survey and compare several represen- |
Identification and adaptive neural network control of a DC motor system with dead-zone characteristics. | In this paper, an adaptive control approach based on the neural networks is presented to control a DC motor system with dead-zone characteristics (DZC), where two neural networks are proposed to formulate the traditional identification and control approaches. First, a Wiener-type neural network (WNN) is proposed to identify the motor DZC, which formulates the Wiener model with a linear dynamic block in cascade with a nonlinear static gain. Second, a feedforward neural network is proposed to formulate the traditional PID controller, termed as PID-type neural network (PIDNN), which is then used to control and compensate for the DZC. In this way, the DC motor system with DZC is identified by the WNN identifier, which provides model information to the PIDNN controller in order to make it adaptive. Back-propagation algorithms are used to train both neural networks. Also, stability and convergence analysis are conducted using the Lyapunov theorem. Finally, experiments on the DC motor system demonstrated accurate identification and good compensation for dead-zone with improved control performance over the conventional PID control. |
A printed LPDA with UWB capability | This work deals with the design of a wideband microstrip log periodic array operating between 4 and 18 GHz (thus working in C,X and Ku bands). A few studies, since now, have been proposed but they are significantly less performing and usually quite complicated. Our solution is remarkably simple and shows both SWR and gain better than likely structures proposed in the literature. The same antenna can also be used as an UWB antenna. The design has been developed using CST MICROWAVE STUDIO 2009, a general purpose and specialist tool for the 3D electromagnetic simulation of microwave high frequency components. |
Effect of a school-based sun-protection intervention on the development of melanocytic nevi in children. | "Kidskin" was a 5-year (1995-1999), school-based intervention trial among first-grade children in Perth, Western Australia. It aimed to assess whether a sun-protection intervention could protect against nevus development on the trunk, face, and arms. Included were a control group, a "moderate intervention" group, and a "high intervention" group. Control schools taught the standard health curriculum, while intervention schools received a specially designed sun-protection curriculum over 4 years. The high intervention group also received program materials over summer vacations when sun exposure was likely to be highest and were offered low-cost sun-protective swimwear. After adjustment for baseline nevus counts and potential confounding, nevus counts on all body sites were slightly lower in both intervention groups relative to the control group at follow-up, although the differences were not statistically significant and the high intervention was no more protective. Children in the moderate and high intervention groups, respectively, had fewer nevi on the back (6%, 95% confidence interval (CI): 0, 12; 4%, 95% CI: -3, 11), chest (boys) (5%, 95% CI: -4, 13; 3%, 95% CI: -8, 14), face (11%, 95% CI: 0, 21; 9%, 95% CI: -6, 21), and arms (8%, 95% CI: -1, 17; 3%, 95% CI: -10, 14). |
A prospective randomized pilot study on intermittent post-dialysis dosing of cinacalcet | Treatment of secondary hyperparathyroidism (SHPT) is important in management of patients with end-stage renal disease on hemodialysis (HD). Calcimimetic agent, cinacalcet provides an option for control of SHPT in patients who fail traditional therapy. It may not have optimal results in non-compliant patients. To enhance compliance, we evaluated effectiveness of post-dialysis dosing of cinacalcet (group AD) as compared to daily home administration (group D) in a prospective randomized trial of HD patients with refractory SHPT. After 2-week run-in phase, patients were randomly assigned to two treatment groups. In group AD (N = 12), patients were administered cinacalcet on the day of dialysis (3 times/week) by dialysis staff, while in control group D (N = 11), cinacalcet was prescribed daily to be taken by patients at home. Intact parathyroid hormone (i-PTH), serum calcium, phosphorus, and alkaline phosphatase were followed for 16 weeks and compared to baseline in both groups. Data were analyzed using between-groups linear regression for repeated measures. No significant decline in i-PTH occurred in group AD at 16 weeks as compared to a significant drop in group D (p = 0.006). However, subgroup analysis showed effectiveness of post-dialysis dosing in patients with less severe SHPT (p = 0.04). Although daily dosing overall was more effective for treatment of SHPT, dialysis dosing was effective in patients with less severe SHPT. This warrants a larger study considering the limitations of this pilot trial. In the meantime, dialysis dosing can be considered in non-compliant patients with less severe SHPT. |
The epidemiology of ocular trauma in rural Nepal. | AIMS
To estimate the incidence of ocular injury in rural Nepal and identify details about these injuries that predict poor visual outcome.
METHODS
Reports of ocular trauma were collected from 1995 through 2000 from patients presenting to the only eye care clinic in Sarlahi district, Nepal. Patients were given a standard free eye examination and interviewed about the context of their injury. Follow up examination was performed 2-4 months after the initial injury.
RESULTS
525 cases of incident ocular injury were reported, with a mean age of 28 years. Using census data, the incidence was 0.65 per 1000 males per year, and 0.38 per 1000 females per year. The most common types of injury were lacerating and blunt, with the majority occurring at home or in the fields. Upon presentation to the clinic, 26.4% of patients had a best corrected visual acuity worse than 20/60 in the injured eye, while 9.6% had visual acuity worse than 20/400. 82% were examined at follow up: 11.2% of patients had visual acuity worse than 20/60 and 4.6% had vision worse than 20/400. A poor visual outcome was associated with increased age, care sought at a site other than the eye clinic, and severe injury. 3% of patients were referred for further care at an eye hospital at the initial visit; 7% had sought additional care in the interim between visits, with this subset representing a more severe spectrum of injuries.
CONCLUSIONS
The detrimental effects of delayed care or care outside of the specialty eye clinic may reflect geographic or economic barriers to care. For optimal visual outcomes, patients who are injured in a rural setting should recognise the injury and seek early care at a specialty eye care facility. Findings from our study suggest that trained non-ophthalmologists may be able to clinically manage many eye injuries encountered in a rural setting in the "developing" world, reducing the demand for acute services of ophthalmologists in remote locations of this highly agricultural country. |
Influence of U.S. Cryptologic Organizations on the digital computer industry | The modern electronic digital computers, beginning with the successors to ENIAC, all benefited from U.S. government financial support in their infancy. Of all involvements by government agencies in early computer developments, perhaps least known was the major role played by the National Security Agency and its predecessor organizations. This story is told for the first time. The impact was felt not only in machines built for that agency, but several commercial models reflected some of the features originated by and for NSA. Also, fundamental engineering and materials investigations were supported, beginning in 1957, in a massive 5-year research program. |
Action of d-propranolol in manic psychoses | Six manic patients were treated with high doses of d-propranolol or d- and dl-propranolol in a double-blind, placebo controlled study. The following variables were measured: propranolol dosage, propranolol serum concentration, pulse frequency, blood pressure, and psychotic behavior. In all cases an improvement was noticed. High dosages were necessary to obtain sufficient effect. The antimanic property of d-propranolol was approximately 50% smaller than the antimanic property of dl-propranolol. We conclude that at least some part of the antimanic action of beta-blockers is independent from the beta-blocking property. Im Rahmen einer doppelblind durchgeführten, placebokontrollierten Studie wurden sechs manische Patienten mit hohen Dosen doder d- und dl-Propranolol behandelt. Dabei wurden folgende Variablen gemessen: Propranolol-Dosis, Propranolol-Serumkonzentration, Pulsfrequenz, Blutdruck und Psychopathologie. In allen Fällen konnte eine klinische Besserung festgestellt werden, aber hohe Dosen waren erforderlich, um einen befriedigenden therapeutischen Effekt zu erreichen. Die antimanische Wirksamkeit von d-Propranolol erschien, verglichen mit der von dl-Propranolol, um etwa 50% geringer. Aus diesem Ergebnis kann geschlossen werden, daß wenigstens ein Teil der antimanischen Wirksamkeit von Beta-Blockern unabhängig ist von den beta-blockierenden Eigenschaften dieser Substanzen. |
The effect of a 12 week walking intervention on markers of insulin resistance and systemic inflammation. | OBJECTIVES
The purpose of the present study was to determine whether a community-based walking intervention, using pedometers, is effective in reducing systemic inflammatory markers.
METHODS
Participants (age=49(8.9)) were recruited in Glasgow, United Kingdom, from August to December 2006 and were randomly assigned to a control (n=24; 6 males, no change in walking) and intervention group (n=24; 5 males gradually increasing walking by 3000 steps/day on 5 days of the week). Blood samples were collected at baseline, and after 12 weeks, and analysed for glucose, insulin, high sensitivity C-reactive protein (hsCRP), interleukin-6 (IL-6), soluble IL-6 receptor (sIL-6R), tumour necrosis factor-alpha (TNF-alpha) and soluble TNF receptors I and II (sTNFR1 and sTNFRII).
RESULTS
In the control group baseline step counts were 6356 (2953) steps/day and did not change (P>0.05) after 12 weeks, 6709 (2918) steps/day. The intervention group increased (P<0.001) step count from 6682 (3761) steps/day at baseline to 10182 (4081) steps/day at 12 weeks. Over the 12 week period there was no change in any other variables measured, in either control or intervention group.
CONCLUSION
We conclude that the current community-based intervention did not affect systemic markers of inflammation or insulin sensitivity. |
The economic impact of severe asthma to low-income families. | BACKGROUND
To estimate the direct and indirect costs of severe asthma and the economic impact of its management to low-income families in Salvador, Brazil.
METHODS
One hundred and ninety-seven patients with severe asthma and referred to a state-funded asthma center providing free treatment were evaluated. At registration, they were asked about family cost-events in the previous year and had a baseline assessment of lung function, symptoms and quality of life. During the subsequent year, they were reassessed prospectively.
RESULTS
One hundred-eighty patients concluded a 12-month follow-up. Eighty-four percent were female patients, and the median family income was US$ 2955/year. Forty-seven percent of family members had lost their jobs because of asthma. Total cost of asthma management took 29% of family income. After proper treatment, asthma control scores improved by 50% and quality of life by 74%. The income of the families increased by US$ 711/year, as their members went back to work. The total cost of asthma to the families was reduced by a median US$ 789/family/year. Consequently, an annual surplus of US$ 1500/family became available.
CONCLUSIONS
Family costs of severe asthma consumed over one-fourth of the family income of the underprivileged population in a middle-income country. Adequate management brings major economic benefit to individuals and families. |
What factors influence the catalytic activity of iron-salan complexes for aerobic oxidative coupling of 2-naphthols? | A few Fe-salan dimer complexes serve as catalysts for aerobic oxidative coupling (AOC) of 2-naphthols, but some others do not. X-Ray and cyclic voltammetry studies of various Fe-salan complexes revealed that the absence or the presence of double hydrogen bonding in Fe-salan dimers, the oxidation potential of monomeric Fe-salan species and the location of the resulting radical cation are critical factors for the catalytic activity of iron-salan complexes for the AOC. |
Towards value-based pricing — An integrative framework for decision making | Despite a recent surge of interest, the subject of pricing in general and value-based pricing in particular has received little academic investigation. Yet, pricing has a huge impact on financial results, both in absolute terms and relative to other instruments of the marketing mix. The objective of this paper is to present a comprehensive framework for pricing decisions which considers all relevant dimensions and elements for profitable and sustainable pricing decisions. The theoretical framework is useful for guiding new product pricing decisions as well as for implementing price-repositioning strategies for existing products. The practical application of this framework is illustrated by a case study involving the pricing decision for a major product launch at a global chemical company. D 2003 Elsevier Inc. All rights reserved. |
NVIDIA Tesla: A Unified Graphics and Computing Architecture | To enable flexible, programmable graphics and high-performance computing, NVIDIA has developed the Tesla scalable unified graphics and parallel computing architecture. Its scalable parallel array of processors is massively multithreaded and programmable in C or via graphics APIs. |
Studies on Ontology-based irrigation management information systems modeling | Research and development irrigation management information systems are the important measures of making irrigation management more modernized and standardized. The difficulties of building information systems have been increased along with the continuous development of information technology and the complexity of information systems, information systems put forward higher request to “shared” and “reuse”. Ontology-based information systems modeling can eliminate semantic differences, and carry out knowledge sharing and interoperability of different systems. In this paper, we introduce several common models which used in information systems modeling briefly; and then we introduce ontology, summarize ontology-based information systems modeling process; finally, we discuss the applications of ontology-based information systems modeling in irrigation management information systems preliminary. |
Graphene oxide--MnO2 nanocomposites for supercapacitors. | A composite of graphene oxide supported by needle-like MnO(2) nanocrystals (GO-MnO(2) nanocomposites) has been fabricated through a simple soft chemical route in a water-isopropyl alcohol system. The formation mechanism of these intriguing nanocomposites investigated by transmission electron microscopy and Raman and ultraviolet-visible absorption spectroscopy is proposed as intercalation and adsorption of manganese ions onto the GO sheets, followed by the nucleation and growth of the crystal species in a double solvent system via dissolution-crystallization and oriented attachment mechanisms, which in turn results in the exfoliation of GO sheets. Interestingly, it was found that the electrochemical performance of as-prepared nanocomposites could be enhanced by the chemical interaction between GO and MnO(2). This method provides a facile and straightforward approach to deposit MnO(2) nanoparticles onto the graphene oxide sheets (single layer of graphite oxide) and may be readily extended to the preparation of other classes of hybrids based on GO sheets for technological applications. |
On the Robustness of a Neural Network | With the development of neural networks based machine learning and their usage in mission critical applications, voices are rising against the black box aspect of neural networks as it becomes crucial to understand their limits and capabilities. With the rise of neuromorphic hardware, it is even more critical to understand how a neural network, as a distributed system, tolerates the failures of its computing nodes, neurons, and its communication channels, synapses. Experimentally assessing the robustness of neural networks involves the quixotic venture of testing all the possible failures, on all the possible inputs, which ultimately hits a combinatorial explosion for the first, and the impossibility to gather all the possible inputs for the second.In this paper, we prove an upper bound on the expected error of the output when a subset of neurons crashes. This bound involves dependencies on the network parameters that can be seen as being too pessimistic in the average case. It involves a polynomial dependency on the Lipschitz coefficient of the neurons' activation function, and an exponential dependency on the depth of the layer where a failure occurs. We back up our theoretical results with experiments illustrating the extent to which our prediction matches the dependencies between the network parameters and robustness. Our results show that the robustness of neural networks to the average crash can be estimated without the need to neither test the network on all failure configurations, nor access the training set used to train the network, both of which are practically impossible requirements. |
Diagnostic accuracy of laser Doppler flowmetry versus strain gauge plethysmography for segmental pressure measurement. | OBJECTIVE
To assess the diagnostic accuracy of laser Doppler flowmetry (LDF) with mercury-in-silastic strain gauge plethysmography (SGP) as a reference test for measuring the toe and ankle pressures in patients with known or suspected peripheral arterial disease (PAD).
METHODS
This was a prospective, randomized, blinded diagnostic accuracy study. Toe and ankle pressures were measured using both methods in 200 consecutive patients, who were recruited at our vascular laboratory over a period of 30 working days. Classification of PAD and critical limb ischemia (CLI) was made in accordance with TASC-II criteria.
RESULTS
The LDF method demonstrated 5.8 mm Hg higher mean toe pressures than the SGP method for the right limb and 7.0 mm Hg for the left limb (both P < .001). There were no significant differences in the mean ankle pressures (both P > .129). The limits of agreement for the differences (SGP - LDF) were -31.7 to 20.2 mm Hg for right toe pressures, -28.0 to 14.0 mm Hg for left toe pressures, -25.5 to 22.8 mm Hg for right ankle pressures, and -26.9 to 24.6 mm Hg for left ankle pressures. A correlation analysis of the absolute pressures using the two methods showed an intraclass correlation coefficient of 0.902 (95% confidence interval [CI], 0.835-0.938) for right toe pressures, 0.919 (95% CI, 0.782-0.960) for the left toe pressures, 0.953 (95% CI, 0.937-0.965) for right ankle pressures, and 0.952 (95% CI, 0.936-0.964) for left ankle pressures. Cohen's Kappa showed an agreement in the diagnostic classification of κ = 0.775 (95% CI, 0.631-0.919) for PAD and κ = 0.780 (95% CI, 0.624-0.936) for CLI.
CONCLUSIONS
LDF showed a good correlation with SGP over a wide range of toe and ankle pressures, as well as substantial agreement for the diagnostic classification of PAD including CLI. |
NormAD - Normalized Approximate Descent based supervised learning rule for spiking neurons | NormAD is a novel supervised learning algorithm to train spiking neurons to produce a desired spike train in response to a given input. It is shown that NormAD provides faster convergence than state-of-the-art supervised learning algorithms for spiking neurons, often the gain in the rate of convergence being more than a factor of 10. The algorithm leverages the fact that a leaky integrate-and-fire neuron can be described as a non-linear spatio-temporal filter, allowing us to treat supervised learning as a mathematically tractable optimization problem with a cost function in terms of the membrane potential rather than the spike arrival time. A variant of stochastic gradient descent along with normalization has been used to derive the synaptic weight update rule. NormAD uses leaky integration of the input to determine the synaptic weight change. Since leaky integration is fundamental to all integrate-and-fire models of spiking neurons, we claim universal applicability of the learning rule to other models such as adaptive exponential integrate-and-fire model of neurons by demonstrating equally good performance in training with our algorithm. |
Automatic TFT-LCD mura defect inspection using discrete cosine transform-based background filtering and ‘ just noticeable difference ’ quantification strategies | An innovative mura defect detection methodology for a thin-film transistor liquid crystal display (TFT-LCD) is developed for automatic inspection of mura defects using the discrete cosine transform (DCT) principle and background image filtering strategy. Efficient and accurate surface defect detection on flat panel display (FPD) panels has never been so important in achieving a high yield rate of FPD manufacturing. Detecting blob-mura defects in an LCD panel can be difficult due to non-uniform brightness background and slightly different brightness levels between the defect region and the background. To overcome this problem, a DCT-based background reconstruction algorithm was developed to establish the background image separated from the defects. The significant level of mura defects can be rationally quantified using the just noticeable difference (JND) definition. Actual performance of the developed method was evaluated on industrial LCD panels containing natural mura defects. Results of experimental tests verified that the proposed algorithm has a superior capability for detecting mura defects efficiently and accurately. |
The Studierstube Augmented Reality Project | Our starting point for developing the Studierstube system was the belief that augmented reality, the less obtrusive cousin of virtual reality, has a better chance of becoming a viable user interface for applications requiring manipulation of complex three-dimensional information as a daily routine. In essence, we are searching for a 3-D user interface metaphor as powerful as the desktop metaphor for 2-D. At the heart of the Studierstube system, collaborative augmented reality is used to embed computer-generated images into the real work environment. In the first part of this paper, we review the user interface of the initial Studierstube system, in particular the implementation of collaborative augmented reality, and the Personal Interaction Panel, a two-handed interface for interaction with the system. In the second part, an extended Studierstube system based on a heterogeneous distributed architecture is presented. This system allows the user to combine multiple approaches augmented reality, projection displays, and ubiquitous computingto the interface as needed. The environment is controlled by the Personal Interaction Panel, a twohanded, pen-and-pad interface that has versatile uses for interacting with the virtual environment. Studierstube also borrows elements from the desktop, such as multitasking and multi-windowing. The resulting software architecture is a user interface management system for complex augmented reality applications. The presentation is complemented by selected application examples. |
Treating to target in psoriatic arthritis. | PURPOSE OF REVIEW
Psoriatic arthritis (PsA) is an inflammatory arthritis causing significant joint damage and impaired quality of life. A treat to target approach has revolutionized the care of patients with rheumatoid arthritis over the last decade. There is now increasing interest in a similar approach in PsA, as it seems that ongoing joint inflammation predicts subsequent damage and loss of function.
RECENT FINDINGS
A 2011 European League Against Rheumatism review highlighted a lack of evidence for treat to target in PsA. However, with the development of the minimal disease activity criteria, a target is available and preliminary results from the first randomized treat-to target study (Tight Control of PsA Study) using these criteria have shown significant benefit in joint and skin disease activity and patient-reported outcomes.
SUMMARY
Early evidence has shown the potential benefit of a treat-to-target approach in PsA and further research is needed to optimize treatment pathways for all subtypes of the disease. |
Skill Squatting Attacks on Amazon Alexa | The proliferation of the Internet of Things has increased reliance on voice-controlled devices to perform everyday tasks. Although these devices rely on accurate speechrecognition for correct functionality, many users experience frequent misinterpretations in normal use. In this work, we conduct an empirical analysis of interpretation errors made by Amazon Alexa, the speech-recognition engine that powers the Amazon Echo family of devices. We leverage a dataset of 11,460 speech samples containing English words spoken by American speakers and identify where Alexa misinterprets the audio inputs, how often, and why. We find that certain misinterpretations appear consistently in repeated trials and are systematic. Next, we present and validate a new attack, called skill squatting. In skill squatting, an attacker leverages systematic errors to route a user to malicious application without their knowledge. In a variant of the attack we call spear skill squatting, we further demonstrate that this attack can be targeted at specific demographic groups. We conclude with a discussion of the security implications of speech interpretation errors, countermeasures, and future work. |
Acupuncture for Spasticity after Stroke: A Systematic Review and Meta-Analysis of Randomized Controlled Trials | The aim of this systematic review was to determine how effective acupuncture or electroacupuncture (acupuncture with electrical stimulation) is in treating poststroke patients with spasticity. We searched publications in Medline, EMBASE, and the Cochrane Library in English, 19 accredited journals in Korean, and the China Integrated Knowledge Resources Database in Chinese through to July 30, 2013. We included randomized controlled trials (RCTs) with no language restrictions that compared the effects of acupuncture or electroacupuncture with usual care or placebo acupuncture. The two investigators assessed the risk of bias and statistical analyses were performed. Three RCTs in English, 1 in Korean, and 1 in Chinese were included. Assessments were performed primarily with the Modified Ashworth Scale (MAS). Meta-analysis showed that acupuncture or electroacupuncture significantly decreased spasticity after stroke. A subgroup analysis showed that acupuncture significantly decreased wrist, knee, and elbow spasticity in poststroke patients. Heterogeneity could be explained by the differences in control, acupoints, and the duration after stroke occurrence. In conclusion, acupuncture could be effective in decreasing spasticity after stroke, but long-term studies are needed to determine the longevity of treatment effects. |
The utility of low frequency heart rate variability as an index of sympathetic cardiac tone: a review with emphasis on a reanalysis of previous studies. | This article evaluates the suitability of low frequency (LF) heart rate variability (HRV) as an index of sympathetic cardiac control and the LF/high frequency (HF) ratio as an index of autonomic balance. It includes a comprehensive literature review and a reanalysis of some previous studies on autonomic cardiovascular regulation. The following sources of evidence are addressed: effects of manipulations affecting sympathetic and vagal activity on HRV, predictions of group differences in cardiac autonomic regulation from HRV, relationships between HRV and other cardiac parameters, and the theoretical and mathematical bases of the concept of autonomic balance. Available data challenge the interpretation of the LF and LF/HF ratio as indices of sympathetic cardiac control and autonomic balance, respectively, and suggest that the HRV power spectrum, including its LF component, is mainly determined by the parasympathetic system. |
Using wavelet transforms for ECG characterization. An on-line digital signal processing system | The rapid and objective measurement of timing intervals of the electrocardiogram (ECG) by automated systems is superior to the subjective assessment of ECG morphology. The timing interval measurements are usually made from the onset to the termination of any component of the EGG, after accurate detection of the QRS complex. This article describes a real-time system that uses wavelet transforms to overcome the limitations of other methods of detecting QRS and the onsets and offsets of P- and T-waves. Wavelet transformation is briefly discussed, and detection methods and hardware and software aspects of the system are presented, as well as experimental results. |
BotGAD: detecting botnets by capturing group activities in network traffic | Recent malicious attempts are intended to obtain financial benefits using a botnet which has become one of the major Internet security problems. Botnets can cause severe Internet threats such as DDoS attacks, identity theft, spamming, click fraud. In this paper, we define a group activity as an inherent property of the botnet. Based on the group activity model and metric, we develop a botnet detection mechanism, called BotGAD (Botnet Group Activity Detector). BotGAD enables to detect unknown botnets from large scale networks in real-time. Botnets frequently use DNS to rally infected hosts, launch attacks and update their codes. We implemented BotGAD using DNS traffic and showed the effectiveness by experiments on real-life network traces. BotGAD captured 20 unknown and 10 known botnets from two day campus network traces. |
The Danish National Penile Cancer Quality database | AIM OF DATABASE
The Danish National Penile Cancer Quality database (DaPeCa-data) aims to improve the quality of cancer care and monitor the diagnosis, staging, and treatment of all incident penile cancer cases in Denmark. The aim is to assure referral practice, guideline adherence, and treatment and development of the database in order to enhance research opportunities and increase knowledge and survival outcomes of penile cancer.
STUDY POPULATION
The DaPeCa-data registers all patients with newly diagnosed invasive squamous cell carcinoma of the penis in Denmark since June 2011.
MAIN VARIABLES
Data are systematically registered at the time of diagnosis by a combination of automated data-linkage to the central registries as well as online registration by treating clinicians. The main variables registered relate to disease prognosis and treatment morbidity and include the presence of risk factors (phimosis, lichen sclerosus, and human papillomavirus), date of diagnosis, date of treatment decision, date of beginning of treatment, type of treatment, treating hospital, type and time of complications, date of recurrence, date of death, and cause of death.
DESCRIPTIVE DATA
Registration of these variables correlated to the unique Danish ten-digit civil registration number enables characterization of the cohort, individual patients, and patient groups with respect to age; 1-, 3-, and 5-year disease-specific and overall survival; recurrence patterns; and morbidity profile related to treatment modality. As of August 2015, more than 200 patients are registered with ∼65 new entries per year.
CONCLUSION
The DaPeCa-data has potential to provide meaningful, timely, and clinically relevant quality data for quality maintenance, development, and research purposes. |
Fuzzy Logic Applications to Power Electronics and Drives-An Overview | Applications of fuzzy logic (FL) to power electronics and drives are on the rise. The paper discusses some representative applications of FL in the area, preceded by an interpretative review of fuzzy logic controller (FLC) theory. A discussion on design and implementation aspects is presented, that also considers the interaction of neural networks and fuzzy logic techniques. Finally, strengths and limitations of FLC are considered, including possible applications in the area. |
Machine learning for attack vector identification in malicious source code | As computers and information technologies become ubiquitous throughout society, the security of our networks and information technologies is a growing concern. As a result, many researchers have become interested in the security domain. Among them, there is growing interest in observing hacker communities for early detection of developing security threats and trends. Research in this area has often reported hackers openly sharing cybercriminal assets and knowledge with one another. In particular, the sharing of raw malware source code files has been documented in past work. Unfortunately, malware code documentation appears often times to be missing, incomplete, or written in a language foreign to researchers. Thus, analysis of such source files embedded within hacker communities has been limited. Here we utilize a subset of popular machine learning methodologies for the automated analysis of malware source code files. Specifically, we explore genetic algorithms to resolve questions related to feature selection within the context of malware analysis. Next, we utilize two common classification algorithms to test selected features for identification of malware attack vectors. Results suggest promising direction in utilizing such techniques to help with the automated analysis of malware source code. |
Air interface design and ray tracing study for 5G millimeter wave communications | To meet the explosive growth in traffic during the next twenty years, 5G systems using local area networks need to be developed. These systems will comprise of small cells and will use extreme cell densification. The use of millimeter wave (Mmwave) frequencies, in particular from 20 GHz to 90 GHz, will revolutionize wireless communications given the extreme amount of available bandwidth. However, the different propagation conditions and hardware constraints of Mmwave (e.g., the use of RF beamforming with very large arrays) require reconsidering the modulation methods for Mmwave compared to those used below 6 GHz. In this paper we present ray-tracing results, which, along with recent propagation measurements at Mmwave, all point to the fact that Mmwave frequencies are very appropriate for next generation, 5G, local area wireless communication systems. Next, we propose null cyclic prefix single carrier as the best candidate for Mmwave communications. Finally, systemlevel simulation results show that with the right access point deployment peak rates of over 15 Gbps are possible at Mmwave along with a cell edge experience in excess of 400 Mbps. |
The Protein Folding Problem and Tertiary Structure Prediction | Download PDF Ebook and Read OnlineThe Protein Folding Problem And Tertiary Structure Prediction%0D. Get The Protein Folding Problem And Tertiary Structure Prediction%0D When some people looking at you while reading the protein folding problem and tertiary structure prediction%0D, you might feel so honored. But, as opposed to other individuals feels you have to instil in yourself that you are reading the protein folding problem and tertiary structure prediction%0D not because of that factors. Reading this the protein folding problem and tertiary structure prediction%0D will give you more than people appreciate. It will overview of understand greater than the people looking at you. Even now, there are several sources to learning, reviewing a book the protein folding problem and tertiary structure prediction%0D still becomes the first choice as a fantastic way. How if there is a site that enables you to look for referred publication the protein folding problem and tertiary structure prediction%0D from all around the world author? Instantly, the site will be extraordinary finished. So many book collections can be found. All will be so very easy without complicated thing to relocate from website to website to get guide the protein folding problem and tertiary structure prediction%0D really wanted. This is the site that will give you those assumptions. By following this site you can acquire lots numbers of book the protein folding problem and tertiary structure prediction%0D collections from versions types of author and publisher popular in this globe. Guide such as the protein folding problem and tertiary structure prediction%0D and others can be gained by clicking wonderful on link download. Why should be reading the protein folding problem and tertiary structure prediction%0D Again, it will certainly rely on exactly how you feel as well as think of it. It is definitely that one of the advantage to take when reading this the protein folding problem and tertiary structure prediction%0D; you could take more lessons straight. Even you have actually not undergone it in your life; you could obtain the experience by reading the protein folding problem and tertiary structure prediction%0D And now, we will present you with the on the internet book the protein folding problem and tertiary structure prediction%0D in this website. |
Incidence of Kawasaki disease in northern European countries. | BACKGROUND
The aim of the present study was to compare the epidemiologic features of Kawasaki disease (KD) in three northern European countries and Japan.
METHODS
Data were obtained from discharge databases for hospitals in Finland, Norway, and Sweden for 1999-2009 and from nationwide epidemiologic surveys in Japan for 1998-2008. Annual incidence for each country was calculated using regional census data.
RESULTS
During the 11 year period, 1390 KD patients were recorded in the registries of the three northern European countries. Average annual incidence rates per 100,000 children aged <5 years were: Finland, 11.4; Norway, 5.4; and Sweden, 7.4. Overall, 86.4% of Japanese KD patients were aged <5 years compared to only 67.8% in the four northern European countries (P < 0.001).
CONCLUSIONS
The incidence of KD in northern Europe was constant over the study period and much lower than in Japan. There was a significant age difference between northern European and Japanese KD patients that remains unexplained. |
In Barrett's esophagus patients and Barrett's cell lines, ursodeoxycholic acid increases antioxidant expression and prevents DNA damage by bile acids. | Hydrophobic bile acids like deoxycholic acid (DCA), which cause oxidative DNA damage and activate NF-κB in Barrett's metaplasia, might contribute to carcinogenesis in Barrett's esophagus. We have explored mechanisms whereby ursodeoxycholic acid (UDCA, a hydrophilic bile acid) protects against DCA-induced injury in vivo in patients and in vitro using nonneoplastic, telomerase-immortalized Barrett's cell lines. We took biopsies of Barrett's esophagus from 21 patients before and after esophageal perfusion with DCA (250 μM) at baseline and after 8 wk of oral UDCA treatment. DNA damage was assessed by phospho-H2AX expression, neutral CometAssay, and phospho-H2AX nuclear foci formation. Quantitative PCR was performed for antioxidants including catalase and GPX1. Nrf2, catalase, and GPX1 were knocked down with siRNAs. Reporter assays were performed using a plasmid construct containing antioxidant responsive element. In patients, baseline esophageal perfusion with DCA significantly increased phospho-H2AX and phospho-p65 in Barrett's metaplasia. Oral UDCA increased GPX1 and catalase levels in Barrett's metaplasia and prevented DCA perfusion from inducing DNA damage and NF-κB activation. In cells, DCA-induced DNA damage and NF-κB activation was prevented by 24-h pretreatment with UDCA, but not by mixing UDCA with DCA. UDCA activated Nrf2 signaling to increase GPX1 and catalase expression, and protective effects of UDCA pretreatment were blocked by siRNA knockdown of these antioxidants. UDCA increases expression of antioxidants that prevent toxic bile acids from causing DNA damage and NF-κB activation in Barrett's metaplasia. Elucidation of this molecular pathway for UDCA protection provides rationale for clinical trials on UDCA for chemoprevention in Barrett's esophagus. |
Generalizing and Improving Bilingual Word Embedding Mappings with a Multi-Step Framework of Linear Transformations | Using a dictionary to map independently trained word embeddings to a shared space has shown to be an effective approach to learn bilingual word embeddings. In this work, we propose a multi-step framework of linear transformations that generalizes a substantial body of previous work. The core step of the framework is an orthogonal transformation, and existing methods can be explained in terms of the additional normalization, whitening, re-weighting, de-whitening and dimensionality reduction steps. This allows us to gain new insights into the behavior of existing methods, including the effectiveness of inverse regression, and design a novel variant that obtains the best published results in zero-shot bilingual lexicon extraction. The corresponding software is released as an open source project. |
The Lord's Dominion: The History of Canadian Methodism. | The history of religion in Canada is a peculiar field. According to the Social Sciences and Humanities Research Council, it does not even exist -- applicants for funding are forced to mark a box next to intellectual history in the history section, or move to the religious studies section for "church history." And yet, in less than a decade the McGill-Queen's Studies in the History of Religion has produced close to 30 volumes. Religious history is a burgeoning area of research, but it has faced different obstacles than many of the other "limited identities" that have recently expanded. Unlike women's history, or working-class history, religious history has had to come to terms with the long tradition of church history that preceded it, a legacy rich in its depth but laden with stereotypical assumptions. The prevailing view of this "church" inheritance is apparent in the comment by Doug Owram: "All the major Protestant denominations have had histories written about them and some are even good."(f.1)Religious history in Canada is marked by a distinct lack of uniformity in methodology and theory. No field is flawless in this respect, but religious history is less than distinguished in inaugurating new theoretical approaches, and Canadian religious history is known more by what it has appropriated from other fields than for what it has contributed. The multi-faceted nature of religion transects many different areas, but apart from the often vague claim to study "religion for its own sake," the history of religion has had difficulty defining itself structurally and methodologically. As Mark McGowan noted, the study of religion has taken the first step "out of the cloister" to engage broader issues of the role of religion in society, but English-Canadian historians of religion have been particularly reluctant to embrace new methods and approaches. McGowan was primarily concerned with developing a closer alliance with social scientific methodology, but the same argument could be made in relation to post-structuralism and the literary criticism of cultural studies.(f.2)Instead of coming to terms with the conceptual difficulties of studying "religion" in history, the historiography of Protestantism in Canada has been distracted by an overriding preoccupation with what has been labelled as the "secularization thesis." The origins of this debate are usually traced to Richard Allen's The Social Passion: Religion and Social Reform in Canada, 1914-28 (1971) which linked the "social gospel" movement with the rise of democratic socialism. The current debate over secularization was crystallized by Ramsay Cook's The Regenerators: Social Criticism in Late Victorian English Canada (1985) and David B. Marshall's Secularizing the Faith: Canadian Protestant Clergy and the Crisis of Belief, 1850-1940 (1992). Despite important differences in subject and approach between Marshall and Cook. both interpreted secularization as a process that was internal to religion, as well as external. The accommodation of religion in the face of modern intellectual and cultural challenges hastened its demise instead of bolstering its status. What originally appeared to be a promising and fruitful debate that would excite interest in the emerging field, has unfortunately degenerated into a polarized and acrimonious battle that has obscured more than it has revealed.The shadow of secularization, the legacy of assumptions regarding "church" history and the uncertainty about which direction the second step "out of the cloister" should take are all elements that are clearly evident in this collection of five recent offerings on the history of Protestantism in Canada. The range and diversity of these works are encouraging signs but it is also apparent that the confident predictions about the development of religious history have not yet lived up to expectations. There is an increasing concern with recovering "religious experience" and "popular piety," but this transition has been accomplished with very little examination of the basic theoretical assumptions underlying its approach and methodology. … |
Reassembleable Disassembling | Reverse engineering has many important applications in computer security, one of which is retrofitting software for safety and security hardening when source code is not available. By surveying available commercial and academic reverse engineering tools, we surprisingly found that no existing tool is able to disassemble executable binaries into assembly code that can be correctly assembled back in a fully automated manner, even for simple programs. Actually in many cases, the resulted disassembled code is far from a state that an assembler accepts, which is hard to fix even by manual effort. This has become a severe obstacle. People have tried to overcome it by patching or duplicating new code sections for retrofitting of executables, which is not only inefficient but also cumbersome and restrictive on what retrofitting techniques can be applied to. In this paper, we present UROBOROS, a tool that can disassemble executables to the extent that the generated code can be assembled back to working binaries without manual effort. By empirically studying 244 binaries, we summarize a set of rules that can make the disassembled code relocatable, which is the key to reassembleable disassembling. With UROBOROS, the disassembly-reassembly process can be repeated thousands of times. We have implemented a prototype of UROBOROS and tested over the whole set of GNU Coreutils, SPEC2006, and a set of other real-world application and server programs. The experiment results show that our tool is effective with a very modest cost. |
Future of Cannabis and Cannabinoids in Therapeutics | This study reviews human clinical experience to date with several synthetic cannabinoids, including nabilone, levonantradol, ajulemic acid (CT3), dexanabinol (HU-211), HU-308, and SR141716 (Rimonabant®). Additionally, the concept of “clinical endogenous cannabinoid deficiency” is explored as a possible factor in migraine, idiopathic bowel disease, fibromyalgia and other clinical pain states. The concept of analgesic synergy of cannabinoids and opioids is addressed. A cannabinoid-mediated improvement in night vision at the retinal level is discussed, as well as its potential application to treatment of retinitis pigmentosa and other conditions. Additionally noted is the role of cannabinoid treatment in neuroprotection and its application to closed head injury, cerebrovascular accidents, and CNS degenerative diseases including Alzheimer, Huntington, Parkinson diseases and ALS. Excellent clinical results employing cannabis based medicine extracts (CBME) in spasticity and spasms of MS suggests extension of such treatment to other spasmodic and dystonic conditions. Finally, controversial areas of cannabinoid treatment in obstetrics, gynecology and pediatrics are addressed along with a rationale for such interventions. [Article copies available for a fee from The Haworth Document Delivery Service: 1-800-HAWORTH. E-mail address: <docdelivery@haworthpress. com> Website: <http://www.HaworthPress.com> 2003 by The Haworth Press, Inc. All rights reserved.] |
Marine pharmacology in 2005-2006: antitumour and cytotoxic compounds. | During 2005 and 2006, marine pharmacology research directed towards the discovery and development of novel antitumour agents was reported in 171 peer-reviewed articles. The purpose of this article is to present a structured review of the antitumour and cytotoxic properties of 136 marine natural products, many of which are novel compounds that belong to diverse structural classes, including polyketides, terpenes, steroids and peptides. The organisms yielding these bioactive marine compounds included invertebrate animals, algae, fungi and bacteria. Antitumour pharmacological studies were conducted with 42 structurally defined marine natural products in a number of experimental and clinical models which further defined their mechanisms of action. Particularly potent in vitro cytotoxicity data generated with murine and human tumour cell lines were reported for 94 novel marine chemicals with as yet undetermined mechanisms of action. Noteworthy is the fact that marine anticancer research was sustained by a global collaborative effort, involving researchers from Australia, Belgium, Benin, Brazil, Canada, China, Egypt, France, Germany, India, Indonesia, Italy, Japan, Mexico, the Netherlands, New Zealand, Panama, the Philippines, Slovenia, South Korea, Spain, Sweden, Taiwan, Thailand, United Kingdom (UK) and the United States of America (USA). Finally, this 2005-2006 overview of the marine pharmacology literature highlights the fact that the discovery of novel marine antitumour agents continued at the same active pace as during 1998-2004. |
Spotting Symbols in Line Drawing Images Using Graph Representations | Many methods of graphics recognition have been dev eloped throughout the years for the recognition of pre-seg m nted graphics symbols but very few techniques achieved the objective of symbo l sp tting and recognition together in a generic case. To go one step forward through this objective, this paper presents an original solution for symbol spot ting using a graph representation of graphical documents. The proposed strategy has two main step. In the first step, a graph base representatio of a document image is generated that includ selection of description prim itives (nodes of the graph) and organisation of these features (edges). In the second step the graph is used to spot interesting parts of the image that potenti ally correspond to symbol. The sub-graphs associated to selected zones are then su bmitted to a graph matching algorithm in order to take the final decision and t o recognize the class of the symbol. The experimental results obtained on differ ent types of documents demonstrates that the system can handle different t ypes of images without any modification. |
Defect-Based Testing | What is a good test case? One that reveals potential defects with good cost-effectiveness. We provide a generic model of faults and failures, formalize it, and present its various methodological usages for test case generation. |
Mental health literacy measures evaluating knowledge, attitudes and help-seeking: a scoping review. | BACKGROUND
Mental health literacy has received increasing attention as a useful strategy to promote early identification of mental disorders, reduce stigma and enhance help-seeking behaviors. However, despite the abundance of research on mental health literacy interventions, there is the absence of evaluations of current available mental health literacy measures and related psychometrics. We conducted a scoping review to bridge the gap.
METHODS
We searched PubMed, PsycINFO, Embase, CINAHL, Cochrane Library, and ERIC for relevant studies. We only focused on quantitative studies and English publications, however, we didn't limit study participants, locations, or publication dates. We excluded non-English studies, and did not check the grey literature (non peer-reviewed publications or documents of any type) and therefore may have missed some eligible measures.
RESULTS
We located 401 studies that include 69 knowledge measures (14 validated), 111 stigma measures (65 validated), and 35 help-seeking related measures (10 validated). Knowledge measures mainly investigated the ability of illness identification, and factual knowledge of mental disorders such as terminology, etiology, diagnosis, prognosis, and consequences. Stigma measures include those focused on stigma against mental illness or the mentally ill; self-stigma ; experienced stigma; and stigma against mental health treatment and help-seeking. Help-seeking measures included those of help-seeking attitudes, intentions to seek help, and actual help-seeking behaviors.
CONCLUSIONS
Our review provides a compendium of available mental health literacy measures to facilitate applying existing measures or developing new measures. It also provides a solid database for future research on systematically assessing the quality of the included measures. |
Pathway selection mechanism of a screw drive in-pipe robot in T-branches | Pipelines are important infrastructures in today's society. To avoid leakages in these pipelines, efficient robotic pipe inspections are required, and to this date, various types of in-pipe robots have been developed. Some of them can select their path at a T-branch at the expense of additional actuators. However, fewer actuators are better in terms of size reduction, energy conservation, production cost, and maintenance. To reduce the number of actuators, a screw drive mechanism with only one actuator has been developed for propelling an in-pipe robot through straight pipes and elbow pipes. Based on this screw drive mechanism, in this paper, we develop a novel robot that uses only two motors and can select pathways. The robot has three locomotion modes: screw driving, steering, and rolling modes. These modes enable the robot to navigate not only through straight pipes but also elbow pipes and T-branches. We performed experiments to verify the validity of the proposed mechanism. |
Divide and conquer: neuroevolution for multiclass classification | Neuroevolution is a powerful and general technique for evolving the structure and weights of artificial neural networks. Though neuroevolutionary approaches such as NeuroEvolution of Augmenting Topologies (NEAT) have been successfully applied to various problems including classification, regression, and reinforcement learning problems, little work has explored application of these techniques to larger-scale multiclass classification problems. In this paper, NEAT is evaluated in several multiclass classification problems, and then extended via two ensemble approaches: One-vs-All and One-vs-One. These approaches decompose multiclass classification problems into a set of binary classification problems, in which each binary problem is solved by an instance of NEAT. These ensemble models exhibit reduced variance and increasingly superior accuracy as the number of classes increases. Additionally, higher accuracy is achieved early in training, even when artificially constrained for the sake of fair comparison with standard NEAT. However, because the approach can be trivially distributed, it can be applied quickly at large scale to solve real problems. In fact, these approaches are incorporated into DarwinTM, an enterprise automatic machine learning solution that also incorporates various other algorithmic enhancements to NEAT. The resulting complete system has proven robust to a wide variety of client datasets. |
Higher Lipoprotein (a) Levels Are Associated with Better Pulmonary Function in Community-Dwelling Older People – Data from the Berlin Aging Study II | Reduced pulmonary function and elevated serum cholesterol levels are recognized risk factors for cardiovascular disease. Currently, there is some controversy concerning relationships between cholesterol, LDL-cholesterol, HDL-cholesterol, serum triglycerides and lung function. However, most previous studies compared patients suffering from chronic obstructive pulmonary disease (COPD) with healthy controls, and only a small number examined this relationship in population-based cohorts. Moreover, lipoprotein a [Lp(a)], another lipid parameter independently associated with cardiovascular diseases, appears not to have been addressed at all in studies of lung function at the population level. Here, we determined relationships between lung function and several lipid parameters including Lp(a) in 606 older community-dwelling participants (55.1% women, 68±4 years old) from the Berlin Aging Study II (BASE-II). We found a significantly lower forced expiration volume in 1 second (FEV1) in men with low Lp(a) concentrations (t-test). This finding was further substantiated by linear regression models adjusting for known covariates, showing that these associations are statistically significant in both men and women. According to the highest adjusted model, men and women with Lp(a) levels below the 20th percentile had 217.3ml and 124.2ml less FEV1 and 239.0ml and 135.2ml less FVC, respectively, compared to participants with higher Lp(a) levels. The adjusted models also suggest that the known strong correlation between pro-inflammatory parameters and lung function has only a marginal impact on the Lp(a)-pulmonary function association. Our results do not support the hypothesis that higher Lp(a) levels are responsible for the increased CVD risk in people with reduced lung function, at least not in the group of community-dwelling older people studied here. |
60-GHz single-chip integrated antenna and Low Noise Amplifier in 65-nm CMOS SOI technology for short-range wireless Gbits/s applications | The single-chip integration of antenna and Low Noise Amplifier (LNA) for 60 GHz short-range wireless transceivers is presented in this work. A 65 nm CMOS Silicon-on-Insulator (SOI) technology has been selected as target; due to its high-resistivity substrate the losses are drastically reduced if compared with the bulk silicon technology and more energy will be provided to the on-chip antenna to radiate. Two different LNA architectures are proposed. First, a three-stage LNA with conventional 50 Ohm input matching allows for a power gain of 23 dB, a noise figure (NF) of 4.04 dB and a power consumption of 35 mW. By relaxing the impedance matching specification, due to on-chip co-design of amplifier and antenna, a new LNA with only two amplification stages has been designed. The two-stage LNA achieves similar performance of the three-stage one (gain >22 dB, NF< 5 dB) with a power consumption reduced by 25%. A dipole antenna with coplanar strip feed has been also designed matching the input LNA impedance and allowing an antenna gain of 3.22 dB at 60 GHz with a limited on-chip area occupation. |
Examining the Effects of Deregulation on Retail Electricity Prices | A primary aim of deregulation is to reduce the customer cost of electricity. In this paper, we examine the degree of success in reaching that goal using a variety of methods. We examine rates for each of four customer classes; for regulated, deregulated and publicly owned utilities; and for three definitions of deregulation. We control for a variety of factors which may independently affect differences in electricity price: climate, fuel costs, and electricity generation by energy source. Taken as a whole, the results from our analysis do not support a conclusion that deregulation has led to lower electricity rates. * This paper was presented July 5, 2005 at the annual meeting of the Western Economic Association International at the session entitled “Power Market Restructuring: A Review” and September 20, 2005 at the annual North American Conference of the USAEE/IAEE at the session entitled “Electricity Market Structure, Conduct and Performance.” |
Creating Capsule Wardrobes from Fashion Images | We propose to automatically create capsule wardrobes. Given an inventory of candidate garments and accessories, the algorithm must assemble a minimal set of items that provides maximal mix-and-match outfits. We pose the task as a subset selection problem. To permit efficient subset selection over the space of all outfit combinations, we develop submodular objective functions capturing the key ingredients of visual compatibility, versatility, and user-specific preference. Since adding garments to a capsule only expands its possible outfits, we devise an iterative approach to allow near-optimal submodular function maximization. Finally, we present an unsupervised approach to learn visual compatibility from "in the wild" full body outfit photos; the compatibility metric translates well to cleaner catalog photos and improves over existing methods. Our results on thousands of pieces from popular fashion websites show that automatic capsule creation has potential to mimic skilled fashionistas in assembling flexible wardrobes, while being significantly more scalable. |
StackGhost: Hardware Facilitated Stack Protection | Conventional security exploits have relied on overwriting the saved return pointer on the stack to hijack the path of execution. Under Sun Microsystem’s Sparc processor architecture, we were able to implement a kernel modification to transparently and automatically guard applications’ return pointers. Our implementation called StackGhost under OpenBSD 2.8 acts as a ghost in the machine. StackGhost advances exploit prevention in that it protects every application run on the system without their knowledge nor does it require their source or binary modification. We will document several of the methods devised to preserve the sanctity of the system and will explore the performance ramifications of StackGhost. |
It's More than Just Sharing Game Play Videos! Understanding User Motives in Mobile Game Social Media | As mobile gaming has become increasingly popular in recent years, new forms of mobile game social media such as GameDuck that share mobile gameplay videos have emerged. In this work, we set out to understand the user motives of GameDuck by leveraging the Uses and Gratification Theory. We first explore the major motive themes from users' responses (n=138) and generate motivation survey items. We then identify the key motivators by conducting exploratory factor analysis of the survey results (n=354). Finally, we discuss how this new social media relates to existing systems such as Twitch. |
Very Small Ultra-Wide-Band MMIC Magic T and Applications to Combiners and Dividers | An FET-sized 1-18 GHz monolithic active magic T (1 W hybrid) is proposed. It unifies two different dividers, electrically isolated from each other, in a novel GaAs FET electrode configuration, viz. the LUFET concept. Its characteristics and experiment results are presented. Applications of the magic T to miniature wide-band RF signal processing. . . |
花岗岩成因及构造环境认识 Study on Granites’ Origination and Tectonic Environment | 在地球中,花岗岩几乎全部产于大陆,是大陆分布最广的岩石之一,是有别于大洋地壳的最主要的物质组成标志。通过研究长期以来地质学者们对花岗岩的研究,取得了一些对花岗岩的认识。其中包括对花岗岩成因的分类。每种类型都对应了不同的地质历史演化。并认识到花岗岩的地壳深熔论是研究其成因的基础所在。在研究花岗岩岩石成因的过程中主要考虑大地构造环境以及源岩对其的影响。最后,深入探讨了花岗岩形成的大地构造背景。 In the earth, almost all granite comes from the mainland, which is one of the most widely rock distributions and different from the main composition of oceanic crust. In this paper, by studying the geological scholars for a long time on the study of granite, we got some knowledge of granite. Including the classification of the causes of granite, each type corresponds to the different geologic evolution history. And awareness of the crustal anatexis is the basis of the research the causes in the study of granite rock formation in the process of the main consideration tectonic environment and source rock to its influence. Finally, we further discuss the granite formed by tectonic background. |
Millimeter-wave beamforming as an enabling technology for 5G cellular communications: theoretical feasibility and prototype results | The ever growing traffic explosion in mobile communications has recently drawn increased attention to the large amount of underutilized spectrum in the millimeter-wave frequency bands as a potentially viable solution for achieving tens to hundreds of times more capacity compared to current 4G cellular networks. Historically, mmWave bands were ruled out for cellular usage mainly due to concerns regarding short-range and non-line-of-sight coverage issues. In this article, we present recent results from channel measurement campaigns and the development of advanced algorithms and a prototype, which clearly demonstrate that the mmWave band may indeed be a worthy candidate for next generation (5G) cellular systems. The results of channel measurements carried out in both the United States and Korea are summarized along with the actual free space propagation measurements in an anechoic chamber. Then a novel hybrid beamforming scheme and its link- and system-level simulation results are presented. Finally, recent results from our mmWave prototyping efforts along with indoor and outdoor test results are described to assert the feasibility of mmWave bands for cellular usage. |
Automated Floating-Point Precision Analysis | As scientific computation continues to scale, it is crucial to use floating-point arithmetic processors as efficiently as possible. Lower precision allows streaming architectures to perform more operations per second and can reduce memory bandwidth pressure on all architectures. However, using a precision that is too low for a given algorithm and data set will result in inaccurate results. Thus, developers must balance speed and accuracy when choosing the floating-point precision of their subroutines and data structures. I am investigating techniques to help developers learn about the runtime floating-point behavior of their programs, and to help them make decisions concerning the choice of precision in implementation. I propose to develop methods that will generate floating-point precision configurations, automatically testing and validating them using binary instrumentation. The goal is ultimately to make a recommendation to the developer regarding which parts of the program can be reduced to single-precision. The central thesis is that automated analysis techniques can make recommendations regarding the precision levels that each part of a computer program must use to maintain overall accuracy, with the goal of improving performance on scientific codes. |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.