title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
A Novel Discriminative Framework for Sentence-Level Discourse Analysis | We propose a complete probabilistic discriminative framework for performing sentencelevel discourse analysis. Our framework comprises a discourse segmenter, based on a binary classifier, and a discourse parser, which applies an optimal CKY-like parsing algorithm to probabilities inferred from a Dynamic Conditional Random Field. We show on two corpora that our approach outperforms the state-of-the-art, often by a wide margin. |
MOSAIC Model for Sensorimotor Learning and Control | Humans demonstrate a remarkable ability to generate accurate and appropriate motor behavior under many different and often uncertain environmental conditions. We previously proposed a new modular architecture, the modular selection and identification for control (MOSAIC) model, for motor learning and control based on multiple pairs of forward (predictor) and inverse (controller) models. The architecture simultaneously learns the multiple inverse models necessary for control as well as how to select the set of inverse models appropriate for a given environment. It combines both feedforward and feedback sensorimotor information so that the controllers can be selected both prior to movement and subsequently during movement. This article extends and evaluates the MOSAIC architecture in the following respects. The learning in the architecture was implemented by both the original gradient-descent method and the expectation-maximization (EM) algorithm. Unlike gradient descent, the newly derived EM algorithm is robust to the initial starting conditions and learning parameters. Second, simulations of an object manipulation task prove that the architecture can learn to manipulate multiple objects and switch between them appropriately. Moreover, after learning, the model shows generalization to novel objects whose dynamics lie within the polyhedra of already learned dynamics. Finally, when each of the dynamics is associated with a particular object shape, the model is able to select the appropriate controller before movement execution. When presented with a novel shape-dynamic pairing, inappropriate activation of modules is observed followed by on-line correction. |
An ultra low power 1V, 220nW temperature sensor for passive wireless applications | This work presents a low power temperature sensor that is suitable for passive wireless systems. The test chip is fabricated with a 0.18 mum CMOS technology and the total area is 0.05 mm2. With temperature inaccuracy of -1.6degC/+3degC from 0degC to 100degC, the temperature sensor consumes only 220 nW at 1V under room temperature. The data conversion rate is 100 sample/s with an output resolution of 0.3degC, which is sufficient for most sensor applications. |
Electrochemical deposition of barium titanate thin films on TiN/Si substrates | Abstract Barium titanate (BaTiO 3 ) films were synthesized on TiN-coated Si substrates by electrochemically anodic oxidation in mixed alkaline electrolytes of 0.5 M Ba(CH 3 COO) 2 and 2 M NaOH. A potentiostatic mode was conducted to make the films with the voltages ranging from 2 to 60 V at 70 °C for only 1 min. X-ray diffraction results show that cubic BaTiO 3 films were successfully prepared on TiN-coated substrates at the reaction voltages above 2 V. Field-emission scanning electron microscopy revealed that the obtained BaTiO 3 films possessed uniformly distributed spherical-particulate surface morphology with a nanolayered feature. The thickness of BaTiO 3 could reach about 4 μm after electrochemical oxidation with the voltage of 60 V. The growth rate of BaTiO 3 films synthesized by the electrochemical oxidation was much faster than that of the films prepared by the previously reported hydrothermal and hydrothermal–galvanic couple methods where only a single layer was produced. Potentiodynamic polarization reveals that the corrosion resistance of TiN-coated substrates was greatly improved by electrochemical deposition of the dense and thick BaTiO 3 films over the substrates. |
Credit scoring using data mining techniques with particular reference to Sudanese banks | One of the key success factors of lending organizations in general and banks in particular is the assessment of borrower credit worthiness in advance during the credit evaluation process. Credit scoring models have been applied by many researchers to improve the process of assessing credit worthiness by differentiating between prospective loans on the basis of the likelihood of repayment. Thus, credit scoring is a very typical Data Mining (DM) classification problem. Many traditional statistical and modern computational intelligence techniques have been presented in the literature to tackle this problem. The main objective of this paper is to describe an experiment of building suitable Credit Scoring Models (CSMs) for the Sudanese banks. Two commonly discussed data mining classification techniques are chosen in this paper namely: Decision Tree (DT) and Artificial Neural Networks (ANN). In addition Genetic Algorithms (GA) and Principal Component Analysis (PCA) are also applied as feature selection techniques. In addition to a Sudanese credit dataset, German credit dataset is also used to evaluate these techniques. The results reveal that ANN models outperform DT models in most cases. Using GA as a feature selection is more effective than PCA technique. The highest accuracy of German data set (80.67%) and Sudanese credit scoring models (69.74%) are achieved by a hybrid GA-ANN model. Although DT and its hybrid models (PCA-DT, GA-DT) are outperformed by ANN and its hybrid models (PCA-ANN, GA-ANN) in most cases, they produced interpretable loan granting decisions. |
High-Temperature Oxidation of Eutectic Alloy Nb-Si, Doped with Yttrium | Nb-Si composites with a high melting point and low density are of interest as a candidate material for use in aircraft engines. However, one major limitation in the application of composites Nb-Si is their poor corrosion resistance at elevated temperatures. The corrosion resistance of Nb-Si composites can be improved by alloying elements such as yttrium. This paper studied the effect of yttrium on the oxidation of eutectic alloy Nb-Si at 25-1000o in air by thermogravimetric method, X-ray diffraction analysis and electron-probe microanalysis. The oxidation products are the oxides of silicon and niobium, which are fixed at a temperature above 600oC. The positive effects of yttrium on the corrosion resistance of Nb-Si eutectic alloy up to 700oC shown. The content in the samples of more than 3%Y increases the oxidation rate of alloys and reduces their heat resistance at heated in air. |
Learning to Rank for Consumer Health Search: A Semantic Approach | For many internet users, searching for health advice online is the first step in seeking treatment. We present a Learning to Rank system that uses a novel set of syntactic and semantic features to improve consumer health search. Our approach was evaluated on the 2016 CLEF eHealth dataset, outperforming the best method by 26.6% in NDCG@10. |
A randomized controlled trial to evaluate the effectiveness of a board game on patients' knowledge uptake of HIV and sexually transmitted diseases at the Infectious Diseases Institute, Kampala, Uganda. | BACKGROUND
As the number of HIV infections continues to rise, the search for effective health education strategies must intensify. A new educational board game was developed to increase HIV peoples' attention and knowledge to HIV and sexually transmitted infections (STIs) information. The object of this study was to assess the effect of this educational board game on the uptake of knowledge.
METHODS
A randomized controlled trial where patients attending the Infectious Diseases Clinic, Kampala, Uganda were randomized to either play the board game (intervention arm) or to attend a health talk (standard of care arm). Participants' knowledge was assessed before and after the education sessions through a questionnaire.
RESULTS
One hundred eighty HIV-positive participants were enrolled, 90 for each study arm. The pretest scores were similar for each arm. There was a statistically significant increase in uptake of knowledge of HIV and STIs in both study arms. Compared with patients in the standard of care arm, participants randomized to the intervention arm had higher uptake of knowledge (4.7 points, 95% confidence interval: 3.9 to 5.4) than the controls (1.5 points, 95% confidence interval: 0.9 to 2.1) with a difference in knowledge uptake between arms of 3.2 points (P < 0.001). Additionally, both participants and facilitators preferred the board game to the health talk as education method.
CONCLUSIONS
The educational game significantly resulted in higher uptake of knowledge of HIV and STIs. Further evaluation of the impact of this educational game on behavioral change in the short and long term is warranted. |
A cooperative approach for handshake detection based on body sensor networks | The handshake gesture is an important part of the social etiquette in many cultures. It lies at the core of many human interactions, either in formal or informal settings: exchanging greetings, offering congratulations, and finalizing a deal are all activities that typically either start or finish with a handshake. The automated detection of a handshake can enable wide range of pervasive computing scanarios; in particular, different types of information can be exchanged and processed among the handshaking persons, depending on the physical/logical contexts where they are located and on their mutual acquaintance. This paper proposes a novel handshake detection system based on body sensor networks consisting of a resource-constrained wrist-wearable sensor node and a more capable base station. The system uses an effective collaboration technique among body sensor networks of the handshaking persons which minimizes errors associated with the application of classification algorithms and improves the overall accuracy in terms of the number of false positives and false negatives. |
Isometric Exercise for the Cervical Extensors Can Help Restore Physiological Lordosis and Reduce Neck Pain: A Randomized Controlled Trial. | OBJECTIVE
The aim of this study was to investigate whether isometric neck extension exercise restores physiological cervical lordosis and reduces pain.
DESIGN
Sixty-five patients with loss of cervical lordosis were randomly assigned to exercise (27 women, 7 men; mean age, 32.82 ± 8.83 yrs) and control (26 women, 5 men; mean age, 33.48 ± 9.67 yrs) groups. Both groups received nonsteroidal anti-inflammatory drugs for 10 days. The exercise group received additional therapy as a home exercise program, which consisted of isometric neck extension for 3 mos. Neck pain severity and cervical lordosis were measured at baseline and at 3 mos after baseline.
RESULTS
Compared with baseline levels, cervical lordosis angle was significantly improved in the exercise group (P < 0.001) but not in the control group (P = 0.371) at the end of 3 mos. Moreover, the exercise group was significantly superior to the control group considering the number of patients in whom cervical lordosis angle returned to physiological conditions (85.2% vs. 22.5%; P < 0.001). At the end of 3 mos, pain intensity was significantly reduced in both groups compared with baseline levels (for all, P < 0.001). Nevertheless, considering the change from baseline to month 3, the reduction in pain was about twice in the exercise group compared with the control group (P < 0.001).
CONCLUSIONS
Isometric neck extension exercise improves cervical lordosis and pain. |
A Study on Consumers ’ Attitude towards Online Shopping in China | Online shopping provides a good example of the business revolution. In China, e-commerce is currently experiencing a period of rapid development; the large number of Internet users provides a good foundation for the expansion of the online shopping market. In this study, perceived usability, perceived security, perceived privacy, perceived after-sales service, perceived marketing mix, and perceived reputation were used for analysis. This research was conducted by using the primary data source, and the survey method was employed in the research. This research found that there were relationships between the perceived usability, perceived security, perceived privacy, perceived after-sales service, perceived marketing mix, perceived reputation and consumers’ attitude to adopting online shopping in China. However, only marketing mix and reputation were found to significantly influence consumers’ attitude to adopt online shopping. The findings help us in understanding consumers’ online purchase behaviour. |
Mining idioms from source code | We present the first method for automatically mining code idioms from a corpus of previously written, idiomatic software projects. We take the view that a code idiom is a syntactic fragment that recurs across projects and has a single semantic purpose. Idioms may have metavariables, such as the body of a for loop. Modern IDEs commonly provide facilities for manually defining idioms and inserting them on demand, but this does not help programmers to write idiomatic code in languages or using libraries with which they are unfamiliar. We present Haggis, a system for mining code idioms that builds on recent advanced techniques from statistical natural language processing, namely, nonparametric Bayesian probabilistic tree substitution grammars. We apply Haggis to several of the most popular open source projects from GitHub. We present a wide range of evidence that the resulting idioms are semantically meaningful, demonstrating that they do indeed recur across software projects and that they occur more frequently in illustrative code examples collected from a Q&A site. Manual examination of the most common idioms indicate that they describe important program concepts, including object creation, exception handling, and resource management. |
A Toolbox of Potato Genetic and Genomic Resources | Access to genetic and genomic resources can greatly facilitate biological understanding of plant species leading to improved crop varieties. While model plant species such as Arabidopsis have had nearly two decades of genetic and genomic resource development, many major crop species have seen limited development of these resources due to the large, complex nature of their genomes. Cultivated potato is among the ranks of crop species that, despite substantial worldwide acreage, have seen limited genetic and genomic tool development. As technologies advance, this paradigm is shifting and a number of tools are being developed for important crop species such as potato. This review article highlights numerous tools that have been developed for the potato community with a specific focus on the reference de novo genome assembly and annotation, genetic markers, transcriptomics resources, and newly emerging resources that extend beyond a single reference individual. El acceso a los recursos genéticos y genómicos puede facilitar en gran medida el entendimiento biológico de las especies de plantas, lo que conduce a variedades mejoradas de cultivos. Mientras que el modelo de las especies de plantas como Arabidopsis ha tenido cerca de dos décadas de desarrollo de recursos genéticos y genómicos, muchas especies de cultivos principales han visto desarrollo limitado de estos recursos debido a la naturaleza grande, compleja, de sus genomios. La papa cultivada está ubicada entre las especies de plantas que a pesar de su superficie substancial mundial, ha visto limitado el desarrollo de las herramientas genéticas y genómicas. A medida que avanzan las tecnologías, este paradigma está girando y se han estado desarrollando un número de herramientas para especies importantes de cultivo tales como la papa. Este artículo de revisión resalta las numerosas herramientas que se han desarrollado para la comunidad de la papa con un enfoque específico en la referencia de ensamblaje y registro de genomio de novo, marcadores genéticos, recursos transcriptómicos, y nuevas fuentes emergentes que se extienden más allá de la referencia de un único individuo. |
Space Vector PWM for PMSM simulation using Matlab Simulink | Space Vector PWM (SVPWM) model is often built based on high-level functions and verified based on the output of the inverter or the model of the electrical motor with best possible accuracy. However, SVPWM implementation on digital hardware such as Field Programmable Gate Array (FPGA) and Application-specific Integrated Circuit (ASIC) is constrained by the limited resources and computation accuracy in these digital hardware compared to the mathematical model. The paper proposed a method that utilizes Matlab Simulink and Fixed-Point Toolbox to construct hardware-amenable SVPWM model. Using the proposed model, it is possible to estimate the digital hardware resources used and analyze the accuracy of the system before the actual designing process takes place. The model has been simulated and verified with signal switching patterns and output signals from the model of the electrical motor. Based on functional comparisons, it was found that the outputs of the SVPWM model are almost identical to the digital hardware implementation. |
Daily life activity tracking application for smart homes using android smartphone | Smart home is regarded as an independent healthy living for elderly person. Advances in phone technology and new style of computing paradigm (i.e., cloud computing) permits real time acquisition, processing, and tracking of activities in smart home. In this paper, we develop android smartphone application to assists elderly people for independent living in their own homes. It reduces the health expenditures and burden of health care professionals in care facility units. We assume smart home as an intelligent agent to perceive the environment and process the sensory data on cloud. Smartphone application communicates with cloud through web services and assists the elderly person to complete their daily life activities. It facilitates the care giver assistant by tracking the elderly persons in their own homes and avoids certain accidents. Furthermore, it also helps the family members to track the activities, when they are outside from homes. |
The global prevalence of dementia: A systematic review and metaanalysis | BACKGROUND
The evidence base on the prevalence of dementia is expanding rapidly, particularly in countries with low and middle incomes. A reappraisal of global prevalence and numbers is due, given the significant implications for social and public policy and planning.
METHODS
In this study we provide a systematic review of the global literature on the prevalence of dementia (1980-2009) and metaanalysis to estimate the prevalence and numbers of those affected, aged ≥60 years in 21 Global Burden of Disease regions.
RESULTS
Age-standardized prevalence for those aged ≥60 years varied in a narrow band, 5%-7% in most world regions, with a higher prevalence in Latin America (8.5%), and a distinctively lower prevalence in the four sub-Saharan African regions (2%-4%). It was estimated that 35.6 million people lived with dementia worldwide in 2010, with numbers expected to almost double every 20 years, to 65.7 million in 2030 and 115.4 million in 2050. In 2010, 58% of all people with dementia lived in countries with low or middle incomes, with this proportion anticipated to rise to 63% in 2030 and 71% in 2050.
CONCLUSION
The detailed estimates in this study constitute the best current basis for policymaking, planning, and allocation of health and welfare resources in dementia care. The age-specific prevalence of dementia varies little between world regions, and may converge further. Future projections of numbers of people with dementia may be modified substantially by preventive interventions (lowering incidence), improvements in treatment and care (prolonging survival), and disease-modifying interventions (preventing or slowing progression). All countries need to commission nationally representative surveys that are repeated regularly to monitor trends. |
Multi-style paper pop-up designs from 3D models | Paper pop-ups are interesting three-dimensional books that fascinate people of all ages. The design and construction of these pop-up books however are done manually and require a lot of time and effort. This has led to computer-assisted or automated tools for designing paper pop-ups. This paper proposes an approach for automatically converting a 3D model into a multi-style paper pop-up. Previous automated approaches have only focused on single-style pop-ups, where each is made of a single type of pop-up mechanisms. In our work, we combine multiple styles in a pop-up, which is more representative of actual artist’s creations. Our method abstracts a 3D model using suitable primitive shapes that both facilitate the formation of the considered pop-up mechanisms and closely approximate the input model. Each shape is then abstracted using a set of 2D patches that combine to form a valid pop-up. We define geometric conditions that ensure the validity of the combined pop-up structures. In addition, our method also employs an image-based approach for producing the patches to preserve the textures, finer details and important contours of the input model. Finally, our system produces a printable design layout and decides an assembly order for the construction instructions. The feasibility of our results is verified by constructing the actual paper pop-ups from the designs generated by our system. |
Voice over IP performance monitoring | We describe a method for monitoring Voice over IP (VoIP) applications based upon a reduction of the ITU-T's E-Model to transport level, measurable quantities. In the process, 1) we identify the relevant transport level quantities, 2) we discuss the tradeoffs between placing the monitors within the VoIP gateways versus placement of the monitors within the transport path, and 3) we identify several areas where further work and consensus within the industry are required. We discover that the relevant transport level quantities are the delay, network packet loss and the decoder's de-jitter buffer packet loss. We find that an in-path monitor requires the definition of a reference de-jitter buffer implementation to estimate voice quality based upon observed transport measurements. Finally, we suggest that more studies are required, which evaluate the quality of various VoIP codecs in the presence of representative packet loss patterns. |
Automatic Driving on Ill-defined Roads: An Adaptive, Shape-constrained, Color-based Method | Autonomous following of ill-defined roads is an important part of visual navigation systems. This paper presents an adaptive method that uses a statistical model of the colour of the road surface within a trapezoidal shape that approximately corresponds to the projection of the road on the image plane. The method does not perform an explicit segmentation of the images but instead expands the shape sideways until the match between shape and road worsens, simultaneously computing the colour statistics. Results show that the method is capable of reactively following roads, at driving speeds typical of the robots used, in a variety of situations while coping with variable conditions of the road such as surface type, puddles and shadows. We extensively evaluate the proposed method using a large number of datasets with ground truth (that will be made public on our server once the paper is published). We moreover evaluate many colour spaces in the context of road following and find that the colour spaces that separate luminance from colour information perform best, especially if the luminance information is discarded. |
Suicide by the intraoral blast of firecrackers — experimental simulation using a skull simulant model | Suicides committed by intraorally placed firecrackers are rare events. Given to the use of more powerful components such as flash powder recently, some firecrackers may cause massive life-threatening injuries in case of such misuse. Innocuous black powder firecrackers are subject to national explosives legislation and only have the potential to cause harmless injuries restricted to the soft tissue. We here report two cases of suicide committed by an intraoral placement of firecrackers, resulting in similar patterns of skull injury. As it was first unknown whether black powder firecrackers can potentially cause serious skull injury, we compared the potential of destruction using black powder and flash powder firecrackers in a standardized skull simulant model (Synbone, Malans, Switzerland). This was the first experiment to date simulating the impacts resulting from an intraoral burst in a skull simulant model. The intraoral burst of a “D-Böller” (an example of one of the most powerful black powder firecrackers in Germany) did not lead to any injuries of the osseous skull. In contrast, the “La Bomba” (an example of the weakest known flash powder firecrackers) caused complex fractures of both the viscero- and neurocranium. The results obtained from this experimental study indicate that black powder firecrackers are less likely to cause severe injuries as a consequence of intraoral explosions, whereas flash powder-based crackers may lead to massive life-threatening craniofacial destructions and potentially death. |
Modeling and characterization of GPS spoofing | The Global Positioning System (GPS) grows into a ubiquitous utility that provides positioning, navigation, and timing (PNT) services. As an essential element of the global information infrastructure, cyber security of GPS faces serious challenges. Some mission-critical systems even rely on GPS as a security measure. However, civilian GPS itself has no protection against malicious acts such as spoofing. GPS spoofing breaches authentication by forging satellite signals to mislead users with wrong location/timing data that threatens homeland security. In order to make civilian GPS secure and resilient for diverse applications, we must understand the nature of attacks. This paper proposes a novel attack modeling of GPS spoofing with event-driven simulation package. Simulation supplements usual experiments to limit incidental harms and to comprehend a surreptitious scenario. We also provide taxonomy of GPS spoofing through characterization. The work accelerates the development of defense technology against GPS-based attacks. |
A click chemistry strategy for visualization of plant cell wall lignification. | Bioorthogonal click chemistry was commissioned to visualize the plant cell wall lignification process in vivo. This approach uses chemical reporter-tagged monolignol mimics that can be metabolically incorporated into lignins and subsequently derivatized via copper-assisted or copper-free click reactions. |
Mathematical features of Whitehead ’ s point-free geometry | This paper is devoted to some mathematical considerations on the geometrical ideas contained in PNK, CN and, successively, in PR. Mainly, we will emphasize that these ideas give very promising suggestions for a modern point-free foundation of geometry. 1. Introduction Recently the researches in point-free geometry received an increasing interest in different areas. As an example, we can quote computability theory, lattice theory, computer science. Now, the basic ideas of point-free geometry were firstly formulated by A. N. Whitehead in PNK and CN where the extension relation between events is proposed as a primitive. The points, the lines and all the " abstract " geometrical entities are defined by suitable abstraction processes. As a matter of fact, as observed in Casati and Varzi 1997, the approach proposed in these books is a basis for a "mereology" (i.e. an investigation about the part-whole relation) rather than for a point-free geometry. Indeed , the inclusion relation is set-theoretical and not topological in nature and this generates several difficulties. As an example, the definition of point is unsatisfactory (see Section 6). So, it is not surprising that some years later the publication of PNK and CN, Whitehead in PR proposed a different approach in which the primitive notion is the one of connection relation. This idea was suggested in de Laguna 1922. The aim of this paper is not to give a precise account of geometrical ideas contained in these books but only to emphasize their mathematical potentialities. So, we translate the analysis of Whitehead into suitable first order theories and we examine these theories from a logical point of view. Also, we argue that multi-valued logic is a promising tool to reformulate the approach in PNK and CN. |
Evaluating rotational kinematics of the knee in ACL reconstructed patients using 3.0 Tesla magnetic resonance imaging. | INTRODUCTION
Injury to the anterior cruciate ligament (ACL) is common. While prior studies have shown that surgical reconstruction of the ACL can restore anterior-posterior kinematics, ACL-injured and reconstructed knees have been shown to have significant differences in tibial rotation when compared to uninjured knees. Our laboratory has developed an MR compatible rotational loading device to objectively quantify rotational stability of the knee following ACL injuries and reconstructions. Previous work from our group demonstrated a significant increase in total tibial rotation following ACL injuries. The current study is a prospective study on the same cohort of patients who have now undergone ACL reconstruction. We hypothesize that ACL reconstructed knees will have less tibial rotation relative to the pre-operative ACL deficient condition. We also hypothesize that ACL reconstructed knees will have greater rotational laxity when compared to healthy contralateral knees.
METHODS
Patients. Six of the ACL injured patients from our initial study who had subsequently undergone ACL reconstruction were evaluated 8.1 ± 2.9 months after surgery. All patients underwent single-bundle ACL reconstruction using anteromedial portal drilling of the femoral tunnel with identical post-operative regimens. Magnetic Resonance (MR) Imaging. Patients were placed in a supine position in the MR scanner on a custom-built loading device. Once secured in the scanner bore, an internal/external torque was applied to the foot. The tibiae were semi-automatically segmented with in-house software. Tibial rotation comparisons were made within subjects (i.e. side-to-side comparison between reconstructed and contralateral knees) and differences were explored using paired sample t-tests with significance set at p=0.05.
RESULTS
Regarding tibial rotation, in the ACL deficient state, these patients experienced an average of 5.9 ± 4.1° difference in tibial rotation between their ACL deficient and contralateral knees. However, there was a -0.2 ± 6.1° difference in tibial rotation of the ACL reconstructed knee when compared to the contralateral uninjured knee. Regarding tibial translation, ACL deficient patients showed a difference of 0.75 ± 1.4mm of anterior tibial translation between injured and healthy knees. After ACL reconstruction, there was a 0.2 ± 1.1mm difference in coupled anterior tibial translation of the ACL reconstructed knee compared to the contralateral knee. No significant differences in contact area between the two time points could be discerned.
DISCUSSION
The objective of our study was to assess the rotational laxity present in ACL reconstructed knees using a previously validated MRI-compatible rotational loading device. Our study demonstrated that ACL reconstruction can restore rotational laxity under load. This may speak to the benefit of an anteromedial drilling technique, which allows for a more horizontal and anatomically appropriate graft position. |
Clinical evaluation of BCR-ABL peptide immunisation in chronic myeloid leukaemia: results of the EPIC study | Peptides from the e14a2 BCR-ABL junction will elicit T-cell responses in vitro. Here, 19 imatinib treated CML patients in first chronic phase were vaccinated with BCR-ABL peptides spanning the e14a2 fusion junction, some of which were linked to the pan DR epitope PADRE to augment CD4+ T cell help. Six vaccinations were given over 9 weeks, together with sargramostim. All patients developed mild local reactions. T cell responses to PADRE were seen in all patients. Fourteen of 19 patients developed T cell responses to BCR-ABL peptides. The development of an anti-BCR-ABL T cell response correlated with a subsequent fall in BCR-ABL transcripts. No molecular benefit was seen in the 5 patients not in major cytogenetic response (MCR) at baseline. However, of the 14 patients in MCR at baseline, 13 developed at least 1 log fall in BCR-ABL transcripts, though this occurred several months after completing vaccination, consistent with an effect at a primitive CML stem cell level. Vaccination may improve the fall in BCR-ABL transcripts in patients who have received imatinib for more than 12 months. BCR-ABL peptide vaccination may improve control of CML, especially in patients responding well to imatinib. Randomised trials are required to address this further. |
Accelerating vector graphics rendering using the graphics hardware pipeline | We describe our successful initiative to accelerate Adobe Illustrator with the graphics hardware pipeline of modern GPUs. Relying on OpenGL 4.4 plus recent OpenGL extensions for advanced blend modes and first-class GPU-accelerated path rendering, we accelerate the Adobe Graphics Model (AGM) layer responsible for rendering sophisticated Illustrator scenes. Illustrator documents render in either an RGB or CMYK color mode. While GPUs are designed and optimized for RGB rendering, we orchestrate OpenGL rendering of vector content in the proper CMYK color space and accommodate the 5+ color components required. We support both non-isolated and isolated transparency groups, knockout, patterns, and arbitrary path clipping. We harness GPU tessellation to shade paths smoothly with gradient meshes. We do all this and render complex Illustrator scenes 2 to 6x faster than CPU rendering at Full HD resolutions; and 5 to 16x faster at Ultra HD resolutions. |
Design and implementation of a smartphone-based portable ultrasound pulsed-wave doppler device for blood flow measurement | Blood flow measurement using Doppler ultrasound has become a useful tool for diagnosing cardiovascular diseases and as a physiological monitor. Recently, pocket-sized ultrasound scanners have been introduced for portable diagnosis. The present paper reports the implementation of a portable ultrasound pulsed-wave (PW) Doppler flowmeter using a smartphone. A 10-MHz ultrasonic surface transducer was designed for the dynamic monitoring of blood flow velocity. The directional baseband Doppler shift signals were obtained using a portable analog circuit system. After hardware processing, the Doppler signals were fed directly to a smartphone for Doppler spectrogram analysis and display in real time. To the best of our knowledge, this is the first report of the use of this system for medical ultrasound Doppler signal processing. A Couette flow phantom, consisting of two parallel disks with a 2-mm gap, was used to evaluate and calibrate the device. Doppler spectrograms of porcine blood flow were measured using this stand-alone portable device under the pulsatile condition. Subsequently, in vivo portable system verification was performed by measuring the arterial blood flow of a rat and comparing the results with the measurement from a commercial ultrasound duplex scanner. All of the results demonstrated the potential for using a smartphone as a novel embedded system for portable medical ultrasound applications. |
Pre-Transplant Cardiovascular Risk Factors Affect Kidney Allograft Survival: A Multi-Center Study in Korea | BACKGROUND
Pre-transplant cardiovascular (CV) risk factors affect the development of CV events even after successful kidney transplantation (KT). However, the impact of pre-transplant CV risk factors on allograft failure (GF) has not been reported.
METHODS AND FINDINGS
We analyzed the graft outcomes of 2,902 KT recipients who were enrolled in a multi-center cohort from 1997 to 2012. We calculated the pre-transplant CV risk scores based on the Framingham risk model using age, gender, total cholesterol level, smoking status, and history of hypertension. Vascular disease (a composite of ischemic heart disease, peripheral vascular disease, and cerebrovascular disease) was noted in 6.5% of the patients. During the median follow-up of 6.4 years, 286 (9.9%) patients had developed GF. In the multivariable-adjusted Cox proportional hazard model, pre-transplant vascular disease was associated with an increased risk of GF (HR 2.51; 95% CI 1.66-3.80). The HR for GF (comparing the highest with the lowest tertile regarding the pre-transplant CV risk scores) was 1.65 (95% CI 1.22-2.23). In the competing risk model, both pre-transplant vascular disease and CV risk score were independent risk factors for GF. Moreover, the addition of the CV risk score, the pre-transplant vascular disease, or both had a better predictability for GF compared to the traditional GF risk factors.
CONCLUSIONS
In conclusion, both vascular disease and pre-transplant CV risk score were independently associated with GF in this multi-center study. Pre-transplant CV risk assessments could be useful in predicting GF in KT recipients. |
Scalable sentiment classification for Big Data analysis using Naïve Bayes Classifier | A typical method to obtain valuable information is to extract the sentiment or opinion from a message. Machine learning technologies are widely used in sentiment classification because of their ability to “learn” from the training dataset to predict or support decision making with relatively high accuracy. However, when the dataset is large, some algorithms might not scale up well. In this paper, we aim to evaluate the scalability of Naïve Bayes classifier (NBC) in large datasets. Instead of using a standard library (e.g., Mahout), we implemented NBC to achieve fine-grain control of the analysis procedure. A Big Data analyzing system is also design for this study. The result is encouraging in that the accuracy of NBC is improved and approaches 82% when the dataset size increases. We have demonstrated that NBC is able to scale up to analyze the sentiment of millions movie reviews with increasing throughput. |
Exploring Entity-centric Networks in Entangled News Streams | The increasing number of news outlets and the frequency of the news cycle have made it all but impossible to obtain the full picture from online news. Consolidating news from different sources has thus become a necessity in online news processing. Despite the amount of research that has been devoted to different aspects of new event detection and tracking in news streams, solid solutions for such entangled streams of full news articles are still lacking. Many existing works focus on streams of microblogs since the analysis of news articles raises the additional problem of summarizing or extracting the relevant sections of articles. For the consolidation of identified news snippets, schemes along numerous different dimensions have been proposed, including publication time, temporal expressions, geo-spatial references, named entities, and topics. The granularity of aggregated news snippets then includes such diverse aspects as events, incidents, threads, or topics for various subdivisions of news articles. To support this variety of granularity levels, we propose a comprehensive network model for the representation of multiple entangled streams of news documents. Unlike previous methods, the model is geared towards entity-centric explorations and enables the consolidation of news along all dimensions, including the context of entity mentions. Since the model also serves as a reverse index, it supports explorations along the dimensions of sentences or documents for an encompassing view on news events. We evaluate the performance of our model on a large collection of entangled news streams from major news outlets of English speaking countries and a ground truth that we generate from event summaries in the Wikipedia Current Events portal. |
Rectified Linear Units Improve Restricted Boltzmann Machines | Restricted Boltzmann machines were developed using binary stochastic hidden units. These can be generalized by replacing each binary unit by an infinite number of copies that all have the same weights but have progressively more negative biases. The learning and inference rules for these “Stepped Sigmoid Units” are unchanged. They can be approximated efficiently by noisy, rectified linear units. Compared with binary units, these units learn features that are better for object recognition on the NORB dataset and face verification on the Labeled Faces in the Wild dataset. Unlike binary units, rectified linear units preserve information about relative intensities as information travels through multiple layers of feature detectors. |
ABCC3 Genetic Variants are Associated with Postoperative Morphine-induced Respiratory Depression and Morphine Pharmacokinetics in Children | Respiratory depression (RD) is a serious side effect of morphine and detrimental to effective analgesia. We reported that variants of the ATP binding cassette gene ABCC3 (facilitates hepatic morphine metabolite efflux) affect morphine metabolite clearance. In this study of 316 children undergoing tonsillectomy, we found significant association between ABCC3 variants and RD leading to prolonged postoperative care unit stay (prolonged RD). Allele A at rs4148412 and allele G at rs729923 caused a 2.36 (95% CI=1.28–4.37, P=0.0061) and 3.7 (95% CI 1.47–9.09, P=0.0050) times increase in odds of prolonged RD, respectively. These clinical associations were supported by increased formation clearance of morphine glucuronides in children with rs4148412 AA and rs4973665 CC genotypes in this cohort, as well as an independent spine surgical cohort of 67 adolescents. This is the first study to report association of ABCC3 variants with opioid-related RD, and morphine metabolite formation (in two independent surgical cohorts). |
Long-term platinum retention after treatment with cisplatin and oxaliplatin | BACKGROUND
The aim of this study was to evaluate long-term platinum retention in patients treated with cisplatin and oxaliplatin.
METHODS
45 patients, treated 8-75 months before participating in this study, were included. Platinum levels in plasma and plasma ultrafiltrate (pUF) were determined. In addition, the reactivity of platinum species in pUF was evaluated. Relationships between platinum retention and possible determinants were evaluated.
RESULTS
Platinum plasma concentrations ranged between 142-2.99 x 10(3) ng/L. Up to 24% of plasma platinum was recovered in pUF. No platinum-DNA adducts in peripheral blood mononuclear cells (PBMCs) could be detected. Ex vivo incubation of DNA with pUF of patients revealed that up to 10% of the reactivity of platinum species was retained. Protein binding proceeded during sample storage. Sodium thiosulfate (STS) appeared to release platinum from the plasma proteins. Platinum levels were related to time, dose, STS co-administration, and glomerular filtration rates (GFR).
CONCLUSION
Our data suggest that plasma platinum levels are related to time, age, dose, GFR, and STS use. Platinum in plasma, probably, represent platinum eliminated from regenerating tissue. Platinum species in pUF were partly present in a reactive form. The effects of the reactivity on long-term consequences of Pt-containing chemotherapy, however, remains to be established. |
A Data-Driven Framework for Identifying High School Students at Risk of Not Graduating on Time [ Extended | Some students, for a variety of factors, struggle to complete high school on time. To address this problem, school districts across the U.S. use intervention programs to help struggling students get back on track academically. Yet in order to best apply those programs, schools need to identify off-track students as early as possible and enroll them in the most appropriate intervention. Unfortunately, identifying and prioritizing students in need of intervention remains a challenging task. This paper describes work that builds on current systems by using advanced data science methods to produce an extensible and scalable predictive framework for providing partner U.S. public school districts with individual early warning indicator systems. Our framework employs machine learning techniques to identify struggling students and describe features that are useful for this task, evaluating these techniques using metrics important to school administrators. By doing so, our framework, developed with the common need of several school districts in mind, provides a common set of tools for identifying struggling students and the factors associated with their struggles. Further, by integrating data from disparate districts into a common system, our framework enables cross-district analyses to investigate common early warning indicators not just within a single school or district, but across the U.S. and beyond. |
An efficient DHT-based elastic SDN controller | Elasticity in distributed SDN Controllers [1], [2], [3] improves fault tolerance, power efficiency, cost efficiency and scalability. We find that the Elasticon [1] algorithm has high time complexity since it balances the load equally on all the controllers for every event. In this paper, we propose an efficient algorithm for elastic controllers using Chord [4], a Distributed Hash Table (DHT). In our scheme, we do not balance the load on the controllers until a controller is overloaded. The switch ID space forms the ring in the DHT. The controller is assigned an ID based on the range of switch IDs it is responsible for which changes as the switches migrate from an overloaded controller to its adjacent controllers in the ring. If the average load on the controllers falls below a given threshold, we consolidate the switch allocation to controllers so that some of the controllers can be powered off. We compared the performance of Elasticon and the proposed algorithm in terms of the number of migrations needed, the average number of controllers that are powered on, the time taken to run the algorithm and the standard deviation of the load on the controllers. We find that the number of controllers is similar for our algorithm and Elasticon, the migrations are fewer in our algorithm while the time taken by our algorithm is orders of magnitude less than Elasticon. |
Octonions , E 6 , and Particle Physics | In 1934, Jordan et al. gave a necessary algebraic condition, the Jordan identity, for a sensible theory of quantum mechanics. All but one of the algebras that satisfy this condition can be described by Hermitian matrices over the complexes or quaternions. The remaining, exceptional Jordan algebra can be described by 3 × 3 Hermitian matrices over the octonions. We first review properties of the octonions and the exceptional Jordan algebra, including our previous work on the octonionic Jordan eigenvalue problem. We then examine a particular real, noncompact form of the Lie group E6, which preserves determinants in the exceptional Jordan algebra. Finally, we describe a possible symmetry-breaking scenario within E6: first choose one of the octonionic directions to be special, then choose one of the 2× 2 submatrices inside the 3× 3 matrices to be special. Making only these two choices, we are able to describe many properties of leptons in a natural way. We further speculate on the ways in which quarks might be similarly encoded. |
Age Estimation by Multi-scale Convolutional Network | In the last five years, biologically inspired features (BIF) always held the state-of-the-art results for human age estimation from face images. Recently, researchers mainly put their focuses on the regression step after feature extraction, such as support vector regression (SVR), partial least squares (PLS), canonical correlation analysis (CCA) and so on. In this paper, we apply convolutional neural network (CNN) to the age estimation problem, which leads to a fully learned end-toend system can estimate age from image pixels directly. Compared with BIF, the proposed method has deeper structure and the parameters are learned instead of hand-crafted. The multi-scale analysis strategy is also introduced from traditional methods to the CNN, which improves the performance significantly. Furthermore, we train an efficient network in a multi-task way which can do age estimation, gender classification and ethnicity classification well simultaneously. The experiments on MORPH Album 2 illustrate the superiorities of the proposed multi-scale CNN over other state-of-the-art methods. |
The scientific basis for high-intensity interval training: optimising training programmes and maximising performance in highly trained endurance athletes. | While the physiological adaptations that occur following endurance training in previously sedentary and recreationally active individuals are relatively well understood, the adaptations to training in already highly trained endurance athletes remain unclear. While significant improvements in endurance performance and corresponding physiological markers are evident following submaximal endurance training in sedentary and recreationally active groups, an additional increase in submaximal training (i.e. volume) in highly trained individuals does not appear to further enhance either endurance performance or associated physiological variables [e.g. peak oxygen uptake (VO2peak), oxidative enzyme activity]. It seems that, for athletes who are already trained, improvements in endurance performance can be achieved only through high-intensity interval training (HIT). The limited research which has examined changes in muscle enzyme activity in highly trained athletes, following HIT, has revealed no change in oxidative or glycolytic enzyme activity, despite significant improvements in endurance performance (p < 0.05). Instead, an increase in skeletal muscle buffering capacity may be one mechanism responsible for an improvement in endurance performance. Changes in plasma volume, stroke volume, as well as muscle cation pumps, myoglobin, capillary density and fibre type characteristics have yet to be investigated in response to HIT with the highly trained athlete. Information relating to HIT programme optimisation in endurance athletes is also very sparse. Preliminary work using the velocity at which VO2max is achieved (V(max)) as the interval intensity, and fractions (50 to 75%) of the time to exhaustion at V(max) (T(max)) as the interval duration has been successful in eliciting improvements in performance in long-distance runners. However, V(max) and T(max) have not been used with cyclists. Instead, HIT programme optimisation research in cyclists has revealed that repeated supramaximal sprinting may be equally effective as more traditional HIT programmes for eliciting improvements in endurance performance. Further examination of the biochemical and physiological adaptations which accompany different HIT programmes, as well as investigation into the optimal HIT programme for eliciting performance enhancements in highly trained athletes is required. |
The presence of Aβ seeds, and not age per se, is critical to the initiation of Aβ deposition in the brain | The deposition of the β-amyloid (Aβ) peptide in senile plaques and cerebral Aβ-amyloid angiopathy can be seeded in β-amyloid precursor protein (APP)-transgenic mice by the intracerebral infusion of brain extracts containing aggregated Aβ. Previous studies of seeded β-amyloid induction have used relatively short incubation periods to dissociate seeded β-amyloid induction from endogenous β-amyloid deposition of the host, thus precluding the analysis of the impact of age and extended incubation periods on the instigation and spread of Aβ lesions in brain. In the present study using R1.40 APP-transgenic mice (which do not develop endogenous Aβ deposition up to 15 months of age) we show that: (1) seeding at 9 months of age does not induce more Aβ deposition than seeding at 3 months of age, provided that the incubation period (6 months) is the same; and (2) very long-term (12 months) incubation after a focal application of the seed results in the emergence of Aβ deposits throughout the forebrain. These findings indicate that the presence of Aβ seeds, and not the age of the host per se, is critical to the initiation of Aβ aggregation in the brain, and that Aβ deposition, actuated in one brain area, eventually spreads throughout the brain. |
Evidence-based lean logic profiles for conceptual data modelling languages | Multiple logic-based reconstruction of conceptual data modelling languages such as EER, UML Class Diagrams, and ORM exists. They mainly cover various fragments of the languages and none are formalised such that the logic applies simultaneously for all three modelling language families as unifying mechanism. This hampers interchangeability, interoperability, and tooling support. In addition, due to the lack of a systematic design process of the logic used for the formalisation, hidden choices permeate the formalisations that have rendered them incompatible. We aim to address these problems, first, by structuring the logic design process in a methodological way. We generalise and extend the DSL design process to apply to logic language design more generally and, in particular, by incorporating an ontological analysis of language features in the process. Second, availing of this extended process, of evidence gathered of language feature usage, and of computational complexity insights from Description Logics (DL), we specify logic profiles taking into account the ontological commitments embedded in the languages. The profiles characterise the minimum logic structure needed to handle the semantics of conceptual models, enabling the development of interoperability tools. There is no known DL language that matches exactly the features of those profiles and the common core is small (in the tractable ALNI). Although hardly any inconsistencies can be derived with the profiles, it is promising for scalable runtime use of conceptual data models. |
Models of Cannabis Taxonomy, Cultural Bias, and Conflicts between Scientific and Vernacular Names | Debates over Cannabis sativa L. and C. indica Lam. center on their taxonomic circumscription and rank. This perennial puzzle has been compounded by the viral spread of a vernacular nomenclature, “Sativa” and “Indica,” which does not correlate with C. sativa and C. indica. Ambiguities also envelop the epithets of wild-type Cannabis: the spontanea versus ruderalis debate (i.e., vernacular “Ruderalis”), as well as another pair of Cannabis epithets, afghanica and kafirstanica. To trace the rise of vernacular nomenclature, we begin with the protologues (original descriptions, synonymies, type specimens) of C. sativa and C. indica. Biogeographical evidence (obtained from the literature and herbarium specimens) suggests 18th–19th century botanists were biased in their assignment of these taxa to field specimens. This skewed the perception of Cannabis biodiversity and distribution. The development of vernacular “Sativa,” “Indica,” and “Ruderalis” was abetted by twentieth century botanists, who ignored original protologues and harbored their own cultural biases. Predominant taxonomic models by Vavilov, Small, Schultes, de Meijer, and Hillig are compared and critiqued. Small’s model adheres closest to protologue data (with C. indica treated as a subspecies). “Sativa” and “Indica” are subpopulations of C. sativa subsp. indica; “Ruderalis” represents a protean assortment of plants, including C. sativa subsp. sativa and recent hybrids. |
Pogamut 3 Can Assist Developers in Building AI (Not Only) for Their Videogame Agents | Many research projects oriented on control mechanisms of virtual agents in videogames have emerged in recent years. However, this boost has not been accompanied with the emergence of toolkits supporting development of these projects, slowing down the progress in the field. Here, we present Pogamut 3, an open source platform for rapid development of behaviour for virtual agents embodied in a 3D environment of the Unreal Tournament 2004 videogame. Pogamut 3 is designed to support research as well as educational projects. The paper also briefly touches extensions of Pogamut 3; the ACT-R integration, the emotional model ALMA integration, support for control of avatars at the level of gestures, and a toolkit for developing educational scenarios concerning orientation in urban areas. These extensions make Pogamut 3 applicable beyond the domain of computer games. |
A highly immunogenic trivalent T cell receptor peptide vaccine for multiple sclerosis. | BACKGROUND
T cell receptor (TCR) peptide vaccination is a novel approach to treating multiple sclerosis (MS). The low immunogenicity of previous vaccines has hindered the development of TCR peptide vaccination for MS.
OBJECTIVE
To compare the immunogenicity of intramuscular injections of TCR BV5S2, BV6S5 and BV13S1 CDR2 peptides in incomplete Freunds adjuvant (IFA) with intradermal injections of the same peptides without IFA.
METHODS
MS subjects were randomized to receive TCR peptides/IFA, TCR peptides/saline or IFA alone. Subjects were on study for 24 weeks.
RESULTS
The TCR peptides/IFA vaccine induced vigorous T cell responses in 100% of subjects completing the 24-week study (9/9) compared with only 20% (2/10) of those receiving the TCR peptides/saline vaccine (P =0.001). IFA alone induced a weak response in only one of five subjects. Aside from injection site reactions, there were no significant adverse events attributable to the treatment.
CONCLUSIONS
The trivalent TCR peptide in IFA vaccine represents a significant improvement in immunogenicity over previous TCR peptide vaccines and warrants investigation of its ability to treat MS. |
Direct implantation of rapamycin-eluting stents with bioresorbable drug carrier technology utilising the Svelte coronary stent-on-a-wire: the DIRECT II study. | AIMS
Our aim was to demonstrate the safety and efficacy of the Svelte sirolimus-eluting coronary stent-on-a-wire Integrated Delivery System (IDS) with bioresorbable drug coating compared to the Resolute Integrity zotarolimus-eluting stent with durable polymer in patients with de novo coronary artery lesions.
METHODS AND RESULTS
Direct stenting, particularly in conjunction with transradial intervention (TRI), has been associated with reduced bleeding complications, procedure time, radiation exposure and contrast administration compared to conventional stenting with wiring and predilatation. The low-profile Svelte IDS is designed to facilitate TRI and direct stenting, reducing the number of procedural steps, time and cost associated with coronary stenting. DIRECT II was a prospective, multicentre trial which enrolled 159 patients to establish non-inferiority of the Svelte IDS versus Resolute Integrity using a 2:1 randomisation. The primary endpoint was angiographic in-stent late lumen loss (LLL) at six months. Target vessel failure (TVF), as well as secondary clinical endpoints, will be assessed annually up to five years. At six months, in-stent LLL was 0.09±0.31 mm in the Svelte IDS group compared to 0.13±0.27 mm in the Resolute Integrity group (p<0.001 for non-inferiority). TVF at one year was similar across the Svelte IDS and Resolute Integrity groups (6.5% vs. 9.8%, respectively).
CONCLUSIONS
DIRECT II demonstrated the non-inferiority of the Svelte IDS to Resolute Integrity with respect to in-stent LLL at six months. Clinical outcomes at one year were comparable between the two groups. |
Evaluation of the effect of obesity on voriconazole serum concentrations. | OBJECTIVES
Voriconazole is a second-generation triazole antifungal, approved by the FDA in 2002. Despite a decade of experience with voriconazole, there are limited published data analysing serum concentrations and toxicity in obese patients. Therefore, we evaluated voriconazole trough serum concentrations in obese and normal-weight patients in a retrospective cohort study.
METHODS
Voriconazole serum trough concentrations and toxicities were compared for obese (body mass index >35 kg/m(2)) versus normal-weight (body mass index 18.5-24.9 kg/m(2)) patients receiving 4 mg/kg voriconazole every 12 h.
RESULTS
The obese group (n = 21) had significantly higher mean serum voriconazole trough concentrations than the normal-weight group (n = 66) (6.2 and 3.5 mg/L, respectively, P < 0.0001). Patients in the obese group also had higher rates of supratherapeutic voriconazole levels (>5.5 mg/L) than patients in the normal-weight group (67% versus 17%, respectively, P < 0.0001). However, hepatotoxicity and neurotoxicity rates did not differ between groups. The secondary endpoint compared mean serum voriconazole concentrations in the obese population when dosed at 4 mg/kg based on ideal body weight, adjusted body weight and actual body weight, which were statistically significantly different at 3.95, 3.3 and 6.2 mg/L, respectively (P = 0.0009). Therapeutic voriconazole concentrations (2.0-5.5 mg/L) occurred in 29% of obese patients when dosed on actual body weight, and 45% and 80% of patients when dosed on ideal body weight and adjusted body weight, respectively.
CONCLUSIONS
Our results suggest a strong association between supratherapeutic concentrations and morbidly obese patients when dosed at 4 mg/kg actual body weight. Dosing voriconazole based on an ideal body weight or adjusted body weight may be appropriate for morbidly obese patients. |
The Concrete Distribution: A Continuous Relaxation of Discrete Random Variables | The reparameterization trick enables the optimization of large scale stochastic computation graphs via gradient descent. The essence of the trick is to refactor each stochastic node into a differentiable function of its parameters and a random variable with fixed distribution. After refactoring, the gradients of the loss propagated by the chain rule through the graph are low variance unbiased estimators of the gradients of the expected loss. While many continuous random variables have such reparameterizations, discrete random variables lack continuous reparameterizations due to the discontinuous nature of discrete states. In this work we introduce concrete random variables – continuous relaxations of discrete random variables. The concrete distribution is a new family of distributions with closed form densities and a simple reparameterization. Whenever a discrete stochastic node of a computation graph can be refactored into a one-hot bit representation that is treated continuously, concrete stochastic nodes can be used with automatic differentiation to produce low-variance biased gradients of objectives (including objectives that depend on the log-likelihood of latent stochastic nodes) on the corresponding discrete graph. We demonstrate their effectiveness on density estimation and structured prediction tasks using neural networks. |
Oscillator-based assistance of cyclical movements: model-based and model-free approaches | In this article, we propose a new method for providing assistance during cyclical movements. This method is trajectory-free, in the sense that it provides user assistance irrespective of the performed movement, and requires no other sensing than the assisting robot’s own encoders. The approach is based on adaptive oscillators, i.e., mathematical tools that are capable of learning the high level features (frequency, envelope, etc.) of a periodic input signal. Here we present two experiments that we recently conducted to validate our approach: a simple sinusoidal movement of the elbow, that we designed as a proof-of-concept, and a walking experiment. In both cases, we collected evidence illustrating that our approach indeed assisted healthy subjects during movement execution. Owing to the intrinsic periodicity of daily life movements involving the lower-limbs, we postulate that our approach holds promise for the design of innovative rehabilitation and assistance protocols for the lower-limb, requiring little to no user-specific calibration. |
STREAM: A First Programming Process | Programming is recognized as one of seven grand challenges in computing education. Decades of research have shown that the major problems novices experience are composition-based---they may know what the individual programming language constructs are, but they do not know how to put them together. Despite this fact, textbooks, educational practice, and programming education research hardly address the issue of teaching the skills needed for systematic development of programs.
We provide a conceptual framework for incremental program development, called Stepwise Improvement, which unifies best practice in modern software development such as test-driven development and refactoring with the prevailing perspective of programming methodology, stepwise refinement. The conceptual framework enables well-defined characterizations of incremental program development.
We utilize the conceptual framework to derive a programming process, STREAM, designed specifically for novices. STREAM is a carefully down-scaled version of a full and rich agile software engineering process particularly suited for novices learning object-oriented programming. In using it we hope to achieve two things: to help novice programmers learn faster and better while at the same time laying the foundation for a more thorough treatment of more advanced aspects of software engineering. In this article, two examples demonstrate the application of STREAM.
The STREAM process has been taught in the introductory programming courses at our universities for the past three years and the results are very encouraging. We report on a small, preliminary study evaluating the learning outcome of teaching STREAM. The study indicates a positive effect on the development of students’ process competences. |
Unsupervised Morpheme Segmentation and Morphology Induction from Text Corpora Using Morfessor 1.0 | In this work, we describe the first public version of the Morfessor software, which is a program that takes as input a corpus of unannotated text and produces a segmentation of the word forms observed in the text. The segmentation obtained often resembles a linguistic morpheme segmentation. Morfessor is not language-dependent. The number of segments per word is not restricted to two or three as in some other existing morphology learning models. The current version of the software essentially implements two morpheme segmentation models presented earlier by us (Creutz and Lagus, 2002; Creutz, 2003). The document contains user’s instructions, as well as the mathematical formulation of the model and a description of the search algorithm used. Additionally, a few experiments on Finnish and English text corpora are reported in order to give the user some ideas of how to apply the program to his own data sets and how to evaluate the results. |
Critical Rationalism And Macrosociology of Globalisation | vi Acknowledgements vii Part I: A Critical Rationalist Approach to Globalisation Chapter 1 Principal Argument and Thesis Structure 1. |
Minimizing RF Performance Spikes in a Cryogenic Orthomode Transducer (OMT) | The turnstile junction exhibits very low cross-polarization leakage and is suitable for low-noise millimeter-wave receivers. For use in a cryogenic receiver, it is best if the orthomode transducer (OMT) is implemented in waveguide, contains no additional assembly features, and may be directly machined. However, machined OMTs are prone to sharp signal drop-outs that are costly to overall performance since they show up directly as spikes in receiver noise. We explore the various factors contributing to this degradation and discuss how the current design mitigates each cause. Final performance is demonstrated at cryogenic temperatures. |
Fashion Shopping in Multichannel Retail: The Role of Technology in Enhancing the Customer Experience | The difficulty of translating the in-store experience to the online environment is one of the main reasons why the fashion industry has been slower than other sectors to adopt e-commerce. Recently, however, new information technologies (ITs) have enabled consumers to evaluate fashion online, creating an interactive and exciting shopping experience. As a result, clothing has become the fastest-growing online category of goods bought in the United Kingdom. This trend could have serious consequences for brick-andmortar stores. The aim of this quantitative research is to gain a better understanding of multichannel fashion-shopping experiences, focusing on the role of IT and the crossover effects between channels. In particular, I explore the influence of the level of online experience on the perceptions and motivations of fashion consumers when they buy across multiple channels. The theoretical framework of hedonic and utilitarian shopping values is applied to measure consumers’ shopping experiences and shopping motivations to buy in different channels. The results from a quantitative survey of 439 consumers in the United Kingdom suggest the need to redefine the in-store shopping experience, promoting the use of technology as a way to create an engaging and integrated experience among channels. Retailers must think in all channels holistically, boosting interactive and new technologies for the Internet and taking advantage of all touchpoints with the consumer, including mobile devices and social networks. |
Automatic virtual machine configuration for database workloads | Virtual machine monitors are becoming popular tools for the deployment of database management systems and other enterprise software. In this article, we consider a common resource consolidation scenario in which several database management system instances, each running in a separate virtual machine, are sharing a common pool of physical computing resources. We address the problem of optimizing the performance of these database management systems by controlling the configurations of the virtual machines in which they run. These virtual machine configurations determine how the shared physical resources will be allocated to the different database system instances. We introduce a virtualization design advisor that uses information about the anticipated workloads of each of the database systems to recommend workload-specific configurations offline. Furthermore, runtime information collected after the deployment of the recommended configurations can be used to refine the recommendation and to handle changes in the workload. To estimate the effect of a particular resource allocation on workload performance, we use the query optimizer in a new what-if mode. We have implemented our approach using both PostgreSQL and DB2, and we have experimentally evaluated its effectiveness using DSS and OLTP workloads. |
A method for learning matching errors for stereo computation | This paper describes a novel learning-based approach for improving the performance of stereo computation. It is based on the observation that whether the image matching scores lead to true or erroneous depth values is dependent on the original stereo images and the underlying scene structure. This function is learned from training data and is integrated into a depth estimation algorithm using the MAP-MRF framework. Because the resultant likelihood function is dependent on the states of a large neighboring region around each pixel, we propose to solve the high-order MRF inference problem using the simulated annealing algorithm combined with a Metropolis-Hastings sampler. A segmentation-based approach is proposed to accelerate the computational speed and improve the performance. Preliminary experimental results show that the learning process captures common errors in SSD matching including the fattening effect, the aperture effect, and mismatches in occluded or low texture regions. It is also demonstrated that the proposed approach significantly improves the accuracy of the depth computation. |
A phase III study of the safety and efficacy of viramidine versus ribavirin in treatment-naïve patients with chronic hepatitis C: ViSER1 results. | UNLABELLED
Pegylated interferon (peg-IFN) and ribavirin (RBV) are effective in eradicating the hepatitis C virus in more than half of patients. However, anemia arising from RBV-induced hemolysis can prompt dose reductions and lower sustained virologic response (SVR) rates. In early clinical trials, Viramidine (VRD, renamed taribavirin), an RBV prodrug, was associated with less anemia and VRD given at 600 mg twice daily (BID) appeared to provide the best safety with comparable efficacy to RBV. The phase III Viramidine's Safety and Efficacy versus Ribavirin 1 (ViSER1) study randomized 972 treatment-naïve patients with chronic hepatitis C to fixed-dose VRD (600 mg BID) or weight-based RBV (1000 or 1200 mg/day), each given with peg-IFN alfa-2b at 1.5 microg/kg/week. The primary efficacy endpoint was SVR rate, and the primary safety endpoint was hemoglobin (Hb) event rate (percent of patients with Hb < 10 g/dL or at least a 2.5-g/dL decrease from baseline). SVR rates were 37.7% with VRD (244/647) and 52.3% with RBV (170/325). Thus, the ViSER1 study failed to demonstrate the primary noninferiority efficacy endpoint. Significantly fewer patients had Hb events with VRD (353/647; 54.6%) compared to those with RBV (272/325; 83.7%) (P < 0.001), and significantly fewer developed anemia (Hb < 10 g/dL) with VRD (34/647; 5.3%) compared to those with RBV (76/325; 23.5%) (P < 0.001).
CONCLUSION
Fixed doses of VRD failed to demonstrate noninferiority to RBV in producing SVR rates. The incidence of anemia was approximately four-fold significantly lower with VRD than with RBV. These results suggest fixed-dose VRD given 600 mg BID is insufficient to treat patients with chronic hepatitis C; a weight-based dosing trial of viramidine is currently under way. |
Hydraulics and life history of tropical dry forest tree species: coordination of species' drought and shade tolerance. | Plant hydraulic architecture has been studied extensively, yet we know little about how hydraulic properties relate to species' life history strategies, such as drought and shade tolerance. The prevailing theories seem contradictory. We measured the sapwood (K(s) ) and leaf (K(l) ) hydraulic conductivities of 40 coexisting tree species in a Bolivian dry forest, and examined associations with functional stem and leaf traits and indices of species' drought (dry-season leaf water potential) and shade (juvenile crown exposure) tolerance. Hydraulic properties varied across species and between life-history groups (pioneers vs shade-tolerant, and deciduous vs evergreen species). In addition to the expected negative correlation of K(l) with drought tolerance, we found a strong, negative correlation between K(l) and species' shade tolerance. Across species, K(s) and K(l) were negatively correlated with wood density and positively with maximum vessel length. Consequently, drought and shade tolerance scaled similarly with hydraulic properties, wood density and leaf dry matter content. We found that deciduous species also had traits conferring efficient water transport relative to evergreen species. Hydraulic properties varied across species, corresponding to the classical trade-off between hydraulic efficiency and safety, which for these dry forest trees resulted in coordinated drought and shade tolerance across species rather than the frequently hypothesized trade-off. |
How to Recommend?: User Trust Factors in Movie Recommender Systems | How much trust a user places in a recommender is crucial to the uptake of the recommendations. Although prior work established various factors that build and sustain user trust, their comparative impact has not been studied in depth. This paper presents the results of a crowdsourced study examining the impact of various recommendation interfaces and content selection strategies on user trust. It evaluates the subjective ranking of nine key factors of trust grouped into three dimensions and examines the differences observed with respect to users' personality traits. |
Predicting the Success of Online Petitions Leveraging Multidimensional Time-Series | Applying classical time-series analysis techniques to online content is challenging, as web data tends to have data quality issues and is often incomplete, noisy, or poorly aligned. In this paper, we tackle the problem of predicting the evolution of a time series of user activity on the web in a manner that is both accurate and interpretable, using related time series to produce a more accurate prediction. We test our methods in the context of predicting signatures for online petitions using data from thousands of petitions posted on The Petition Site—one of the largest platforms of its kind. We observe that the success of these petitions is driven by a number of factors, including promotion through social media channels and on the front page of the petitions platform. We propose an interpretable model that incorporates seasonality, aging effects, self-excitation, and external effects. The interpretability of the model is important for understanding the elements that drives the activity of an online content. We show through an extensive empirical evaluation that our model is significantly better at predicting the outcome of a petition than state-of-the-art techniques. |
PuDianNao: A Polyvalent Machine Learning Accelerator | Machine Learning (ML) techniques are pervasive tools in various emerging commercial applications, but have to be accommodated by powerful computer systems to process very large data. Although general-purpose CPUs and GPUs have provided straightforward solutions, their energy-efficiencies are limited due to their excessive supports for flexibility. Hardware accelerators may achieve better energy-efficiencies, but each accelerator often accommodates only a single ML technique (family). According to the famous No-Free-Lunch theorem in the ML domain, however, an ML technique performs well on a dataset may perform poorly on another dataset, which implies that such accelerator may sometimes lead to poor learning accuracy. Even if regardless of the learning accuracy, such accelerator can still become inapplicable simply because the concrete ML task is altered, or the user chooses another ML technique.
In this study, we present an ML accelerator called PuDianNao, which accommodates seven representative ML techniques, including k-means, k-nearest neighbors, naive bayes, support vector machine, linear regression, classification tree, and deep neural network. Benefited from our thorough analysis on computational primitives and locality properties of different ML techniques, PuDianNao can perform up to 1056 GOP/s (e.g., additions and multiplications) in an area of 3.51 mm^2, and consumes 596 mW only. Compared with the NVIDIA K20M GPU (28nm process), PuDianNao (65nm process) is 1.20x faster, and can reduce the energy by 128.41x. |
Nanosecond Pulse Generator with Variable Pulse Duration for the Study of Pulse Induced Biological Effects | We have developed a Blumlein line pulse generator which utilizes the superposition of electrical pulses launched from two individually switched pulse forming lines. By using a fast power MOSFET as a switch on each end of the Blumlein line, we were able to generate pulses with amplitudes of 1 kV across a 100-Omega load. Pulse duration and polarity can be controlled by the temporal delay in the triggering of the two switches. In addition, the use of identical switches allows us to overcome pulse distortions arising from the use of non-ideal switches in the traditional Blumlein configuration. With this pulse generator, pulses with durations between 8 and 300 ns were applied to Jurkat cells (a leukemia cell line) to investigate the pulse dependent increase in calcium levels. The development of the calcium levels in individual cells was studied by spinning-disc confocal fluorescent microscopy with the calcium indicator, fluo-4. With this fast imaging system, fluorescence changes, representing calcium mobilization, could be resolved with an exposure of 5 ms every 18 ms. For a 60-ns pulse duration, each rise in intracellular calcium was greater as the electric field strength was increased from 25 kV/cm to 100 kV/cm. Only for the highest electric field strength is the response dependent on the presence of extracellular calcium. The results complement ion-exchange mechanisms previously observed during the charging of cellular membranes, which were suggested by observations of membrane potential changes during exposure. |
Access Control Models for Virtual Object Communication in Cloud-Enabled IoT | The Internet of Things (IoT) is the latest evolution of the Internet, encompassing an enormous number of connected physical "things." The access-control oriented (ACO) architecture was recently proposed for cloud-enabled IoT, with virtual objects (VOs) and cloud services in the middle layers. A central aspect of ACO is to control communication among VOs. This paper develops operational and administrative access control models for this purpose, assuming topic-based publishsubscribe interaction among VOs. Operational models are developed using (i) access control lists for topics and capabilities for virtual objects and (ii) attribute-based access control, and it is argued that role-based access control is not suitable for this purpose. Administrative models for these two operational models are developed using (i) access control lists, (ii) role-based access control, and (iii) attribute-based access control. A use case illustrates the details of these access control models for VO communication, and their differences. An assessment of these models with respect to security and privacy preserving objectives of IoT is also provided. |
Code Improvements for Model Elimination Based Reasoning Systems | We have been investigating ways in which the performance of model elimination based systems can be improved and in this paper we present some of our results. Firstly, we have investigated code improvements based on local and global analysis of the internal knowledge base used by the theorem prover. Secondly, we have looked into the use of a n lists to represent ancestor goal information to see if this gives a performance boost over the traditional two list approach. This n list representation might be thought of as a simple hash table. Thirdly, we conducted initial investigations into the effect of rule body literal ordering on performance.The results for the code improvements show them to be worthwhile, producing gains in some example problems. Using the n list representation gave mixed results: for some examples it improved execution speed, in others it degraded it. A rule body literal ordering that placed instantiated goals (including hypotheses) early in the bodies of rules showed an improvement in execution time. |
SHIHbot: A Facebook chatbot for Sexual Health Information on HIV/AIDS | We present the implementation of an autonomous chatbot, SHIHbot, deployed on Facebook, which answers a wide variety of sexual health questions on HIV/AIDS. The chatbot's response database is compiled from professional medical and public health resources in order to provide reliable information to users. The system's backend is NPCEditor, a response selection platform trained on linked questions and answers; to our knowledge this is the first retrieval-based chatbot deployed on a large public social network. |
Genome sequence and comparative analysis of the solvent-producing bacterium Clostridium acetobutylicum. | The genome sequence of the solvent-producing bacterium Clostridium acetobutylicum ATCC 824 has been determined by the shotgun approach. The genome consists of a 3.94-Mb chromosome and a 192-kb megaplasmid that contains the majority of genes responsible for solvent production. Comparison of C. acetobutylicum to Bacillus subtilis reveals significant local conservation of gene order, which has not been seen in comparisons of other genomes with similar, or, in some cases closer, phylogenetic proximity. This conservation allows the prediction of many previously undetected operons in both bacteria. However, the C. acetobutylicum genome also contains a significant number of predicted operons that are shared with distantly related bacteria and archaea but not with B. subtilis. Phylogenetic analysis is compatible with the dissemination of such operons by horizontal transfer. The enzymes of the solventogenesis pathway and of the cellulosome of C. acetobutylicum comprise a new set of metabolic capacities not previously represented in the collection of complete genomes. These enzymes show a complex pattern of evolutionary affinities, emphasizing the role of lateral gene exchange in the evolution of the unique metabolic profile of the bacterium. Many of the sporulation genes identified in B. subtilis are missing in C. acetobutylicum, which suggests major differences in the sporulation process. Thus, comparative analysis reveals both significant conservation of the genome organization and pronounced differences in many systems that reflect unique adaptive strategies of the two gram-positive bacteria. |
The influence of sleep quality, sleep duration and sleepiness on school performance in children and adolescents: A meta-analytic review. | Insufficient sleep, poor sleep quality and sleepiness are common problems in children and adolescents being related to learning, memory and school performance. The associations between sleep quality (k=16 studies, N=13,631), sleep duration (k=17 studies, N=15,199), sleepiness (k=17, N=19,530) and school performance were examined in three separate meta-analyses including influential factors (e.g., gender, age, parameter assessment) as moderators. All three sleep variables were significantly but modestly related to school performance. Sleepiness showed the strongest relation to school performance (r=-0.133), followed by sleep quality (r=0.096) and sleep duration (r=0.069). Effect sizes were larger for studies including younger participants which can be explained by dramatic prefrontal cortex changes during (early) adolescence. Concerning the relationship between sleep duration and school performance age effects were even larger in studies that included more boys than in studies that included more girls, demonstrating the importance of differential pubertal development of boys and girls. Longitudinal and experimental studies are recommended in order to gain more insight into the different relationships and to develop programs that can improve school performance by changing individuals' sleep patterns. |
Another glimpse over the salting-out assisted liquid-liquid extraction in acetonitrile/water mixtures. | The use of the salting-out effect in analytical chemistry is very diverse and can be applied to increase the volatility of the analytes in headspace extractions, to cause the precipitation of proteins in biological samples or to improve the recoveries in liquid-liquid extractions. In the latter, the salting-out process can be used to create a phase separation between water-miscible organic solvents and water. Salting-out assisted liquid-liquid extraction (SALLE) is an advantageous sample preparation technique aiming HPLC-UV analysis when developing analytical methodologies. In fact, some new extraction methodologies like QuEChERS include the SALLE concept. This manuscript discusses another point of view over SALLE with particular emphasis over acetonitrile-water mixtures for HPLC-UV analysis; the influence of the salting-out agents, their concentration and the water-acetonitrile volume ratios were the studied parameters. α-dicarbonyl compounds and beer were used as test analytes and test samples, respectively. The influence of the studied parameters was characterized by the obtained phase separation volume ratio and the fraction of α-dicarbonyls extracted to the acetonitrile phase. Results allowed the distribution of salts within three groups according to the phase separation and their extractability: (1) chlorides and acetates, (2) carbonates and sulfates and (3) magnesium sulfate; of all tested salts, sodium chloride had the highest influence on the α-dicarbonyls fraction extracted. |
An efficient reconciliation algorithm for social networks | People today typically use multiple online social networks (Facebook, Twitter, Google+, LinkedIn, etc.). Each online network represents a subset of their “real” ego-networks. An interesting and challenging problem is to reconcile these online networks, that is, to identify all the accounts belonging to the same individual. Besides providing a richer understanding of social dynamics, the problem has a number of practical applications. At first sight, this problem appears algorithmically challenging. Fortunately, a small fraction of individuals explicitly link their accounts across multiple networks; our work leverages these connections to identify a very large fraction of the network. Our main contributions are to mathematically formalize the problem for the first time, and to design a simple, local, and efficient parallel algorithm to solve it. We are able to prove strong theoretical guarantees on the algorithm’s performance on well-established network models (Random Graphs, Preferential Attachment). We also experimentally confirm the effectiveness of the algorithm on synthetic and real social network data sets. |
Intelligent book positioning for library using RFID and book spine matching | In library, the management of books is very complication and timing costing. The location of books could be altered by librarian, students, teachers and any one around the library. Therefore, allocating a book is not an easy task in big library. Indoor positioning is an important technology to help storage management and customer services providing. RFID provides a good wireless platform to facilitate indoor positioning. However, duo to the small width of each book spine, adopting positioning based RFID alone is not enough to locate books in a library. In this work, we combine image matching with L-GEM based RBFNN to enhance the accuracy and robustness of the book locating system. We apply this new method in a library environment to position the certain books. Experimental results show that the proposed method is highly accurate and robust to white noise of RFID signals. |
Light it up: using paper circuitry to enhance low-fidelity paper prototypes for children | Paper prototyping is an important tool for designing and testing early technologies during development. However, children have different relationships with technology and thus one cannot expect children to assess paper prototypes with the same mental model as adults. In this paper, we examine the effect of incorporating paper circuitry into low-fidelity paper prototypes, in order to add a level of interactivity that is not present in traditional paper prototypes. We conducted a study with 20 children ages 3 to 10 years old where participants used a cardboard prototype of a voice-controlled rocket on a pretend play mission to Mars. Children chose between buttons that lit up when pressed using paper circuitry, and buttons that did not light up, and explained their selections to the researchers. Our results show that children indeed preferred buttons augmented with paper circuitry, demonstrating more attention for and increased believability in the function of these buttons as well as the overall system. These findings show how designers can use paper circuitry to more effectively engage children while play-testing their paper prototypes. |
Fast matting using large kernel matting Laplacian matrices | Image matting is of great importance in both computer vision and graphics applications. Most existing state-of-the-art techniques rely on large sparse matrices such as the matting Laplacian [12]. However, solving these linear systems is often time-consuming, which is unfavored for the user interaction. In this paper, we propose a fast method for high quality matting. We first derive an efficient algorithm to solve a large kernel matting Laplacian. A large kernel propagates information more quickly and may improve the matte quality. To further reduce running time, we also use adaptive kernel sizes by a KD-tree trimap segmentation technique. A variety of experiments show that our algorithm provides high quality results and is 5 to 20 times faster than previous methods. |
of lexical appRoach and machine leaRning methods foR sentiment analysis | The machine learning approach is represented with two methods: the maximum entropy method and support vector machine. Text representation for the maximum entropy method includes the information about the proportion of positive and negative words and collocations, the quantity of interrogation and exclamation marks, emoticons, obscene language. For the support vector machine binary vectors with cosine normalization are built on texts. |
Detecting Controversies in Online News Media | This paper sets out to detect controversial news reports using online discussions as a source of information. We define controversy as a public discussion that divides society and demonstrate that a content and stylometric analysis of these debates yields useful signals for extracting disputed news items. Moreover, we argue that a debate-based approach could produce more generic models, since the discussion architectures we exploit to measure controversy occur on many different platforms. |
The Search for Quasi-Periodicity in Islamic 5-fold Ornament | T he Penrose tilings are remarkable in that they are non-periodic (have no translational symmetry) but are clearly organised. Their structure, called quasiperiodicity, can be described in several ways, including via self-similar subdivision, tiles with matching rules, and projection of a slice of a cubic lattice in R. The tilings are also unusual for their many centres of local 5-fold and 10fold rotational symmetry, features shared by some Islamic geometric patterns. This resemblance has prompted comparison, and has led some to see precursors of the Penrose tilings and even evidence of quasi-periodicity in traditional Islamic designs. Bonner [2] identified three styles of self-similarity; Makovicky [20] was inspired to develop new variants of the Penrose tiles and later, with colleagues [24], overlaid Penrose-type tilings on traditional Moorish designs; more recently, Lu and Steinhardt [17] observed the use of subdivision in traditional Islamic design systems and overlaid Penrose kites and darts on Iranian designs. The latter article received widespread exposure in the world’s press, although some of the coverage overstated and misrepresented the actual findings. The desire to search for examples of quasi-periodicity in traditional Islamic patterns is understandable, but we must take care not to project modern motivations and abstractions into the past. An intuitive knowledge of group theory is sometimes attributed to any culture that has produced repeating patterns displaying a wide range of symmetry types, even though they had no abstract notion of a group. There are two fallacies to avoid: • abstraction: P knew about X and X is an example of Y therefore P knew Y. • deduction: P knew X and X implies Y therefore P knew Y. |
Reservoir Computing in Forecasting Financial Markets | The ability of the echo state network to learn chaotic time series makes it an interesting tool for financial forecasting where data is very nonlinear and complex. In this study I initially examine the Mackey-Glass system to determine how different global parameters can optimize training in an echo state network. In order to simultaneously optimize multiple parameters I conduct a grid search to explore the mean squared error surface plot. In the the grid search I find that error is relatively stable over certain ranges of the leaking rate and spectral radius. However, the ranges over which the Mackey-Glass system minimizes error do not correspond with an error surface plot minimum for financial data, as a result of intrinsic qualities such as step size and timescale of dynamics in the data. The study of chaos in financial time series data leads me to find alternate understandings of the distribution of the relative stock price over time. I find the Lorentzian distribution and the Voigt profile are good models for explaining the thick tails that characterize large returns and losses, which are not explained in the common Gaussian model. These distributions act as an untrained random model to benchmark the predictions of the echo state network trained on the historical price changes in the S&P 500. The global reservoir parameters, optimized in a grid search given financial input data, does not lead to significant predictive abilities. Studies of the committees of multiple reservoirs are shown to give similar forecasts to single reservoirs. Compared to a benchmark random sample from the defined distribution of previous input, the echo state network is not able to make significantly better forecasts, suggesting the necessity of more sophisticated statistical techniques and the need to better understand chaotic dynamics in finance. |
A CMOS beta multiplier voltage reference with improved temperature performance and silicon tunability | A new implementation has been proposed for the beta multiplier voltage reference to improve its performance with regard to process variations. The scope for silicon tunability on the proposed circuit is also discussed. The circuit was implemented in a 0.18 /spl mu/ process and was found to have a temperature sensitivity of less than 500 ppm/C in the virgin die without trimming. |
Adsorption of methylene blue from industrial effluent using poly (vinyl alcohol) | The application of poly (vinyl alcohol) (PVA) in an unmodified form as an adsorbent for methylene blue (MB) dye removal in waste water at 303–333K was investigated. Effects of operational parameters such as adsorbent dosage, initial dye concentration and pH were determined for optimum conditions for maximum dye removal. Results obtained showed that dye removal efficiency increases as the amount of adsorbent increased. Lowest initial dye concentration and at pH of 4.0 gave maximum adsorption of MB onto PVA. Adsorption parameters were found to fit well into Langmuir, Freundlich and Temkin adsorption isotherm models with correlation coefficient (R > 0.95) in the concentration range of MB studied. Adsorption kinetics studies revealed that pseudo-second order provided the best fit to experimental data compared with pseudo-first-order model. Thermodynamic parameters revealed that the adsorption process was non-spontaneous and exothermic with an orderly mixing adherence of the dye molecules on the adsorbent surface. |
48V Power Assist Recuperation System (PARS) with a permanent magnet motor, inverter and DC-DC converter | The present study proposed a novel Power Assist Recuperation System consisted of motor, inverter, DC-DC converter and 48V Li ion battery for fuel saving of conventional Internal Combustion Engine and hybrid vehicles. As a total solution of the next-generation 48V PARS, key components such as Interior Permanent Magnet Synchronous Motor, MOSFET power switching inverter and bi-directional DC-DC converter with cost competitiveness from in-house power IC's and power modules have been optimized based on a 48V Li ion battery. A deep flux weakening control has been evaluated for the IPMSM's high speed operation. The specific value of how much the proposed 48V PARS can reduce CO2 emission and fuel consumption was estimated by so-called Autonomie software using both efficiency maps of motoring and generating modes. It is noticed that the present 48V PARS gives a considerably enhanced performance in reduction of fuel consumption by as high as 17% for a commercial 1.8L engine. |
Are Autonomous Mobile Robots Able to Take Over Construction ? A Review | Received: May 17, 2015. Received in revised form: October 15, 2015. Accepted: October 25, 2015. Although construction has been known as a highly complex application field for autonomous robotic systems, recent advances in this field offer great hope for using robotic capabilities to develop automated construction. Today, space research agencies seek to build infrastructures without human intervention, and construction companies look to robots with the potential to improve construction quality, efficiency, and safety, not to mention flexibility in architectural design. However, unlike production robots used, for instance, in automotive industries, autonomous robots should be designed with special consideration for challenges such as the complexity of the cluttered and dynamic working space, human-robot interactions and inaccuracy in positioning due to the nature of mobile systems and the lack of affordable and precise self-positioning solutions. This paper briefly reviews state-ofthe-art research into automated construction by autonomous mobile robots. We address and classify the relevant studies in terms of applications, materials, and robotic systems. We also identify ongoing challenges and discuss about future robotic requirements for automated construction. |
Understanding Android Obfuscation Techniques: A Large-Scale Investigation in the Wild | Program code is a precious asset to its owner. Due to the easyto-reverse nature of Java, code protection for Android apps is of particular importance. To this end, code obfuscation is widely utilized by both legitimate app developers andmalware authors, which complicates the representation of source code or machine code in order to hinder the manual investigation and code analysis. Despite many previous studies focusing on the obfuscation techniques, however, our knowledge on how obfuscation is applied by realworld developers is still limited. In this paper, we seek to better understand Android obfuscation and depict a holistic view of the usage of obfuscation through a large-scale investigation in the wild. In particular, we focus on four popular obfuscation approaches: identifier renaming, string encryption, Java reflection, and packing. To obtain the meaningful statistical results, we designed efficient and lightweight detection models for each obfuscation technique and applied them to our massive APK datasets (collected from Google Play, multiple thirdparty markets, and malware databases). We have learned several interesting facts from the result. For example, malware authors use string encryption more frequently, and more apps on third-party markets than Google Play are packed. We are also interested in the explanation of each finding. Therefore we carry out in-depth code analysis on some Android apps after sampling. We believe our study will help developers select the most suitable obfuscation approach, and in the meantime help researchers improve code analysis systems in the right direction. |
The Fall of the Roman Republic | Dramatic artist, natural scientist and philosopher, Plutarch is widely regarded as the most significant historian of his era, writing sharp and succinct accounts of the greatest politicians and statesman of the classical period. Taken from the "Lives", a series of biographies spanning the Graeco-Roman age, this collection illuminates the twilight of the old Roman Republic from 157-43 bc. Whether describing the would-be dictators Marius and Sulla, the battle between Crassus and Spartacus, the death of political idealist Crato, Julius Caesar's harrowing triumph in Gaul or the eloquent oratory of Cicero, all offer a fascinating insight into an empire wracked by political divisions. Deeply influential on Shakespeare and many other later writers, they continue to fascinate today with their exploration of corruption, decadence and the struggle for ultimate power. |
Forecasting Residential Real Estate Price Changes from Online Search Activity | Abstract: The intention of buying a home is revealed by many potential home buyers when they turn to the internet to search for their future residence. Therefore, the aggregated amount of today’s real estate related searches is likely to provide information about the future demand for housing and possibly predict future housing price trends. This paper examines the extent to which future cross sectional differences in home price changes are predicted by online search intensity in prior periods. Our findings are economically meaningful and suggest that abnormal search intensity for real estate in a particular city can help predict the city’s future abnormal housing price change. These findings hold even after we control for momentum in house prices. On average, cities associated with abnormally high real estate search intensity consistently outperform cities with abnormally low real estate search volume by as much as 8.5% over a twoyear period. This outperformance appears to exhibit eventual reversal and is particularly noticeable for cities with lower supply land elasticity. Moreover, the results show that home prices are more sensitive to an “uptick” rather than a “downtick” in search intensity – consistent with the upward “stickiness” characteristic of home prices. |
BCD (Bipolar-CMOS-DMOS) technology trends for power management IC | This paper reviews the technology trends of BCD (Bipolar-CMOS-DMOS) technology in terms of voltage capability, switching speed of power transistor, and high integration of logic CMOS for SoC (System-on-Chip) solution requiring high-voltage devices. Recent trends such like modularity of the process, power metal routing, and high-density NVM (Non-Volatile Memory) are also discussed. |
SUMMAC: a text summarization evaluation | The TIPSTER Text Summarization Evaluation (SUMMAC) has developed several new extrinsic and intrinsic methods for evaluating summaries. It has established definitively that automatic text summarization is very effective in relevance assessment tasks on news articles. Summaries as short as 17% of full text length sped up decision-making by almost a factor of 2 with no statistically significant degradation in accuracy. Analysis of feedback forms filled in after each decision indicated that the intelligibility of present-day machine-generated summaries is high. Systems that performed most accurately in the production of indicative and informative topic-related summaries used term frequency and co-occurrence statistics, and vocabulary overlap comparisons between text passages. However, in the absence of a topic, these statistical methods do not appear to provide any additional leverage: in the case of generic summaries, the systems were indistinguishable in accuracy. The paper discusses some of the tradeoffs and challenges faced by the evaluation, and also lists some of the lessons learned, impacts, and possible future directions. The evaluation methods used in the SUMMAC evaluation are of interest to both summarization evaluation as well as evaluation of other 'output-related' NLP technologies, where there may be many potentially acceptable outputs, with no automatic way to compare them. |
Template Attacks | We present template attacks, the strongest form of side channel attack possible in an information theoretic sense. These attacks can break implementations and countermeasures whose security is dependent on the assumption that an adversary cannot obtain more than one or a limited number of side channel samples. They require that an adversary has access to an identical experimental device that he can program to his choosing. The success of these attacks in such constraining situations is due manner in which noise within each sample is handled. In contrast to previous approaches which viewed noise as a hindrance that had to be reduced or eliminated, our approach focuses on precisely modeling noise, and using this to fully extract information present in a single sample. We describe in detail how an implementation of RC4, not amenable to techniques such as SPA and DPA, can easily be broken using template attacks with a single sample. Other applications include attacks on certain DES implementations which use DPA–resistant hardware and certain SSL accelerators which can be attacked by monitoring electromagnetic emanations from an RSA operation even from distances of fifteen feet. |
A systematic and comprehensive investigation of methods to build and evaluate fault prediction models | This paper describes a study performed in an industrial setting that attempts to build predictive models to identify parts of a Java system with a high fault probability. The system under consideration is constantly evolving as several releases a year are shipped to customers. Developers usually have limited resources for their testing and would like to devote extra resources to faulty system parts. The main research focus of this paper is to systematically assess three aspects on how to build and evaluate fault-proneness models in the context of this large Java legacy system development project: (1) compare many data mining and machine learning techniques to build fault-proneness models, (2) assess the impact of using different metric sets such as source code structural measures and change/fault history (process measures), and (3) compare several alternative ways of assessing the performance of the models, in terms of (i) confusion matrix criteria such as accuracy and precision/recall, (ii) ranking ability, using the receiver operating characteristic area (ROC), and (iii) our proposed cost-effectiveness measure (CE). The results of the study indicate that the choice of fault-proneness modeling technique has limited impact on the resulting classification accuracy or cost-effectiveness. There is however large differences between the individual metric sets in terms of cost-effectiveness, and although the process measures are among the most expensive ones to collect, including them as candidate measures significantly improves the prediction models compared with models that only include structural measures and/or their deltas between releases – both in terms of ROC area and in terms of CE. Further, we observe that what is considered the best model is highly dependent on the criteria that are used to evaluate and compare the models. And the regular confusion matrix criteria, although popular, are not clearly related to the problem at hand, namely the cost-effectiveness of using fault-proneness prediction models to focus verification efforts to deliver software with less faults at less cost. ! 2009 Elsevier Inc. All rights reserved. |
Measured Gene-Environment Interactions in Psychopathology: Concepts, Research Strategies, and Implications for Research, Intervention, and Public Understanding of Genetics. | There is much curiosity about interactions between genes and environmental risk factors for psychopathology, but this interest is accompanied by uncertainty. This article aims to address this uncertainty. First, we explain what is and is not meant by gene-environment interaction. Second, we discuss reasons why such interactions were thought to be rare in psychopathology, and argue instead that they ought to be common. Third, we summarize emerging evidence about gene-environment interactions in mental disorders. Fourth, we argue that research on gene-environment interactions should be hypothesis driven, and we put forward strategies to guide future studies. Fifth, we describe potential benefits of studying measured gene-environment interactions for basic neuroscience, gene hunting, intervention, and public understanding of genetics. We suggest that information about nurture might be harnessed to make new discoveries about the nature of psychopathology. |
String cosmology versus standard and inflationary cosmology | This paper presents a review of the basic, model-independent differences between the pre-big-bang scenario, arising naturally in a string cosmology context, and the standard inflationary scenario. We use an unconventional approach in which the introduction of technical details is avoided as much as possible, trying to focus the reader's attention on the main conceptual aspects of both scenarios. The aim of the paper is not to conclude either in favour of one or other of the scenarios, but to raise questions that are left to the reader's meditation. Warning: the paper does not contain equations, and is not intended as a complete review of all aspects of string cosmology. |
Domain Theory in Logical Form | The mathematical framework of Stone duality is used to synthesise a number of hitherto separate developments in theoretical computer science. Domain theory, the mathematical theory of computation introduced by Scott as a foundation for denotational semantics. The theory of concurrency and systems behavior developed by Milner, Hennesy et al. based on operational semantics. Logics of programs |
Building Successful Knowledge Management Projects | It is widely acknowledged that developed economies have gradually been transformed over the past fifty years. Scholars and observers from disciplines as disparate as sociology, economics, and management science generally agree that knowledge has been at the center of this change. 1 Knowledge could be defined as information that has been combined with experience, context, interpretation, and reflection. Given the value of this asset to organizations, it is not surprising that greater attention is being paid to the subject of knowledge: what it is, how it differs from the related concepts of information and data, and how to begin to create, transfer, and use it more effectively. The subject of knowledge management, in particular, has had a recent flowering. 2 |
Design of Circular Polarization Antenna With Harmonic Suppression for Rectenna Application | A microstrip antenna design that effectively attains circular polarization (CP) and harmonic suppression is proposed. By modifying the size and position of two peripheral cuts, two orthogonal modes that have equal amplitude and are 90 out of phase are simultaneously excited. The four right-angle slits embedded in the antenna can block the second- and third-order harmonic signals. This property is especially suitable for nonlinear circuit applications, such as active integrated antenna (AIA) and rectifying antenna (rectenna). The adopted CP antenna built on a low-cost FR4 substrate has a measured bandwidth of 137 MHz (10-dB return loss) and a 30-MHz CP bandwidth (3-dB axial ratio). A rectenna that comprises the proposed antenna and a rectifying circuit is built for verifying the characteristics of the proposed antenna. The measured results of the proposed antenna are in good performances and steady efficiency. |
Biological and docking studies of topoisomerase IV inhibition by thiosemicarbazides. | 4-Benzoyl-1-(4-methyl-imidazol-5-yl)-carbonylthiosemicarbazide (1) was synthesized, and its antibacterial and type IIA topoisomerase (DNA gyrase and topoisomerase IV) activity evaluated. (1) was found to have high therapeutic potential against opportunistic Gram-positive bacteria, and inhibitory activity against topoisomerase IV (IC(50)=90 μM) but not against DNA gyrase. An increase in activity against topoisomerase IV (IC(50)=14 μM) was observed when the imidazole moiety of (1) was replaced with the indole group in 4-benzoyl-1-(indol-2-yl)-carbonylthiosemicarbazide (2). However, (2) showed only weak antibacterial activity. Although the results of the bacterial type IIA topoisomerases inhibition study did not parallel antibacterial activities, our observations strongly imply that a 4-benzoylthiosemicarbazide scaffold can be developed into an efficient Gram-positive antibacterial targeting topoisomerase IV. The difference in activity against type IIA topoisomerases between (1) and (2) was further investigated by docking studies, which suggested that these compounds target the ATP binding pocket. |
Psychosocial health among young victims and offenders of direct and indirect bullying. | OBJECTIVE
To assess the association between bullying (both directly and indirectly) and indicators of psychosocial health for boys and girls separately.
STUDY DESIGN
A school-based questionnaire survey of bullying, depression, suicidal ideation, and delinquent behavior.
SETTING
Primary schools in Amsterdam, The Netherlands.
PARTICIPANTS
A total of 4811 children aged 9 to 13.
RESULTS
Depression and suicidal ideation are common outcomes of being bullied in both boys and girls. These associations are stronger for indirect than direct bullying. After correction, direct bullying had a significant effect on depression and suicidal ideation in girls, but not in boys. Boy and girl offenders of bullying far more often reported delinquent behavior. Bullying others directly is a much greater risk factor for delinquent behavior than bullying others indirectly. This was true for both boys and girls. Boy and girl offenders of bullying also more often reported depressive symptoms and suicidal ideation. However, after correction for both sexes only a significant association still existed between bullying others directly and suicidal ideation.
CONCLUSIONS
The association between bullying and psychosocial health differs notably between girls and boys as well as between direct and indirect forms of bullying. Interventions to stop bullying must pay attention to these differences to enhance effectiveness. |
Long-term therapeutic outcome of ophthalmic complications following endoscopic sinus surgery | Ophthalmic complications associated with endoscopic sinus surgery (ESS) are quite rare. There is a paucity of reliable data and limited experience on the clinical findings and treatments of these injuries. Our study here is to characterize the types of orbital injury following ESS, in particular extraocular muscle injury, and to evaluate the long-term therapeutic outcomes as compiled from a relatively large sample of Chinese patients.A series of 27 patients (21 males and 6 females; mean age = 42.6 years, ranges: 10-60 years) were retrospectively reviewed. The mean duration of orbital complication was 6.6 months (ranges: 1 day to 24 months). The right eye was affected in 19 patients and the left in 8 patients. All patients had various extraocular muscle dysfunction, including contusion, oculomotor nerve damage, muscle entrapment, muscle transection, and muscle destruction. All patients subjected to strabismus surgery showed an obvious reduction in deviation. Three patients achieved orthophoria without any surgery during the period of observation. All patients displayed mild to complicated orbital hemorrhage that often disappeared within 2 weeks. Optic nerve injury occurred in 29.6% of patients and vision damage in these patients was often irreversible.All patients with ophthalmic complications after ESS had strabismus and extraocular muscle dysfunction. Timing and type of strabismus surgery performed depended on the severity and number of muscles involved as well as the type of injury. This surgery is less effective in cases of restriction factor adhesion and/or entrapment as compared to that of patients with other types of strabismus. Orbital hemorrhages were usually resolved spontaneously, but optic nerve injury was mostly irreversible. |
The Adaptation of “ Attitudes toward Research ( ATR ) ” Scale into Turkish | The aim of this study is to adapt The Scale of Attitude toward Research (ATR), which was developed by Papanastasiou (2005), to Turkish culture. In order to determine whether the Turkish translation of scale was appropriate in terms of language or not, English and Turkish forms of scale were applied to 25, third year students who participated voluntarily studying in English language teaching department for two weeks. The language validity provided scale was put through Confirmatory Factor Analysis (CFA) with the data obtained from 391 people, and item total correlation and item discrimination were examined. According to CFA and item analysis results, the scale has maintained its original form in Turkish culture. For the total scale, Cronbach Alpha coefficient was revealed .92. |
ContainerLeaks: Emerging Security Threats of Information Leakages in Container Clouds | Container technology provides a lightweight operating system level virtual hosting environment. Its emergence profoundly changes the development and deployment paradigms of multi-tier distributed applications. However, due to the incomplete implementation of system resource isolation mechanisms in the Linux kernel, some security concerns still exist for multiple containers sharing an operating system kernel on a multi-tenancy container cloud service. In this paper, we first present the information leakage channels we discovered that are accessible within the containers. Such channels expose a spectrum of system-wide host information to the containers without proper resource partitioning. By exploiting such leaked host information, it becomes much easier for malicious adversaries (acting as tenants in the container clouds) to launch advanced attacks that might impact the reliability of cloud services. Additionally, we discuss the root causes of the containers' information leakages and propose a two-stage defense approach. As demonstrated in the evaluation, our solution is effective and incurs trivial performance overhead. |
From Retinex to Automatic Color Equalization: issues in developing a new algorithm for unsupervised color equalization | nce ch; n ple iray r. In hue 003 Abstract. We present a comparison between two color equalization algorithms: Retinex, the famous model due to Land and McCann, and Automatic Color Equalization (ACE), a new algorithm recently presented by the authors. These two algorithms share a common approach to color equalization, but different computational models. We introduce the two models focusing on differences and common points. An analysis of their computational characteristics illustrates the way the Retinex approach has influenced ACE structure, and which aspects of the first algorithm have been modified in the second one and how. Their interesting equalization properties, like lightness and color constancy, image dynamic stretching, global and local filtering, and data driven dequantization, are qualitatively and quantitatively presented and compared, together with their ability to mimic the human visual system. © 2004 SPIE and IS&T. [DOI: 10.1117/1.1635366] |
How to avoid discontinuation of antihypertensive treatment. The experience in São Paulo, Brazil | OBJECTIVES
To evaluate the importance of providing guidelines to patients via active telephone calls for blood pressure control and for preventing the discontinuation of treatment among hypertensive patients.
INTRODUCTION
Many reasons exist for non-adherence to medical regimens, and one of the strategies employed to improve treatment compliance is the use of active telephone calls.
METHODS
Hypertensive patients (n=354) who could receive telephone calls to remind them of their medical appointments and receive instruction about hypertension were distributed into two groups: a) "uncomplicated" - hypertensive patients with no other concurrent diseases and b) "complicated" - severe hypertensive patients (mean diastolic ≥ 110 mmHg with or without medication) or patients with comorbidities. All patients, except those excluded (n=44), were open-block randomized to follow two treatment regimens ("traditional" or "current") and to receive or not receive telephone calls ("phone calls" and "no phone calls" groups, respectively).
RESULTS
Significantly fewer patients in the "phone calls" group discontinued treatment compared to those in the "no phone calls" group (4 vs. 30; p<0.0094). There was no difference in the percentage of patients with controlled blood pressure in the "phone calls" group and "no phone calls" group or in the "traditional" and "current" groups. The percentage of patients with controlled blood pressure (<140/90 mmHg) was increased at the end of the treatment (74%), reaching 80% in the "uncomplicated" group and 67% in the "complicated" group (p<0.000001).
CONCLUSION
Guidance to patients via active telephone calls is an efficient strategy for preventing the discontinuation of antihypertensive treatment. |
A Generalized Path Integral Control Approach to Reinforcement Learning | With the goal to generate more scalable algorithms with high er efficiency and fewer open parameters, reinforcement learning (RL) has recently moved towar ds combining classical techniques from optimal control and dynamic programming with modern learni ng techniques from statistical estimation theory. In this vein, this paper suggests to use the fr amework of stochastic optimal control with path integrals to derive a novel approach to RL with para meterized policies. While solidly grounded in value function estimation and optimal control b ased on the stochastic Hamilton-JacobiBellman (HJB) equations, policy improvements can be transf ormed into an approximation problem of a path integral which has no open algorithmic parameters o ther than the exploration noise. The resulting algorithm can be conceived of as model-based, sem i-model-based, or even model free, depending on how the learning problem is structured. The upd ate equations have no danger of numerical instabilities as neither matrix inversions nor g radient learning rates are required. Our new algorithm demonstrates interesting similarities with previous RL research in the framework of probability matching and provides intuition why the slig htly heuristically motivated probability matching approach can actually perform well. Empirical eva luations demonstrate significant performance improvements over gradient-based policy learnin g a d scalability to high-dimensional control problems. Finally, a learning experiment on a simul ated 12 degree-of-freedom robot dog illustrates the functionality of our algorithm in a complex robot learning scenario. We believe that Policy Improvement withPath Integrals ( PI2) offers currently one of the most efficient, numerically robust, and easy to implement algorithms for RL based o n trajectory roll-outs. |
Active cyber defense with denial and deception: A cyber-wargame experiment | In January 2012, MITRE performed a real-time, red team/blue team cyber-wargame experiment. This presented the opportunity to blend cyber-warfare with traditional mission planning and execution, including denial and deception tradecraft. The cyberwargame was designed to test a dynamic network defense cyber-security platform being researched in The MITRE Corporation’s Innovation Program called Blackjack, and to investigate the utility of using denial and deception to enhance the defense of information in command and control systems. The Blackjack tool failed to deny the adversary access to real information on the command and control mission system. The adversary had compromised a number of credentials without the computer network defenders’ knowledge, and thereby observed both the real command and control mission system and the fake command and control mission system. However, traditional denial and deception techniques were effective in denying the adversary access to real information on the real command and control mission system, and instead provided the adversary with access to false information on a fake command and control mission system. a 2013 Elsevier Ltd. All rights reserved. |
The Proactive Security Toolkit and Applications | Existing security mechanisms focus on prevention of penetrations, detection of a penetration and (manual) recovery tools Indeed attackers focus their penetration efforts on breaking into critical modules, and on avoiding detection of the attack. As a result, security tools and procedures may cause the attackers to lose control over a specific module (computer, account), since the attacker would rather lose control than risk detection of the attack. While controlling the module, attacker may learn critical secret information or modify the module that make it much easier for the attacker to regain control over that module later. Recent results in cryptography give some hope of improving this situation; they show that many fundamental security tasks can be achieved with proactive security. Proactive security does not assume that there is any module completely secure against penetration Instead, we assume that at any given time period (day, week,.), a sufficient number of the modules in the system are secure (not penetrated). The results obtained so far include some of the most important cryptographic primitives such as signatures, secret sharing, and secure communication However, there was no usable implementation, and several critical issues (for actual use) were not addressed
In this work we report on a practical toolkit implementing the key proactive security mechanisms The toolkit provides secure interfaces to make it easy for applications to recover from penetrations. The toolkit also addresses other critical implementation issues, such as the initialization of the proactive secure system.
We describe the toolkit and discuss some of the potential applications. Some applications require minimal enhancements to the existing implementations - e.g. for secure logging (especially for intrusion detection), secure end-to-end communication and timestamping. Other applications require more significant enhancements, mainly distribution over multiple servers, examples are certification authority, key recovery, and secure file system or archive |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.