title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
Accurate Resource Prediction for Hybrid IaaS Clouds Using Workload-Tailored Elastic Compute Units | Cloud computing's pay-per-use model greatly reduces upfront cost and also enables on-demand scalability as service demand grows or shrinks. Hybrid clouds are an attractive option in terms of cost benefit, however, without proper elastic resource management, computational resources could be over-provisioned or under-provisioned, resulting in wasting money or failing to satisfy service demand. In this paper, to accomplish accurate performance prediction and cost-optimal resource management for hybrid clouds, we introduce Workload-tailored Elastic Compute Units (WECU) as a measure of computing resources analogous to Amazon EC2's ECUs, but customized for a specific workload. We present a dynamic programming-based scheduling algorithm to select a combination of private and public resources which satisfy a desired throughput. Using a loosely-coupled benchmark, we confirmed WECUs have 24 (J% better runtime prediction ability than ECUs on average. Moreover, simulation results with a real workload distribution of web service requests show that our WECU-based algorithm reduces costs by 8-31% compared to a fixed provisioning approach. |
Oxidized cholesterol in the diet accelerates the development of aortic atherosclerosis in cholesterol-fed rabbits. | Oxidized lipoproteins may play a role in atherosclerosis. Recently, we have demonstrated that the levels of oxidized fatty acids in the circulation correlate directly with the quantity of oxidized fatty acids in the diet and that dietary oxidized fatty acids accelerate atherosclerosis in rabbits. The present study tests the hypothesis that oxidized cholesterol in the diet accelerates the development of atherosclerosis. Rabbits were fed a diet containing 0.33% nonoxidized cholesterol (control diet) or the same diet containing 0.33% cholesterol of which 5% was oxidized (oxidized diet). Serum cholesterol levels increased to a similar extent in both groups, with the majority of cholesterol in the beta-VLDL fraction. Moreover, in the serum beta-VLDL fraction and liver, there was a significant increase in the oxidized cholesterol levels. Most importantly, feeding a diet enriched in oxidized cholesterol resulted in a 100% increase in fatty streak lesions in the aorta. Western diets contain high concentrations of oxidized cholesterol products, and our results suggest that these foods may be a risk factor for atherosclerosis. |
Consumer Search Behavior in Online Shopping Environments | This paper explores search behavior of online shoppers. Information economics literature suggests that search cost in electronic markets has essentially been reduced to zero as consumers are able to use powerful search tools free of charge to easily find and compare product and shopping information on the Internet. In the present research, however, we present a research model proposing that users need to spend time and effort when completing search tasks resulting in significant search cost and a trade-off between search cost and search performance. Preliminary findings from an Internet experiment indicate that search task complexity, search engine capability, search strategy and user experience are important factors determining search cost and performance. |
A TEST OF THE CRITICAL PERIOD HYPOTHESIS FOR LANGUAGE LEARNING * By | A critical period for language learning is often defined as a sharp decline in learning outcomes with age. This study examines the relevance of the critical period for English speaking proficiency among immigrants in the US. It uses microdata from the 2000 US Census, a model of language acquisition, and a flexible specification of an estimating equation based on 64 age-at-migration dichotomous variables. Self-reported English speaking proficiency among immigrants declines more-or-less monotonically with age at migration, and this relationship is not characterized by any sharp decline or discontinuity that might be considered consistent with a “critical” period. The findings are robust across the various immigrant samples, and between the genders. (110 words). |
Single Image Dehazing Using Color Attenuation Prior | In this paper, we propose a simple but powerful prior, color attenuation prior, for haze removal from a single input hazy image. By creating a linear model for modelling the scene depth of the hazy image under this novel prior and learning the parameters of the model by using a supervised learning method, the depth information can be well recovered. With the depth map of the hazy image, we can easily remove haze from a single image. Experimental results show that the proposed approach is highly efficient and it outperforms state-of-the-art haze removal algorithms in terms of the dehazing effect as well. |
Long range communications in urban and rural environments | Low-Power Wide Area Networks (LPWANs) are prominent technologies in the field of the Internet of Things (IoT). Due to the long range capabilities and low energy consumption, Low-Power Wide Area Networks (LPWANs) are the ideal technologies to send small data occasionally. With their unique characteristics, LPWANs can be used in many applications and in different environments such as urban, rural and even indoor. The work in this paper presents a study on the LPWAN LoRa technology, by testing and evaluating its range, signal quality properties and its performance in delivering data. Three distinct scenarios are proposed and tested, one rural and two urban scenarios in Portugal. The maximum communication range achieved was 5660 meters for the rural scenario and around 2000 meters for the urban ones, and we concluded that there are three essential scenario characteristics that influence the achieved results: distance, elevation difference and obstacles in the signal path. |
Freestyle-Like V-Y Flaps of the Eyebrow: A New Outlook and Indication of an Historical Technique | The eyebrow region is of utmost importance for facial movement, symmetry, and the overall cosmetic appearance of the face. Trauma or tumor resection often leave scars that may dislocate the eyebrow producing an alteration both in static symmetry of the face and in the dynamic expressivity. The authors present a technique for eyebrow's defects repair using the remaining eyebrow advancement by means of a "freestyle-like" V-Y flap. In the past two years a total of eight consecutive patients underwent excision of skin lesions in the superciliary region and immediate reconstruction with this technique. On histology, six patients were affected from basal cell carcinomas, one from squamous cell carcinoma, and one from congenital intradermal melanocytic nevus. The pedicle of the flap included perforators from the supratrochlear, supraorbital, or superficial temporalis artery. Advancement of the entire aesthetic subunit that includes the eyebrow using a V-Y perforator flap was performed successfully in all cases achieving full, tension-free closure of defects up to 3.0 cm. "Freestyle-like" V-Y flaps should be considered as a first-line choice for partial defects of the eyebrow. The greater mobility compared to random subcutaneous flaps allows to reconstruct large defects providing an excellent cosmetic result. |
Environmental Impacts of Distributed Manufacturing from 3-D Printing of Polymer Components and Products | Although additive layer manufacturing is well established for rapid prototyping the low throughput and historic costs have prevented mass-scale adoption. The recent development of the RepRap, an open source self-replicating rapid prototyper, has made low-cost 3-D printers readily available to the public at reasonable prices (<$1,000). The RepRap (Prusa Mendell variant) currently prints 3-D objects in a 200x200x140 square millimeters build envelope from acrylonitrile butadiene styrene (ABS) and polylactic acid (PLA). ABS and PLA are both thermoplastics that can be injection-molded, each with their own benefits, as ABS is rigid and durable, while PLA is plant-based and can be recycled and composted. The melting temperature of ABS and PLA enable use in low-cost 3-D printers, as these temperature are low enough to use in melt extrusion in the home, while high enough for prints to retain their shape at average use temperatures. Using 3-D printers to manufacture provides the ability to both change the fill composition by printing voids and fabricate shapes that are impossible to make using tradition methods like injection molding. This allows more complicated shapes to be created while using less material, which could reduce environmental impact. As the open source 3-D printers continue to evolve and improve in both cost and performance, the potential for economically-viable distributed manufacturing of products increases. Thus, products and components could be customized and printed on-site by individual consumers as needed, reversing the historical trend towards centrally mass-manufactured and shipped products. Distributed manufacturing reduces embodied transportation energy from the distribution of conventional centralized manufacturing, but questions remain concerning the potential for increases in the overall embodied energy of the manufacturing due to reduction in scale. In order to quantify the environmental impact of distributed manufacturing using 3-D printers, a life cycle analysis was performed on a plastic juicer. The energy consumed and emissions produced from conventional large-scale production overseas are compared to experimental measurements on a RepRap producing identical products with ABS and PLA. The results of this LCA are discussed in relation to the environmental impact of distributed manufacturing with 3-D printers and polymer selection for 3-D printing to reduce this impact. The results of this study show that distributed manufacturing uses less energy than conventional manufacturing due to the RepRap's unique ability to reduce fill composition. Distributed manufacturing also has less emissions than conventional manufacturing when using PLA and when using ABS with solar photovoltaic power. The results of this study indicate that opensource additive layer distributed manufacturing is both technically viable and beneficial from an ecological perspective. Mater. Res. Soc. Symp. Proc. Vol. 1492 © 2013 Materials Research Society DOI: 1 557/op 013 0.1 l.2 .319 |
Automatic Inference of Search Patterns for Taint-Style Vulnerabilities | Taint-style vulnerabilities are a persistent problem in software development, as the recently discovered "Heart bleed" vulnerability strikingly illustrates. In this class of vulnerabilities, attacker-controlled data is passed unsanitized from an input source to a sensitive sink. While simple instances of this vulnerability class can be detected automatically, more subtle defects involving data flow across several functions or project-specific APIs are mainly discovered by manual auditing. Different techniques have been proposed to accelerate this process by searching for typical patterns of vulnerable code. However, all of these approaches require a security expert to manually model and specify appropriate patterns in practice. In this paper, we propose a method for automatically inferring search patterns for taint-style vulnerabilities in C code. Given a security-sensitive sink, such as a memory function, our method automatically identifies corresponding source-sink systems and constructs patterns that model the data flow and sanitization in these systems. The inferred patterns are expressed as traversals in a code property graph and enable efficiently searching for unsanitized data flows -- across several functions as well as with project-specific APIs. We demonstrate the efficacy of this approach in different experiments with 5 open-source projects. The inferred search patterns reduce the amount of code to inspect for finding known vulnerabilities by 94.9% and also enable us to uncover 8 previously unknown vulnerabilities. |
Standard Function Blocks for Flexible IED in IEC 61850-Based Substation Automation | Flexible intelligent electronic devices (IEDs) are highly desirable to support free allocation of function to IED by means of software reconfiguration without any change of hardware. The application of generic hardware platforms and component-based software technology seems to be a good solution. Due to the advent of IEC 61850, generic hardware platforms with a standard communication interface can be used to implement different kinds of functions with high flexibility. The remaining challenge is the unified function model that specifies various software components with appropriate granularity and provides a framework to integrate them efficiently. This paper proposes the function-block (FB)-based function model for flexible IEDs. The standard FBs are established by combining the IEC 61850 model and the IEC 61499 model. The design of a simplified distance protection IED using standard FBs is described and investigated. The testing results of the prototype system in MATLAB/Simulink demonstrate the feasibility and flexibility of FB-based IEDs. |
Improved OCR based automatic vehicle number plate recognition using features trained neural network | Significant research and development of algorithms in intelligent transportation has grabbed more attention in recent years. An automated, fast, accurate and robust vehicle plate recognition system has become need for traffic control and law enforcement of traffic regulations; and the solution is ANPR. This paper is dedicated on an improved technique of OCR based license plate recognition using neural network trained dataset of object features. A blended algorithm for recognition of license plate is proposed and is compared with existing methods for improve accuracy. The whole system can be categorized under three major modules, namely License Plate Localization, Plate Character Segmentation, and Plate Character Recognition. The system is simulated on 300 national and international motor vehicle LP images and results obtained justifies the main requirement. |
The MOS 36-Item Short-Form Health Survey (SF-36): II. Psychometric and clinical tests of validity in measuring physical and mental health constructs. | Cross-sectional data from the Medical Outcomes Study (MOS) were analyzed to test the validity of the MOS 36-Item Short-Form Health Survey (SF-36) scales as measures of physical and mental health constructs. Results from traditional psychometric and clinical tests of validity were compared. Principal components analysis was used to test for hypothesized physical and mental health dimensions. For purposes of clinical tests of validity, clinical criteria defined mutually exclusive adult patient groups differing in severity of medical and psychiatric conditions. Scales shown in the components analysis to primarily measure physical health (physical functioning and role limitations-physical) best distinguished groups differing in severity of chronic medical condition and had the most pure physical health interpretation. Scales shown to primarily measure mental health (mental health and role limitations-emotional) best distinguished groups differing in the presence and severity of psychiatric disorders and had the most pure mental health interpretation. The social functioning, vitality, and general health perceptions scales measured both physical and mental health components and, thus, had the most complex interpretation. These results are useful in establishing guidelines for the interpretation of each scale and in documenting the size of differences between clinical groups that should be considered very large. |
Paradox of richness: a cognitive model of media choice | Researchers have long studied the effects of social presence and media richness on media choice and the effects of media use. This focus on social presence and social psychological theories has led to valuable research on communication. However, little research (either empirical or theoretical) has been done to understand the ways in which media choices influence the cognitive processes that underlie communication. In this paper, we present a cognitive-based view of media choice and media use, based on dual process theories of cognition, which argue that in order for individuals to systematically process messages, they must be motivated to process the message and have the ability to process it. We argue that the use of rich media high in social presence induces increased motivation but decreases the ability to process information, while the use of lean media low in social presence induces decreased motivation but increases the ability to process information. The paradox of richness lies in its duality of impact: from a cognitive perspective, rich media high in social presence simultaneously acts to both improve and impair performance. |
Prospective Randomized Long-Term Study on the Efficacy and Safety of UV-Free Blue Light for Treating Mild Psoriasis Vulgaris. | BACKGROUND
Blue light irradiation reduces the proliferation of keratinocytes and modulates T-cell immune response in vitro and has been shown to reduce the severity of psoriasis vulgaris (Pv) in two clinical trials.
OBJECTIVE
Evaluation of safety and efficacy of long-term UV-free blue light treatment at home for mild Pv.
METHODS
Forty-seven patients with mild Pv were randomized for receiving high-intensity blue light treatment (HI: 453 nm LED, 200 mW/cm(2), n = 24) and low-intensity treatment (LI: 453 nm LED, 100 mW/cm(2), n = 23) of one Pv plaque for 12 weeks. A contralateral control plaque remained untreated.
RESULTS
Patient compliance and satisfaction were high. The primary endpoint, change from baseline (CfB) of the Local Psoriasis Severity Index, revealed a significant improvement of the target compared to the control plaques (ΔCfB for the HI group: -0.92 ± 1.10, p = 0.0005; for the LI group: -0.74 ± 1.18, p = 0.0064).
CONCLUSION
UV-free blue light home treatment is safe and improves Pv plaques. |
Android based malware detection using a multifeature collaborative decision fusion approach | Smart mobile device usage has expanded at a very high rate all over the world. Since the mobile devices nowadays are used for a wide variety of application areas like personal communication, data storage and entertainment, security threats emerge, comparable to those which a conventional PC is exposed to. Mobile malware has been growing in scale and complexity as smartphone usage continues to rise. Android has surpassed other mobile platforms as the most popular whilst also witnessing a dramatic increase in malware targeting the platform. In this work, we have considered Android based malware for analysis and a scalable detection mechanism is designed using multifeature collaborative decision fusion (MCDF). The different features of a malicious file like the permission based features and the API call based features are considered in order to provide a better detection by training an ensemble of classifiers and combining their decisions using collaborative approach based on probability theory. The performance of the proposed model is evaluated on a collection of Android based malware comprising of different malware families and the results show that our approach give a better performance than state-of-the-art ensemble schemes available. & 2014 Elsevier B.V. All rights reserved. |
Predicting the Next Location: A Recurrent Model with Spatial and Temporal Contexts | Spatial and temporal contextual information plays a key role for analyzing user behaviors, and is helpful for predicting where he or she will go next. With the growing ability of collecting information, more and more temporal and spatial contextual information is collected in systems, and the location prediction problem becomes crucial and feasible. Some works have been proposed to address this problem, but they all have their limitations. Factorizing Personalized Markov Chain (FPMC) is constructed based on a strong independence assumption among different factors, which limits its performance. Tensor Factorization (TF) faces the cold start problem in predicting future actions. Recurrent Neural Networks (RNN) model shows promising performance comparing with PFMC and TF, but all these methods have problem in modeling continuous time interval and geographical distance. In this paper, we extend RNN and propose a novel method called Spatial Temporal Recurrent Neural Networks (ST-RNN). ST-RNN can model local temporal and spatial contexts in each layer with time-specific transition matrices for different time intervals and distance-specific transition matrices for different geographical distances. Experimental results show that the proposed ST-RNN model yields significant improvements over the competitive compared methods on two typical datasets, i.e., Global Terrorism Database (GTD) and Gowalla dataset. |
HARRISON: A Benchmark on HAshtag Recommendation for Real-world Images in Social Networks | Simple, short, and compact hashtags cover a wide range of information on social networks. Although many works in the field of natural language processing (NLP) have demonstrated the importance of hashtag recommendation, hashtag recommendation for images has barely been studied. In this paper, we introduce the HARRISON dataset, a benchmark on hashtag recommendation for real world images in social networks. The HARRISON dataset is a realistic dataset, composed of 57,383 photos from Instagram and an average of 4.5 associated hashtags for each photo. To evaluate our dataset, we design a baseline framework consisting of visual feature extractor based on convolutional neural network (CNN) and multi-label classifier based on neural network. Based on this framework, two single feature-based models, object-based and scene-based model, and an integrated model of them are evaluated on the HARRISON dataset. Our dataset shows that hashtag recommendation task requires a wide and contextual understanding of the situation conveyed in the image. As far as we know, this work is the first vision-only attempt at hashtag recommendation for real world images in social networks. We expect this benchmark to accelerate the advancement of hashtag recommendation. |
FDTD simulation of TE and TM plane waves at nonzero incidence in arbitrary Layered media | Plane wave scattering is an important class of electromagnetic problems that is surprisingly difficult to model with the two-dimensional finite-difference time-domain (FDTD) method if the direction of propagation is not parallel to one of the grid axes. In particular, infinite plane wave interaction with dispersive half-spaces or layers must include careful modeling of the incident field. By using the plane wave solutions of Maxwell's equations to eliminate the transverse field dependence, a modified set of curl equations is derived which can model a "slice" of an oblique plane wave along grid axes. The resulting equations may be used as edge conditions on an FDTD grid. These edge conditions represent the only known way to accurately propagate plane wave pulses into a frequency dependent medium. An examination of grid dispersion between the plane wave and the modeled slice reveals good agreement. Application to arbitrary dispersive media is straightforward for the transverse magnetic (TM) case, but requires the use of an auxiliary equation for the transverse electric case, which increases complexity. In the latter case, a simplified approach, based on formulating the dual of the TM equations, is shown to be quite effective. The strength of the developed approach is illustrated with a comparison with the conventional simulation based on an analytic incident wave specification with half-space, single frequency reflection and transmission for the edges. Finally, an example of a possible biomedical application is given and the implementation of the method in the perfectly matched layer region is discussed. |
Psychopharmacological treatment of social phobia; a double blind placebo controlled study with fluvoxamine | Previous studies have shown selective and non-selective monoamine oxidase inhibitors (MAOIs) to be effective in the treatment of social phobia. In this study we investigated the efficacy of selective serotonin reuptake inhibitors (SSRIs) in social phobia. Thirty patients with social phobia (DSM-IIIR) were treated with the SSRI fluvoxamine (150 mg daily) using a 12-week double-blind placebo controlled design. A substantial improvement was observed in seven (46%) patients on fluvoxamine and in one (7%) on placebo. Statistically significant effects were seen on measures of social anxiety and general (or anticipatory) anxiety in patients treated with fluvoxamine compared with placebo. The level of phobic avoidance decreased also but the difference at endpoint between fluvoxamine and placebo failed to reach statistical significance. It is concluded that treatment with the SSRI fluvoxamine has beneficial effects in patients suffering from social phobia, suggesting that serotonergic mechanisms might be implicated in social anxiety. |
Flexible stabilization of the distal tibiofibular syndesmosis: clinical and biomechanical considerations: a review of the literature | Syndesmotic rupture is present in 10 % of ankle fractures and must be recognized and treated to prevent late complications. The method of fixation is classically rigid fixation with one or two screws. Knowledge of the biomechanics of the syndesmosis has led to the development of new dynamic implants to restore physiologic motion during walking. One of these implants is the suture-button system. The purpose of this paper is to review the orthopaedic trauma literature, both biomechanical and clinical, to present the current state of knowledge on the suture-button fixation and to put emphasis on the advantages and disadvantages of this technique. Two investigators searched the databases of Pubmed/Medline, Cochrane Clinical Trial Register and Embase independently. The search interval was from January 1980 to March 2011. The search keys comprised terms to identify articles on biomechanical and clinical issues of flexible fixation of syndesmotic ruptures. Ninety-nine publications met the search criteria. After filtering using the exclusion criteria, 11 articles (five biomechanical and six clinical) were available for review. The biomechanical studies involved 90 cadaveric ankles. The suture-button demonstrated good resistance to axial and rotational loads (equivalent to screws) and resistance to failure. Physiologic motion of the syndesmosis was restored in all directions. The clinical studies (149 ankles) demonstrated good functional results using the AOFAS score, indicating faster rehabilitation with flexible fixation than with screws. There were few complications. Preliminary results from the current literature support the use of suture-button fixation for syndesmotic ruptures. This method seems secure and safe. As there is no strong evidence for its use, prospective randomized controlled trials to compare the suture-button to the screw fixation for ankle syndesmotic ruptures are required. |
Planning Paths for Package Delivery in Heterogeneous Multirobot Teams | This paper addresses the task scheduling and path planning problem for a team of cooperating vehicles performing autonomous deliveries in urban environments. The cooperating team comprises two vehicles with complementary capabilities, a truck restricted to travel along a street network, and a quadrotor micro-aerial vehicle of capacity one that can be deployed from the truck to perform deliveries. The problem is formulated as an optimal path planning problem on a graph and the goal is to find the shortest cooperative route enabling the quadrotor to deliver items at all requested locations. The problem is shown to be NP-hard. A solution is then proposed using a novel reduction to the Generalized Traveling Salesman Problem, for which well-established heuristic solvers exist. The heterogeneous delivery problem contains as a special case the problem of scheduling deliveries from multiple static warehouses. We propose two additional algorithms, based on enumeration and a reduction to the traveling salesman problem, for this special case. Simulation results compare the performance of the presented algorithms and demonstrate examples of delivery route computations over real urban street maps. |
1. Lexicalized meaning and the internal temporal structure of events | Most current studies of aspect assume the existence of the four Vendler classes: states, activities, achievements and accomplishments. Despite the fact that other classifications have been offered, (for example, those in Mourelatos 1978, Bach 1981, and Carlson 1981) none has achieved the status of the Vendler classification. Often, linguists take these classes to be a linguistic fact, and then attempt to come up with theories which explain their existence and their properties, usually by offering basic elements of meaning and modes of composition that together produce just these four aspectual classes. |
A new image thresholding method based on Gaussian mixture model | Abstract: In this paper, an efficient approach to search for the global threshold of image using Gaussian mixture model is proposed. Firstly, a gray-level histogram of an image is represented as a function of the frequencies of gray-level. Then,to fit the Gaussian mixtures to the histogram of image, the Expectation Maximization (EM) algorithm is developed to estimate the number of Gaussian mixture of such histograms and their corresponding parameterization. Finally, the optimal threshold which is the average of these Gaussian mixture means is chosen. And the experimental results show that the new algorithm performs better. In this paper, an efficient approach to search for the global threshold of image using Gaussian mixture model is proposed. Firstly, a gray-level histogram of an image is represented as a function of the frequencies of gray-level. Then,to fit the Gaussian mixtures to the histogram of image, the Expectation Maximization (EM) algorithm is developed to estimate the number of Gaussian mixture of such histograms and their corresponding parameterization. Finally, the optimal threshold which is the average of these Gaussian mixture means is chosen. And the experimental results show that the new algorithm performs better. |
6-axis force-torque sensor chip composed of 16 piezoresistive beams | This paper reports the design, a part of the fabrication process, especially ion doping method by oblique ion implantation, and experimental results of a 6-axis force-torque sensor chip composed of 16 piezoresistive beams. The sensor chip's area, 2mm square in size, is one-third of that of the minimum 6-axis sensor chip ever reported. It will enhance the mounting density of sensor array and also reduce the cost in case of volume production. These sensor chips were fabricated by MNOIC, 8-inch MEMS foundry in Japan, and their characteristic variations are enough small to make practical use of them. |
Approximate dissections | A geometric dissection is a set of pieces which can be assembled in different ways to form distinct shapes. Dissections are used as recreational puzzles because it is striking when a single set of pieces can construct highly different forms. Existing techniques for creating dissections find pieces that reconstruct two input shapes exactly. Unfortunately, these methods only support simple, abstract shapes because an excessive number of pieces may be needed to reconstruct more complex, naturalistic shapes. We introduce a dissection design technique that supports such shapes by requiring that the pieces reconstruct the shapes only approximately. We find that, in most cases, a small number of pieces suffices to tightly approximate the input shapes. We frame the search for a viable dissection as a combinatorial optimization problem, where the goal is to search for the best approximation to the input shapes using a given number of pieces. We find a lower bound on the tightness of the approximation for a partial dissection solution, which allows us to prune the search space and makes the problem tractable. We demonstrate our approach on several challenging examples, showing that it can create dissections between shapes of significantly greater complexity than those supported by previous techniques. |
Mechanisms and control of pathologic bone loss in periodontitis. | The periodontal diseases range from the relatively benign form of periodontal disease known as gingivitis, to chronic and aggressive forms of periodontal disease, all of which not only threaten the dentition but may also be a threat to general health (72). All forms of inflammatory periodontal disease are associated with chronic inflammation, resulting in destruction of the periodontal ligament and bone. If left untreated, significant tissue damage occurs; the affected teeth can become loose and may be lost if the disease continues to be active. The periodontal diseases are very prevalent, with up to 90% of the adult population suffering from gingivitis, 60% having chronic periodontitis and 5–15% having aggressive periodontitis (72). Histologically and biochemically the periodontal lesions of patients with chronic and aggressive periodontitis appear to be similar. While there may be some differences in the cellular infiltrate between these two diseases (discussed elsewhere in this edition of Periodontology 2000), the molecular mediators and pathologic processes are generally the same. The only differences between chronic periodontitis and aggressive periodontitis with regard to tissue destruction appear to be perhaps the magnitude, sequelae and control of the response. Otherwise the mechanisms are remarkably similar. While a great deal of focus has been on managing the inflammation in the gingival tissues, advances in our understanding of bone metabolism are opening up new avenues of understanding regarding the pathologic bone loss in periodontitis. This knowledge, together with the development of novel drugs that can inhibit bone loss ⁄ destruction, provides us with opportunities to target not only soft tissue inflammation but also the destructive bone loss seen in periodontitis. This review aims to demonstrate that, in the future, periodontal treatments will not only target the inflammatory response but will also utilize adjunct agents to prevent alveolar bone loss associated with progressing periodontitis. |
Research on the Construction and Filter Method of Stop-word List in Text Preprocessing | In the text preprocessing of text mining, a stop word list is constructed to filter the segment results of the text documents so that the dimensionality of the text feature space can be cut down primarily. This paper summarized the definition, extraction principles and method of stop-word, and constructed a customizing Chinese-English stop-word list with the classical stop-word list based on the difference of text documents' domain. Three different filter algorithms were designed and implemented in the process of the stop-word filter and their efficiency was compared emphatically. The experiment indicated that the hash-filter method was the fastest. |
Analog Layout Generator for CMOS Circuits | In this paper, we present a new layout level automation tool for analog CMOS circuits, namely, analog layout generator (ALG). ALG is capable of generating individual or matched components as well as placement and routing. ALG takes performance considerations into account, optimizing the layout in each step. A distinguishing feature of the tool is primarily providing spectra of generation possibilities ranging from full custom to automatic generation. ALG is not only designed to work as a standalone tool but also implemented to be the final step of an analog automation flow. The flow supports circuit level specification in addition to layout level user specifications, so that it can be integrated into an analog automation system. Another feature of ALG is its interaction with a layout adviser tool, namely, YASA. YASA performs sensitivity simulations using a spicelike simulator providing sensitivities of performance parameters with respect to circuit parameters. |
Less can be more: Targeted embolization of aneurysms associated with arteriovenous malformations unsuitable for surgical resection. | INTRODUCTION
To mitigate risks of hemorrhage, high-risk features of brain arteriovenous malformations (BAVMs) can be targeted to reduce the risk of rupture. Previous investigation has examined embolization of a pedicle supplying a high-risk feature; this study examines embolization targeted specifically at aneurysms associated with BAVMs.
MATERIALS AND METHODS
Patients with BAVMs treated at two high-volume neurointerventional services were retrospectively reviewed. Patients treated with intention to occlude only the associated aneurysm itself were analyzed. Demographic and lesion characteristics were identified, as were technical and clinical outcomes. Adverse events were defined as hemorrhage, new seizure, and death.
RESULTS
Thirty-two patients met inclusion criteria out of 1103 patients treated during the study period. Twenty-seven (84.4%) BAVMs were acutely ruptured, all with the aneurysm identified as the hemorrhage source. Twenty-four (75.0%) lesions involved eloquent territory. There were equal numbers of feeding artery and nidus aneurysms. Follow-up data were available for a total of 101.3 patient-years for a mean follow-up time of 2.9 years. One patient died; the remaining 31 patients had improved functional status at last contact. Annualized rate of hemorrhage after treatment was 1.0%; rate of adverse events after treatment was 3.0%. Excluding time after confirmed occlusion following radiosurgery, annualized rates were 1.4% and 4.8%, respectively.
CONCLUSION
In inoperable BAVMs, targeted embolization of associated aneurysms can be performed safely and effectively. This should be considered in high-risk lesions prior to radiosurgery or in cases when no other treatment options are available. Such intervention warrants further investigation. |
Developing Targeted Health Service Interventions Using the PRECEDE-PROCEED Model: Two Australian Case Studies | Aims and Objectives. This paper provides an overview of the applicability of the PRECEDE-PROCEED Model to the development of targeted nursing led chronic illness interventions. Background. Changing health care practice is a complex and dynamic process that requires consideration of social, political, economic, and organisational factors. An understanding of the characteristics of the target population, health professionals, and organizations plus identification of the determinants for change are also required. Synthesizing this data to guide the development of an effective intervention is a challenging process. The PRECEDE-PROCEED Model has been used in global health care settings to guide the identification, planning, implementation, and evaluation of various health improvement initiatives. Design. Using a reflective case study approach, this paper examines the applicability of the PRECEDE-PROCEED Model to the development of targeted chronic care improvement interventions for two distinct Australian populations: a rapidly expanding and aging rural population with unmet palliative care needs and a disadvantaged urban community at higher risk of cardiovascular disease. Results. The PRECEDE-PROCEED Model approach demonstrated utility across diverse health settings in a systematic planning process. In environments characterized by increasing health care needs, limited resources, and growing community expectations, adopting planning tools such as PRECEDE-PROCEED Model at a local level can facilitate the development of the most effective interventions. Relevance to Clinical Practice. The PRECEDE-PROCEED Model is a strong theoretical model that guides the development of realistic nursing led interventions with the best chance of being successful in existing health care environments. |
Levator trauma is associated with pelvic organ prolapse. | OBJECTIVE
To estimate the risk of prolapse associated with levator avulsion injury among a urogynaecological clinic population.
DESIGN
Retrospective observational study.
SETTING
Tertiary urogynaecological unit.
SAMPLE
A total of 934 women seen for interview, examination using the pelvic organ prolapse quantification (POP-Q) staging system and imaging of the levator ani muscle by four-dimensional translabial ultrasound.
METHODS
Retrospective review of charts and stored imaging data.
MAIN OUTCOME MEASURES
Pelvic organ prolapse stage II and higher and presence of defects of the levator ani muscle.
RESULTS
After exclusion of 137 women with a history of anti-incontinence or prolapse surgery, and a further exclusion of 16 women in whom either examination or imaging was impossible, we compared prolapse and imaging data in 781 women. Mean age was 53 years (range 15-89 years), and median parity was 2 (range 0-12). Women reported stress incontinence (76%), urge incontinence (69%), frequency (47%), nocturia (49%) and symptoms of prolapse (38%). Significant prolapse (stage II or higher) was diagnosed in 415 (53%) women, and 181 (23%) women were found to have levator avulsion defects. Prolapse was seen in 150/181 (83%) women with avulsion and in 265/600 (44%) women without avulsion, giving a relative risk (RR) of 1.9 (95% CI 1.7-2.1). The association was strongest for cystocele (RR 2.3, 95% CI 2.0-2.7) and uterine prolapse (RR 4.0, 95% CI 2.5-6.5).
CONCLUSIONS
Women with levator avulsion defects were about twice as likely to show pelvic organ prolapse of stage II or higher than those without. This effect is mainly due to an increased risk of cystocele and uterine prolapse. |
Does supporting multiple student strategies lead to greater learning and motivation? Investigating a source of complexity in the architecture of intelligent tutoring systems | Intelligent tutoring systems (ITS) support students in learning a complex problem-solving skill. One feature that makes an ITS architecturally complex, and hard to build, is support for strategy freedom, that is, the ability to let students pursue multiple solution strategies within a given problem. But does greater freedom mean that students learn more robustly? We developed three versions of the same ITS for solving linear algebraic equations that differed only in the amount of freedom given to students. One condition required students to strictly adhere to a standard strategy, the other two allowed minor and major variations, respectively. We conducted a study in two US middle schools with 57 students in grades 7 and 8. Overall, students’ algebra skills improved. Contrary to our hypotheses, the amount of freedom offered by the system did not affect students’ learning outcomes, nor did if affect their intrinsic motivation. Students tended to use only the standard strategy and its minor variations. Thus, the study suggests that in the early stages of problem-solving practice within a complex domain, an ITS should allow at least a small amount of freedom, validating, albeit to a limited degree, one source of complexity in ITS architectures. To help students develop strategic flexibility, a desirable outcome in many domains, more is needed than letting students chose their own solution strategy within a given problem. 2012 Elsevier Ltd. All rights reserved. |
Early Usability Evaluation in Model Driven Architecture Environments | Due to the increasing interest in the model driven architecture (MDA) paradigm, the conceptual models have become the backbone of the software development process. So far some methods exist to develop a user interface according to a MDA-compliant method, none of them explicitly connects usability to their process activities. In this paper, we present a framework which incorporates usability as part of a MDA development process. In particular, a usability model for early evaluation is proposed. Using this model, the usability of a software system is evaluated and improved at the platform independent model (PIM) level. It focuses on the correspondences between the abstract user interface elements and the final user interface elements in a specific platform (CM). This framework has been successfully applied to an industrial MDA tool |
A Tale of Many Cities: Universal Patterns in Human Urban Mobility | The advent of geographic online social networks such as Foursquare, where users voluntarily signal their current location, opens the door to powerful studies on human movement. In particular the fine granularity of the location data, with GPS accuracy down to 10 meters, and the worldwide scale of Foursquare adoption are unprecedented. In this paper we study urban mobility patterns of people in several metropolitan cities around the globe by analyzing a large set of Foursquare users. Surprisingly, while there are variations in human movement in different cities, our analysis shows that those are predominantly due to different distributions of places across different urban environments. Moreover, a universal law for human mobility is identified, which isolates as a key component the rank-distance, factoring in the number of places between origin and destination, rather than pure physical distance, as considered in some previous works. Building on our findings, we also show how a rank-based movement model accurately captures real human movements in different cities. |
Building Rome on a Cloudless Day ( ECCV 2010 ) | This paper introduces an approach for dense 3D reconstruc7 7 tion from unregistered Internet-scale photo collections with about 3 mil8 8 lion of images within the span of a day on a single PC (“cloudless”). Our 9 9 method advances image clustering, stereo, stereo fusion and structure 10 10 from motion to achieve high computational performance. We leverage 11 11 geometric and appearance constraints to obtain a highly parallel imple12 12 mentation on modern graphics processors and multi-core architectures. 13 13 This leads to two orders of magnitude higher performance on an order 14 14 of magnitude larger dataset than competing state-of-the-art approaches. 15 15 |
Development of vehicle driver drowsiness detection system using electrooculogram (EOG) | Driver drowsiness is one of the major causes of road accident. Various driver drowsiness detection systems have been designed to detect and warn the driver of impending drowsiness. Most available prototype and ongoing research have focused on video-based eye tracking system, which demands high computing power due to real time video processing. In our research, the use of electrooculogram (EOG) as an alternative to video-based systems in detecting eye activities caused by drowsiness is evaluated. The EOG, which is the electrical signal generated by eye movements, is acquired by a mobile biosignal acquisition module and are processed offline using personal computer. Digital signal differentiation and simple information fusion techniques are used to detect signs of drowsiness in the EOG signal. EOG signal is found to be a promising drowsiness detector, with detection rate of more than 80%. Based on the tested offline processing techniques, an online fatigue monitoring system prototype based on a Personal Digital Assistant (PDA) has been designed to detect driver dozing off through EOG signal. |
Almotriptan in the acute treatment of migraine in patients 11–17 years old: an open–label pilot study of efficacy and safety | The objective was to investigate the safety and efficacy of almotriptan in patients aged 11–17 years old with acute migraine. Fifteen patients aged 11–17 with a history of migraine with or without aura were treated with almotriptan. Reduction in headache severity, disability and adverse effects were studied. Almotriptan in doses ranging from 6.25 to 12.5 mg was well tolerated. There were virtually no adverse effects except for one case of transient mild stiffness. Of the 15 patients, only 2 demonstrated no efficacy without adverse effects. In the other 13 patients, not only was almotriptan effective, but again, no significant adverse effects were reported. Almotriptan is probably safe and effective in patients aged 11–17. This small open–label pilot study should support the feasibility of a large randomised controlled study to demonstrate tolerability and efficacy of almotriptan in children and adolescents with episodic migraine. |
"Multivariate Approximation: Theory and Applications" | This introductory paper describes the main topics of this special issue, dedicated to Leonardo Traversoni, known at the international level as the promoter of the conference series ''Multivariate Approximation: Theory and Applications'', to celebrate his 60th birthday. |
Prostate Cancer Screening - A Perspective on the Current State of the Evidence. | After a quarter century of extensive screening for prostate cancer with prostate-specific antigen (PSA) in the United States, and after the completion of two major trials examining the effects of such screening, the medical community is still divided with regard to its effectiveness and its benefits-to-harms ratio. Here, we review the current status of PSA screening and examine emerging trends. In 2012, after publication of the findings from the major randomized trials of PSA-based screening for prostate cancer — the European Randomized Study of Screening for Prostate Cancer (ERSPC) and the Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial (PLCO) — the U.S. Preventive Services Task Force (USPSTF) recommended against PSA-based screening for prostate cancer (a recommendation that is currently undergoing routine review and updating).1 The 2012 statement applied to men in the general U.S. population (excluding specific high-risk groups, such as men with known BRCA mutations). Over the next several years, other organizations and professional societies in North America issued guidelines either recommending against PSA-based screening in average-risk men or recommending some form of shared decision making about screening2-7 (Table 1). With respect to shared decision making, for example, the American College of Physicians recommended discussing the benefits and harms of screening and ordering screening only when the patient expresses a clear preference for it. The above entities cited as the benefits of screening a reduction in prostate-specific mortality of approximately 1 death per 1000 men screened (the USPSTF cited a range of 0 to 1 per 1000). This rate comes from the ERSPC; specifically, a difference in prostate cancer–specific mortality of 1.3 deaths per 1000 men over 13 years of follow-up and a mean of approximately two PSA screens among men in the screening group.8 In contrast, the PLCO did not show a reduction in prostate cancer–specific mortality; in a recent update, the risk ratio was 1.04 for the screened group versus the control group after a |
Impact of a nurse telephone intervention among high-cardiovascular-risk, health fair participants. | BACKGROUND AND OBJECTIVES
Cardiovascular disease (CVD) is the leading cause of death in the United States, yet most individuals remain unaware of their risk. Current health fair models assess individual risk factors but miss the opportunity to assess, counsel, and follow-up with participants regarding global CVD risk. Objectives of this nurse telephone intervention were to (1) describe high-CVD-risk participants' healthcare-seeking behavior after the health fair and following a nurse telephone intervention and (2) describe CVD risk-reducing therapies provided to high-risk participants after the health fair and following a nurse telephone intervention.
SUBJECTS AND METHODS
Five hundred twenty-nine of 4,489 health fair participants who completed an interactive Framingham risk assessment in 2006 were identified with high CVD risk. These participants received a nurse telephone intervention approximately 1 month after the health fair, during which the risk message was reinforced, principles of motivational interviewing were applied, and follow-up care was assessed. We evaluated the proportion of high-CVD-risk participants who obtained healthcare before and after intervention, and we compared the care received before and after intervention.
RESULTS AND CONCLUSION
Among 447 contacted high-CVD-risk participants, 59% (n = 262) saw a healthcare provider, and 86% of those discussed CVD risk at their healthcare visit. A greater proportion of participants were started on a cardioprotective drug (41% vs 20%; P < .01), and more participants discussed "heart health" (96% vs 75%; P < .001) after receiving the nurse telephone intervention. Our findings suggest that a nurse intervention may improve individuals' CVD risk awareness as well as activate providers to implement CVD risk reduction strategies. |
DeepRank: A New Deep Architecture for Relevance Ranking in Information Retrieval | This paper concerns a deep learning approach to relevance ranking in information retrieval (IR). Existing deep IR models such as DSSM and CDSSM directly apply neural networks to generate ranking scores, without explicit understandings of the relevance. According to the human judgement process, a relevance label is generated by the following three steps: 1) relevant locations are detected; 2) local relevances are determined; 3) local relevances are aggregated to output the relevance label. In this paper we propose a new deep learning architecture, namely DeepRank, to simulate the above human judgment process. Firstly, a detection strategy is designed to extract the relevant contexts. Then, a measure network is applied to determine the local relevances by utilizing a convolutional neural network (CNN) or two-dimensional gated recurrent units (2D-GRU). Finally, an aggregation network with sequential integration and term gating mechanism is used to produce a global relevance score. DeepRank well captures important IR characteristics, including exact/semantic matching signals, proximity heuristics, query term importance, and diverse relevance requirement. Experiments on both benchmark LETOR dataset and a large scale clickthrough data show that DeepRank can significantly outperform learning to ranking methods, and existing deep learning methods. |
Arsenic and Other Metal Contamination of Groundwater in the Mekong River Delta, Vietnam | High levels of arsenic (As) contamination are found in the groundwater of Vietnam. To determine the distribution of arsenic and other metal contamination in the groundwater of the Mekong River Delta, we examined the contamination status of As and other metals in two regions, Tien Giang Province and Dong Thap Province. The concentration of total As in the groundwater, which is used for the drinking water supply, ranged from 0.9μg/l to 321μg/l, and 27% of the shallow-well water samples exceeded the World Health Organization (WHO) provisional guideline of 10μg/l. Also, 91% and 27% of shallow-well water samples had higher concentrations of manganese (Mn) and barium (Ba) than those stated in the WHO drinking water guidelines, respectively. On the other hand, such contamination was not found in the deepwell water samples examined. These results suggest that pollution by As, Mn, and Ba is widely distributed in the shallow aquifer of the Mekong River Delta and thus the health of the people consuming shallow-well water in both provinces might be at considerable risk. |
Physicochemical and biological characterization of SB2, a biosimilar of Remicade® (infliximab) | A biosimilar is a biological medicinal product that contains a version of the active substance of an already authorized original biological medicinal product. Biosimilarity to the reference product (RP) in terms of quality characteristics, such as physicochemical and biological properties, safety, and efficacy, based on a comprehensive comparability exercise needs to be established. SB2 (Flixabi® and Renflexis®) is a biosimilar to Remicade® (infliximab). The development of SB2 was performed in accordance with relevant guidelines of the International Conference on Harmonisation, the European Medicines Agency, and the United States Food and Drug Administration. To determine whether critical quality attributes meet quality standards, an extensive characterization test was performed with more than 80 lots of EU- and US-sourced RP. The physicochemical characterization study results revealed that SB2 was similar to the RP. Although a few differences in physicochemical attributes were observed, the evidence from the related literature, structure-activity relationship studies, and comparative biological assays showed that these differences were unlikely to be clinically meaningful. The biological characterization results showed that SB2 was similar to the RP in terms of tumor necrosis factor-α (TNF-α) binding and TNF-α neutralization activities as a main mode of action. SB2 was also similar in Fc-related biological activities including antibody-dependent cell-mediated cytotoxicity, complement-dependent cytotoxicity, neonatal Fc receptor binding, C1q binding, and Fc gamma receptor binding activities. These analytical findings support that SB2 is similar to the RP and also provide confidence of biosimilarity in terms of clinical safety and efficacy. |
Using information to optimize medical outcomes. | AN IMPORTANT HEALTH CARE–RELATED DECISION OF the Obama administration is to reduce waste and harm by modernizing health care information technology systems. This decision would be met with more enthusiasm if the design of the final system were driven by the goal of improving the efficacy and efficiency of medical processes through a strategy of collecting meaningful data. Before spending billions creating and implementing such systems, careful consideration must be given to how this information will be integrated into medical decision making. Some of the current health care information technology systems seem to have been planned using a “ready, fire, aim” approach, with little or no concern for how the data will be used. Ideally, health care information technology systems should collect and use information to improve the probability that patients will receive optimal care. Optimized decision making is the essence of evidence-based medicine. However, collecting, organizing, and storing information is only the first step in this process. The systems must also be designed to facilitate data analysis. Information by itself is useless. Deming, a pioneer in datadriven optimization programs, acknowledged that many systems are drowning in information. Most physicians will agree. Deming argued that what is needed is knowledge. Knowledge is generated from analyzing information and finding the relevant patterns within the data set. These patterns are typically obscured by random noise, but once revealed, provide an understanding or mental model of the underlying processes. These models allow observations of past events to be transformed into future predictions. Although all attempts to predict the future are imperfect, predictions are useful because a series of failed predictions signals that mental models could be inaccurate. Once recognized, such errors prompt improvement in predictive models and can optimize future performance. |
Maximum power point tracking of photovoltaic water pumping system using fuzzy logic controller | Diode junction photovoltaic (PV) generators exhibit nonlinear V-I characteristics and the maximum power extractable varies with the intensity of solar radiation, temperature and load conditions. A maximum power point tracking (MPPT) controller is therefore usually employed in PV-generator applications to automatically extract maximum power irrespective of the instantaneous conditions of the PV system. This paper presents a fuzzy logic control (FLC) scheme for extracting the maximum power from a stand-alone PV generator for use in a water pumping system. The PV-generator system comprises a solar panel, DC-DC buck chopper, fuzzy MPP tracker and permanent DC-motor driving a centrifugal pump. The fuzzy controller generates a control signal for the pulse-width-modulation generator which in turn adjusts the duty ratio of the buck chopper to match the load impedance to the PV generator, and consequently maximizes the motor speed and the water discharge rate of a coupled centrifugal pump. The control method has been modelled in Matlab/Simulink and simulation results are presented to confirm its significantly improved power extraction performance under different sunlight conditions, when compared with a directly-connected PV-generator energized pumping system operating. |
Degradation of polyaromatic hydrocarbons employing biosurfactant-producing Bacillus pumilus KS2 | An efficient hydrocarbon-degrading native bacterial strain Bacillus pumilus KS2 (identified by partial 16S rDNA gene sequencing) was isolated from crude oil-contaminated soil collected from oil fields of Lakowa, Sivasagar district of Assam, India. Experiments were conducted under laboratory conditions to determine the efficiency of this biosurfactant-producing strain to degrade polycyclicaromatic hydrocarbons (PAHs). Quantification of the capacity of the biosurfactant to reduce the surface tension (ST) of the culture medium was used as a measure of biosurfactant production. In terms of total petroleum hydrocarbon (TPH) degradation, strain KS2 was able to degrade 80.44 % of the TPH by 4 weeks of incubation. It also demonstrated efficient degradation of PAHs, completely degrading nine of the 16 major PAHs present in the crude oil sample. Strain KS2 also produced biosurfactant which, based on biochemical and FTIR analyses, was glycolipid in nature. To our knowledge, this is the first report showing the potential of a native strain of the North-East region of India for efficient degradation of TPH and PAHs and, consequently, in the remediation of hydrocarbons from contaminated sites. |
Fluorescent pigments in corals are photoprotective | All reef-forming corals depend on the photosynthesis performed by their algal symbiont, and such corals are therefore restricted to the photic zone. The intensity of light in this zone declines over several orders of magnitude—from high and damaging levels at the surface to extreme shade conditions at the lower limit. The ability of corals to tolerate this range implies effective mechanisms for light acclimation and adaptation. Here we show that the fluorescent pigments (FPs) of corals provide a photobiological system for regulating the light environment of coral host tissue. Previous studies have suggested that under low light, FPs may enhance light availability. We now report that in excessive sunlight FPs are photoprotective; they achieve this by dissipating excess energy at wavelengths of low photosynthetic activity, as well as by reflecting of visible and infrared light by FP-containing chromatophores. We also show that FPs enhance the resistance to mass bleaching of corals during periods of heat stress, which has implications for the effect of environmental stress on the diversity of reef-building corals, such as enhanced survival of a broad range of corals allowing maintenance of habitat diversity. |
Multi-label Cross-Modal Retrieval | In this work, we address the problem of cross-modal retrieval in presence of multi-label annotations. In particular, we introduce multi-label Canonical Correlation Analysis (ml-CCA), an extension of CCA, for learning shared subspaces taking into account high level semantic information in the form of multi-label annotations. Unlike CCA, ml-CCA does not rely on explicit pairing between modalities, instead it uses the multi-label information to establish correspondences. This results in a discriminative subspace which is better suited for cross-modal retrieval tasks. We also present Fast ml-CCA, a computationally efficient version of ml-CCA, which is able to handle large scale datasets. We show the efficacy of our approach by conducting extensive cross-modal retrieval experiments on three standard benchmark datasets. The results show that the proposed approach achieves state of the art retrieval performance on the three datasets. |
Promoting apoptosis as a strategy for cancer drug discovery | Apoptosis is deregulated in many cancers, making it difficult to kill tumours. Drugs that restore the normal apoptotic pathways have the potential for effectively treating cancers that depend on aberrations of the apoptotic pathway to stay alive. Apoptosis targets that are currently being explored for cancer drug discovery include the tumour-necrosis factor (TNF)-related apoptosis-inducing ligand (TRAIL) receptors, the BCL2 family of anti-apoptotic proteins, inhibitor of apoptosis (IAP) proteins and MDM2. |
Advances in the Inspection of Unpiggable Pipelines | The field of in-pipe robotics covers a vast and varied number of approaches to the inspection of pipelines with robots specialising in pipes ranging anywhere from 10 mm to 1200 mm in diameter. Many of these developed systems focus on overcoming in-pipe obstacles such as T-sections and elbows, as a result important aspects of exploration are treated as sub-systems, namely shape adaptability. One of the most prevalent methods of hybridised locomotion today is wall-pressing; generating traction using the encompassing pipe walls. A review of wall-pressing systems has been performed, covering the different approaches taken since their introduction. The advantages and disadvantages of these systems is discussed as well as their effectiveness in the inspection of networks with highly varying pipe diameters. When compared to unconventional in-pipe robotic techniques, traditional full-bore wall-pressing robots were found to be at a disadvantage. |
Detecting Epileptic Seizures from EEG Data using Neural Networks | We explore the use of neural networks trained with dropout in predicting epileptic seizures from electroencephalographic data (scalp EEG). The input to the neural network is a 126 feature vector containing 9 features for each of the 14 EEG channels obtained over 1-second, non-overlapping windows. The models in our experiments achieved high sensitivity and specificity on patient records not used in the training process. This is demonstrated using leave-one-out-cross-validation across patient records, where we hold out one patient’s record as the test set and use all other patients’ records for training; repeating this procedure for all patients in the database. |
Whole arm planning for a soft and highly compliant 2D robotic manipulator | Soft continuum manipulators have the advantage of being more compliant and having more degrees of freedom than rigid redundant manipulators. This attribute should allow soft manipulators to autonomously execute highly dexterous tasks. However, current approaches to motion planning, inverse kinematics, and even design limit the capacity of soft manipulators to take full advantage of their inherent compliance. We provide a computational approach to whole arm planning for a soft planar manipulator that advances the arm's end effector pose in task space while simultaneously considering the arm's entire envelope in proximity to a confined environment. The algorithm solves a series of constrained optimization problems to determine locally optimal inverse kinematics. Due to inherent limitations in modeling the kinematics of a highly compliant soft robot and the local optimality of the planner's solutions, we also rely on the increased softness of our newly designed manipulator to accomplish the whole arm task, namely the arm's ability to harmlessly collide with the environment. We detail the design and fabrication of the new modular manipulator as well as the planner's central algorithm. We experimentally validate our approach by showing that the robotic system is capable of autonomously advancing the soft arm through a pipe-like environment in order to reach distinct goal states. |
Thematic Analysis and Visualization of Textual Corpus | The semantic analysis of documents is a domain of intense research at present. The works in this domain can take several directions and touch several levels of granularity. In the present work we are exactly interested in the thematic analysis of the textual documents. In our approach, we suggest studying the variation of the theme relevance within a text to identify the major theme and all the minor themes evoked in the text. This allows us at the second level of analysis to identify the relations of thematic associations in a textual corpus. Through the identification and the analysis of these association relations we suggest generating thematic paths allowing users, within the frame work of information search system, to explore the corpus according to their themes of interest and to discover new knowledge by navigating in the thematic association relations. |
Detecting Insider Threats Using RADISH: A System for Real-Time Anomaly Detection in Heterogeneous Data Streams | We present a scalable system for high-throughput real-time analysis of heterogeneous data streams. Our architecture enables incremental development of models for predictive analytics and anomaly detection as data arrives into the system. In contrast with batch data-processing systems, such as Hadoop, that can have high latency, our architecture allows for ingest and analysis of data on the fly, thereby detecting and responding to anomalous behavior in near real time. This timeliness is important for applications such as insider threat, financial fraud, and network intrusions. We demonstrate an application of this system to the problem of detecting insider threats, namely, the misuse of an organization's resources by users of the system and present results of our experiments on a publicly available insider threat dataset. |
A Simple and Fast Incremental Randomized Algorithm for Computing Trapezoidal Decompositions and for Triangulating Polygons | Seidel, R., A simple and fast incremental randomized algorithm for computing trapezoidal decompositions and for triangulating polygons, Computational Geometry: Theory and Applications 1 (1991) 51-64. This paper presents a very simple incremental randomized algorithm for computing the trapezoidal decomposition induced by a set S of n line segments in the plane. If S is given as a simple polygonal chain the expected running time of the algorithm is O(n log* n). This leads to a simple algorithm of the same complexity for triangulating polygons. More generally, if S is presented as a plane graph with k connected components, then the expected running time of the algorithm is O(n log* n + k log n). As a by-product our algorithm creates a search structure of expected linear size that allows point location queries in the resulting trapezoidation in logarithmic expected time. The analysis of the expected performance is elementary and straightforward. All expectations are with respect to ‘coinflips’ generated by the algorithm and are not based on assumptions about the geometric distribution of the input. |
Advantages of Geometric Algebra Over Complex Numbers in the Analysis of Networks With Nonsinusoidal Sources and Linear Loads | An alternative circuit analysis technique is used to study networks with nonsinusoidal sources and linear loads. In contrast to the technique developed by Steinmetz, this method is supported by geometric algebra instead of the algebra of complex numbers, uses multivectors in place of phasors and is performed in the GN domain instead of the frequency domain. The advantages of this method over the present technique involve: determining the flow of current and power quantities in the circuit, validating the results using the principle of conservation of energy, discerning and revealing other forms of reactive power generation, and the ability to design compensators with great flexibility. The power equation is composed of the active power and the CN -power representing the nonactive power. All the CN-power terms are sorted into reactive power terms due to phase shift, reactive power terms due to harmonic interactions and degrading power terms which determine the new quantity called degrading power. This decomposition shows that estimating these quantities is intricate. It also displays the power equation's functionality for power factor improvement. The geometric addition of power quantities is not pre-established but results from applying the established norm and yields the new quantity called net apparent power. |
Research Guides: Economics: Citing & Writing | Information on library resources, services and more to get you started on researching economics topics |
Ultra-Compact and Robust Physically Unclonable Function Based on Voltage-Compensated Proportional-to-Absolute-Temperature Voltage Generators | This paper presents a technique for designing an ultra-compact and robust physically unclonable function for security-oriented applications. The circuits are based on pairs of analog circuits whose output voltage is supply voltage compensated and proportional to absolute temperature (PTAT). The difference between two outputs of a PTAT pair is digitized to produce 1 b output which is sensitive mostly only to random transistor threshold voltage variations. Fabricated in a 65 nm, the proposed 256 b PUF array takes an area of 3.07 μm2/bit, consumes 0.548 pJ/bit at a throughput of 10 Mb/s, while showing desirable robustness against temperature and supply voltage variations with 3.5% and 1.004% bit-instability across 0 to 80°C and 0.6 to 1.2 V, respectively. The unpredictability and uniqueness of the 256 b PUF output are verified by NIST randomness test and Hamming Distance between keys. As compared with the state of the art, the proposed design has an 8.3× smaller area/bit or 2× better robustness against noise and environmental variations. |
Online Learning of Relaxed CCG Grammars for Parsing to Logical Form | We consider the problem of learning to parse sentences to lambda-calculus representations of their underlying semantics and present an algorithm that learns a weighted combinatory categorial grammar (CCG). A key idea is to introduce non-standard CCG combinators that relax certain parts of the grammar—for example allowing flexible word order, or insertion of lexical items— with learned costs. We also present a new, online algorithm for inducing a weighted CCG. Results for the approach on ATIS data show 86% F-measure in recovering fully correct semantic analyses and 95.9% F-measure by a partial-match criterion, a more than 5% improvement over the 90.3% partial-match figure reported by He and Young (2006). |
Brain mechanisms in religion and spirituality: An integrative predictive processing framework | We present the theory of predictive processing as a unifying framework to account for the neurocognitive basis of religion and spirituality. Our model is substantiated by discussing four different brain mechanisms that play a key role in religion and spirituality: temporal brain areas are associated with religious visions and ecstatic experiences; multisensory brain areas and the default mode network are involved in self-transcendent experiences; the Theory of Mind-network is associated with prayer experiences and over attribution of intentionality; top-down mechanisms instantiated in the anterior cingulate cortex and the medial prefrontal cortex could be involved in acquiring and maintaining intuitive supernatural beliefs. We compare the predictive processing model with two-systems accounts of religion and spirituality, by highlighting the central role of prediction error monitoring. We conclude by presenting novel predictions for future research and by discussing the philosophical and theological implications of neuroscientific research on religion and spirituality. |
Dynamic Heterogeneous Learning Games for Opportunistic Access in LTE-Based Macro/Femtocell Deployments | Interference is one of the most limiting factors when trying to achieve high spectral efficiency in the deployment of heterogeneous networks (HNs). In this paper, the HN is modeled as a layer of closed-access LTE femtocells (FCs) overlaid upon an LTE radio access network. Within the context of dynamic learning games, this work proposes a novel heterogeneous multiobjective fully distributed strategy based on a reinforcement learning (RL) model (CODIPAS-HRL) for FC self-configuration/optimization. The self-organization capability enables the FCs to autonomously and opportunistically sense the radio environment using different learning strategies and tune their parameters accordingly, in order to operate under restrictions of avoiding interference to both network tiers and satisfy certain quality-of-service requirements. The proposed model reduces the learning cost associated with each learning strategy. We also study the convergence behavior under different learning rates and derive a new accuracy metric in order to provide comparisons between the different learning strategies. The simulation results show the convergence of the learning model to a solution concept based on satisfaction equilibrium, under the uncertainty of the HN environment. We show that intra/inter-tier interference can be significantly reduced, thus resulting in higher cell throughputs. |
SCL: Simplifying Distributed SDN Control Planes | We consider the following question: what consistency model is appropriate for coordinating the actions of a replicated set of SDN controllers? We first argue that the conventional requirement of strong consistency, typically achieved through the use of Paxos or other consensus algorithms, is conceptually unnecessary to handle unplanned network updates. We present an alternate approach, based on the weaker notion of eventual correctness, and describe the design of a simple coordination layer (SCL) that can seamlessly turn a set of single-image SDN controllers (that obey certain properties) into a distributed SDN system that achieves this goal (whereas traditional consensus mechanisms do not). We then show through analysis and simulation that our approach provides faster responses to network events. While our primary focus is on handling unplanned network updates, our coordination layer also handles policy updates and other situations where consistency is warranted. Thus, contrary to the prevailing wisdom, we argue that distributed SDN control planes need only be slightly more complicated than single-image controllers. |
Image Steganography using LSB and LSB+Huffman Code | Steganography is an important area of research in recent years involving a various number of applications. It is the science of embedding information into cover of the media such as text, image, audio, and video. This paper uses two techniques for Steganography (text into image): Least Significant Bit (LSB) and Least Significant Bit with Huffman code (LSB+HUFF). It uses the zigzag scanning for the two methods to increase the security, and compares the results using Peak Signal to Noise Ratio (PSNR). All images used here is a gray scale images to implement the study; what implemented on gray scale image can be applied on colored image. |
Green Queue: Customized Large-Scale Clock Frequency Scaling | We examine the scalability of a set of techniques related to Dynamic Voltage-Frequency Scaling (DVFS) on HPC systems to reduce the energy consumption of scientific applications through an application-aware analysis and runtime framework, Green Queue. Green Queue supports making CPU clock frequency changes in response to intra-node and inter-node observations about application behavior. Our intra-node approach reduces CPU clock frequencies and therefore power consumption while CPUs lacks computational work due to inefficient data movement. Our inter-node approach reduces clock frequencies for MPI ranks that lack computational work. We investigate these techniques on a set of large scientific applications on 1024 cores of Gordon, an Intel Sandy bridge-based supercomputer at the San Diego Supercomputer Center. Our optimal intra-node technique showed an average measured energy savings of 10.6% and a maximum of 21.0% over regular application runs. Our optimal inter-node technique showed an average 17.4% and a maximum of 31.7% energy savings. |
Statistical Language Models for Information Retrieval | The past decade has seen a steady growth of interest in statistical language models for information retrieval, and much research work has been conducted on this subject. This book by ChengXiang Zhai summarizes most of this research. It opens with an introduction covering the basic concepts of information retrieval and statistical languagemodels, presenting the intuitions behind these concepts. This introduction is then followed by a chapter providing an overview of: |
Theory and Applications of Natural Language Processing | One of the more novel approaches to collaboratively creating language resources in recent years is to use online games to collect and validate data. The most significant challenges collaborative systems face are how to train users with the necessary expertise and how to encourage participation on a scale required to produce high quality data comparable with data produced by “traditional” experts. In this chapter we provide a brief overview of collaborative creation and the different approaches that have been used to create language resources, before analysing games used for this purpose. We discuss some key issues in using a gaming approach, including task design, player motivation and data quality, and compare the costs of each approach in terms of development, distribution and ongoing administration. In conclusion, we summarise the benefits and limitations of using a gaming approach to resource creation and suggest key considerations for evaluating its utility in different research scenarios. |
Searching Solitaire in Real Time | This article presents a new real-time heuristic search method for planning problems with distinct stages. Our multistage nested rollout algorithm allows the user to apply separate heuristics at each stage of the search process and tune the search magnitude for each stage. We propose a searchtree compression that reveals a new state representation for the games of Klondike Solitaire and Thoughtful Solitaire, a version of Klondike Solitaire in which the location of all cards is known. Moreover, we present a Thoughtful Solitaire solver based on these methods that can determine over 80% of Thoughtful Solitaire games in less than 4 seconds. Finally, we demonstrate empirically that no less than 82% and no more than 91.44% of Klondike Solitaire games have winning solutions, leaving less than 10% of games unresolved. |
Altered metabolite levels in cancer: implications for tumour biology and cancer therapy | Altered cell metabolism is a characteristic feature of many cancers. Aside from well-described changes in nutrient consumption and waste excretion, altered cancer cell metabolism also results in changes to intracellular metabolite concentrations. Increased levels of metabolites that result directly from genetic mutations and cancer-associated modifications in protein expression can promote cancer initiation and progression. Changes in the levels of specific metabolites, such as 2-hydroxyglutarate, fumarate, succinate, aspartate and reactive oxygen species, can result in altered cell signalling, enzyme activity and/or metabolic flux. In this Review, we discuss the mechanisms that lead to changes in metabolite concentrations in cancer cells, the consequences of these changes for the cells and how they might be exploited to improve cancer therapy. |
Can Programming Be Liberated, Period? | We have come a long way since programming had to be done by tediously listing machine-level instructions that prescribed how a specific computer was to modify and move bits and words in its memory. The author describes his dream about freeing ourselves from the straightjackets of programming, making the process of getting computers to do what we want intuitive, natural, and also fun. He recommends harnessing the great power of computing and transforming a natural and almost playful means of programming so that it becomes fully operational and machine-doable. Once liberated, programmers will probably have new kinds of work to do, possibly including the need to set up specialized features of the new sophisticated computational tools that would be running in the background. |
Detection and Tracking of Moving Objects Using 2 . 5 D Motion Grids | Autonomous vehicles require a reliable perception of their environment to operate in real-world conditions. Awareness of moving objects is one of the key components for the perception of the environment. This paper proposes a method for detection and tracking of moving objects (DATMO) in dynamic environments surrounding a moving road vehicle equipped with a Velodyne laser scanner and GPS/IMU localization system. First, at every time step, a local 2.5D grid is built using the last sets of sensor measurements. Along time, the generated grids combined with localization data are integrated into an environment model called local 2.5D map. In every frame, a 2.5D grid is compared with an updated 2.5D map to compute a 2.5D motion grid. A mechanism based on spatial properties is presented to suppress false detections that are due to small localization errors. Next, the 2.5D motion grid is post-processed to provide an object level representation of the scene. The detected moving objects are tracked over time by applying data association and Kalman filtering. The experiments conducted on different sequences from KITTI dataset showed promising results, demonstrating the applicability of the proposed method. |
MODELING WATER QUALITY USING TERRA / MODIS 500 M SATELLITE IMAGES | A study was conducted in Hong Kong with the aim of deriving algorithms for the retrieval of turbidity, chlorophyII a and suspended solids concentrations from Terra/MODIS 500m level 1B reflectance data. Rigorous atmospheric correction using Radiative Transfer Model coupled with the inputs of AERONET data and MODIS atmospheric products was carried out for deriving the water leaving reflectances. The in-situ measurements were compared with coincident water leaving reflectances using different sets of empirical algorithms. Due to the high variation of water quality in Hong Kong, the in-situ data were divided into three groups based on the spatial locations of the measurements: east zone (similar to case I water), west zone (similar to case II water) and harbour zone (in between case I and case II water). Significant correlations were observed between water-leaving reflectances and in-situ marine data on the three zones while only moderate correlations can be found on those with the entire dataset. These suggest the single empirical algorithm may not be applicable on the areas with high variation of water quality and MODIS 500m data are feasible for water quality retrieval in a local scale. * Corresponding author: Man Sing Wong, email: [email protected]; [email protected] |
Outcomes of lower extremity bypass performed for acute limb ischemia. | OBJECTIVE
Acute limb ischemia remains one of the most challenging emergencies in vascular surgery. Historically, outcomes following interventions for acute limb ischemia have been associated with high rates of morbidity and mortality. The purpose of this study was to determine contemporary outcomes following lower extremity bypass performed for acute limb ischemia.
METHODS
All patients undergoing infrainguinal lower extremity bypass between 2003 and 2011 within hospitals comprising the Vascular Study Group of New England were identified. Patients were stratified according to whether or not the indication for lower extremity bypass was acute limb ischemia. Primary end points included bypass graft occlusion, major amputation, and mortality at 1 year postoperatively as determined by Kaplan-Meier life table analysis. Multivariable Cox proportional hazards models were constructed to evaluate independent predictors of mortality and major amputation at 1 year.
RESULTS
Of 5712 lower extremity bypass procedures, 323 (5.7%) were performed for acute limb ischemia. Patients undergoing lower extremity bypass for acute limb ischemia were similar in age (66 vs 67; P = .084) and sex (68% male vs 69% male; P = .617) compared with chronic ischemia patients, but were less likely to be on aspirin (63% vs 75%; P < .0001) or a statin (55% vs 68%; P < .0001). Patients with acute limb ischemia were more likely to be current smokers (49% vs 39%; P < .0001), to have had a prior ipsilateral bypass (33% vs 24%; P = .004) or a prior ipsilateral percutaneous intervention (41% vs 29%; P = .001). Bypasses performed for acute limb ischemia were longer in duration (270 vs 244 minutes; P = .007), had greater blood loss (363 vs 272 mL; P < .0001), and more commonly utilized prosthetic conduits (41% vs 33%; P = .003). Acute limb ischemia patients experienced increased in-hospital major adverse events (20% vs 12%; P < .0001) including myocardial infarction, congestive heart failure exacerbation, deterioration in renal function, and respiratory complications. Patients who underwent lower extremity bypass for acute limb ischemia had no difference in rates of graft occlusion (18.1% vs 18.5%; P = .77), but did have significantly higher rates of limb loss (22.4% vs 9.7%; P < .0001) and mortality (20.9% vs 13.1%; P < .0001) at 1 year. On multivariable analysis, acute limb ischemia was an independent predictor of both major amputation (hazard ratio, 2.16; confidence interval, 1.38-3.40; P = .001) and mortality (hazard ratio, 1.41; confidence interval, 1.09-1.83; P = .009) at 1 year.
CONCLUSIONS
Patients who present with acute limb ischemia represent a less medically optimized subgroup within the population of patients undergoing lower extremity bypass. These patients may be expected to have more complex operations followed by increased rates of perioperative adverse events. Additionally, despite equivalent graft patency rates, patients undergoing lower extremity bypass for acute ischemia have significantly higher rates of major amputation and mortality at 1 year. |
Squidy: a zoomable design environment for natural user interfaces | We introduce the interaction library Squidy, which eases the design of natural user interfaces by unifying relevant frameworks and toolkits in a common library. Squidy provides a central design environment based on high-level visual data flow programming combined with zoomable user interface concepts. The user interface offers a simple visual language and a collection of ready-to-use devices, filters and interaction techniques. The concept of semantic zooming enables nevertheless access to more advanced functionality on demand. Thus, users are able to adjust the complexity of the user interface to their current need and knowledge. |
$LCL$ Filter Design and Performance Analysis for Grid-Interconnected Systems | The use of power converters is very important in maximizing the power transfer from renewable energy sources such as wind, solar, or even a hydrogen-based fuel cell to the utility grid. An LCL filter is often used to interconnect an inverter to the utility grid in order to filter the harmonics produced by the inverter. Although there is an extensive amount of literature available describing LCL filters, there has been a gap in providing a systematic design methodology. Furthermore, there has been a lack of a state-space mathematical modeling approach that considers practical cases of delta- and wye-connected capacitors showing their effects on possible grounding alternatives. This paper describes a design methodology of an LCL filter for grid-interconnected inverters along with a comprehensive study of how to mitigate harmonics. The procedures and techniques described in this paper may be used in small-scale renewable energy conversion systems and may be also retrofitted for medium- and large-scale grid-connected systems. |
Learning to Control a Low-Cost Manipulator using Data-Efficient Reinforcement Learning | Over the last years, there has been substantial progress in robust manipulation in unstructured environments. The long-term goal of our work is to get away from precise, but very expensive robotic systems and to develop affordable, potentially imprecise, self-adaptive manipulator systems that can interactively perform tasks such as playing with children. In this paper, we demonstrate how a low-cost off-the-shelf robotic system can learn closed-loop policies for a stacking task in only a handful of trials—from scratch. Our manipulator is inaccurate and provides no pose feedback. For learning a controller in the work space of a Kinect-style depth camera, we use a model-based reinforcement learning technique. Our learning method is data efficient, reduces model bias, and deals with several noise sources in a principled way during long-term planning. We present a way of incorporating state-space constraints into the learning process and analyze the learning gain by exploiting the sequential structure of the stacking task. |
Performance of a Distribution Intelligent Universal Transformer under Source and Load Disturbances | A bench model of the new generation intelligent universal transformer (IUT) has been recently developed for distribution applications. The distribution IUT employs high-voltage semiconductor device technologies along with multilevel converter circuits for medium-voltage grid connection. This paper briefly describes the basic operation of the IUT and its experimental setup. Performances under source and load disturbances are characterized with extensive tests using a voltage sag generator and various linear and nonlinear loads. Experimental results demonstrate that IUT input and output can avoid direct impact from its opposite side disturbances. The output voltage is well regulated when the voltage sag is applied to the input. The input voltage and current maintains clean sinusoidal and unity power factor when output is nonlinear load. Under load transients, the input and output voltages remain well regulated. These key features prove that the power quality performance of IUT is far superior to that of conventional copper-and-iron based transformers |
Delving Deeper into Convolutional Networks for Learning Video Representations | We propose an approach to learn spatio-temporal features in videos from intermediate visual representations we call “percepts” using Gated-Recurrent-Unit Recurrent Networks (GRUs). Our method relies on percepts that are extracted from all levels of a deep convolutional network trained on the large ImageNet dataset. While high-level percepts contain highly discriminative information, they tend to have a low-spatial resolution. Low-level percepts, on the other hand, preserve a higher spatial resolution from which we can model finer motion patterns. Using low-level percepts, however, can lead to high-dimensionality video representations. To mitigate this effect and control the number of parameters, we introduce a variant of the GRU model that leverages the convolution operations to enforce sparse connectivity of the model units and share parameters across the input spatial locations. We empirically validate our approach on both Human Action Recognition and Video Captioning tasks. In particular, we achieve results equivalent to state-of-art on the YouTube2Text dataset using a simpler caption-decoder model and without extra 3D CNN features. |
Clustering of the self-organizing map | The self-organizing map (SOM) is an excellent tool in exploratory phase of data mining. It projects input space on prototypes of a low-dimensional regular grid that can be effectively utilized to visualize and explore properties of the data. When the number of SOM units is large, to facilitate quantitative analysis of the map and the data, similar units need to be grouped, i.e., clustered. In this paper, different approaches to clustering of the SOM are considered. In particular, the use of hierarchical agglomerative clustering and partitive clustering using k-means are investigated. The two-stage procedure--first using SOM to produce the prototypes that are then clustered in the second stage--is found to perform well when compared with direct clustering of the data and to reduce the computation time. |
Is the cerebellum a smith predictor? | The motor system may use internal predictive models of the motor apparatus to achieve better control than would be possible by negative feedback. Several theories have proposed that the cerebellum may form these predictive representations. In this article, we review these theories and try to unify them by reference to an engineering control model known as a Smith Predictor. We suggest that the cerebellum forms two types of internal model. One model is a forward predictive model of the motor apparatus (e.g., limb and muscle), providing a rapid prediction of the sensory consequences of each movement. The second model is of the time delays in the control loop (due to receptor and effector delays, axonal conductances, and cognitive processing delays). This model delays a copy of the rapid prediction so that it can be compared in temporal register with actual sensory feedback from the movement. The result of this comparison is used both to correct for errors in performance and as a training signal to learn the first model. We discuss evidence that the cerebellum could form both of these models and suggest that the cerebellum may hold at least two separate Smith Predictors. One, in the lateral cerebellum, would predict the movement outcome in visual, egocentric, or peripersonal coordinates. Another, in the intermediate cerebellum, would predict the consequences in motor coordinates. Generalization of the Smith Predictor theory is discussed in light of cerebellar involvement in nonmotor control systems, including autonomic functions and cognition. |
Wavelet Trees: From Theory to Practice | The \emph{wavelet tree} data structure is a space-efficient technique for rank and select queries that generalizes from binary characters to an arbitrary multicharacter alphabet. It has become a key tool in modern full-text indexing and data compression because of its capabilities in compressing, indexing, and searching. We present a comparative study of its practical performance regarding a wide range of options on the dimensions of different coding schemes and tree shapes. Our results are both theoretical and experimental: (1)~We show that the run-length $\delta$ coding size of wavelet trees achieves the 0-order empirical entropy size of the original string with leading constant 1, when the string's 0-order empirical entropy is asymptotically less than the logarithm of the alphabet size. This result complements the previous works that are dedicated to analyzing run-length $\gamma$-encoded wavelet trees. It also reveals the scenarios when run-length $\delta$ encoding becomes practical. (2)~We introduce a full generic package of wavelet trees for a wide range of options on the dimensions of coding schemes and tree shapes. Our experimental study reveals the practical performance of the various modifications. |
Phase I study utilizing a novel antigen-presenting cell-targeted vaccine with Toll-like receptor stimulation to induce immunity to self-antigens in cancer patients. | PURPOSE
The use of tumor-derived proteins as cancer vaccines is complicated by tolerance to these self-antigens. Tolerance may be broken by immunization with activated, autologous, ex vivo generated and antigen-loaded, antigen-presenting cells (APC); however, targeting tumor antigen directly to APC in vivo would be a less complicated strategy. We wished to test whether targeted delivery of an otherwise poorly immunogenic, soluble antigen to APC through their mannose receptors (MR) would induce clinically relevant immunity.
EXPERIMENTAL DESIGN
Two phase I studies were conducted with CDX-1307, a vaccine composed of human chorionic gonadotropin beta-chain (hCG-β) fused to an MR-specific monoclonal antibody, administered either locally (intradermally) or systemically (intravenously) in patients with advanced epithelial malignancies. An initial dose escalation of single-agent CDX-1307 was followed by additional cohorts of CDX-1307 combined with granulocyte-macrophage colony-stimulating factor (GM-CSF) and the Toll-like receptor (TLR) 3 agonist polyinosinic-polycytidylic acid (poly-ICLC) and TLR7/8 agonist resiquimod to activate the APC.
RESULTS
CDX-1307 induced consistent humoral and T-cell responses to hCG-β when coadministered with TLR agonists. Greater immune responses and clinical benefit, including the longest duration of stable disease, were observed with immunization combined with local TLR agonists. Immune responses were induced equally efficiently in patients with elevated and nonelevated levels of serum hCG-β. Antibodies within the serum of vaccinated participants had tumor suppressive function in vitro. Toxicity consisted chiefly of mild injection site reactions.
CONCLUSIONS
APC targeting and activation induce adaptive immunity against poorly immunogenic self-antigens which has implications for enhancing the efficacy of cancer immunotherapy. |
SVM-Based Comments Classification and Mining of Virtual Community : For Case of Sentiment Classification of Hotel Reviews | Comments of virtual communities are very import to the operation and development of an enterprise, through analysis the sentiment of the comments, we can mine the user's semantic orientation and preferences or hates, it is very helpful for enterprises to improve their products and services and for individuals to use these information rationally. This paper takes hotel review of the virtual community for example, to use key technologies of support vector machine (SVM) and sentiment classification to build a small domain sentiment corpus, and to propose a sentiment classification model based on the corpus to reach the purpose of feature dimension reduction. We choose a hotel reviews corpus as instance to validate the method we proposed in this paper, the results of the experiment shows that the accuracy of support vector machine algorithm has been enhanced by adding a user’s sentiment corpus. This method can be fatherly extended to apply to other comments of the virtual communities. |
Rennick Concept Map of Adult Perspectives Psychosocial Experiences Associated With Confirmed and Self-Identified Dyslexia : A Participant-Driven | Concept mapping (a mixed qualitative–quantitative methodology) was used to describe and understand the psychosocial experiences of adults with confirmed and self-identified dyslexia. Using innovative processes of art and photography, Phase 1 of the study included 15 adults who participated in focus groups and in-depth interviews and were asked to elucidate their experiences with dyslexia. On index cards, 75 statements and experiences with dyslexia were recorded. The second phase of the study included 39 participants who sorted these statements into self-defined categories and rated each statement to reflect their personal experiences to produce a visual representation, or concept map, of their experience. The final concept map generated nine distinct cluster themes: Organization Skills for Success; Finding Success; A Good Support System Makes the Difference; On Being Overwhelmed; Emotional Downside; Why Can’t They See It?; Pain, Hurt, and Embarrassment From Past to Present; Fear of Disclosure; and Moving Forward. Implications of these findings are discussed. |
Novel metabolic risk factors for incident heart failure and their relationship with obesity: the MESA (Multi-Ethnic Study of Atherosclerosis) study. | OBJECTIVES
The objectives of this study were to determine the associations of the metabolic syndrome, inflammatory markers, and insulin resistance with incident congestive heart failure (CHF), beyond established risk factors, and to examine whether these risk factors may provide the link between obesity and CHF.
BACKGROUND
Recently, increasing interest has emerged on the potential role of novel risk factors such as systemic inflammation, insulin resistance, and albuminuria in the pathophysiology of CHF and their relationship with obesity.
METHODS
The MESA (Multi-Ethnic Study of Atherosclerosis) study is a community-based multicenter cohort study of 6,814 participants (age 45 to 84 years, 3,601 women) of 4 ethnicities: Caucasians, African Americans, Hispanics, and Chinese Americans. Participants were recruited between 2000 and 2002 from 6 U.S. communities. Median follow-up time was 4 years. Participants with history of symptomatic cardiovascular disease were excluded. Cox proportional hazards models were used to analyze the associations of the metabolic syndrome, inflammatory markers, insulin resistance, and albuminuria with incident CHF, independent of established risk factors (age, gender, hypertension, diabetes mellitus, left ventricular hypertrophy, obesity, serum total cholesterol, and smoking), an interim myocardial infarction, and baseline magnetic resonance imaging parameters of left ventricular structure and function.
RESULTS
A total of 79 participants developed CHF during follow-up, and 26 participants (32.9%) had a myocardial infarction prior to CHF and 65% of the cases had CHF with preserved function (left ventricular ejection fraction >or=40%). In multivariable analyses, serum interleukin-6 (hazard ratio [HR] for 1 standard deviation 1.50, 95% confidence interval [CI] 1.10 to 2.03) or C-reactive protein (HR for 1 standard deviation 1.38; 95% CI 1.01 to 1.86) and macroalbuminuria (HR 4.31, 95% CI 1.58 to 11.76) were predictors of CHF, independent of obesity and the other established risk factors. Although obesity was significantly associated with incident CHF, this association was no longer significant after adding inflammatory markers (interleukin-6 or C-reactive protein) to the model.
CONCLUSIONS
Inflammatory markers and albuminuria are independent predictors of CHF. The association of obesity and CHF may be related to pathophysiologic pathways associated with inflammation. |
Ren Ng1 Marc Levoy1 Mathieu Bredif1 Gene Duval2 Mark Horowitz1 Pat Hanrahan1 | This paper presents a camera that samples the 4D light field on its sensor in a single photographic exposure. This is achieved by inserting a microlens array between the sensor and main lens, creating a plenoptic camera. Each microlens measures not just the total amount of light deposited at that location, but how much light arrives along each ray. By re-sorting the measured rays of light to where they would have terminated in slightly different, synthetic cameras, we can compute sharp photographs focused at different depths. We show that a linear increase in the resolution of images under each microlens results in a linear increase in the sharpness of the refocused photographs. This property allows us to extend the depth of field of the camera without reducing the aperture, enabling shorter exposures and lower image noise. Especially in the macrophotography regime, we demonstrate that we can also compute synthetic photographs from a range of different viewpoints. These capabilities argue for a different strategy in designing photographic imaging systems. To the photographer, the plenoptic camera operates exactly like an ordinary hand-held camera. We have used our prototype to take hundreds of light field photographs, and we present examples of portraits, high-speed action and macro close-ups. |
Crystalloid fluid therapy | This article is one of ten reviews selected from the Annual Update in Intensive Care and Emergency medicine 2016. Other selected articles can be found online at http://www.biomedcentral.com/collections/annualupdate2016. Further information about the Annual Update in Intensive Care and Emergency Medicine is available from http://www.springer.com/series/8901. |
Changing Ethical Attitudes : The Case of the Enron and ImClone Scandals n | Objective. We analyze the process of changing ethical attitudes over time by focusing on a specific set of ‘‘natural experiments’’ that occurred over an 18-month period, namely, the accounting scandals that occurred involving Enron/Arthur Andersen and insider-trader allegations related to ImClone. Methods. Given the amount of media attention devoted to these ethical scandals, we test whether respondents in a cross-sectional sample taken over 18 months become less accepting of ethically charged vignettes dealing with ‘‘accounting tricks’’ and ‘‘insider trading’’ over time. Results. We find a significant and gradual decline in the acceptance of the vignettes over the 18-month period. Conclusions. Findings presented here may provide valuable insight into potential triggers of changing ethical attitudes. An intriguing implication of these results is that recent highly publicized ethical breaches may not be only a symptom, but also a cause of changing attitudes. |
Moo : A Batteryless Computational RFID and Sensing Platform | The UMass Moo is a passively powered computational RFID that harvests RFID reader energy from the UHF band, communicates with an RFID reader, and processes data from its onboard sensors. Its function can be extended via its general-purpose I/Os, serial buses, and 12-bit ADC/DAC ports. Based on the Intel DL WISP (revision 4.1), the Moo provides an RFID-scale, reprogrammable, batteryless sensing platform. This report compares the Moo to its ancestor, documents our design decisions, and details the Moo’s compatibility with other devices. It is meant to be a companion document for the open-source release of code and specifications for the Moo (revision 1.x). We made an initial batch of Moo 1.1 hardware available to other researchers in June 2011. |
Puberty-related influences on brain development | Puberty is a time of striking changes in cognition and behavior. To indirectly assess the effects of puberty-related influences on the underlying neuroanatomy of these behavioral changes we will review and synthesize neuroimaging data from typically developing children and adolescents and from those with anomalous hormone or sex chromosome profiles. The trajectories (size by age) of brain morphometry differ between boys and girls, with girls generally reaching peak gray matter thickness 1-2 years earlier than boys. Both boys and girls with congenital adrenal hyperplasia (characterized by high levels of intrauterine testosterone), have smaller amygdala volume but the brain morphometry of girls with CAH did not otherwise significantly differ from controls. Subjects with XXY have gray matter reductions in the insula, temporal gyri, amygdala, hippocampus, and cingulate-areas consistent with the language-based learning difficulties common in this group. |
The MOM Project: delivering maternal health services among internally displaced populations in eastern Burma. | Alternative strategies to increase access to reproductive health services among internally displaced populations are urgently needed. In eastern Burma, continuing conflict and lack of functioning health systems render the emphasis on facility-based delivery with skilled attendants unfeasible. Along the Thailand-Burma border, local organisations have implemented an innovative pilot, the Mobile Obstetric Maternal Health Workers (MOM) Project, establishing a three-tiered collaborative network of community-based reproductive health workers. Health workers from local organisations received practical training in basic emergency obstetric care plus blood transfusion, antenatal care and family planning at a central facility. After returning to their target communities inside Burma, these first-tier maternal health workers trained a second tier of local health workers and a third tier of traditional birth attendants (TBAs) to provide a limited subset of these interventions, depending on their level of training. In this ongoing project, close communication between health workers and TBAs promotes acceptance and coverage of maternity services throughout the community. We describe the rationale, design and implementation of the project and a parallel monitoring plan for evaluation of the project. This innovative obstetric health care delivery strategy may serve as a model for the delivery of other essential health services in this population and for increasing access to care in other conflict settings. |
Oncoplastic mammoplasty as a strategy for reducing reconstructive complications associated with postmastectomy radiation therapy. | Given the high complication rates in patients who require radiation therapy (XRT) after mastectomy and immediate reconstruction, and the low local recurrence rates following neo-adjuvant chemotherapy and breast conservation therapy, we sought to determine if using neo-adjuvant chemotherapy and oncoplastic mammoplasty as an alternative to mastectomy and immediate reconstruction is an effective strategy for reducing complication rates in the setting of XRT. A prospectively maintained data base was queried for patients who received neo-adjuvant chemotherapy and XRT between 2001 and 2010 and underwent either oncoplastic mammoplasty or mastectomy with immediate reconstruction. Rates of postoperative complications between groups were compared using Fisher's exact test. Outcomes from 37 patients who underwent oncoplastic mammoplasty were compared to 64 patients who underwent mastectomy with immediate reconstruction. Mean follow-up was 33 months (range 4-116 months). Rates of postoperative complications, including unplanned operative intervention for a reconstructive complication (2.7% versus 37.5%, p < 0.001), skin flap necrosis (10.8% versus 29.7%, p = 0.05), and infection (16.2% versus 35.9, p = 0.04) were significantly higher in the mastectomy group. Overall, 45.3% of patients who underwent mastectomy developed at least one breast complication, compared to 18.9% of patients who underwent oncoplastic mammoplasty (p = 0.01). If XRT is indicated after mastectomy, attempts should be made to achieve breast conservation through the use of neo-adjuvant therapy and oncoplastic surgery in order to optimize surgical outcomes. Breast conservation with oncoplastic reconstruction does not compromise oncologic outcome, but significantly reduces complications compared to postmastectomy reconstruction followed by XRT. |
Auto-Detect: Data-Driven Error Detection in Tables | Given a single column of values, existing approaches typically employ regex-like rules to detect errors by finding anomalous values inconsistent with others. Such techniques make local decisions based only on values in the given input column, without considering a more global notion of compatibility that can be inferred from large corpora of clean tables. We propose \sj, a statistics-based technique that leverages co-occurrence statistics from large corpora for error detection, which is a significant departure from existing rule-based methods. Our approach can automatically detect incompatible values, by leveraging an ensemble of judiciously selected generalization languages, each of which uses different generalizations and is sensitive to different types of errors. Errors so detected are based on global statistics, which is robust and aligns well with human intuition of errors. We test \sj on a large set of public Wikipedia tables, as well as proprietary enterprise Excel files. While both of these test sets are supposed to be of high-quality, \sj makes surprising discoveries of over tens of thousands of errors in both cases, which are manually verified to be of high precision (over 0.98). Our labeled benchmark set on Wikipedia tables is released for future research. |
A Deep Learning Model for Epigenomic Studies | Epigenetics is the study of heritable changesin gene expression that does not involve changes to theunderlying DNA sequence, i.e. a change in phenotype notinvolved by a change in genotype. At least three mainfactor seems responsible for epigenetic change including DNAmethylation, histone modification and non-coding RNA, eachone sharing having the same property to affect the dynamicof the chromatin structure by acting on Nucleosomes position. A nucleosome is a DNA-histone complex, where around150 base pairs of double-stranded DNA is wrapped. Therole of nucleosomes is to pack the DNA into the nucleusof the Eukaryote cells, to form the Chromatin. Nucleosomepositioning plays an important role in gene regulation andseveral studies shows that distinct DNA sequence featureshave been identified to be associated with nucleosomepresence. Starting from this suggestion, the identificationof nucleosomes on a genomic scale has been successfullyperformed by DNA sequence features representation andclassical supervised classification methods such as SupportVector Machines, Logistic regression and so on. Taking inconsideration the successful application of the deep neuralnetworks on several challenging classification problems, inthis paper we want to study how deep learning network canhelp in the identification of nucleosomes. |
Low-Quality Product Review Detection in Opinion Summarization | Product reviews posted at online shopping sites vary greatly in quality. This paper addresses the problem of detecting lowquality product reviews. Three types of biases in the existing evaluation standard of product reviews are discovered. To assess the quality of product reviews, a set of specifications for judging the quality of reviews is first defined. A classificationbased approach is proposed to detect the low-quality reviews. We apply the proposed approach to enhance opinion summarization in a two-stage framework. Experimental results show that the proposed approach effectively (1) discriminates lowquality reviews from high-quality ones and (2) enhances the task of opinion summarization by detecting and filtering lowquality reviews. |
Towards an ISO Standard for Dialogue Act Annotation | This paper describes an ISO project developing an international standard for annotating dialogue with semantic information, in particular concerning the communicative functions of the utterances, the kind of content they address, and the dependency relations to what was said and done earlier in the dialogue. The project, registered as ISO 24617-2 Semantic annotation framework, Part 2: Dialogue acts”, is currently at DIS stage. |
Mitigating Docker Security Issues | Mitigating Docker Security Issues 1 Robail Yasrab 1 School of Computer Science and Information Technology University of Science and Technology of China, (USTC), Hefei China. [email protected] Abstract: It is very easy to run applications in Docker. Docker offers an ecosystem that offers a platform for application packaging, distributing and managing within containers. However, Docker platform is yet not matured. Presently, Docker is less secured as compare to virtual machines (VM) and most of the other cloud technologies. The key of reason of Docker’s inadequate security protocols is; containers sharing of Linux kernel, which can lead to risk of privileged escalations. This research is going to outline some major security vulnerabilities at Docker and counter solutions to neutralize such attacks. There are variety of security attacks like insider and outsider. This research will outline both types of attacks and their mitigations strategies. Taking some precautionary measures can save from huge disasters. This research will also present Docker secure deployment guidelines. These guidelines will suggest different configurations to deploy Docker containers in a more secure way. |
Pancreatic exocrine dysfunction: common in type 3c diabetes, but don't forget types 1 and 2 diabetes mellitus | It is well understood that the pancreas has two distinct roles: the endocrine and exocrine functions, that are functionally and anatomically closely related. As specialists in diabetes care, we are adept at managing pancreatic endocrine failure and its associated complications. However, there is frequent overlap and many patients with diabetes also suffer from exocrine insufficiency. Here we outline the different causes of exocrine failure, and in particular that associated with type 1 and type 2 diabetes and how this differs from diabetes that is caused by pancreatic exocrine disease: type 3c diabetes. Copyright © 2017 John Wiley & Sons. Practical Diabetes 2017; 34(6): 200–204 |
Effects of theophylline on plasma levels of interleukin-4, cyclic nucleotides and pulmonary functions in patients with chronic obstructive pulmonary disease | In order to measure the plasma levels of interleukin-4 (IL-4). cyclic adenosine monophosphate (cAMP) and cyclic guanosine monophosphate (cGMP) in patients with chronic obstructive pulmonary disease (COPD) and observe the effects of oral theophylline on them, we divided 28 COPD patients into COPD experimental group and COPD control group. Plasma levels of IL-4, cAMP and cGMP as well as parameters of pulmonary function tests were measured in these 2 groups before and after 2 weeks of treatment with oral theophylline (Protheo, 400 mg, qd) or placebo. Plasma levels of IL-4 and cGMP were significantly elevated in patients with COPD as compared with normal controls (P< 0. 05), while cAMP and cAMP/cGMP were significantly lower than those in controls (P < 0. 01). Plasma level of IL-4 was inversely correlated with forced expiratory volume at the first second (FEV1) and with maximum expiratory flow rate at 50 % of forced vital capacity (V50) (bothr= −0.46,P< 0.05) while it was directly correlated with the scores of the clinical manifestations (r=0.57,P< 0.05) in COPD patients. Two weeks after treatment with theophylline, IL-4 and cGMP in COPD experimental group were decreased significantly while cAMP and cAMP/ cGMP increased significantly (P< 0. 05). The change of IL-4 was inversely correlated with the changes of FEV1, and V50(r=−0.53 and -0.54, respectively,P< 0. 05). Two weeks after placebo treatment, the COPD control group did not show such changes. We are led to conclude that IL-4 might play a role in the pathogenesis of the airway inflammation and air flow limitation in COPD patients and the mechanisms of theophylline’s therapeutic efrects of attenuating air-flow limitation may partially depend on its anti-inflammatory effects on the airways which, in turn, is dependent on its inhibitory effects on some inflammatory mediators such as IL-4. |
Identification of a DNA-binding site for the transcription factor Haa1, required for Saccharomyces cerevisiae response to acetic acid stress | The transcription factor Haa1 is the main player in reprogramming yeast genomic expression in response to acetic acid stress. Mapping of the promoter region of one of the Haa1-activated genes, TPO3, allowed the identification of an acetic acid responsive element (ACRE) to which Haa1 binds in vivo. The in silico analysis of the promoter regions of the genes of the Haa1-regulon led to the identification of an Haa1-responsive element (HRE) 5'-GNN(G/C)(A/C)(A/G)G(A/G/C)G-3'. Using surface plasmon resonance experiments and electrophoretic mobility shift assays it is demonstrated that Haa1 interacts with high affinity (K(D) of 2 nM) with the HRE motif present in the ACRE region of TPO3 promoter. No significant interaction was found between Haa1 and HRE motifs having adenine nucleotides at positions 6 and 8 (K(D) of 396 and 6780 nM, respectively) suggesting that Haa1p does not recognize these motifs in vivo. A lower affinity of Haa1 toward HRE motifs having mutations in the guanine nucleotides at position 7 and 9 (K(D) of 21 and 119 nM, respectively) was also observed. Altogether, the results obtained indicate that the minimal functional binding site of Haa1 is 5'-(G/C)(A/C)GG(G/C)G-3'. The Haa1-dependent transcriptional regulatory network active in yeast response to acetic acid stress is proposed. |
From Movement to Thought: Executive Function, Embodied Cognition, and the Cerebellum | This paper posits that the brain evolved for the control of action rather than for the development of cognition per se. We note that the terms commonly used to describe brain–behavior relationships define, and in many ways limit, how we conceptualize and investigate them and may therefore constrain the questions we ask and the utility of the “answers” we generate. Many constructs are so nonspecific and over-inclusive as to be scientifically meaningless. “Executive function” is one such term in common usage. As the construct is increasingly focal in neuroscience research, defining it clearly is critical. We propose a definition that places executive function within a model of continuous sensorimotor interaction with the environment. We posit that control of behavior is the essence of “executive function,” and we explore the evolutionary advantage conferred by being able to anticipate and control behavior with both implicit and explicit mechanisms. We focus on the cerebellum's critical role in these control processes. We then hypothesize about the ways in which procedural (skill) learning contributes to the acquisition of declarative (semantic) knowledge. We hypothesize how these systems might interact in the process of grounding knowledge in sensorimotor anticipation, thereby directly linking movement to thought and “embodied cognition.” We close with a discussion of ways in which the cerebellum instructs frontal systems how to think ahead by providing anticipatory control mechanisms, and we briefly review this model's potential applications. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.