title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
Relativistic theory of tidal Love numbers | In Newtonian gravitational theory, a tidal Love number relates the mass multipole moment created by tidal forces on a spherical body to the applied tidal field. The Love number is dimensionless, and it encodes information about the body’s internal structure. We present a relativistic theory of Love numbers, which applies to compact bodies with strong internal gravities; the theory extends and completes a recent work by Flanagan and Hinderer, which revealed that the tidal Love number of a neutron star can be measured by Earth-based gravitational-wave detectors. We consider a spherical body deformed by an external tidal field, and provide precise and meaningful definitions for electric-type and magnetic-type Love numbers; and these are computed for polytropic equations of state. The theory applies to black holes as well, and we find that the relativistic Love numbers of a nonrotating black hole are all zero. |
Relaxation of an induced defense after exclusion of herbivores: spines on Acacia drepanolobium | Descriptive and experimental evidence suggests that spine length is an inducible defense, with longer spines being produced by branches experiencing greater levels of herbivory. Here we present results from a replicated, controlled herbivore exclusion experiment in which cattle, wildlife (large mammalian herbivores), and megaherbivores (elephants and giraffes) were independently manipulated. Experimental wildlife barriers virtually eliminated herbivory on Acacia drepanolobium branches at all heights. Megaherbivore barriers reduced herbivory on branches more than 1.75 m from the ground by up to 80%, and reduced herbivory on lower branches by 40%. These patterns of herbivory were matched by patterns of relaxation of spine length that occurred in response to the treatments. After 22 months of herbivore exclusion, the lengths of newly produced spines were 19% shorter on branches protected from large mammal herbivory than on trees in control plots. On low branches, there was a steady increase in spine length from total exclusion plots (shortest spines) to plots with wildlife to plots with both megaherbivores and wildlife (longest spines). On higher branches, new spines were shorter in total exclusion plots and wildlife plots than in plots in which megaherbivores were allowed. This is the first replicated, controlled experimental demonstration that browsing by free-ranging herbivores is associated with greater spine lengths. Examination of trees incidentally protected from herbivory for several years suggests that reduction in spine length in the experimental plots will eventually exceed 70%. Initially slow relaxation of spine length may represent a cautious adaptive strategy in an environment where a given branch is likely to escape herbivory in a given growth season, even when herbivores are present. |
Hidden images | A hidden image is a form of artistic expression in which one or more secondary objects (or scenes) are hidden within a primary image. Features of the primary image, especially its edges and texture, are used to portray a secondary object. People can recognize both the primary and secondary intent in such pictures, although the time taken to do so depends on the prior experience of the viewer and the strength of the clues. Here, we present a system for creating such images. It relies on the ability of human perception to recognize an object, e.g. a human face, from incomplete edge information within its interior, rather than its outline. Our system detects edges of the object to be hidden, and then finds a place where it can be embedded within the scene, together with a suitable transformation for doing so, by optimizing an energy based on edge differences. Embedding is performed using a modified Poisson blending approach, which strengthens matched edges of the host image using edges of the object being embedded. We show various hidden images generated by our system. |
An admittance control scheme for haptic interfaces based on cable-driven parallel mechanisms | This paper presents a cable-driven parallel mechanism as a haptic interface and its underlying control method. This human-sized, three-degree-of-freedom mechanism has a tetrahedral architecture, four cables and evolves in three-dimensional space. A brief review of the kinematics of the mechanism is presented. Also, an admittance control law coupled with a closed-loop velocity controller is proposed. The control method is then refined by introducing adaptations for smooth surfaces and sharp edges. This control method is then validated by experimental results. Furthermore, the geometry of the mechanism is identified by a method that does not require any other sensor than the motor encoders. |
Improving Ship Stability in Automated Stowage Planning for Large Containerships | Stowage planning for container vessels concerns the core competence of the shipping lines. As such, automated stowage planning has attracted much research in the past two decades, but with few documented successes. In an ongoing project, we are developing a prototype stowage planning system aiming for large containerships. The system consists of three modules: the stowage plan generator, the stability adjustment module, and the optimization engine. This paper mainly focuses on the stability adjustment module. The objective of the stability adjustment module is to check the global ship stability of the stowage plan produced by the stowage plan generator and resolve the stability issues by applying a heuristic algorithm to search for alternative feasible locations for containers that violate some of the stability criteria. We demonstrate that the procedure proposed is capable of solving the stability problems for a large containership with more than 5000 TEUs. Keywords— Automation, Stowage Planning, Local Search, Heuristic algorithm, Stability Optimization |
Position statement on concurrent objects for massively parallel architectures | This discussion reviews some features of objects and actors and attempts to answer remarks made by Carl Hewitt [Hewitt, 1988} concerning the superiority of actors over objects. Our interest in this discussion stems from research on a multicomputer being designed at Mississippi State University called the Mapped Array Differential Equation Machine (MADEM). The architecture for this floating-point-intensive scientific problem solver supports a finegrained reactive message-passing programming model. Message delivery and process scheduling are performed by a node's i/o processor, which operates concurrently with a high performance numeric processor. In choosing our model for concurrent computation we were influenced by research at Cal Tech and by the work of William Dally [Dally, 1986]. 3. Concurrency within computational agents |
Experiment on the Automatic Detection of Function Clones in a Software System Using Metrics | This paper presents a technique to automatically identify duplicate and near duplicate functions in a large software system. The identification technique is based on metrics extracted fi’om the source code using the tool Datrix? This clone identificafion technique uses 21 function metrics grouped into four points of comparison. Each point of comparison is used to compare functions and determine their cloning level. An ordinal scale of eight cloning levels is defined. The levels range from an exact copy to distinct functions. The metrics, the thresholds and the process used are fully described. The results of applying the clone detection technique to two telecommunication monitoring systems tofaling one million lines of source code are provided as examples. The information provided by this study is useful in monitoring the maintainability of large software systems. |
Gut microorganisms as promising targets for the management of type 2 diabetes | Each human intestine harbours not only hundreds of trillions of bacteria but also bacteriophage particles, viruses, fungi and archaea, which constitute a complex and dynamic ecosystem referred to as the gut microbiota. An increasing number of data obtained during the last 10 years have indicated changes in gut bacterial composition or function in type 2 diabetic patients. Analysis of this ‘dysbiosis’ enables the detection of alterations in specific bacteria, clusters of bacteria or bacterial functions associated with the occurrence or evolution of type 2 diabetes; these bacteria are predominantly involved in the control of inflammation and energy homeostasis. Our review focuses on two key questions: does gut dysbiosis truly play a role in the occurrence of type 2 diabetes, and will recent discoveries linking the gut microbiota to host health be helpful for the development of novel therapeutic approaches for type 2 diabetes? Here we review how pharmacological, surgical and nutritional interventions for type 2 diabetic patients may impact the gut microbiota. Experimental studies in animals are identifying which bacterial metabolites and components act on host immune homeostasis and glucose metabolism, primarily by targeting intestinal cells involved in endocrine and gut barrier functions. We discuss novel approaches (e.g. probiotics, prebiotics and faecal transfer) and the need for research and adequate intervention studies to evaluate the feasibility and relevance of these new therapies for the management of type 2 diabetes. |
A Survey of Current YouTube Video Characteristics | Given the impact of YouTube on Internet services and social networks, a healthy quantity of research has been conducted over the past few years. The majority of studies on traffic capture and evaluation were carried out prior to Google's acquisition of YouTube in 2007. Since then, there have been some changes made to the user policy and service infrastructure, including limits placed on video duration, file size, and resolution. This article depicts the latest YouTube traffic profiles and delivers updated and valuable information for future researchers. To obtain a detailed understanding of YouTube video characteristics, a customized Web spider was employed to crawl over a million YouTube videos. The study demonstrates consistency with previous research for major video streams while revealing that new categories of features have emerged within the YouTube service provision. Compared with traditional video repositories, YouTube exhibits many unique characteristics that could introduce novel challenges and opportunities for optimizing the performance of short video-sharing services. |
Deep Architectures for Face Attributes | We train a deep convolutional neural network to perform identity classification using a new dataset of public figures annotated with age, gender, ethnicity and emotion labels, and then fine-tune it for attribute classification. An optimal sharing pattern of computational resources within this network is determined by experiment, requiring only 1 G flops to produce all predictions. Rather than fine-tune by relearning weights in one additional layer after the penultimate layer of the identity network, we try several different depths for each attribute. We find that prediction of age and emotion is improved by fine-tuning from earlier layers onward, presumably because deeper layers are progressively invariant to non-identity related changes in the input. |
Smallest control invariant set and error boundaries of FCS-MPC for PMSM | This paper studies the steady-state behavior of Finite Control Set (FCS) Model Predictive Control (MPC) for permanent-magnet synchronous machine (PMSM). The control actuation is formulated as a trajectory tracking problem in the αβ stator flux space. The concept of set stability and Control Invariant Set (CIS) is used to determine the smallest possible current, i.e. flux ripple. The best achievable steady state region of the control error is proposed and validated, revealing the maximum switching performance of FCS-MPC. The upper bound of the control error is achieved by constructing a CIS with respect to any reference defined on the equilibrium operating points. The proposed analysis has direct applications since it can be used to predict and optimize the current ripple in real time. Another application is the variation of the sampling frequency to achieve maximum ripple requirements at minimum switching losses. |
Evaluating Visual Aesthetics in Photographic Portraiture | We propose and demonstrate a strategy to quantify aesthetic quality in photographs. Our approach is to develop a small set of classification features by tuning general compositional principles to a targeted image domain where saliency can be better understood. We demonstrate this strategy with photographic portraits of individuals, but it can be extended to other domains. Our technique leverages a refined method of using templates as spatial composition feature look-up tables. Compared to the traditional approach using a large set of global and local features extracted with little salient knowledge, classifiers using features extracted with our approach are better predictors of human aesthetic judgments. |
Harmonic and reactive power compensation with shunt active power filter under non-ideal mains voltage | This paper presents a new control algorithm for an active power filter (APF) to compensate harmonic and reactive power of a 3-phase thyristor bridge rectifier under non-ideal mains voltage scenarios. Sensing load current, dc bus voltage and source voltages compute reference currents of the APF. APF driving signals are produced with these signals via a hysteresis band current controller. Matlab/simulink power system toolbox is used to simulate the proposed system. The proposed method’s performance is compared with conventional instantaneous p proposed a e increased p onstrated. © |
Evaluation of a Practice-Based Intervention to Improve the Management of Pediatric Asthma | Pediatric asthma remains a significant burden upon patients, families, and the healthcare system. Despite the availability of evidence-based best practice asthma management guidelines for over a decade, published studies suggest that many primary care physicians do not follow them. This article describes the Provider Quality Improvement (PQI) intervention with six diverse community-based practices. A pediatrician and a nurse practitioner conducted the yearlong intervention, which was part of a larger CDC-funded project, using problem-based learning within an academic detailing model. Process and outcome assessments included (1) pre- and post-intervention chart reviews to assess eight indicators of quality care, (2) post-intervention staff questionnaires to assess contact with the intervention team and awareness of practice changes, and (3) individual semi-structured interviews with physician and nurse champions in five of the six practices. The chart review indicated that all six practices met predefined performance improvement criteria for at least four of eight indicators of quality care, with two practices meeting improvement criteria for all eight indicators. The response rate for the staff questionnaires was high (72%) and generally consistent across practices, demonstrating high staff awareness of the intervention team, the practice “asthma champions,” and changes in practice patterns. In the semi-structured interviews, several respondents attributed the intervention’s acceptability and success to the expertise of the PQI team and expressed the belief that sustaining changes would be critically dependent on continued contact with the team. Despite significant limitations, this study demonstrated that interventions that are responsive to individual practice cultures can successfully change practice patterns. |
Perspectives and limits for cement kilns as a destination for RDF. | RDF, the high calorific value fraction of MSW obtained by conventional separation systems, can be employed in technological plants (mainly cement kilns) in order to obtain a useful energy recovery. It is interesting and important to evaluate this possibility within the general framework of waste-to-energy solutions. The solution must be assessed on the basis of different aspects, namely: technological features and clinker characteristics; local atmospheric pollution; the effects of RDF used in cement kilns on the generation of greenhouse gases; the economics of conventional solid fuels substitution and planning perspectives, from the point of view of the destination of RDF and optimal cement kiln policy. The different experiences of this issue throughout Europe are reviewed, and some applications within Italy are also been considered. The main findings of the study are that the use of RDF in cement kilns instead of coal or coke offers environmental benefits in terms of greenhouse gases, while the formation of conventional gaseous pollutants is not a critical aspect. Indeed, the generation of nitrogen oxides can probably be lower because of lower flame temperatures or lower air excess. The presence of chlorinated micro-pollutants is not influenced by the presence of RDF in fuel, whereas depending on the quality of the RDF, some problems could arise compared to the substituted fuel as far as heavy metals are concerned, chiefly the more volatile ones. |
Quantifying the Bullwhip Effect in a Simple Supply Chain : The Impact of Forecasting , Lead Times , and Information | Frank Chen • Zvi Drezner • Jennifer K. Ryan • David Simchi-Levi Decision Sciences Department, National University of Singapore, 119260 Singapore Department of MS & IS, California State University, Fullerton, California 92834 School of Industrial Engineering, Purdue University, West Lafayette, Indiana 47907 Department of IE & MS, Northwestern University, Evanston, Illinois 60208 [email protected] • [email protected] • [email protected] • [email protected] |
Computerized planning for multiprobe cryosurgery using a force-field analogy. | Cryosurgery is the destruction of undesired biological tissues by freezing. For internal organs, multiple cryoprobes are inserted into the tissue with the goal of maximizing cryoinjury within a predefined target region, while minimizing cryoinjury to the surrounding tissues. The objective of this study is to develop a computerized planning tool to determine the best locations to insert the cryoprobes, based on bioheat transfer simulations. This tool is general and suitable for all available cooling techniques and hardware. The planning procedure employs a novel iterative optimization technique based on a force-field analogy. In each iteration, a single transient bioheat transfer simulation of the cryoprocedure is computed. At the end of the simulation, regions of tissue that would have undesired temperatures apply "forces" to the cryoprobes directly moving them to better locations. This method is more efficient than traditional numerical optimization techniques, because it requires significantly fewer bioheat transfer simulations for each iteration of planning. For demonstration purposes, 2D examples on cross sections typical of prostate cryosurgery are given. |
FluxQuery: An Execution Framework for Highly Interactive Query Workloads | Modern computing devices and user interfaces have necessitated highly interactive querying. Some of these interfaces issue a large number of dynamically changing and continuous queries to the backend. In others, users expect to inspect results during the query formulation process, in order to guide or help them towards specifying a full-fledged query. Thus, users end up issuing a fast-changing workload to the underlying database. In such situations, the user's query intent can be thought of as being in flux. In this paper, we show that the traditional query execution engines are not well-suited for this new class of highly interactive workloads. We propose a novel model to interpret the variability of likely queries in a workload. We implemented a cyclic scan-based approach to process queries from such workloads in an efficient and practical manner while reducing the overall system load. We evaluate and compare our methods with traditional systems and demonstrate the scalability of our approach, enabling thousands of queries to run simultaneously within interactive response times given low memory and CPU requirements. |
Online Word-of-Mouth (or Mouse): An Exploration of Its Antecedents and Consequences | This study developed an integrated model to explore the antecedents and consequences of online word-of-mouth in the context of music-related communication. Based on survey data from college students, online word-of-mouth was measured with two components: online opinion leadership and online opinion seeking. The results identified innovativeness, Internet usage, and Internet social connection as significant predictors of online word-of-mouth, and online forwarding and online chatting as behavioral consequences of online word-of-mouth. Contrary to the original hypothesis, music involvement was found not to be significantly related to online word-of-mouth. Theoretical implications of the findings and future research directions are discussed. |
ALGORITHMIC TRADING WITH MARKOV CHAINS | An order book consists of a list of all buy and sell offers, represented by price and quantity, available to a market agent. The order book changes rapidly, within fractions of a second, due to new orders being entered into the book. The volume at a certain price level may increase due to limit orders, i.e. orders to buy or sell placed at the end of the queue, or decrease because of market orders or cancellations. In this paper a high-dimensional Markov chain is used to represent the state and evolution of the entire order book. The design and evaluation of optimal algorithmic strategies for buying and selling is studied within the theory of Markov decision processes. General conditions are provided that guarantee the existence of optimal strategies. Moreover, a value-iteration algorithm is presented that enables finding optimal strategies numerically. As an illustration a simple version of the Markov chain model is calibrated to high-frequency observations of the order book in a foreign exchange market. In this model, using an optimally designed strategy for buying one unit provides a significant improvement, in terms of the expected buy price, over a naive buy-one-unit strategy. |
Small-group problem-based learning as a complex adaptive system | Small-group problem-based learning (PBL) is widely embraced as a method of study in health professions schools and at many different levels of education. Complexity science provides a differe t lens with which to view and understand the application of this method. It presents new concepts and vocabulary that may be unfamiliar to practitioners of small-group PBL and other educational methods. This article looks at small-group PBL from the perspective of complex adaptive systems (CAS). It begins with a brief review of the current understanding and practice of PBL. Next some of the characteristics of CAS are reviewed using examples from small-group PBL to illustrate how these characteristics are expressed in that context. The principles and the educational theory in which small-group PBL are embedded are related to CAS. Implications for health professions education are discussed. r 2006 Elsevier Ltd. All rights reserved. |
LCL DC/DC Converter for DC Grids | This paper proposes an LCL dc-dc converter concept which is capable of achieving very high stepping ratios with megawatt-level power transfers. The converter can find potential application in connecting high-power dc sources to high-voltage dc transmission including future dc transmission networks. This converter is based on two ac/dc insulated-gate bipolar transistor-based converters and an internal passive LCL circuit without internal ac transformers. The LCL circuit is designed to enable voltage stepping without any reactive power circulation and with potentially soft switching operation which minimizes the switching losses. The designed converter has the ability to achieve current regulation even under extreme external dc faults and, therefore, the converter can operate through dc faults. The switch utilization is better than similar topologies and losses are reasonably low. A dual-active-bridge transformer-based converter design is presented to compare with the proposed LCL converter. A detailed PSCAD model confirms conclusions using a 100-MW 20-kV/300 kV test system. An LCL 200-W 20-V/100-V dc/dc prototype converter is built to validate the proposed topology. |
Fine-Grained Product Class Recognition for Assisted Shopping | Assistive solutions for a better shopping experience can improve the quality of life of people, in particular also of visually impaired shoppers. We present a system that visually recognizes the fine-grained product classes of items on a shopping list, in shelves images taken with a smartphone in a grocery store. Our system consists of three components: (a) We automatically recognize useful text on product packaging, e.g., product name and brand, and build a mapping of words to product classes based on the large-scale GroceryProducts dataset. When the user populates the shopping list, we automatically infer the product class of each entered word. (b) We perform fine-grained product class recognition when the user is facing a shelf. We discover discriminative patches on product packaging to differentiate between visually similar product classes and to increase the robustness against continuous changes in product design. (c) We continuously improve the recognition accuracy through active learning. Our experiments show the robustness of the proposed method against cross-domain challenges, and the scalability to an increasing number of products with minimal re-training. |
Type Inclusion Constraints and Type Inference | We present a general algorithm for solving systems of inclusion constraints over type expressions. The constraint language includes function types, constructor types, and liberal intersection and union types. We illustrate the application of our constraint solving algorithm with a type inference system for the lambda calculus with constants. In this system, every pure lambda term has a (computable) type and every term typable in the Hindley/Milner system has all of its Hindley/Milner types. Thus, the inference system is an extension of the Hindley/Milner system that can type a very large set of lambda terms. |
WIDEBAND GYSEL POWER DIVIDER WITH ARBITRARY POWER DIVISION BASED ON PATCH TYPE STRUCTURE | A novel Gysel power divider based on patch type structure is presented in this paper. The proposed power divider possesses broad bandwidth, small physical occupation and arbitrary power division. More than 30% bandwidth enhancement is achieved based on the −15 dB input return loss criteria, while 55% size reduction is realized compared with conventional Gysel power divider. What’s more, flat dividing is obtained in the design without using additional transmission line sections. Based on the novel structure, a design procedure of power dividers with unequal power division ratios is provided without using narrow microstrip line. To verify the design approach, the proposed power dividers with equal and unequal (2:1 and 4:1) power divisions at the centre frequency 1.5GHz are fabricated and measured. The results demonstrate that the design can fulfil our goals. |
On the Origins of Adaptive Behavioral Complexity : Developmental Channeling of Structural Trade-offs | 1. Developmental Perspective on the Evolution of Behavioral Strategies: Approach 1 2. Evidence of Ontogenetic Behavioral Linkages and Dependencies 3 2.1 Early Developmental Origins of Behavioral Variation 3 2.2 Trade-offs in Neural Processes and Personality 5 2.3 Maintenance of Variation in Behavioral Expression Along Trade-off Axes 8 3. Developmental Origins of Behavioral Variation 10 3.1 Design Principles of the Brain and Mechanisms Underlying Neural Tradeoffs 10 3.2 Developmental Channeling: Mechanism for Separating Individuals Along Trade-off Axes 12 4. Applying the Concept of Developmental Channeling: Dispersal Strategies as an Example 17 4.1 Evolution of Dispersal Strategies 17 4.2 Maternally Induced Dispersal Behavior 21 5. Conclusion and Future Directions 25 Acknowledgments 27 References 27 |
An electronic ballast to control an induction lamp | The use of electromagnetic induction lamps without electrodes has increased because of their long life and energy efficiency. The control of the ignition and luminosity of the lamp is provided by an electronic ballast. Beyond that, the electronic ballast also provides a power factor correction, allowing the minimizing of the lamps impact on the quality of service of the electrical network. The electronic ballast includes several blocks, namely a bridge rectifier, a power factor correcting circuit (PFC), an asymmetric half-bridge inverter with a resonant filter on the inverter output, and a circuit to control the conduction time ot the ballast transistors. Index Terms – SEPIC, PFC, electrodeless lamp, ressonant filter, |
Governance on the Drug Supply Chain via Gcoin Blockchain | As a trust machine, blockchain was recently introduced to the public to provide an immutable, consensus based and transparent system in the Fintech field. However, there are ongoing efforts to apply blockchain to other fields where trust and value are essential. In this paper, we suggest Gcoin blockchain as the base of the data flow of drugs to create transparent drug transaction data. Additionally, the regulation model of the drug supply chain could be altered from the inspection and examination only model to the surveillance net model, and every unit that is involved in the drug supply chain would be able to participate simultaneously to prevent counterfeit drugs and to protect public health, including patients. |
Psychopathic predators? Getting specific about the relation between psychopathy and violence. | OBJECTIVE
The psychopathy checklist-revised (PCL-R; Hare, 1991, 2003) is often used to assess risk of violence, perhaps based on the assumption that it captures emotionally detached individuals who are driven to prey upon others. This study is designed to assess the relation between (a) core interpersonal and affective traits of psychopathy and impulsive antisociality on the one hand and (b) the risk of future violence and patterns of motivation for past violence on the other.
METHOD
A research team reliably assessed a sample of 158 male offenders for psychopathy, using both the interview-based PCL-R and the self-report psychopathic personality inventory (PPI: Lilienfeld & Andrews, 1996). Then, a second independent research team assessed offenders' lifetime patterns of violence and their motivation. After these baseline assessments, offenders were followed in prison or the community for up to 1 year to assess their involvement in 3 different forms of violence. Baseline and follow-up assessments included both interviews and reviews of official records.
RESULTS
First, the PPI manifested incremental validity in predicting future violence over the PCL-R (but not vice versa)-and most of its predictive power derived solely from impulsive antisociality. Second, impulsive antisociality-not interpersonal and affective traits specific to psychopathy-were uniquely associated with instrumental lifetime patterns of past violence. The latter psychopathic traits are narrowly associated with deficits in motivation for violence (e.g., lack of fear or lack of provocation).
CONCLUSIONS
These findings and their consistency with some past research led us to advise against making broad generalizations about the relation between psychopathy and violence. |
Experience with EMERALD to Date | After summarizing the EMERALD architecture and the evolutionary process from which EMERALD has evolved, this paper focuses on our experience to date in designing, implementing, and applying EMERALD to various types of anomalies and misuse. The discussion addresses the fundamental importance of good software engineering practice and the importance of the system architecture { in attaining detectability, interoperability, general applicability, and future evolvability. It also considers the importance of correlation among distributed and hierarchical instances of EMERALD, and needs for additional detection and analysis components. |
On the Direction of Arrival (DoA) Estimation for a Switched-Beam Antenna System Using Neural Networks | A generic direction of arrival (DoA) estimation methodology is presented that is based on neural networks (NNs) and designed for a switched-beam system (SBS). The method incorporates the benefits of NNs and SBSs to achieve DoA estimation in a less complex and expensive way compared to the corresponding widely known super resolution algorithms. The proposed technique is step-by-step developed and thoroughly studied and explained, especially in terms of the beam pattern structure and the neuro-computational procedures. Emphasis is given on the direct sequence code division multiple access (DS-CDMA) applications, and particularly the Universal Mobile Telecommunication System (UMTS). Extensive simulations are realized for each step of the method, demonstrating its performance. It is shown that a properly trained NN can accurately find the signal of interest (SoI) angle of arrival at the presence of a varying number of mobile users and a varying SoI to interference ratio. The proposed NN-SBS DoA estimation method can be applied to current cellular communications base stations, promoting the wider use of smart antenna beamforming. |
An ontology-based approach to Chinese semantic advertising | In the web advertising domain, contextual advertising and sponsored search are two of the main advertising channels used to display related advertisements on web pages. A major challenge for contextual advertising is to match advertisements and web pages based on their semantics. When a web page and its semantically related advertisements contain many different words, the performance of the traditional methods can be very poor. In particular, there are few studies presented for Chinese contextual advertising that are based on semantics. To address these issues, we propose an ontology-based approach to Chinese semantic advertising. We utilize an ontology called the Taobao Ontology and populate it by automatically adding related phrases as instances. The ontology is used to match web pages and advertisements on a conceptual level. Based on the Taobao Ontology, the proposed method exploits seven distance functions to measure the similarities between concepts and web pages or advertisements. Then, the similarities between web pages and advertisements are calculated by considering the ontology-based similarities as well as term-based similarities. The empirical experiments indicate that our method is able to match Chinese web pages and advertisements with a relatively high accuracy. Among the seven distance functions, Cosine distance and Tanimoto distance show the best performance in terms of precision, recall, and F-measure. In addition, our method outperforms two contextual advertising methods, i.e., the impedance coupling method and the SVMbased method. 2012 Elsevier Inc. All rights reserved. |
Human Gait Recognition and Classification Using Time Series Shapelets | Human gait is the main activity of daily life. Gait can be used for applications like human identification (in medical field etc). Since gait can be perceived from a distance it can be used for human identification. Gait recognition means identifying the person with his/her gait. Human identification using gait can be used in surveillance. A method is proposed for gait recognition using a technique which uses time series shapelets. First, for a gait video a preprocessing is done to extract the silhouette images from the video. From these silhouette images features like joint angle and swing distance are extracted which can be represented as the time series data. From this time series data, time series shapelets are extracted. Shapelets are subsequence of time series data which can discriminate between classes. Shapelets are maximally representative of the class. These time series shapelets can be used to identify human by their gait. Shapelets can also be used for classification. After extracting the shapelets, the prediction is done using the decision tree. In that it can be used for classifying normal and abnormal human gait. |
Multi-armed Bandit Algorithms and Empirical Evaluation | The multi-armed bandit problem for a gambler is to decide which arm of a K-slot machine to pull to maximize his total reward in a series of trials. Many real-world learning and optimization problems can be modeled in this way. Several strategies or algorithms have been proposed as a solution to this problem in the last two decades, but, to our knowledge, there has been no common evaluation of these algorithms. This paper provides a preliminary empirical evaluation of several multiarmed bandit algorithms. It also describes and analyzes a new algorithm, Poker (Price Of Knowledge and Estimated Reward) whose performance compares favorably to that of other existing algorithms in several experiments. One remarkable outcome of our experiments is that the most naive approach, the -greedy strategy, proves to be often hard to beat. |
An Interactive 3D Holographic Pyramid for Museum Exhibition | In this paper, an interactive holographic system, realized with the aim of creating, exchanging, discussing and disseminating cultural heritage information, is presented. By using low-cost and off-the-shelf devices, the system provides the visitors with a 'floating' computer generated representation of a virtual cultural artefact that, unlike the real one, can be examined in detail through a touchless natural interface. The proposed system is realized in such a way that it can be easily placed in a cultural exhibition without requiring any structural intervention. As such, it could represent a useful instrument complementary to a museum visit thanks to its capacity both to convey different types of digital cultural information and especially to allow the visitor to become an active actor, able to enjoy different perspectives and all the details of the artefact sharing her/his experience with other visitors. The paper describes the system modules and the hardware design to physically realize the pyramid, and details the user interface composed of two main actions designed to obtain a simple exploration of a virtual cultural heritage artefact. |
Detection of malingering in assessment of adult ADHD. | Comparisons of two assessment measures for ADHD: the ADHD Behavior Checklist and the Integrated Visual and Auditory Continuous Performance Test (IVA CPT) were examined using undergraduates (n=44) randomly assigned to a control or a simulated malingerer condition and undergraduates with a valid diagnosis of ADHD (n=16). It was predicted that malingerers would successfully fake ADHD on the rating scale but not on the CPT for which they would overcompensate, scoring lower than all other groups. Analyses indicated that the ADHD Behavior Rating Scale was successfully faked for childhood and current symptoms. IVA CPT could not be faked on 81% of its scales. The CPT's impairment index results revealed: sensitivity 94%, specificity 91%, PPP 88%, NPP 95%. Results provide support for the inclusion of a CPT in assessment of adult ADHD. |
Intra- and Inter-Brain Synchronization during Musical Improvisation on the Guitar | Humans interact with the environment through sensory and motor acts. Some of these interactions require synchronization among two or more individuals. Multiple-trial designs, which we have used in past work to study interbrain synchronization in the course of joint action, constrain the range of observable interactions. To overcome the limitations of multiple-trial designs, we conducted single-trial analyses of electroencephalography (EEG) signals recorded from eight pairs of guitarists engaged in musical improvisation. We identified hyper-brain networks based on a complex interplay of different frequencies. The intra-brain connections primarily involved higher frequencies (e.g., beta), whereas inter-brain connections primarily operated at lower frequencies (e.g., delta and theta). The topology of hyper-brain networks was frequency-dependent, with a tendency to become more regular at higher frequencies. We also found hyper-brain modules that included nodes (i.e., EEG electrodes) from both brains. Some of the observed network properties were related to musical roles during improvisation. Our findings replicate and extend earlier work and point to mechanisms that enable individuals to engage in temporally coordinated joint action. |
Watch Their Moves: Applying Probabilistic Multiple Object Tracking to Autonomous Robot Soccer | In many autonomous robot applications robots must be capable of estimating the positions and motions of moving objects in their environments. In this paper, we apply probabilistic multiple object tracking to estimating the positions of opponent players in autonomous robot soccer. We extend an existing tracking algorithm to handle multiple mobile sensors with uncertain positions, discuss the specification of probabilistic models needed by the algorithm, and describe the required vision-interpretation algorithms. The multiple object tracking has been successfully applied throughout the RoboCup 2001 world championship. |
What Mathematics Is The Most Fundamental | Standard mathematics involves such notions as infinitely small/large, continuity and standard division. This mathematics is usually treated as fundamental while finite mathematics is treated as inferior. Standard mathematics has foundational problems (as follows, for example, from G\"{o}del's incompleteness theorems) but it is usually believed that this is less important than the fact that it describes many experimental data with high accuracy. We argue that the situation is the opposite: standard mathematics is only a degenerate case of finite one in the formal limit when the characteristic of the ring or field used in finite mathematics goes to infinity. Therefore foundational problems in standard mathematics are not fundamental. |
Early alliance, alliance ruptures, and symptom change in a nonrandomized trial of cognitive therapy for avoidant and obsessive-compulsive personality disorders. | Participants were 30 adult outpatients diagnosed with avoidant personality disorder or obsessive-compulsive personality disorder who enrolled in an open trial of cognitive therapy for personality disorders. Treatment consisted of up to 52 weekly sessions. Symptom evaluations were conducted at intake, at Sessions 17 and 34, and at the last session. Alliance variables were patients' first alliance rating and "rupture-repair" episodes, which are disruptions in the therapeutic relationship that can provide corrective experiences and facilitate change. Stronger early alliances and rupture-repair episodes predicted more improvement in symptoms of personality disorder and depression. This work points to potentially important areas to target in treatment development for these personality disorders. |
The red one!: On learning to refer to things based on discriminative properties | As a first step towards agents learning to communicate about their visual environment, we propose a system that, given visual representations of a referent (CAT) and a context (SOFA), identifies their discriminative attributes, i.e., properties that distinguish them (has_tail). Moreover, although supervision is only provided in terms of discriminativeness of attributes for pairs, the model learns to assign plausible attributes to specific objects (SOFA-has_cushion). Finally, we present a preliminary experiment confirming the referential success of the predicted discriminative attributes. |
Ontop: Answering SPARQL queries over relational databases | We present Ontop, an open-source Ontology-Based Data Access (OBDA) system that allows for querying relational data sources through a conceptual representation of the domain of interest, provided in terms of an ontology, to which the data sources are mapped. Key features of Ontop are its solid theoretical foundations, a virtual approach to OBDA, which avoids materializing triples and is implemented through the query rewriting technique, extensive optimizations exploiting all elements of the OBDA architecture, its compliance to all relevant W3C recommendations (including SPARQL queries, R2RML mappings, and OWL 2 QL and RDFS ontologies), and its support for all major relational databases. |
Code and data files for "Owning Capital or being Shareholders: an equivalence result with Incomplete Markets" | Many recent papers in macroeconomics have studied the implications of models with household heterogeneity and incomplete financial markets under the assumption that households own the stock of physical capital and undertake the intertemporal investment decisions. In these models, production exhibits constant returns to scale, households maximize expected discounted utility, and firms rent capital and labor from households to maximize period by period profits. This paper considers the case in which infinitely lived firms, rather than households, make the intertemporal investment decisions. Under this assumption, it shows that there exists an objective function for firms that results in the same equilibrium allocation as in the standard setting with one period lived firms. The objective requires that firms maximize their asset value, which is defined as the discounted value of future cash flows using present value processes that do not allow for arbitrage opportunities. (Copyright: Elsevier) |
Name disambiguation in author citations using a K-way spectral clustering method | An author may have multiple names and multiple authors may share the same name simply due to name abbreviations, identical names, or name misspellings in publications or bibliographies 1. This can produce name ambiguity which can affect the performance of document retrieval, web search, and database integration, and may cause improper attribution of credit. Proposed here is an unsupervised learning approach using K-way spectral clustering that disambiguates authors in citations. The approach utilizes three types of citation attributes: co-author names, paper titles, and publication venue titles 2. The approach is illustrated with 16 name datasets with citations collected from the DBLP database bibliography and author home pages and shows that name disambiguation can be achieved using these citation attributes. |
Learning Concepts by Arranging Appropriate Training Order | Machine learning has been proven useful for solving the bottlenecks in building expert systems. Noise in the training instances will, however, confuse a learning mechanism. Two main steps are adopted here to solve this problem. The first step is to appropriately arrange the training order of the instances. It is well known from Psychology that different orders of presentation of the same set of training instances to a human may cause different learning results. This idea is used here for machine learning and an order arrangement scheme is proposed. The second step is to modify a conventional noise-free learning algorithm, thus making it suitable for noisy environment. The generalized version space learning algorithm is then adopted to process the training instances for deriving good concepts. Finally, experiments on the Iris Flower problem show that the new scheme can produce a good training order, allowing the generalized version space algorithm to have a satisfactory learning result. |
Feasibility and efficacy of simultaneous integrated boost intensity-modulated radiation therapy in patients with limited-disease small cell lung cancer | PURPOSE
To evaluate the feasibility and efficacy of simultaneous integrated boost intensity-modulated radiation therapy (SIB-IMRT) in patients with limited-disease small-cell lung cancer (LD-SCLC).
METHODS
Patients with LD-SCLC were treated with SIB-IMRT within 1 week after completion of 2 cycles of induction chemotherapy. Then 2-4 cycles of adjuvant chemotherapy were administered within 1 week after SIB-IMRT. Irradiation was given accelerated hyper-fractionated with the prescribed dose 57Gy at 1.9Gy twice daily to the gross tumor volume (GTV) , 51Gy at 1.7Gy twice daily to the clinical tumor volume (CTV) and 45Gy at 1.5Gy twice daily to the planning target volume (PTV). The chemotherapy regimen consisted of platinum plus etoposide. Prophylactic cranial radiation (25Gy in 10 fractions) was administered to patients who got complete response (CR) or near complete response (nCR). The primary endpoint of this study was the frequency of grade 3 or higher acute non-hematologic treatment-related toxicities. Secondary end points included objective response, overall survival (OS), progression-free survival (PFS), locoregional recurrence-free survival (LRFS).
RESULTS
A cohort of 35 patients were enrolled in the study, the biological equivalent dose (BED) of the GTV in the SIB-IMRT was 59.16Gy. Grade 1, 2, and 3 esophagitis were observed in 11 (31%), 12 (34%), and 6 (17%) patients, respectively; Grade 1 and 2 pneumonitis were observed in 8 (23%) and 4 (11%) patients, respectively. The median OS and PFS of the whole group were 37.7 months and 29.3 months, respectively. The 1- and 2-year OS was 94.1% and 68.5%, respectively. The 1- and 2-year PFS was 76.8% and 40.7%, respectively. The 1- and 2-year LRFS was 87.7% and 73.8%, respectively.
CONCLUSIONS
SIB-IMRT was feasible and well-tolerated in patients with LD-SCLC, and worth further evaluating in a large prospective clinical trial. |
Automated characterization of blood vessels as arteries and veins in retinal images | In recent years researchers have found that alternations in arterial or venular tree of the retinal vasculature are associated with several public health problems such as diabetic retinopathy which is also the leading cause of blindness in the world. A prerequisite for automated assessment of subtle changes in arteries and veins, is to accurately separate those vessels from each other. This is a difficult task due to high similarity between arteries and veins in addition to variation of color and non-uniform illumination inter and intra retinal images. In this paper a novel structural and automated method is presented for artery/vein classification of blood vessels in retinal images. The proposed method consists of three main steps. In the first step, several image enhancement techniques are employed to improve the images. Then a specific feature extraction process is applied to separate major arteries from veins. Indeed, vessels are divided to smaller segments and feature extraction and vessel classification are applied to each small vessel segment instead of each vessel point. Finally, a post processing step is added to improve the results obtained from the previous step using structural characteristics of the retinal vascular network. In the last stage, vessel features at intersection and bifurcation points are processed for detection of arterial and venular sub trees. Ultimately vessel labels are revised by publishing the dominant label through each identified connected tree of arteries or veins. Evaluation of the proposed approach against two different datasets of retinal images including DRIVE database demonstrates the good performance and robustness of the method. The proposed method may be used for determination of arteriolar to venular diameter ratio in retinal images. Also the proposed method potentially allows for further investigation of labels of thinner arteries and veins which might be found by tracing them back to the major vessels. |
RECENT DEVELOPMENTS IN PAPER CURRENCY RECOGNITION SYSTEM | Currency denomination recognition is one the active research topics at present. And this wide interest is particularly due to the various potential applications it has. Monetary transaction is an integral part of our day to day activities. However, blind people particularly suffer in monetary transactions. They are not able to effectively distinguish between various denominations and are often deceived by other people. Also, a reliable currency recognition system could be used in any sector wherever monetary transaction is of concern. Thus, there is an ardent need to design a system that is helpful in recognition of paper currency notes correctly. Currency denomination detection is a vast area of research and significant progress had been achieved over the years. This paper presents an extensive survey of research on various developments in recent years in identification of currency denomination. A number of techniques applied by various researchers are discussed briefly in order to assess the state of art. |
A Comparison Among ARIMA, BP-NN, and MOGA-NN for Software Clone Evolution Prediction | Software evolution continues throughout the life cycle of the software. During the evolution of software system, it has been observed that the developers have a tendency to copy the modules completely or partially and modify them. This practice gives rise to identical or very similar code fragments called software clones. This paper examines the evolution of clone components by using advanced time series analysis. In the first phase, software clone components are extracted from the source repository of the software application by using the abstract syntax tree approach. Then, the evolution of software clone components is analyzed. In this paper, three models, Autoregressive Integrated Moving Average, back propagation neural network, and multi-objective genetic algorithm-based neural network, have been compared for the prediction of the evolution of software clone components. Evaluation is performed on the large open-source software application, ArgoUML. The ability to predict the clones helps the software developer to reduce the effort during software maintenance activities. |
A review of recent advances in risk analysis for wildfire management | Risk analysis evolved out of the need tomake decisions concerning highly stochastic events, and is well suited to analyse the timing, location and potential effects of wildfires. Over the past 10 years, the application of risk analysis to wildland firemanagement has seen steady growthwith new risk-based analytical tools that support a wide range of fire and fuels management planning scales from individual incidents to national, strategic interagency programs. After a brief review of the three components of fire risk – likelihood, intensity and effects – this paper reviews recent advances in quantifying and integrating these individual components of fire risk. We also review recent advances in addressing temporal dynamics of fire risk and spatial optimisation of fuels management activities. Risk analysis approaches have become increasingly quantitative and sophisticated but remain quite disparate. We suggest several necessary and fruitful directions for future research and development in wildfire risk analysis. Additional keywords: burn probability, fire likelihood, hazard, risk assessment, risk science. Received 11 August 2011, accepted 29 June 2012, published online 14 September 2012 |
Particle Swarm Optimization: Technique, System and Challenges | Particle Swarm Optimization (PSO) is a biologically inspired computational search and optimization method developed in 1995 by Eberhart and Kennedy based on the social behaviors of birds flocking or fish schooling. A number of basic variations have been developed due to improve speed of convergence and quality of solution found by the PSO. On the other hand, basic PSO is more appropriate to process static, simple optimization problem. Modification PSO is developed for solving the basic PSO problem. The observation and review 46 related studies in the period between 2002 and 2010 focusing on function of PSO, advantages and disadvantages of PSO, the basic variant of PSO, Modification of PSO and applications that have implemented using PSO. The application can show which one the modified or variant PSO that haven’t been made and which one the modified or variant PSO that will be developed. |
A Meaning-based English Math Word Problem Solver with Understanding, Reasoning and Explanation | This paper presents a meaning-based statistical math word problem (MWP) solver with understanding, reasoning and explanation. It comprises a web user interface and pipelined modules for analysing the text, transforming both body and question parts into their logic forms, and then performing inference on them. The associated context of each quantity is represented with proposed role-tags (e.g., nsubj, verb, etc.), which provides the flexibility for annotating the extracted math quantity with its associated syntactic and semantic information (which specifies the physical meaning of that quantity). Those role-tags are then used to identify the desired operands and filter out irrelevant quantities (so that the answer can be obtained precisely). Since the physical meaning of each quantity is explicitly represented with those role-tags and used in the inference process, the proposed approach could explain how the answer is obtained in a human comprehensible way. |
Single Image Haze Removal Using Dark Channel Prior | In this paper, we propose a simple but effective image prior-dark channel prior to remove haze from a single input image. The dark channel prior is a kind of statistics of outdoor haze-free images. It is based on a key observation-most local patches in outdoor haze-free images contain some pixels whose intensity is very low in at least one color channel. Using this prior with the haze imaging model, we can directly estimate the thickness of the haze and recover a high-quality haze-free image. Results on a variety of hazy images demonstrate the power of the proposed prior. Moreover, a high-quality depth map can also be obtained as a byproduct of haze removal. |
Circulating adiponectin and resistin levels in relation to metabolic factors, inflammatory markers, and vascular reactivity in diabetic patients and subjects at risk for diabetes. | OBJECTIVE
Adiponectin and resistin, two recently discovered adipocyte-secreted hormones, may link obesity with insulin resistance and/or metabolic and cardiovascular risk factors. We performed a cross-sectional study to investigate the association of adiponectin and resistin with inflammatory markers, hyperlipidemia, and vascular reactivity and an interventional study to investigate whether atorvastatin mediates its beneficial effects by altering adiponectin or resistin levels.
RESEARCH DESIGN AND METHODS
Associations among vascular reactivity, inflammatory markers, resistin, and adiponectin were assessed cross-sectionally using fasting blood samples obtained from 77 subjects who had diabetes or were at high risk to develop diabetes. The effect of atorvastatin on adiponectin and resistin levels was investigated in a 12-week-long randomized, double-blind, placebo-controlled study.
RESULTS
In the cross-sectional study, we confirm prior positive correlations of adiponectin with HDL and negative correlations with BMI, triglycerides, C-reactive protein (CRP), and plasma activator inhibitor (PAI)-1 and report a negative correlation with tissue plasminogen activator. The positive association with HDL and the negative association with PAI-1 remained significant after adjusting for sex and BMI. We also confirm prior findings of a negative correlation of resistin with HDL and report for the first time a positive correlation with CRP. All of these associations remained significant after adjusting for sex and BMI. No associations of adiponectin or resistin with any aspects of vascular reactivity were detected. In the interventional study, atorvastatin decreased lipid and CRP levels, but adiponectin and resistin were not specifically altered.
CONCLUSIONS
We conclude that adiponectin is significantly associated with inflammatory markers, in part, through an underlying association with obesity, whereas resistin's associations with inflammatory markers appear to be independent of BMI. Lipid profile and inflammatory marker changes produced by atorvastatin cannot be attributed to changes of either adiponectin or resistin. |
Hybrid Stabilization of Thoracic Spine Fractures with Sublaminar Bands and Transpedicular Screws: Description of a Surgical Alternative and Review of the Literature | Stabilization of unstable thoracic fractures with transpedicular screws is widely accepted. However, placement of transpedicular screws can cause complications, particularly in the thoracic spine with physiologically small pedicles. Hybrid stabilization, a combination of sublaminar bands and pedicle screws, might reduce the rate of misplaced screws and can be helpful in special anatomic circumstances, such as preexisting scoliosis and osteoporosis. We report about two patients suffering from unstable thoracic fractures, of T5 in one case and T3, T4, and T5 in the other case, with preexisting scoliosis and extremely small pedicles. Additionally, one patient had osteoporosis. Patients received hybrid stabilization with pedicle screws adjacent to the fractured vertebral bodies and sublaminar bands at the level above and below the pedicle screws. No complications occurred. Follow-up was 12 months with clinically uneventful postoperative courses. No signs of implant failure or loss of reduction could be detected. In patients with very small thoracic pedicles, scoliosis, and/or osteoporosis, hybrid stabilization with sublaminar bands and pedicle screws can be a viable alternative to long pedicle screw constructs. |
Shrinking microwave filters | An adaptive predistortion technique has been presented and verified through the design and fabrication of practical filters in both the C and Ku bands. The method allows the realization of microwave filters at a lower cost, lighter mass, smaller volume, and better performance with minimum insertion loss penalties.The concept of lossy filters has been presented from a practical perspective. A simple lossy synthesis technique using any synthesized lossless (nontransversal) filter was shown, which can be used with hyperbolic rotations for loss distribution. Moreover, the limitation on the minimum Q of lossy resonators has been studied using a one-pole filter as a fundamental building block. Lossy four-pole Chebyshev and quasi-elliptic synthesis examples were presented. A four- pole Chebyshev lossy filter in the Ku band has been synthesized, modeled, and fabricated successfully using mixed combline and microstrip technologies. The design has the advantage of having all input-output paths going through more than one resonator, which minimizes unwanted source-to-load coupling, especially at high frequencies. The lossy approach is still at its early stages of development and needs more research and development effort to become as mature as the predistorted filters. |
SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming | Support vector machine (SVM) soft margin classifiers are important learning algorithms for classification problems. They can be stated as convex optimization problems and are suitable for a large data setting. Linear programming SVM classifiers are especially efficient for very large size samples. But little is known about their convergence, compared with the well-understood quadratic programming SVM classifier. In this article, we point out the difficulty and provide an error analysis. Our analysis shows that the convergence behavior of the linear programming SVM is almost the same as that of the quadratic programming SVM. This is implemented by setting a stepping-stone between the linear programming SVM and the classical 1-norm soft margin classifier. An upper bound for the misclassification error is presented for general probability distributions. Explicit learning rates are derived for deterministic and weakly separable distributions, and for distributions satisfying some Tsybakov noise condition. |
Leucine and Calcium Regulate Fat Metabolism and Energy Partitioning in Murine Adipocytes and Muscle Cells | Dietary calcium modulation of adiposity is mediated, in part, by suppression of calcitriol, while the additional effect of dairy products is mediated by additional components; these include the high concentration of leucine, a key factor in the regulation of muscle protein turnover. We investigated the effect of leucine, calcitriol and calcium on energy metabolism in murine adipocytes and muscle cells and on energy partitioning between adipocytes and skeletal muscle. Leucine induced a marked increase in fatty acid oxidation in C2C12 muscle cells (P < 0.001) and decreased FAS expression by 66% (P < 0.001) in 3T3-L1 adipocytes. Calcitriol decreased muscle cell fatty acid oxidation by 37% (P < 0.001) and increased adipocyte FAS gene expression by threefold (P < 0.05); these effects were partially reversed by either leucine or calcium channel antagonism with nifedipine. Co-culture of muscle cells with adipocytes or incubation with 48-h adipocyte conditioned medium decreased muscle fatty acid oxidation by 62% (P < 0.001), but treating adipocytes with leucine and/or nifedipine attenuated this effect. Leucine, nifedipine and calcitriol also modulated adiponectin production and thereby exerted additional indirect effects on fatty acid oxidation in C2C12 myotubes. Adiponectin increased IL-15 and IL-6 release by myotubes and partially reversed the inhibitory effects of calcitriol. Comparable effects of leucine, calcitriol and adiponectin were found in myotubes treated with conditioned medium derived from adipocytes or co-cultured with adipocytes. These data suggest that leucine and nifedipine promote energy partitioning from adipocytes to muscle cells, resulting in decreased energy storage in adipocytes and increasing fatty acid utilization in muscle. |
Targeted Storyfying: Creating Stories About Particular Events | The present paper proposes a computational model of the task of building a story from a set of events that have been observed in the world. For the purposes of the paper, a story is considered to be a particular type of sequential discourse, that includes a beginning, a complication and a resolution, concerns a character that can be clearly identified as a protagonist, and ends with a certain sense of closure. Starting from prior approaches to this task, the paper addresses the problem of how to target particular events to act as the core of the desired story. Two different heuristics – imaginative interpretation and imaginative enrichment – are proposed, one favouring faithful rendering of the observed events and the other favouring strong cohesive plots. The heuristics are tested over a simple case study based on finding interesting plots to tell inspired by the movements of pieces in a chess game. |
Learning Sensor-Specific Spatial-Spectral Features of Hyperspectral Images via Convolutional Neural Networks | Convolutional neural network (CNN) is well known for its capability of feature learning and has made revolutionary achievements in many applications, such as scene recognition and target detection. In this paper, its capability of feature learning in hyperspectral images is explored by constructing a five-layer CNN for classification (C-CNN). The proposed C-CNN is constructed by including recent advances in deep learning area, such as batch normalization, dropout, and parametric rectified linear unit (PReLU) activation function. In addition, both spatial context and spectral information are elegantly integrated into the C-CNN such that spatial-spectral features are learned for hyperspectral images. A companion feature-learning CNN (FL-CNN) is constructed by extracting fully connected feature layers in this C-CNN. Both supervised and unsupervised modes are designed for the proposed FL-CNN to learn sensor-specific spatial-spectral features. Extensive experimental results on four benchmark data sets from two well-known hyperspectral sensors, namely airborne visible/infrared imaging spectrometer (AVIRIS) and reflective optics system imaging spectrometer (ROSIS) sensors, demonstrate that our proposed C-CNN outperforms the state-of-the-art CNN-based classification methods, and its corresponding FL-CNN is very effective to extract sensor-specific spatial-spectral features for hyperspectral applications under both supervised and unsupervised modes. |
Longitudinal changes in asthma control with omalizumab: 2-year interim data from the EXCELS Study. | BACKGROUND
Asthma guidelines emphasize the importance of achieving and maintaining asthma control; however, many patients with moderate to severe asthma fail to achieve adequate control.
OBJECTIVE
This 2-year interim analysis evaluated the longitudinal effects of omalizumab on asthma control in patients treated in real-world clinical practice settings.
METHODS
EXCELS is an ongoing observational cohort study of approximately 5000 omalizumab-treated and 2500 non-omalizumab-treated patients aged ≥12 years with moderate to severe asthma. Asthma control was measured using the Asthma Control Test (ACT) every 6 months.
RESULTS
Subgroups of the omalizumab cohort included those who initiated omalizumab at baseline (new starts, n = 549) and those treated with omalizumab >7 days before baseline (established users, n = 4421). For reference, data are also presented for patients who were not receiving omalizumab prior to or at the time of enrolment (non-omalizumab, n = 2867). Over half of the new starts (54%) achieved improvement in ACT consistent with the minimally important difference (MID, defined as ≥3-point improvement) by Month 6 and this proportion increased throughout the follow-up period, reaching 62% at Month 24. Similar results were observed in patients stratified by moderate and severe asthma. Established users of omalizumab maintained asthma control throughout the observation period.
CONCLUSION
Over a 2-year period, patients initiating omalizumab therapy experienced clinically relevant improvements in asthma control, which were maintained during 2 years of longitudinal follow-up. Established users of omalizumab maintained asthma control over the 2-year period with a small improvement similar to that seen in non-omalizumab users. |
Differences in perceptions of communication quality between a Twitterbot and human agent for information seeking and learning | Twitter’s design allows the implementation of automated programs that can submit tweets, interact with others, and generate content based on algorithms. Scholars and end-users alike refer to these programs to as “Twitterbots.” This two-part study explores the differences in perceptions of communication quality between a human agent and a Twitterbot in the areas of cognitive elaboration, information seeking, and learning outcomes. In accordance with the Computers Are Social Actors (CASA) framework (Reeves & Nass, 1996), results suggest that participants learned the same from either a Twitterbot or a human agent. Results are discussed in light of CASA, as well as implications and directions for future studies. © 2016 Elsevier Ltd. All rights reserved. |
Gender gaps across the earnings distribution for full-time employees in Britain: Allowing for sample selection | This paper investigates gender differences across the log wage distributions of British employees working full-time in 2005. The raw gender wage gap shows a tendency to increase across the distribution with a glass ceiling effect indicated. A strong relationship between high skilled, white-collar occupations and carrying out managerial duties with the glass ceiling effect is indicated in the data. After allowing for positive selection into full-time employment by British women, a substantially larger gender earning gaps is found: the selection corrected gender wage gap is close to twice the raw gap across most of the earnings distribution. This selection corrected gap is found to be predominantly related to women receiving lower rewards for their characteristics than men. Indeed, the results suggest the gender earnings gap would all but disappear across the earnings distribution if women working full-time received the same returns to their characteristics as men working full-time in Britain do. |
Video segmentation by tracing discontinuities in a trajectory embedding | Our goal is to segment a video sequence into moving objects and the world scene. In recent work, spectral embedding of point trajectories based on 2D motion cues accumulated from their lifespans, has shown to outperform factorization and per frame segmentation methods for video segmentation. The scale and kinematic nature of the moving objects and the background scene determine how close or far apart trajectories are placed in the spectral embedding. Such density variations may confuse clustering algorithms, causing over-fragmentation of object interiors. Therefore, instead of clustering in the spectral embedding, we propose detecting discontinuities of embedding density between spatially neighboring trajectories. Detected discontinuities are strong indicators of object boundaries and thus valuable for video segmentation. We propose a novel embedding discretization process that recovers from over-fragmentations by merging clusters according to discontinuity evidence along inter-cluster boundaries. For segmenting articulated objects, we combine motion grouping cues with a center-surround saliency operation, resulting in “context-aware”, spatially coherent, saliency maps. Figure-ground segmentation obtained from saliency thresholding, provides object connectedness constraints that alter motion based trajectory affinities, by keeping articulated parts together and separating disconnected in time objects. Finally, we introduce Gabriel graphs as effective per frame superpixel maps for converting trajectory clustering to dense image segmentation. Gabriel edges bridge large contour gaps via geometric reasoning without over-segmenting coherent image regions. We present experimental results of our method that outperform the state-of-the-art in challenging motion segmentation datasets. |
Epidemiological African day for evaluation of patients at risk of venous thrombosis in acute hospital care settings | INTRODUCTION
This study aimed to identify patients at risk for venous thromboembolism (VTE) among all patients hospitalised, and to determine the proportion of at-risk hospital patients who received effective types of VTE prophylaxis in sub-Saharan Africa (SSA).
METHODS
A multinational, observational, cross-sectional survey was carried out on 1 583 at-risk patients throughout five SSA countries.
RESULTS
The prevalence of VTE risk was 50.4% overall, 62.3% in medical and 43.8% in surgical patients. The proportion of at-risk patients receiving prophylaxis was 51.5% overall, 36.2% in medical and 64% in surgical patients. Low-molecular weight heparin was the most frequently used prophylactic method in 40.2% overall, 23.1% in medical and 49.9% in surgical patients.
DISCUSSION
This study showed a high prevalence of VTE risk among hospitalised patients and that less than half of all at-risk patients received an American College of Clinical Pharmacy-recommended method of prophylaxis.
CONCLUSION
Recommended VTE prophylaxis is underused in SSA. |
A Data Exfiltration and Remote Exploitation Attack on Consumer 3D Printers | With the increased popularity of 3D printers in homes, and industry sectors, such as biomedical and manufacturing, the potential for cybersecurity risks must be carefully considered. Risks may arise from factors such as printer manufacturers not having the requisite levels of security awareness, and not fully understanding the need for security measures to protect intellectual property, and other sensitive data that are stored, accessed, and transmitted from such devices. This paper examines the security features of two different models of MakerBot Industries' consumer-oriented 3D printers and proposes an attack technique that is able to, not only, exfiltrate sensitive data, but also allow for remote manipulation of these devices. The attack steps are discretely modeled using a threat model to enable formal representation of the attack. Specifically, we found that the printers stored the previously printed and currently printing objects on an unauthenticated web server. We also ascertain that the transport layer security implementation on these devices was flawed, which severely affected the security of these devices and allowed for remote exploitation. Countermeasures to the attack that are implementable by both the manufacturer and the user of the printer are presented. |
On GitHub's Programming Languages | GitHub is the most widely used social, distributed version control system. It has around 10 million registered users and hosts over 16 million public repositories. Its user base is also very active as GitHub ranks in the top 100 Alexa most popular websites. In this study, we collect GitHub’s state in its entirety. Doing so, allows us to study new aspects of the ecosystem. Although GitHub is the home to millions of users and repositories, the analysis of users’ activity time-series reveals that only around 10% of them can be considered active. The collected dataset allows us to investigate the popularity of programming languages and existence of pattens in the relations between users, repositories, and programming languages. By, applying a k-means clustering method to the usersrepositories commits matrix, we find that two clear clusters of programming languages separate from the remaining. One cluster forms for “web programming” languages (Java Script, Ruby, PHP, CSS), and a second for “system oriented programming” languages (C, C++, Python). Further classification, allow us to build a phylogenetic tree of the use of programming languages in GitHub. Additionally, we study the main and the auxiliary programming languages of the top 1000 repositories in more detail. We provide a ranking of these auxiliary programming languages using various metrics, such as percentage of lines of code, and PageRank. |
You Better Eat to Survive! Exploring Edible Interactions in a Virtual Reality Game | "You Better Eat to Survive!" is a two-player virtual reality game that involves eating real food to survive and ultimately escape from a virtual island. Eating is sensed through capturing chewing sounds via a low-cost microphone solution. Unlike most VR games that stimulate mostly our visual and auditory senses, "You Better Eat to Survive!" makes a novel contribution by integrating the gustatory sense not just as an additional game input, but as an integral element to the game experience: we use the fact that with head-mounted displays, players cannot see what they are eating and have to entrust a second player outside the VR experience to provide them with sufficient food and feeding him/her. With "You Better Eat to Survive!", we aim to demonstrate that eating can be an intriguing interaction technique to enrich virtual reality experiences while offering complementary benefits of social interactions around food. |
DeepID3: Face Recognition with Very Deep Neural Networks | The state-of-the-art of face recognition has been significantly advanced by the emergence of deep learning. Very deep neural networks recently achieved great success on general object recognition because of their superb learning capacity. This motivates us to investigate their effectiveness on face recognition. This paper proposes two very deep neural network architectures, referred to as DeepID3, for face recognition. These two architectures are rebuilt from stacked convolution and inception layers proposed in VGG net [10] and GoogLeNet [16] to make them suitable to face recognition. Joint face identification-verification supervisory signals are added to both intermediate and final feature extraction layers during training. An ensemble of the proposed two architectures achieves 99.53% LFW face verification accuracy and 96.0% LFW rank-1 face identification accuracy, respectively. A further discussion of LFW face verification result is given in the end. |
Deep Learning for Generic Object Detection: A Survey | Generic object detection, aiming at locating object instances from a large number of predefined categories in natural images, is one of the most fundamental and challenging problems in computer vision. Deep learning techniques have emerged in recent years as powerful methods for learning feature representations directly from data, and have led to remarkable breakthroughs in the field of generic object detection. Given this time of rapid evolution, the goal of this paper is to provide a comprehensive survey of the recent achievements in this field brought by deep learning techniques. More than 250 key contributions are included in this survey, covering many aspects of generic object detection research: leading detection frameworks and fundamental subproblems including object feature representation, object proposal generation, context information modeling and training strategies; evaluation issues, specifically benchmark datasets, evaluation metrics, and state of the art performance. We finish by identifying promising directions for future research. |
‘Cycle Thieves, We Are Watching You’: Impact of a Simple Signage Intervention against Bicycle Theft | BACKGROUND
Bicycle theft is a serious problem in many countries, and there is a lack of evidence concerning effective prevention strategies. Displaying images of 'watching eyes' has been shown to make people behave in more socially desirable ways in a number of settings, but it is not yet clear if this effect can be exploited for purposes of crime prevention. We report the results of a simple intervention on a university campus where signs featuring watching eyes and a related verbal message were displayed above bicycle racks.
METHODOLOGY AND PRINCIPAL FINDINGS
We installed durable signs at three locations which had experienced high levels of bicycle theft, and used the rest of the university campus as a control location. Reported thefts were monitored for 12 months before and after the intervention. Bicycle thefts decreased by 62% at the experimental locations, but increased by 65% in the control locations, suggesting that the signs were effective, but displaced offending to locations with no signs. The Odds Ratio for the effect of the intervention was 4.28 (95% confidence interval 2.04-8.98), a large effect compared to other place-based crime prevention interventions.
CONCLUSIONS AND SIGNIFICANCE
The effectiveness of this extremely cheap and simple intervention suggests that there can be considerable crime-reduction benefits to engaging the psychology of surveillance, even in the absence of surveillance itself. Simple interventions for high-crime locations based on this principle should be considered as an adjunct to other measures, although a possible negative consequence is displacement of offending. |
Wideband Radar for Ballistic Missile Defense and Range-Doppler Imaging of Satellites | ■ Lincoln Laboratory led the nation in the development of high-power wideband radar with a unique capability for resolving target scattering centers and producing three-dimensional images of individual targets. The Laboratory fielded the first wideband radar, called ALCOR, in 1970 at Kwajalein Atoll. Since 1970 the Laboratory has developed and fielded several other wideband radars for use in ballistic-missile-defense research and space-object identification. In parallel with these radar systems, the Laboratory has developed high-capacity, high-speed signal and data processing techniques and algorithms that permit generation of target images and derivation of other target features in near real time. It has also pioneered new ways to realize improved resolution and scatterer-feature identification in wideband radars by the development and application of advanced signal processing techniques. Through the analysis of dynamic target images and other wideband observables, we can acquire knowledge of target form, structure, materials, motion, mass distribution, identifying features, and function. Such capability is of great benefit in ballistic missile decoy discrimination and in space-object identification. |
Wire Bonding to Advanced Copper-Low-K Integrated Circuits, the Metal/Dielectric Stacks, and Materials Considerations | There are three areas to consider when designing/implementing wire bonding to advanced ULSI damascene–copper chips having copper metallization and low dielectric-constant polymers embedded beneath them (Cu/LoK). These are: 1) the copper-pad top-surface oxidation inhibitor coatingmetal/organic/inorganic. (Current work involves evaluating the metal and inorganic options); 2) the low dielectric constant materials available; 3) under-pad metal/polymer stacks and support structures necessary for bondability and reliability. There are also various polymer/metallurgical interactions, resulting in long term packaged-device reliability problems, that can occur as the result of the wire bonding process over low modulus, LoK materials with barriers. These include cracked diffusion barriers, copper diffusion into the LoK polymers, cracking/spalling/crazing of the LoK materials, and bond pad indentation (“cupping”). Low-K polymer materials, with high expansion coefficients and low thermal conductivities, can also increase the stress and further extend any existing damage to barriers. Many of the above problems have previously been encountered when bonding to pads over polymers (MCM-D, polymer buildup-layers on PCBs, PBGAs, flex circuits, etc.), and they share some of the same solutions. Well designed LoK and the underpad structures should have no negative effect on bonding parameters and be invisible to the bonding process. |
Variable flux permanent magnet synchronous machine (VF-PMSM) design to meet electric vehicle traction requirements with reduced losses | Variable flux permanent magnet synchronous machines (VF-PMSMs) in which the magnetization state (MS) of low coercive force (low-Hc) permanent magnets can be actively controlled to reduce losses in applications that require wide-speed operation have been proposed recently. While prior focus has been on achieving MS manipulation without over-sizing the inverter and obtaining higher torque capability, this paper extends the design objectives to include the power requirements of an electric vehicle traction motor over its entire speed range. Finite element methods are used to study the effect of combinations of low-Hc and high-Hc permanent magnets arranged in either series or parallel on the performance of VF-PMSMs. It is shown that while both configurations help improve the torque density, only the series configuration can help improve the high speed power capability. Experimental results showing the variable MS property, torque-speed capability and loss reduction capability of a series magnet configuration VF-PMSM test machine are presented. |
A comparison of the standard and the computerized versions of the Well-being Questionnaire (WBQ) and the Diabetes Treatment Satisfaction Questionnaire (DTSQ) | In the present study, the equivalence of paper and pencil assessment versus computer assessment of two self-administered questionnaires was investigated by means of a randomized cross-over design. Therefore, 105 out-patients with diabetes were invited to participate; 76 patients completed both the computer and the paper and pencil version of the Well-being Questionnaire (WBQ) and the Diabetes Treatment Satisfaction Questionnaire (DTSQ) in a randomized order, with a mean interval of 7 days. The scales showed high test-retest correlations and the means, dispersions, kurtosis and skewness were found to be approximately the same in both versions. In both modes of assessment, the depression and the energy scale proved to be sensitive for carry-over effects, resulting in better well-being scores at the second measurement. Almost all subjects reported that using the personal computer in the realization of a questionnaire was easy. It is concluded that the paper and pencil and the computerized versions of the WBQ and DTSQ can be considered equivalent. Therefore, the norms and cut-off scores obtained from paper and pencil assessments can be used in computerized versions of the WBQ and DTSQ and vice versa. |
Robust Higher Order Potentials for Enforcing Label Consistency | This paper proposes a novel framework for labelling problems which is able to combine multiple segmentations in a principled manner. Our method is based on higher order conditional random fields and uses potentials defined on sets of pixels (image segments) generated using unsupervised segmentation algorithms. These potentials enforce label consistency in image regions and can be seen as a generalization of the commonly used pairwise contrast sensitive smoothness potentials. The higher order potential functions used in our framework take the form of the Robust P n model and are more general than the P n Potts model recently proposed by Kohli et al. We prove that the optimal swap and expansion moves for energy functions composed of these potentials can be computed by solving a st-mincut problem. This enables the use of powerful graph cut based move making algorithms for performing inference in the framework. We test our method on the problem of multi-class object segmentation by augmenting the conventional crf used for object segmentation with higher order potentials defined on image regions. Experiments on challenging data sets show that integration of higher order potentials quantitatively and qualitatively improves results leading to much better definition of object boundaries. We believe that this method can be used to yield similar improvements for many other labelling problems. |
Look Who Are Disguising Profits: An Application to Chinese Industrial Firms | This paper develops a fairly general empirical procedure to trace out the extent of profit disguising and examine the motives behind it. Applying the methodology to the National Bureau of Statistics of China (NBS) database which covers more than 20,000 large-and mediumsized industrial firms in China for 1995-2002, we find (i) there is a profit-disguising propensity order by ownership in China (from the weakest to the strongest) — foreign invested firms Hong Kong or Taiwan firms state-owned enterprises mixed firms collective firms private firms. Specifically, we find that, based on a conservative estimation, the private firms in China on average disguise 18.5% more profits than the state-owned enterprises and 37.4% more profits than foreign firms; (ii) firms with tighter financing constraints reveal stronger tendency to disguise profit; and (iii) smaller firms tend to disguise more profits. These results suggest that tax evasion, and incentive to overcome financing constraints, together with distorted corporate behavior caused by insecure property rights and weak institutions, account for Chinese firms’ profit disguising. We also find that Chinese firms’ profit disguising lies principally on revenue rather than cost. |
Payment card fraud detection using neural network committee and clustering | The task of payment card fraud detection using account information is considered. We apply to two approaches for organization of neural networks interaction: neural network committee and clustering approach. Finally, these two methods are compared. |
Power System Load Shedding : Key Issues and New Perspectives | Optimal load shedding (LS) design as an emergency plan is one of the main control challenges posed by emerging new uncertainties and numerous distributed generators including renewable energy sources in a modern power system. This paper presents an overview of the key issues and new challenges on optimal LS synthesis concerning the integration of wind turbine units into the power systems. Following a brief survey on the existing LS methods, the impact of power fluctuation produced by wind powers on system frequency and voltage performance is presented. The most LS schemas proposed so far used voltage or frequency parameter via under-frequency or under-voltage LS schemes. Here, the necessity of considering both voltage and frequency indices to achieve a more effective and comprehensive LS strategy is emphasized. Then it is clarified that this problem will be more dominated in the presence of wind turbines. Keywords— Load shedding, emergency control, voltage, frequency, wind turbine. |
Privacy Tipping Points in Smartphones Privacy Preferences | The aim of this research was to understand what affects people's privacy preferences in smartphone apps. We ran a four-week study in the wild with 34 participants. Participants were asked to answer questions, which were used to gather their personal context and to measure their privacy preferences by varying app name and purpose of data collection. Our results show that participants shared the most when no information about data access or purpose was given, and shared the least when both of these details were specified. When just one of either purpose or the requesting app was shown, participants shared less when just the purpose was specified than when just the app name was given. We found that the purpose for data access was the predominant factor affecting users' choices. In our study the purpose condition vary from being not specified, to vague to be very specific. Participants were more willing to disclose data when no purpose was specified. When a vague purpose was shown, participants became more privacy-aware and were less willing to disclose their information. When specific purposes were shown participants were more willing to disclose when the purpose for requesting the information appeared to be beneficial to them, and shared the least when the purpose for data access was solely beneficial to developers. |
Diagnosing Faults in Electrical Power Systems of Spacecraft and Aircraft | Electrical power systems play a critical role in spacecraft and aircraft. This paper discusses our development of a diagnostic capability for an electrical power system testbed, ADAPT, using probabilistic techniques. In the context of ADAPT, we present two challenges, regarding modelling and real-time performance, often encountered in real-world diagnostic applications. To meet the modelling challenge, we discuss our novel high-level speci cation language which supports autogeneration of Bayesian networks. To meet the real-time challenge, we compile Bayesian networks into arithmetic circuits. Arithmetic circuits typically have small footprints and are optimized for the real-time avionics systems found in spacecraft and aircraft. Using our approach, we present how Bayesian networks with over 400 nodes are auto-generated and then compiled into arithmetic circuits. Using real-world data from ADAPT as well as simulated data, we obtain average inference times smaller than one millisecond when computing diagnostic queries using arithmetic circuits that model our realworld electrical power system. |
Predicting Sufficient Annotation Strength for Interactive Foreground Segmentation | The mode of manual annotation used in an interactive segmentation algorithm affects both its accuracy and ease-of-use. For example, bounding boxes are fast to supply, yet may be too coarse to get good results on difficult images, freehand outlines are slower to supply and more specific, yet they may be overkill for simple images. Whereas existing methods assume a fixed form of input no matter the image, we propose to predict the tradeoff between accuracy and effort. Our approach learns whether a graph cuts segmentation will succeed if initialized with a given annotation mode, based on the image's visual separability and foreground uncertainty. Using these predictions, we optimize the mode of input requested on new images a user wants segmented. Whether given a single image that should be segmented as quickly as possible, or a batch of images that must be segmented within a specified time budget, we show how to select the easiest modality that will be sufficiently strong to yield high quality segmentations. Extensive results with real users and three datasets demonstrate the impact. |
Topic Detection by Clustering Keywords | We consider topic detection without any prior knowledge of category structure or possible categories. Keywords are extracted and clustered based on different similarity measures using the induced k-bisecting clustering algorithm. Evaluation on Wikipedia articles shows that clusters of keywords correlate strongly with the Wikipedia categories of the articles. In addition, we find that a distance measure based on the Jensen-Shannon divergence of probability distributions outperforms the cosine similarity. In particular, a newly proposed term distribution taking co-occurrence of terms into account gives best results. |
The H2 robotic exoskeleton for gait rehabilitation after stroke: early findings from a clinical study | Stroke significantly affects thousands of individuals annually, leading to considerable physical impairment and functional disability. Gait is one of the most important activities of daily living affected in stroke survivors. Recent technological developments in powered robotics exoskeletons can create powerful adjunctive tools for rehabilitation and potentially accelerate functional recovery. Here, we present the development and evaluation of a novel lower limb robotic exoskeleton, namely H2 (Technaid S.L., Spain), for gait rehabilitation in stroke survivors. H2 has six actuated joints and is designed to allow intensive overground gait training. An assistive gait control algorithm was developed to create a force field along a desired trajectory, only applying torque when patients deviate from the prescribed movement pattern. The device was evaluated in 3 hemiparetic stroke patients across 4 weeks of training per individual (approximately 12 sessions). The study was approved by the Institutional Review Board at the University of Houston. The main objective of this initial pre-clinical study was to evaluate the safety and usability of the exoskeleton. A Likert scale was used to measure patient’s perception about the easy of use of the device. Three stroke patients completed the study. The training was well tolerated and no adverse events occurred. Early findings demonstrate that H2 appears to be safe and easy to use in the participants of this study. The overground training environment employed as a means to enhance active patient engagement proved to be challenging and exciting for patients. These results are promising and encourage future rehabilitation training with a larger cohort of patients. The developed exoskeleton enables longitudinal overground training of walking in hemiparetic patients after stroke. The system is robust and safe when applied to assist a stroke patient performing an overground walking task. Such device opens the opportunity to study means to optimize a rehabilitation treatment that can be customized for individuals. Trial registration: This study was registered at ClinicalTrials.gov ( https://clinicaltrials.gov/show/NCT02114450 ). |
Comorbidity of auditory processing, language, and reading disorders. | PURPOSE
The authors assessed comorbidity of auditory processing disorder (APD), language impairment (LI), and reading disorder (RD) in school-age children.
METHOD
Children (N = 68) with suspected APD and nonverbal IQ standard scores of 80 or more were assessed using auditory, language, reading, attention, and memory measures. Auditory processing tests included the Frequency Pattern Test (FPT; F. E. Musiek, 1994; D. Noffsinger, R. H. Wilson, & F. E. Musiek, 1994); the Dichotic Digit Test Version 2 (DDT; F. E. Musiek, 1983); the Random Gap Detection Test (R. W. Keith, 2000); the 500-Hz tone Masking Level Difference (V. Aithal, A. Yonovitz, & S. Aithal, 2006); and a monaural low-redundancy speech test (compressed and reverberant words; A. Boothroyd & S. Nittrouer, 1988). The Clinical Evaluation of Language Fundamentals, Fourth Edition (E. Semel, E. Wiig, & W. Secord, 2003) was used to assess language abilities (including auditory memory). Reading accuracy and fluency and phonological awareness abilities were assessed using the Wheldall Assessment of Reading Passages (A. Madelaine & K. Wheldall, 2002) and the Queensland University Inventory of Literacy (B. Dodd, A. Holm, M. Orelemans, & M. McCormick, 1996). Attention was measured using the Integrated Visual and Auditory Continuous Performance Test (J. A. Sandford & A. Turner, 1995).
RESULTS
Of the children, 72% had APD on the basis of these test results. Most of these children (25%) had difficulty with the FPT bilaterally. A further 22% had difficulty with the FPT bilaterally and had right ear deficits for the DDT. About half of the children (47%) had problems in all 3 areas (APD, LI, and RD); these children had the poorest FPT scores. More had APD-RD, or APD-LI, than APD, RD, or LI alone. There were modest correlations between FPT scores and attention and memory, and between DDT scores and memory.
CONCLUSIONS
LI and RD commonly co-occur with APD. Attention and memory are linked to performance on some auditory processing tasks but only explain a small amount of the variance in scores. Comprehensive assessment across a range of areas is required to characterize the difficulties experienced by children with APD. |
A reliable gyroscope-based gait-phase detection sensor embedded in a shoe insole | This paper presents results of patient experiments using a new gait-phase detection sensor (GPDS) together with a programmable functional electrical stimulation (FES) system for subjects with a dropped-foot walking dysfunction. The GPDS (sensors and processing unit) is entirely embedded in a shoe insole and detects in real time four phases (events) during the gait cycle: stance, heel off, swing, and heel strike. The instrumented GPDS insole consists of a miniature gyroscope that measures the angular velocity of the foot and three force sensitive resistors that measure the force load on the shoe insole at the heel and the metatarsal bones. The extracted gait-phase signal is transmitted from the embedded microcontroller to the electrical stimulator and used in a finite state control scheme to time the electrical stimulation sequences. The electrical stimulations induce muscle contractions in the paralyzed muscles leading to a more physiological motion of the affected leg. The experimental results of the quantitative motion analysis during walking of the affected and nonaffected sides showed that the use of the combined insole and FES system led to a significant improvement in the gait-kinematics of the affected leg. This combined sensor and stimulation system has the potential to serve as a walking aid for rehabilitation training or permanent use in a wide range of gait disabilities after brain stroke, spinal-cord injury, or neurological diseases. |
From cyber-physical systems to Industry 4.0: make future manufacturing become possible | Recently, ‘Industry 4.0’ has become the supporting force in the manufacturing field. From CPS to Industry 4.0, they are leading a new round of industrial revolution. To meet the fourth industrial revolution, USA, European countries, Japan, South Korea and China have presented different versions of the so-called ‘Industry 4.0’ strategies respectively, which are different from each other in the content, but their common goal is to achieve manufacturing structural adjustment and restructuring and upgrading, to seize a place for future manufacturing, to provide users with better service. What is more, we summarised the five major trends of the future manufacturing. Then, we introduce some enabling technologies about Industry 4.0, among them, CPS plays a key role in the Industry 4.0 factory. Based on the customer to business (C2B) mode, we regard everything as a service, and propose a structure of Industry 4.0. Finally, the automotive industry as an example, we make a brief introduction about an Industry 4.0 case. [Received 15 December 2015; Revised 30 May 2016; Accepted 30 May 2016] |
Facial part displacement effect on template-based gender and ethnicity classification | Visual information such as gender, age and ethnicity play critical roles in human identification. Most of gender and ethnicity recognition research works use the full face considering equal discriminant capability for different face parts. In this paper, we improve the gender and ethnicity recognition, by employing the optimum decision making rule on the confidence level of automatically separated face regions using the modified Golden ratio mask. Faces are preprocessed with multiple base point photometric normalization to prevent facial parts displacement in the noted mask, due to different facial parts' distances of people. SVM is employed as the classifier on the extracted Gabor features of each patch to get its confidence level. The final classification results are obtained based on the output of each patch decision using the optimum decision making rule. Finally, using the most accurate normalization approach for each patch, we could achieve 94% and 98% for gender and ethnicity respectively on a dataset composed of FERET and PEAL frontal face images. |
Social interaction facilitated by a minimally-designed robot: Findings from longitudinal therapeutic practices for autistic children | This paper reports our longitudinal observation of unconstrained child-robot interaction at a daycare center for autistic children. We used a simple robot, Keepon, that is only capable of expressing its attention (directing its gaze) and emotions (pleasure and excitement). While controlled by a remote experimenter, Keepon interacted with the children with its simple appearance and actions. With a sense of curiosity and security, the children spontaneously approached Keepon and engaged in dyadic interaction with it, which then extended to triadic interactions where they exchanged with adult caregivers pleasure and surprise found in Keepon. The three-year-long observation suggests that autistic children possess the motivation to share mental states with others, which is contrary to the commonly held position that the motivation is impaired in autism. Autistic children, however, have difficulty rather in sifting out socially meaningful information (e.g., attention and emotions) from flooding perceptual information. Keepon, with its minimal expressiveness, directly conveyed socially meaningful information to the children, which activated their intact motivation to share interests and feelings with others. We conclude that social filtering is one of the prerequisites for interpersonal communication and that robots like Keepon can facilitate social filtering and its development in autistic children. |
On the Systematic Analysis of Natural Language Requirements with CIRCE | This paper presents Circe, an environment for the analysis of natural language requirements. Circe is first presented in terms of its architecture, based on a transformational paradigm. Details are then given for the various transformation steps, including (i) a novel technique for parsing natural language requirements, and (ii) an expert system based on modular agents, embodying intensional knowledge about software systems in general. The result of all the transformations is a set of models for the requirements document, for the system described by the requirements, and for the requirements writing process. These models can be inspected, measured, and validated against a given set of criteria. Some of the features of the environment are shown by means of an example. Various stages of requirements analysis are covered, from initial sketches to pseudo-code and UML models. |
Polymeric nanoparticle-encapsulated curcumin ("nanocurcumin"): a novel strategy for human cancer therapy | BACKGROUND
Curcumin, a yellow polyphenol extracted from the rhizome of turmeric (Curcuma longa), has potent anti-cancer properties as demonstrated in a plethora of human cancer cell line and animal carcinogenesis models. Nevertheless, widespread clinical application of this relatively efficacious agent in cancer and other diseases has been limited due to poor aqueous solubility, and consequently, minimal systemic bioavailability. Nanoparticle-based drug delivery approaches have the potential for rendering hydrophobic agents like curcumin dispersible in aqueous media, thus circumventing the pitfalls of poor solubility.
RESULTS
We have synthesized polymeric nanoparticle encapsulated formulation of curcumin - nanocurcumin - utilizing the micellar aggregates of cross-linked and random copolymers of N-isopropylacrylamide (NIPAAM), with N-vinyl-2-pyrrolidone (VP) and poly(ethyleneglycol)monoacrylate (PEG-A). Physico-chemical characterization of the polymeric nanoparticles by dynamic laser light scattering and transmission electron microscopy confirms a narrow size distribution in the 50 nm range. Nanocurcumin, unlike free curcumin, is readily dispersed in aqueous media. Nanocurcumin demonstrates comparable in vitro therapeutic efficacy to free curcumin against a panel of human pancreatic cancer cell lines, as assessed by cell viability and clonogenicity assays in soft agar. Further, nanocurcumin's mechanisms of action on pancreatic cancer cells mirror that of free curcumin, including induction of cellular apoptosis, blockade of nuclear factor kappa B (NFkappaB) activation, and downregulation of steady state levels of multiple pro-inflammatory cytokines (IL-6, IL-8, and TNFalpha).
CONCLUSION
Nanocurcumin provides an opportunity to expand the clinical repertoire of this efficacious agent by enabling ready aqueous dispersion. Future studies utilizing nanocurcumin are warranted in pre-clinical in vivo models of cancer and other diseases that might benefit from the effects of curcumin. |
Behaviour does not fully explain the high risk of chronic liver disease in less educated men in Hungary. | BACKGROUND
Hungary has among the highest mortality rates from chronic liver disease (CLD) and cirrhosis in Europe. Usually, conventional behavioural factors are hypothesized as the cause of the high risk of CLD.
METHODS
A case-control study was performed with 287 cases and 892 controls to study the relationship between socio-economic and behavioural factors and the risk of CLD. Liver disease was verified by physical examination and blood tests. Blood samples were collected for detecting hepatitis B, C and E virus infection. Information on exposure factors was recorded by the participating physicians and by self-administered questionnaire. Simple regression analysis was used to study the relationship between CLD/cirrhosis and potential risk factors as alcohol intake (amount and source), problem drinking, cigarette smoking, physical activity, viral hepatitis infections, socio-economic factors (education, financial and marital status). Multiple regression analysis was used to identify whether the effect of socio-economic factors is fully mediated by health behaviour (smoking, alcohol consumption, physical activity).
RESULTS
The univariate analysis showed that heavy alcohol consumption, problem drinking, former and heavy cigarette smoking, single, separated or divorced marital status, bad or very bad perceived financial status and lower education significantly increased the risk of CLD/cirrhosis. The effect of marital status and of education did not change after adjustment for behavioural factors, but the effect of perceived financial status disappeared.
CONCLUSIONS
The effect of low socio-economic status on the risk of CLD/cirrhosis is only partially explained by conventional behavioural risk factors in Hungary. |
Soft robot actuators using energy-efficient valves controlled by electropermanent magnets | This paper presents the design, fabrication, and evaluation of a novel type of valve that uses an electropermanent magnet [1]. This valve is then used to build actuators for a soft robot. The developed EPM valves require only a brief (5 ms) pulse of current to turn flow on or off for an indefinite period of time. EPMvalves are characterized and demonstrated to be well suited for the control of elastomer fluidic actuators. The valves drive the pressurization and depressurization of fluidic channels within soft actuators. Furthermore, the forward locomotion of a soft, multi-actuator rolling robot is driven by EPM valves. The small size and energy-efficiency of EPM valves may make them valuable in soft mobile robot applications. |
Local Learning Regularized Nonnegative Matrix Factorization | Sequence alignment is an important problem in computational biology. We compare two different approaches to the problem of optimally aligning two or more character strings: bounded dynamic programming (BDP), and divide-and-conquer frontier search (DCFS). The approaches are compared in terms of time and space requirements in 2 through 5 dimensions with sequences of varying similarity and length. While BDP performs better in two and three dimensions, it consumes more time and memory than DCFS for higher-dimensional problems. |
Culture in business process management: a literature review | Purpose – Business process management (BPM) is a management approach that developed with a strong focus on the adoption of information technology (IT). However, there is a growing awareness that BPM requires a holistic organizational perspective especially since culture is often considered a key element in BPM practice. Therefore, the purpose of this paper is to provide an overview of existing research on culture in BPM. Design/methodology/approach – This literature review builds on major sources of the BPM community including the BPM Journal, the BPM Conference and central journal/conference databases. Forward and backward searches additionally deepen the analysis. Based on the results, a model of culture’s role in BPM is developed. Findings – The results of the literature review provide evidence that culture is still a widely under-researched topic in BPM. Furthermore, a framework on culture’s role in BPM is developed and areas for future research are revealed. Research limitations/implications – The analysis focuses on the concepts of BPM and culture. Thus, results do not include findings regarding related concepts such as business process reengineering or change management. Practical implications – The framework provides an orientation for managerial practice. It helps identify dimensions of possible conflicts based on cultural aspects. It thus aims at raising awareness regarding potentially neglected cultural factors. Originality/value – Although culture has been recognized in both theory and practice as an important aspect of BPM, researchers have not systematically engaged with the specifics of the culture phenomenon in BPM. This literature review provides a frame of reference that serves as a basis for future research regarding culture’s role in BPM. |
MMAC: a mobility-adaptive, collision-free MAC protocol for wireless sensor networks | Mobility in wireless sensor networks poses unique challenges to the medium access control (MAC) protocol design. Previous MAC protocols for sensor networks assume static sensor nodes and focus on energy-efficiency. In this paper, we present a mobility-adaptive, collision-free medium access control protocol (MMAC) for mobile sensor networks. MMAC caters for both weak mobility (e.g., topology changes, node joins, and node failures) and strong mobility (e.g., concurrent node joins and failures, and physical mobility of nodes). MMAC is a scheduling-based protocol and thus it guarantees collision avoidance. MMAC allows nodes the transmission rights at particular time-slots based on the traffic information and mobility pattern of the nodes. Simulation results indicate that the performance of MMAC is equivalent to that of TRAMA in static sensor network environments. In sensor networks with mobile nodes or high network dynamics, MMAC outperforms existing MAC protocols, like TRAM A and S-MAC, in terms of energy-efficiency, delay, and packet delivery. |
Multi-pedestrian detection in crowded scenes: A global view | Recent state-of-the-art algorithms have achieved good performance on normal pedestrian detection tasks. However, pedestrian detection in crowded scenes is still challenging due to the significant appearance variation caused by heavy occlusions and complex spatial interactions. In this paper we propose a unified probabilistic framework to globally describe multiple pedestrians in crowded scenes in terms of appearance and spatial interaction. We utilize a mixture model, where every pedestrian is assumed in a special subclass and described by the sub-model. Scores of pedestrian parts are used to represent appearance and quadratic kernel is used to represent relative spatial interaction. For efficient inference, multi-pedestrian detection is modeled as a MAP problem and we utilize greedy algorithm to get an approximation. For discriminative parameter learning, we formulate it as a learning to rank problem, and propose Latent Rank SVM for learning from weakly labeled data. Experiments on various databases validate the effectiveness of the proposed approach. |
A practical time-delay estimator for localizing speech sources with a microphone array | A frequency-domain-based delay estimator is described, designed speci cally for speech signals in a microphone-array environment. It is shown to be capable of obtaining precision delay estimates over a wide range of SNR conditions and is simple enough computationally to make it practical for real-time systems. A location algorithm based upon the delay estimator is then developed. With this algorithm it is possible to localize talker positions to a region only a few centimeters in diameter (not very di erent from the size of the source), and to track a moving source. Experimental results using data from a real 16-element array are presented to indicate the true performance of the algorithms. |
Single-trial dynamics of motor cortex and their applications to brain-machine interfaces | Increasing evidence suggests that neural population responses have their own internal drive, or dynamics, that describe how the neural population evolves through time. An important prediction of neural dynamical models is that previously observed neural activity is informative of noisy yet-to-be-observed activity on single-trials, and may thus have a denoising effect. To investigate this prediction, we built and characterized dynamical models of single-trial motor cortical activity. We find these models capture salient dynamical features of the neural population and are informative of future neural activity on single trials. To assess how neural dynamics may beneficially denoise single-trial neural activity, we incorporate neural dynamics into a brain-machine interface (BMI). In online experiments, we find that a neural dynamical BMI achieves substantially higher performance than its non-dynamical counterpart. These results provide evidence that neural dynamics beneficially inform the temporal evolution of neural activity on single trials and may directly impact the performance of BMIs. |
Comparing Architectures of Mobile Applications | This article describes various advantages and disadvantages of SMS, WAP, J2ME and Windows CE technologies in designing mobile applications. In defining the architecture of any software application it is important to get the best trade-off between platform's possibilities and design requirements. Achieving optimum software design is even more important with mobile applications where all computer resources are limited. Therefore, it is important to have a comparative analysis of all relevant contemporary approaches in designing mobile applications. As always, the choice between these technologies is determined by application requirements and system capabilities. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.