title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
Modeling and Tracking the Driving Environment With a Particle-Based Occupancy Grid | Modeling and tracking the driving environment is a complex problem due to the heterogeneous nature of the real world. In many situations, modeling the obstacles and the driving surfaces can be achieved by the use of geometrical objects, and tracking becomes the problem of estimating the parameters of these objects. In the more complex cases, the scene can be modeled and tracked as an occupancy grid. This paper presents a novel occupancy grid tracking solution based on particles for tracking the dynamic driving environment. The particles will have a dual nature-they will denote hypotheses, as in the particle filtering algorithms, but they will also be the building blocks of our modeled world. The particles have position and speed, and they can migrate in the grid from cell to cell, depending on their motion model and motion parameters, but they will be also created and destroyed using a weighting-resampling mechanism that is specific to particle filtering algorithms. The tracking algorithm will be centered on particles, instead of cells. An obstacle grid derived from processing a stereovision-generated elevation map is used as measurement information, and the measurement model takes into account the uncertainties of the stereo reconstruction. The resulting system is a flexible real-time tracking solution for dynamic unstructured driving environments. |
Collateral , Type of Lender and Relationship Banking as Determinants of Credit Risk | This paper analyses the determinants of the probability of default (PD) of bank loans. We focus the discussion on the role of a limited set of variables (collateral, type of lender and bank-borrower relationship) while controlling for the other explanatory variables. The study uses information on the more than three million loans entered into by Spanish credit institutions over a complete business cycle (1988 to 2000) collected by the Bank of Spain’s Credit Register (Central de Información de Riesgos). We find that collateralised loans have a higher PD, loans granted by savings banks are riskier and, finally, that a close bank-borrower relationship increases the willingness to take more risk. JEL: G21 |
A Questionnaire Approach based on the Technology Acceptance Model for Mobile tracking on Patient progress Applications | Healthcare professionals spend much of their time wandering between patients and offices, while the supportive technology stays stationary. Therefore, mobile applications has adapted for healthcare industry. In spite of the advancement and variety of available mobile based applications, there is an eminent need to investigate the current position of the acceptance of those mobile health applications that are tailored towards the tracking patients condition, share patients information and access. Consequently, in this study Technology Acceptance Model has designed to investigate the user acceptance of mobile technology application within healthcare industry. The purpose of this study is to design a quantitative approach based on the technology acceptance model questionnaire as its primary research methodology. It utilized a quantitative approach based a Technology Acceptance Model (TAM) to evaluate the system mobile tracking Model. The related constructs for evaluation are: Perceived of Usefulness, Perceived Ease of Use, User Satisfaction and Attribute of Usability. All these constructs are modified to suit the context of the study. Moreover, this study outlines the details of each construct and its relevance toward the research issue. The outcome of the study represents series of approaches that will apply for checking the suitability of a mobile tracking on patient progress application for health care industry and how well it achieves the aims and objectives of the design. |
A PIN diode controlled variable attenuator using a 0-dB branch-line coupler | We describe a simple PIN diode controlled variable attenuator that employs a 0-dB branch line directional coupler. The response of the attenuator was measured between 1.3 GHz and 2.6 GHz. At the center frequency, the attenuation monotonically varied from 0.7 dB to 23 dB with the control voltage, and the distributed branch-line coupler structure resulted in low input reflection. Our attenuator is easier to design, smaller in area than a double hybrid coupled attenuator, and has comparable or better reflection and attenuation performance characteristics. |
Dynamic Bernoulli Embeddings for Language Evolution | Word embeddings are a powerful approach for unsupervised analysis of language. Recently, Rudolph et al. (2016) developed exponential family embeddings, which cast word embeddings in a probabilistic framework. Here, we develop dynamic embeddings, building on exponential family embeddings to capture how the meanings of words change over time. We use dynamic embeddings to analyze three large collections of historical texts: the U.S. Senate speeches from 1858 to 2009, the history of computer science ACM abstracts from 1951 to 2014, and machine learning papers on the Arxiv from 2007 to 2015. We find dynamic embeddings provide better fits than classical embeddings and capture interesting patterns about how language changes. |
Pericytes regulate the blood–brain barrier | The blood–brain barrier (BBB) consists of specific physical barriers, enzymes and transporters, which together maintain the necessary extracellular environment of the central nervous system (CNS). The main physical barrier is found in the CNS endothelial cell, and depends on continuous complexes of tight junctions combined with reduced vesicular transport. Other possible constituents of the BBB include extracellular matrix, astrocytes and pericytes, but the relative contribution of these different components to the BBB remains largely unknown. Here we demonstrate a direct role of pericytes at the BBB in vivo. Using a set of adult viable pericyte-deficient mouse mutants we show that pericyte deficiency increases the permeability of the BBB to water and a range of low-molecular-mass and high-molecular-mass tracers. The increased permeability occurs by endothelial transcytosis, a process that is rapidly arrested by the drug imatinib. Furthermore, we show that pericytes function at the BBB in at least two ways: by regulating BBB-specific gene expression patterns in endothelial cells, and by inducing polarization of astrocyte end-feet surrounding CNS blood vessels. Our results indicate a novel and critical role for pericytes in the integration of endothelial and astrocyte functions at the neurovascular unit, and in the regulation of the BBB. |
Response of arthropod species richness and functional groups to urban habitat structure and management | Urban areas are a particular landscape matrix characterized by a fine-grained spatial arrangement of very diverse habitats (urban mosaic). We investigated arthropods to analyse biodiversity-habitat associations along five environmental gradients (age, impervious area, management, configuration, composition) in three Swiss cities (96 study sites). We considered total species richness and species richness within different functional groups (zoophagous, phytophagous, pollinator, low mobility, and high mobility species). Information theoretical model selection procedures were applied and predictions were calculated based on weighted models. Urban areas yielded on average 284 arthropod species (range: 169–361), with species richness correlating mostly with heterogeneity indices (configuration and composition). Species richness also increased with age of urban settlement, while enlarged proportions of impervious area and intensified habitat management was negatively correlated. Functional groups showed contrasted, specific responses to environmental variables. Overall, we found surprisingly little variation in species richness along the gradients, which is possibly due to the fine-grained spatial interlinkage of good (heterogeneous) and bad (sealed) habitats. The highly fragmented nature of urban areas may not represent a major obstacle for the arthropods currently existing in cities because they have probably been selected for tolerance to fragmentation and for high colonisation potential. Given that built areas are becoming denser, increasing spatial heterogeneity of the urban green offers potential for counteracting the detrimental effects of densification upon urban biodiversity. By quantifying the expected effects along environmental gradients, this study provides guidance for managers to set priorities when enhancing urban arthropod species richness. |
Planar High-Gain Circularly Polarized Element Antenna for Array Applications | This paper proposes a planar high-gain circularly polarized element antenna that can be used for array applications. The substrate-integrated waveguide (SIW) cavity is designed to support dual-resonant modes, the TE120-like and TE210-like modes, for the implementation of circularly polarized radiation performance. Four rectangular radiation slots and a large perturbation, introduced by inserting a metallic via-brick into the SIW cavity, are deployed to achieve a high simulated gain of 10.28 dBi at 6.65 GHz for the proposed element. As a demonstration of an array application, a 4 × 4 array antenna is designed using the proposed element with a compact beam-forming network. Experiments are carried out to verify the designed prototypes. The measured peak gains of the designed element and array antennas are 9.6 and 20.1 dBi, respectively, which include the loss from SMA connectors. Good agreement between simulated and measured results is obtained. |
Proximity-Coupled Microstrip Antenna for Bluetooth, WiMAX, and WLAN Applications | This letter presents a novel proximity-coupled multiband microstrip patch antenna. The proposed antenna simultaneously operates at Bluetooth (2.4-2.485 GHz), WiMAX (3.3-3.7 GHz), and WLAN (5.15-5.35 and 5.725-5.85 GHz) bands with consistent radiation pattern and uniform gain. The antenna is a two-layer design. A V-shaped patch with a rectangular strip on the top layer is fed by a buried rectangular strip in the middle layer by proximity coupling. Furthermore, resonating parasitic structures in the bottom plane are used to improve the antenna characteristics. The fabricated antenna has a small size of 24 ×27 ×1.6 mm3 on an FR4 substrate. |
Prenatal Dexamethasone and Exogenous Surfactant Therapy: Surface Activity and Surfactant Components in Airway Specimens | ABSTRACT: To explain some of the effects of prenatal glucocorticoid treatment on lung function, surfactant parameters in the airway specimens of ventilator-dependent preterm infants were analyzed. In this double-blind study, the mothers of these infants had received dexamethasone (DEX) or placebo prenatally. Human surfactant was given for the treatment of moderate to severe respiratory distress syndrome. Seventy-six preterm infants with mean gestational age of 29 wk and mean birth weight of 1137 g were studied. The concentrations of surfactant components in epithelial lining fluid (ELF) were analyzed, and the surface activity was measured using a pulsating bubble method. Prenatal DEX treatment increased the responsiveness to exogenous surfactant and decreased the severity of respiratory failure during the first day of life. The treatment had no effect on the concentrations of surfactant phospholipids that were generally high. Prenatal DEX treatment increased the association between phospholipid concentration in ELF and the degree of respiratory failure. Prenatal DEX improved the surface activity of surfactant isolated from airway specimens and tended to increase the ratio of surfactant protein A to phosphatidylcholine among recipients of exogenous surfactant. A subgroup of infants, offspring of mothers with severe hypertension had an abnormally low concentration of surfactant protein A and a poor outcome, despite prenatal DEX treatment or surfactant substitution. Prenatal DEX decreased the concentration of nonsedimentable proteins in ELF and decreased the inhibition of surface activity by these proteins. Our results indicate that improved surfactant function during the first day of life explains some of the beneficial pulmonary effects of prenatal glucocorticoid treatment in preterm infants who are ventilator-dependent. |
Model-based influences on humans’ choices and striatal
prediction errors | The mesostriatal dopamine system is prominently implicated in model-free reinforcement learning, with fMRI BOLD signals in ventral striatum notably covarying with model-free prediction errors. However, latent learning and devaluation studies show that behavior also shows hallmarks of model-based planning, and the interaction between model-based and model-free values, prediction errors, and preferences is underexplored. We designed a multistep decision task in which model-based and model-free influences on human choice behavior could be distinguished. By showing that choices reflected both influences we could then test the purity of the ventral striatal BOLD signal as a model-free report. Contrary to expectations, the signal reflected both model-free and model-based predictions in proportions matching those that best explained choice behavior. These results challenge the notion of a separate model-free learner and suggest a more integrated computational architecture for high-level human decision-making. |
Arc-fault unwanted tripping survey with UL 1699B-listed products | Since adoption of the 2011 National Electrical Code®, many photovoltaic (PV) direct current (DC) arc-fault circuit interrupters (AFCIs) and arc-fault detectors (AFDs) have been introduced into the PV market. To meet the Code requirements, these products must be listed to Underwriters Laboratories (UL) 1699B Outline of Investigation. The UL 1699B test sequence was designed to ensure basic arc-fault detection capabilities with resistance to unwanted tripping; however, field experiences with AFCI/AFD devices have shown mixed results. In this investigation, independent laboratory tests were performed with UL-listed, UL-recognized, and prototype AFCI/AFDs to reveal any limitations with state-of-the-art arc-fault detection products. By running AFCIs and stand-alone AFDs through realistic tests beyond the UL 1699B requirements, many products were found to be sensitive to unwanted tripping or were ineffective at detecting harmful arc-fault events. Based on these findings, additional experiments are encouraged for inclusion in the AFCI/AFD design process and the certification standard to improve products entering the market. |
Algorithms for Deterministic Incremental Dependency Parsing | Parsing algorithms that process the input from left to right and construct a single derivation have often been considered inadequate for natural language parsing because of the massive ambiguity typically found in natural language grammars. Nevertheless, it has been shown that such algorithms, combined with treebank-induced classifiers, can be used to build highly accurate disambiguating parsers, in particular for dependency-based syntactic representations. In this article, we first present a general framework for describing and analyzing algorithms for deterministic incremental dependency parsing, formalized as transition systems. We then describe and analyze two families of such algorithms: stack-based and list-based algorithms. In the former family, which is restricted to projective dependency structures, we describe an arc-eager and an arc-standard variant; in the latter family, we present a projective and a non-projective variant. For each of the four algorithms, we give proofs of correctness and complexity. In addition, we perform an experimental evaluation of all algorithms in combination with SVM classifiers for predicting the next parsing action, using data from thirteen languages. We show that all four algorithms give competitive accuracy, although the non-projective list-based algorithm generally outperforms the projective algorithms for languages with a non-negligible proportion of non-projective constructions. However, the projective algorithms often produce comparable results when combined with the technique known as pseudo-projective parsing. The linear time complexity of the stack-based algorithms gives them an advantage with respect to efficiency both in learning and in parsing, but the projective list-based algorithm turns out to be equally efficient in practice. Moreover, when the projective algorithms are used to implement pseudo-projective parsing, they sometimes become less efficient in parsing (but not in learning) than the non-projective list-based algorithm. Although most of the algorithms have been partially described in the literature before, this is the first comprehensive analysis and evaluation of the algorithms within a unified framework. |
Wide-band microstrip antenna with an H-shaped coupling aperture | Theoretical and experimental results of a wide-band planar antenna are presented. This antenna can achieve a wide bandwidth, low cross-polarization levels, and low backward radiation levels. For wide bandwidth and easy integration with active circuits, it uses aperture-coupled stacked square patches. The coupling aperture is an H-shaped aperture. Based on the finite-difference time-domain method, a parametric study of the input impedance of the antenna is presented, and effects of each parameter on the antenna impedance are illustrated. One antenna is also designed, fabricated, and measured. The measured return loss exhibits an impedance bandwidth of 21.7%. The cross-polarization levels in both and planes are better than 23 dB. The front-to-back ratio of the antenna radiation pattern is better than 22 dB. Both theoretical and experimental results of parameters and radiation patterns are presented and discussed. |
Consumers ’ Perception on Online Shopping | E-commerce is emerging as a great level given that organized retail is still not ubiquitous across the length and breadth of the country with large retail chains making up less than 10% of the market.E-commerce is helping people in smaller towns in India access quality products and services similar to what people in the larger cities have access to. It being forecast that close to 60% of online shoppers would come from beyond the top eight large cities by end of this year.Increasing internet penetration has helped to expand the potential customer pool. Internet penetration is only about 10% (or about 121 million users) as against about 81% in the US and 36% in China. However this number continues to rise at a consistent pace because of falling prices for broadband connections. The first World Wide Web server and browser, created by Tim Berners-Lee in 1990, opened for commercial use in 1991. Thereafter, subsequent technological innovations emerged in 1994: online banking, the opening of an online pizza shop by Pizza Hut, Netscape's SSL v2 encryption standard for secure data transfer, and inter-shop’s first online shopping system. Immediately after, Amazon.com launched its online shopping site in 1995 and eBay was introduced in 1996. In the past decade, there has been a dramatic change in the way consumers have altered their way of shopping. Although consumers continue to purchase from a physical store, consumers feel very convenient to shop online since it frees the customer from personally visiting the store. Internet shopping has its own advantages and it reduces the effort of travelling to a physical store. Decisions can be made from home at ease looking at various choices and prices can be easily compared with the competitor’s products to arrive at a decision. This study highlights student’s attitude towards online shopping and their product preference on online shopping. This enable the e-retailers to support their online customer better by developing suitable marketing strategy in order to attract and convert potential customer as an active customers by encouraging them in an efficient way to make a purchase decision. |
How Much is 131 Million Dollars? Putting Numbers in Perspective with Compositional Descriptions | How much is 131 million US dollars? To help readers put such numbers in context, we propose a new task of automatically generating short descriptions known as perspectives, e.g. “$131 million is about the cost to employ everyone in Texas over a lunch period”. First, we collect a dataset of numeric mentions in news articles, where each mention is labeled with a set of rated perspectives. We then propose a system to generate these descriptions consisting of two steps: formula construction and description generation. In construction, we compose formulae from numeric facts in a knowledge base and rank the resulting formulas based on familiarity, numeric proximity and semantic compatibility. In generation, we convert a formula into natural language using a sequence-to-sequence recurrent neural network. Our system obtains a 15.2% F1 improvement over a non-compositional baseline at formula construction and a 12.5 BLEU point improvement over a baseline description generation. |
Effect of crop residue harvest on long-term crop yield, soil erosion and nutrient balance: trade-offs for a sustainable bioenergy feedstock | future science group www.future-science.com 1 10.4155/BFS.09.8 © 2010 Future Science Ltd As the role of renewable fuels increases in the US energy portfolio, bioenergy has garnered much attention as a low-carbon alternative to petroleum, particularly in the transportation sector. In recent years, there has been a dramatic expansion of biofuel production: in the USA, ethanol production increased by over a factor of six in the last decade (1998–2008) to 9 billion gallons per year (34 GL year) [101]. The Energy Independence and Security Act (EISA), passed by the US Congress in 2007, mandates a further expansion to 36 billion gallons of biofuels per year (136 GL year) by 2022. Of this, 21 billion gallons per year (79 GL year) would be from so-called advanced biofuels – ethanol derived from sources other than corn starch, such as cellulosic ethanol. In addition to ethanol, biomass could also be combusted or cofired in power plants to produce electricity, displacing coal, or serve as chemical feedstock to replace natural gas. Alternatively, sequestered residue (e.g., in deep water) could present an option for reducing atmospheric concentrations of CO 2 [1]. Meeting the growing demand and legislated targets for increased production of bioenergy will require a dramatic increase in biomass feedstock supply and some have expressed concern over the potential negative impacts this would have. For example, the expansion of agriculture into natural areas could result in a loss of biodiversity [2,3] and may incur an insurmountable carbon debt from emissions from land conversion [4]. Intensive farming of land under the Conservation Reserve Program (CRP) or marginal lands could lead to deer [5] and bird [6] habitat destruction, increase water consumption [7], exacerbate soil loss [8] and increase nutrient run-off and eutrophication of riparian and aquatic systems [9]. Furthermore, economic analyses have suggested that competition for crops could increase food prices [10,11]. Some of these drawbacks can be mitigated by taking advantage of agricultural residues. Agricultural residues are already produced as a coproduct of food and fiber production and thus have a large potential as a bioenergy feedstock, requiring no new land or technology to produce. For these reasons, they are likely to be a much cheaper and a more immediately available source for biomass feedstocks than crops such as switchgrass (Panicum virgatum) or hybrid poplar (Populus spp.). However, it is unclear what effect different rates of residue biomass removal will have on soil erosion and crop yields. If large-scale exploitation of crop residues Biofuels (2010) 1(1), xxx–xxx Effect of crop residue harvest on long-term crop yield, soil erosion and nutrient balance: trade-offs for a sustainable bioenergy feedstock |
The ball-pivoting algorithm for surface reconstruction | The Ball-Pivoting Algorithm (BPA) computes a triangle mesh interpolating a given point cloud. Typically, the points are surface samples acquired with multiple range scans of an object. The principle of the BPA is very simple: Three points form a triangle if a ball of a user-specified radius p touches them without containing any other point. Starting with a seed triangle, the ball pivots around an edge (i.e., it revolves around the edge while keeping in contact with the edge's endpoints) until it touches another point, forming another triangle. The process continues until all reachable edges have been tried, and then starts from another seed triangle, until all points have been considered. The process can then be repeated with a ball of larger radius to handle uneven sampling densities. We applied the BPA to datasets of millions of points representing actual scans of complex 3D objects. The relatively small amount of memory required by the BPA, its time efficiency, and the quality of the results obtained compare favorably with existing techniques. |
Towards a design theory for reducing aggression in psychiatric facilities | ? Users may download and print one copy of any publication from the public portal for the purpose of private study or research. ? You may not further distribute the material or use it for any profit-making activity or commercial gain ? You may freely distribute the URL identifying the publication in the public portal ? Take down policy If you believe that this document breaches copyright please contact us at [email protected] providing details, and we will remove access to the work immediately and investigate your claim. |
Tamper Resistance — a Cautionary Note | An increasing number of systems, from pay-TV to electronic purses, rely on the tamper resistance of smartcards and other security processors. We describe a number of attacks on such systems — some old, some new and some that are simply little known outside the chip testing community. We conclude that trusting tamper resistance is problematic; smartcards are broken routinely, and even a device that was described by a government signals agency as ‘the most secure processor generally available’ turns out to be vulnerable. Designers of secure systems should consider the consequences with care. 1 Tamperproofing of cryptographic |
Radio Frequency Identification Sensors | Radio Frequency Identification (RFID) Systems are used when non-line-of-sight operation is required to identify an object and store its information. Some of the applications of this system are toll collection and inventory control. The following paper presents the ongoing research for the Auburn University Detection and Food Safety (AUDFS) project to use RFID technology combined with sensors for detection of pathogens in food. AUDFS aims to integrate the breakthroughs in the detection of food borne illnesses with advances in wireless and biosensor technologies. It explains the basic building blocks and operation of an RFID system. It also illustrates the data transfer process taking place in the system. |
The US Trade and Budget Deficits in Global Perspective: An Essay in Geopolitical-Economy | The disturbances in the global economy induced by the US trade and budget deficits are widely recognised by economists, even if the precise causes and effects of these deficits remain a topic of debate, What is less widely understood is that the deficits are pregnant with geographical implications for the world economy and that the production of the deficits can be explained in part by the changing and asymmetrical location of the USA in international economic and geopolitical relations. In this paper, both of these points are expanded. In section 2, competing accounts of the production of the US trade and budget deficits are considered. Particular attention is paid to ‘Reaganomics’ as an economic and geopolitical discourse. In section 3, the implications of the twin deficits for the US space-economy are considered. The regional impacts of increased military (and other) spending under Reagan are noted, as are the likely effects of the alienation of US real assets and of the buildup of debt-servicing oblig... |
Toward a New Generation of Virtual Humans for Interactive Experiences | and a civilian car. A seriously injured child lies on the ground, surrounded by a distraught woman and a medic from your team. You ask what happened and your sergeant reacts defensively. He casts an angry glance at the mother and says, “They rammed into us, sir. They just shot out from the side street and our driver couldn’t see them.” Before you can think, an urgent radio call breaks in: “This is Eagle1-6. Where are you guys? Things are heating up. We need you here now!” From the side street, a CNN camera team appears. What do you do now, lieutenant? Interactive virtual worlds provide a powerful medium for entertainment and experiential learning. Army lieutenants can gain valuable experience in decision-making in scenarios like the example above. Others can use the same technology for entertaining role-playing even if they never have to face such situations in real life. Similarly, students can learn about, say, ancient Greece by walking through its virtual streets, visiting its buildings, and interacting with its people. Scientists and science fiction fans alike can experience life in a colony on Mars long before the required infrastructure is in place. The range of worlds that people can explore and experience with virtual-world technology is unlimited, ranging from factual to fantasy and set in the past, present, or future. Our goal is to enrich such worlds with virtual humans—autonomous agents that support face-toface interaction with people in these environments in a variety of roles, such as the sergeant, medic, or even the distraught mother in the example above. Existing virtual worlds, such as military simulations and computer games, often incorporate virtual humans with varying degrees of intelligence. However, these characters’ability to interact with human users is usually very limited: Typically, users can shoot at them and they can shoot back. Those characters that support more collegial interactions, such as in children’s educational software, are usually very scripted and offer human users no ability to carry on a dialogue. In contrast, we envision virtual humans that cohabit virtual worlds with people and support face-to-face dialogues situated in those worlds, serving as guides, mentors, and teammates. Although our goals are ambitious, we argue here that many key building blocks are already in place. Early work on embodied conversational agents1 and animated pedagogical agents2 has laid the groundwork for face-to-face dialogues with users. Our prior work on Steve3,4 (see Figure 1) is particularly relevant. Steve is unique among interactive animated agents because it can collaborate with people in 3D virtual worlds as an instructor or teammate. Our goal with Steve is to integrate the latest advances from separate research communities into a single agent architecture. While we continue to add more sophistication, we have already implemented a core set of capabilities and applied it to the Army peacekeeping example described earlier. Virtual humans— |
Learning Graph Representations with Recurrent Neural Network Autoencoders | Representing and comparing graphs is a central problem in many fields. We present an approach to learn representations of graphs using recurrent neural network autoencoders. Recurrent neural networks require sequential data, so we begin with several methods to generate sequences from graphs, including random walks, breadth-first search, and shortest paths. We train long short-term memory (LSTM) autoencoders to embed these graph sequences into a continuous vector space. We then represent a graph by averaging its graph sequence representations. The graph representations are then used for graph classification and comparison tasks. We demonstrate the effectiveness of our approach by showing improvements over the existing state-of-the-art on several graph classification tasks, including both labeled and unlabeled graphs. |
Effects of Music on Physiological Arousal : Explorations into Tempo and Genre | Two experiments explore the validity of conceptualizing musical beats as auditory structural features and the potential for increases in tempo to lead to greater sympathetic arousal, measured using skin conductance. In the first experiment, fastand slow-paced rock and classical music excerpts were compared to silence. As expected, skin conductance response (SCR) frequency was greater during music processing than during silence. Skin conductance level (SCL) data showed that fast-paced music elicits greater activation than slow-paced music. Genre significantly interacted with tempo in SCR frequency, with faster tempo increasing activation for classical music while decreasing it for rock music. A second experiment was conducted to explore the possibility that the presumed familiarity of the genre led to this interaction. Although further evidence was found for conceptualizing musical beat onsets as auditory structure, the familiarity explanation was not supported. Music Effects on Arousal 2 Effects of Music Genre and Tempo on Physiological Arousal Music communicates many different types of messages through the combination of sound and lyric (Sellnow & Sellnow, 2001). For example, music can be used to exchange political information (e.g., Frith, 1981; Stewart, Smith, & Denton, 1989). Music can also establish and portray a selfor group-image (Arnett, 1991, 1992; Dehyle, 1998; Kendall & Carterette, 1990; Dillman Carpentier, Knobloch & Zillmann, 2003; Manuel, 1991; McLeod, 1999; see also Hansen & Hansen, 2000). Pertinent to this investigation, music can communicate emotional information (e.g., Juslin & Sloboda, 2001). In short, music is a form of “interhuman communication in which humanly organized, non-verbal sound is perceived as vehiculating primarily affective (emotional) and/or gestural (corporeal) patterns of cognition” (Tagg, 2002, p. 5). This idea of music as communication reaches the likes of audio production students, who are taught the concept of musical underscoring, or adding music to “enhance information or emotional content” in a wide variety of ways from establishing a specific locale to intensifying action (Alten, 2005, p. 360). In this realm, music becomes a key instrument in augmenting or punctuating a given message. Given the importance of arousal and/or activation in most theories of persuasion and information processing, an understanding of how music can be harnessed to instill arousal is arguably of benefit to media producers looking to utilize every possible tool when creating messages, whether the messages are commercial appeals, promotional announcements or disease-prevention messages. It is with the motivation of harnessing the psychological response to music for practical application that two experiments were conducted to test whether message creators can rely on musical tempo as a way to increase sympathetic nervous system Music Effects on Arousal 3 activation in a manner similar to other structural features of media (i.e., cuts, edits, sound effects, voice changes). Before explaining the original work, a brief description of the current state of the literature on music and emotion is offered. Different Approaches in Music Psychology Although there is little doubt that music ‘vehiculates’ emotion, several debates exist within the music psychology literature about exactly how that process is best conceptualized and empirically approached (e.g., Bever, 1988; Gaver & Mandler, 1987; Juslin & Sloboda, 2001; Lundin, 1985; Sloboda, 1991). The primary conceptual issue revolves around two different schools of thought (Scherer & Zentner, 2001). The first, the cognitivist approach, describes emotional response to music as resulting from the listener’s cognitive recognition of cues within the composition itself. Emotivists, on the other hand, eliminate the cognitive calculus required by cue recognition in the score, instead describing emotional response to music as a feeling of emotion. Although both approaches acknowledge a cultural or social influence in how the music is interpreted (e.g., Krumhansl, 1997; Peretz, 2001), the conceptual chasm between emotion as being either expressed or elicited by a piece of music is wide indeed. A second issue in the area of music psychology concerns a difference in the empirical approach present among emotion scholars writ large. Some focus their explorations on specific, discrete affective states (i.e., joy, fear, disgust, etc.), often labeled as the experience of basic emotions (Ortony et al., 1988; Thayer, 1989; Zajonc, 1980). Communication scholars such as Nabi (1999, 2003) and Newhagen (1998) have also found it fruitful to explore unique affective states resulting from mediated messages, driven by the understanding that “each emotion expresses a different relational meaning Music Effects on Arousal 4 that motivates the use of mental and/or physical resources in ways consistent with the emotion’s action tendency” (Nabi, 2003, p. 226; also see Wirth & Schramm, 2005 for review). This approach is also well represented by studies exploring human reactions to music (see Juslin & Laukka, 2003 for review). Other emotion scholars design studies where the focus is placed not on the discrete identifier assigned to a certain feeling-state by a listener, but rather the extent to which different feeling-states share common factors or dimensions. The two most commonly studied dimensions are valence—a term given to the relative positive/negative hedonic value, and arousal—the intensity or level to which that hedonic value is experienced. The centrality of these two dimensions in the published literature is due to the consistency with which they account for the largest amount of predictive variance across a wide variety of dependent variables (Osgood, Suci & Tannenbuam, 1957; Bradley, 1994; Reisenzein, 1994). This dimensional approach to emotional experience is well-represented by articles in the communication literature exploring the combined impact of valence and arousal on memory (Lang, Bolls, Potter & Kawahara, 1999; Sundar & Kalyanaraman, 2004), liking (Yoon, Bolls, & Lang, 1998), and persuasive appeal (Yoon et al., 1998; Potter, LaTour, Braun-LaTour & Reichert, 2006). When surveying the music psychology literature for studies utilizing the dimensional emotions approach, however, results show that the impact of music on hedonic valence are difficult to consistently predict—arguably due to contextual, experiential or mood-state influences of the listener combined with interpretational differences of the song composers and performers (Bigand, Filipic, & Lalitte, 2005; Cantor & Zillmann, 1973; Gabrielsson & Lindström, 2001; Kendall & Carterette, 1990; Leman, 2003; Lundin, 1985). Music Effects on Arousal 5 On the other hand, the measured effects of music on the arousal dimension, while not uniform, are more consistent across studies (see Scherer & Zentner, 2001). For example, numerous experiments have noted the relaxation potential of music—either using compositions pre-tested as relaxing or self-identified by research participants as such. In Bartlett’s (1996) review of music studies using physiological measures, a majority of studies measuring muscle tension found relaxing music to reduce it. Interestingly, slightly more than half of the studies that measured skin temperature found relaxing music to increase it. Pelletier (2004) went beyond reviewing studies individually, conducting a statistical meta-analysis of 22 experiments. Conclusions showed that music alone, as well as used in tandem with relaxation techniques, significantly decreased perceived arousal and physiological activation. However, the amount of decrease significantly varied by age, stressor, musical preference, and previous music experience of the participant. These caveats provide possible explanations for the few inconsistent findings across individual studies that show either little or no effects of relaxing music (e.g., Davis-Rollans & Cunningham, 1987; Robb, Nichols, Rutan, & Bishop, et al., 1995; Strauser, 1997; see Standley, 1991 for review) or that show listening to relaxing music yields higher perceived arousal compared to the absence of music (Davis & Thaut, 1989). Burns, Labbé, Williams, and McCall (1999) relied on both self-report and physiological responses to the musical selections to explore music’s ability to generate states of relaxation. The researchers used a predetermined classical excerpt, a predetermined rock excerpt, an excerpt from a “relaxing” selection chosen by each participant, and a condition of sitting in silence. Burns et al. (1999) found that, within Music Effects on Arousal 6 groups, both finger temperature and skin conductance decreased over time. Across emotional conditions, self-reported relaxation was lowest for rock listeners and highest for participants in the self-selection and silence conditions. However, no significant between-group physiological differences were found. Rickard (2004) also combined self-reports of emotional impact, enjoyment, and familiarity with psychophysiological measures in evaluating arousal effects of music. Psychophysiological measures included skin conductance responses, chills, skin temperature, and muscle tension. Stimuli included relaxing music, music predetermined to be arousing but not emotionally powerful, self-selected emotionally-powerful music, and an emotionally-powerful film scene. Rickard found that music participants had selfidentified as emotionally powerful led to the greatest increases in skin conductance and chills, in addition to higher ratings on the self-reported measures. No correlation was found between these effects and participant gender or musical training. Krumhansl (1997) explored how music affects the peripheral nervous system in eliciting emotions in college-aged music students. Classical music selections approximately 180-seconds long were chosen which expressed sadness, happiness or fear. While listening, ha |
A Novel Surgical Technique to Correct Intrareolar Polythelia. | Polythelia is a rare congenital malformation that occurs in 1-2% of the population. Intra-areolar polythelia is the presence of one or more supernumerary nipples located within the areola. This is extremely rare. This article presents 3 cases of intra-areolar polythelia treated at our Department. These cases did not present other associated malformation. Surgical correction was performed for psychological and cosmetic reasons using advancement flaps. The aesthetic and functional results were satisfactory. |
Evaluating the validity and applicability of automated essay scoring in two massive open online courses | The use of massive open online courses (MOOCs) to expand students’ access to higher education has raised questions regarding the extent to which this course model can provide and assess authentic, higher level student learning. In response to this need, MOOC platforms have begun utilizing automated essay scoring (AES) systems that allow students to engage in critical writing and free-response activities. However, there is a lack of research investigating the validity of such systems in MOOCs. This research examined the effectiveness of an AES tool to score writing assignments in two MOOCs. Results indicated that some significant differences existed between Instructor grading, AES-Holistic scores, and AES-Rubric Total scores within two MOOC courses. However, use of the AES system may still be useful given instructors’ assessment needs and intent. Findings from this research have implications for instructional technology administrators, educational designers, and instructors implementing AES learning activities in MOOC courses. |
Eosinophilic Esophagitis in Patients with Refractory Gastroesophageal Reflux Disease | Background Eosinophilic esophagitis is among the causes of refractory reflux disease. Biopsy of esophagus is the gold standard for diagnosis. In this study we determined the frequency of eosinophilic esophagitis (EE) in refractory reflux cases referred to Motility Department of Shahid Beheshti Research Center of Gastroenterology and Liver Disease, Tehran, Iran. Methods In this cross-sectional study, 68 cases with refractory reflux disease underwent endoscopy and had biopsies taken. Specimens were stained by hematoxylin and eosin and two independent pathologists confirmed the diagnosis of eosinophilic esophagitis. Results Mean (standard deviation, SD) age at diagnosis was 41.8 (10.94) years. All had allergy or atopy, and unknown dysphagia was noted for 66%. Endoscopic findings were as follows: esophagitis (33.3%), rings (33.3%), and whitish plaques (33.3%). Prevalence of eosinophilic esophagitis was 8.8% (N = 6; one man and five women). No statistical difference in demographic variables was found between eosinophilic esophagitis cases and others, except for history of atopy, food impaction, and endoscopic features (P value <0.005). Conclusion Eosinophilic esophagitis should be considered in the differential diagnosis of any cases with refractory reflux who complain of chronic unexplained dysphagia, with history of recurrent food impaction, and atopy or abnormal endoscopic features. |
Ecological compatibility of GM crops and biological control | Insect-resistant and herbicide-tolerant genetically modified (GM) crops pervade many modern cropping systems (especially field-cropping systems), and present challenges and opportunities for developing biologically based pest-management programs. Interactions between biological control agents (insect predators, parasitoids, and pathogens) and GM crops exceed simple toxicological relationships, a priority for assessing risk of GM crops to non-target species. To determine the compatibility of biological control and insect-resistant and herbicide-tolerant GM crop traits within integrated pest-management programs, this synthesis prioritizes understanding the bi-trophic and prey/host-mediated ecological pathways through which natural enemies interact within cropland communities, and how GM crops alter the agroecosystems in which natural enemies live. Insect-resistant crops can affect the quantity and quality of non-prey foods for natural enemies, as well as the availability and quality of both target and non-target pests that serve as prey/hosts. When they are used to locally eradicate weeds, herbicide-tolerant crops alter the agricultural landscape by reducing or changing the remaining vegetational diversity. This vegetational diversity is fundamental to biological control when it serves as a source of habitat and nutritional resources. Some inherent qualities of both biological control and GM crops provide opportunities to improve upon sustainable IPM systems. For example, biological control agents may delay the evolution of pest resistance to GM crops, and suppress outbreaks of secondary pests not targeted by GM plants, while herbicide-tolerant crops facilitate within-field management of vegetational diversity that can enhance the efficacy of biological control agents. By examining the ecological compatibility of biological control and GM crops, and employing them within an IPM framework, the sustainability and profitability of farming may be improved. |
Do intrauterine or genetic influences explain the foetal origins of chronic disease? A novel experimental method for disentangling effects | BACKGROUND
There is much evidence to suggest that risk for common clinical disorders begins in foetal life. Exposure to environmental risk factors however is often not random. Many commonly used indices of prenatal adversity (e.g. maternal gestational stress, gestational diabetes, smoking in pregnancy) are influenced by maternal genes and genetically influenced maternal behaviour. As mother provides the baby with both genes and prenatal environment, associations between prenatal risk factors and offspring disease maybe attributable to true prenatal risk effects or to the "confounding" effects of genetic liability that are shared by mother and offspring. Cross-fostering designs, including those that involve embryo transfer have proved useful in animal studies. However disentangling these effects in humans poses significant problems for traditional genetic epidemiological research designs.
METHODS
We present a novel research strategy aimed at disentangling maternally provided pre-natal environmental and inherited genetic effects. Families of children aged 5 to 9 years born by assisted reproductive technologies, specifically homologous IVF, sperm donation, egg donation, embryo donation and gestational surrogacy were contacted through fertility clinics and mailed a package of questionnaires on health and mental health related risk factors and outcomes. Further data were obtained from antenatal records.
RESULTS
To date 741 families from 18 fertility clinics have participated. The degree of association between maternally provided prenatal risk factor and child outcome in the group of families where the woman undergoing pregnancy and offspring are genetically related (homologous IVF, sperm donation) is compared to association in the group where offspring are genetically unrelated to the woman who undergoes the pregnancy (egg donation, embryo donation, surrogacy). These comparisons can be then examined to infer the extent to which prenatal effects are genetically and environmentally mediated.
CONCLUSION
A study based on children born by IVF treatment and who differ in genetic relatedness to the woman undergoing the pregnancy is feasible. The present report outlines a novel experimental method that permits disaggregation of maternally provided inherited genetic and post-implantation prenatal effects. |
Combined use of positron emission tomography and volume doubling time in lung cancer screening with low-dose CT scanning. | BACKGROUND
In lung cancer screening the ability to distinguish malignant from benign nodules is a key issue. This study evaluates the ability of positron emission tomography (PET) and volume doubling time (VDT) to discriminate between benign and malignant nodules.
METHODS
From the Danish Lung Cancer Screening Trial, participants with indeterminate nodules who were referred for a 3-month rescan were investigated. Resected nodules and indolent nodules (ie, stable for at least 2 years) were included. Between the initial scan and the 3-month rescan, participants were referred for PET. Uptake on PET was categorised as most likely benign to malignant (grades I-IV). VDT was calculated from volume measurements on repeated CT scans using semiautomated pulmonary nodule evaluation software. Receiver operating characteristic (ROC) analyses were used to determine the sensitivity and specificity of PET and VDT.
RESULTS
A total of 54 nodules were included. The prevalence of lung cancer was 37%. In the multivariate model both PET (OR 2.63, p<0.01) and VDT (OR 2.69, p<0.01) were associated with lung cancer. The sensitivities and specificities of both PET and VDT were 71% and 91%, respectively. Cut-off points for malignancy were PET>II and VDT<1 year, respectively. Combining PET and VDT resulted in a sensitivity of 90% and a specificity of 82%; ROC cut-off point was either PET or VDT indicating malignancy.
CONCLUSION
PET and VDT predict lung cancer independently of each other. The use of both PET and VDT in combination is recommended when screening for lung cancer with low-dose CT. |
Hearing and saying. The functional neuro-anatomy of auditory word processing. | The neural systems involved in hearing and repeating single words were investigated in a series of experiments using PET. Neuropsychological and psycholinguistic studies implicate the involvement of posterior and anterior left perisylvian regions (Wernicke's and Broca's areas). Although previous functional neuroimaging studies have consistently shown activation of Wernicke's area, there has been only variable implication of Broca's area. This study demonstrates that Broca's area is involved in both auditory word perception and repetition but activation is dependent on task (greater during repetition than hearing) and stimulus presentation (greater when hearing words at a slow rate). The peak of frontal activation in response to hearing words is anterior to that associated with repeating words; the former is probably located in Brodmann's area 45, the latter in Brodmann's area 44 and the adjacent precentral sulcus. As Broca's area activation is more subtle and complex than that in Wernicke's area during these tasks, the likelihood of observing it is influenced by both the study design and the image analysis technique employed. As a secondary outcome from the study, the response of bilateral auditory association cortex to 'own voice' during repetition was shown to be the same as when listening to "other voice' from a prerecorded tape. |
A Simplification-Translation-Restoration Framework for Cross-Domain SMT Applications | Integration of domain specific knowledge into a general purpose statistical machine translation (SMT) system poses challenges due to insufficient bilingual corpora. In this paper we propose a simplification-translation-restoration (STR) framework for domain adaptation in SMT by simplifying domain specific segments of a text. For an in-domain text, we identify the critical segments and modify them to alleviate the data sparseness problem in the out-domain SMT system. After we receive the translation result, these critical segments are then restored according to the provided in-domain knowledge. We conduct experiments on an English-toChinese translation task in the medical domain and evaluate each step of the STR framework. The translation results show significant improvement of our approach over the out-domain and the naïve in-domain SMT systems. 用於跨領域統計式機器翻譯系統之簡化-翻譯-還原架構 |
Trust Among Strangers in Internet Transactions : Empirical Analysis of eBay ’ s Reputation System | Reputations that are transmitted from person to person can deter moral hazard and discourage entry by bad types in markets where players repeat transactions but rarely with the same player. On the Internet, information about past transactions may be both limited and potentially unreliable, but it can be distributed far more systematically than the informal gossip among friends that characterizes conventional marketplaces. One of the earliest and best known Internet reputation systems is run by eBay, which gathers comments from buyers and sellers about each other after each transaction. Examination of a large data set from 1999 reveals several interesting features of this system, which facilitates many millions of sales each month. First, despite incentives to free ride, feedback was provided more than half the time. Second, well beyond reasonable expectation, it was almost always positive. Third, reputation profiles were predictive of future performance. However, the net feedback scores that eBay displays encourages Pollyanna assessments of reputations, and is far from the best predictor available. Fourth, although sellers with better reputations were more likely to sell their items, they enjoyed no boost in price, at least for the two sets of items that we examined. Fifth, there was a high correlation between buyer and seller feedback, suggesting that the players reciprocate and retaliate. |
Racial Discrimination in the Sharing Economy : Evidence from a Field Experiment | In an experiment on Airbnb, we find that applications from guests with distinctively African-American names are 16% less likely to be accepted relative to identical guests with distinctively White names. Discrimination occurs among landlords of all sizes, including small landlords sharing the property and larger landlords with multiple properties. It is most pronounced among hosts who have never had an African-American guest, suggesting only a subset of hosts discriminate. While rental markets have achieved significant reductions in discrimination in recent decades, our results suggest that Airbnb’s current design choices facilitate discrimination and raise the possibility of erasing some of these civil rights gains. |
Demand Response Management With Multiple Utility Companies: A Two-Level Game Approach | Demand Response Management (DRM) is a key component of the future smart grid that helps to reduce power peak load and variation. Different from most existing studies that focus on the scenario with a single utility company, this paper studies DRM with multiple utility companies. First, the interaction between utility companies and residential users is modeled as a two-level game. That is, the competition among the utility companies is formulated as a non-cooperative game, while the interaction among the residential users is formulated as an evolutionary game. Then, we prove that the proposed strategies are able to make both games converge to their own equilibrium. In addtion, the strategies for the utility companies and the residential users are implemented by distributed algorithms. Illustrative examples show that the proposed scheme is able to significantly reduce peak load and demand variation. |
Light field geometry of a Standard Plenoptic Camera. | The Standard Plenoptic Camera (SPC) is an innovation in photography, allowing for acquiring two-dimensional images focused at different depths, from a single exposure. Contrary to conventional cameras, the SPC consists of a micro lens array and a main lens projecting virtual lenses into object space. For the first time, the present research provides an approach to estimate the distance and depth of refocused images extracted from captures obtained by an SPC. Furthermore, estimates for the position and baseline of virtual lenses which correspond to an equivalent camera array are derived. On the basis of paraxial approximation, a ray tracing model employing linear equations has been developed and implemented using Matlab. The optics simulation tool Zemax is utilized for validation purposes. By designing a realistic SPC, experiments demonstrate that a predicted image refocusing distance at 3.5 m deviates by less than 11% from the simulation in Zemax, whereas baseline estimations indicate no significant difference. Applying the proposed methodology will enable an alternative to the traditional depth map acquisition by disparity analysis. |
The US Air Force R&M 2000 Initiative Special Introduction | IEEE Transactions on Reliability dedicated to the US Air are incorporated into a system from inception to producFo Transactionseon Reliability anddedilicted tM te UIition and deployment. The future operational effectiveness Force Reliability and Maintainability (R&M) 2000 Iniof our systems will be determined by how well we integrate tiative. For several years we have focused intense attention reliability and maintainability into the design process. To on improved combat readiness. At the very top of our list focus defense contractors on this goal, we have drastically of ways to accomplish this is improved reliability and altered acquisition priorities. Reliability, maintainability, maintainability. The rationale is compelling: broken and producibility now stand as the first items in the highest equipment and unusable systems don't deter war or prevail ranked areas for source selection. on the battle field. R&M improvements will translate hardIn the Air Force, R&M factors have come of age with ware on the ramp into improved sortie rates, increased priority equal to or greater than such traditional factors as mobility, decreased manpower and lower costs. All that performance, cost, and schedule. This R&M attention is adds up to more warfighting capability and, hence, more clear recognition that logistics is not just in the "nice to deterrence. have" category. Logistics is the essential prerequisite for deterrence. ~~~~~~~~and supporter of warfighting capability. Reliability and For the R&M initiative to be fully effective, it must be ain a ina modernlitic takdable. I understood and applied by all those involved in design, challenge all of you in the engineering and academic comresearch, development, acquisition, and management of munities to join me in this important initiative to build Air Force systems. R&M 2000 cuts across academic, insupportable Air Force capabilities. dustrial, and governmental lines to redefine system capability not only in terms of performance, but also in Manuscript TR8711 received 1986 December 15; revised 1987 April 21. terms of reliability, maintainability, and producibility. The IEEE Log Number 15946 4 TR > |
Interpolation technique for encoder resolution improvement in permanent magnet synchronous motor drives | This paper presents an interpolation scheme for high-resolution rotor position estimation using data from a low-resolution encoder. The estimator is produced by incorporating a geometric polynomial equation into the rotor movement profile. The low and transient speed characteristics are examined and it is shown that the technique can produce effective resolution improvements by a factor of 16 |
Efficient Large-Scale Approximate Nearest Neighbor Search on the GPU | We present a new approach for efficient approximate nearest neighbor (ANN) search in high dimensional spaces, extending the idea of Product Quantization. We propose a two level product and vector quantization tree that reduces the number of vector comparisons required during tree traversal. Our approach also includes a novel highly parallelizable re-ranking method for candidate vectors by efficiently reusing already computed intermediate values. Due to its small memory footprint during traversal the method lends itself to an efficient, parallel GPU implementation. This Product Quantization Tree (PQT) approach significantly outperforms recent state of the art methods for high dimensional nearest neighbor queries on standard reference datasets. Ours is the first work that demonstrates GPU performance superior to CPU performance on high dimensional, large scale ANN problems in time-critical real-world applications, like loop-closing in videos. |
Black-box Importance Sampling | Importance sampling is widely used in machine learning and statistics, but its power is limited by the restriction of using simple proposals for which the importance weights can be tractably calculated. We address this problem by studying black-box importance sampling methods that calculate importance weights for samples generated from any unknown proposal or black-box mechanism. Our method allows us to use better and richer proposals to solve difficult problems, and (somewhat counter-intuitively) also has the additional benefit of improving the estimation accuracy beyond typical importance sampling. Both theoretical and empirical analyses are provided. |
Enhancing security and privacy in biometrics-based authentication systems | Because biometrics-based authentication offers several advantages over other authentication methods, there has been a significant surge in the use of biometrics for user authentication in recent years. It is important that such biometrics-based authentication systems be designed to withstand attacks when employed in security-critical applications, especially in unattended remote applications such as ecommerce. In this paper we outline the inherent strengths of biometrics-based authentication, identify the weak links in systems employing biometrics-based authentication, and present new solutions for eliminating some of these weak links. Although, for illustration purposes, fingerprint authentication is used throughout, our analysis extends to other biometrics-based methods. |
Abnormal coronary flow velocity reserve after coronary intervention is associated with cardiac marker elevation. | BACKGROUND
Residual reduction of relative coronary flow velocity reserve (rCVR) after successful coronary intervention has been related to microvascular impairment. However, the incidence of cardiac enzyme elevation as a surrogate marker of an underlying embolic myocardial injury in these cases has not been studied.
METHODS AND RESULTS
A series of 55 consecutive patients with successful coronary stenting, periprocedural intracoronary Doppler analysis, and determination of creatine kinase (CK; upper limit of normal [ULN] for women 70 IU/L, for men 80 IU/L) and cardiac troponin T (cTnT; bedside test, threshold 0.1 ng/mL) before and 6, 12, and 24 hours after intervention were studied. Postprocedural rCVR was the only intracoronary Doppler parameter that independently correlated with cTnT (r=-0.498, P<0.001) and CK outcome (r=-0.406, P=0.002). Receiver operating characteristic analysis identified a postprocedural rCVR of 0.78 as the best discriminating value, with a sensitivity of 83.3% and 69.2% and a specificity of 79.1% and 76.2% for detection of cTnT and CK elevation, respectively. Stratified according to this cutoff value, the incidence of cTnT elevation was 52.6% in patients with (n=19) and 5.6% in patients without (n=36) a postprocedural rCVR <0.78 (P<0.001), associated with a CK elevation >1 times the ULN in 36.8% and 5.6% (P=0.005) of patients, respectively.
CONCLUSIONS
Cardiac marker elevation can frequently be found after coronary procedures that are associated with a persistent reduction of rCVR, indicating procedural embolization of atherothrombotic debris with microvascular impairment and myocardial injury as a potential underlying mechanism. |
Resilient packet delivery through software defined redundancy: An experimental study | Mission-critical applications that use computer networks require guaranteed communication with bounds on the time needed to deliver packets, even if hardware along a path fails or congestion occurs. Current networking technologies follow a reactive approach in which retransmission mechanisms and routing update protocols respond once packet loss occurs. We present a proactive approach that pre-builds and monitors redundant paths to ensure timely delivery with no recovery latency, even in the case of a single-link failure. The paper outlines the approach, describes a new mechanism we call a redundancy controller, and explains the use of SDN to establish and change paths. The paper also reports on a prototype system and gives experimental measurements. |
Unsupervised Domain Adaptation With Label and Structural Consistency | Unsupervised domain adaptation deals with scenarios in which labeled data are available in the source domain, but only unlabeled data can be observed in the target domain. Since the classifiers trained by source-domain data would not be expected to generalize well in the target domain, how to transfer the label information from source to target-domain data is a challenging task. A common technique for unsupervised domain adaptation is to match cross-domain data distributions, so that the domain and distribution differences can be suppressed. In this paper, we propose to utilize the label information inferred from the source domain, while the structural information of the unlabeled target-domain data will be jointly exploited for adaptation purposes. Our proposed model not only reduces the distribution mismatch between domains, improved recognition of target-domain data can be achieved simultaneously. In the experiments, we will show that our approach performs favorably against the state-of-the-art unsupervised domain adaptation methods on benchmark data sets. We will also provide convergence, sensitivity, and robustness analysis, which support the use of our model for cross-domain classification. |
Radio Link Frequency Assignment | The problem of radio frequency assignment is to provide communication channels from limited spectral resources whilst keeping to a minimum the interference suffered by those whishing to communicate in a given radio communication network. This problem is a combinatorial (NP-hard) optimization problem. In 1993, the CELAR (the French “Centre d'Electronique de l'Armement”) built a suite of simplified versions of Radio Link Frequency Assignment Problems (RLFAP) starting from data on a real network Roisnel93. Initially designed for assessing the performances of several Constraint Logic Programming languages, these benchmarks have been made available to the public in the framework of the European EUCLID project CALMA (Combinatorial Algorithms for Military Applications). These problems should look very attractive to the CSP community: the problem is simple to represent, all constraints are binary and involve finite domain variables. They nevertheless have some of the flavors of real problems (including large size and several optimization criteria). This paper gives essential facts about the CELAR instances and also introduces the GRAPH instances which were generated during the CALMA project. |
Orientalism in Euro-American and Indian psychology: historical representations of "natives" in Colonial and postcolonial contexts. | The author examines the historical role of Euro-American psychology in constructing Orientalist representations of the natives who were colonized by the European colonial powers. In particular, the author demonstrates how the power to represent the non-Western "Other" has always resided, and still continues to reside, primarily with psychologists working in Europe and America. It is argued that the theoretical frameworks that are used to represent non-Westerners in contemporary times continue to emerge from Euro-American psychology. Finally, the author discusses how non-Western psychologists internalized these Orientalist images and how such a move has led to a virtual abandonment of pursuing "native" forms of indigenous psychologies in Third World psychology departments. |
Artificial General Intelligence | Any agent that is part of the environment it interacts with and has versatile actuators (such as arms and fingers), will in principle have the ability to self-modify – for example by changing its own source code. As we continue to create more and more intelligent agents, chances increase that they will learn about this ability. The question is: will they want to use it? For example, highly intelligent systems may find ways to change their goals to something more easily achievable, thereby ‘escaping’ the control of their creators. In an important paper, Omohundro (2008) argued that goal preservation is a fundamental drive of any intelligent system, since a goal is more likely to be achieved if future versions of the agent strive towards the same goal. In this paper, we formalise this argument in general reinforcement learning, and explore situations where it fails. Our conclusion is that the self-modification possibility is harmless if and only if the value function of the agent anticipates the consequences of self-modifications and use the current utility function when evaluating the future. |
Facial Performance Capture with Deep Neural Networks | We present a deep learning technique for facial performance capture, i.e., the transfer of video footage into a motion sequence of a 3D mesh representing an actor’s face. Specifically, we build on a conventional capture pipeline based on computer vision and multiview video, and use its results to train a deep neural network to produce similar output from a monocular video sequence. Once trained, our network produces high-quality results for unseen inputs with greatly reduced effort compared to the conventional system. In practice, we have found that approximately 10 minutes worth of high-quality data is sufficient for training a network that can then automatically process as much footage from video to 3D as needed. This yields major savings in the development of modern narrativedriven video games involving digital doubles of actors and potentially hours of animated dialogue per character. |
On complex event processing for sensor networks | Sensor networks have to cope with a high volume of events continuously. Conventional software architectures do not explicitly target the efficient processing of continuous event streams. Recently, event-driven architectures (EDA) have been proposed as a new paradigm for event-based applications. In this paper we propose a reference architecture for sensor-based networks, which enables the analysis and processing of complex event streams in real-time. Our approach is based on semantically rich event models using ontologies that allow representation of structural properties of event types and constraints between them. Then, we argue in favor of a declarative approach of complex event processing that draws upon rule languages such as Esper and JESS. Also, we illustrate our approach in the domain of road traffic management for the high-capacity road network in Bilbao, Spain. |
ETS family-associated gene fusions in Japanese prostate cancer: analysis of 194 radical prostatectomy samples | The incidence and clinical significance of the TMPRSS2:ERG gene fusion in prostate cancer has been investigated with contradictory results. It is now common knowledge that significant variability in gene alterations exists according to ethnic background in various kinds of cancer. In this study, we evaluated gene fusions involving the ETS gene family in Japanese prostate cancer. Total RNA from 194 formalin-fixed and paraffin-embedded prostate cancer samples obtained by radical prostatectomy was subjected to reverse-transcriptase polymerase chain reaction to detect the common TMPRSS2:ERG T1-E4 and T1-E5 fusion transcripts and five other non-TMPRSS2:ERG fusion transcripts. We identified 54 TMPRSS2:ERG-positive cases (54/194, 28%) and two HNRPA2B1:ETV1-positive cases (2/194, 1%). The SLC45A3-ELK4 transcript, a fusion transcript without structural gene rearrangement, was detectable in five cases (5/194, 3%). The frequencies of both TMPRSS2:ERG- and non-TMPRSS2:ERG-positive cases were lower than those reported for European, North American or Brazilian patients. Internodular heterogeneity of TMPRSS2:ERG was observed in 5 out of 11 multifocal cases (45%); a frequency similar to that found in European and North American cases. We found a positive correlation between the TMPRSS2:ERG fusion and a Gleason score of ≤7 and patient age, but found no relationship with pT stage or plasma prostate-specific antigen concentration. To exclude the possibility that Japanese prostate cancer displays novel TMPRSS2:ERG transcript variants or has unique 5′ fusion partners for the ETS genes, we performed 5′ RACE using fresh-frozen prostate cancer samples. We identified only the normal 5′ cDNA ends for ERG, ETV1 and ETV5 in fusion-negative cases. Because we identified a relatively low frequency of TMPRSS2:ERG and other fusions, further evaluation is required before this promising molecular marker should be introduced into the management of Japanese prostate cancer patients. |
The Tower of Hanoi with Forbidden Moves | We consider a variant of the classical three-peg Tower of Hanoi problem, where limitations on the possible moves among the pegs are imposed. Each variant corresponds to a di-graph whose vertices are the pegs, and an edge from one vertex to another designates the ability of moving a disk from the first peg to the other, provided that the rules concerning the disk sizes are obeyed. There are five non-isomorphic graphs on three vertices, which are strongly connected—a sufficient condition for the existence of a solution to the problem. We provide optimal algorithms for the problem for all these graphs, and find the number of moves each requires. |
Intensity-modulated extended-field chemoradiation plus simultaneous integrated boost in the pre-operative treatment of locally advanced cervical cancer: a dose-escalation study. | OBJECTIVE
To investigate the feasibility and determine the recommended pre-operative intensity-modulated radiotherapy (IMRT) dose of extended-field chemoradiation along with simultaneous integrated boost (SIB) dose escalation.
METHODS
A radiation dose of 40 Gy over 4 weeks, 2 Gy/fraction, was delivered to the tumour and the lymphatic drainage (planning target volume, PTV3), which encompassed a volume larger than standard (common iliac lymphatic area up to its apex, in front of the L3 vertebra), concurrently with chemotherapy (cisplatin and 5-fluorouracil). Radiation dose was escalated to the pelvis (PTV2) and to the macroscopic disease (PTV1) with the SIB-IMRT strategy. Three dose levels were planned: Level 1 (PTV3: 40/2 Gy; PTV2: 40/2 Gy; PTV1: 45/2.25 Gy), Level 2 (PTV3: 40/2 Gy; PTV2: 45/2.25 Gy; PTV1: 45/2.25 Gy) and Level 3 (PTV3: 40/2 Gy; PTV2: 45/2.25 Gy; PTV1: 50/2.5 Gy). All treatments were delivered in 20 fractions. Patients were treated in cohorts of between three and six per group using a Phase I study design. The recommended dose was exceeded if two of the six patients in a cohort experienced dose-limiting toxicity within 3 months from treatment.
RESULTS
19 patients [median age: 46 years; The International Federation of Gynecology and Obstetrics (FIGO) stage IB2: 3, IIB: 10, IIIA-IIIB: 6] were enrolled. Median follow-up was 24 months (9-60 months). The most common grade 3/4 toxicity was gastrointestinal (GI) (diarrhoea, mucous discharge, rectal/abdominal pain). At Levels 1 and 2, only one grade 3 GI toxicity per level was recorded, whereas at Level 3, two grade 3 GI toxicities (diarrhoea, emesis and nausea) were recorded.
CONCLUSION
The SIB-IMRT technique was found to be feasible and safe at the recommended doses of 45 Gy to PTV1 and PTV2 and 40 Gy to PTV3 in the pre-operative treatment of patients with locally advanced cervical cancer. Unfortunately, this complex technique was unable to safely escalate dose beyond levels already achieved with three-dimensional conformal radiotherapy technique given acute GI toxicity.
ADVANCES IN KNOWLEDGE
A Phase I radiotherapy dose-escalation trial with SIB-IMRT technique is proposed in cervical cancer. This complex technique is feasible and safe at the recommended doses. |
Goserelin versus leuprolide in the chemical castration of patients with prostate cancer | To evaluate the relative efficiency of leuprolide 3.75 mg, leuprolide 7.5 mg, and goserelin 3.6 mg in relation to the reduction in serum testosterone, regarding the levels of castration. We evaluated prospectively 60 randomized patients with advanced prostate carcinoma, with indication for hormone blockade. The patients were divided into 3 groups of 20: Group (1) received leuprolide 3.75 mg; Group (2) received leuprolide 7.5 mg; and Group (3) received goserelin 3.6 mg. All groups were treated with monthly application of the respective drugs. The patients’ levels of serum testosterone were evaluated in two moments: before the treatment and 3 months after the treatment. The patients’ ages were similar within the three groups, with a median of 72, 70, and 70 in groups 1, 2, and 3, respectively. Of the patients that received leuprolide 3.75 mg, leuprolide 7.5 mg, and goserelin 3.6 mg, 26.3, 25, and 35%, respectively, did not reach castration levels, considering a testosterone cutoff ≤ 50 ng/dl. And 68.4, 30, and 45%, respectively, did not reach castration levels, considering a testosterone cutoff ≤ 20 ng/dl. There were no statistically significant differences in the levels of castration when comparing leuprolide 3.75 mg, leuprolide 7.5 mg, and goserelin 3.6 mg, altogether. When compared in groups of two, there was a statistically significant difference between leuprolide 3.75 mg and leuprolide 7.5 mg, the latter presented better results in reaching castration levels, cutoff ≤ 20 ng/dl. The importance of this difference, however, must be measured with caution, since the comparison of the three groups simultaneously did not reach the established significance level, even though it came close. |
Miniaturized Wilkinson power dividers utilizing capacitive loading | The authors report the miniaturization of a planar Wilkinson power divider by capacitive loading of the quarter wave transmission lines employed in conventional Wilkinson power dividers. Reduction of the transmission line segments from /spl lambda//4 to between /spl lambda//5 and /spl lambda//12 are reported here. The input and output lines at the three ports and the lines comprising the divider itself are coplanar waveguide (CPW) and asymmetric coplanar stripline (ACPS), respectively. The 10 GHz power dividers are fabricated on high resistivity silicon (HRS) and alumina wafers. These miniaturized dividers are 74% smaller than conventional Wilkinson power dividers, and have a return loss better than +30 dB and an insertion loss less than 0.55 dB. Design equations and a discussion about the effect of parasitic reactance on the isolation are presented for the first time. |
Learning to handle negated language in medical records search | Negated language is frequently used by medical practitioners to indicate that a patient does not have a given medical condition. Traditionally, information retrieval systems do not distinguish between the positive and negative contexts of terms when indexing documents. For example, when searching for patients with angina, a retrieval system might wrongly consider a patient with a medical record stating ``no evidence of angina" to be relevant. While it is possible to enhance a retrieval system by taking into account the context of terms within the indexing representation of a document, some non-relevant medical records can still be ranked highly, if they include some of the query terms with the intended context. In this paper, we propose a novel learning framework that effectively handles negated language. Based on features related to the positive and negative contexts of a term, the framework learns how to appropriately weight the occurrences of the opposite context of any query term, thus preventing documents that may not be relevant from being retrieved. We thoroughly evaluate our proposed framework using the TREC 2011 and 2012 Medical Records track test collections. Our results show significant improvements over existing strong baselines. In addition, in combination with a traditional query expansion and a conceptual representation approach, our proposed framework could achieve a retrieval effectiveness comparable to the performance of the best TREC 2011 and 2012 systems, while not addressing other challenges in medical records search, such as the exploitation of semantic relationships between medical terms. |
Microbuckling instability in elastomeric cellular solids | Compressive properties of elastic cellular solids are studied via experiments upon foam and upon single-cell models. Open-cell foam exhibits a monotonic stress-strain relation with a plateau region; deformation is localized in transverse bands. Single-cell models exhibit a force-deformation relation which is not monotonic. In view of recent concepts of the continuum theory of elasticity, the banding instability of the foam in compression is considered to be a consequence of the non-monotonic relation between force and deformation of the single cell. |
Fenestration in the P 1 Segment of the Posterior Cerebral Artery | The posterior cerebral artery (PCA) has been noted in literature to have anatomical variations, specifically fenestration. Cerebral arteries with fenestrations are uncommon, especially when associated with other vascular pathologies. We report a case here of fenestrations within the P1 segment of the right PCA associated with a right middle cerebral artery (MCA) aneurysm in an elder adult male who presented with a new onset of headaches. The patient was treated with vascular clipping of the MCA and has recovered well. Identifying anatomical variations with appropriate imaging is of particular importance in neuro-interventional procedures as it may have an impact on the procedure itself and consequently post-interventional outcomes. Categories: Neurology, Neurosurgery |
Multiple event analysis for large-scale power systems through cluster-based sparse coding | Accurate event analysis in real time is of paramount importance for high-fidelity situational awareness such that proper actions can take place before any isolated faults escalate to cascading blackouts. For large-scale power systems, due to the large intra-class variance and inter-class similarity, the nonlinear nature of the system, and the large dynamic range of the event scale, multi-event analysis presents an intriguing problem. Existing approaches are limited to detecting only single or double events or a specified event type. Although some previous works can well distinguish multiple events in small-scale power systems, the performance tends to degrade dramatically in large-scale systems. In this paper, we focus on multiple event detection, recognition, and temporal localization in large-scale power systems. We discover that there always exist groups of buses whose reaction to each event shows high degree similarity, and the group membership generally remains the same regardless of the type of event(s). We further verify that this reaction to multiple events can be approximated as a linear combination of reactions to each constituent event. Based on these findings, we propose a novel method, referred to as cluster-based sparse coding (CSC), to extract all the underlying single events involved in a multi-event scenario. Experimental results based on simulated large-scale system model (i.e., NPCC) show that the proposed CSC algorithm presents high detection and recognition rate with low false alarms. |
ALDOCX: Detection of Unknown Malicious Microsoft Office Documents Using Designated Active Learning Methods Based on New Structural Feature Extraction Methodology | Attackers increasingly take advantage of innocent users who tend to casually open email messages assumed to be benign, carrying malicious documents. Recent targeted attacks aimed at organizations utilize the new Microsoft Word documents (*.docx). Anti-virus software fails to detect new unknown malicious files, including malicious docx files. In this paper, we present ALDOCX, a framework aimed at accurate detection of new unknown malicious docx files that also efficiently enhances the framework’s detection capabilities over time. Detection relies upon our new structural feature extraction methodology (SFEM), which is performed statically using meta-features extracted from docx files. Using machine-learning algorithms with SFEM, we created a detection model that successfully detects new unknown malicious docx files. In addition, because it is crucial to maintain the detection model’s updatability and incorporate new malicious files created daily, ALDOCX integrates our active-learning (AL) methods, which are designed to efficiently assist anti-virus vendors by better focusing their experts’ analytical efforts and enhance detection capability. ALDOCX identifies and acquires new docx files that are most likely malicious, as well as informative benign files. These files are used for enhancing the knowledge stores of both the detection model and the anti-virus software. The evaluation results show that by using ALDOCX and SFEM, we achieved a high detection rate of malicious docx files (94.44% TPR) compared with the anti-virus software (85.9% TPR)—with very low FPR rates (0.19%). ALDOCX’s AL methods used only 14% of the labeled docx files, which led to a reduction of 95.5% in security experts’ labeling efforts compared with the passive learning and the support vector machine (SVM)-Margin (existing active-learning method). Our AL methods also showed a significant improvement of 91% in number of unknown docx malware acquired, compared with the passive learning and the SVM-Margin, thus providing an improved updating solution for the detection model, as well as the anti-virus software widely used within organizations. |
The use of medical claims to assess incidence, diagnostic procedures and initial treatment of myelodysplastic syndromes and chronic myelomonocytic leukemia in the Netherlands. | Myelodysplastic syndromes (MDS) and chronic myelomonocytic leukemia (CMML) may be underreported in cancer registries such as the Netherlands Cancer Registry (NCR). Analysis of Dutch medical claims can complement NCR data on MDS and CMML. We analyzed data on 3681 MDS patients and 235 CMML patients aged ≥18 years with initial claims for MDS or CMML from the Dutch nationwide medical claims-based Diagnosis Treatment Combination Information System (DIS) between 2008 and 2010. Clinical information was available in the DIS. MDS and CMML were diagnosed without a bone marrow (BM) examination in almost half of the patients. The age-standardized incidence rate (ASR) per 100,000 in the cohort that underwent BM examinations compared with NCR data was 2.8 vs. 3.3 for MDS and 0.2 vs. 0.4 for CMML in 2008-2010. A conservative treatment approach was associated with increasing age and absence of BM examination in MDS (p<0.001 for both) and CMML patients (p<0.033 for both). In conclusion, the ASR of MDS in the cohort that underwent BM examinations was comparable with the NCR. The majority of elderly patients, either with or without BM examinations, received no therapy. Together, MDS and CMML may be misdiagnosed and inappropriately managed without a BM confirmation. |
Generalized quasi-geostrophy for spatially anisotropic rotationally constrained flows | Closed reduced equations analogous to the quasi-geostrophic equations are derived in the extratropics for small Rossby numbers and vertical scales that are comparable to or much larger than horizontal scales. On these scales, significant vertical motions are permitted and found to couple to balanced geostrophic dynamics. In the equatorial regions, similar reduced equations are derived for meridional scales much larger than the vertical and zonal scales. These equations are derived by a systematic exploration of different aspect ratios, and Froude and buoyancy numbers, and offer advantages similar to the standard quasi-geostrophic equations for studies of smallerscale processes and/or of the equatorial regions. |
Measurement properties of the Flu-Like Symptom Index from the Hepatitis Physical Symptom Severity Diary | Chronic Hepatitis C (CHC) Virus infection is a serious health issue in the US. Standard treatment involves peginterferon alpha and ribavirin, often associated with adverse side effects including flu-like symptoms. These adverse effects are common reasons for the discontinuation of treatment and therefore represent a major obstacle in the effective treatment of CHC. The Hepatitis Physical Symptom Severity Diary, a newly developed patient-reported outcome measure for assessing physical symptoms in CHC patients, was recently developed. It contains four questions addressing flu-like symptoms [the Flu-Like Symptom Index (FLSI)]. Measurement properties of the FLSI in CHC patients were assessed using data from two randomized clinical trials. Exploratory factor analysis using data from baseline and the last visit while on treatment supported a single-factor solution for the FLSI. Internal reliability and test–retest reliability are acceptable (Cronbach’s alpha range 0.73–0.81; intraclass correlation coefficient range 0.85–0.97), and correspondence to several similar constructs was acceptable. The FLSI score was higher among those with investigator-reported flu-like symptoms (mean = 4.1) versus those without (1.4), although not statistically significant (p = 0.12). Responsiveness of the FLSI was moderate, as measured by standardized effect sizes and response means, and the minimum important difference (MID) was estimated at 2.5–3.0 points. While additional research should be conducted to evaluate validity with more closely related constructs and to utilize anchor-based methods for estimating the MID, data suggest that the FLSI has acceptable measurement properties and can be an effective tool in assessing flu-like symptoms in CHC patients. |
Intelligent fuzzy controller of a quadrotor | The aim of this work is to describe an intelligent system based on fuzzy logic that is developed to control a quadrotor. A quadrotor is a helicopter with four rotors, that make the vehicle more stable but more complex to model and to control. The quadrotor has been used as a testing platform in the last years for various universities and research centres. A quadrotor has six degrees of freedom, three of them regarding the position: height, horizontal and vertical motions; and the other three are related to the orientation: pitch, roll and yaw. A fuzzy control is designed and implemented to control a simulation model of the quadrotor. The inputs are the desired values of the height, roll, pitch and yaw. The outputs are the power of each of the four rotors that is necessary to reach the specifications. Simulation results prove the efficiency of this intelligent control strategy. |
Group Buying on the Web: A Comparison of Price-Discovery Mechanisms | Web-based Group-Buying mechanisms, a refinement of quantity discounting, are being used for both Business-to-Business (B2B) and Business-toConsumer (B2C) transactions. In this paper, we survey currently operational online Group-Buying markets, and then study this phenomenon using analytical models. We surveyed over fifty active Group-Buying sites, and provide a comprehensive review of Group-Buying practices in the B2B, B2C and non-profit sectors, across three continents. On the modeling side, we build on the coordination literature in Information Economics and the quantity-discounts literature in Operations to develop an analytical model of a monopolist who uses web-based Group-Buying mechanisms under different kinds of demand uncertainty. We derive the monopolist's optimal Group-Buying schedule, and compare his profits with those that obtain under the more conventional posted-price mechanism. We also study the effect of heterogeneity in the demand regimes, in combination with uncertainty, on the relative performance of the two mechanisms. We further study the impact of the timing of the pricing decision (vis-à-vis the production decision) by modeling it as a two-stage game between the monopolist and buyers. Finally, we investigate how Group-Buying schemes compare with posted price markets when buyers can revise their prior valuation of products based on information received from third parties (infomediaries). In all cases, we characterize the conditions under which one mechanism outperforms the other, and those under which the posted price and Group-Buy mechanisms lead to identical seller revenues. Our results have implications for firms' choice of price discovery mechanisms in electronic markets and scheduling of production and pricing decisions in the presence (and absence) of scale economies of production. 1 Email: [email protected]; Tel: (215) 898 1175. 2 Email: [email protected]; Tel:(215) 573 5677. |
[Wright's peak flow meter in the assessment of respiratory tract obstruction]. | |
Introgression genetics and breeding between Upland and Pima cotton: a review | The narrow genetic base of elite Upland cotton (Gossypium hirsutum L.) germplasm has been a significant impediment to sustained progress in the development of cotton cultivars to meet the needs of growers and industry in recent years. The prospect of widening the genetic base of Upland cotton by accessing the genetic diversity and fiber quality of Pima cotton (Gossypium barbadense L.) has encouraged interspecific hybridization and introgression efforts for the past century. However, success is limited due mainly to genetic barriers between the two species in the forms of divergent gene regulatory systems, accumulated gene mutations, gene order rearrangements and cryptic chromosomal structure differences that have resulted in hybrid breakdown, hybrid sterility and selective elimination of genes. The objective of this paper is to provide a mini-review in interspecific hybridization between Upland and Pima cotton relevant to breeding under the following sections: (1) qualitative genetics; (2) cytogenetic stocks; (3) quantitative genetics; (4) heterosis, and (5) introgression breeding. Case studies of successful examples are provided. |
MPC-Based Approach to Active Steering for Autonomous Vehicle Systems | In this paper a novel approach to autonomous steering systems is presented. A model predictive control (MPC) scheme is designed in order to stabilize a vehicle along a desired path while fulfilling its physical constraints. Simulation results show the benefits of the systematic control methodology used. In particular we show how very effective steering maneuvers are obtained as a result of the MPC feedback policy. Moreover, we highlight the trade off between the vehicle speed and the required preview on the desired path in order to stabilize the vehicle. The paper concludes with highlights on future research and on the necessary steps for experimental validation of the approach. |
Artificial intelligence - a modern approach: the intelligent agent book | The first edition of Artificial Intelligence: A Modern Approach has become a classic in the AI literature. It has been adopted by over 600 universities in 60 countries, and has been praised as the definitive synthesis of the field. In the second edition, every chapter has been extensively rewritten. Significant new material has been introduced to cover areas such as constraint satisfaction, fast propositional inference, planning graphs, internet agents, exact probabilistic inference, Markov Chain Monte Carlo techniques, Kalman filters, ensemble learning methods, statistical learning, probabilistic natural language models, probabilistic robotics, and ethical aspects of AI. The book is supported by a suite of online resources including source code, figures, lecture slides, a directory of over 800 links to "AI on the Web," and an online discussion group. All of this is available at: aima.cs.berkeley.edu. |
Stronger generalization bounds for deep nets via a compression approach | Deep nets generalize well despite having more parameters than the number of training samples. Recent works try to give an explanation using PAC-Bayes and Margin-based analyses, but do not as yet result in sample complexity bounds better than naive parameter counting. The current paper shows generalization bounds that’re orders of magnitude better in practice. These rely upon new succinct reparametrizations of the trained net — a compression that is explicit and efficient. These yield generalization bounds via a simple compression-based framework introduced here. Our results also provide some theoretical justification for widespread empirical success in compressing deep nets. Analysis of correctness of our compression relies upon some newly identified “noise stability”properties of trained deep nets, which are also experimentally verified. The study of these properties and resulting generalization bounds are also extended to convolutional nets, which had eluded earlier attempts on proving generalization. |
Projecting Functional Models of Imperative Programs | Functional modelling [17,29] enables functional reasoning methods to be applied to programs written in imperative languages. It is, however, the view of many workers [4, 24] that it is not the notation in which a program is written, but the sheer size of a program which prohibits the application of reasoning and proof techniques.The beauty of Projection is that simple aspects of programs have correspondingly simple models, irrespective of the size and complexity of the overall imperative program.We describe the Projection technique, showing how it allows for the manageable application of functional language technology to 'real' imperative programs. We demonstrate how Projection may facilitate the development of a proof of a program in terms of its constituents and how Projection may allow programmers to select the safety-critical sections of programs for particular attention.We briefly discuss the connection between Projection and Slicing, introduced by Weiser [34,35], and give examples of the functional programming techniques made available to imperative programmers by virtue of functional modelling. Specifically, we use formal proof [23, 24, 4], Transformation [5, 9] and Partial Evaluation [14, 16, 31]. We claim that the Projection of functional models lends itself to the analysis and proof of existing computer software written in imperative programming languages. |
The Formulation of Public Policy in the Absence of Democracy | The formulation of public policy in nations of a non-democratic nature takes different forms and depends on whether the nations govern from the left or the right. Both have command style governments. I review how policies affect the family; arts, education and language; land, property and possessions; nuclear energy and economic development. Some conclu sions are drawn about states without a tolerated opposition. |
Naming Games in Spatially-Embedded Random Networks | We investigate a prototypical agent-based model, the Naming Game, on random geometric networks. The Naming Game is a minimal model, employing local communications that captures the emergence of shared communication schemes (languages) in a population of autonomous semiotic agents. Implementing the Naming Games on random geometric graphs, local communications being local broadcasts, serves as a model for agreement dynamics in large-scale, autonomously operating wireless sensor networks. Further, it captures essential features of the scaling properties of the agreement process for spatially-embedded autonomous agents. We also present results for the case when a small density of long-range communication links are added on top of the random geometric graph, resulting in a "small-world"-like network and yielding a significantly reduced time to reach global agreement. |
Inferring Ancestry : Mitochondrial Origins and Other Deep Branches in the Eukaryote Tree of Life | There are ~12 supergroups of complex-celled organisms (eukaryotes), but relationships among them (including the root) remain elusive. For Paper I, I developed a dataset of 37 eukaryotic proteins of ... |
The molecular chaperone SecB is released from the carboxy-terminus of SecA during initiation of precursor protein translocation. | The chaperone SecB keeps precursor proteins in a translocation-competent state and targets them to SecA at the translocation sites in the cytoplasmic membrane of Escherichia coli. SecA is thought to recognize SecB via its carboxy-terminus. To determine the minimal requirement for a SecB-binding site, fusion proteins were created between glutathione-S-transferase and different parts of the carboxy-terminus of SecA and analysed for SecB binding. A strikingly short amino acid sequence corresponding to only the most distal 22 aminoacyl residues of SecA suffices for the authentic binding of SecB or the SecB-precursor protein complex. SecAN880, a deletion mutant that lacks this highly conserved domain, still supports precursor protein translocation but is unable to bind SecB. Heterodimers of wild-type SecA and SecAN880 are defective in SecB binding, demonstrating that both carboxy-termini of the SecA dimer are needed to form a genuine SecB-binding site. SecB is released from the translocase at a very early stage in protein translocation when the membrane-bound SecA binds ATP to initiate translocation. It is concluded that the SecB-binding site on SecA is confined to the extreme carboxy-terminus of the SecA dimer, and that SecB is released from this site at the onset of translocation. |
Anxiety and sensory over-responsivity in toddlers with autism spectrum disorders: bidirectional effects across time. | This report focuses on the emergence of and bidirectional effects between anxiety and sensory over-responsivity (SOR) in toddlers with autism spectrum disorders (ASD). Participants were 149 toddlers with ASD and their mothers, assessed at 2 annual time points. A cross-lag analysis showed that anxiety symptoms increased over time while SOR remained relatively stable. SOR positively predicted changes in anxiety over and above child age, autism symptom severity, NVDQ, and maternal anxiety, but anxiety did not predict changes in SOR. Results suggest that SOR emerges earlier than anxiety, and predicts later development of anxiety. |
RAPSTROM™ first-in-man study long-term results of a biodegradable polymer sustained-release sirolimus-eluting stent in de novo coronary stenoses. | BACKGROUND
Durable polymers used for first-generation drug-eluting stents (DES) potentially contribute to persistent inflammation and late DES thrombosis. We report the first real-life human experience with the rapamycin-eluting biodegradable polymer-coated Rapstrom stent.
METHODS
All consecutive patients with single de novo native coronary stenosis (<30 mm and between 2.5 and 4.0 mm) were enrolled. Major adverse cardiac events (MACE) at 1 year (cardiac death, myocardial infarction [Q and non-Q], or ischemia-driven target lesion revascularization) were the primary end-point.
RESULTS
A total of 123 patients were enrolled. The stent was implanted without complications in all patients, and no MACE were recorded at 30 days. At 12-month follow-up 9 patients (7.3%) experienced a MACE and 4 (3.2%) required a target lesion revascularization, while 1 (1%) stent thrombosis was recorded. A planned angiographic follow-up (FU) was performed in 73 patients (59%) at 9.4 ± 2.6 months following the index procedure. In-stent late loss was 0.16 ± 0.09 mm, and in-segment late loss was 0.18 ± 0.8 mm.
CONCLUSION
The Rapstrom biodegradable polymer rapamycin-eluting stent appeared safe and efficacious in this first real-life human experience, due to a low late lumen loss. Larger randomized studies are required to confirm these preliminary results. |
An empirical study of identifier splitting techniques | Researchers have shown that program analyses that drive software development and maintenance tools supporting search, traceability and other tasks can benefit from leveraging the natural language information found in identifiers and comments. Accurate natural language information depends on correctly splitting the identifiers into their component words and abbreviations. While conventions such as camel-casing can ease this task, conventions are not well-defined in certain situations and may be modified to improve readability, thus making automatic splitting more challenging. This paper describes an empirical study of state-of-the-art identifier splitting techniques and the construction of a publicly available oracle to evaluate identifier splitting algorithms. In addition to comparing current approaches, the results help to guide future development and evaluation of improved identifier splitting approaches. |
Refining Individualized Consideration: Distinguishing Developmental Leadership and Supportive Leadership | This study explores the theoretical and empirical distinction between developmental leadership and supportive leadership, which are currently encompassed in a single sub dimension of transformational leadership, individualized consideration. Items were selected to assess these constructs, and hypotheses regarding the differential effects of developmental and supportive leadership were proposed. Confirmatory factor analyses provided support for the proposed distinction between developmental and supportive leadership, although these leadership factors were very strongly associated. Structural equation modelling and multi-level modelling results indicated that both developmental leadership and supportive leadership displayed unique relationships with theoretically selected outcome measures. Developmental leadership displayed significantly stronger relationships with job satisfaction, career certainty, affective commitment to the organization and role breadth self-efficacy than did supportive leadership. Results provide initial evidence in support of the discriminant validity of these two types of leadership. Discussion focuses on the need to further examine the construct of developmental leadership. |
Assessment of structural and transport properties in fibrous C/C composite preforms as digitized by X-ray CMT. Part II : Heat and gas transport | Raw and partially infiltrated carbon-carbon composite preforms have been scanned by high-resolution synchrotron radiation x-ray computerized micro-tomography (CMT). Three-dimensional (3D) high-quality images of the pore space have been produced at two distinct resolutions and have been used for the computation of transport properties: heat conductivity, binary gas diffusivities, Knudsen diffusivities, and viscous flow permeabilities. The computation procedures are based on a double change-of-scale strategy suited to the bimodal nature of pore space, and on the local determination of transport anisotropy. Good agreement has been found between all calculated quantities and experimental data. |
Investigating The Adoption of Knowledge Management in a Non-Govermental Organization in Malaysia | Non-governmental organizations use knowledge extensively in humanitarian efforts, engagements with organizations concerning the welfare of the people and other activities. The knowledge in nongovernmental organizations are mostly tacit. Nevertheless, little is known about knowledge management practices in non-governmental organizations. The aim of this study is to investigate the adoption of knowledge management in a nongovernmental organization. This study adopts a case study research design. Data was collected through a survey sent to a non-governmental organization. The questionnaire was based on the four pillars of knowledge management namely management and organization, people and culture, content and processes and infrastructure. A total of 31 respondents from one NGO participated in the survey. The data was analyzed using descriptive analyses. The findings indicate that knowledge is widely acknowledge as important in non-governmental organizations. To a certain extent, knowledge management principles have been practiced in the organization. However, non-governmental organizations have yet to establish an organizationwide knowledge management strategy. The study provides an understanding of how knowledge management is perceived by the non-governmental organizations and provide valuable insights to the practice of knowledge management in nongovernmental organizations. |
A Theory of Object Recognition: Computations and Circuits in the Feedforward Path of the Ventral Stream in Primate Visual Cortex | We describe a quantitative theory to account for the computations performed by the feedforward path of the ventral stream of visual cortex and the local circuits implementing them. We show that a model instantiating the theory is capable of performing recognition on datasets of complex images at the level of human observers in rapid categorization tasks. We also show that the theory is consistent with (and in some case has predicted) several properties of neurons in V1, V4, IT and PFC. The theory seems sufficiently comprehensive, detailed and satisfactory to represent an interesting challenge for physiologists and modelers: either disprove its basic features or propose alternative theories of equivalent scope. The theory suggests a number of open questions for visual physiology and psychophysics. This version replaces the preliminary “Halloween” CBCL paper from Nov. 2005. This report describes research done within the Center for Biological & Computational Learning in the Department of Brain & Cognitive Sciences and in the Computer Science and Artificial Intelligence Laboratory at the Massachusetts Institute of Technology. This research was sponsored by grants from: Office of Naval Research (DARPA) under contract No. N00014-00-1-0907, National Science Foundation (ITR) under contract No. IIS-0085836, National Science Foundation (KDI) under contract No. DMS-9872936, and National Science Foundation under contract No. IIS-9800032 Additional support was provided by: Central Research Institute of Electric Power Industry, Center for e-Business (MIT), Eastman Kodak Company, DaimlerChrysler AG, Compaq, Honda R&D Co., Ltd., Komatsu Ltd., Merrill-Lynch, NEC Fund, Nippon Telegraph & Telephone, Siemens Corporate Research, Inc., The Whitaker Foundation, and the SLOAN Foundations. 1To whom correspondence should be addressed. Email: [email protected] |
Personae: a Corpus for Author and Personality Prediction from Text | We present a new corpus for computational stylometry, more specifically authorship attribution and the prediction of author personality from text. Because of the large number of authors (145), the corpus will allow previously impossible studies of variation in features considered predictive for writing style. The innovative meta-information (personality profiles of the authors) associated with these texts allows the study of personality prediction, a not yet very well researched aspect of style. In this paper, we describe the contents of the corpus and show its use in both authorship attribution and personality prediction. We focus on features that have been proven useful in the field of author recognition. Syntactic features like part-of-speech n-grams are generally accepted as not being under the author’s conscious control and therefore providing good clues for predicting gender or authorship. We want to test whether these features are helpful for personality prediction and authorship attribution on a large set of authors. Both tasks are approached as text categorization tasks. First a document representation is constructed based on feature selection from the linguistically analyzed corpus (using the Memory-Based Shallow Parser (MBSP)). These are associated with each of the 145 authors or each of the four components of the Myers-Briggs Type Indicator (Introverted-Extraverted, Sensing-iNtuitive, Thinking-Feeling, JudgingPerceiving). Authorship attribution on 145 authors achieves results around 50% accuracy. Preliminary results indicate that the first two personality dimensions can be predicted fairly accurately. |
The accuracy of prognostic scoring systems for post-operative morbidity and mortality in patients with perforated peptic ulcer - | In urgent surgical procedures for perforated peptic ulcer (PPU), there is considerable postoperative morbidity and mortality. The overall mortality rate is about 9-27%. 1-3 A large number of prognostic factors for morbidity and mortality in patients with perforated peptic ulcer have been reported. 2-7 Several clinical scoring system have been proposed for prognostic prediction. The most wellknown predicted scoring system is Boey score 3, which predicted mortality in PPU patients base on the time from perforation to admission, pre-operative systolic blood pressure and comorbid conditions of patients. Then Boey score was classified in three groups, score 0, 1 and 2 and mortality rate were 0%, 10% and 100% respectively. Lohsiriwat et al was found that a higher Boey score was associated with increasing rates of both morbidity and mortality and could be considered as a simple and appropriate prognostic marker in the management of PPU. 1 |
JSForce: A Forced Execution Engine for Malicious JavaScript Detection | The drastic increase of JavaScript exploitation attacks has led to a strong interest in developing techniques to enable malicious JavaScript analysis. Existing analysis techniques fall into two general categories: static analysis and dynamic analysis. Static analysis tends to produce inaccurate results (both false positive and false negative) and is vulnerable to a wide series of obfuscation techniques. Thus, dynamic analysis is constantly gaining popularity for exposing the typical features of malicious JavaScript. However, existing dynamic analysis techniques possess limitations such as limited code coverage and incomplete environment setup, leaving a broad attack surface for evading the detection. To overcome these limitations, we present the design and implementation of a novel JavaScript forced execution engine named JSForce which drives an arbitrary JavaScript snippet to execute along different paths without any input or environment setup. We evaluate JSForce using 220,587 HTML and 23,509 PDF realworld samples. Experimental results show that by adopting our forced execution engine, the malicious JavaScript detection rate can be substantially boosted by 206.29% using same detection policy without any noticeable false positive increase. We also make JSForce publicly available as an online service and will release the source code to the security community upon the acceptance for publication. |
A Comparative Analysis of Selection Schemes Used in Genetic Algorithms | This paper considers a number of selection schemes commonly used in modern genetic algorithms. Specifically, proportionate reproduction, ranking selection, tournament selection, and Genitor (or «steady state") selection are compared on the basis of solutions to deterministic difference or differential equations, which are verified through computer simulations. The analysis provides convenient approximate or exact solutions as well as useful convergence time and growth ratio estimates. The paper recommends practical application of the analyses and suggests a number of paths for more detailed analytical investigation of selection techniques. |
Three-dimensional particle simulation modeling of ion propulsion plasma environment for Deep Space One | A fully three-dimensional particle-in-cell simulation model was developed to obtain ion propulsion charge-exchange plasma environment over the entire downstream-toupstream region for the Deep Space 1 (DS1) spacecraft. Simulations are compared with in-flight measurements of charge-exchange plasma from the Ion Propulsion Diagnostics Subsystem on DS1, and the results show an excellent agreement. It is found that the plasma environment of DS1 spacecraft is dominated by the charge-exchange plasma from the plume. For a typical ion thruster operating condition, the charge exchange plasma near the spacecraft surface has a density ranging from lo6 c m 3 at the thruster end to lo4 ~ 7 n ~ at the opposite end, and a current density ranging from to lo-’ A / ~ r n ~ . It is shown that for an interplanetary spacecraft with a moderate charging potential, charge-exchange ion backflow is through an expansion process similar to that of the expansion of a mesothermal plasma into a vacuum. |
Determined to Die! Ability to Act Following Multiple Self-inflicted Gunshot Wounds to the Head. The Cook County Office of Medical Examiner Experience (2005-2012) and Review of Literature. | Cases of multiple (considered 2+) self-inflicted gunshot wounds are a rarity and require careful examination of the scene of occurrence; thorough consideration of the decedent's psychiatric, medical, and social histories; and accurate postmortem documentation of the gunshot wounds. We present a series of four cases of multiple self-inflicted gunshot wounds to the head from the Cook County Medical Examiner's Office between 2005 and 2012 including the first case report of suicide involving eight gunshot wounds to the head. In addition, a review of the literature concerning multiple self-inflicted gunshot wounds to the head is performed. The majority of reported cases document two gunshot entrance wound defects. Temporal regions are the most common affected regions (especially the right and left temples). Determining the capability to act following a gunshot wound to the head is necessary in crime scene reconstruction and in differentiation between homicide and suicide. |
Modified singular value decomposition by means of independent component analysis | In multisensor signal processing (underwater acoustics, geophysics, etc.), the initial dataset is usually separated into complementary subspaces called signal and noise subspaces in order to enhance the signal to noise ratio. The Singular Value Decomposition (SVD) is a useful tool to achieve this separation. It provides two orthogonal matrices that convey information on normalized wavelets and propagation vectors. As signal and noise subspaces are on the whole well evaluated, usually the SVD procedure cannot correctly extract only the source waves with a high degree of sensor to sensor correlation. This is due to the constraint given by the orthogonality of the propagation vectors. To relax this condition, exploiting the concept of Independent Component Analysis (ICA), we propose another orthogonal matrix made up of statistically independent normalized wavelets. By using this combined SVD-ICA procedure, we obtain a better separation of these source waves in the signal subspace. Efficiency of this new separation procedure is shown on synthetic and real datasets. |
ELECTORAL RULES AS CONSTRAINTS ON CORRUPTION: THE RISKS OF CLOSED-LIST PROPORTIONAL REPRESENTATION | This paper investigates how different electoral rules influence political corruption. We argue that closed-list proportional representation systems are most susceptible to corruption relative to open-list proportional representation and plurality systems. We argue that this effect is due to the way both closed party lists and geographically large districts limit voters’ ability to monitor incumbents. We also examine interaction effects between electoral rules and other institutional forms, namely presidentialism, federalism, and bi-cameralism. We test our main predictions, the proposed causal mechanism, and interaction effects empirically on a cross-section of 105 countries, controlling for economic, political, and social background factors. The empirical findings strongly support our theoretical hypothesis that closed-list PR systems, especially together with presidentialism, are associated with higher levels of corruption. This result is robust to different model specifications and deleting influential observations. ____________________________ * We would like to thank Jose Antonio Cheibub, Rafael DiTella, Eduardo Engel, Philip Levy, Fiona McGillivray, Frances Rosenbluth, Matthew Shugart, and Alastair Smith for helpful discussions and comments on earlier drafts, and Phil Keefer and Jessica Seddon for sharing their data. All remaining errors are ours. Contact Information: [email protected]; [email protected]. |
Split hardware transactions: true nesting of transactions using best-effort hardware transactional memory | Transactional Memory (TM) is on its way to becoming the programming API of choice for writing correct, concurrent, and scalable programs. Hardware TM (HTM) implementations are expected to be significantly faster than pure software TM (STM); however, full hardware support for true closed and open nested transactions is unlikely to be practical.
This paper presents a novel mechanism, the split hardware transaction (SpHT), that uses minimal software support to combine multiple segments of an atomic block, each executed using a separate hardware transaction, into one atomic operation. The idea of segmenting transactions can be used for many purposes, including nesting, local retry, orElse, and user-level thread scheduling; in this paper we focus on how it allows linear closed and open nesting of transactions. SpHT overcomes the limited expressive power of best-effort HTM while imposing overheads dramatically lower than STM and preserving useful guarantees such as strong atomicity provided by the underlying HTM. |
Throughput Estimation for Short Lived TCP Cubic Flows | Mobile devices are increasingly becoming the dominant device for Internet access. The network throughput achieved by a mobile device directly affects the performance and user experience. Throughput measurement techniques thus play an important role in predicting expected performance. Measurement techniques that require the transfer of large amounts of data can provide higher accuracy but incur large overhead. Further, since most mobile cellular plans impose usage quota, the overhead of such measurements over cellular networks can become quite high. Smaller data transfers have also been used to measure the throughput. Due to the conservative TCP slow start behaviour, however, these measurements often underestimate the achievable throughput. Considering these weaknesses in existing throughput measurement techniques, we propose a throughput estimation technique for TCP Cubic that uses 1 MB of data transfer to predict the throughput for prevalent large transfer sizes in mobile traffic such as 5 MB, 10 MB and 20 MB. Our evaluation shows that our approach can achieve high accuracy with low overhead, in predicting the achievable throughput. |
Clinical parameters in male genital lichen sclerosus: a case series of 329 patients. | BACKGROUND
The dermatological aspects of male genital lichen sclerosus (MGLSc) have not received much prominence in the literature. Sexual morbidity appears under-appreciated, the role of histology is unclear, the relative places of topical medical treatment and circumcision are not established, the prognosis for sexual function, urinary function and penis cancer is uncertain and the pathogenesis has not been specifically studied although autoimmunity (as in women) and HPV infection have been mooted.
OBJECTIVE
To illuminate the above by analysing the clinical parameters of a large series of patients with MGLSc.
METHODS
A total of 329 patients with a clinical diagnosis of MGLSc were identified retrospectively from a dermatology-centred multidisciplinary setting. Their clinical and histopathological features and outcomes have been abstracted from the records and analysed by simple descriptive statistics.
RESULTS
The collation and analysis of clinical data derived from the largest series of men with MGLSc ever studied from a dermatological perspective has been achieved. These data allow the conclusions below to be drawn.
CONCLUSIONS
MGLSc is unequivocally a disease of the uncircumcised male; the adult peak is late in the fourth decade; dyspareunia is a common presenting complaint; non-specific histology requires careful interpretation; most men are either cured by topical treatment with ultrapotent steroid (50-60%) or by circumcision (>75%); effective and definitive management appears to abrogate the risk of developing penile squamous cell carcinoma; urinary contact is implicated in the pathogenesis of MGLSc; HPV infection and autoimmunity seem unimportant. |
Artificial Intelligence : Potential Benefits and Ethical Considerations | However, like all powerful technologies, great care must be taken in its development and deployment. To reap the societal benefits of AI systems, we will first need to trust them and make sure that they follow the same ethical principles, moral values, professional codes, and social norms that we humans would follow in the same scenario. Research and educational efforts, as well as carefully designed regulations, must be put in place to achieve this goal. |
Sit-to-Stand Trainer: An Apparatus for Training “Normal-Like” Sit to Stand Movement | Sit-to-stand (STS) transfer training is probably the most demanding task in rehabilitation. We have developed an innovative STS trainer that offers variable levels of mechanical support and speeds of STS transfer. In a group of neurologically intact individuals we compared kinematics, kinetics and electromyography (EMG) patterns of STS transfer assessed in three experimental conditions with increasing degree of mechanical support (MIN STS-T, MED STS-T, and MAX STS-T) to natural, unassisted STS movement (NO STS-T). The resulting ankle, knee, hip joint and trunk angles in experimental conditions MED STS-T and MIN STS-T were very similar to experimental condition NO STS-T. Vertical ground reaction forces and EMG patterns in the tibialis anterior, quadriceps and hamstrings show a clear trend toward “normal” patterns as the level of mechanical support from the device is progressively reduced. We have further tested the feasibility of the STS trainer in five stroke subjects at two levels of support showing that increased voluntary effort is needed when the support is reduced. Based on these results we conclude that negligible constraints are imposed by the device on a user's STS transfer kinematics, which is an important prerequisite for considering clinical use of the device for training in neurologically impaired. |
Removal of congo red dye from water using carbon slurry waste | A cheaper adsorbent has been prepared from carbon slurry waste obtained from National Fertilizer Limited (NFL), Panipat and investigated for the removal of congo red, an anionic dye. Its adsorption on prepared carbonaceous adsorbent was studied as a function of contact time, concentration and temperature. The results have shown that carbonaceous adsorbent adsorbs dye to a sufficient extent (272 mg g). A comparative study of adsorption results with those obtained on activated charcoal shows that the carbonaceous adsorbent is ~95% efficient as compared to activated charcoal. Thus, it can be fruitfully used for the removal of dyes from wastewaters. |
Chocolate consumption and bone density in older women. | BACKGROUND
Nutrition is important for the development and maintenance of bone structure and for the prevention of osteoporosis and fracture. The relation of chocolate intake with bone has yet to be investigated.
OBJECTIVE
We investigated the relation of chocolate consumption with measurements of whole-body and regional bone density and strength.
DESIGN
Randomly selected women aged 70-85 y (n=1460) were recruited from the general population to a randomized controlled trial of calcium supplementation and fracture risk. We present here a cross-sectional analysis of 1001 of these women. Bone density and strength were measured with the use of dual-energy X-ray absorptiometry, peripheral quantitative computed tomography, and quantitative ultrasonography. Frequency of chocolate intake was assessed with the use of a questionnaire and condensed into 3 categories: <1 time/wk, 1-6 times/wk, >or=1 time/d.
RESULTS
Higher frequency of chocolate consumption was linearly related to lower bone density and strength (P<0.05). Daily (>or=1 times/d) consumption of chocolate, in comparison to <1 time/wk, was associated with a 3.1% lower whole-body bone density; with similarly lower bone density of the total hip, femoral neck, tibia, and heel; and with lower bone strength in the tibia and the heel (P<0.05, for all). Adjustment for covariates did not influence interpretation of the results.
CONCLUSIONS
Older women who consume chocolate daily had lower bone density and strength. Additional cross-sectional and longitudinal studies are needed to confirm these observations. Confirmation of these findings could have important implications for prevention of osteoporotic fracture. |
Smoking cessation via the internet: a randomized clinical trial of an internet intervention as adjuvant treatment in a smoking cessation intervention. | Internet interventions for smoking cessation are ubiquitous. Yet, to date, there are few randomized clinical trials that gauge their efficacy. This study is a randomized clinical trial (N= 284, n= 140 in the treatment group, n= 144 in the control group) of an Internet smoking cessation intervention. Smokers were randomly assigned to receive either bupropion plus counseling alone, or bupropion and counseling in addition to 12 weeks of access to the Comprehensive Health Enhancement Support System for Smoking Cessation and Relapse Prevention (CHESS SCRP; a Web site which provided information on smoking cessation as well as support). We found that access to CHESS SCRP was not significantly related to abstinence at the end of the treatment period (OR= 1.13, 95% CI 0.66-2.62) or at 6 months postquit (OR= 1.48, 95% CI 0.66-2.62). However, the number of times participants used CHESS SCRP per week was related to abstinence at both end of treatment (OR= 1.79, 95% CI 1.25-2.56) and at the 6-month follow-up (OR= 1.59, 95% CI 1.06-2.38). Participants with access to CHESS SCRP logged in an average of 33.64 times (SD=30.76) over the 90-day period of access. Rates of CHESS SCRP use did not differ by ethnicity, level of education or gender (all p>.05). In sum, results suggest that participants used CHESS SCRP frequently, CHESS SCRP use was related to success, but the effects in general did not yield intergroup effects. |
PiOS: Detecting Privacy Leaks in iOS Applications | With the introduction of Apple’s iOS and Google’s Android operating systems, the sales of smartphones have exploded. These smartphones have become powerful devices that are basically miniature versions of personal computers. However, the growing popularity and sophistication of smartphones have also increased concerns about the privacy of users who operate these devices. These concerns have been exacerbated by the fact that it has become increasingly easy for users to install and execute third-party applications. To protect its users from malicious applications, Apple has introduced a vetting process. This vetting process should ensure that all applications conform to Apple’s (privacy) rules before they can be offered via the App Store. Unfortunately, this vetting process is not welldocumented, and there have been cases where malicious applications had to be removed from the App Store after |
A controlled comparative investigation of psychological treatments for chronic sleep-onset insomnia. | A sample of physician-referred chronic insomniacs was randomly allocated to either progressive relaxation, stimulus control, paradoxical intention, placebo or no treatment conditions. Treatment process and outcome were investigated in terms of mean and standard deviation (night to night variability) measures of sleep pattern and sleep quality. Only active treatments were associated with significant improvement, but the nature of treatment gains varied. In particular, stimulus control improved sleep pattern, whereas relaxation affected perception of sleep quality. All improvements were maintained at 17 month follow-up. Results are discussed with reference to previous research and guidelines are given for clinical practice. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.