title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
Improved Surface Quality in 3D Printing by Optimizing the Printing Direction | We present a pipeline of algorithms that decomposes a given polygon model into parts such that each part can be 3D printed with high (outer) surface quality. For this we exploit the fact that most 3D printing technologies have an anisotropic resolution and hence the surface smoothness varies significantly with the orientation of the surface. Our pipeline starts by segmenting the input surface into patches such that their normals can be aligned perpendicularly to the printing direction. A 3D Voronoi diagram is computed such that the intersections of the Voronoi cells with the surface approximate these surface patches. The intersections of the Voronoi cells with the input model’s volume then provide an initial decomposition. We further present an algorithm to compute an assembly order for the parts and generate connectors between them. A post processing step further optimizes the seams between segments to improve the visual quality. We run our pipeline on a wide range of 3D models and experimentally evaluate the obtained improvements in terms of numerical, visual, and haptic quality. |
A review of podcasting in higher education: Its influence on the traditional lecture | This paper examines the possible influence of podcasting on the traditional lecture in higher education. Firstly, it explores some of the benefits and limitations of the lecture as one of the dominant forms of teaching in higher education. The review then moves to explore the emergence of podcasting in education and the purpose of its use, before examining recent relevant literature about podcasting for supporting, enhancing, and indeed replacing the traditional lecture. The review identifies three broad types of use of podcasting: substitutional, supplementary and creative use. Podcasting appears to be most commonly used to provide recordings of past lectures to students for the purposes of review and revision (substitutional use). The second most common use was in providing additional material, often in the form of study guides and summary notes, to broaden and deepen students’ understanding (supplementary use). The third and least common use reported in the literature involved the creation of student generated podcasts (creative use). The review examines three key questions: What are the educational uses of podcasting in teaching and learning in higher education? Can podcasting facilitate more flexible and mobile learning? In what ways will podcasting influence the traditional lecture? These questions are discussed in the final section of the paper, with reference to future policies and practices. |
Soft start-up for high frequency LLC resonant converter with optimal trajectory control | This paper investigates the soft start-up for high frequency LLC resonant converter with optimal trajectory control. Two methods are proposed to realize soft start-up for high frequency LLC converter by commercial low-cost microcontrollers (MCU). Both methods can achieve soft start-up with minimum stress and optimal energy delivery. One method is mixed-signal implementation by sensing resonant tank to minimize the digital delay. Another method is digital implementation with look-up table. Experimental results are demonstrated on 500kHz 1kW 400V/12V LLC converter. |
Intuitionistic fuzzy hypergraphs with applications | Article history: Received 27 October 2011 Received in revised form 26 February 2012 Accepted 23 June 2012 Available online 6 July 2012 |
Large-scale brain networks and psychopathology: a unifying triple network model | The science of large-scale brain networks offers a powerful paradigm for investigating cognitive and affective dysfunction in psychiatric and neurological disorders. This review examines recent conceptual and methodological developments which are contributing to a paradigm shift in the study of psychopathology. I summarize methods for characterizing aberrant brain networks and demonstrate how network analysis provides novel insights into dysfunctional brain architecture. Deficits in access, engagement and disengagement of large-scale neurocognitive networks are shown to play a prominent role in several disorders including schizophrenia, depression, anxiety, dementia and autism. Synthesizing recent research, I propose a triple network model of aberrant saliency mapping and cognitive dysfunction in psychopathology, emphasizing the surprising parallels that are beginning to emerge across psychiatric and neurological disorders. |
High Flow Nasal Oxygen Therapy: From Physiology to Clinic | INTRODUCTION Oxygen support therapy should be given to the patients with acute hypoxic respiratory insufficiency in order to provide oxygenation of the tissues until the underlying pathology improves. The inspiratory flow rate requirement of patients with respiratory insufficiency varies between 30 and 120 L/min. Low flow and high flow conventional oxygen support systems produce a maximum flow rate of 15 L/min, and FiO2 changes depending on the patient’s peak inspiratory flow rate, respiratory pattern, the mask that is used, or the characteristics of the cannula. The inability to provide adequate airflow leads to discomfort in tachypneic patients. With high-flow nasal oxygen (HFNO) cannulas, warmed and humidified air matching the body temperature can be regulated at flow rates of 5–60 L/min, and oxygen delivery varies between 21% and 100%. When HFNO, first used in infants, was reported to increase the risk of infection, its long-term use was stopped. This problem was later eliminated with the use of sterile water, and its use has become a current issue in critical adult patients as well. Studies show that HFNO treatment improves physiological parameters when compared to conventional oxygen systems. Although there are studies indicating successful applications in different patient groups, there are also studies indicating that it does not create any difference in clinical parameters, but patient comfort is better in HFNO when compared with standard oxygen therapy and noninvasive mechanical ventilation (NIMV) (1-6). In this compilation, the physiological effect mechanisms of HFNO treatment and its use in various clinical situations are discussed in the light of current studies. |
Microprogrammed significance arithmetic: a perspective and feasibility study | This study is an attempt to evaluate the feasibility of microprogrammed routines for monitoring significant digits in the numerical result of digital computers in real time. The first part is tutorial and, in the second part, microprograms for two methods of significance arithmetic are designed and evaluated. |
Active $du/dt$—New Output-Filtering Approach for Inverter-Fed Electric Drives | Active du/dt is a new output-filtering method to mitigate motor overvoltages. The inverter pulse pattern edges are broken down into narrower pulses, which control the filter LC circuit. This results in an output voltage that does not have to exhibit the overshoot typically seen in common LC circuits in output-filtering applications. Furthermore, the shape of the output-voltage edge has properties well suited for output-filtering applications. An appropriate filter rise time is selected according to the motor-cable length to eliminate the motor overvoltage. The basis of the active du/dt method is discussed in brief. Considerations on the application of the active du/dt filtering in electric drives are presented together with simulations and experimental data to verify the potential of the method. |
Systemic Diseases and Conditions Affecting Jaws. | This article discusses the radiographic manifestation of jaw lesions whose etiology may be traced to underlying systemic disease. Some changes may be related to hematologic or metabolic disorders. A group of bone changes may be associated with disorders of the endocrine system. It is imperative for the clinician to compare the constantly changing and dynamic maxillofacial skeleton to the observed radiographic pathology as revealed on intraoral and extraoral imagery. |
ICDAR 2015 competition on Robust Reading | Results of the ICDAR 2015 Robust Reading Competition are presented. A new Challenge 4 on Incidental Scene Text has been added to the Challenges on Born-Digital Images, Focused Scene Images and Video Text. Challenge 4 is run on a newly acquired dataset of 1,670 images evaluating Text Localisation, Word Recognition and End-to-End pipelines. In addition, the dataset for Challenge 3 on Video Text has been substantially updated with more video sequences and more accurate ground truth data. Finally, tasks assessing End-to-End system performance have been introduced to all Challenges. The competition took place in the first quarter of 2015, and received a total of 44 submissions. Only the tasks newly introduced in 2015 are reported on. The datasets, the ground truth specification and the evaluation protocols are presented together with the results and a brief summary of the participating methods. |
Quantitative video-oculography to help diagnose stroke in acute vertigo and dizziness: toward an ECG for the eyes. | BACKGROUND AND PURPOSE
Strokes can be distinguished from benign peripheral causes of acute vestibular syndrome using bedside oculomotor tests (head impulse test, nystagmus, test-of-skew). Using head impulse test, nystagmus, test-of-skew is more sensitive and less costly than early magnetic resonance imaging for stroke diagnosis in acute vestibular syndrome but requires expertise not routinely available in emergency departments. We sought to begin standardizing the head impulse test, nystagmus, test-of-skew diagnostic approach for eventual emergency department use through the novel application of a portable video-oculography device measuring vestibular physiology in real time. This approach is conceptually similar to ECG to diagnose acute cardiac ischemia.
METHODS
Proof-of-concept study (August 2011 to June 2012). We recruited adult emergency department patients with acute vestibular syndrome defined as new, persistent vertigo/dizziness, nystagmus, and (1) nausea/vomiting, (2) head motion intolerance, or (3) new gait unsteadiness. We recorded eye movements, including quantitative horizontal head impulse testing of vestibulo-ocular-reflex function. Two masked vestibular experts rated vestibular findings, which were compared with final radiographic gold-standard diagnoses. Masked neuroimaging raters determined stroke or no stroke using magnetic resonance imaging of the brain with diffusion-weighted imaging obtained 48 hours to 7 days after symptom onset.
RESULTS
We enrolled 12 consecutive patients who underwent confirmatory magnetic resonance imaging. Mean age was 61 years (range 30-73), and 10 were men. Expert-rated video-oculography-based head impulse test, nystagmus, test-of-skew examination was 100% accurate (6 strokes, 6 peripheral vestibular).
CONCLUSIONS
Device-based physiological diagnosis of vertebrobasilar stroke in acute vestibular syndrome should soon be possible. If confirmed in a larger sample, this bedside eye ECG approach could eventually help fulfill a critical need for timely, accurate, efficient diagnosis in emergency department patients with vertigo or dizziness who are at high risk for stroke. |
Consumer Attitude Metrics for Guiding Marketing Mix Decisions | Marketing managers often use consumer attitude metrics such as awareness, consideration, and preference as performance indicators because they represent their brand’s health and are readily connected to marketing activity. However, this does not mean that financially focused executives know how such metrics translate into sales performance, which would allow them to make beneficial marketing mix decisions. We propose four criteria – potential, responsiveness, stickiness and sales conversion – that determine the connection between marketing actions, attitudinal metrics, and sales outcomes. We test our approach with a rich dataset of four-weekly marketing actions, attitude metrics, and sales for several consumer brands in four categories over a sevenyear period. The results quantify how marketing actions affect sales performance through their differential impact on attitudinal metrics, as captured by our proposed criteria. We find that marketing-attitude and attitude-sales relationships are predominantly stable over time, but differ substantially across brands and across product categories with different levels of involvement. We also establish that combining marketing and attitudinal metrics improves the prediction of brand sales performance, often substantially so. Based on these insights, we provide specific recommendations on improving the marketing mix for different brands, and we validate them in a hold-out sample. For managers and researchers alike, our criteria offer a verifiable explanation for differences in marketing elasticities and an actionable connection between marketing and financial performance metrics. |
Lipid-based nano-delivery systems for skin delivery of drugs and bioactives | Topical drug delivery across the skin can offer many advantages, such as confer sustained drug release, lower fluctuations in plasma drug levels, circumvent first-pass metabolism, improve patient compliance, and provide local (dermal), or systemic (transdermal) effects (Schäfer-Korting et al., 2007; El Maghraby et al., 2008). However, the barrier function of the skin, exerted by the horny layer of the stratum corneum, impairs the penetration and absorption of drugs (Bouwstra and Ponec, 2006). This layer prevents the penetration of hydrophilic compounds much more efficiently as compared to lipophilic compounds (Bouwstra et al., 2003; Bouwstra and Ponec, 2006). Therefore, there has been wide interest in exploring new techniques to increase drug absorption through the skin. Novel topical drug delivery systems, with the use of nanotechnology in dosage form design, have been used to facilitate overcoming the skin barrier. This article will summarize recent findings of lipid-based nano-delivery systems for skin delivery of drugs and bioactives agents. |
Cavity Optomagnonics with Spin-Orbit Coupled Photons. | We experimentally implement a system of cavity optomagnonics, where a sphere of ferromagnetic material supports whispering gallery modes (WGMs) for photons and the magnetostatic mode for magnons. We observe pronounced nonreciprocity and asymmetry in the sideband signals generated by the magnon-induced Brillouin scattering of light. The spin-orbit coupled nature of the WGM photons, their geometrical birefringence, and the time-reversal symmetry breaking in the magnon dynamics impose the angular-momentum selection rules in the scattering process and account for the observed phenomena. The unique features of the system may find interesting applications at the crossroad between quantum optics and spintronics. |
Teachers ’ Beliefs about Classroom Assessment and their selection of Classroom Assessment Strategies | The use of classroom assessment is strongly supported to promote student learning. However, assessment for promoting learning is not yet widely used. On the contrary, summative assessments are emphasized and teachers continue to use classroom assessments primarily for grading students. Teachers' attitude and beliefs about students provide foundation for their philosophy of teaching. Teachers enter teaching with prior knowledge and beliefs about learners, learning and classroom teaching. These beliefs affect teachers’ choices of assessment strategies. This research was conducted to compare the beliefs of trained and untrained middle and secondary school teachers of Pakistan about classroom assessment. The data was collected from 123 teachers selected from 15 schools of various cities of Pakistan. The samples were selected by using the convenience sampling strategies (Gay, 1992; Gay, & Airasian, 2003; Fraenkel, Wallen, 2006 ) and teachers were categorized into trained and untrained groups. The data gathered from the sample was tabulated and analyzed. The hypothesis was tested using Chi Square. Except a few differences, the study revealed that there is no significant difference in the beliefs of trained and untrained teachers regarding the teacher knowledge and teaching approaches. The study opens up an issue, “whether or not the teacher training has a significant impact on teachers in Pakistan.” It was recommended that teacher education institutes should reconsider their teachings and there should be more professional development activities inside schools to encourage teachers to equip themselves with contemporary approaches of assessments. |
Effect size estimates: current use, calculations, and interpretation. | The Publication Manual of the American Psychological Association (American Psychological Association, 2001, American Psychological Association, 2010) calls for the reporting of effect sizes and their confidence intervals. Estimates of effect size are useful for determining the practical or theoretical importance of an effect, the relative contributions of factors, and the power of an analysis. We surveyed articles published in 2009 and 2010 in the Journal of Experimental Psychology: General, noting the statistical analyses reported and the associated reporting of effect size estimates. Effect sizes were reported for fewer than half of the analyses; no article reported a confidence interval for an effect size. The most often reported analysis was analysis of variance, and almost half of these reports were not accompanied by effect sizes. Partial η2 was the most commonly reported effect size estimate for analysis of variance. For t tests, 2/3 of the articles did not report an associated effect size estimate; Cohen's d was the most often reported. We provide a straightforward guide to understanding, selecting, calculating, and interpreting effect sizes for many types of data and to methods for calculating effect size confidence intervals and power analysis. |
Renal Lymphangiectasia: A Curious Cause of Pleural Effusion | Renal lymphangiectasia is a disorder of the lymphatic system of the kidneys, which can be congenital or acquired. Although the exact etiology remains unknown, an obstructive process resulting from several causes, including infection, inflammation or malignant infiltration, has been suggested to be responsible for the acquired form. This disorder may be associated with several pathologies. We report a case of a 24-year-old man with renal lymphangiectasia presenting with polycythemia, ascites and pleural effusion associated with hepatitis C virus (HCV) infection in an intravenous (IV) drug user. Our case is the first in the literature that shows an association between HCV infection and IV drug use. |
A review on travel behaviour modelling in dynamic traffic simulation models for evacuations | Dynamic traffic simulation models are frequently used to support decisions when planning an evacuation. This contribution reviews the different (mathematical) model formulations underlying these traffic simulation models used in evacuation studies and the behavioural assumptions that are made. The appropriateness of these behavioural assumptions is elaborated on in light of the current consensus on evacuation travel behaviour, based on the view from the social sciences as well as empirical studies on evacuation behaviour. The focus lies on how travellers’ decisions are predicted through simulation regarding the choice to evacuate, departure time choice, destination choice, and route choice. For the evacuation participation and departure time choice we argue in favour of the simultaneous approach to dynamic evacuation demand prediction using the repeated binary logit model. For the destination choice we show how further research is needed to generalize the current preliminary findings on the location-type specific destination choice models. For the evacuation route choice we argue in favour of hybrid route choice models that enable both following instructed routes and en-route switches. Within each of these discussions, we point at current limitations and make corresponding suggestions on promising future research directions. |
Battery powered BION FES network | The Alfred Mann Foundation is completing development of a coordinated network of BION/spl reg/ microstimulator/sensor (hereinafter implant) that has broad stimulating, sensing and communication capabilities. The network consists of a master control unit (MCU) in communication with a group of BION implants. Each implant is powered by a custom lithium-ion rechargeable 10 mW-hr battery. The charging, discharging, safety, stimulating, sensing, and communication circuits are designed to be highly efficient to minimize energy use and maximize battery life and time between charges. The stimulator can be programmed to deliver pulses in any value in the following range: 5 /spl mu/A to 20 mA in 3.3% constant current steps, 7 /spl mu/s to 2000 /spl mu/s in 7 /spl mu/s pulse width steps, and 1 to 4000 Hz in frequency. The preamp voltage sensor covers the range 10 /spl mu/V to 1.0 V with bandpass filtering and several forms of data analysis. The implant also contains sensors that can read out pressure, temperature, DC magnetic field, and distance (via a low frequency magnetic field) up to 20 cm between any two BION implants. The MCU contains a microprocessor, user interface, two-way communication system, and a rechargeable battery. The MCU can command and interrogate in excess of 800 BlON implants every 10 ms, i.e., 100 times a second. |
Corrigendum: Antibacterial activities of bacteriocins: application in foods and pharmaceuticals | Shih-Chun Yang , Chih-Hung Lin , Calvin T. Sung and Jia-You Fang* 1 Research Center for Industry of Human Ecology, Chang Gung University of Science and Technology, Taoyuan, Taiwan 2 Pharmaceutics Laboratory, Graduate Institute of Natural Products, Chang Gung University, Taoyuan, Taiwan 3 Chang Gung Memorial Hospital, Taoyuan, Taiwan 4 Center for General Education, Chang Gung University of Science and Technology, Taoyuan, Taiwan 5 Chronic Diseases and Health Promotion Research Center, Chang Gung University of Science and Technology, Taoyuan, Taiwan 6 Department of Microbiology, Immunology, and Molecular Genetics, University of California, Los Angeles, Los Angeles, CA, USA 7 Chinese Herbal Medicine Research Team, Healthy Aging Research Center, Chang Gung University, Taoyuan, Taiwan *Correspondence: [email protected] †These authors have contributed equally to this work. |
The Impact of Enterprise Risk Management on the Marginal Cost of Reducing Risk: Evidence from the Insurance Industry | We test the hypothesis that practicing enterprise risk management (ERM) reduces firms’ cost of reducing risk. Adoption of ERM represents a radical paradigm shift from the traditional method of managing risks individually to managing risks collectively allowing ERM-adopting firms to better recognize natural hedges, prioritize hedging activities towards the risks that contribute most to the total risk of the firm, and optimize the evaluation and selection of available hedging instruments. We hypothesize that these advantages allow ERM-adopting firms to produce greater risk reduction per dollar spent. Our hypothesis further predicts that, after implementing ERM, firms experience profit maximizing incentives to lower risk. Consistent with this hypothesis, we find that firms adopting ERM experience a reduction in stock return volatility. We also find that the reduction in return volatility for ERM-adopting firms becomes stronger over time. Further, we find that operating profits per unit of risk (ROA/return volatility) increase post ERM adoption. 2014 Elsevier B.V. All rights reserved. |
Signal/Collect: Graph Algorithms for the (Semantic) Web | The Semantic Web graph is growing at an incredible pace, enabling opportunities to discover new knowledge by interlinking and analyzing previously unconnected data sets. This confronts researchers with a conundrum: Whilst the data is available the programming models that facilitate scalability and the infrastructure to run various algorithms on the graph are missing. Some use MapReduce – a good solution for many problems. However, even some simple iterative graph algorithms do not map nicely to that programming model requiring programmers to shoehorn their problem to the MapReduce model. This paper presents the Signal/Collect programming model for synchronous and asynchronous graph algorithms. We demonstrate that this abstraction can capture the essence of many algorithms on graphs in a concise and elegant way by giving Signal/Collect adaptations of various relevant algorithms. Furthermore, we built and evaluated a prototype Signal/Collect framework that executes algorithms in our programming model. We empirically show that this prototype transparently scales and that guiding computations by scoring as well as asynchronicity can greatly improve the convergence of some example algorithms. We released the framework under the Apache License 2.0 (at http://www.ifi.uzh.ch/ddis/research/sc). |
Free Flight Separation Assurance Using Distributed Algorithms | Many of the nation’s airspace users desire more freedom in selecting and modifying their routes. This desire has been expressed in the free flight concept, which has gained increased attention in the last few years. Free flight offers the potential for more efficient routes, decreased fuel costs, and less dependence on air traffic control. The greatest challenge, however, is maintaining the safe separation between aircraft. This problem is often referred to as conflict detection and resolution (CD&R). This paper describes a technique by which aircraft may simultaneously and independently determine collision-free routes in a free flight operational environment. The technique, derived from potential-field models, has demonstrated tremendous robustness in a variety of scenarios ranging from simple two-aircraft conflicts and contrived geometric formations to complex, randomized multi-aircraft conflicts. Communication failures and restrictive maneuverability constraints have also been considered. The results of this work suggest that potential field algorithms are an extremely robust solution to the problem of CD&R. The results also show that these algorithms can be adapted to a situation requiring distributed computation and resolution. The advantage of a distributed approach is the decreased reliance on a central command authority. In simulation, separation can be maintained even with an unreasonable number of aircraft, in close proximity, with only partially reliable communications, and operating under tight constraints on maneuverability. This paper explores the technical feasibility of performing autonomous CD&R. The results are very promising. This paper does not address the more difficult issue of transitioning the airspace into a free flight environment. Nor do we address the general questions about user acceptance and regulatory adoption of free flight concepts. However, it is believed that if the financial advantages of making the transition to free flight can be adequately demonstrated, the reality of a more user-centric airspace management system may be closer than most people think. TABLE OF CONTENTS |
RF MEMS based impedance matching networks for tunable multi-band microwave low noise amplifiers | In this paper, we present different types of reconfigurable RF MEMS based matching networks intended for frequency-agile (multi-band) LNAs. Measured results of 2-bits matching networks show a centre frequency tuning range of 2–3 GHz (10–13%) around 20 GHz and 1.5–2.0 dB of minimum losses. Simulated tunable LNA results based on measured data of the RF MEMS matching networks show the possibilities of achieving similar high gain, good matching and low NF over the whole tuning range. The results demonstrate the potential of using RF MEMS switches for the realization of tunable LNAs at microwave and millimetre-wave frequencies. |
A randomized, controlled trial of linopirdine in the treatment of Alzheimer's disease. | OBJECTIVES
We tested the efficacy and safety of linopirdine, a novel phenylindolinone, in the treatment of Alzheimer's disease.
METHODS
A multicentre, randomized, double-blind, parallel group, placebo-controlled trial of linopirdine (30 mg three times per day or placebo). Patients (n = 382, 55% male, 98% Caucasian, age range 51-95 years) with mild or moderate Alzheimer's disease, of whom 375 received at least one treatment dose were analysed. There were no important differences between the groups at baseline.
RESULTS
No difference was seen in Clinical Global Impression scores between patients receiving placebo and those receiving linopirdine (n = 189). Small differences in the Alzheimer's Disease Assessment Scale-Cognitive Subscale (ADAS-Cog) scores were seen throughout the study favouring linopirdine; at 6 months the ADAS-Cog scores were 20.2 (linopirdine) and 22.1 (placebo) p = 0.01.
CONCLUSIONS
This trial did not detect clinically meaningful differences in patients receiving linopirdine for 6 months, despite evidence of a small degree of improved cognitive function. Further studies may benefit from more sensitive tests of treatment effects in Alzheimer's disease. |
Minimum clinically important difference for the COPD Assessment Test: a prospective analysis. | BACKGROUND
The COPD Assessment Test (CAT) is responsive to change in patients with chronic obstructive pulmonary disease (COPD). However, the minimum clinically important difference (MCID) has not been established. We aimed to identify the MCID for the CAT using anchor-based and distribution-based methods.
METHODS
We did three studies at two centres in London (UK) between April 1, 2010, and Dec 31, 2012. Study 1 assessed CAT score before and after 8 weeks of outpatient pulmonary rehabilitation in patients with COPD who were able to walk 5 m, and had no contraindication to exercise. Study 2 assessed change in CAT score at discharge and after 3 months in patients admitted to hospital for more than 24 h for acute exacerbation of COPD. Study 3 assessed change in CAT score at baseline and at 12 months in stable outpatients with COPD. We focused on identifying the minimum clinically important improvement in CAT score. The St George's Respiratory Questionnaire (SGRQ) and Chronic Respiratory Questionnaire (CRQ) were measured concurrently as anchors. We used receiver operating characteristic curves, linear regression, and distribution-based methods (half SD, SE of measurement) to estimate the MCID for the CAT; we included only patients with paired CAT scores in the analysis.
FINDINGS
In Study 1, 565 of 675 (84%) patients had paired CAT scores. The mean change in CAT score with pulmonary rehabilitation was -2·5 (95% CI -3·0 to -1·9), which correlated significantly with change in SGRQ score (r=0·32; p<0·0001) and CRQ score (r=-0·46; p<0·0001). In Study 2, of 200 patients recruited, 147 (74%) had paired CAT scores. Mean change in CAT score from hospital discharge to 3 months after discharge was -3·0 (95% CI -4·4 to -1·6), which correlated with change in SGRQ score (r=0·47; p<0·0001). In Study 3, of 200 patients recruited, 164 (82%) had paired CAT scores. Although no significant change in CAT score was identified after 12 months (mean 0·6, 95% CI -0·4 to 1·5), change in CAT score correlated significantly with change in SGRQ score (r=0·36; p<0·0001). Linear regression estimated the minimum clinically important improvement for the CAT to range between -1·2 and -2·8 with receiver operating characteristic curves consistently identifying -2 as the MCID. Distribution-based estimates for the MCID ranged from -3·3 to -3·8.
INTERPRETATION
The most reliable estimate of the minimum important difference of the CAT is 2 points. This estimate could be useful in the clinical interpretation of CAT data, particularly in response to intervention studies.
FUNDING
Medical Research Council and UK National Institute of Health Research. |
Forecasting Using Elman Recurrent Neural Network | Forecasting is an important data analysis technique that aims to study historical data in order to explore and predict its future values. In fact, to forecast, different methods have been tested and applied from regression to neural network models. In this research, we proposed Elman Recurrent Neural Network (ERNN) to forecast the Mackey-Glass time series elements. Experimental results show that our scheme outperforms other state-of-art studies. |
Impaired gastric motility and its relationship to reflux symptoms in patients with nonerosive gastroesophageal reflux disease | More than half of patients with refluxrelated symptoms have no endoscopic evidence of mucosal breaks. These patients are considered to have nonerosive gastroesophageal reflux disease (NERD). The pathogenesis of NERD may be multifactorial, but the role played by gastric motility in symptom generation in patients with NERD has not been examined. In this study, we elucidate gastric motility in patients with NERD and the efficacy of a prokinetic agent in the treatment of NERD. Gastric motility was evaluated with electrogastrography (EGG) and by measurement of gastric emptying using the acetaminophen method in 26 patients with NERD and in 11 matched healthy controls. NERD patients were treated with a prokinetic agent (mosapride 15 mg, orally three times daily) for a period of 4 weeks, after which gastric motility was measured again. Compared with the healthy controls, the NERD patients showed a significantly lower percentage of normogastria, a lower power ratio in EGG, and delayed gastric emptying. Ten patients had normal gastric motor function (group A), and 16 showed abnormalities of either gastric myoelectrical activity or gastric emptying (group B). After treatment with mosapride, gastric motility improved significantly in both groups of patients compared with pretreatment values. The subjective assessment by the patient after the treatment was improved in 20.0% of group A versus 62.5% of group B patients (P < 0.05). Gastric hypomotility appears to be an important factor in reflux symptom generation in some NERD patients. |
[Hyaluronic acid rheology: Basics and clinical applications in facial rejuvenation]. | Hyaluronic acid (HA) is the most widely used dermal filler to treat facial volume deficits and winkles specially for facial rejuvenation. Depending on various areas of the face, filler is exposed to two different forces (shear deformation and compression/stretching forces) resulting from intrinsec and external mechanical stress. The purpose of this technical note is to explain how rheology, which is the study of the flow and deformation of matter under strains, can be used in our clinical practice of facial volumization with fillers. Indeed, comprehension of the rheological properties of HA has become essential in selection of dermal filler targeted to the area of the face. Viscosity, elasticity and cohesivity are the main three properties to be taken into consideration in this selection. Aesthetic physicians and surgeons have to familiarize with those basics in order to select the HA with the right rheological properties to achieve a natural-looking and long-lasting outcome. |
Biomechanical evaluation of three surgical scenarios of posterior lumbar interbody fusion by finite element analysis | BACKGROUND
For the treatment of low back pain, the following three scenarios of posterior lumbar interbody fusion (PLIF) were usually used, i.e., PLIF procedure with autogenous iliac bone (PAIB model), PLIF with cages made of PEEK (PCP model) or titanium (Ti) (PCT model) materiel. But the benefits or adverse effects among the three surgical scenarios were still not fully understood.
METHOD
Finite element analysis (FEA), as an efficient tool for the analysis of lumbar diseases, was used to establish a three-dimensional nonlinear L1-S1 FE model (intact model) with the ligaments of solid elements. Then it was modified to simulate the three scenarios of PLIF. 10 Nm moments with 400 N preload were applied to the upper L1 vertebral body under the loading conditions of extension, flexion, lateral bending and torsion, respectively.
RESULTS
Different mechanical parameters were calculated to evaluate the differences among the three surgical models. The lowest stresses on the bone grafts and the greatest stresses on endplate were found in the PCT model. The PCP model obtained considerable stresses on the bone grafts and less stresses on ligaments. But the changes of stresses on the adjacent discs and endplate were minimal in the PAIB model.
CONCLUSIONS
The PCT model was inferior to the other two models. Both the PCP and PAIB models had their own relative merits. The findings provide theoretical basis for the choice of a suitable surgical scenario for different patients. |
A low complexity Orthogonal Matching Pursuit for sparse signal approximation with shift-invariant dictionaries | We propose a variant of Orthogonal Matching Pursuit (OMP), called LoCOMP, for scalable sparse signal approximation. The algorithm is designed for shift-invariant signal dictionaries with localized atoms, such as time-frequency dictionaries, and achieves approximation performance comparable to OMP at a computational cost similar to Matching Pursuit. Numerical experiments with a large audio signal show that, compared to OMP and Gradient Pursuit, the proposed algorithm runs in over 500 less time while leaving the approximation error almost unchanged. |
Semi-supervised Spectral Clustering for Image Set Classification | We present an image set classification algorithm based on unsupervised clustering of labeled training and unlabeled test data where labels are only used in the stopping criterion. The probability distribution of each class over the set of clusters is used to define a true set based similarity measure. To this end, we propose an iterative sparse spectral clustering algorithm. In each iteration, a proximity matrix is efficiently recomputed to better represent the local subspace structure. Initial clusters capture the global data structure and finer clusters at the later stages capture the subtle class differences not visible at the global scale. Image sets are compactly represented with multiple Grassmannian manifolds which are subsequently embedded in Euclidean space with the proposed spectral clustering algorithm. We also propose an efficient eigenvector solver which not only reduces the computational cost of spectral clustering by many folds but also improves the clustering quality and final classification results. Experiments on five standard datasets and comparison with seven existing techniques show the efficacy of our algorithm. |
Multidomain Document Layout Understanding using Few Shot Object Detection | We try to address the problem of document layout understanding using a simple algorithm which generalizes across multiple domains while training on just few examples per domain. We approach this problem via supervised object detection method and propose a methodology to overcome the requirement of large datasets. We use the concept of transfer learning by pre-training our object detector on a simple artificial (source) dataset and fine-tuning it on a tiny domain specific (target) dataset. We show that this methodology works for multiple domains with training samples as less as 10 documents. We demonstrate the effect of each component of the methodology in the end result and show the superiority of this methodology over simple object detectors. |
Longitudinal Neurostimulation in Older Adults Improves Working Memory | An increasing concern affecting a growing aging population is working memory (WM) decline. Consequently, there is great interest in improving or stabilizing WM, which drives expanded use of brain training exercises. Such regimens generally result in temporary WM benefits to the trained tasks but minimal transfer of benefit to untrained tasks. Pairing training with neurostimulation may stabilize or improve WM performance by enhancing plasticity and strengthening WM-related cortical networks. We tested this possibility in healthy older adults. Participants received 10 sessions of sham (control) or active (anodal, 1.5 mA) tDCS to the right prefrontal, parietal, or prefrontal/parietal (alternating) cortices. After ten minutes of sham or active tDCS, participants performed verbal and visual WM training tasks. On the first, tenth, and follow-up sessions, participants performed transfer WM tasks including the spatial 2-back, Stroop, and digit span tasks. The results demonstrated that all groups benefited from WM training, as expected. However, at follow-up 1-month after training ended, only the participants in the active tDCS groups maintained significant improvement. Importantly, this pattern was observed for both trained and transfer tasks. These results demonstrate that tDCS-linked WM training can provide long-term benefits in maintaining cognitive training benefits and extending them to untrained tasks. |
Immunoglobulin light chain class multiplicity and alternative organizational forms in early vertebrate phylogeny | The prototypic chondrichthyan immunoglobulin (Ig) light chain type (type I) isolated from Heterodontus francisci (horned shark) has a clustered organization in which variable (V), joining (J), and constant (C) elements are in relatively close linkage (V-J-C). Using a polymerase chain reaction-based approach on a light chain peptide sequence from the holocephalan, Hydrolagus colliei (spotted ratfish), it was possible to isolate members of a second light chain gene family. A probe to this light chain (type II) detects homologs in two orders of elasmobranchs, Heterodontus, a galeomorph and Raja erinacea (little skate), a batoid, suggesting that this light chain type may be present throughout the cartilaginous fishes. In all cases, V, J, and C regions of the type II gene are arranged in closely linked clusters typical of all known Ig genes in cartilaginous fishes. All representatives of this type II gene family are joined in the germline. A third (kappa-like) light chain type from Heterodontus is described. These findings establish that a degree of light chain class complexity comparable to that of the mammals is present in the most phylogenetically distant extant jawed vertebrates and that the phenomenon of germline-joined (pre-rearranged) genes, described originally in the heavy chain genes of cartilaginous fishes, extends to light chain genes. |
Community trends in the use and characteristics of persons with acute myocardial infarction who are transported by emergency medical services. | OBJECTIVE
Limited data exist on recent trends in ambulance use and factors associated with ambulance use in patients hospitalized with acute myocardial infarction (AMI), particularly from the more generalizable perspective of a community-wide investigation. This population-based prospective epidemiologic study describes the decade-long trends (1997-2007) in the use of emergency medical services (EMS) by residents of the Worcester, Massachusetts, metropolitan area who are hospitalized for AMI and the characteristics of patients with AMI who are transported to the hospital by EMS (n = 3789) compared with those transported by other means (n = 1505).
METHODS
The study population consisted of 5294 patients hospitalized for AMI at 11 greater Worcester medical centers in 5 annual periods between 1997 and 2007. Information on the use of EMS and the factors associated with EMS use was obtained through the review of hospital medical records.
RESULTS
There was a progressive increase in the proportion of greater Worcester residents with AMI who were transported to central Massachusetts hospitals by ambulance over time (66.9% transported in 1997; 74.9% transported in 2007). Patients transported by EMS were older, more likely to be women, and more likely to have a greater prevalence of previously diagnosed comorbidities.
CONCLUSION
Our findings provide encouragement for the use of EMS in residents of a large central New England community hospitalized with AMI. Despite increasing trends in ambulance use, more research is needed to explore the reasons why patients with AMI do not use EMS in the setting of an acute cardiac emergency. |
Intraneural ganglion cyst of the tibial nerve | Intraneural ganglion cyst of the tibial nerve is very rare. To date, only 5 cases of this entity in the popliteal fossa have been reported. We report a new case and review the previously reported cases. A 40-year-old man experienced a mild vague pain in the medial half of his right foot for 3 years. Magnetic resonance imaging scan demonstrated a soft-tissue mass along the right tibial nerve. At surgery, an intraneural ganglion cyst was evacuated. After 12 months, the patient was pain-free with no signs of recurrence. Trauma might be a contributing factor to the development of intraneural ganglion cysts. Application of microsurgical techniques is encouraged. |
Multilinear Algebra and Chess Endgames | This article has three chief aims: (1) To show the wide utility of multilinear algebraic formalism for high-performance computing. (2) To describe an application of this formalism in the analysis of chess endgames, and results obtained thereby that would have been impossible to compute using earlier techniques, including a win requiring a record 243 moves. (3) To contribute to the study of the history of chess endgames, by focusing on the work of Friedrich Amelung (in particular his apparently lost analysis of certain six-piece endgames) and that of Theodor Molien, one of the founders of modern group representation theory and the first person to have systematically numerically analyzed a pawnless endgame. |
Towards performance measurements for the Java Virtual Machine's invokedynamic | This paper presents a study of a Java Virtual Machine prototype from the Da Vinci Machine project, defined by JSR 292. It describes binary translation techniques to prepare benchmarks to run on the invokedynamic mode of the prototype, resulting in the invokedynamic version of the SciMark 2.0 suite. Benchmark preparation techniques presented in this paper are proven to be useful as the invokedynamic version of benchmark programs successfully identified strange slowness behavior of the invokedynamic mode of the server virtual machine.
Surprisingly, benchmarking results show that the invoke-dynamic mode with direct method handles on the server virtual machine is just 2-5 times slower than native Java invocations, except the Monte Carlo benchmark. But this mode on the client virtual machine still requires further performance tuning. |
Adaptation of Prototype Sets in On-line Recognition of Isolated Handwritten Latin Characters | Results on a comparison of adaptive recognition techniques for on-line recognition of handwritten Latin alphabets are presented. The emphasis is on ve adaptive classiication strategies described in this paper. The strategies are based on rst generating a user-independent set of prototype characters and then modifying this set in order to adapt it to each user's personal writing style. The initial set is formed by a simple clustering algorithm. The modiication of the prototype set is performed using three modes of operation: 1) new prototypes are added, 2) existing prototypes are reshaped to better match the input, and 3) prototypes which produce false classiications are removed. The classiication decision uses the k-Nearest Neighbor (k-NN) rule for the distances between the unknown character and the stored prototypes. The distances are calculated by using template matching with Dynamic Time Warping (DTW). The reshaping of the existing prototypes is performed by utilizing a modiied version of the Learning Vector Quantization (LVQ) algorithm. The presented experiments show that the recognition system is able to adapt well to the user's writing style with only a few { say one hundred { handwritten characters. |
TV Viewing and Advertising Targeting Yiting | Television (TV), the predominant advertising medium, is being transformed by the micro-targeting capabilities of set-top boxes (STBs). By procuring impressions at the STB level (often denoted programmatic television), advertisers can now lower per-exposure costs and/or reach viewers most responsive to advertising creatives. Accordingly, this paper uses a proprietary, household-level, single-source data set to develop an instantaneous show and advertisement viewing model to forecast consumers’ exposure to advertising and the downstream consequences for impressions and sales. Viewing data suggest person-specific factors dwarf brandor show-specific factors in explaining advertising avoidance, thereby suggesting that device-level advertising targeting can be more effective than existing show-level targeting. Consistent with this observation, the model indicates that microtargeting lowers advertising costs and raises incremental profits considerably relative to show-level targeting. Further, these advantages are amplified when advertisers are allowed to buy real-time as opposed to up-front. |
Absent Multiple Kernel Learning | Multiple kernel learning (MKL) optimally combines the multiple channels of each sample to improve classification performance. However, existing MKL algorithms cannot effectively handle the situation where some channels are missing, which is common in practical applications. This paper proposes an absent MKL (AMKL) algorithm to address this issue. Different from existing approaches where missing channels are firstly imputed and then a standard MKL algorithm is deployed on the imputed data, our algorithm directly classifies each sample with its observed channels. In specific, we define a margin for each sample in its own relevant space, which corresponds to the observed channels of that sample. The proposed AMKL algorithm then maximizes the minimum of all sample-based margins, and this leads to a difficult optimization problem. We show that this problem can be reformulated as a convex one by applying the representer theorem. This makes it readily be solved via existing convex optimization packages. Extensive experiments are conducted on five MKL benchmark data sets to compare the proposed algorithm with existing imputation-based methods. As observed, our algorithm achieves superior performance and the improvement is more significant with the increasing missing ratio. Disciplines Engineering | Science and Technology Studies Publication Details Liu, X., Wang, L., Yin, J., Dou, Y. & Zhang, J. (2015). Absent multiple kernel learning. Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence (pp. 2807-2813). United States: IEEE. This conference paper is available at Research Online: http://ro.uow.edu.au/eispapers/5373 Absent Multiple Kernel Learning Xinwang Liu School of Computer National University of Defense Technology Changsha, China, 410073 Lei Wang School of Computer Science and Software Engineering University of Wollongong NSW, Australia, 2522 Jianping Yin, Yong Dou School of Computer National University of Defense Technology Changsha, China, 410073 Jian Zhang Faculty of Engineering and Information Technology University of Technology Sydney NSW, Australia, 2007 |
Gender and Justice in Plato | Plato's proposal for the equality of the sexes remains one of the most controversial aspects of his argument in the Republic. I explore this argument with special emphasis on locating it within larger themes of the work, themes whose relevance to the argument on gender equality have often been ignored. On this basis, I find that Plato's defense of gender equality is serious, but that the foundation and the consequences of that argument have not usually been well understood. Plato's argument for gender equality rests on a distinctive view of human nature, and his elaboration of the consequences of pursuing gender equality reveal that a price would have to be paid for it that few are willing to accept. His argument should be considered by contemporary advocates of gender equality. |
Atheromatic™: Symptomatic vs. asymptomatic classification of carotid ultrasound plaque using a combination of HOS, DWT & texture | Quantitative characterization of carotid atherosclerosis and classification into either symptomatic or asymptomatic is crucial in terms of diagnosis and treatment planning for a range of cardiovascular diseases. This paper presents a computer-aided diagnosis (CAD) system (Atheromatic™, patented technology from Biomedical Technologies, Inc., CA, USA) which analyzes ultrasound images and classifies them into symptomatic and asymptomatic. The classification result is based on a combination of discrete wavelet transform, higher order spectra and textural features. In this study, we compare support vector machine (SVM) classifiers with different kernels. The classifier with a radial basis function (RBF) kernel achieved an accuracy of 91.7% as well as a sensitivity of 97%, and specificity of 80%. Encouraged by this result, we feel that these features can be used to identify the plaque tissue type. Therefore, we propose an integrated index, a unique number called symptomatic asymptomatic carotid index (SACI) to discriminate symptomatic and asymptomatic carotid ultrasound images. We hope this SACI can be used as an adjunct tool by the vascular surgeons for daily screening. |
A survey on tree edit distance and related problems | We survey the problem of comparing labeled trees based on simple local operations of deleting, inserting, and relabeling nodes. These operations lead to the tree edit distance, alignment distance, and inclusion problem. For each problem we review the results available and present, in detail, one or more of the central algorithms for solving the problem. keywords tree matching, edit distance |
Firing back: how great leaders rebound after career disasters. | Among the tests of a leader, few are more challenging-and more painful-than recovering from a career catastrophe. Most fallen leaders, in fact, don't recover. Still, two decades of consulting experience, scholarly research, and their own personal experiences have convinced the authors that leaders can triumph over tragedy--if they do so deliberately. Great business leaders have much in common with the great heroes of universal myth, and they can learn to overcome profound setbacks by thinking in heroic terms. First, they must decide whether or not to fight back. Either way, they must recruit others into their battle. They must then take steps to recover their heroic status, in the process proving, both to others and to themselves, that they have the mettle necessary to recover their heroic mission. Bernie Marcus exemplifies this process. Devastated after Sandy Sigoloff ired him from Handy Dan, Marcus decided to forgo the distraction of litigation and instead make the marketplace his batttleground. Drawing from his network of carefully nurtured relationships with both close and more distant acquaintances, Marcus was able to get funding for a new venture. He proved that he had the mettle, and recovered his heroic status, by building Home Depot, whose entrepreneurial spirit embodied his heroic mission. As Bank One's Jamie Dimon, J.Crew's Mickey Drexler, and even Jimmy Carter, Martha Stewart, and Michael Milken have shown, stunning comebacks are possible in all industries and walks of life. Whatever the cause of your predicament, it makes sense to get your story out. The alternative is likely to be long-lasting unemployment. If the facts of your dismissal cannot be made public because they are damning, then show authentic remorse. The public is often enormously forgiving when it sees genuine contrition and atonement. |
Evolution of co-management: role of knowledge generation, bridging organizations and social learning. | Over a period of some 20 years, different aspects of co-management (the sharing of power and responsibility between the government and local resource users) have come to the forefront. The paper focuses on a selection of these: knowledge generation, bridging organizations, social learning, and the emergence of adaptive co-management. Co-management can be considered a knowledge partnership. Different levels of organization, from local to international, have comparative advantages in the generation and mobilization of knowledge acquired at different scales. Bridging organizations provide a forum for the interaction of these different kinds of knowledge, and the coordination of other tasks that enable co-operation: accessing resources, bringing together different actors, building trust, resolving conflict, and networking. Social learning is one of these tasks, essential both for the co-operation of partners and an outcome of the co-operation of partners. It occurs most efficiently through joint problem solving and reflection within learning networks. Through successive rounds of learning and problem solving, learning networks can incorporate new knowledge to deal with problems at increasingly larger scales, with the result that maturing co-management arrangements become adaptive co-management in time. |
Outcomes after Pylorus-preserving Gastrectomy for Early Gastric Cancer: A Prospective Multicenter Trial | The aim of the present study was to compare in a prospective, multicenter trial the results early and late after pylorus-preserving gastrectomy (PPG) versus conventional distal gastrectomy (CDG) with Billroth I anastomosis for early gastric cancer. Eighty-one patients with early gastric cancer were randomized and then underwent either PPG or CDG. Duration of operation, intraoperative blood loss, days until removal of the nasogastric tube, days until start of oral intake, and decrease in body weight were studied as parameters for outcomes early after the surgery. Late results were studied in patients followed for longer than 3 years. Change in body weight, status of oral intake, symptoms suggesting early dumping syndrome, and overall satisfaction were addressed in the questionnaire. The presence of gallstones was examined with ultrasonography. There were no differences in early results between PPG and CDG. The incidence of early dumping syndrome was lower in PPG (8%) than in CDG (33%). Other late results including the incidence of gallstones were not different between the 2 groups. These results indicate that PPG is as safe as CDG and has an advantage in terms of early dumping syndrome. |
Stationary and nonstationary learning characteristics of the LMS adaptive filter | This paper describes the performance characteristics of the LMS adaptive filter, a digital filter composed of a tapped delay line and adjustable weights, whose impulse response is controlled by an adaptive algorithm. For stationary stochastic inputs, the mean-square error, the difference between the filter output and an externally supplied input called the "desired response," is a quadratic function of the weights, a paraboloid with a single fixed minimum point that can be sought by gradient techniques. The gradient estimation process is shown to introduce noise into the weight vector that is proportional to the speed of adaptation and number of weights. The effect of this noise is expressed in terms of a dimensionless quantity "misadjustment" that is a measure of the deviation from optimal Wiener performance. Analysis of a simple nonstationary case, in which the minimum point of the error surface is moving according to an assumed first-order Markov process, shows that an additional contribution to misadjustment arises from "lag" of the adaptive process in tracking the moving minimum point. This contribution, which is additive, is proportional to the number of weights but inversely proportional to the speed of adaptation. The sum of the misadjustments can be minimized by choosing the speed of adaptation to make equal the two contributions. It is further shown, in Appendix A, that for stationary inputs the LMS adaptive algorithm, based on the method of steepest descent, approaches the theoretical limit of efficiency in terms of misadjustment and speed of adaptation when the eigenvalues of the input correlation matrix are equal or close in value. When the eigenvalues are highly disparate (λmax/λmin> 10), an algorithm similar to LMS but based on Newton's method would approach this theoretical limit very closely. |
Modern French drama, 1940-1990 | List of illustrations Preface Acknowledgements 1. Introduction: the inter-war years 2. The Occupation 3. The Parisian theatre I: philosophical melodrama 4. The Parisian theatre II: the new theatre 5. The decentralised theatre I: the fifties 6. The decentralised theatre II: Planchon and Adamov 7. The decentralised theatre III: the sixties 8. Total theatre 9. La creation collective 10. Playwrights of the seventies 11. The eighties Bibliography Historical table of productions Index. |
A Reinforcement Learning Based Approach for Automated Lane Change Maneuvers | Lane change is a crucial vehicle maneuver which needs coordination with surrounding vehicles. Automated lane changing functions built on rule-based models may perform well under pre-defined operating conditions, but they may be prone to failure when unexpected situations are encountered. In our study, we proposed a Reinforcement Learning based approach to train the vehicle agent to learn an automated lane change behavior such that it can intelligently make a lane change under diverse and even unforeseen scenarios. Particularly, we treated both state space and action space as continuous, and designed a Q-function approximator that has a closed-form greedy policy, which contributes to the computation efficiency of our deep Q-learning algorithm. Extensive simulations are conducted for training the algorithm, and the results illustrate that the Reinforcement Learning based vehicle agent is capable of learning a smooth and efficient driving policy for lane change maneuvers. |
A Unified Gradient Regularization Family for Adversarial Examples | Adversarial examples are augmented data points generated by imperceptible perturbation of input samples. They have recently drawn much attention with the machine learning and data mining community. Being difficult to distinguish from real examples, such adversarial examples could change the prediction of many of the best learning models including the state-of-the-art deep learning models. Recent attempts have been made to build robust models that take into account adversarial examples. However, these methods can either lead to performance drops or lack mathematical motivations. In this paper, we propose a unified framework to build robust machine learning models against adversarial examples. More specifically, using the unified framework, we develop a family of gradient regularization methods that effectively penalize the gradient of loss function w.r.t. inputs. Our proposed framework is appealing in that it offers a unified view to deal with adversarial examples. It incorporates another recently-proposed perturbation based approach as a special case. In addition, we present some visual effects that reveals semantic meaning in those perturbations, and thus support our regularization method and provide another explanation for generalizability of adversarial examples. By applying this technique to Maxout networks, we conduct a series of experiments and achieve encouraging results on two benchmark datasets. In particular, we attain the best accuracy on MNIST data (without data augmentation) and competitive performance on CIFAR-10 data. |
A handbook of protocols for standardised and easy measurement of plant functional traits worldwide | . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 336 Introduction and discussion . . . . . . . . . . . . . . . . . . . . 336 The protocol handbook . . . . . . . . . . . . . . . . . . . . . . . . 337 1. Selection of plants and statistical considerations . . . 337 1.1 Selection of species in a community or ecosystem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 337 1.2 Selection of individuals within a species . . . . . . 339 1.3 Statistical considerations . . . . . . . . . . . . . . . . . . 339 2. Vegetative traits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341 2.1. Whole-plant traits . . . . . . . . . . . . . . . . . . . . . . . 341 Growth form . . . . . . . . . . . . . . . . . . . . . . . . . . . 341 Life form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341 Plant height . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342 Clonality (and belowground storage organs) . . 343 Spinescence . . . . . . . . . . . . . . . . . . . . . . . . . . . . 343 Flammability . . . . . . . . . . . . . . . . . . . . . . . . . . . 344 2.2. Leaf traits. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 345 Specific leaf area (SLA) . . . . . . . . . . . . . . . . . . 345 Leaf size (individual leaf area) . . . . . . . . . . . . . 347 Leaf dry matter content (LDMC) . . . . . . . . . . . 348 Leaf nitrogen concentration (LNC) and leaf phosphorus concentration (LPC) . . . . . . . . . 349 Physical strength of leaves . . . . . . . . . . . . . . . . . 350 Leaf lifespan. . . . . . . . . . . . . . . . . . . . . . . . . . . . 351 Leaf phenology (seasonal timing of foliage) . . . 352 Photosynthetic pathway . . . . . . . . . . . . . . . . . . . 353 Leaf frost sensitivity. . . . . . . . . . . . . . . . . . . . . . 355 2.3. Stem traits. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 356 Stem specific density (SSD) . . . . . . . . . . . . . . . 356 Twig dry matter content (TDMC) and twig drying time . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357 Bark thickness (and bark quality) . . . . . . . . . . . 358 2.4. Belowground traits . . . . . . . . . . . . . . . . . . . . . . . 359 Specific root length (SRL) and fine root diameter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359 Root depth distribution and 95% rooting depth. 360 Nutrient uptake strategy . . . . . . . . . . . . . . . . . . . 362 3. Regenerative traits. . . . . . . . . . . . . . . . . . . . . . . . . . . . 368 Dispersal mode. . . . . . . . . . . . . . . . . . . . . . . . . . 368 Dispersule shape and size . . . . . . . . . . . . . . . . . 368 Seed mass. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 369 Resprouting capacity after major disturbance . . 370 Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 371 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 372 336 Australian Journal of Botany J. H. C. Cornelissen et al. Introduction and discussion This paper is not just another handbook on ecological methodology, but serves a particular and urgent demand as well as a global ambition. Classifying plant species according to their higher taxonomy has strong limitations when it comes to answering important ecological questions at the scale of ecosystems, landscapes or biomes (Woodward and Diament 1991; Keddy 1992; Körner 1993). These questions include those on responses of vegetation to environmental variation or changes, notably in climate, atmospheric chemistry, landuse and natural disturbance regimes. Reciprocal questions are concerned with the impacts of vegetation on these large-scale environmental parameters (see Lavorel and Garnier 2002 for a review on response and effect issues). A fast-growing scientific community has come to the realisation that a promising way forward for answering such questions, as well as various other ecological questions, is by classifying plant species on functional grounds (e.g. Díaz et al. 2002). Plant functional types and plant strategies, the units within functional classification schemes, can be defined as groups of plant species sharing similar functioning at the organismic level, similar responses to environmental factors and/or similar roles in (or effects on) ecosystems or biomes (see reviews by Box 1981; Chapin et al. 1996; Lavorel et al. 1997; Smith et al. 1997; Westoby 1998; McIntyre et al. 1999a; McIntyre et al. 1999b; Semenova and van der Maarel 2000; Grime 2001; Lavorel and Garnier 2002). These similarities are based on the fact that they tend to share a set of key functional traits (e.g. Grime and Hunt 1975; Thompson et al. 1993; Brzeziecki and Kienast 1994; Chapin et al. 1996; Noble and Gitay 1996; Thompson et al. 1996; Díaz and Cabido 1997; Grime et al. 1997; Westoby 1998; Weiher et al. 1999; Cornelissen et al. 2001; McIntyre and Lavorel 2001; Lavorel and Garnier 2002; Pausas and Lavorel 2003). Empirical studies on plant functional types and traits have flourished recently and are rapidly progressing towards an understanding of plant traits relevant to local vegetation and ecosystem dynamics. However, functional classifications are not fully resolved with regard to application in regional to global scale modelling, or to the interpretation of vegetation–environment relationships in the paleo-record. Recent empirical work has tended to adopt a ‘bottom-up’ approach where detailed analyses relate (responses of) plant traits to specific environmental factors. Some of the difficulties associated with this approach regard the identification of actual plant functional groups from the knowledge of relevant traits and the scaling from individual plant traits to ecosystem functioning. On the other hand, geo-biosphere modellers as well as paleo-ecologists have tended to focus on ‘top-down’ classifications where functional types or life forms are defined a priori from a small set of postulated characteristics. These are often the characteristics that can be observed without empirical measurement and only have limited functional explanatory power. The modellers and paleo-ecologists are aware that their functional type classifications do not suffice to tackle some of the pressing large-scale ecological issues (Steffen and Cramer 1997). In an attempt to bridge the gap between the ‘bottom-up’ and ‘top-down’ approaches (see Canadell et al. 2000), scientists from both sides joined a workshop (at Isle sur la Sorgue, France, in October 2000) organised by the International Geosphere–Biosphere Programme (IGBP, project Global Change and Terrestrial Ecosystems). One of the main objectives of the workshop was to assemble a minimal list of functional traits of terrestrial vascular plants that (1) can together represent the key responses and effects of vegetation at various scales from ecosystems to landscapes to biomes to continents, (2) can be used to devise a satisfactory functional classification as a tool in regional and global-scale modelling and paleo-ecology of the geo-biosphere, (3) can help answer some further questions of ecological theory, nature conservation and land management (see Table 1 and Weiher et al. 1999) and (4) are candidates for relatively easy, inexpensive and standardised Abstract. There is growing recognition that classifying terrestrial plant species on the basis of their function (into ‘functional types’) rather than their higher taxonomic identity, is a promising way forward for tackling important ecological questions at the scale of ecosystems, landscapes or biomes. These questions include those on vegetation responses to and vegetation effects on, environmental changes (e.g. changes in climate, atmospheric chemistry, land use or other disturbances). There is also growing consensus about a shortlist of plant traits that should underlie such functional plant classifications, because they have strong predictive power of important ecosystem responses to environmental change and/or they themselves have strong impacts on ecosystem processes. The most favoured traits are those that are also relatively easy and inexpensive to measure for large numbers of plant species. Large international research efforts, promoted by the IGBP–GCTE Programme, are underway to screen predominant plant species in various ecosystems and biomes worldwide for such traits. This paper provides an international methodological protocol aimed at standardising this research effort, based on consensus among a broad group of scientists in this field. It features a practical handbook with step-by-step recipes, with relatively brief information about the ecological context, for 28 functional traits recognised as critical for tackling large-scale ecological questions. BT |
The use of mobile technology for online shopping and entertainment among older adults in Finland | Older adults are becoming an important market segment for all internet-based services, but few studies to date have considered older adults as online shoppers and users of entertainment media. Utilising the concept of life course, this article investigates the use of mobile technologies for online shopping and entertainment among consumers aged 55 to 74. The data were collected with a web-based survey completed by a panel of respondents representing Finnish television viewers (N=322). The results reveal that consumers aged 55 to 74 use a smartphone or tablet to purchase products or services online as often as younger consumers. In contrast, listening to internet radio and watching videos or programmes online with a smartphone or tablet are most typical for younger male consumers. The results demonstrate that mobile-based online shopping is best predicted by age, higher education, and household type (children living at home), and use of entertainment media by age and gender. |
SenseClusters: Unsupervised Clustering and Labeling of Similar Contexts | SenseClusters is a freely available system that identifies similar contexts in text. It relies on lexical features to build first and second order representations of contexts, which are then clustered using unsupervised methods. It was originally developed to discriminate among contexts centered around a given target word, but can now be applied more generally. It also supports methods that create descriptive and discriminating labels for the discovered clusters. |
Head Reconstruction from Internet Photos | 3D face reconstruction from Internet photos has recently produced exciting results. A person’s face, e.g., Tom Hanks, can be modeled and animated in 3D from a completely uncalibrated photo collection. Most methods, however, focus solely on face area and mask out the rest of the head. This paper proposes that head modeling from the Internet is a problem we can solve. We target reconstruction of the rough shape of the head. Our method is to gradually “grow” the head mesh starting from the frontal face and extending to the rest of views using photometric stereo constraints. We call our method boundary-value growing algorithm. Results on photos of celebrities downloaded from the Internet are presented. |
Hardware software co-design in Haskell | We present a library in Haskell for programming Field Programmable Gate Arrays (FPGAs), including hardware software co-design. Code for software (in C) and hardware (in VHDL) is generated from a single program, along with the code to support communication between hardware and software. We present type-based techniques for the simultaneous implementation of more than one embedded domain specific language (EDSL). We build upon a generic representation of imperative programs that is loosely coupled to instruction and expression types, allowing the individual parts to be developed and improved separately. Code generation is implemented as a series of translations between progressively smaller, typed EDSLs, safeguarding against errors that arise in untyped translations. Initial case studies show promising performance. |
Online gaming and risks predict cyberbullying perpetration and victimization in adolescents | The present study examined factors associated with the emergence and cessation of youth cyberbullying and victimization in Taiwan. A total of 2,315 students from 26 high schools were assessed in the 10th grade, with follow-up performed in the 11th grade. Self-administered questionnaires were collected in 2010 and 2011. Multiple logistic regression was conducted to examine the factors. Multivariate analysis results indicated that higher levels of risk factors (online game use, exposure to violence in media, internet risk behaviors, cyber/school bullying experiences) in the 10th grade coupled with an increase in risk factors from grades 10 to 11 could be used to predict the emergence of cyberbullying perpetration/victimization. In contrast, lower levels of risk factors in the 10th grade and higher levels of protective factors coupled with a decrease in risk factors predicted the cessation of cyberbullying perpetration/victimization. Online game use, exposure to violence in media, Internet risk behaviors, and cyber/school bullying experiences can be used to predict the emergence and cessation of youth cyberbullying perpetration and victimization. |
Does abciximab provide long-term benefit for patients with acute myocardial infarction treated with stents? | BACKGROUND The number of revascularization procedures performed has been shown to be reduced when stents are used in primary per cutaneous coronary intervention (PCI) for patients with acute myo cardial infarction (AMI). The Abciximab Before Direct Angioplasty and Stenting in Myocardial Infarction Regarding Acute and Long-term Follow-up (ADMIRAL) trial has already shown that adjunctive glycoprotein IIb/IIIa inhibition with abciximab can effectively lower mortality, the incidence of myocardial infarction (MI) and the number of revascularization procedures at 30 days compared with placebo; however, the long-term effects of abciximab are not known. |
Clock skew scheduling for improved reliability via quadratic programming | This paper considers the problem of determining an optimal clock skew schedule for a synchronous VLSI circuit. A novel formulation of clock skew scheduling as a constrained quadratic programming (QP) problem is introduced. The concept of a permissible range, or a valid interval, for the clock skew of each local data path is key to this QP approach. From a reliability perspective, the ideal clock schedule corresponds to each clock skew within the circuit being at the center of the respective permissible range. However, this ideal clock schedule is not practically implementable because of limitations imposed by the connectivity among the registers within the circuit. To evaluate the reliability, a quadratic cost function is introduced as the Euclidean distance between the ideal schedule and a given practically feasible clock schedule. This cost function is the minimization objective of the described algorithms for the solution of the previously mentioned quadratic program. Furthermore, the work described here substantially differs from previous research in that it permits complete control over specific clock signal delays or skews within the circuit. Specifically, the algorithms described here can be employed to obtain results with explicitly specified target values of important clock delays/skews with a circuit, such as for example, the clock delays/skews for I/O registers. An additional benefit is a potential reduction in clock period of up to 10%.
An efficient mathematical algorithm is derived for the solution of the QP problem with &Ogr;(r3) run time complexity and &Ogr;(r2) storage complexity, where r is the number of registers in the circuit. The algorithm is implemented as a C++ program and demonstrated on the ISCAS'89 suite of benchmark circuits as well as on a number of industrial circuits. The work described here yields additional insights into the correlation between circuit structure and circuit timing by characterizing the degree to which specific signal paths limit the overall performance and reliability of a circuit. This information is directly applicable to logic and architectural synthesis. |
The Critical Role of Vocabulary Development for English Language Learners | English language learners (ELLs) who experience slow vocabulary development are less able to comprehend text at grade level than their English-only peers. Such students are likely to perform poorly on assessments in these areas and are at risk of being diagnosed as learning disabled. In this article, we review the research on methods to develop the vocabulary knowledge of ELLs and present lessons learned from the research concerning effective instructional practices for ELLs. The review suggests that several strategies are especially valuable for ELLs, including taking advantage of students’ first language if the language shares cognates with English; ensuring that ELLs know the meaning of basic words, and providing sufficient review and reinforcement. Finally, we discuss challenges in designing effective vocabulary instruction for ELLs. Important issues are determining which words to teach, taking into account the large deficits in second-language vocabulary of ELLs, and working with the limited time that is typically available for direct instruction in vocabulary. |
Modern Heat Extraction Systems for Power Traction Machines—A Review | This paper presents a review of modern cooling system employed for the thermal management of power traction machines. Various solutions for heat extractions are described: high thermal conductivity insulation materials, spray cooling, high thermal conductivity fluids, combined liquid and air forced convection, and loss mitigation techniques. |
Inflows of capital to developing countries in the 1990s | Half a decade has passed since the resurgence of international capital flows to many developing countries. The recent surge in capital inflows was initially attributed to domestic developments, such as the sound policies and stronger economic performance of a handful of countries. Eventually, it became clear that the phenomenon was widespread, affecting countries with very diverse characteristics. This pattern suggested that global factors, like cyclical movements in interest rates, were especially important. This paper discusses the principal facts, developments and policies that characterize the current episode of capital inflows to Asia and Latin America. |
Anchorage capacity of osseointegrated and conventional anchorage systems: a randomized controlled trial. | INTRODUCTION
Our aim in this investigation was to evaluate and compare orthodontic anchorage capacity of 4 anchorage systems during leveling/aligning and space closure after maxillary premolar extractions.
METHODS
One hundred twenty patients (60 girls, 60 boys; mean age, 14.3 years; SD 1.73) were recruited and randomized into 4 anchorage systems: Onplant (Nobel Biocare, Gothenburg, Sweden), Orthosystem implant (Institut Straumann AG, Basel, Switzerland), headgear, and transpalatal bar. The main outcome measures were cephalometric analysis of maxillary first molar and incisor movement, sagittal growth changes of the maxilla, and treatment time. The results were also analyzed on an intention-to-treat basis.
RESULTS
The maxillary molars were stable during the leveling/aligning in the Onplant, Orthosystem implant, and headgear groups, but the transpalatal bar group had anchorage loss (mean, 1.0 mm; P <.001). During the space-closure phase, the molars were still stable in the Onplant and Orthosystem groups, whereas the headgear and transpalatal bar groups had anchorage loss (means, 1.6 and 1.0 mm, respectively; P <.001). Thus, the Onplant and the Orthosystem implant groups had significantly higher success rates for anchorage than did the headgear and transpalatal bar groups. Compared with the Orthosystem implant, there were more technical problems with the Onplant.
CONCLUSIONS
If maximum anchorage is required, the Orthosystem implant is the system of choice. |
The influence of political events and ideologies on Nathaniel Hawthorne's political vision and writings | Introduction - Nathaniel Hawthorne and political reform Nathaniel Hawthorne and conservative philosophy Hawthorne's concept of human nature and original sin in the philosophy of conservative reform Hawthorne's conservatism and the democratic dilemma the role of the "past" in Hawthorne's philosophy of conservative reform Hawthorne's response to Comtean positivism in the philosophy of conservative reform Hawthorne and conservative reform in the American Revolution Nathaniel Hawthorne and Edmund Burke -conservative perspectives on reform in "Earth's Holocaust" and the French Revolution conclusion - Nathaniel Hawthorne and the philosophy of conservative reform. |
An Attribute-Based High-Level Image Representation for Scene Classification | Scene classification is increasingly popular due to its extensive usage in many real-world applications such as object detection, image retrieval, and so on. Traditionally, the low-level hand-crafted image representations are adopted to describe the scene images. However, they usually fail to detect semantic features of visual concepts, especially in handling complex scenes. In this paper, we propose a novel high-level image representation which utilizes image attributes as features for scene classification. More specifically, the attributes of each image are firstly extracted by a deep convolution neural network (CNN), which is trained to be a multi-label classifier by minimizing an element-wise logistic loss function. The process of generating attributes can reduce the “semantic gap” between the low-level feature representation and the high level scene meaning. Based on the attributes, we then build a system to discover semantically meaningful descriptions of the scene classes. Extensive experiments on four large-scale scene classification datasets show that our proposed algorithm considerably outperforms other state-of-the-art methods. |
A novel CNTFET-based ternary logic gate design | This paper presents a novel design of ternary logic inverters using carbon nanotube FETs (CNTFETs). Multiple-valued logic (MVL) circuits have attracted substantial interest due to the capability of increasing information content per unit area. In the past extensive design techniques for MVL circuits (especially ternary logic inverters) have been proposed for implementation in CMOS technology. In CNTFET device, the threshold voltage of the transistor can be controlled by controlling the chirality vector (i.e. the diameter); in this paper this feature is exploited to design ternary logic inverters. New designs are proposed and compared with existing CNTFET-based designs. Extensive simulation results using SPICE demonstrate that power delay product is improved by 300% comparing to the conventional ternary gate design. |
Simultaneous Traffic Sign Detection and Boundary Estimation Using Convolutional Neural Network | We propose a novel traffic sign detection system that simultaneously estimates the location and precise boundary of traffic signs using convolutional neural network (CNN). Estimating the precise boundary of traffic signs is important in navigation systems for intelligent vehicles where traffic signs can be used as 3-D landmarks for road environment. Previous traffic sign detection systems, including recent methods based on CNN, only provide bounding boxes of traffic signs as output, and thus requires additional processes such as contour estimation or image segmentation to obtain the precise boundary of signs. In this paper, the boundary estimation of traffic sign is formulated as 2-D pose and shape class prediction problem, and this is effectively solved by a single CNN. With the predicted 2-D pose and the shape class of a target traffic sign in the input, we estimate the actual boundary of the target sign by projecting the boundary of a corresponding template sign image into the input image plane. By formulating the boundary estimation problem as a CNN-based pose and shape prediction task, our method is end-to-end trainable, and more robust to occlusion and small targets than other boundary estimation methods that rely on contour estimation or image segmentation. With our architectural optimization of the CNN-based traffic sign detection network, the proposed method shows a detection frame rate higher than seven frames/second while providing highly accurate and robust traffic sign detection and boundary estimation results on a low-power mobile platform. |
Automatic Protocol Format Reverse Engineering through Context-Aware Monitored Execution | Protocol reverse engineering has often been a manual process that is considered time-consuming, tedious and error-prone. To address this limitation, a number of solutions have recently been proposed to allow for automatic protocol reverse engineering. Unfortunately, they are either limited in extracting protocol fields due to lack of program semantics in network traces or primitive in only revealing the flat structure of protocol format. In this paper, we present a system called AutoFormat that aims at not only extracting protocol fields with high accuracy, but also revealing the inherently “non-flat”, hierarchical structures of protocol messages. AutoFormat is based on the key insight that different protocol fields in the same message are typically handled in different execution contexts (e.g., the runtime call stack). As such, by monitoring the program execution, we can collect the execution context information for every message byte (annotated with its offset in the entire message) and cluster them to derive the protocol format. We have evaluated our system with more than 30 protocol messages from seven protocols, including two text-based protocols (HTTP and SIP), three binary-based protocols (DHCP, RIP, and OSPF), one hybrid protocol (CIFS/SMB), as well as one unknown protocol used by a real-world malware. Our results show that AutoFormat can not only identify individual message fields automatically and with high accuracy (an average 93.4% match ratio compared with Wireshark), but also unveil the structure of the protocol format by revealing possible relations (e.g., sequential, parallel, and hierarchical) among the message fields. Part of this research has been supported by the National Science Foundation under grants CNS-0716376 and CNS-0716444. The bulk of this work was performed when the first author was visiting George Mason University in Summer 2007. |
Blind Source Separation Algorithms with Matrix Constraints | In many applications of Independent Component Analysis (ICA) and Blind Source Separation (BSS) estimated sources signals and the mixing or separating matrices have some special structure or some constraints are imposed for the matrices such as symmetries, orthogonality, non-negativity, sparseness and specified invariant norm of the separating matrix. In this paper we present several algorithms and overview some known transformations which allows us to preserve several important constraints. Computer simulation experiments confirmed validity and usefulness of the developed algorithms. key words: Blind sources separation, independent component analysis with constraints, non-negative blind source separation |
Monte Carlo arithmetic: how to gamble with floating point and win | floating-point arithmetic in which arithmetic operators and their operands are randomized. For example, rather than insist that floating-point addition obey x ⊕ y = fl[x + y] = (x ⊕ y)(1 + δ), where δ is some deterministically defined value, we allow δ to be a random variable. MCA is an extension of floating-point arithmetic that permits randomization to be used both in rounding and in processing inexact operands—the result of every arithmetic operation is randomized in a predefined way. As a result, x + y can yield different values if evaluated multiple times. With MCA, then, if a program runs with the same input n times, each run yields a slightly different answer. These answers constitute a sample distribution to which we can apply the whole array of standard statistical methods. Typically, after n such runs, the sample mean ̂μ estimates the exact solution, the sample standard deviation σ̂ estimates the error in the answer of any single run, and the sample standard error σ̂/ n estimates the error in μ̂ after n such runs. Each recalculation is an experiment in a Monte Carlo simulation—a simulation of the sensitivity to rounding of this particular combination of input and program. MCA makes numerical computing empirical. It has practical applications—end users can gauge the number of significant digits in computed values as well as a program’s stability. They can also use Monte Carlo methods to estimate round-off error accumulation. By treating individual rounding errors as random variables, we can use statistical methods to analyze errors. MCA also has theoretical applications. In fact, we can use MCA to circumvent certain wellknown anomalies in floating-point operations.1 This article gives a short review of several years’ investigation.2 For a copy of our full work and a C demonstration program that runs all the examples in this article, visit www.cs.ucla. edu/~stott/mca. We assume the reader is familiar with basic issues in floating-point arithmetic.2 David Goldberg’s tutorial3 and Nicholas Higham’s encyclopedia4 are great references. |
Cerebral palsy | Cerebral palsy is the most common cause of childhood-onset, lifelong physical disability in most countries, affecting about 1 in 500 neonates with an estimated prevalence of 17 million people worldwide. Cerebral palsy is not a disease entity in the traditional sense but a clinical description of children who share features of a non-progressive brain injury or lesion acquired during the antenatal, perinatal or early postnatal period. The clinical manifestations of cerebral palsy vary greatly in the type of movement disorder, the degree of functional ability and limitation and the affected parts of the body. There is currently no cure, but progress is being made in both the prevention and the amelioration of the brain injury. For example, administration of magnesium sulfate during premature labour and cooling of high-risk infants can reduce the rate and severity of cerebral palsy. Although the disorder affects individuals throughout their lifetime, most cerebral palsy research efforts and management strategies currently focus on the needs of children. Clinical management of children with cerebral palsy is directed towards maximizing function and participation in activities and minimizing the effects of the factors that can make the condition worse, such as epilepsy, feeding challenges, hip dislocation and scoliosis. These management strategies include enhancing neurological function during early development; managing medical co-morbidities, weakness and hypertonia; using rehabilitation technologies to enhance motor function; and preventing secondary musculoskeletal problems. Meeting the needs of people with cerebral palsy in resource-poor settings is particularly challenging. |
Energy-efficient radio resource and power allocation for device-to-device communication underlaying cellular networks | Device-to-device (D2D) communication underlaying cellular networks brings significant benefits to resource utilization, improving user's throughput and extending battery life of user equipments. However, the allocation of radio resources and power to D2D communication needs elaborate coordination, as D2D communication causes interference to cellular networks. In this paper, we propose a novel joint radio resource and power allocation scheme to improve the performance of the system in the uplink period. Energy efficiency is considered as our optimization objective since devices are handheld equipments with limited battery life. We formulate the the allocation problem as a reverse iterative combinatorial auction game. In the auction, radio resources occupied by cellular users are considered as bidders competing for D2D packages and their corresponding transmit power. We propose an algorithm to solve the allocation problem as an auction game. We also perform numerical simulations to prove the efficacy of the proposed algorithm. |
The effect of body-mind relaxation meditation induction on major depressive disorder: A resting-state fMRI study. | BACKGROUND
Meditation has been increasingly evaluated as an important complementary therapeutic tool for the treatment of depression. The present study employed resting-state functional magnetic resonance imaging (rs-fMRI) to examine the effect of body-mind relaxation meditation induction (BMRMI) on the brain activity of depressed patients and to investigate possible mechanisms of action for this complex intervention.
METHOD
21 major depressive disorder patients (MDDs) and 24 age and gender-matched healthy controls (HCs) received rs-fMRI scans at baseline and after listening to a selection of audio designed to induce body-mind relaxation meditation. The rs-fMRI data were analyzed using Matlab toolbox to obtain the amplitude of low-frequency fluctuations (ALFF) of the BOLD signal for the whole brain. A mixed-design repeated measures analysis of variance (ANOVA) was performed on the whole brain to find which brain regions were affected by the BMRMI. An additional functional connectivity analysis was used to identify any atypical connection patterns after the BMRMI.
RESULTS
After the BMRMI experience, both the MDDs and HCs showed decreased ALFF values in the bilateral frontal pole (BA10). Additionally, increased functional connectivity from the right dorsal medial prefrontal cortex (dmPFC) to the left dorsal lateral prefrontal cortex (dlPFC) and the left lateral orbitofrontal cortex (OFC) was identified only in the MDDs after the BMRMI.
LIMITATION
In order to exclude the impact of other events on the participants׳ brain activity, the Hamilton Rating Scales for Depression (HDRS) was not measured after the body-mind relaxation induction.
CONCLUSION
Our findings support the hypothesis that body-mind relaxation meditation induction may regulate the activities of the prefrontal cortex and thus may have the potential to help patients construct reappraisal strategies that can modulate the brain activity in multiple emotion-processing systems. |
Remembering through lifelogging: A survey of human memory augmentation | Human memory is unquestionably a vital cognitive ability but one that can often be unreliable. External memory aids such as diaries, photos, alarms and calendars are often employed to assist in remembering important events in our past and future. The recent trend for lifelogging, continuously documenting ones life through wearable sensors and cameras, presents a clear opportunity to augment human memory beyond simple reminders and actually improve its capacity to remember. This article surveys work from the fields of computer science and psychology to understand the potential for such augmentation, the technologies necessary for realising this opportunity and to investigate what the possible benefits and ethical pitfalls of using such technology might be. |
Time series classification with ensembles of elastic distance measures | Several alternative distance measures for comparing time series have recently been proposed and evaluated on time series classification (TSC) problems. These include variants of dynamic time warping (DTW), such as weighted and derivative DTW, and edit distance-based measures, including longest common subsequence, edit distance with real penalty, time warp with edit, and move–split–merge. These measures have the common characteristic that they operate in the time domain and compensate for potential localised misalignment through some elastic adjustment. Our aim is to experimentally test two hypotheses related to these distance measures. Firstly, we test whether there is any significant difference in accuracy for TSC problems between nearest neighbour classifiers using these distance measures. Secondly, we test whether combining these elastic distance measures through simple ensemble schemes gives significantly better accuracy. We test these hypotheses by carrying out one of the largest experimental studies ever conducted into time series classification. Our first key finding is that there is no significant difference between the elastic distance measures in terms of classification accuracy on our data sets. Our second finding, and the major contribution of this work, is to define an ensemble classifier that significantly outperforms the individual classifiers. We also demonstrate that the ensemble is more accurate than approaches not based in the time domain. Nearly all TSC papers in the data mining literature cite DTW (with warping window set through cross validation) as the benchmark for comparison. We believe that our ensemble is the first ever classifier to significantly outperform DTW and as such raises the bar for future work in this area. |
Reduction and IR-drop compensations techniques for reliable neuromorphic computing systems | Neuromorphic computing system (NCS) is a promising architecture to combat the well-known memory bottleneck in Von Neumann architecture. The recent breakthrough on memristor devices made an important step toward realizing a low-power, small-footprint NCS on-a-chip. However, the currently low manufacturing reliability of nano-devices and the voltage IR-drop along metal wires and memristors arrays severely limits the scale of memristor crossbar based NCS and hinders the design scalability. In this work, we propose a novel system reduction scheme that significantly lowers the required dimension of the memristor crossbars in NCS while maintaining high computing accuracy. An IR-drop compensation technique is also proposed to overcome the adverse impacts of the wire resistance and the sneak-path problem in large memristor crossbar designs. Our simulation results show that the proposed techniques can improve computing accuracy by 27.0% and 38.7% less circuit area compared to the original NCS design. |
Spanish version of the Parkinson’s Disease Questionnaire–Carer (PDQ-Carer) | BACKGROUND
Parkinson's caregivers are frequently affected by a range of physical and psychological factors affecting to the quality of life (QoL) of patients and caregivers. However, while there are well-validated QoL instruments for patients, few specific measures has been developed for caregivers of patients with PD. This study examined the psychometric properties of the Spanish version of the Parkinson Disease Questionnaire-Carer (PDQ-Carer) for use in PD caregivers.
METHODS
The PDQ-Carer and the Short Form-36 Health Survey (SF-36) were completed by sample of 73 caregivers of patients with PD in Spain (71.8 % females; 63.6 ± 12.3 years old).
RESULTS
Psychometric analysis confirmed the reliability and validity of the Spanish version of the PDQ-Carer. The internal consistency was found to be satisfactory for the four PDQ-Carer domains: Personal and Social Activities, Depression and Anxiety, Self-care and Stress with Cronbach's alpha values ranging 0.80 to 0.95. The PDQ-Carer was significantly correlated with the eight SF-36 domains (r = -0.31 to -0.59, p < 0.001) supporting the concurrent validity of the instrument.
CONCLUSIONS
Overall, these results provide preliminary evidence of the utility of the Spanish version of the PDQ-Carer in non-professionals caregivers. |
A Distributed Test System Architecture for Open-source IoT Software | In this paper, we discuss challenges that are specific to testing of open IoT software systems. The analysis reveals gaps compared to wireless sensor networks as well as embedded software. We propose a testing framework which (a) supports continuous integration techniques, (b) allows for the integration of project contributors to volunteer hardware and software resources to the test system, and (c) can function as a permanent distributed plugtest for network interoperability testing. The focus of this paper lies in open-source IoT development but many aspects are also applicable to closed-source projects. |
Corrective 3D reconstruction of lips from monocular video | In facial animation, the accurate shape and motion of the lips of virtual humans is of paramount importance, since subtle nuances in mouth expression strongly influence the interpretation of speech and the conveyed emotion. Unfortunately, passive photometric reconstruction of expressive lip motions, such as a kiss or rolling lips, is fundamentally hard even with multi-view methods in controlled studios. To alleviate this problem, we present a novel approach for fully automatic reconstruction of detailed and expressive lip shapes along with the dense geometry of the entire face, from just monocular RGB video. To this end, we learn the difference between inaccurate lip shapes found by a state-of-the-art monocular facial performance capture approach, and the true 3D lip shapes reconstructed using a high-quality multi-view system in combination with applied lip tattoos that are easy to track. A robust gradient domain regressor is trained to infer accurate lip shapes from coarse monocular reconstructions, with the additional help of automatically extracted inner and outer 2D lip contours. We quantitatively and qualitatively show that our monocular approach reconstructs higher quality lip shapes, even for complex shapes like a kiss or lip rolling, than previous monocular approaches. Furthermore, we compare the performance of person-specific and multi-person generic regression strategies and show that our approach generalizes to new individuals and general scenes, enabling high-fidelity reconstruction even from commodity video footage. |
The incidence of hypoglycaemia in children with type 1 diabetes and treated asthma. | AIMS
To investigate whether treatment of coexisting asthma has any effect on the incidence of hypoglycaemia and on glycaemic control in children with type 1 diabetes.
METHODS
An observational study of children attending the paediatric diabetes clinics of five hospitals in the North Trent Region. Information on the frequency of hypoglycaemia in the preceding three months, treatment for asthma, and the individual's latest HbA1c, was recorded when they attended for review.
RESULTS
Data were collected on 226 children, of whom 27 (12%) had treated asthma. Only 11/27 children with asthma were taking their prescribed inhaled steroids. All used beta agonists at least once a week. There was a reduction of 20% in the incidence of hypoglycaemia in the diabetic children with treated asthma. Of the children with diabetes and treated asthma, 52% reported an episode of hypoglycaemia in the previous three months compared to 72% of those with only diabetes. There was no difference in the proportion of children experiencing nocturnal or severe hypoglycaemia. Although not significant, those with asthma and diabetes also had better overall control (HbA1c 8.8%) compared to those with diabetes alone (HbA1c 9.3%).
CONCLUSIONS
Diabetic children with treated asthma have significantly fewer episodes of hypoglycaemia and better glycaemic control compared to children with diabetes alone. This observation needs further investigation but raises an interesting question. Do the drugs used to treat asthma, in particular beta agonists, have the therapeutic potential to reduce hypoglycaemia and facilitate an improvement in glycaemic control? |
High frequency active-clamp buck converter for low power automotive applications | Low power (5 W-25 W) automotive DC-DC converters have a very wide input voltage range from 4.5V to 42V, and it is usually operated at high switching frequency in order to comply with the strict CISPR-25 Standard for EMI performance. The conventional buck converter is currently employed for such applications owing to its simplicity and low cost, but it has low efficiency at high switching frequencies and high EMI emission due to hard switching. To solve these issues, an active-clamp buck converter is proposed in this paper, which features zero-voltage-switching (ZVS) and hence high efficiency and low EMI emission over a wide input voltage range, making it suitable for low power automotive applications. The operating principles including ZVS mechanism, detailed design considerations and experimental results from a 1 MHz prototype are presented. |
Traumatic hemipelvectomy with free gluteus maximus fillet flap covers: a case report. | ABSTRACT
Traumatic hemipelvectomy is an uncommon and life threatening injury. We report a case of a 16-year-old boy involved in a traffic accident who presented with an almost circumferential pelvic wound with wide diastasis of the right sacroiliac joint and symphysis pubis. The injury was associated with complete avulsion of external and internal iliac vessels as well as the femoral and sciatic nerves. He also had ipsilateral open comminuted fractures of the femur and tibia. Emergency debridement and completion of amputation with preservation of the posterior gluteal flap and primary anastomosis of the inferior gluteal vessels to the internal iliac artery stump were performed. A free fillet flap was used to close the massive exposed area.
KEY WORDS
traumatic hemipelvectomy, amputation, and free gluteus maximus fillet flap. |
Italian colonialism : legacy and memory | Until relatively recently, the Italian colonial experience was largely regarded as an incidental aspect of Italy's past. Studies of liberal Italy and even fascism underplayed both the significance of the state's colonial ambition and its broader cultural impact. In the post-war era, even less consideration has been given to how this colonial legacy still affects Italy and the countries it occupied and colonized. This book arises out of a major two-day international conference held at the Italian Cultural Institute in London in December 2001. The essays investigate the ways in which the Italian colonial experience continues to be relevant even after the end of empire. They explore the ways in which the memories of Italy's colonial past have been crafted to accommodate the needs of the present and the extent to which forgetting colonialism became an integral part of Italian culture and national identity. These issues have come into sharper relief of late as labour migration to Italy has led to new social and cultural encounters within Italy. The essays additionally investigate the colonial legacy from the perspective of Italy's former colonies, highlighting the enduring social, cultural and political ramifications of the colonial relationship. This interdisciplinary collection contains contributions from international experts in the fields of history, cultural studies (literature and film), politics and sociology. |
Recommendations for designers and researchers resulting from the world-wide failure exercise | The World-Wide Failure Exercise (WWFE) contained a detailed assessment of 19 theoretical approaches for predicting the deformation and failure response of polymer composite laminates when subjected to complex states of stress. The leading five theories are explored in greater detail to demonstrate their strengths and weaknesses in predicting various types of structural failure. Recommendations are then derived, as to how the theories can be best utilised to provide safe and economic predictions in a wide range of engineering design applications. Further guidance is provided for designers on the level of confidence and bounds of applicability of the current theories. The need for careful interpretation of initial failure predictions is emphasised, as is the need to allow for multiple sources of non-linearity (including progressive damage) where accuracy is sought for certain classes of large deformation and final failure strength predictions. Aspects requiring further experimental and theoretical investigation are identified. Direction is also provided to the research community by highlighting specific, tightly focussed, experimental and theoretical studies that, if carried out in the very near future, would pay great dividends from the designer’s perspective, by increasing their confidence in the theoretical foundations. # 2003 QinetiQ Ltd. Published by Elsevier Ltd. All rights reserved. |
Self-disclosure and Privacy Calculus on Social Networking Sites: The Role of Culture - Intercultural Dynamics of Privacy Calculus | Hanna Krasnova, Natasha F. Veltri, Oliver Günther Self-disclosure and Privacy Calculus on Social Networking Sites: The Role of Culture Intercultural Dynamics of Privacy Calculus Social Network Sites (SNSs) rely exclusively on user-generated content to offer engaging and rewarding experience to its members. As a result, stimulating user communication and selfdisclosure is vital for the sustainability of SNSs. However, considering that the SNS users are increasingly culturally diverse, motivating this audience to self-disclose requires understanding of their cultural intricacies. Yet existing research offers only limited insights into the role of culture behind the motivation of SNS users to self-disclose. Building on the privacy calculus framework, this study explores the role of two cultural dimensions – individualism and uncertainty avoidance – in selfdisclosure decisions of SNS users. Survey responses of US and German Facebook members are used as the basis for our analysis. Structural equation modeling and multi-group analysis results reveal the distinct role of culture in the cognitive patterns of SNS users. The authors find that trusting beliefs play a key role in the self-disclosure decisions of users from individualistic cultures. At the same time, uncertainty avoidance determines the impact of privacy concerns. This paper contributes to the theory by rejecting the universal nature of privacy calculus processes. The findings provide for an array of managerial implications for SNS providers as they strive to encourage content creation and sharing by their heterogeneous members. |
Emotional intelligence as a moderator in the stress-burnout relationship: a questionnaire study on nurses. | AIMS AND OBJECTIVES
To investigate inter-relationships between emotional intelligence (EI), work stress and burnout in a group of nurses in the Western Cape Province, South Africa. The moderating effect of EI in the stress-burnout relationship and group differences (nurses working in different wards) in burnout were also investigated.
BACKGROUND
Stress and subsequent burnout commonly threaten the occupational health and well-being of nurses in South Africa and elsewhere. Developing EI in nurses may increase individual stress resistance and combat burnout.
DESIGN
A cross-sectional research design with anonymous questionnaires was conducted. Self-report data were used.
METHODS
Survey data were collected from 122 nurses working in different wards at four hospitals from a private hospital group. The Swinburne University Emotional Intelligence Test, Sources of Work Stress Inventory and Maslach Burnout Inventory were used to measure EI, stress and burnout, respectively.
RESULTS
Consistent inverse relationships between emotional control and management as dimensions of EI, and stress and burnout emerged. A differential effect of high vs. low EI on the stress-burnout relationship was evident. Workload and the work/family interface emerged as significant predictors of burnout. Respondents working in maternity, paediatric and ER wards reported more feelings of personal accomplishment than those working in general wards.
CONCLUSIONS
Higher EI is significantly related with lower stress and burnout in a sample of South African nurses. The moderator effect of EI in the stress-burnout relationship suggests that enhanced EI may help diminish burnout development when chronic stress is experienced.
RELEVANCE TO CLINICAL PRACTICE
EI developmental interventions, if introduced in nursing curricula, may increase emotional coping resources and enhanced social skills, which may benefit the long-term occupational health of nurses. This may be relevant in developing countries, where environmental stressors related to the organisational context (budget constraints) and wider social factors (shortage of qualified nurses) are difficult to address. |
The Christie hospital adjuvant tamoxifen trial--status at 10 years. | From November 1976 to June 1982, a randomised clinical trial was carried out at the Christie Hospital, Manchester, to test the clinical efficacy of tamoxifen (TAM) as an adjuvant to surgery for patients with operable breast carcinoma. Following surgery, premenopausal women were randomly allocated to have either TAM 20 mg day-1 for one year or an irradiation menopause (the previous standard treatment). Postmenopausal women had TAM 20 mg day-1 for one year or no further treatment (Controls). A total of 1005 patients were entered into the trial of whom 961 are evaluable at 10 years from the inception. At 10 years the analysis shows no significant difference in overall and disease free survival between premenopausal women given TAM or an irradiation menopause. For premenopausal node negative patients there would appear to be a trend in favour of the TAM treated patients with a 93% ten year survival vs. 82% for the irradiation menopause group (P = 0.09). When the disease free survival of all 961 patients is analysed, allowing for node status, then there is a marked trend in favour of the TAM treated patients (P = 0.07). Of the patients originally allocated to TAM 47% had an irradiation menopause on relapse and 73% of the postmenopausal control patients had TAM on relapse. The incidence of side effects and second primary tumours is discussed as well as the possible effects of varying the length of time over which adjuvant TAM is administered. |
Utilizing Purchase Intervals in Latent Clusters for Product Recommendation | Collaborative filtering have become increasingly important with the development of Web 2.0. Online shopping service providers aim to provide users with quality list of recommended items that will enhance user satisfaction and loyalty. Matrix factorization approaches have become the dominant method as they can reduce the dimension of the data set and alleviate the sparsity problem. However, matrix factorization approaches are limited because they depict each user as one preference vector. In practice, we observe that users may have different preferences when purchasing different subsets of items, and the periods between purchases also vary from one user to another. In this work, we propose a probabilistic approach to learn latent clusters in the large user-item matrix, and incorporate temporal information into the recommendation process. Experimental results on a real world dataset demonstrate that our approach significantly improves the conversion rate, precision and recall of state-of-the-art methods. |
Two dimensional model of a permanent magnet spur gear | This paper extends an analysis method developed for a radial magnetized spur gear. The extension describes how a parallel magnetized spur gear can be modeled analytically. The analytical method for a parallel magnetized gear is verified with the finite element method, which showed good agreements. The results with the parallel magnetized are also compared with results from the previous developed analytical method for radial magnetized gear, and the parallel magnetized version turned out to give a better performance. A test model was also built to verify the theoretical calculations. |
Semiotic schemas: A framework for grounding language in action and perception | A theoretical framework for grounding language is introduced that provides a computational path from sensing and motor action to words and speech acts. The approach combines concepts from semiotics and schema theory to develop a holistic approach to linguistic meaning. Schemas serve as structured beliefs that are grounded in an agent’s physical environment through a causal-predictive cycle of action and perception. Words and basic speech acts are interpreted in terms of grounded schemas. The framework reflects lessons learned from implementations of several language processing robots. It provides a basis for the analysis and design of situated, multimodal communication systems that straddle symbolic and non-symbolic realms. 2005 Published by Elsevier B.V. |
A forensic evidence recovery from mobile device applications | In recent past, there are a lot of research advancements in mobile forensics tools. This is so due to increase usage of mobile phones in storage of information, law enforcement, mobile online transactions, and also negatively by criminals due to increased computational capabilities. Mobile forensics devices continue to remain a very challenging task due to poor user data retrieval techniques for evidence retrieval. Recently, third party applications assume a veritable feet because it is supported by majority of mobile devices platforms, thereby making it easy to extract information of its users’ for future criminal audit. This paper proposes an evidence data retrieval method from InstagramApp using two networks-based platforms [that is, pure peer-to-peer (PPP) and special cluster peer (SCP)-based], whose concept is to manage mobile device communication and generate multiple copies of users data/information to be dumped across three servers. The forensic test results were obtained from PPP and SCP developed to securely extract data from mobile devices. This shows that, SCP outperformed PPP in terms of the time taken to fulfil forensic auditor’s request, throughput and broadband utilisation which are 42.82% to 57.18%, 56.81% to 43.19% and 35.41% to 64.53% respectively. |
The Discipline and Practice of Qualitative Research | Writing about scientific research, including qualitative research, from the vantage point of the colonized, a position that she chooses to privilege, Linda Tuhiwai Smith (1999) states that “the term ‘research’ is inextricably linked to European imperialism and colonialism.” She continues, “The word itself is probably one of the dirtiest words in the indigenous world’s vocabulary. . . . It is implicated in the worst excesses of colonialism,” with the ways in which “knowledge about indigenous peoples was collected, classified, and then represented back to the West” (p. 1). This dirty word stirs up anger, silence, distrust. “It is so powerful that indigenous people even write poetry about research” (p. 1). It is one of colonialism’s most sordid legacies. Sadly, qualitative research, in many if not all of its forms (observation, participation, interviewing, ethnography), serves as a metaphor for colonial knowledge, for power, and for truth. The metaphor works this way. Research, quantitative and qualitative, is scientific. Research provides the foundation for reports about and representations of “the Other.” In the colonial context, research becomes an objective way of representing the dark-skinned Other to the white world. Colonizing nations relied on the human disciplines, especially sociology and anthropology, to produce knowledge about strange and foreign worlds. This close involvement with the colonial project contributed, in significant ways, to qualitative research’s long and anguished history, to its becoming a dirty word (for reviews, see in this volume Foley & Valenzuela, Chapter 9; Tedlock, Chapter 18). In sociology, the work of the “Chicago school” in the 1920s and 1930s established the importance of qualitative inquiry for the study of human group life. In anthropology during the same period, the discipline-defining studies of Boas, Mead, Benedict, Bateson, EvansPritchard, Radcliffe-Brown, and Malinowski charted the outlines of the fieldwork method (see Gupta & Ferguson, 1997; Stocking, 1986, 1989). |
Prenatal maternal stress: effects on pregnancy and the (unborn) child. | BACKGROUND
Animal experiments have convincingly demonstrated that prenatal maternal stress affects pregnancy outcome and results in early programming of brain functions with permanent changes in neuroendocrine regulation and behaviour in offspring.
AIM
To evaluate the existing evidence of comparable effects of prenatal stress on human pregnancy and child development.
STUDY DESIGN
Data sources used included a computerized literature search of PUBMED (1966-2001); Psychlit (1987-2001); and manual search of bibliographies of pertinent articles.
RESULTS
Recent well-controlled human studies indicate that pregnant women with high stress and anxiety levels are at increased risk for spontaneous abortion and preterm labour and for having a malformed or growth-retarded baby (reduced head circumference in particular). Evidence of long-term functional disorders after prenatal exposure to stress is limited, but retrospective studies and two prospective studies support the possibility of such effects. A comprehensive model of putative interrelationships between maternal, placental, and fetal factors is presented.
CONCLUSIONS
Apart from the well-known negative effects of biomedical risks, maternal psychological factors may significantly contribute to pregnancy complications and unfavourable development of the (unborn) child. These problems might be reduced by specific stress reduction in high anxious pregnant women, although much more research is needed. |
Humanoids and personal robots: Design and experiments | This paper addresses the field of humanoid and personal robotics—its objectives, motivations, and technical problems. The approach described in the paper is based on the analysis of humanoid and personal robots as an evolution from industrial to advanced and service robotics driven by the need for helpful machines, as well as a synthesis of the dream of replicating humans. The first part of the paper describes the development of anthropomorphic components for humanoid robots, with particular regard to anthropomorphic sensors for vision and touch, an eight-d.o.f. arm, a three-fingered hand with sensorized fingertips, and control schemes for grasping. Then, the authors propose a user-oriented designmethodology for personal robots, anddescribe their experience in the design, development, and validation of a real personal robot composed of a mobile unit integrating some of the anthropomorphic components introduced previously and aimed at operating in a distributedworking environment. Based on the analysis of experimental results, the authors conclude that humanoid robotics is a tremendous and attractive technical and scientific challenge for robotics research. The real utility of humanoids has still to be demonstrated, but personal assistance can be envisaged as a promising application domain. Personal robotics also poses difficult technical problems, especially related to the need for achieving adequate safety, proper human–robot interaction, useful performance, and affordable cost. When these problems are solved, personal robots will have an excellent chance for significant application opportunities, especially if integrated into future home automation systems, and if supported by the availability of humanoid robots. © 2001 John Wiley & Sons, Inc. |
The Architectural Treatise of the Italian Renaissance: Architectural Invention, Ornament, and Literary Culture | 1. Of archaeology and license 2. Vitruvius 3. Literary grids and artistic intersections 4. Alberti 5. Francesco di Giorgio Martini 6. Serlio and the theoretization of ornament 7. Spini and the Architectural Imitatio 8. Palladio and Aesthetics Necessita Scamozzi and Gesamttheorie. |
The Necessary Dream: A Study of the Novels of Manuel Puig | The Latin American novelist Manuel Puig is perhaps best known for his novel Kiss of the Spider Woman. The Necessary Dream provides an introduction to and interpretation of his seven novels written from 1968 to 1982. While each novel is given a separate chapter, the homogenious thread of attitudes and themes which touch on psychology, feminism, Argentine politics and popular culture, is clearly displayed. Contents: Introduction; 'La Vie est ailleurs':^R La traiciUn de Rita Hayworth (1968); 'The Rules of the Game': Boquitas pintadas (1969); 'The Divided Self': The Buenos Aires Affair (1973); 'The Kiss of Death': El beso de la mujer aran?ia (1976); 'Only Make-Believe': Pubis angelical (1979); 'Les Liaisons dangereuses': MaldiciUn eterna a quien lea estas p*ginas (1980); 'Life's a Dream': Sangre de amor correspondido (1982); Notes; Bibliography; Index |
On What We See | This paper investigates the idea that perception can be, at once, a mode of direct awareness of the world and an encounter, in the first instance, with mere appearances. In developing this point, I introduce a sensorimotor account of perception according to which the senses are ways of exploring the environment mediated by different patterns of sensorimotor contingency (i.e. by the distinctive ways in which what the perceiver does affects how things appear). The world we live in is the world of sense-data; but the world we talk about is the world of |
Adversarial Learning for Image Forensics Deep Matching with Atrous Convolution | Constrained image splicing detection and localization (CISDL) is a newly proposed challenging task for image forensics, which investigates two input suspected images and identifies whether one image has suspected regions pasted from the other. In this paper, we propose a novel adversarial learning framework to train the deep matching network for CISDL. Our framework mainly consists of three building blocks: 1) the deep matching network based on atrous convolution (DMAC) aims to generate two high-quality candidate masks which indicate the suspected regions of the two input images, 2) the detection network is designed to rectify inconsistencies between the two corresponding candidate masks, 3) the discriminative network drives the DMAC network to produce masks that are hard to distinguish from ground-truth ones. In DMAC, atrous convolution is adopted to extract features with rich spatial information, the correlation layer based on the skip architecture is proposed to capture hierarchical features, and atrous spatial pyramid pooling is constructed to localize tampered regions at multiple scales. The detection network and the discriminative network act as the losses with auxiliary parameters to supervise the training of DMAC in an adversarial way. Extensive experiments, conducted on 21 generated testing sets and two public datasets, demonstrate the effectiveness of the proposed framework and the superior performance of DMAC. |
Functional neuroanatomy of human rapid-eye-movement sleep and dreaming | RAPID-EYE-MOVEMENT (REM) sleep is associated with intense neuronal activity, ocular saccades, muscular atonia and dreaming1,2. The function of REM sleep remains elusive and its neural correlates have not been characterized precisely in man. Here we use positron emission tomography and statistical parametric mapping to study the brain state associated with REM sleep in humans. We report a group study of seven subjects who maintained steady REM sleep during brain scanning and recalled dreams upon awakening. The results show that regional cerebral blood flow is positively correlated with REM sleep in pontine tegmentum, left thalamus, both amygdaloid complexes, anterior cingulate cortex and right parietal operculum. Negative correlations between regional cerebral blood flow and REM sleep are observed bilaterally, in a vast area of dorsolateral prefrontal cortex, in parietal cortex (supramarginal gyrus) as well as in posterior cingulate cortex and precuneus. Given the role of the amygdaloid complexes in the acquisition of emotionally influenced memories, the pattern of activation in the amygdala and the cortical areas provides a biological basis for the processing of some types of memory during REM sleep. |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.