title
stringlengths
8
300
abstract
stringlengths
0
10k
Two-Stage Method for Large-Scale Acquisition of Contradiction Pattern Pairs using Entailment
In this paper we propose a two-stage method to acquire contradiction relations between typed lexico-syntactic patterns such as Xdrug prevents Ydisease and Ydisease caused by Xdrug . In the first stage, we train an SVM classifier to detect contradiction pattern pairs in a large web archive by exploiting the excitation polarity (Hashimoto et al., 2012) of the patterns. In the second stage, we enlarge the first stage classifier’s training data with new contradiction pairs obtained by combining the output of the first stage’s classifier and that of an entailment classifier. We acquired this way 750,000 typed Japanese contradiction pattern pairs with an estimated precision of 80%. We plan to release this resource to the NLP community.
High-density MIM capacitors using AlTaOx dielectrics
The authors have obtained good MIM capacitor integrity of high-capacitance density of 10 fF//spl mu/m/sup 2/ using high-/spl kappa/ AlTaO/sub x/ fabricated at 400/spl deg/C. In addition, small voltage dependence of capacitance of <600 ppm (quadratic voltage coefficient of only 130 ppm/V/sup 2/) is obtained at 1 GHz using their mathematical derivation from measured high-frequency S parameters. These good results ensure the high-/spl kappa/ AlTaO/sub x/ MIM capacitor technology is useful for high-precision circuits operated at the RF frequency regime.
Local shape feature fusion for improved matching, pose estimation and 3D object recognition.
We provide new insights to the problem of shape feature description and matching, techniques that are often applied within 3D object recognition pipelines. We subject several state of the art features to systematic evaluations based on multiple datasets from different sources in a uniform manner. We have carefully prepared and performed a neutral test on the datasets for which the descriptors have shown good recognition performance. Our results expose an important fallacy of previous results, namely that the performance of the recognition system does not correlate well with the performance of the descriptor employed by the recognition system. In addition to this, we evaluate several aspects of the matching task, including the efficiency of the different features, and the potential in using dimension reduction. To arrive at better generalization properties, we introduce a method for fusing several feature matches with a limited processing overhead. Our fused feature matches provide a significant increase in matching accuracy, which is consistent over all tested datasets. Finally, we benchmark all features in a 3D object recognition setting, providing further evidence of the advantage of fused features, both in terms of accuracy and efficiency.
Biaxial-Type Concentrated Solar Tracking System with a Fresnel Lens for Solar-Thermal Applications
In this paper, an electromechanical, biaxial-type concentrated solar tracking system was designed for solar-thermal applications. In our tracking system, the sunlight was concentrated by the microstructure of Fresnel lens to the heating head of the Stirling engine and two solar cells were installed to provide the power for tracking system operation. In order to obtain the maximum sun power, the tracking system traces the sun with the altitude-azimuth biaxial tracing method and accurately maintains the sun’s radiation perpendicular to the plane of the heating head. The results indicated that the position of heating head is an important factor for power collection. If the sunlight can be concentrated to completely cover the heating head with small heat loss, we can obtain the maximum temperature of the heating head of the Stirling engine. Therefore, the temperature of heating head can be higher than 1000 ̋C in our experiment on a sunny day. Moreover, the results also revealed that the temperature decrease of the heating head is less than the power decrease of solar irradiation because of the latent heat of copper and the small heat loss from the heating head.
Heart Disease Prediction System Using Data Mining and Hybrid Intelligent Techniques: A Review
Heart disease is one of the main sources of demise around the world and it is imperative to predict the disease at a premature phase. The computer aided systems help the doctor as a tool for predicting and diagnosing heart disease. The objective of this review is to widespread about Heart related cardiovascular disease and to brief about existing decision support systems for the prediction and diagnosis of heart disease supported by data mining and hybrid intelligent techniques .
Soft Mask Methods for Single-Channel Speaker Separation
The problem of single-channel speaker separation attempts to extract a speech signal uttered by the speaker of interest from a signal containing a mixture of acoustic signals. Most algorithms that deal with this problem are based on masking, wherein unreliable frequency components from the mixed signal spectrogram are suppressed, and the reliable components are inverted to obtain the speech signal from speaker of interest. Most current techniques estimate this mask in a binary fashion, resulting in a hard mask. In this paper, we present two techniques to separate out the speech signal of the speaker of interest from a mixture of speech signals. One technique estimates all the spectral components of the desired speaker. The second technique estimates a soft mask that weights the frequency subbands of the mixed signal. In both cases, the speech signal of the speaker of interest is reconstructed from the complete spectral descriptions obtained. In their native form, these algorithms are computationally expensive. We also present fast factored approximations to the algorithms. Experiments reveal that the proposed algorithms can result in significant enhancement of individual speakers in mixed recordings, consistently achieving better performance than that obtained with hard binary masks.
Aim-mat: the auditory image model in MATLAB
This paper describes a version of the auditory image model (AIM) [1] implemented in MATLAB. It is referred to as “aim-mat” and it includes the basic modules that enable AIM to simulate the spectral analysis, neural encoding and temporal integration performed by the auditory system. The dynamic representations produced by non-static sounds can be viewed on a frame-by-frame basis or in movies with synchronized sound. The software has a sophisticated graphical user interface designed to facilitate the auditory modelling. It is also possible to add MATLAB code and complete modules to aim-mat. The software can be downloaded from http://www.mrccbu.cam.ac.uk/cnbh/aimmanual
Face Recognition : A Survey
Face recognition has gained a significant position among most commonly used applications of image processing furthermore availability of viable technologies in this field have contributed a great deal to it. In spite of rapid progress in this field it still has to overcome various challenges like Aging, Partial Occlusion, and Facial Expressions etc affecting the performance of the system, are covered in first part of the survey. This part also highlights the most commonly used databases, available as a standard for face recognition tests. AT & T, AR Database, FERET, ORL and Yale Database have been outlined here. While in the second part of this survey a detailed overview of some important existing methods which are used to dealing the issues of face recognition have been presented. Said methods include Eigenface, Neural Network (NN), Support Vector Machine (SVM), Gabor Wavelet and Hidden Markov Model (HMM). While in last part of the survey several applications of a face recognition system such as video surveillance, Access Control, and Pervasive Computing has been discussed.
Factors influencing the Cloud Computing adoption in Higher Education Institutions of Punjab, Pakistan
Cloud Computing is one of the most important trend and newest area in the field of information technology in which resources (e.g. CPU and storage) can be leased and released by customers through the Internet in an on-demand basis. The adoption of Cloud Computing in Education and developing countries is real an opportunity. Although Cloud computing has gained popularity in Pakistan especially in education and industry, but its impact in Pakistan is still unexplored especially in Higher Education Department. Already published work investigated in respect of factors influencing on adoption of cloud computing but very few investigated said analysis in developing countries. The Higher Education Institutions (HEIs) of Punjab, Pakistan are still not focused to discover cloud adoption factors. In this study, we prepared cloud adoption model for Higher Education Institutions (HEIs) of Punjab, a survey was carried out from 900 students all over Punjab. The survey was designed based upon literature and after discussion and opinions of academicians. In this paper, 34 hypothesis were developed that affect the cloud computing adoption in HEIs and tested by using powerful statistical analysis tools i.e. SPSS and SmartPLS. Statistical findings shows that 84.44% of students voted in the favor of cloud computing adoption in their colleges, while 99% supported Reduce Cost as most important factor in cloud adoption.
A low-power, 26-GHz transformer-based regulated cascode transimpedance amplifier in 0.25µm SiGe BiCMOS
A 26 GHz transimpedance amplifier (TIA) with transformer-based regulated cascode (RGC) input stage is proposed and analyzed. The transformer enhances the effective transconductance of the TIA's input common-base transistor; reducing the input resistance and providing considerable bandwidth extension. The TIA is implemented in a 0.25µm BiCMOS technology. Measurement shows the single-ended transimpedance gain of 53dBΩ with −3dB bandwidth of 26 GHz. Total chip power, including an output buffer, is 28.2mW from a 2.5V supply; while core TIA power is 8.2mW. The measured average input-referred noise current spectral density is 21.3 pA/√Hz. Total chip area, including pads, is 960µm×780µm.
Trade-offs? What trade-offs? Competence and competitiveness in manufacturing strategy
Manufacturing strategy is increasingly recognized by academics as essential in achieving sustainable competitive advantage. The reason it has not yet penetrated equally far into the world of business is partly the lack of simple conceptual foundations. As a result, a number of important observations showing up recently in manufacturing strategy literature have not yet captured the attention they deserve, despite their broad implications for managers. This article attempts to clarify some of these issues. It is important to distinguish clearly between internal competences and external measures of competitiveness; ensuring a proper link between the two is a critical factor for success. The points we emphasize are (1) competences don't have to hurt each other, (2) performance relative to the competition is what counts, (3) each product has to meet some minimum requirements to have a chance of selling, and (4) these requirements are continually becoming tougher. We then sketch a scenario where even superior manufacturing may no longer be a source of competitive advantage, but simply a ticket to the ball game. Trade-Offs? What Trade-Offs? (Competence and competitiveness in manufacturing strategy) Charles Corbett Luk Van Wassenhove The field of manufacturing strategy has been around for more than 20 years. 1 Over this period the field has advanced considerably as an academic discipline, but practical achievements have been limited to date. The father of the field, Wickham Skinner, claims that this is partly due to the lack of solid conceptual foundations. 2 Indeed, some important recent developments in thinking on manufacturing strategy and their managerial implications have not drawn as much attention as we believe they deserve. In this paper, we bring a number of these developments together and state them more explicitly. In so doing, we hope to clarify them. We also extrapolate the recent developments in question to an extreme-sounding but currently emerging scenario, describing how manufacturing may eventually cease to be a source of competitive advantage, leaving human resources as the ultimate critical factor. In particular, we use these emerging concepts in manufacturing strategy to provide a conceptual basis for the often-heard complaint among managers that "markets are always becoming more competitive" or "margins are continuously getting smaller." What seems to emerge is the concept of a competitive dimensions life-cycle, suggesting that the well-known product-process matrix is losing its validity. We are greatly indebted to an anonymous referee of the California Management Review, whose comments have led to substantial improvements in this paper.
Organizational strategies for promoting patient and provider uptake of personal health records
OBJECTIVE To investigate organizational strategies to promote personal health records (PHRs) adoption with a focus on patients with chronic disease. METHODS Using semi-structured interviews and a web-based survey, we sampled US health delivery organizations which had implemented PHRs for at least 12 months, were recognized as PHR innovators, and had scored highly in national patient satisfaction surveys. Respondents had lead positions for clinical information systems or high-risk population management. Using grounded theory approach, thematic categories were derived from interviews and coupled with data from the survey. RESULTS Interviews were conducted with 30 informants from 16 identified organizations. Organizational strategies were directed towards raising patient awareness via multimedia communications, and provider acceptance and uptake. Strategies for providers were grouped into six main themes: organizational vision, governance and policies, work process redesign, staff training, information technology (IT) support, and monitoring and incentives. Successful organizations actively communicated their vision, engaged leaders at all levels, had clear governance, planning, and protocols, set targets, and celebrated achievement. The most effective strategy for patient uptake was through health professional encouragement. No specific outreach efforts targeted patients with chronic disease. Registration and PHR activity was routinely measured but without reference to a denominator population or high risk subpopulations. DISCUSSION AND CONCLUSION Successful PHR implementation represents a social change and operational project catalyzed by a technical solution. The key to clinician acceptance is making their work easier. However, organizations will likely not achieve the value they want from PHRs unless they target specific populations and monitor their uptake.
Study on the MFCC similarity-based voice activity detection algorithm
The accuracy of voice activity detection (VAD) is one of the most important factors which influence the capability of the speech recognition system, how to detect the endpoint precisely in noise environment is still a difficult task. In this paper, we proposed a new VAD method based on Mel-frequency cepstral coefficients (MFCC) similarity. We first extracts the MFCC of a voice signal for each frame, followed by calculating the MFCC Euclidean distance and MFCC correlation coefficient of the test frame and the background noise, Finally, give the experimental results. The results show that at low SNR circumstance, MFCC similarity detection method is better than traditional short-term energy method. Compared with Euclidean distance measure method, correlation coefficient is better.
Policy Recognition in the Abstract Hidden Markov Model
In this paper, we present a method for recognising an agent’s behaviour in dynamic, noisy, uncertain domains, and across multiple levels of abstracti on. We term this problemon-line plan recognition under uncertainty and view it generally as probabilistic inference on the stochastic process representing the execution of the agent ’s plan. Our contributions in this paper are two fold. In terms of plan recognition, we propose a novel plan recognition framework based on the Abstract Hidden Markov Model (AHMM) as the p lan execution model. In terms of probabilistic inference, we introduce the AHMM, a novel type of stochastic processes, provide its dynamic Bayesian network (DBN) structu e and analyse the properties of this network model. We describe a novel application of the Ra o-Blackwellisation procedure to general DBN, which allows us to construct hybrid inferenc methods that take advantage of the tractable sub-structures present in the DBN. When appli ed to the AHMM, we are able to take advantage of the independence properties inherent to a model of action and plan execution and derive an online probabilistic plan recognition algori thm that scales well with the number of levels in the plan hierarchy. This illustrates that while stochastic models for plan execution can be complex, they exhibit special structures which, if ex ploited, can lead to efficient plan recognition algorithms.
Stereo-based road boundary tracking for mobile robot navigation
This paper describes a method of stereo-based road boundary tracking for mobile robot navigation. Since sensory evidence for road boundaries might change from place to place, we cannot depend on a single cue but have to use multiple sensory features. The method uses color, edge, and height information obtained from a single stereo camera. To cope with a variety of road types and shapes and that of their changes, we adopt a particle filter in which road boundary hypotheses are represented by particles. The proposed method has been tested in various road scenes and conditions, and verified to be effective for autonomous driving of a mobile robot.
Enalapril versus atenolol in the treatment of hypertensive smokers
A randomised crossover study has been done to compare the antihypertensive efficacy of enalapril and atenolol in 45 smoking, hypertensive men. Treatment was started with enalapril 20 mg/d or atenolol 50 mg/d and, if necessary, the doses were doubled after 4 weeks to achieve a sitting diastolic blood pressure ≤ 95 mm Hg, after which hydrochlorothiazide was added, if necessary. Both drugs lowered blood pressure significantly. However, enalapril was more efficient in lowering both systolic and diastolic blood pressure; the mean difference was significant after both 4 and 8 weeks in the sitting systolic (11.6 mm Hg and 7.9 mm Hg) and diastolic (3.3 mm Hg and 3.0 mm Hg) pressures and in the erect systolic pressures (8.2 mm Hg and 7.2 mm Hg), and after 8 weeks in the supine systolic pressure, too (8.9 mm Hg). The effect on enalapril was especially marked in moderate (<20 cigarettes/day) smokers. The need for diuretics was also significantly less in the enalapril group. It appears that angiotensin-converting enzyme inhibitors may be superior to β-adrenoceptor blockers in the treatment of hypertensive smoking patients.
Joint subcarrier-pairing and resource allocation for two-way multi-relay OFDM networks
In this paper, we investigate the joint subcarrier-pairing and resource allocation scheme for multi-relay aided two-way relay OFDM networks, where power allocation, subcarriers assignment and relay selection are taken into account. It is assumed that amplify-and-forward relaying protocol is deployed on all relay nodes to assists the information exchange between two sources via orthogonal subchannels. In this case, we formulate an optimization problem to maximize the total end-to-end transmission rate of the system under individual power constraints at each node. The goal is to seek the jointly optimized subcarrier pairing, subcarrier-pair-to-relay selection and power allocation. To solve the problem, we derive an asymptotically optimal scheme by adopting the dual decomposition approach of mixed-integer programming problems. Finally, simulation results are presented to demonstrate the performance of the proposed scheme.
Automatic parking path optimization based on Bezier curve fitting
The autonomous parking system is an intelligent technology to park a car into a small space. More and more studies have addressed this issue. However, the planned path in these studies is not smooth enough. Based on Ackermann steering geometry, the car kinematic model is established in this paper, the collision possibility between car body and obstacles in the parking space are analyzed, and then the bound of start point and collision-free space are presented. Based on this, the parking trajectory is planned and Bezier curve is used to smooth the trajectory which is generated on multi arcs of circumference, finally, the generation of continuous curvature path is realized. In the end, we build the environment of the simulation in MATLAB and the Fuzzy PID control method is used to track the trajectory which is planned in advance. The simulation results show that the car can not only avoid the obstacles effectively, but also move smoothly on turning points for using the Bezier curve. The design can meet the continuity of the parking requirements very well.
Rivastigmine reduces tobacco craving in alcohol-dependent smokers.
INTRODUCTION Although alcohol-dependent smokers represent an important group for applying smoking interventions, a sufficient pharmacotherapy has not been established in this high-risk group so far. METHODS In order to examine the effect of the acetylcholinesterase inhibitor rivastigmine on tobacco dependence, we performed a 12-week, randomized, placebo-controlled trial. 26 alcohol-dependent smokers were randomized to rivastigmine 6 mg/day (n=14) or placebo (n=12). Assessments on addictive behavior included carbon monoxide (CO), severity of tobacco dependence (FTND), daily smoked cigarettes (diaries), and craving for tobacco (QSU) and alcohol (AUQ). RESULTS ANOVA revealed a significant treatment-by-time interaction for tobacco consumption and tobacco craving (each p<0.0001). The rivastigmine group showed a decrease in daily smoked cigarettes (-30%), in exhaled carbon monoxide (-32%) and in tobacco craving (-18%) whereas controls did not show significant changes. ANCOVA revealed rivastigmine effects to be more prominent in smokers suffering from more severe tobacco dependence. None of the patients developed an alcohol relapse or an increase in alcohol craving. DISCUSSION Our preliminary data indicate an effect of rivastigmine on tobacco craving and consumption. This pilot study encourages further investigation of acetylcholinesterase-inhibitors as a promising treatment approach regarding tobacco dependence.
Densely Connected Convolutional Networks
Recent work has shown that convolutional networks can be substantially deeper, more accurate, and efficient to train if they contain shorter connections between layers close to the input and those close to the output. In this paper, we embrace this observation and introduce the Dense Convolutional Network (DenseNet), which connects each layer to every other layer in a feed-forward fashion. Whereas traditional convolutional networks with L layers have L connections&#x2014;one between each layer and its subsequent layer&#x2014;our network has L(L+1)/2 direct connections. For each layer, the feature-maps of all preceding layers are used as inputs, and its own feature-maps are used as inputs into all subsequent layers. DenseNets have several compelling advantages: they alleviate the vanishing-gradient problem, strengthen feature propagation, encourage feature reuse, and substantially reduce the number of parameters. We evaluate our proposed architecture on four highly competitive object recognition benchmark tasks (CIFAR-10, CIFAR-100, SVHN, and ImageNet). DenseNets obtain significant improvements over the state-of-the-art on most of them, whilst requiring less memory and computation to achieve high performance. Code and pre-trained models are available at https://github.com/liuzhuang13/DenseNet.
Car parking control using a trajectory tracking controller
One of the key technologies of future automobiles is the parking assist or automatic parking control. Control problems of a car-like vehicle are not easy because of nonholonomic velocity constraints. This paper proposes a parking control strategy which is composed of an open loop path planner and a feedback tracking controller. By employing a trajectory tracking controller for a 2 wheeled robot, a car-like vehicle can be successfully controlled to the desired configuration. Experimental results with a radio controlled model car clearly show that the proposed control scheme is practically useful
Dydrogesterone has no effect on uterine fibroids when used to prevent miscarriage in pregnant women with uterine fibroids.
OBJECTIVES To analyse the effect of dydrogesterone use during pregnancy on uterine fibroids, pregnancy complications, and pregnancy outcome. MATERIAL AND METHODS In all, 372 pregnant women with uterine fibroids who were treated at the Affiliated Provincial Hospital of Shandong University were included in this study. Thirty-three of these women received dydrogesterone and constituted the treatment group, and the 27 women who were found to have uterine fibroids during the first trimester but did not receive intervention to prevent miscarriage composed the control group. The changes in uterine fibroids before and after pregnancy and the pregnancy complications were recorded; immunohistochemistry was used to detect the expression of progesterone receptor (PR) and proliferation- and apoptosis-related proteins in the uterine fibroid tissue. RESULTS No significant difference was observed in the change in uterine fibroid volume during pregnancy between the treatment group and the control group (p > 0.05). The percentage of uterine fibroids with red degeneration was lower in the treatment group than in the control group, but the difference was not statistically significant. No significant difference was observed in newborn weight, height, Apgar score, threatened miscarriage, or premature birth, among other characteristics, between the two groups (p > 0.05). Immunohistochemistry showed no significant difference in the expression of PR, cyclinD1, insulin-like growth factor (IGF1), or B-cell lymphoma 2 (Bcl2) between the two groups. CONCLUSIONS The use of dydrogesterone during pregnancy has no significant effect on uterine fibroids, pregnancy progression, or pregnancy outcomes in pregnant patients with uterine fibroids.
Chainsaw: Chained Automated Workflow-based Exploit Generation
We tackle the problem of automated exploit generation for web applications. In this regard, we present an approach that significantly improves the state-of-art in web injection vulnerability identification and exploit generation. Our approach for exploit generation tackles various challenges associated with typical web application characteristics: their multi-module nature, interposed user input, and multi-tier architectures using a database backend. Our approach develops precise models of application workflows, database schemas, and native functions to achieve high quality exploit generation. We implemented our approach in a tool called Chainsaw. Chainsaw was used to analyze 9 open source applications and generated over 199 first- and second-order injection exploits combined, significantly outperforming several related approaches.
Performance Evaluation of Microservices Architectures Using Containers
Micro services architecture has started a new trend for application development for a number of reasons: (1) to reduce complexity by using tiny services, (2) to scale, remove and deploy parts of the system easily, (3) to improve flexibility to use different frameworks and tools, (4) to increase the overall scalability, and (5) to improve the resilience of the system. Containers have empowered the usage of micro services architectures by being lightweight, providing fast start-up times, and having a low overhead. Containers can be used to develop applications based on monolithic architectures where the whole system runs inside a single container or inside a micro services architecture where one or few processes run inside the containers. Two models can be used to implement a micro services architecture using containers: master-slave, or nested-container. The goal of this work is to compare the performance of CPU and network running benchmarks in the two aforementioned models of micro services architecture hence provide a benchmark analysis guidance for system designers.
Capturing missing edges in social networks using vertex similarity
We introduce the graph vertex similarity measure, Relation Strength Similarity (RSS), that utilizes a network's topology to discover and capture similar vertices. The RSS has the advantage that it is asymmetric; can be used in a weighted network; and has an adjustable "discovery range" parameter that enables exploration of friend of friend connections in a social network. To evaluate RSS we perform experiments on a coauthorship network from the CiteSeerX database. Our method significantly outperforms other vertex similarity measures in terms of the ability to predict future coauthoring behavior among authors in the CiteSeerX database for the near future 0 to 4 years out and reasonably so for 4 to 6 years out.
Topic Models for Mortality Modeling in Intensive Care Units
Mortality prediction is an important problem in the intensive care unit (ICU) because it is helpful for understanding patients’ evolving severity, quality of care, and comparing treatments. Most ICU mortality models primarily consider structured data and physiological waveforms (Le Gall et al., 1993). An important limitation of these structured data approaches is that they miss a lot of vital information captured in providers’ free text notes and reports. In this paper, we propose an approach to mortality prediction that incorporates the information from free text notes using topic modeling.
Understanding challenges in the front lines of home health care: a human-systems approach.
A human-systems perspective is a fruitful approach to understanding home health care because it emphasizes major individual components of the system - persons, equipment/technology, tasks, and environments - as well as the interaction between these components. The goal of this research was to apply a human-system perspective to consider the capabilities and limitations of the persons, in relation to the demands of the tasks and equipment/technology in home health care. Identification of challenges and mismatches between the person(s) capabilities and the demands of providing care provide guidance for human factors interventions. A qualitative study was conducted with 8 home health Certified Nursing Assistants and 8 home health Registered Nurses interviewed about challenges they encounter in their jobs. A systematic categorization of the challenges the care providers reported was conducted and human factors recommendations were proposed in response, to improve home health. The challenges inform a human-systems model of home health care.
The Demand for Energy-Using Assets among the World's Rising Middle Classes
We study household decisions to acquire energy-using assets in the presence of rising incomes. We develop a theoretical framework to characterize the effect of income growth on asset purchases when consumers face credit constraints. We use large and plausibly exogenous shocks to household income generated by the conditional-cash-transfer program in Mexico, Oportunidades, to show that asset acquisition is nonlinear, depends, as predicted in the presence of credit constraints, on the pace of income growth, and both effects are economically large among beneficiaries. Our results may help explain important worldwide trends in the relationship between energy use and income growth.
Non-Rigid Registration Under Isometric Deformations
We present a robust and efficient algorithm for the pairwise non-rigid registration of partially overlapped 3D surfaces. Our approach treats non-rigid registration as an optimization problem and solves it by alternating between correspondence and deformation optimization. Assuming approximately isometric deformations, robust correspondences are generated using a pruning mechanism based on geodesic consistency. We iteratively learn an appropriate deformation discretization from the current set of correspondences and use it to update the correspondences in the next iteration. Our algorithm is able to register partially similar point clouds that undergo large deformations, in just a few seconds. We demonstrate the potential of our algorithm in various applications such as example based articulated segmentation, and shape interpolation.
Tall Buildings and Elevators : A Review of Recent Technological Advances
Efficient vertical mobility is a critical component of tall building development and construction. This paper investigates recent advances in elevator technology and examines their impact on tall building development. It maps out, organizes, and collates complex and scattered information on multiple aspects of elevator design, and presents them in an accessible and non-technical discourse. Importantly, the paper contextualizes recent technological innovations by examining their implementations in recent major projects including One World Trade Center in New York; Shanghai Tower in Shanghai; Burj Khalifa in Dubai; Kingdom Tower in Jeddah, Saudi Arabia; and the green retrofit project of the Empire State Building in New York. Further, the paper discusses future vertical transportation models including a vertical subway concept, a space lift, and electromagnetic levitation technology. As these new technological advancements in elevator design empower architects to create new forms and shapes of large-scale, mixed-use developments, this paper concludes by highlighting the need for interdisciplinary research in incorporating elevators in skyscrapers.
Introductory essay: the social shaping of technology
LSE has developed LSE Research Online so that users may access research output of the School. Copyright © and Moral Rights for the papers on this site are retained by the individual authors and/or other copyright owners. Users may download and/or print one copy of any article(s) in LSE Research Online to facilitate their private study or for non-commercial research. You may not engage in further distribution of the material or use it for any profit-making activities or any commercial gain. You may freely distribute the URL (http://eprints.lse.ac.uk) of the LSE Research Online website.
FTR-18: Collecting rumours on football transfer news
This paper describes ongoing work on the creation of a multilingual rumour dataset on football transfer news, FTR-18. Transfer rumours are continuously published by sports media. They can both harm the image of player or a club or increase the player’s market value. The proposed dataset includes transfer articles written in English, Spanish and Portuguese. It also comprises Twitter reactions related to the transfer rumours. FTR-18 is suited for rumour classification tasks and allows the research on the linguistic patterns used in
The Second PASCAL Recognising Textual Entailment Challenge
This paper describes the Second PASCAL Recognising Textual Entailment Challenge (RTE-2). 1 We describe the RTE2 dataset and overview the submissions for the challenge. One of the main goals for this year’s dataset was to provide more “realistic” text-hypothesis examples, based mostly on outputs of actual systems. The 23 submissions for the challenge present diverse approaches and research directions, and the best results achieved this year are considerably higher than last year’s state of the art.
Microstrip series fed antenna array for millimeter wave automotive radar applications
A 2D tapered microstrip antenna array is analyzed and designed in the paper by adopting the method of Modified Transmission Line Model. The pattern of the E-plane is determined by 20 linear series fed rectangular microstrip patches with tapered width, and 16 linear arrays compose the 2D array fed by four-stage T-junction power divider network. The excitation coefficients in both E and H plane follow the Taylor distribution. The antenna array achieves a pencil beam of 29dB gain at W band, and its half power beamwidths(HPBW) for E plane and H plane are both about 5°. The Side Lobe Level (SLL) of the pattern at the center frequency is lower than -17dB and 19dB at E and H plane respectively.
The MacLean Committee: Scotland's answer to the ‘dangerous people with severe personality disorder’ proposals?
The MacLean Committee was established in 1999 by the Scottish Office to review and make recommendations concerning the sentencing of serious violent and sexual offenders, including those with personality disorder. It provides an alternative perspective on the problem of offenders with personality
Effectiveness of a barber-based intervention for improving hypertension control in black men: the BARBER-1 study: a cluster randomized trial.
BACKGROUND Barbershop-based hypertension (HTN) outreach programs for black men are becoming increasingly common, but whether they are an effective approach for improving HTN control remains uncertain. METHODS To evaluate whether a continuous high blood pressure (BP) monitoring and referral program conducted by barbers motivates male patrons with elevated BP to pursue physician follow-up, leading to improved HTN control, a cluster randomized trial (BARBER-1) of HTN control was conducted among black male patrons of 17 black-owned barbershops in Dallas County, Texas (March 2006-December 2008). Participants underwent 10-week baseline BP screening, and then study sites were randomized to a comparison group that received standard BP pamphlets (8 shops, 77 hypertensive patrons per shop) or an intervention group in which barbers continually offered BP checks with haircuts and promoted physician follow-up with sex-specific peer-based health messaging (9 shops, 75 hypertensive patrons per shop). After 10 months, follow-up data were obtained. The primary outcome measure was change in HTN control rate for each barbershop. RESULTS The HTN control rate increased more in intervention barbershops than in comparison barbershops (absolute group difference, 8.8% [95% confidence interval (CI), 0.8%-16.9%]) (P = .04); the intervention effect persisted after adjustment for covariates (P = .03). A marginal intervention effect was found for systolic BP change (absolute group difference, -2.5 mm Hg [95% CI, -5.3 to 0.3 mm Hg]) (P = .08). CONCLUSIONS The effect of BP screening on HTN control among black male barbershop patrons was improved when barbers were enabled to become health educators, monitor BP, and promote physician follow-up. Further research is warranted. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT00325533.
Continuum Robot Dynamics Utilizing the Principle of Virtual Power
Efficient formulations for the dynamics of continuum robots are necessary to enable accurate modeling of the robot's shape during operation. Previous work in continuum robotics has focused on low-fidelity lumped parameter models, in which actuated segments are modeled as circular arcs, or computationally intensive high-fidelity distributed parameter models, in which continuum robots are modeled as a parameterized spatial curve. In this paper, a novel dynamic modeling methodology is studied that captures curvature variations along a segment using a finite set of kinematic variables. This dynamic model is implemented using the principle of virtual power (also called Kane's method) for a continuum robot. The model is derived to account for inertial, actuation, friction, elastic, and gravitational effects. The model is inherently adaptable for including any type of external force or moment, including dissipative effects and external loading. Three case studies are simulated on a cable-driven continuum robot structure to study the dynamic properties of the numerical model. Cross validation is performed in comparison to both experimental results and finite-element analysis.
Virtual worlds in competitive contexts: Analyzing eSports consumer needs
More recently, 3D graphical environments on the Internet, that is virtual worlds, have moved to the center of scientific interest. Since virtual worlds are suggested to mold social computing, research has predominately focused on collaborative virtual worlds. Yet, virtual worlds increasingly move to competitive environments leaving operating businesses with the question as to what to offer in order to fulfill customers’ needs. To close this knowledge gap, we examine competitive virtual worlds in terms of eSports services intrinsically tying cooperation and competition; we illuminate competitive and hedonic need gratifications of continuous eSports use. We apply Uses and Gratifications theory reporting on ten in-depth expert interviews as well as survey data collected from 360 eSports players. We reveal that both competitive (competition and challenge) and hedonic need gratifications (escapism) drive continuous eSports use.
1 Creativity , Intelligence , and Personality
ivergent thinking; creativity in women; hemispheric specialization opposing right brain to left as the source of intuition, metaphor, and imagery; the contribution of altered states of consciousness to creative thinking; an organismic interpretation of the relationship of creativity to personality and intelligence; new methods of analysis of biographical material and a new emphasis on psychohistory; the relationship of thought disorder to originality; the inheritance of intellectual and personal traits important to creativity; the enhancement of creativity by training; these have been the main themes emerging in research on creativity since the last major reviews of the field (Stein 1968; Dellas & Gaier 1970; Freeman, Butcher & Christie 1971; Gilchrist 1972). Much indeed has happened in the field of creativity research since 1950, when J. P. Guilford in his parting address as president of the American Psychological Association pointed out that up to that time only 186 out of 121,000 entries in Psychological Abstracts dealt with creative imagination. By 1956, when the first national research conference on creativity was organized by C. W. Taylor at the University of Utah (under the sponsorship of the National Science Foundation), this number had doubled. By 1962, when Scientific Creativity (compiled by C. W. Taylor and F. Barron) went to press with a summary of the
Harmonisation of knowledge management - comparing 160 KM frameworks around the globe
Purpose – The purpose of this paper is to look at how knowledge management (KM) has entered into a new phase where consolidation and harmonisation of concepts is required. Some first standards have been published in Europe and Australia in order to foster a common understanding of terms and concepts. The aim of this study was to analyse KM frameworks from research and practice regarding their model elements and try to discover differences and correspondences. Design/methodology/approach – A total of 160 KM frameworks from science, practice, associations and standardization bodies have been collected worldwide. These frameworks have been analysed regarding the use and understanding of the term knowledge, the terms used to describe the knowledge process activities and the factors influencing the success of knowledge management. Quantitative and qualitative content analysis methods have been applied. Findings – The result shows that despite the wide range of terms used in the KM frameworks an underlying consensus was detected regarding the basic categories used to describe the knowledge management activities and the critical success factors of KM. Nevertheless regarding the core term knowledge there is still a need to develop an improved understanding in research and practice. Originality/value – The first quantitative and qualitative analysis of 160 KM frameworks from different origin worldwide.
Elastance-Based Control of a Mock Circulatory System
A new control strategy for a mock circulatory system (MCS) has been developed to mimic the Starling response of the natural heart. The control scheme is based on Suga's elastance model, which is implemented using nested elastance and pressure feedback control systems. The elastance control loop calculates the desired chamber pressure using a time-varying elastance function and the ventricular chamber volume signal. The pressure control loop regulates the chamber pressure according to this reference signal. Simulations and tests on MCS hardware showed that the elastance-based controller responds to changes in preload, afterload, and contractility in a manner similar to the natural heart. Since the elastance function is an arbitrary function of time, the controller allows modification of ventricular chamber contractility, giving researchers a new tool to mimic various pathological conditions which can be used in the evaluation of cardiac devices such as ventricular assist devices. © 2001 Biomedical Engineering Society. PAC01: 8719Uv, 8780-y, 8719Rr
Evolution of land tenure institutions and development of agroforestry: evidence from customary land areas of Sumatra
It is widely believed that land tenure insecurity under a customary tenure system leads to a socially inefficient resource allocation. This article demonstrates that the practice of granting secure individual ownership to tree planters spurs earlier tree planting, which is inefficient from the private point of view but could be efficient from the viewpoint of the global environment. Regression analysis, based on primary data collected in Sumatra, indicates that an expected increase in tenure security in fact led to early tree planting. It is also found that customary land tenure institutions have been evolving towards greater tenure security responding to increasing scarcity of land. © 2001 Elsevier Science B.V. All rights reserved.
Road Sign Detection and Recognition Using Hidden Markov Model †
This paper proposes a novel road sign detection and recognition method. The position of a road sign in an image is detected using the projection technique. The hidden Markov model is then used to match the detected road sign with those in the database so that the goal of road sign recognition can be achieved. More specifically, the color images in terms of the RGB color system are first converted to the HSV color system and then quantized into specific colors. The horizontal and vertical projections of whole images in the specific colors existing in road signs are then used to detect the positions of road signs. In the recognition stage, only local features around the detected positions are used and a two-step strategy is adopted. The horizontal and vertical projections of background in the local area are used to prune irrelevant road signs. The candidate road signs are then sorted using the hidden Markov model. The one with the first rank is regarded as the recognition result. The effectiveness of the proposed method has been demonstrated by various experiments.
Embedding Entities and Relations for Learning and Inference in Knowledge Bases
We consider learning representations of entities and relations in KBs using the neural-embedding approach. We show that most existing models, including NTN (Socher et al., 2013) and TransE (Bordes et al., 2013b), can be generalized under a unified learning framework, where entities are low-dimensional vectors learned from a neural network and relations are bilinear and/or linear mapping functions. Under this framework, we compare a variety of embedding models on the link prediction task. We show that a simple bilinear formulation achieves new state-of-the-art results for the task (achieving a top-10 accuracy of 73.2% vs. 54.7% by TransE on Freebase). Furthermore, we introduce a novel approach that utilizes the learned relation embeddings to mine logical rules such as BornInCitypa, bq ^ CityInCountrypb, cq ùñ Nationalitypa, cq. We find that embeddings learned from the bilinear objective are particularly good at capturing relational semantics, and that the composition of relations is characterized by matrix multiplication. More interestingly, we demonstrate that our embedding-based rule extraction approach successfully outperforms a state-ofthe-art confidence-based rule mining approach in mining horn rules that involve compositional reasoning.
Solving Single-digit Sudoku Subproblems
We show that single-digit “Nishio” subproblems in n×n Sudoku puzzles may be solved in time o(2n), faster than previous solutions such as the pattern overlay method. We also show that single-digit deduction in Sudoku is NP-hard.
A smartphone-based sensing platform to model aggressive driving behaviors
Driving aggressively increases the risk of accidents. Assessing a person's driving style is a useful way to guide aggressive drivers toward having safer driving behaviors. A number of studies have investigated driving style, but they often rely on the use of self-reports or simulators, which are not suitable for the real-time, continuous, automated assessment and feedback on the road. In order to understand and model aggressive driving style, we construct an in-vehicle sensing platform that uses a smartphone instead of using heavyweight, expensive systems. Utilizing additional cheap sensors, our sensing platform can collect useful information about vehicle movement, maneuvering and steering wheel movement. We use this data and apply machine learning to build a driver model that evaluates drivers' driving styles based on a number of driving-related features. From a naturalistic data collection from 22 drivers for 3 weeks, we analyzed the characteristics of drivers who have an aggressive driving style. Our model classified those drivers with an accuracy of 90.5% (violation-class) and 81% (questionnaire-class). We describe how, in future work, our model can be used to provide real-time feedback to drivers using only their current smartphone.
Compressive light field photography using overcomplete dictionaries and optimized projections
Light field photography has gained a significant research interest in the last two decades; today, commercial light field cameras are widely available. Nevertheless, most existing acquisition approaches either multiplex a low-resolution light field into a single 2D sensor image or require multiple photographs to be taken for acquiring a high-resolution light field. We propose a compressive light field camera architecture that allows for higher-resolution light fields to be recovered than previously possible from a single image. The proposed architecture comprises three key components: light field atoms as a sparse representation of natural light fields, an optical design that allows for capturing optimized 2D light field projections, and robust sparse reconstruction methods to recover a 4D light field from a single coded 2D projection. In addition, we demonstrate a variety of other applications for light field atoms and sparse coding, including 4D light field compression and denoising.
False-Positive Results and Contamination in Nucleic Acid Amplification Assays: Suggestions for a Prevent and Destroy Strategy
Contamination of samples with DNA is still a major problem in microbiology laboratories, despite the wide acceptance of PCR and other amplification techniques for the detection of frequently low amounts of target DNA. This review focuses on the implications of contamination in the diagnosis and research of infectious diseases, possible sources of contaminants, strategies for prevention and destruction, and quality control. Contamination of samples in diagnostic PCR can have far-reaching consequences for patients, as illustrated by several examples in this review. Furthermore, it appears that the (sometimes very unexpected) sources of contaminants are diverse (including water, reagents, disposables, sample carry over, and amplicon), and contaminants can also be introduced by unrelated activities in neighboring laboratories. Therefore, lack of communication between researchers using the same laboratory space can be considered a risk factor. Only a very limited number of multicenter quality control studies have been published so far, but these showed false-positive rates of 9–57%. The overall conclusion is that although nucleic acid amplification assays are basically useful both in research and in the clinic, their accuracy depends on awareness of risk factors and the proper use of procedures for the prevention of nucleic acid contamination. The discussion of prevention and destruction strategies included in this review may serve as a guide to help improve laboratory practices and reduce the number of false-positive amplification results.
Graph stream classification using labeled and unlabeled graphs
Graph classification is becoming increasingly popular due to the rapidly rising applications involving data with structural dependency. The wide spread of the graph applications and the inherent complex relationships between graph objects have made the labels of the graph data expensive and/or difficult to obtain, especially for applications involving dynamic changing graph records. While labeled graphs are limited, the copious amounts of unlabeled graphs are often easy to obtain with trivial efforts. In this paper, we propose a framework to build a stream based graph classification model by combining both labeled and unlabeled graphs. Our method, called gSLU, employs an ensemble based framework to partition graph streams into a number of graph chunks each containing some labeled and unlabeled graphs. For each individual chunk, we propose a minimum-redundancy subgraph feature selection module to select a set of informative subgraph features to build a classifier. To tackle the concept drifting in graph streams, an instance level weighting mechanism is used to dynamically adjust the instance weight, through which the subgraph feature selection can emphasize on difficult graph samples. The classifiers built from different graph chunks form an ensemble for graph stream classification. Experiments on real-world graph streams demonstrate clear benefits of using minimum-redundancy subgraph features to build accurate classifiers. By employing instance level weighting, our graph ensemble model can effectively adapt to the concept drifting in the graph stream for classification.
THE MEANING OF NOTION 'NATURE' IN THE POLITICAL DOCTRINE OF THOMAS HOBBES AGAINST A BACKGROUND OF 'ZOON POLITIKON' CONCEPTION (Znaczenie pojecia 'natura' w doktrynie politycznej Thomasa Hobbesa na tle koncepcji zoon politikon)
Piotr Świercz, The meaning of notion „nature” in the political doctrine of Thomas Hobbes against a background of „zoon politikon” conception. The main aim of this article is to examine meaning of “nature”, “state of nature” and “natural condition of mankind” in the political doctrine of Thomas Hobbes. It should be emphasised that is a matter of notion “nature”, not a Hobbes’ description of natural state. Some posibilities are taken into consideration: “nature” as an essence, “nature” (“state of nature”) as a wildness, “nature” as opposed to “artificial”, “nature” as an abstract conception of human individual. To emphasize the importance of title problem, it’s showed against a background of classical (greek and christian) idea of zoon politikon. It seemed justified because Hobbes’ political doctrine is considered (even by Hobbes) as opposed to classical understanding of human nature. In the recapitulation of this thesis author tries to indicate the reasons of Hobbes’ doctrine obscurity and consequences of this obscurity for modern political ideas.
Semantic soft segmentation
Accurate representation of soft transitions between image regions is essential for high-quality image editing and compositing. Current techniques for generating such representations depend heavily on interaction by a skilled visual artist, as creating such accurate object selections is a tedious task. In this work, we introduce semantic soft segments, a set of layers that correspond to semantically meaningful regions in an image with accurate soft transitions between different objects. We approach this problem from a spectral segmentation angle and propose a graph structure that embeds texture and color features from the image as well as higher-level semantic information generated by a neural network. The soft segments are generated via eigendecomposition of the carefully constructed Laplacian matrix fully automatically. We demonstrate that otherwise complex image editing tasks can be done with little effort using semantic soft segments.
Noncontact Measurement of Complex Permittivity and Thickness by Using Planar Resonators
This paper presents a novel noncontact measurement technique that entails using a single-compound triple complementary split-ring resonator (SC-TCSRR) to determine the complex permittivity and thickness of a material under test (MUT). The proposed technique overcomes the problem engendered by the existence of air gaps between the sensor ground plane and the MUT. In the proposed approach, a derived governing equation of the resonance frequencies is used to estimate the thickness and complex permittivity of the MUT by calculating the resonant frequency (f r) and magnitude response in a single-step noncontact measurement process. This study theoretically analyzed and experimentally verified a simple and low-cost SC-TCSRR measurement method for assessing materials in a noncontact method. For a 0.2-mm air gap, the experiments yielded average measurement errors of 4.32% and 5.05% for the thickness and permittivity, respectively. The proposed SC-TCSRR technique provides excellent solutions for reducing the effect of air-gap conditions on permittivity, thickness, and loss tangent in noncontact measurements.
Flexicurity, Happiness, and Satisfaction
Many policy analysts and researchers endorse a "flexicurity" approach to labor market protections—a package of policies that simultaneously offers workers security and firms flexibility. Definitions of flexicurity are often vague and analysts focus on studying the unintended effects of policy on unemployment or labor market participation rates rather than the policies' goals, such as improving workers' security and satisfaction. This article examines the success of flexicurity on these alternative outcomes, while tightly defining flexicurity as the substitution of a safety net for employment protections. We review the development of the two policy indexes measuring safety nets and employment protections, and the validation of the indexes looking at unemployment outcomes. The indexes are then used to examine policy's impact on life satisfaction and happiness. The analysis of satisfaction finds flexicurity generally has better outcomes than a free market approach or labor protections based exclusively on em...
Question detection from acoustic features using recurrent neural network with gated recurrent unit
Question detection is of importance for many speech applications. Only parts of the speech utterances can provide useful clues for question detection. Previous work of question detection using acoustic features in Mandarin conversation is weak in capturing such proper time context information, which could be modeled essentially in recurrent neural network (RNN) structure. In this paper, we conduct an investigation on recurrent approaches to cope with this problem. Based on gated recurrent unit (GRU), we build different RNN and bidirectional RNN (BRNN) models to extract efficient features at segment and utterance level. The particular advantage of GRU is it can determine a proper time scale to extract high-level contextual features. Experimental results show that the features extracted within proper time scale make the classifier perform better than the baseline method with pre-designed lexical and acoustic feature set.
Moisture Buffering Effects on Indoor Air Quality— Experimental and Simulation Results
The ability of building materials to control indoor air humidity is studied in this paper. First, moisture capacity and transient response of building materials were investigated in small-scale laboratory experiments. Effects of moisture-absorbing interior wall materials on indoor air humidity were measured in a full-scale room under controlled conditions with known ventilation rates and moisture production schedule. The measured interior surface materials included wood, porous wood fiberboard, gypsum board with hygroscopic insulation, perforated plywood board, and, in a reference case, aluminium foil. Second, numerical simulation tools for hygrothermal performance analyses of building envelope parts and for buildings as a whole were used to assess the impact of hygroscopic mass on indoor air humidity. Two levels of testing and simulation were carried out: First, the moisture capacity of building materials in dynamic conditions was tested in small-scale laboratory tests. Second, the materials were placed in a room with intermittent moisture production. Moisture production and ventilation rates were set to correspond to those typical in residential buildings. Mass transfer between the finishing materials and indoor air affects the humidity both in indoor air and in the building envelope. The effect of coatings and their vapor permeance on the moisture exchange was investigated. A sensitivity study looking at the hygrothermal material properties and their effects on the performance was carried out. The results show that building materials exposed to indoor air can have a strong effect on the indoor air humidity. Potentials, practical applications, and design concepts for utilizing the moisture-buffering effect of building materials are discussed.
Huffman Based LZW Lossless Image Compression Using Retinex Algorithm
Image compression is an application of data compression that encodes the original image with few bits. The objective of image compression is to reduce irrelevance and redundancy of the image data in order to be able to store or transmit data in an efficient form. So image compression can reduce the transmit time over the network and increase the speed of transmission. In Lossless image compression no data loss when the compression Technique is done. In this research, a new lossless compression scheme is presented and named as Huffman Based LZW Lossless Image Compression using Retinex Algorithm which consists of three stages: In the first stage, a Huffman coding is used to compress the image. In the second stage all Huffman code words are concatenated together and then compressed with LZW coding and decoding. In the third stage the Retinex algorithm are used on compressed image for enhance the contrast of image and improve the quality of image. This Proposed Technique is used to increase the compression ratio (CR), Peak signal of Noise Ratio (PSNR), and Mean Square Error (MSE) in the MATLAB Software. Keywords-Compression, Encode, Decode, Huffman, LZW.
Independent motion detection with event-driven cameras
Unlike standard cameras that send intensity images at a constant frame rate, event-driven cameras asynchronously report pixel-level brightness changes, offering low latency and high temporal resolution (both in the order of micro-seconds). As such, they have great potential for fast and low power vision algorithms for robots. Visual tracking, for example, is easily achieved even for very fast stimuli, as only moving objects cause brightness changes. However, cameras mounted on a moving robot are typically non-stationary and the same tracking problem becomes confounded by background clutter events due to the robot ego-motion. In this paper, we propose a method for segmenting the motion of an independently moving object for event-driven cameras. Our method detects and tracks corners in the event stream and learns the statistics of their motion as a function of the robot's joint velocities when no independently moving objects are present. During robot operation, independently moving objects are identified by discrepancies between the predicted corner velocities from ego-motion and the measured corner velocities. We validate the algorithm on data collected from the neuromorphic iCub robot. We achieve a precision of ∼ 90% and show that the method is robust to changes in speed of both the head and the target.
High School Students’ Perceptions of Motivations for Cyberbullying: An Exploratory Study
OBJECTIVES Internet usage has increased in recent years resulting in a growing number of documented reports of cyberbullying. Despite the rise in cyberbullying incidents, there is a dearth of research regarding high school students' motivations for cyberbullying. The purpose of this study was to investigate high school students' perceptions of the motivations for cyberbullying. METHOD We undertook an exploratory qualitative study with 20 high school students, conducting individual interviews using a semi-structured interview protocol. Data were analyzed using Grounded Theory. RESULTS The developed coding hierarchy provides a framework to conceptualize motivations, which can be used to facilitate future research about motivations and to develop preventive interventions designed to thwart the negative effects of cyberbullying. The findings revealed that high school students more often identified internally motivated reasons for cyberbullying (e.g., redirect feelings) than externally motivated (no consequences, non-confrontational, target was different). CONCLUSION Uncovering the motivations for cyberbullying should promote greater understanding of this phenomenon and potentially reduce the interpersonal violence that can result from it. By providing a framework that begins to clarify the internal and external factors motivating the behavior, there is enhanced potential to develop effective preventive interventions to prevent cyberbullying and its negative effects.
Sustainability and social corporation responsibility as a competitive strategy in hotels.
RESUMO - Esse artigo tem como objetivo discutir a relacao entre acoes de sustentabilidade e responsabilidade social corporativa no âmbito da hotelaria e suas possiveis contribuicoes para competitividade do setor hoteleiro. O metodo utilizado envolveu a coleta de dados de hoteis do municipio de Belo Horizonte, Minas Gerais (Brasil), mediante um estudo de caso descritivo. Avaliou-se o discurso formal, o discurso pronunciado e subtendido, silenciado pelo responsavel por cada hotel na interpretacao dos pesquisadores. O estudo buscou descrever as acoes dos dirigentes hoteleiros com reflexos causais no desempenho organizacional de maneira estrategica. As estrategias pesquisadas com enfase foram a associacao do hotel a alguma rede hoteleira ou instituicao turistica local; e a pratica de acoes relacionadas a responsabilidade social corporativa e sustentabilidade. Os fatores operacionais destacados pelos hoteleiros e identificados como indutores da competitividade dos empreendimentos foram: (1) Organizacao em redes e franquias; (2) Manutencao e investimento em infraestrutura e equipamentos de lazer; (3) Qualificacao da mao de obra, notadamente em servicos; (4) Orientacao estrategica de mercado e, (5) Acoes de responsabilidade social corporativa. Finalmente, as acoes sustentaveis nos hoteis pesquisados enfocam principalmente a gestao rigida de caixa, com enfase em controle de centros de custos operacionais. A responsabilidade social corporativa e sustentabilidade apresentam menor prioridade estrategica para os gestores. Palavras-chave: Desempenho hoteleiro; Sustentabilidade em hotelaria; Responsabilidade Social; Competitividade no setor hoteleiro.
The MetaCrawler architecture for resource aggregation on the Web
The MetaCrawler Softbot is a parallel Web search service that has been available at the University of Washington since June of 1995. It provides users with a single interface with which they can query popular general-purpose Web search services, such as Lycos[6] and AltaVista[1], and has some sophisticated features that allow it to obtain results of much higher quality than simply regurgitating the output from each search service. In this article, we briefly outline the motivation for MetaCrawler and highlight previous work, and then discuss the architecture of MetaCrawler and how it enables MetaCrawler to perform well and to scale and adapt to a dynamic Internet.
Comparison Research on Text Pre-processing Methods on Twitter Sentiment Analysis
Twitter sentiment analysis offers organizations ability to monitor public feeling towards the products and events related to them in real time. The first step of the sentiment analysis is the text pre-processing of Twitter data. Most existing researches about Twitter sentiment analysis are focused on the extraction of new sentiment features. However, to select the pre-processing method is ignored. This paper discussed the effects of text pre-processing method on sentiment classification performance in two types of classification tasks, and summed up the classification performances of six pre-processing methods using two feature models and four classifiers on five Twitter datasets. The experiments show that the accuracy and F1-measure of Twitter sentiment classification classifier are improved when using the pre-processing methods of expanding acronyms and replacing negation, but barely changes when removing URLs, removing numbers or stop words. The Naive Bayes and Random Forest classifiers are more sensitive than Logistic Regression and support vector machine classifiers when various pre-processing methods were applied.
Studying Teamwork in Global IT Support
As modern organizations increasingly operate in a global economy, they need IT support around the globe; favorable economic conditions also encourage the use of offshore IT teams. However, when IT efforts "go global," issues and challenges typical of IT development and support are magnified. In this paper, we review and integrate three research areas that contribute to our understanding and management of global IT support teams: studies of global teamwork practices, small group dynamics theory, and studies of virtual teams. We review key findings from these areas and discuss a case example of global IT support to illustrate the insights possible through these research perspectives. We conclude by outlining an agenda for future research on teamwork in global IT support. 1. Overview and motivation The United Nations Conference on Trade and Development (UNCTAD) [37] in their World Investment Report 2000 reported that international production by transnational corporations (TNCs), now numbering 63,000 parent firms with about 690,000 foreign affiliates, spans virtually all countries and economic activities. In this global economy, effective use of information technologies (IT) is essential to the operation of modern business organizations. Information technologies enable business managers to redesign core business processes, products, or services and provide necessary information for decision-making [31]. King and Sethi [21] suggest that IT is fundamental to effective global operations in two primary ways: providing a coordination mechanism for geographically dispersed activities, and facilitating the reshaping of the separate organizations into global cooperatives. For example, enterprise resource planning (ERP) software make it possible to operate a multinational firm as a unified enterprise with global product and service sourcing, rather than a loose network of international subsidiaries serving local markets. S fi tw im te an o im st re cr in ho ca ce IT th th st co 8 d b to th
Who Owns Pragmatism
This paper compares the approach to legal theory associated with the neopragmatism of Richard Rorty with that of classical pragmatism, as it emerged from the Metaphysical Club, the discussion group in Cambridge, Massachusetts in the early 1870s. The two approaches derive from distinct historical and intellectual origins. Neopragmatism's prominent feature is its proximity to, and common purpose with, Continental writers of an anti-foundational bent, including some for whom the pragmatic label would not previously have been comfortable, for example, Nietzsche, Heidegger, and Foucault. For Rorty’s neopragmatism, the exhaustion of Enlightenment foundationalism has implicated an end to epistemology and metaphysics. For the early pragmatists, it is attributed to failure to be adequately consequentialist and inclusive. For classical pragmatism, generalizing was tested by consequences and connected to the solution of human problems. The Metaphysical Club included lawyers like N. St. John Green and Oliver Wendell Holmes, Jr. While its main agenda is often seen as undressing traditional metaphysical problems with a consequentialist test for meaning and belief, for the lawyers this also involved a political mission: articulating a coherent legal theory for a revolutionary democratic republic still less than a century old. In law, science, and philosophy itself, meaning can be described as the best consensus of all those confronted with a practical stake in the outcome. In law this highlights the degree of inclusion. Pragmatism emerged as the philosophy of democracy, ideally an unbounded democracy of unlimited inquiry, not apologizing for the existing order but rather making actual inquiry, as well as real democratic institutions, always appear inadequate. It originated as the first philosophy founded not on foundational certainty but on exigency, fallibility, and representivity.
Follow me cloud: interworking federated clouds and distributed mobile networks
This article introduces the Follow-Me Cloud concept and proposes its framework. The proposed framework is aimed at smooth migration of all or only a required portion of an ongoing IP service between a data center and user equipment of a 3GPP mobile network to another optimal DC with no service disruption. The service migration and continuity is supported by replacing IP addressing with service identification. Indeed, an FMC service/application is identified, upon establishment, by a session/service ID, dynamically changing along with the service being delivered over the session; it consists of a unique identifier of UE within the 3GPP mobile network, an identifier of the cloud service, and dynamically changing characteristics of the cloud service. Service migration in FMC is triggered by change in the IP address of the UE due to a change of data anchor gateway in the mobile network, in turn due to UE mobility and/or for load balancing. An optimal DC is then selected based on the features of the new data anchor gateway. Smooth service migration and continuity are supported thanks to logic installed at UE and DCs that maps features of IP flows to the session/service ID.
A Distributed Demand Response Algorithm and Its Application to PHEV Charging in Smart Grids
This paper proposes a distributed framework for demand response and user adaptation in smart grid networks. In particular, we borrow the concept of congestion pricing in Internet traffic control and show that pricing information is very useful to regulate user demand and hence balance network load. User preference is modeled as a willingness to pay parameter which can be seen as an indicator of differential quality of service. Both analysis and simulation results are presented to demonstrate the dynamics and convergence behavior of the algorithm. Based on this algorithm, we then propose a novel charging method for plug-in hybrid electric vehicles (PHEVs) in a smart grid, where users or PHEVs can adapt their charging rates according to their preferences. Simulation results are presented to demonstrate the dynamic behavior of the charging algorithm and impact of different parameters on system performance.
Hormone replacement therapy in postmenopausal women with schizophrenia: positive effect on negative symptoms?
BACKGROUND Some studies of premenopausal women suggest that the severity of psychopathology associated with schizophrenia may be related to levels of estrogen. METHODS We examined psychopathology in community-dwelling postmenopausal women with schizophrenia who had received (n = 24) versus had never received (n = 28) hormone replacement therapy. RESULTS Users of hormone replacement therapy and nonusers did not differ significantly with respect to age, ethnicity, education, age of onset, duration of schizophrenia, global cognitive functioning, or neuroleptic-induced movement disorders. The hormone replacement therapy users received lower average daily doses of antipsychotic medication; they had similar levels of positive symptoms but significantly less severe negative symptoms compared with hormone replacement therapy nonusers, independent of differences in antipsychotic dosage. CONCLUSIONS Our results suggest that the use of hormone replacement therapy in conjunction with antipsychotic medication in postmenopausal women with schizophrenia may help reduce negative, but not positive, symptoms.
Exhaled 8-isoprostane in childhood asthma
BACKGROUND Exhaled breath condensate (EBC) is a non-invasive method to assess airway inflammation and oxidative stress and may be useful in the assessment of childhood asthma. METHODS Exhaled 8-isoprostane, a stable marker of oxidative stress, was measured in EBC, in children (5-17 years) with asthma (13 steroid-naïve and 12 inhaled steroid-treated) and 11 healthy control. RESULTS Mean exhaled 8-isoprostane concentration was significantly elevated in steroid-naïve asthmatic children compared to healthy children 9.3 (SEM 1.7) vs. 3.8 (0.6) pg/ml, p < 0.01. Children on inhaled steroids also had significantly higher 8-isoprostane levels than those of normal subjects 6.7 (0.7) vs. 3.8 (0.6) pg/ml, p < 0.01. Steroid-naïve asthmatics had higher exhaled nitric oxide (eNO) than those of controls 28.5 (4.7) vs. 12.6 (1.5) ppb, p < 0.01. eNO in steroid-treated asthmatics was similar to control subjects 27.5(8.8) vs. 12.6(1.5) ppb. Exhaled 8-isoprostane did not correlate with duration of asthma, dose of inhaled steroids or eNO. CONCLUSION We conclude that 8-isoprostane is elevated in asthmatic children, indicating increased oxidative stress, and that this does not appear to be normalized by inhaled steroid therapy. This suggests that 8-isoprostane is a useful non-invasive measurement of oxidative stress in children and that antioxidant therapy may be useful in the future.
Systems thinking: Foundations for enhancing system of systems engineering
This paper explores Systems Thinking (ST) as an essential foundation for more effective System of Systems Engineering (SoSE). ST is explored as a particular worldview grounded in appreciation of system behavior (performance) stemming not from system elements, but rather from the interaction of the elements. Following a short introduction, three primary avenues are explored in this paper, including: (1) examination of the nature of ST with emphasis on the central tenets of the field, (2) identification of the contributions and role of ST for SoSE, and (3) exploration of thirteen points defining a ST worldview and their implications for SoSE. The paper closes with a set of implications for practitioners and challenges for advancing the SoSE field through further inclusion of ST.
Compact Dual-Band Branch-Line and Rat-Race Couplers With Stepped-Impedance-Stub Lines
This study constructs stepped-impedance-stub lines for a dual-band branch-line coupler design with improved design flexibility. The proposed structure demonstrates dual-band performance and a compact size due to additional stepped-impedance stubs to branches. The developed synthesis method has two degrees of freedom which can be exploited to miniaturize circuit size and/or replace impractical impedances with more realizable ones. Observations also show the advantage of a wide-range realizable frequency ratio of dual bands. The current work fabricates three experimental dual-band branch-line couplers, including a two-section coupler, and achieves a size reduction up to 21.7%, compared with conventional structures. The measured results validate good dual-band performance at 2.4/5.8 GHz with enhanced bandwidths up to 21% and 12%, respectively. This research also successfully applies the proposed circuit to synthesize a dual-band rat-race coupler.
Understanding requirement prioritization techniques
Software is becoming progressively more integral part of day-to-day life. Developing the software that meet stakeholders' need is the ultimate goal in today's environment. As the complexity of software increases so does the requirements. There are many requirements which should be fulfilled in the given time duration on the other hand some requirements should be considered first to reduce the risks. Hence, proper gathering and prioritizing requirements may leads to the successive development of the software. In literature there are number of techniques which focus on requirement prioritization problem. This paper presents the comparative study of various requirement prioritization techniques.
Automatic malware classification and new malware detection using machine learning
The explosive growth of malware variants poses a major threat to information security. Traditional anti-virus systems based on signatures fail to classify unknown malware into their corresponding families and to detect new kinds of malware programs. Therefore, we propose a machine learning based malware analysis system, which is composed of three modules: data processing, decision making, and new malware detection. The data processing module deals with gray-scale images, Opcode n-gram, and import functions, which are employed to extract the features of the malware. The decision-making module uses the features to classify the malware and to identify suspicious malware. Finally, the detection module uses the shared nearest neighbor (SNN) clustering algorithm to discover new malware families. Our approach is evaluated on more than 20 000 malware instances, which were collected by Kingsoft, ESET NOD32, and Anubis. The results show that our system can effectively classify the unknown malware with a best accuracy of 98.9%, and successfully detects 86.7% of the new malware.
Performance comparison of LQR and ANFIS controller for stabilizing double inverted pendulum system
In this paper performance of LQR and ANFIS control for a Double Inverted Pendulum system is compared. The double inverted pendulum system is highly unstable and nonlinear. Mathematical model is presented by linearizing the system about its vertical position. The analysis of the system is performed for its stability, controllability and observability. Furthermore, the LQR controller and ANFIS controller based on the state variable fusion is proposed for the control of the double inverted pendulum system and simulation results show that ANFIS controller has better tracking performance and disturbance rejecting performance as compared to LQR controller.
Resting-state functional connectivity reflects structural connectivity in the default mode network.
Resting-state functional connectivity magnetic resonance imaging (fcMRI) studies constitute a growing proportion of functional brain imaging publications. This approach detects temporal correlations in spontaneous blood oxygen level-dependent (BOLD) signal oscillations while subjects rest quietly in the scanner. Although distinct resting-state networks related to vision, language, executive processing, and other sensory and cognitive domains have been identified, considerable skepticism remains as to whether resting-state functional connectivity maps reflect neural connectivity or simply track BOLD signal correlations driven by nonneural artifact. Here we combine diffusion tensor imaging (DTI) tractography with resting-state fcMRI to test the hypothesis that resting-state functional connectivity reflects structural connectivity. These 2 modalities were used to investigate connectivity within the default mode network, a set of brain regions--including medial prefrontal cortex (MPFC), medial temporal lobes (MTLs), and posterior cingulate cortex (PCC)/retropslenial cortex (RSC)--implicated in episodic memory processing. Using seed regions from the functional connectivity maps, the DTI analysis revealed robust structural connections between the MTLs and the retrosplenial cortex whereas tracts from the MPFC contacted the PCC (just rostral to the RSC). The results demonstrate that resting-state functional connectivity reflects structural connectivity and that combining modalities can enrich our understanding of these canonical brain networks.
Decision trees and forests: a probabilistic perspective
Decision trees and ensembles of decision trees are very popular in machine learning and often achieve state-of-the-art performance on black-box prediction tasks. However, popular variants such as C4.5, CART, boosted trees and random forests lack a probabilistic interpretation since they usually just specify an algorithm for training a model. We take a probabilistic approach where we cast the decision tree structures and the parameters associated with the nodes of a decision tree as a probabilistic model; given labeled examples, we can train the probabilistic model using a variety of approaches (Bayesian learning, maximum likelihood, etc). The probabilistic approach allows us to encode prior assumptions about tree structures and share statistical strength between node parameters; furthermore, it offers a principled mechanism to obtain probabilistic predictions which is crucial for applications where uncertainty quantification is important. Existing work on Bayesian decision trees relies on Markov chain Monte Carlo which can be computationally slow and suffer from poor mixing. We propose a novel sequential Monte Carlo algorithm that computes a particle approximation to the posterior over trees in a top-down fashion. We also propose a novel sampler for Bayesian additive regression trees by combining the above top-down particle filtering algorithm with the Particle Gibbs (Andrieu et al., 2010) framework. Finally, we propose Mondrian forests (MFs), a computationally efficient hybrid solution that is competitive with non-probabilistic counterparts in terms of speed and accuracy, but additionally produces well-calibrated uncertainty estimates. MFs use the Mondrian process (Roy and Teh, 2009) as the randomization mechanism and hierarchically smooth the node parameters within each tree (using a hierarchical probabilistic model and approximate Bayesian updates), but combine the trees in a non-Bayesian fashion. MFs can be grown in an incremental/online fashion and remarkably, the distribution of online MFs is the same as that of batch MFs.
Design of an 99%-efficient, 5kW, phase-shift PWM DC-DC converter for telecom applications
In the last decade power electronic research focused on the power density maximization mainly to reduce initial systems costs [1]. In the field of data centers and telecom applications, the costs for powering and cooling exceed the purchasing cost in less than 2 years [2]. That causes the changing driving forces in the development of new power supplies to efficiency, while the power density should stay on a high level. The commonly used DC-DC converter in the power supply unit (PSU) for data centers and telecom applications are full bridge phase-shift converters since they meet the demands of high power and efficient power conversion, a compact design and the constant operation frequency allows a simple control and EMI design. The development of the converter with respect to high efficiency has a lot of degrees of freedom. An optimization procedure based on comprehensive analytical models leads to the optimal parameters (e.g. switching frequency, switching devices in parallel and transformer design) for the most efficient design. In this paper a 5kW, 400V–48⋖56V phase-shift PWM converter with LC-output filter is designed for highest efficiency (η ≥99%) with a volume limitation and the consideration of the part-load efficiency. The components dependency as well as the optimal design will be explained. The realized prototype design reaches a calculated efficiency of η = 99.2% under full load condition and a power density of ρ = 36W/in3 (2.2 kW/liter).
Local, sustainable, small-scale cellular networks
Over five billion people are active cellular subscribers, spending over a trillion dollars a year on communications. Despite this, hundreds of millions of people are still not connected. Implicit in these networks is a top-down design, with large nation-scale telecommunication firms deciding when and where coverage will be available. This is enforced by the large capital investment required to run cellular systems; base stations can cost upwards of US$100,000 and require expensive related core infrastructure. Recent technological innovations have enabled much cheaper cellular equipment; a base station now costs around US$10,000 and requires none of the other related systems. This reduction in cost is enabling new models of cellular telephony. Small organizations are suddenly capable of being service providers. In this work we ask, "How successful would bottom-up cellular networks be?" Essentially we argue for and demonstrate a local cellular network, utilizing existing infrastructure (e.g., power, network, and people) to operate at much lower cost, with less required capital, bringing coverage to areas not traditionally able to support cellular deployments. This network also provides sustainable employment and revenue to local entrepreneurs and services for the local community. We demonstrate the value of this concept by conducting an ongoing six-month long field deployment in rural Papua, Indonesia, in partnership with local NGOs. This network is currently live, with 187 subscribers sustainably providing US$830 per month in revenue (US$368 in profit) for the operator and employment for three different credit sellers in the village. We also show that this network provides a valuable service to the community through usage logs and user interviews.
Erasure code-based low storage blockchain node
The concept of a decentralized ledger usually implies that each node of a blockchain network stores the entire blockchain. However, in the case of popular blockchains, which each weigh several hundreds of GB, the large amount of data to be stored can incite new or low-capacity nodes to run lightweight clients. Such nodes do not participate to the global storage effort and can result in a centralization of the blockchain by very few nodes, which is contrary to the basic concepts of a blockchain. To avoid this problem, we propose new low storage nodes that store a reduced amount of data generated from the blockchain by using erasure codes. The properties of this technique ensure that any block of the chain can be easily rebuilt from a small number of such nodes. This system should encourage low storage nodes to contribute to the storage of the blockchain and to maintain decentralization despite of a globally increasing size of the blockchain. This system paves the way to new types of blockchains which would only be managed by low capacity nodes.
Echo Chamber: A Persuasive Game on Climate Change Rhetoric
Echo Chamber is a game that persuades players to re-examine their argumentation style and adopt new rhetorical techniques procedurally delivered through gameplay. Several games have been made addressing the environmental impacts of climate change; none have examined the gap between scientific and public discourse over climate change, and our goal was to teach players more effective communication techniques for conveying climate change in public venues. Our game provides other developers insight into persuasion through game mechanics with good design practices for similar persuasive games.
A Multi-Stage Strategy to Perspective Rectification for Mobile Phone Camera-Based Document Images
Document images captured by a mobile phone camera often have perspective distortions. Efficiency and accuracy are two important issues in designing a rectification system for such perspective documents. In this paper, we propose a new perspective rectification system based on vanishing point detection. This system achieves both the desired efficiency and accuracy using a multi-stage strategy: at the first stage, document boundaries and straight lines are used to compute vanishing points; at the second stage, text baselines and block aligns are utilized; and at the last stage, character tilt orientations are voted for the vertical vanishing point. A profit function is introduced to evaluate the reliability of detected vanishing points at each stage. If vanishing points at one stage are reliable, then rectification is ended at that stage. Otherwise, our method continues to seek more reliable vanishing points in the next stage. We have tested this method with more than 400 images including paper documents, signboards and posters. The image acceptance rate is more than 98.5% with an average speed of only about 60 ms.
Diagnostic value of endobronchial and endoscopic ultrasound-guided fine needle aspiration for accessible lung cancer lesions after non-diagnostic conventional techniques: a prospective study
Lung cancer diagnosis is usually achieved through a set of bronchoscopic techniques or computed tomography guided-transthoracic needle aspiration (CT-TTNA). However these procedures have a variable diagnostic yield and some patients remain without a definite diagnosis despite being submitted to an extensive workup. The aim of this study was to evaluate the efficacy and cost of linear endobronchial (EBUS) and endoscopic ultrasound (EUS) guided fine needle aspiration (FNA), performed with one echoendoscope, for the diagnosis of suspicious lung cancer lesions after failure of conventional procedures. One hundred and twenty three patients with an undiagnosed but suspected malignant lung lesion (paratracheal, parabronchial, paraesophageal) or with a peripheral lesion and positron emission tomography positive mediastinal lymph nodes who had undergone at least one diagnostic flexible bronchoscopy or CT-TTNA attempt were submitted to EBUS and EUS-FNA. Patients with endobronchial lesions were excluded. Of the 123 patients, 88 had a pulmonary nodule/mass and 35 were selected based on mediastinal PET positive lymph nodes. Two patients were excluded because an endobronchial mass was detected at the time of the procedure. The target lesion could be visualized in 121 cases and FNA was performed in 118 cases. A definitive diagnosis was obtained in 106 cases (87.6%). Eighty-eight patients (72.7%) had non-small cell lung cancer, 15 (12.4%) had small cell lung cancer and metastatic disease was found in 3 patients (2.5%). The remaining 15 negative cases were subsequently diagnosed by surgical procedures. Twelve patients (9.9%) had a malignant tumor and in 3 (2.5%) a benign lesion was found. The overall sensitivity, specificity, positive and negative predictive values of EBUS and EUS-FNA to diagnose malignancy were 89.8%, 100%, 100% and 20.0% respectively. The diagnostic accuracy was 90.1% in a population with 97.5% prevalence of cancer. The ultrasonographic approach avoided expensive surgical procedures and significantly reduced costs (p < 0.001). Linear EBUS and EUS-FNA are able to improve the diagnostic yield of suspicious lung cancer lesions after non-diagnostic conventional techniques. These techniques, performed with one scope, can be offered to patients with accessible lesions as an intermediate step for diagnosis since they may avoid more invasive procedures and hence reduce costs.
A quantum leap for AI
9 Quantum computing and AI Subhash Kak, Louisiana State University Every few years, we hear of a new technology that will revolutionize AI. After careful reflection, we find that the advance is within the framework of the Turing machine model and equivalent, in many cases, to existing statistical techniques. But this time, in quantum computing, we seem to be on the threshold of a real revolution— a “quantum” leap—because it is a true frontier beyond classical computing. But will these possibilities be realized any time soon? Classical computers work on classical logic and can be viewed as an embodiment of classical physics. Quantum computers, on the other hand, are based on the superpositional logic of quantum mechanics, which is an entirely different paradigm. Conventional explanation sees consciousness arising as an emergent property of the classical computations taking place in the circuits of the brain, but this does not address the question of how thoughts and feelings arise. If brains perform quantum processing, this might be the secret behind consciousness. Furthermore, it might explain several puzzling features of animal and human intelligence and provide a new direction to develop AI machines. In this brief survey, I present the rationale for the convergence between quantum computing and AI and discuss prospects for realizing the technology.
On the Design of Virtual Reality Learning Environments in Engineering
Currently, the use of virtual reality (VR) is being widely applied in different fields, especially in computer science, engineering, and medicine. Concretely, the engineering applications based on VR cover approximately one half of the total number of VR resources (considering the research works published up to last year, 2016). In this paper, the capabilities of different computational software for designing VR applications in engineering education are discussed. As a result, a general flowchart is proposed as a guide for designing VR resources in any application. It is worth highlighting that, rather than this study being based on the applications used in the engineering field, the obtained results can be easily extrapolated to other knowledge areas without any loss of generality. This way, this paper can serve as a guide for creating a VR application.
MATHEMATICS ANXIETY AND ACHIEVEMENT AMONG SECONDARY SCHOOL STUDENTS
Research has shown that mathematics achievement in students is influenced by psychological factors suc h as mathematics anxiety. Weaknesses among students in l earning mathematics in particular will affect the e fforts of various sectors in making Malaysia a fully develope d nation by 2020. The purpose of this study was to determine mathematics anxiety and mathematics achievement amo ng secondary school students in Selangor, Malaysia. The research examined the differences in mathematics an xiety according to gender as well as the difference s in mathematics achievement of students based on the le vel of mathematics anxiety. The study involved195 F orm Four students (86 male and 109 female). The instrum ent used to measure differences was adapted from th e Fennema-Sherman Mathematics Attitudes Scale. The da ta was analyzed using Statistical Package for the S ocial Sciences (SPSS) to determine the mean, frequency, t -test and one-way ANOVA. The findings of the study indicated that there is mathematics anxiety among s econdary school students. The t-test showed that th e mean difference between mathematics anxiety and gender i s not significant. The ANOVA test showed that there were significant differences in achievement based on the lev l of mathematics anxiety. Thus, math anxiety i s one factor that affects student achievement. Therefore, teache rs should strive to understand mathematics anxiety and implement teaching and learning strategies so that s udents can overcome their anxiety.
A Corpus of Natural Language for Visual Reasoning
Property-based Features Given a sentencerepresentation pair, for each property listed in Table 2, we compute if it holds for the representation. For each property that holds and for each n-gram in the sentence we trigger a feature. Consider the first example in Table 1. The features triggered for this example include touches-wall#two-boxes-have and touches-wall#touching-the-side computed from the property touches-wall and the tri-grams two boxes have and touching the side. We observe that the MaxEnt model learns a higher weight for features which combine similar properties of the world and the sentence, such as touches-wall#touching-the-side.
Matrix Completion for Weakly-Supervised Multi-Label Image Classification
In the last few years, image classification has become an incredibly active research topic, with widespread applications. Most methods for visual recognition are fully supervised, as they make use of bounding boxes or pixelwise segmentations to locate objects of interest. However, this type of manual labeling is time consuming, error prone and it has been shown that manual segmentations are not necessarily the optimal spatial enclosure for object classifiers. This paper proposes a weakly-supervised system for multi-label image classification. In this setting, training images are annotated with a set of keywords describing their contents, but the visual concepts are not explicitly segmented in the images. We formulate the weakly-supervised image classification as a low-rank matrix completion problem. Compared to previous work, our proposed framework has three advantages: (1) Unlike existing solutions based on multiple-instance learning methods, our model is convex. We propose two alternative algorithms for matrix completion specifically tailored to visual data, and prove their convergence. (2) Unlike existing discriminative methods, our algorithm is robust to labeling errors, background noise and partial occlusions. (3) Our method can potentially be used for semantic segmentation. Experimental validation on several data sets shows that our method outperforms state-of-the-art classification algorithms, while effectively capturing each class appearance.
Decoding two-dimensional movement trajectories using electrocorticographic signals in humans.
Signals from the brain could provide a non-muscular communication and control system, a brain-computer interface (BCI), for people who are severely paralyzed. A common BCI research strategy begins by decoding kinematic parameters from brain signals recorded during actual arm movement. It has been assumed that these parameters can be derived accurately only from signals recorded by intracortical microelectrodes, but the long-term stability of such electrodes is uncertain. The present study disproves this widespread assumption by showing in humans that kinematic parameters can also be decoded from signals recorded by subdural electrodes on the cortical surface (ECoG) with an accuracy comparable to that achieved in monkey studies using intracortical microelectrodes. A new ECoG feature labeled the local motor potential (LMP) provided the most information about movement. Furthermore, features displayed cosine tuning that has previously been described only for signals recorded within the brain. These results suggest that ECoG could be a more stable and less invasive alternative to intracortical electrodes for BCI systems, and could also prove useful in studies of motor function.
Tumor-associated macrophages: unwitting accomplices in breast cancer malignancy
Deleterious inflammation is a primary feature of breast cancer. Accumulating evidence demonstrates that macrophages, the most abundant leukocyte population in mammary tumors, have a critical role at each stage of cancer progression. Such tumor-associated macrophages facilitate neoplastic transformation, tumor immune evasion and the subsequent metastatic cascade. Herein, we discuss the dynamic process whereby molecular and cellular features of the tumor microenvironment act to license tissue-repair mechanisms of macrophages, fostering angiogenesis, metastasis and the support of cancer stem cells. We illustrate how tumors induce, then exploit trophic macrophages to subvert innate and adaptive immune responses capable of destroying malignant cells. Finally, we discuss compelling evidence from murine models of cancer and early clinical trials in support of macrophage-targeted intervention strategies with the potential to dramatically reduce breast cancer morbidity and mortality.
A Survey of Computation Offloading for Mobile Systems
Mobile systems have limited resources, such as battery life, network bandwidth, storage capacity, and processor performance. These restrictions may be alleviated by computation of f loading: sending heavy computation to resourceful servers and receiving the results from these servers. Many issues related to offloading have been investigated in the past decade. This survey paper provides an overview of the background, techniques, systems, and research areas for offloading computation. We also describe directions for future research.
Pain sensitisation and the risk of poor outcome following physiotherapy for patients with moderate to severe knee osteoarthritis: protocol for a prospective cohort study
INTRODUCTION Pain is the dominant symptom of knee osteoarthritis (OA), and recent evidence suggests factors outside of local joint pathology, such as pain sensitisation, can contribute significantly to the pain experience. It is unknown how pain sensitisation influences outcomes from commonly employed interventions such as physiotherapy. The aims of this study are, first, to provide a comprehensive description of the somatosensory characteristics of people with pain associated with knee OA. Second, we will investigate if indicators of pain sensitisation in patients with knee osteoarthritis are predictive of non-response to physiotherapy. METHODS AND ANALYSIS This is a multicentre prospective cohort study with 140 participants. Eligible patients with moderate to severe symptomatic knee osteoarthritis will be identified at outpatient orthopaedic and rheumatology clinics. A baseline assessment will provide a comprehensive description of the somatosensory characteristics of each participant by means of clinical examination, quantitative sensory testing, and validated questionnaires measuring pain and functional capacity. Participants will then undergo physiotherapy treatment. The primary outcome will be non-response to physiotherapy on completion of the physiotherapy treatment programme as defined by the Osteoarthritis Research Society International treatment responder criteria. A principal component analysis will identify measures related to pain sensitisation to include in the predictive model. Regression analyses will explore the relationship between responder status and pain sensitisation while accounting for confounders. ETHICS AND DISSEMINATION This study has been approved by St James' Hospital/AMNCH Research Ethics Committee and by the St Vincent's Healthcare Group Ethics and Medical Research Committee. The results will be presented at international conferences and published in a peer review journal. TRIAL REGISTRATION NUMBER NCT02310945.
A phase I pharmacokinetic and safety evaluation of oral pazopanib dosing administered as crushed tablet or oral suspension in patients with advanced solid tumors
Because cancer patients may have difficulty swallowing whole tablets, crushing tablets or ingesting an oral suspension is a practical alternative. This open-label, 2-part, randomized crossover, phase I study evaluated the pharmacokinetics and tolerability of pazopanib administered as a crushed tablet or an oral suspension relative to whole tablet in patients with advanced cancer (Part 1). Patients completing Part 1 were eligible for continuous daily pazopanib 800 mg (Part 2). Administration of a single pazopanib 400 mg crushed tablet increased the area under the curve from 0 to 72 h (AUC(0–72); 46%) and maximum observed plasma concentration (Cmax; ~2-fold), and decreased time to achieve maximum plasma concentration (Tmax; ~2 h), indicating increased rate and extent of oral absorption relative to whole-tablet administration. Similarly, a single dose of pazopanib 400 mg suspension increased AUC(0–72) (33%) and Cmax (29%), and decreased Tmax (1 h). These changes in pharmacokinetic parameters were not associated with increases in the magnitude or duration of short-term (ie, up to 72 h) blood pressure elevation compared with whole-tablet administration.
Characterization of Heavy Metals in Vegetables
The heavy metals or trace elements play an important role in the metabolic pathways during the growth and development of plants, when available in required concentration. The heavy metal concentration of Cadmium (Cd), Cobalt (Co), Copper (Cu), Iron (Fe), Nickel (Ni), Lead (Pb) and Zinc (Zn) was analyzed using Inductive Coupled Plasma Analyzer (ICPA) (Perkin-Elmer ICP Optima 3300 RL, USA) in 21 vegetables collected from Vegetable Market of Anand town, Gujarat. The vegetables are Lady’s Finger (Abelmoschus esculentus), Onion (Alium sepa), Cauliflower (Brassica oleracea var. botrytis), Beat (Brassica oleracea), Chilli (Capsicum annum), Tindora (Coccinia indica), Pattarveli (Colocasia sp.), Coriander (Coriandrum sativum), Cucumber (Cucumis sativus), Turmeric (Curcuma longa), Vetches/Gavar (Cyamopsis soralioides), Bean Pods (Dolichos lablab), Carrot (Ductus carrotus), Ginger (Gingiber officinalis), Sweet Potato (Ipomoea batatas), Bottle Gourd (Lagernaria vulgaris), Tomato (Lycopersicum esculentum), Bitter Gourd (Momordica charantia), Drumstick (Moringa oleifera), Brinjal (Solanum melongena) and Parwar (Trichosanthes dioicea). The high concentration of Cd was found in Onion, Coriander and Cauliflower, while Co and Cu content was recorded high in Cauliflower and Bottle Gourd. On the other hand, high content of Fe was observed in Cauliflower and Cucumber. Vetches and Lady’s Finger had shown high concentration of Ni. Cauliflower and Onion showed high amount of Pb. On the other hand, Cucumber and Cauliflower registered maximum content of Zn. The heavy metal concentration in vegetables was within the prescribed safety limits except Fe owing to iron-rich soil of the area. The distribution and characterization of heavy metals in vegetables was studied in detail and discussed in this paper. @JASEM There are 35 metals that concerned us because of occupational or residential exposure; 23 of these are the heavy elements or "heavy metals". Distribution of heavy metals in plant body depends upon availability and concentration of heavy metals as well as particular plant species and its population (Punz and Seighardt 1973). Many researchers have shown that some common vegetables are capable of accumulating high levels of metals from the soil (Garcia et al. 1981, Khan and Frankland 1983, Xiong 1998, Cobb et al. 2000). Certain species of Brassica (Cabbage) are hyperaccumulators of heavy metals into the edible tissues of plant (Xiong 1998). Many people could be at risk of adverse health effects from consuming common market vegetables cultivated in contaminated soil. Often the condition of the soil is unknown or undocumented; therefore, exposure to toxic levels can occur. Xu and Thornton (1985) suggested that there are health risks from consuming vegetables with elevated heavy metal concentrations. The populations most affected by heavy metal toxicity are pregnant women or very young children (Boon and Soltanpour 1992). Neurological disorders, CNS destruction, and cancers of various body organs are some of the reported effects of heavy metal poisoning (ATSDR 1994a,b, ATSDR 1999a, ATSDR 1999b, ATSDR 2000). Low birth weight and severe mental retardation of newborn children have been reported in some cases where the pregnant mother ingested toxic amounts of a heavy metal through direct or indirect consumption of vegetables (Mahaffey et al. 1981). Studies on Cd, Cu and Ni levels in vegetables from industrial and residential areas of Lagos City, Nigeria was carried out by Yusuf et al. (2002) which revealed that the levels of Cd, Cu and Ni in different edible vegetables along with its soils on which they were grown were higher in industrial areas than those of the residential areas due to pollution. Trace element and heavy metal concentrations in fruits and vegetables of the Gediz River region was intensively studied by Delibacak et al. (2002). Also edible portions of five varieties of green vegetables, viz. Amaranth, Chinese Cabbage, Cowpea leaves, Leafy Cabbage and Pumpkin leaves collected from several areas in Dar Es Salaam, Africa, were analyzed for Pb, Cd, Cr, Zn, Ni and Cu. There was a direct positive correlation between Zn and Pb levels in soils with the levels in vegetables. The relation was absent for other heavy metals (Othman et al. 2002). Characterization of Heavy Metals in Vegetables... NIRMAL KUMAR, J I ; HIREN SONI; RITA N. KUMAR 76 In India, similar kind of study was undertaken by Somasundaram et al. (2003) on heavy metal content of plant samples of sewage-irrigated area of Coimbatore district. Leafy vegetables were found with very high levels of heavy metal contamination including Cd, Zn, Cu, Mn and Pb. A similar research was carried out in Delhi and its surrounding regions on `Vegetables eating up vegetarians' found the presence of deadly heavy metals in vegetable samples collected from across the capital (The Hindu 2003). Rana and Nirmal Kumar (1988) and Nirmal Kumar et al. (1989) have investigated elemental composition of certain aquatic plants by Energy Dispersive X-Ray Analysis (EDAX). Heavy metals like Al, Si, Mn, Fe were found accumulated in Vallisnaria spiralis, Hydrilla verticillata and Azolla pinnata. Nirmal Kumar and Rita N. Kumar (1997) also carried out elemental composition of certain economic important plants by EDAX. Some studies on distribution and characterization of heavy metals in vegetable plants and its parts collected from organic farms and village agriculture fields around Anand province, Gujarat, was also carried out by Nirmal Kumar et al. (2004). The present investigation aims to study the distribution and characterization of heavy metals in common vegetables sold in vegetable market of Anand town, Milk City of Asia, Gujarat. MATERIALS AND METHODS In the present study, 21 fresh vegetables sold in vegetable market of Anand town, Gujarat were collected and brought to the laboratory. The common vegetables available in market are Bean Pods, Beet, Bitter Gourd, Bottle Gourd, Brinjal, Carrot, Cauliflower, Coriander, Cucumber, Drumstick, Gavar, Tindora, Ginger, Chilli, Lady’s Finger, Onion, Parwar, Pattarveli, Sweet Potato, Tomato and Turmeric. The edible parts of these vegetables like roots, stems, leaves, flowers and fruits were collected. The samples were brought to the laboratory and washed under tape-water gently. The moisture and water droplets were removed with the help of blotting papers. 0.5 gm dry powder was weighed by electric monopan balance (Dhona 200D) and digested with Sulphuric acid (H2SO4), Nitric acid (HNO3) and Hydrogen Peroxide (H2O2) (2:6:6) as prescribed by Saison et al. (2004). The samples were analyzed in Inductive Coupled Plasma Analyzer (ICPA) (PerkinElmer ICP Optima 3300 Rl, USA) at Sophisticated Instrumentation Center for Applied Research and Testing (SICART), Vallabh Vidyanagar, Gujarat. The concentration of heavy metals such as Cadmium (Cd), Cobalt (Co), Copper (Cu), Iron (Fe), Nickel (Ni), Lead (Pb) and Zinc (Zn) were analyzed. The reading of each sample is calculated by taking the average readings of duplicate sets. All values are mentioned in mg/gr dry weight.
Automatic Generation of Personal Chinese Handwriting by Capturing the Characteristics of Personal Handwriting
Personal handwritings can add colors to human communication. Handwriting, however, takes more time and is less favored than typing in the digital age. In this paper we propose an intelligent algorithm which can generate imitations of Chinese handwriting by a person requiring only a very small set of training characters written by the person. Our method first decomposes the sample Chinese handwriting characters into a hierarchy of reusable components, called character components. During handwriting generation, the algorithm tries and compares different possible ways to compose the target character. The likeliness of a given personal handwriting generation result is evaluated according to the captured characteristics of the person’s handwriting. We then find among all the candidate generation results an optimal one which can maximize a likeliness estimation. Experiment results show that our algorithm works reasonably well in the majority of the cases and sometimes remarkably well, which was verified through comparison with the groundtruth data and by a small scale user survey.
Bogus currency authorization using HSV techniques
In present scenario, the Indian government has announced the demonetization of all Rs 500 and Rs 1000, in reserve bank notes of Mahatma Gandhi series. Indian government has introduced a new Rs 500 and Rs 2000, to reduce fund illegal activity in India. Even then the new notes of fake or bogus currency are circulated in the society. The main objective of this work is used to identify fake currencies among the real. From the currency, the strip lines or continuous lines are detected from real and fake note by using edge detection techniques. HSV techniques are used to saturate the value of an input image. To achieve the enhance reliability and dynamic way in detecting the counterfeit currency.
A Modified Secure Remote Password (SRP) Protocol for Key Initialization and Exchange in Bluetooth Systems
This paper presents a novel solution to previously published weaknesses identified within the Bluetooth initialization key generation process. The current initialization key generating protocol will be replaced by a more robust technique, the Secure Remote Password protocol (SRP); the proposed algorithm is adapted to Bluetooth’s constrained environment by replacing the exponentiation operations with elliptic curve multiplications. This is followed by an analytical performance evaluation for the new protocol. The analysis suggests the suitability of the proposed BT-EC-SRP solution for the constrained environment of Bluetooth devices.
Automated video looping with progressive dynamism
Given a short video we create a representation that captures a spectrum of looping videos with varying levels of dynamism, ranging from a static image to a highly animated loop. In such a progressively dynamic video, scene liveliness can be adjusted interactively using a slider control. Applications include background images and slideshows, where the desired level of activity may depend on personal taste or mood. The representation also provides a segmentation of the scene into independently looping regions, enabling interactive local adjustment over dynamism. For a landscape scene, this control might correspond to selective animation and deanimation of grass motion, water ripples, and swaying trees. Converting arbitrary video to looping content is a challenging research problem. Unlike prior work, we explore an optimization in which each pixel automatically determines its own looping period. The resulting nested segmentation of static and dynamic scene regions forms an extremely compact representation.
Training Shallow and Thin Networks for Acceleration via Knowledge Distillation with Conditional Adversarial Networks
There is an increasing interest on accelerating neural networks for real-time applications. We study the studentteacher strategy, in which a small and fast student network is trained with the auxiliary information learned from a large and accurate teacher network. We propose to use conditional adversarial networks to learn the loss function to transfer knowledge from teacher to student. The proposed method is particularly effective for relatively small student networks. Moreover, experimental results show the effect of network size when the modern networks are used as student. We empirically study the trade-off between inference time and classification accuracy, and provide suggestions on choosing a proper student network.
Comparison of Traditional Breeding and Transgenesis in Farmed Fish with Implications for Growth Enhancement and Fitness
Improvements in the performance of fish species used in aquaculture are being accomplished using a variety of approaches, including both traditional and molecular genetic methodologies. Historical gains in productivity have been achieved by domestication, selection, interspecific and interstrain crossbreeding, polyploidy, and synthesis of monosex populations. More recently, transgenesis has been explored as a technique to enhance growth rate and other performance characteristics. Domestication of species, without directed selection, can yield improvement in production characteristics. Domesticated strains of farmed fish usually grow faster than wild strains, and this effect can be achieved fairly rapidly: for example in channel catfish, Ictalurus punctatus, domestication can improve the growth rate by approximately 2–6% per generation. In contrast, directed selection (mass selection) for body weight in fish has resulted in an up to 55% increase in body weight after four to ten generations of selection. In channel catfish, correlated responses to selection include higher dressing percentage, but a decreased ability to tolerate low concentrations of dissolved oxygen. Intraspecific crossbreeding can increase growth in channel catfish, common carp and salmonids, but crossbreeding does not always result in heterosis. Interspecific hybridization seldom results in overdominant performance in fish. However, one catfish hybrid, channel catfish female 3 blue catfish (I. furcatus) male, exhibits improved performance for several traits including growth, disease resistance, survival, tolerance of low dissolved oxygen, angling vulnerability, seinability, dressing and fillet %. Ploidy manipulation and sex-control technologies have also played an important role in enhancing production performance. Induction of triploidy does not improve performance in catfish hybrids, but in salmonids triploidy can enhance flesh quality by preventing sexual maturation, although growth rate is somewhat reduced
Confidence Measures for Neural Network Classifiers
Neural Networks are commonly used in classification and decision tasks. In this paper, we focus on the problem of the local confidence of their results. We review some notions from statistical decision theory that offer an insight on the determination and use of confidence measures for classification with Neural Networks. We then present an overview of the existing confidence measures and finally propose a simple measure which combines the benefits of the probabi-listic interpretation of network outputs and the estimation of the quality of the model by bootstrap error estimation. We discuss empirical results on a real-world application and an artificial problem and show that the simplest measure behaves often better than more sophisticated ones, but may be dangerous under certain situations.
Intensity- and Duration-Based Options to Regulate Endurance Training
The regulation of endurance training is usually based on the prescription of exercise intensity. Exercise duration, another important variable of training load, is rarely prescribed by individual measures and mostly set from experience. As the specific exercise duration for any intensity plays a substantial role regarding the different kind of cellular stressors, degree, and kind of fatigue as well as training effects, concepts integrating the prescription of both intensity and duration within one model are needed. An according recent approach was the critical power concept which seems to have a physiological basis; however, the mathematical approach of this concept does not allow applying the three zones/two threshold model of metabolism and its different physiological consequences. Here we show the combination of exercise intensity and duration prescription on an individual basis applying the power/speed to distance/time relationship. The concept is based on both the differentiation of intensities by two lactate or gas exchange variables derived turn points, and on the relationship between power (or velocity) and duration (or distance). The turn points define three zones of intensities with distinct acute metabolic, hormonal, and cardio-respiratory responses for endurance exercise. A maximal duration exists for any single power or velocity such as described in the power-duration relationship. Using percentages of the maximal duration allows regulating fatigue, recovery time, and adaptation for any single endurance training session. Four domains of duration with respect to induced fatigue can be derived from maximal duration obtained by the power-duration curve. For any micro-cycle, target intensities and durations may be chosen on an individual basis. The model described here is the first conceptual framework of integrating physiologically defined intensities and fatigue related durations to optimize high-performance exercise training.
Culture and politics
Introductory Chapter: On Cultural Analysis of Politics Part 1: On the Methodology of Cultural Analysis 1. Cultural Enquiry: Scientific Objectivity and Ethical Neutrality 2. Cultural Studies: Subjectivism and 'Verstehen' Part 2: Elements of Culture 3. Ethnicity 4. Religion 5. Tradition 6. Religion or Tradition: The Lack of Modernization in the Arab world Part 3: Research into Value-Orientations: Some Critiques 7. Methodology for the Enquiry into Value-Orientations 8. Gender and Homosexuality 9. Basic Value-Orientations 10. Social Capital: Identification of the Concept(s) 11. Ambiguity of Post-Materialism 12. What are Self-Expression or Emancipatory Values? Part 4: Exploration into New Values 13. Happiness or Life Satisfaction 14. Culture and Ecology 15. Globalization Values 16. Culture and Conflicts 17. Causes of Cultural Conflicts 18. New Aspects of the Attitudes towards Homosexuality and Gender