title
stringlengths
8
300
abstract
stringlengths
0
10k
On the Number of RF Chains and Phase Shifters, and Scheduling Design With Hybrid Analog–Digital Beamforming
This paper considers hybrid beamforming (HB) for downlink multiuser massive multiple-input multiple-output (MIMO) systems with frequency selective channels. The proposed HB design employs sets of digitally controlled phase (fixed phase) paired phase shifters (PSs) and switches. For this system, first we determine the required number of radio frequency (RF) chains and PSs such that the proposed HB achieves the same performance as that of the digital beamforming (DB) which utilizes N (number of transmitter antennas) RF chains. We show that the performance of the DB can be achieved with our HB just by utilizing rt RF chains and 2rt(N-rt + 1) PSs, where rt ≤ N is the rank of the combined digital precoder matrices of all subcarriers. Second, we provide a simple and novel approach to reduce the number of PSs with only a negligible performance degradation. Numerical results reveal that only 20-40 PSs per RF chain are sufficient for practically relevant parameter settings. Finally, for the scenario where the deployed number of RF chains (Na) is less than rt, we propose a simple user scheduling algorithm to select the best set of users in each subcarrier. Simulation results validate theoretical expressions, and demonstrate the superiority of the proposed HB design over the existing HB designs in both flat fading and frequency selective channels.
Sympathetic Nerve Reconstruction for Compensatory Hyperhidrosis after Sympathetic Surgery for Primary Hyperhidrosis
We performed sympathetic nerve reconstruction using intercostal nerve in patients with severe compensatory hyperhidrosis after sympathetic surgery for primary hyperhidrosis, and analyzed the surgical results. From February 2004 to August 2007, sympathetic nerve reconstruction using intercostal nerve was performed in 19 patients. The subjected patients presented severe compensatory hyperhidrosis after thoracoscopic sympathetic surgery for primary hyperhidrosis. Reconstruction of sympathetic nerve was performed by thoracoscopic surgery except in 1 patient with severe pleural adhesion. The median interval between the initial sympathetic surgery and sympathetic nerve reconstruction was 47.2 (range: 3.5-110.7) months. Compensatory sweating after the reconstruction surgery improved in 9 patients, and 3 out of them had markedly improved symptoms. Sympathetic nerve reconstruction using intercostal nerve may be one of the useful surgical options for severe compensatory hyperhidrosis following sympathetic surgery for primary hyperhidrosis.
Effect of a low dose of sea buckthorn berries on circulating concentrations of cholesterol, triacylglycerols, and flavonols in healthy adults.
BACKGROUND Epidemiological studies indicate beneficial effects of flavonoids on cardiovascular disease (CVD) risk. AIM OF THE STUDY To study the effect of flavonoid-rich sea buckthorn berry (SBB) on circulating lipid markers associated with CVD risk and plasma flavonol concentration. Also investigated was whether changes in the circulating flavonol concentrations correlate with the SBB induced changes in C-reactive protein (CRP) concentration observed previously. SUBJECTS AND METHODS In all 229 healthy participants completed the randomized double-blind study and consumed daily 28 g of SBB or placebo for 3 months. Fasting blood samples for the analysis of lipid markers and flavonols were obtained at the beginning and end of the study. RESULTS Compared to the placebo, the consumption of SBB increased the plasma concentration of the flavonols quercetin and isorhamnetin significantly [treatment differences 3.0 ng/ml (P = 0.03) and 3.9 ng/ml (P < 0.01), respectively]. The increase of kaempferol concentration was not significant [treatment difference 0.7 ng/ml (P = 0.08)]. SBB did not affect the serum total, HDL, LDL cholesterol, or the serum triacylglycerol concentrations. There was no correlation between the changes in flavonol and CRP concentrations of participants. CONCLUSIONS The consumption of SBB significantly increased the fasting plasma concentration of quercetin and isorhamnetin indicating that it is a good dietary source of flavonols. However, this did not convert to affecting the circulating concentrations of lipid markers in healthy, normolipidemic adults having healthy diets.
A Large-Scale Hidden Semi-Markov Model for Anomaly Detection on User Browsing Behaviors
Many methods designed to create defenses against distributed denial of service (DDoS) attacks are focused on the IP and TCP layers instead of the high layer. They are not suitable for handling the new type of attack which is based on the application layer. In this paper, we introduce a new scheme to achieve early attack detection and filtering for the application-layer-based DDoS attack. An extended hidden semi-Markov model is proposed to describe the browsing behaviors of web surfers. In order to reduce the computational amount introduced by the model's large state space, a novel forward algorithm is derived for the online implementation of the model based on the M-algorithm. Entropy of the user's HTTP request sequence fitting to the model is used as a criterion to measure the user's normality. Finally, experiments are conducted to validate our model and algorithm.
From single cells to deep phenotypes in cancer
In recent years, major advances in single-cell measurement systems have included the introduction of high-throughput versions of traditional flow cytometry that are now capable of measuring intracellular network activity, the emergence of isotope labels that can enable the tracking of a greater variety of cell markers and the development of super-resolution microscopy techniques that allow measurement of RNA expression in single living cells. These technologies will facilitate our capacity to catalog and bring order to the inherent diversity present in cancer cell populations. Alongside these developments, new computational approaches that mine deep data sets are facilitating the visualization of the shape of the data and enabling the extraction of meaningful outputs. These applications have the potential to reveal new insights into cancer biology at the intersections of stem cell function, tumor-initiating cells and multilineage tumor development. In the clinic, they may also prove important not only in the development of new diagnostic modalities but also in understanding how the emergence of tumor cell clones harboring different sets of mutations predispose patients to relapse or disease progression.
Toward Standardization of Authenticated Caller ID Transmission
With the cost of telecommunication becoming as cheap as Internet data, the telephone network today is rife with telephone spam and scams. In recent years, the U.S. government has received record numbers of complaints on phone fraud and unwanted calls. Caller ID is at the heart of stopping telephone spam -- a variety of apps and services, including law enforcement, rely on caller ID information to defend against unwanted calls. However, spammers are using spoofed caller IDs to defeat call blockers, to evade identification, and to further a variety of scams. To provide a solution to this problem, this article proposes a standardized authentication scheme for caller ID that enables the possibility of a security indicator for telecommunication. The goal of this proposal is to help prevent users from falling victim to telephone spam and scams, as well as provide a foundation for future and existing defenses to stop unwanted telephone communication based on caller ID information.
B-trees, shadowing, and clones
B-trees are used by many file systems to represent files and directories. They provide guaranteed logarithmic time key-search, insert, and remove. File systems like WAFL and ZFS use shadowing, or copy-on-write, to implement snapshots, crash recovery, write-batching, and RAID. Serious difficulties arise when trying to use b-trees and shadowing in a single system. This article is about a set of b-tree algorithms that respects shadowing, achieves good concurrency, and implements cloning (writeable snapshots). Our cloning algorithm is efficient and allows the creation of a large number of clones. We believe that using our b-trees would allow shadowing file systems to better scale their on-disk data structures.
Six-Step Operation of PMSM With Instantaneous Current Control
Six-step operation has many advantages in permanent-magnet synchronous machine (PMSM) drives such as maximum power utilization and widened flux-weakening region. However, due to the maximum utilization of inverter output, saturation of current regulator makes it difficult to maintain instantaneous current control capability. Accordingly, in most of conventional research studies, the six-step operation has been implemented by voltage angle control without current regulation, whose dynamic performance is quite unsatisfactory. This paper proposes a control scheme for the six-step operation of PMSM with enhanced dynamic performance of current control. By collaborative operation of dynamic overmodulation, flux weakening, and a technique for enhanced dynamic performance, the six-step operation is realized without losing instantaneous current control capability. Simulations and experiments are carried out to verify the effectiveness of the proposed control scheme. The experimental results show 17% extension of the constant torque region and 27% enhancement of torque capability at three times of the base speed.
Effects of wine, alcohol and polyphenols on cardiovascular disease risk factors: evidences from human studies.
AIMS The aim of this review was to focus on the knowledge of the cardiovascular benefits of moderate alcohol consumption, as well as to analyze the effects of the different types of alcoholic beverages. METHODS Systematic revision of human clinical studies and meta-analyses related to moderate alcohol consumption and cardiovascular disease (CVD) from 2000 to 2012. RESULTS Heavy or binge alcohol consumption unquestionably leads to increased morbidity and mortality. Nevertheless, moderate alcohol consumption, especially alcoholic beverages rich in polyphenols, such as wine and beer, seems to confer cardiovascular protective effects in patients with documented CVD and even in healthy subjects. CONCLUSIONS In conclusion, wine and beer (but especially red wine) seem to confer greater cardiovascular protection than spirits because of their polyphenolic content. However, caution should be taken when making recommendations related to alcohol consumption.
The cultural construction of self-enhancement: an examination of group-serving biases.
Self-serving biases, found routinely in Western samples, have not been observed in Asian samples. Yet given the orientation toward individualism and collectivism in these 2 cultures, respectively, it is imperative to examine whether parallel differences emerge when the target of evaluation is the group. It may be that Asians show a group-serving bias parallel to the Western self-serving bias. In 2 studies, group-serving biases were compared across European Canadian, Asian Canadian, and Japanese students. Study 1 revealed that Japanese students evaluated a family member less positively than did both groups of Canadian students. Study 2 replicated this pattern with students' evaluations of their universities. The data suggest that cultural differences in enhancement biases are robust, generalizing to individuals' evaluations of their groups.
Relative Entropy Inverse Reinforcement Learning
We consider the problem of imitation learning where the examples, demonstrated by an expert, cover only a small part of a large state space. Inverse Reinforcement Learning (IRL) provides an efficient tool for generalizing the demonstration, based on the assumption that the expert is optimally acting in a Markov Decision Process (MDP). Most of the past work on IRL requires that a (near)optimal policy can be computed for different reward functions. However, this requirement can hardly be satisfied in systems with a large, or continuous, state space. In this paper, we propose a model-free IRL algorithm, where the relative entropy between the empirical distribution of the state-action trajectories under a uniform policy and their distribution under the learned policy is minimized by stochastic gradient descent. We compare this new approach to well-known IRL algorithms using approximate MDP models. Empirical results on simulated car racing, gridworld and ball-in-a-cup problems show that our approach is able to learn good policies from a small number of demonstrations.
Notes on Convolutional Neural Networks
We discuss the derivation and implementation of convolutional neural networks, followed by an extension which allows one to learn sparse combinations of feature maps. The derivation we present is specific to two-dimensional data and convolutions, but can be extended without much additional effort to an arbitrary number of dimensions. Throughout the discussion, we emphasize efficiency of the implementation, and give small snippets of MATLAB code to accompany the equations.
A novel functional VKORC1 promoter polymorphism is associated with inter-individual and inter-ethnic differences in warfarin sensitivity.
Warfarin, a commonly prescribed anticoagulant, exhibited large inter-individual and inter-ethnic differences in the dose required for its anticoagulation effect. Asian populations, including Chinese, require a much lower maintenance dose than Caucasians, for which the mechanisms still remain unknown. We determined DNA sequence variants in CYP2C9 and VKORC1 in 16 Chinese patients having warfarin sensitivity (< or = 1.5 mg/day, n = 11) or resistance (> or = 6.0 mg/day, n = 5), 104 randomly selected Chinese patients receiving warfarin, 95 normal Chinese controls and 92 normal Caucasians. We identified three CYP2C9 variants, CYP2C9*3, T299A and P382L, in four warfarin-sensitive patients. A novel VKORC1 promoter polymorphism (-1639 G > A) presented in the homozygous form (genotype AA) was found in all warfarin-sensitive patients. The resistant patients were either AG or GG. Among the 104 randomly selected Chinese patients receiving warfarin, AA genotype also had lower dose than the AG/GG genotype (P < 0.0001). Frequencies of AA, AG and GG genotypes were comparable in Chinese patients receiving warfarin (79.7, 17.6 and 2.7%) and normal Chinese controls (82, 18 and 0%), but differed significantly from Caucasians (14, 47 and 39%) (P < 0.0001). The promoter polymorphism abolished the E-box consensus sequences and dual luciferase assay revealed that VOKRC1 promoter with the G allele had a 44% increase of activity when compared with the A allele. The differences in allele frequencies of A/G allele and its levels of VKORC1 promoter activity may underscore the inter-individual differences in warfarin dosage as well as inter-ethnic differences between Chinese and Caucasians.
On Evaluation of 6D Object Pose Estimation
A pose of a rigid object has 6 degrees of freedom and its full knowledge is required in many robotic and scene understanding applications. Evaluation of 6D object pose estimates is not straightforward. Object pose may be ambiguous due to object symmetries and occlusions, i.e. there can be multiple object poses that are indistinguishable in the given image and should be therefore treated as equivalent. The paper defines 6D object pose estimation problems, proposes an evaluation methodology and introduces three new pose error functions that deal with pose ambiguity. The new error functions are compared with functions commonly used in the literature and shown to remove certain types of non-intuitive outcomes. Evaluation tools are provided at: https : //github.com/thodan/obj pose eval
Malton: Towards On-Device Non-Invasive Mobile Malware Analysis for ART
It’s an essential step to understand malware’s behaviors for developing effective solutions. Though a number of systems have been proposed to analyze Android malware, they have been limited by incomplete view of inspection on a single layer. What’s worse, various new techniques (e.g., packing, anti-emulator, etc.) employed by the latest malware samples further make these systems ineffective. In this paper, we propose Malton, a novel on-device non-invasive analysis platform for the new Android runtime (i.e., the ART runtime). As a dynamic analysis tool, Malton runs on real mobile devices and provides a comprehensive view of malware’s behaviors by conducting multi-layer monitoring and information flow tracking, as well as efficient path exploration. We have carefully evaluated Malton using real-world malware samples. The experimental results showed that Malton is more effective than existing tools, with the capability to analyze sophisticated malware samples and provide a comprehensive view of malicious behaviors of these samples.
An Algorithm for Finding the Minimum Cost of Storing and Regenerating Datasets in Multiple Clouds
The proliferation of cloud computing allows users to flexibly store, re-compute or transfer large generated datasets with multiple cloud service providers. However, due to the pay-as-you-go model, the total cost of using cloud services depends on the consumption of storage, computation and bandwidth resources which are three key factors for the cost of IaaS-based cloud resources. In order to reduce the total cost for data, given cloud service providers with different pricing models on their resources, users can flexibly choose a cloud service to store a generated dataset, or delete it and choose a cloud service to regenerate it whenever reused. However, finding the minimum cost is a complicated yet unsolved problem. In this paper, we propose a novel algorithm that can calculate the minimum cost for storing and regenerating datasets in clouds, i.e., whether datasets should be stored or deleted, and furthermore where to store or to regenerate whenever they are reused. This minimum cost also achieves the best trade-off among computation, storage and bandwidth costs in multiple clouds. Comprehensive analysis and rigid theorems guarantee the theoretical soundness of the paper, and general (random) simulations conducted with popular cloud service providers’ pricing models demonstrate the excellent performance of our approach.
Analysis of quadratic dickson based envelope detectors for IoE sensor node applications
This paper presents a study of passive Dickson based envelope detectors operating in the quadratic small signal regime, specifically intended to be used in RF front end of sensing units of IoE sensor nodes. Critical parameters such as open-circuit voltage sensitivity (OCVS), charge time, input impedance, and output noise are studied and simplified circuit models are proposed to predict the behavior of the detector, resulting in practical design intuitions. There is strong agreement between model predictions, simulation results and measurements of 15 representative test structures that were fabricated in a 130 nm RF CMOS process.
An Introduction to the Theory of Mechanism Design
1 The authors will donate all payments that they receive from the publisher for this book to Amnesty International.
Wireless sensor and actor networks: research challenges
Wireless sensor and actor networks (WSANs) refer to a group of sensors and actors linked by wireless medium to perform distributed sensing and acting tasks. The realization of wireless sensor and actor networks (WSANs) needs to satisfy the requirements introduced by the coexistence of sensors and actors. In WSANs, sensors gather information about the physical world, while actors take decisions and then perform appropriate actions upon the environment, which allows a user to effectively sense and act from a distance. In order to provide effective sensing and acting, coordination mechanisms are required among sensors and actors. Moreover, to perform right and timely actions, sensor data must be valid at the time of acting. This paper explores sensor-actor and actoractor coordination and describes research challenges for coordination and communication problems.
Private Enforcement of Securities Law in China: A Ten-Year Retrospective
This paper undertakes the first comprehensive study of the nature and extent of private securities litigation in China, critically examining whether private securities litigation has been effectively carried out during the first decade after it was formally permitted by China’s top court in 2002. Methodologically, it goes beyond the doctrinal analysis as commonly seen in the extant literature on the subject matter to empirically investigate both the quantity and quality of securities civil action in China. It is found that there have been a much-lower-than-expected number of securities civil suits in China under the ten-year study period, but the percentage of recovery generated by securities civil suits in China is significantly higher than that in the US. The policy implications of the empirical findings are discussed with a view to improving the Chinese legal regime for private securities litigation. In particular, it casts doubts on the popular belief that China should adopt the US-style class action. Further, as the local courts are found to be dysfunctional in handling securities civil cases, a reform proposal is made to advocate the choice of the court for bringing securities civil action in China.
Emission Factors for High-Emitting Vehicles Based on On-Road Measurements of Individual Vehicle Exhaust with a Mobile Measurement Platform.
Fuel-based emission factors for 143 light-duty gasoline vehicles (LDGVs) and 93 heavy-duty diesel trucks (HDDTs) were measured in Wilmington, CA using a zero-emission mobile measurement platform (MMP). The frequency distributions of emission factors of carbon monoxide (CO), nitrogen oxides (NOx), and particle mass with aerodynamic diameter below 2.5 μm (PM2.5) varied widely, whereas the average of the individual vehicle emission factors were comparable to those reported in previous tunnel and remote sensing studies as well as the predictions by Emission Factors (EMFAC) 2007 mobile source emission model for Los Angeles County. Variation in emissions due to different driving modes (idle, low- and high-speed acceleration, low- and high-speed cruise) was found to be relatively small in comparison to intervehicle variability and did not appear to interfere with the identification of high emitters, defined as the vehicles whose emissions were more than 5 times the fleet-average values. Using this definition, approximately 5% of the LDGVs and HDDTs measured were high emitters. Among the 143 LDGVs, the average emission factors of NOx, black carbon (BC), PM2.5, and ultrafine particle (UFP) would be reduced by 34%, 39%, 44%, and 31%, respectively, by removing the highest 5% of emitting vehicles, whereas CO emission factor would be reduced by 50%. The emission distributions of the 93 HDDTs measured were even more skewed: approximately half of the NOx and CO fleet-average emission factors and more than 60% of PM2.5, UFP, and BC fleet-average emission factors would be reduced by eliminating the highest-emitting 5% HDDTs. Furthermore, high emissions of BC, PM2.5, and NOx tended to cluster among the same vehicles. [Box: see text].
mm Wave Initial Cell Search Analysis under UE Rotational Motion
Millimeter wave (mmW) communication has emerged as a promising component of the access link for 5G cellular systems. In order to overcome the higher free-space path loss at these frequencies, high gain, and therefore highly directional, antennas are being proposed at both ends of the link. Furthermore, in order to maintain a mobile connection with an acceptable quality of service (QOS), these highly directional antennas also need to be electronically steerable. This required dual-end steerability adds significant system complexity in terms of both initial access and connected mode procedures. Therefore, these various procedures which were originally designed for the current sub 6 GHz systems need to be re-investigated in light of this added complexity. Traditionally, sub 6 GHz mobility studies mainly focused on translational motion; however, with the dual-end highly directional links being proposed for mmW communications, rotational motion also becomes a concern. In this paper, we study the effects these steerable links have on initial network access procedures, especially in the presence of user equipment (UE) rotational motion.
A Giant Hemangioma of the Tongue
Introduction: Vascular abnormalities are relatively uncommon lesions, but head and neck is a common region for vascular malformation which is classified as benign tumors. In this paper, the authors report a rare presentation of vascular malformation in the tongue and its managements. Case Report: An 18 months 2 old child presented with a giant mass of tongue which caused functional and aesthetic problem. The rapid growth pattern of cavernous hemangioma was refractory to corticosteroid. The lesion was excised without any complication. Since the mass was so huge that not only filled entire oral cavity but was protruding outside, airway management was a great challenge for anesthesia plan and at the same time surgical technique was difficult to select. Conclusion: Despite different recommended modalities in managing hemangiomas of the tongue, in cases of huge malformations, surgery could be the mainstay treatment and provided that critical care measures are taken in to account, could be performed very safely.
Can Illegal Corporate Behavior be Predicted? An Event History Analysis
A model of illegal corporate behavior was developed and tested for a 19-year period using event history analysis and data on clearly illegal acts. Results indicated that large firms operating in dynamic, munificent environments were the most likely of the firms studied to behave illegally, and firms with poor performance were not prone to commit wrongdoing. Membership in certain industries and a history of prior violations also increased the likelihood that a firm would behave illegally.
System and method for interpreting and analyzing dynamic characteristics of the current state of tasks performed
FIELD: information technology. SUBSTANCE: invention relates to means for analyzing dynamic characteristics of parallel programs and supercomputers. System contains a set of computational nodes of a supercomputer, each of which is equipped with a monitoring system (CM), a task flow control system (SOUP), information processing server, including a module for data aggregation from monitoring systems of each computing node, a data analysis module from the task flow control system of each computing node, as well as a data storage unit, a tool for visualizing the results of processing. At the same time, SM means include sensors of system monitoring that provide information on the status and degree of use of available resources from each of the available monitoring systems, MEMS facilities are designed to obtain information about the status of tasks, their distribution by nodes and the nature of resource use at compute nodes. Method describes system operation. EFFECT: technical result consists in increasing efficiency of supercomputer by analyzing current state of problem being solved. 11 cl, 7 dwg, 1 tbl, 1 ex
On regularization parameter estimation under covariate shift
This paper identifies a problem with the usual procedure for L2-regularization parameter estimation in a domain adaptation setting. In such a setting, there are differences between the distributions generating the training data (source domain) and the test data (target domain). The usual cross-validation procedure requires validation data, which can not be obtained from the unlabeled target data. The problem is that if one decides to use source validation data, the regularization parameter is underestimated. One possible solution is to scale the source validation data through importance weighting, but we show that this correction is not sufficient. We conclude the paper with an empirical analysis of the effect of several importance weight estimators on the estimation of the regularization parameter.
Paper Prototyping: The Fast and Easy Way to Design and Refine User Interfaces
Paper prototyping highlights cost-effective usability testing techniques that produce fast results for improving an interface design. Practitioners and students interested in the design, development, and support of user interfaces will appreciate Snyder’s text for its focus on practical information and application. This book’s best features are the real life examples, anecdotes, and case studies that the author presents to demonstrate the uses of paper prototyping and its many benefits. While the author advocates paper prototyping, she also notes that paper prototyping techniques are one of many usability evaluation methods and that paper prototyping works best only in certain situations. Snyder reminds her readers that paper prototyping does not produce precise usability measurements, but rather it is a “blunt instrument” that rapidly uncovers qualitative information from actual users performing real tasks (p. 185). Hence, this book excludes in-depth theoretical discussions about methods and validity, but its pragmatic discussion on test design prepares the practitioner for dealing with several circumstances and making sound decisions based on testing method considerations.
Performance analysis of image segmentation using watershed algorithm, fuzzy C-means of clustering algorithm and Simulink design
An Integration of image processing and soft computing techniques for the image analysis plays a significant contribution in the field of image processing. One of the important class of image analysis techniques is image segmentation. There are applications in various fields some of them are sorting product in the industry, surveillance system in the security zones, biomedical processing, medical imaging, environmental predictions etc. In this paper the study of fuzzy C-means clustering method, watershed based image segmentation and Simulink design based image segmentation with some enhancement design algorithm for the improvement of the overall performance of image segmentation is used. The main objective of this paper is to bring the conviction of utility of watershed, Simulink and FCM based segmentation for the applications in medical imaging, and other images where lots of spatial & inherent information is available, by considering their performance analysis metrics. This performance analysis is used to justify and decide which method is appropriate and more convincing for the effective analysis of more complex imaging system.
Grounded running in quails: simulations indicate benefits of observed fixed aperture angle between legs before touch-down.
Many birds use grounded running (running without aerial phases) in a wide range of speeds. Contrary to walking and running, numerical investigations of this gait based on the BSLIP (bipedal spring loaded inverted pendulum) template are rare. To obtain template related parameters of quails (e.g. leg stiffness) we used x-ray cinematography combined with ground reaction force measurements of quail grounded running. Interestingly, with speed the quails did not adjust the swing leg's angle of attack with respect to the ground but adapted the angle between legs (which we termed aperture angle), and fixed it about 30ms before touchdown. In simulations with the BSLIP we compared this swing leg alignment policy with the fixed angle of attack with respect to the ground typically used in the literature. We found symmetric periodic grounded running in a simply connected subset comprising one third of the investigated parameter space. The fixed aperture angle strategy revealed improved local stability and surprising tolerance with respect to large perturbations. Starting with the periodic solutions, after step-down step-up or step-up step-down perturbations of 10% leg rest length, in the vast majority of cases the bipedal SLIP could accomplish at least 50 steps to fall. The fixed angle of attack strategy was not feasible. We propose that, in small animals in particular, grounded running may be a common gait that allows highly compliant systems to exploit energy storage without the necessity of quick changes in the locomotor program when facing perturbations.
A multidisciplinary approach towards computational thinking for science majors
This paper describes the development and initial evaluation of a new course ``Introduction to Computational Thinking'' taken by science majors to fulfill a college computing requirement. The course was developed by computer science faculty in collaboration with science faculty and it focuses on the role of computing and computational principles in scientific inquiry. It uses Python and Python libraries to teach computational thinking via basic programming concepts, data management concepts, simulation, and visualization. Problems with a computational aspect are drawn from different scientific disciplines and are complemented with lectures from faculty in those areas. Our initial evaluation indicates that the problem-driven approach focused on scientific discovery and computational principles increases the student's interest in computing.
Learning to Detect Vehicles by Clustering Appearance Patterns
This paper studies efficient means in dealing with intracategory diversity in object detection. Strategies for occlusion and orientation handling are explored by learning an ensemble of detection models from visual and geometrical clusters of object instances. An AdaBoost detection scheme is employed with pixel lookup features for fast detection. The analysis provides insight into the design of a robust vehicle detection system, showing promise in terms of detection performance and orientation estimation accuracy.
Fault based cryptanalysis of the Advanced Encryption Standard
In this paper we describe several fault attacks on the Advanced Encryption Standard (AES). First, using optical fault induction attacks as recently publicly presented by Skorobogatov and Anderson [SA], we present an implementation independent fault attack on AES. This attack is able to determine the complete 128-bit secret key of a sealed tamper-proof smartcard by generating 128 faulty cipher texts. Second, we present several implementationdependent fault attacks on AES. These attacks rely on the observation that due to the AES's known timing analysis vulnerability (as pointed out by Koeune and Quisquater [KQ]), any implementation of the AES must ensure a data independent timing behavior for the so called AES's xtime operation. We present fault attacks on AES based on various timing analysis resistant implementations of the xtime-operation. Our strongest attack in this direction uses a very liberal fault model and requires only 256 faulty encryptions to determine a 128-bit key.
Hormonal contraceptives and the length of their use are not independent risk factors for high-risk HPV infections or high-grade CIN.
AIMS To evaluate the role of hormonal contraceptives as a risk factor of high-risk human papillomavirus (HR-HPV), cervical intraepithelial lesions (CIN) and cervical cancer in our multi-center population-based LAMS (Latin American Screening) study. METHODS A cohort study with >12,000 women from Brazil and Argentina using logistic regression to analyze the covariates of hormonal contraception (HOC - oral, injections, patches, implants, vaginal ring and progesterone intrauterine system) use followed by multivariate modeling for predictors of HR-HPV and CIN2+. RESULTS HR-HPV infection was a consistent risk factor of high-grade CIN in all three groups of women. The length of HOC use was not significantly related to high-grade squamous intraepithelial lesions (HSIL)+ Pap (p = 0.069), LSIL+ Pap (p = 0.781) or ASCUS+ (p = 0.231). The same was true with the length of HOC use and histology CIN3+ (p = 0.115) and CIN2+ (p = 0.515). Frequently, HOC users have previously shown more HPV-related lesions, as well as lower HPV prevalence if they were current smokers. But HOC use and time of usage were not independent risk factors of either HR-HPV infection or high-grade CIN using multiple logistic regressions. CONCLUSIONS No evidence was found for an association between the use of HOC with an increased risk for HR-HPV infection or high-grade CIN in this cohort.
Computer-based classification of dermoscopy images of melanocytic lesions on acral volar skin.
We describe a fully automated system for the classification of acral volar melanomas. We used a total of 213 acral dermoscopy images (176 nevi and 37 melanomas). Our automatic tumor area extraction algorithm successfully extracted the tumor in 199 cases (169 nevi and 30 melanomas), and we developed a diagnostic classifier using these images. Our linear classifier achieved a sensitivity (SE) of 100%, a specificity (SP) of 95.9%, and an area under the receiver operating characteristic curve (AUC) of 0.993 using a leave-one-out cross-validation strategy (81.1% SE, 92.1% SP; considering 14 unsuccessful extraction cases as false classification). In addition, we developed three pattern detectors for typical dermoscopic structures such as parallel ridge, parallel furrow, and fibrillar patterns. These also achieved good detection accuracy as indicated by their AUC values: 0.985, 0.931, and 0.890, respectively. The features used in the melanoma-nevus classifier and the parallel ridge detector have significant overlap.
Sensor data fusion using a probability density grid
A novel technique has been developed at DRDC Ottawa for fusing electronic warfare (EW) sensor data by numerically combining the probability density function representing the measured value and error estimate provided by each sensor. Multiple measurements are sampled at common discrete intervals to form a probability density grid and combined to produce the fused estimate of the measured parameter. This technique, called the discrete probability density (DPD) method, is used to combine sensor measurements taken from different locations for the EW function of emitter geolocation. Results are presented using simulated line of bearing measurements and are shown to approach the theoretical location accuracy limit predicted by the Cramer-Rao lower bound. The DPD method is proposed for fusing other geolocation sensor data including time of arrival, time difference of arrival, and a priori information.
Intuitionistic Fuzzy Aggregation and Clustering
Only for you today! Discover your favourite intuitionistic fuzzy aggregation and clustering book right here by downloading and getting the soft file of the book. This is not your time to traditionally go to the book stores to buy a book. Here, varieties of book collections are available to download. One of them is this intuitionistic fuzzy aggregation and clustering as your preferred book. Getting this book b on-line in this site can be realized now by visiting the link page to download. It will be easy. Why should be here?
Robust Image Hashing With Ring Partition and Invariant Vector Distance
Robustness and discrimination are two of the most important objectives in image hashing. We incorporate ring partition and invariant vector distance to image hashing algorithm for enhancing rotation robustness and discriminative capability. As ring partition is unrelated to image rotation, the statistical features that are extracted from image rings in perceptually uniform color space, i.e., CIE L*a*b* color space, are rotation invariant and stable. In particular, the Euclidean distance between vectors of these perceptual features is invariant to commonly used digital operations to images (e.g., JPEG compression, gamma correction, and brightness/contrast adjustment), which helps in making image hash compact and discriminative. We conduct experiments to evaluate the efficiency with 250 color images, and demonstrate that the proposed hashing algorithm is robust at commonly used digital operations to images. In addition, with the receiver operating characteristics curve, we illustrate that our hashing is much better than the existing popular hashing algorithms at robustness and discrimination.
Barriers to Physical Activity Among Patients With Type 1 Diabetes
OBJECTIVE To determine, in an adult population with type 1 diabetes, barriers to regular physical activity using a diabetes-specific barriers measure (the Barriers to Physical Activity in Diabetes [type 1] [BAPAD1] scale) and factors associated with these barriers. RESEARCH DESIGN AND METHODS One hundred adults with type 1 diabetes answered a questionnaire assessing perceived barriers to physical activity and related factors. A1C was obtained from the medical chart of each individual. RESULTS Fear of hypoglycemia was identified as being the strongest barrier to physical activity. Greater knowledge about insulin pharmacokinetics and using appropriate approaches to minimize exercise-induced hypoglycemia were factors associated with fewer perceived barriers. Greater barriers were positively correlated with A1C levels (r = 0.203; P = 0.042) and negatively with well-being (r = -0.45; P < 0.001). CONCLUSIONS Fear of hypoglycemia is the strongest barrier to regular physical activity in adults with type 1 diabetes, who should therefore be informed and supported in hypoglycemia management.
A Guide to Visual Multi-Level Interface Design From Synthesis of Empirical Study Evidence
a guide to visual multi level interface design from synthesis of empirical study evidence tamara munzner. Book lovers, when you need a new book to read, find the book here. Never worry not to find what you need. Is the a guide to visual multi level interface design from synthesis of empirical study evidence tamara munzner your needed book now? That's true; you are really a good reader. This is a perfect book that comes from great author to share with you. The book offers the best experience and lesson to take, not only take, but also learn.
A phase III randomized study to evaluate the efficacy and safety of CT-P13 compared with reference infliximab in patients with active rheumatoid arthritis: 54-week results from the PLANETRA study
BACKGROUND CT-P13 (Remsima®, Inflectra®) is a biosimilar of the infliximab reference product (RP; Remicade®). The aim of this study was to compare the 54-week efficacy, immunogenicity, safety, pharmacokinetics (PK) and pharmacodynamics (PD) of CT-P13 and RP in patients with active rheumatoid arthritis (RA). METHODS In this multinational phase III double-blind study, patients with active RA and an inadequate response to methotrexate (MTX) were randomized (1:1) to receive CT-P13 (3 mg/kg) or RP (3 mg/kg) at weeks 0, 2, 6 and then every 8 weeks to week 54 in combination with MTX (12.5-25 mg/week). Efficacy endpoints included American College of Rheumatology (ACR)20, ACR50 and ACR70 response rates, Disease Activity Score in 28 joints (DAS28), Simplified Disease Activity Index (SDAI), Clinical Disease Activity Index (CDAI), European League Against Rheumatism (EULAR) response rates, patient-reported outcomes and joint damage progression. Immunogenicity, safety and PK/PD outcomes were also assessed. RESULTS Of 606 randomized patients, 455 (CT-P13 233, RP 222) were treated up to week 54. At week 54, ACR20 response rate was highly similar between groups (CT-P13 74.7 %, RP 71.3 %). ACR50 and ACR70 response rates were also comparable between groups (CT-P13 43.6 % and 21.3 %, respectively; RP 43.1 % and 19.9 %, respectively). DAS28, SDAI and CDAI decreased from baseline to week 54 to a similar extent with CT-P13 and RP. Radiographic progression measured by Sharp scores as modified by van der Heijde was also comparable. With both treatments, patient assessments of pain, disease activity and physical ability, as well as mean scores on the Medical Outcomes Study Short Form Health Survey (SF-36), improved markedly at week 14 and remained stable thereafter up to week 54. The proportion of patients positive for antidrug antibodies at week 54 was similar between the two groups: 41.1 % and 36.0 % with CT-P13 and RP, respectively. CT-P13 was well tolerated and had a similar safety profile to RP. PK/PD results were also comparable between CT-P13 and RP. CONCLUSIONS CT-P13 and RP were comparable in terms of efficacy (including radiographic progression), immunogenicity and PK/PD up to week 54. The safety profile of CT-P13 was also similar to that of RP. TRIAL REGISTRATION ClinicalTrials.gov identifier: NCT01217086 . Registered 4 Oct 2010.
Gigapixel Computational Imaging
Today, consumer cameras produce photographs with tens of millions of pixels. The recent trend in image sensor resolution seems to suggest that we will soon have cameras with billions of pixels. However, the resolution of any camera is fundamentally limited by geometric aberrations. We derive a scaling law that shows that, by using computations to correct for aberrations, we can create cameras with unprecedented resolution that have low lens complexity and compact form factor. In this paper, we present an architecture for gigapixel imaging that is compact and utilizes a simple optical design. The architecture consists of a ball lens shared by several small planar sensors, and a post-capture image processing stage. Several variants of this architecture are shown for capturing a contiguous hemispherical field of view as well as a complete spherical field of view. We demonstrate the effectiveness of our architecture by showing example images captured with two proof-of-concept gigapixel cameras.
Benchmarking the Task Graph Scheduling Algorithms
The problem of scheduling a weighted directed acyclic graph (DAG) to a set of homogeneous processors to minimize the completion time has been extensively studied. The NPcompleteness of the problem has instigated researchers to propose a myriad of heuristic algorithms. While these algorithms are individually reported to be efficient, it is not clear how effective they are and how well they compare against each other. A comprehensive performance evaluation and comparison of these algorithms entails addressing a number of difficult issues. One of the issues is that a large number of scheduling algorithms are based upon radically different assumptions, making their comparison on a unified basis a rather intricate task. Another issue is that there is no standard set of benchmarks that can be used to evaluate and compare these algorithms. Furthermore, most algorithms are evaluated using small problem sizes, and it is not clear how their performance scales with the problem size. In this paper, we first provide a taxonomy for classifying various algorithms into different categories according to their assumptions and functionalities. We then propose a set of benchmarks which are of diverse structures without being biased towards a particular scheduling technique and still allow variations in important parameters. We have evaluated 15 scheduling algorithms, and compared them using the proposed benchmarks. Based upon the design philosophies and principles behind these algorithms, we interpret the results and discuss why some algorithms perform better than the others.
Information visualization- in regarding to customer journey map in a three-dimensions format
Customer journey map used in the service design was considered an important tool to discover service experiences related issues, since it could logically and contextually find problems and solves them. This article tried to conduct the idea of information visualization. and utilized the customer journey map in a three-dimensions format; therefore, designers not only can acquire information from 2D diagram, but also manipulate intangible information from this designed tool. Through 3 service design case studies, this article expects to develop various application tools, which allowing designers explore more opportunities gaps from users’ experiences.
Composite nanoplatelets combining soft-magnetic iron oxide with hard-magnetic barium hexaferrite.
By coupling two different magnetic materials inside a single composite nanoparticle, the shape of the magnetic hysteresis can be engineered to meet the requirements of specific applications. Sandwich-like composite nanoparticles composed of a hard-magnetic Ba-hexaferrite (BaFe12O19) platelet core in between two soft-magnetic spinel iron oxide maghemite (γ-Fe2O3) layers were synthesized using a new, simple and inexpensive method based on the co-precipitation of Fe(3+)/Fe(2+) ions in an aqueous suspension of hexaferrite core nanoparticles. The required close control of the supersaturation of the precipitating species was enabled by the controlled release of the Fe(3+) ions from the nitrate complex with urea ([Fe((H2N)2C=O)6](NO3)3) and by using Mg(OH)2 as a solid precipitating agent. The platelet Ba-hexaferrite nanoparticles of different sizes were used as the cores. The controlled coating resulted in an exclusively heterogeneous nucleation and the topotactic growth of the spinel layers on both basal surfaces of the larger hexaferrite nanoplatelets. The direct magnetic coupling between the core and the shell resulted in a strong increase of the energy product |BH|max. Ultrafine core nanoparticles reacted with the precipitating species and homogeneous product nanoparticles were formed, which differ in terms of the structure and composition compared to any other compound in the BaO-Fe2O3 system.
Multi-Document Summarization via the Minimum Dominating Set
Multi-document summarization has been an important problem in information retrieval. It aims to distill the most important information from a set of documents to generate a compressed summary. Given a sentence graph generated from a set of documents where vertices represent sentences and edges indicate that the corresponding vertices are similar, the extracted summary can be described using the idea of graph domination. In this paper, we propose a new principled and versatile framework for multi-document summarization using the minimum dominating set. We show that four well-known summarization tasks including generic, query-focused, update, and comparative summarization can be modeled as different variations derived from the proposed framework. Approximation algorithms for performing summarization are also proposed and empirical experiments are conducted to demonstrate the effectiveness of our proposed framework.
Profiling safety behaviors: exploration of the sociocognitive variables that best discriminate between different behavioral patterns.
This study combines contributions from both safety climate literature and prominent social influence theories. It was developed to identify the combination of sociocognitive variables that differentiate between different profiles of safety behaviors. This empirical approach has hardly been explored in the literature on behavioral aspects related to safety. The research setting for this study was a transportation company (N = 356). The results of discriminant analysis showed that different combinations of dispositional and situational influences may lead to diverse profiles of compliance and proactive safety behaviors. Perceived behavioral control was revealed to be the variable that best differentiated the group with more safe behaviors from the others. However, results also revealed that high attitudes and perceived behavioral control are very important, but not sufficient, to promote proactive safety. Co-workers' descriptive safety norms were a major differentiating variable in proactive safety. Theoretical and practical implications of the findings are discussed.
Evaluation of a non-point source pollution model, AnnAGNPS, in a tropical watershed
Impaired water quality caused by human activity and the spread of invasive plant and animal species has been identified as a major factor of degradation of coastal ecosystems in the tropics. The main goal of this study was to evaluate the performance of AnnAGNPS (Annualized NonPoint Source Pollution Model), in simulating runoff and soil erosion in a 48 km watershed located on the Island of Kauai, Hawaii. The model was calibrated and validated using 2 years of observed stream flow and sediment load data. Alternative scenarios of spatial rainfall distribution and canopy interception were evaluated. Monthly runoff volumes predicted by AnnAGNPS compared well with the measured data (R 1⁄4 0.90, P < 0.05); however, up to 60% difference between the actual and simulated runoff were observed during the driest months (May and July). Prediction of daily runoff was less accurate (R 1⁄4 0.55, P < 0.05). Predicted and observed sediment yield on a daily basis was poorly correlated (R 1⁄4 0.5, P < 0.05). For the events of small magnitude, the model generally overestimated sediment yield, while the opposite was true for larger events. Total monthly sediment yield varied within 50% of the observed values, except for May 2004. Among the input parameters the model was most sensitive to the values of ground residue cover and canopy cover. It was found that approximately one third of the watershed area had low sediment yield (0e1 t ha 1 y ), and presented limited erosion threat. However, 5% of the area had sediment yields in excess of 5 t ha 1 y . Overall, the model performed reasonably well, and it can be used as a management tool on tropical watersheds to estimate and compare sediment loads, and identify ‘‘hot spots’’ on the landscape. 2007 Elsevier Ltd. All rights reserved.
Semantic 3D motion retargeting for facial animation
We present a system for realistic facial animation that decomposes facial motion capture data into semantically meaningful motion channels based on the Facial Action Coding System. A captured performance is retargeted onto a morphable 3D face model based on a semantic correspondence between motion capture and 3D scan data. The resulting facial animation reveals a high level of realism by combining the high spatial resolution of a 3D scanner with the high temporal accuracy of motion capture data that accounts for subtle facial movements with sparse measurements.Such an animation system allows us to systematically investigate human perception of moving faces. It offers control over many aspects of the appearance of a dynamic face, while utilizing as much measured data as possible to avoid artistic biases. Using our animation system, we report results of an experiment that investigates the perceived naturalness of facial motion in a preference task. For expressions with small amounts of head motion, we find a benefit for our part-based generative animation system over an example-based approach that deforms the whole face at once.
Testing normative and self-appraisal feedback in an online slot-machine pop-up in a real-world setting
Over the last few years, there have been an increasing number of gaming operators that have incorporated on-screen pop-up messages while gamblers play on slot machines and/or online as one of a range of tools to help encourage responsible gambling. Coupled with this, there has also been an increase in empirical research into whether such pop-up messages are effective, particularly in laboratory settings. However, very few studies have been conducted on the utility of pop-up messages in real-world gambling settings. The present study investigated the effects of normative and self-appraisal feedback in a slot machine pop-up message compared to a simple (non-enhanced) pop-up message. The study was conducted in a real-world gambling environment by comparing the behavioral tracking data of two representative random samples of 800,000 gambling sessions (i.e., 1.6 million sessions in total) across two conditions (i.e., simple pop-up message versus an enhanced pop-up message). The results indicated that the additional normative and self-appraisal content doubled the number of gamblers who stopped playing after they received the enhanced pop-up message (1.39%) compared to the simple pop-up message (0.67%). The data suggest that pop-up messages influence only a small number of gamblers to cease long playing sessions and that enhanced messages are slightly more effective in helping gamblers to stop playing in-session.
Using dermoscopy to detect tinea of vellus hair.
DEAR EDITOR, Classically, systemic antifungal treatment is usually mandatory in ringworm affecting cutaneous appendages: the hair or nails. In contrast, in the hairless skin only topical treatment can be used, except in cases of extensive, multiple or recurrent lesions, or in immunocompromised patients. Recently, we described in this journal a new criterion to start systemic antifungal therapy in tinea of vellus hair skin: the observation of parasitized vellus hairs on direct examination. To date, 20 isolated cases of ringworm of the vellus hair have been described, and we reported the largest series of tinea of the vellus hair. We highlighted the potential importance of this finding, which is one cause of resistance to topical treatment in tinea of vellus hair skin. We assessed whether dermoscopy could be useful to identify parasitism of vellus hair, in the same way that it has proven useful in the diagnosis of tinea capitis. We present a prospective observational study of six patients with tinea of vellus hair skin who attended our mycology unit at the Regional University Hospital of Malaga, Spain, during the period 2011–2014. Cases were analysed from a clinical, dermoscopic, mycological and therapeutic standpoint. All patients underwent the following: history taking, clinical examination, digital photography of any lesion using a Canon PowerShot G11 camera (Canon, Lake Success, NY, U.S.A.), microscopic examination of skin scraping, and use of KOH 20%. Fungal culture was performed on Sabouraud dextrose agar with and without actidione, incubated at 27 °C and examined frequently for 3 weeks. Dermoscopic examination (DermLite Pro II HR; 3Gen Inc., San Juan Capistrano, CA, U.S.A.) using a transparent film dressing was performed, after disinfection of the lens with an alcohol swab to avoid transmission of infection. The patients (three male and three female) were aged 4–40 years (mean 20 8). Six patients had single ringworm lesions at the time of diagnosis (Table 1). Most (67%) had received topical antifungal agents before seeing the dermatologist, and two cases (33%) had received topical corticosteroids before. Fifty per cent reported previous contact with a pet. Two patients had lesions located on the arm (Figs 1a, 2a), one on the axillae, one on the chest, one on the leg and one on the cheek (Figs 1a, 2a). Clinically, follicular micropustules were seen in four lesions (67%). Under dermoscopy (Figs 1b, 2b) we observed scaly plaques (100%), translucent hairs (83%), follicular pustules (67%), broken hairs (50%), corkscrew hair (17%), black dots (17%), dystrophic hair (17%) and Morse code hair (17%). Empty follicles were observed in only one lesion. In all cases, during the skin scraping, short thin hairs fell easily onto the slide (Fig. 1c). In all patients all the few vellus hairs identified on direct examination with KOH were affected (Fig. 2c). When antifungal topical cream had been used previously there was no evidence of parasitism of the scales, with the condition of the infected vellus hair being the only finding under direct microscopy; this occurred in five lesions (83%). Geophilic species were identified in one lesion (17%), with the remaining cases all being zoophilic (50%) or anthropophilic species (33%). All cases healed properly after 6 weeks of oral antifungal treatment, with no recurrences. We used griseofulvin when Microsporum canis or Microsporum gypseum were isolated, and terbinafine in the remaining cases.
A computer vision-aided motion sensing algorithm for mobile robot's indoor navigation
This paper presents the design and analysis of a computer vision-aided motion sensing algorithm for wheeled mobile robot's indoor navigation. The algorithm is realized using two vision cameras attached on a wheeled mobile robot. The first camera is positioned at front-looking direction while the second camera is positioned at downward-looking direction. An algorithm is developed to process the images acquired from the cameras to yield the mobile robot's positions and orientations. The proposed algorithm is implemented on a wheeled mobile robot for real-world effectiveness testing. Results are compared and shown the accuracy of the proposed algorithm. At the end of the paper, an artificial landmark approach is introduced to improve the navigation efficiency. Future work involved implementing the proposed artificial landmark for indoor navigation applications with minimized accumulated errors.
Is a patient's self-reported health-related quality of life a prognostic factor for survival in non-small-cell lung cancer patients? A multivariate analysis of prognostic factors of EORTC study 08975.
BACKGROUND The aim of this prognostic factor analysis was to investigate if a patient's self-reported health-related quality of life (HRQOL) provided independent prognostic information for survival in non-small cell lung cancer (NSCLC) patients. PATIENTS AND METHODS Pretreatment HRQOL was measured in 391 advanced NSCLC patients using the EORTC QLQ-C30 and the EORTC Lung Cancer module (QLQ-LC13). The Cox proportional hazards regression model was used for both univariate and multivariate analyses of survival. In addition, a bootstrap validation technique was used to assess the stability of the outcomes. RESULTS The final multivariate Cox regression model retained four parameters as independent prognostic factors for survival: male gender with a hazard ratio (HR) = 1.32 (95% CI 1.03-1.69; P = 0.03); performance status (0 to 1 versus 2) with HR = 1.63 (95% CI 1.04-2.54; P = 0.032); patient's self-reported score of pain with HR= 1.11 (95% CI 1.07-1.16; P < 0.001) and dysphagia with HR = 1.12 (95% CI 1.04-1.21; P = 0.003). A 10-point shift worse in the scale measuring pain and dysphagia translated into an 11% and 12% increased in the likelihood of death respectively. A risk group categorization was also developed. CONCLUSION The results suggest that patients' self-reported HRQOL provide independent prognostic information for survival. This finding supports the collection of such data in routine clinical practice.
Sentence Ranking with the Semantic Link Network in Scientific Paper
Sentence ranking is one of the most important research issues in text analysis. It can be used in text summarization and information retrieval. Graph-based methods are a common way of ranking and extracting sentences. In graph based methods, sentences are nodes of graph and edges are built based on the sentence similarities or on sentence co-occurrence relationship. PageRank style algorithms can be applied to get sentence ranks. In this paper, we focus on how to rank sentences in a single scientific paper. A scientific literature has more structural information than general texts and this structural information has not been fully explored yet in graph based ranking models. We investigated several different methods that used the is-part-of link on paragraph and section and similar link and co-occurrence link to construct a heterogeneous graph for ranking sentences. We conducted experiments on these methods to compare the results on sentence ranking. It shows that structural information can help identify more representative sentences.
Behavior recognition based on machine learning algorithms for a wireless canine machine interface
Training and handling working dogs is a costly process and requires specialized skills and techniques. Less subjective and lower-cost training techniques would not only improve our partnership with these dogs but also enable us to benefit from their skills more efficiently. To facilitate this, we are developing a canine body-area-network (cBAN) to combine sensing technologies and computational modeling to provide handlers with a more accurate interpretation for dog training. As the first step of this, we used inertial measurement units (IMU) to remotely detect the behavioral activity of canines. Decision tree classifiers and Hidden Markov Models were used to detect static postures (sitting, standing, lying down, standing on two legs and eating off the ground) and dynamic activities (walking, climbing stairs and walking down a ramp) based on the heuristic features of the accelerometer and gyroscope data provided by the wireless sensing system deployed on a canine vest. Data was collected from 6 Labrador Retrievers and a Kai Ken. The analysis of IMU location and orientation helped to achieve high classification accuracies for static and dynamic activity recognition.
DSRD-Based High-Power Repetitive Short-Pulse Generator Containing GDT: Theory and Experiment
A high-power short-pulse generator based on the diode step recovery phenomenon and high repetition rate discharges in a two-electrode gas discharge tube is presented. The proposed circuit is simple and low cost and driven by a low-power source. A full analysis of this generator is presented which, considering the nonlinear behavior of the gas tube, predicts the waveform of the output pulse. The proposed method has been shown to work properly by implementation of a kW-range prototype. Experimental measurements of the output pulse characteristics showed a rise time of 3.5 ns, with pulse repetition rate of 2.3 kHz for a 47- $\Omega $ load. The input peak power was 2.4 W, which translated to about 0.65-kW output, showing more than 270 times increase in the pulse peak power. The efficiency of the prototype was 57%. The overall price of the employed components in the prototype was less than U.S. $2.0. An excellent agreement between the analytical and experimental test results was established. The analysis predicts that the proposed circuit can generate nanosecond pulses with more than 100-kW peak powers by using a subkW power supply.
An Equilibrium Model of Wealth Distribution
I present an explicitly solved equilibrium model for the distribution of wealth and income in an incomplete-markets economy. I first propose a self-insurance model with an inter-temporally dependent preference (Uzawa (1968), Lucas and Stokey (1984), and Obstfeld (1990)). I then derive an analytical consumption rule which captures stochastic precautionary saving motive and generates stationary wealth accumulation. Finally, I provide a complete characterization for the equilibrium cross-sectional distribution of wealth and income in closed form by developing a recursive formulation for the moments of the distribution of wealth and income. Using this recursive formulation, I show that income persistence and the degree of wealth mean reversion are the main determinants of wealth-income correlation and relative dispersions of wealth to income, such as skewness and kurtosis ratios between wealth and income. JEL Classification: D91, E21
Online Sequence-to-Sequence Active Learning for Open-Domain Dialogue Generation
We propose an online, end-to-end, neural generative conversational model for open-domain dialog. It is trained using a unique combination of offline two-phase supervised learning and online human-inthe-loop active learning. While most existing research proposes offline supervision or hand-crafted reward functions for online reinforcement, we devise a novel interactive learning mechanism based on a diversity-promoting heuristic for response generation and one-character userfeedback at each step. Experiments show that our model inherently promotes the generation of meaningful, relevant and interesting responses, and can be used to train agents with customized personas, moods and conversational styles.
Inventory forecasting model using genetic programming and Holt-Winter's exponential smoothing method
Accurate and reliable inventory forecasting can save an organization from overstock, under-stock and no stock/stock-out situation of inventory. Overstocking leads to high cost of storage and its maintenance, whereas under-stocking leads to failure to meet the demand and losing profit and customers, similarly stock-out leads to complete halt of production or sale activities. Inventory transactions generate data, which is a time-series data having characteristic volume, speed, range and regularity. The inventory level of an item depends on many factors namely, current stock, stock-on-order, lead-time, annual/monthly target. In this paper, we present a perspective of treating Inventory management as a problem of Genetic Programming based on inventory transactions data. A Genetic Programming — Symbolic Regression (GP-SR) based mathematical model is developed and subsequently used to make forecasts using Holt-Winters Exponential Smoothing method for time-series modeling. The GP-SR model evolves based on RMSE as the fitness function. The performance of the model is measured in terms of RMSE and MAE. The estimated values of item demand from the GP-SR model is finally used to simulate a time-series and forecasts are generated for inventory required on a monthly time horizon.
Cloud-assisted Industrial Internet of Things (IIoT) - Enabled framework for health monitoring
The promising potential of the emerging Internet of Things (IoT) technologies for interconnected medical devices and sensors has played an important role in the next-generation healthcare industry for quality patient care. Because of the increasing number of elderly and disabled people, there is an urgent need for a real-time health monitoring infrastructure for analyzing patients’ healthcare data to avoid preventable deaths. Healthcare Industrial IoT (HealthIIoT) has significant potential for the realization of such monitoring. HealthIIoT is a combination of communication technologies, interconnected apps, Things (devices and sensors), and people that would function together as one smart system to monitor, track, and store patients’ healthcare information for ongoing care. This paper presents a HealthIIoT-enabled monitoring framework, where ECG and other healthcare data are collected by mobile devices and sensors and securely sent to the cloud for seamless access by healthcare professionals. Signal enhancement, watermarking, and other related analytics will be used to avoid identity theft or clinical error by healthcare professionals. The suitability of this approach has been validated through both experimental evaluation, and simulation by deploying an IoT-driven ECG-based health monitoring service in the cloud. © 2016 Elsevier B.V. All rights reserved.
Confusion Matrix
A confusion matrix (Kohavi and Provost, 1998) contains information about actual and predicted classifications done by a classification system. Performance of such systems is commonly evaluated using the data in the matrix. The following table shows the confusion matrix for a two class classifier. The entries in the confusion matrix have the following meaning in the context of our study: ● a is the number of correct predictions that an instance is negative, ● b is the number of incorrect predictions that an instance is positive, ● c is the number of incorrect of predictions that an instance negative, and ● d is the number of correct predictions that an instance is positive. Predicted Negative Positive Actual Negative a b Positive c d
Precision Landing System on H-Octocopter Drone Using Complementary Filter
An automatic landing system is required on a long-range drone because the position of the vehicle cannot be reached visually by the pilot. The autopilot system must be able to correct the drone movement dynamically in accordance with its flying altitude. The current article describes autopilot system on an H-Octocopter drone using image processing and complementary filter. This paper proposes a new approach to reduce oscillations during the landing phase on a big drone. The drone flies above 10 meters to a provided coordinate using GPS data, to check for the existence of the landing area. This process is done visually using the camera. PID controller is used to correct the movement by calculate error distance detected by camera. The controller also includes altitude parameters on its calculations through a complementary filter. The controller output is the PWM signals which control the movement and altitude of the vehicle. The signal then transferred to Flight Controller through serial communication, so that, the drone able to correct its movement. From the experiments, the accuracy is around 0.56 meters and it can be done in 18 seconds.
Audience empathy: a phenomenological method for mediated performance
This research investigates audience experience of empathy with a performer during a digitally mediated performance. Theatrical performance necessitates social interaction between performers and audience. We present a performance-based study that explores audience awareness of performer's kinaesthetic activity in 2 ways: by isolating the audience's senses (visual, auditory, and kinaesthetic) and by focusing audience perception through defamiliarization. By positioning the performer behind the audience: in their 'backspace', we focus the audience's attention to the performer in an unfamiliar way. We describe two research contributions to the study of audience empathic experience during performance. The first is the development of a phenomenological interview method designed for extracting empirical evaluations of experience of audience members in a performance scenario. The second is a descriptive model for a poetics of reception. Our model is based on an empathetic audience-performer relationship that includes 3 components of audience awareness: contextual, interpersonal, and sense-based. Our research contributions are of particular benefit to performances involving digital media, and can provide insight into audience experience of empathy.
Karma: A System for Mapping Structured Sources into the Semantic Web
The Linked Data cloud contains large amounts of RDF data generated from databases. Much of this RDF data, generated using tools such as D2R, is expressed in terms of vocabularies automatically derived from the schema of the original database. The generated RDF would be significantly more useful if it were expressed in terms of commonly used vocabularies. Using today’s tools, it is labor-intensive to do this. For example, one can first use D2R to automatically generate RDF from a database and then use R2R to translate the automatically generated RDF into RDF expressed in a new vocabulary. The problem is that defining the R2R mappings is difficult and labor intensive because one needs to write the mapping rules in terms of SPARQL graph patterns. In this work, we present a semi-automatic approach for building mappings that translate data in structured sources to RDF expressed in terms of a vocabulary of the user’s choice. Our system, Karma, automatically derives these mappings, and provides an easy to use interface that enables users to control the automated process to guide the system to produce the desired mappings. In our evaluation, users need to interact with the system less than once per column (on average) in order to construct the desired mapping rules. The system then uses these mapping rules to generate semantically rich RDF for the data sources. We demonstrate Karma using a bioinformatics example and contrast it with other approaches used in that community. Bio2RDF [7] and Semantic MediaWiki Linked Data Extension (SMW-LDE) [2] are examples of efforts that integrate bioinformatics datasets by mapping them to a common vocabulary. We applied our approach to a scenario used in the SMW-LDE that integrate ABA, Uniprot, KEGG Pathway, PharmGKB and Linking Open Drug Data datasets using a
On the Power of Quantum Finite State Automata
In this paper, we introduce 1-way and 2-way quantum nite state automata (1qfa's and 2qfa's), which are the quantum analogues of deterministic, nondeter-ministic and probabilistic 1-way and 2-way nite state automata. We prove the following facts regarding 2qfa's. 1. For any > 0, there is a 2qfa M which recognizes the non-regular language L = fa m b m j m 1g with (one-sided) error bounded by , and which halts in linear time. Speciically, M accepts any string in L with probability 1 and rejects any string not in L with probability at least 1 ?. 2. For every regular language L, there is a reversible (and hence quantum) 2-way nite state automaton which recognizes L and which runs in linear time. In fact, it is possible to deene 2qfa's which recognize the non-context-free language fa m b m c m jm 1g, based on the same technique used for 1. Consequently, the class of languages recognized by linear time, bounded error 2qfa's properly includes the regular languages. Since it is known that 2-way deterministic, nondeter-ministic and polynomial expected time, bounded error probabilistic nite automata can recognize only regular languages, it follows that 2qfa's are strictly more powerful than these \classical" models. In the case of 1-way automata, the situation is reversed. We prove that the class of languages recognizable by bounded error 1qfa's is properly contained in the class of regular languages.
Noncovalent complexes between DNA and basic polypeptides or polyamines by MALDI-TOF
MALDI-MS was evaluated as a method for the study of noncovalent complexes involving DNA oligonucleotides and various polybasic compounds (basic polypeptides and polyamines). Complexes involving single-stranded DNA were successfully detected using DHAP matrix in the presence of an ammonium salt. Control experiments confirmed that the interactions involved basic sites of the polybasic compounds and that the complexes were not formed in the gas phase but were pre-existing in the matrix crystals. Moreover, the pre-existence in solution was probed by isothermal titration calorimetry at concentration and ionic strength similar to those used for mass spectrometry. Spectra showed no important difference between negative and positive ion modes. The influence of nature and size of DNA and polybasic compound on the relative intensities and stoichiometries of the complexes was investigated. Despite the fact that relative intensities can be affected by ionization yields and the gas-phase stabilities of the different species, numerous trends observed in the MALDI study were consistent with the expected in-solution behaviors. Experimental conditions related to sample preparation were investigated also. Complex abundance generally decreased when increasing the ammonium acetate concentration. It was dramatically decreased when using ATT instead of DHAP. Penta-L-arginine is an exception to these observations. Lastly, in the case of complexes involving DNA duplex, the ATT matrix was shown to favor the observation of specific DNA duplex but not that of its complex with polybasic compounds. Inversely, DHAP was appropriate for the conservation of DNA-polybasic compound interaction but not for the transfer of intact duplex.
Intelligent parking Cloud services based on IoT using MQTT protocol
This paper presents the concept of vehicular cloud service network using IoT and Cloud together. Both these technologies (IoT and Cloud Computing) are able to solve real time problems faced by population. The tremendous growth of Internet of Thing(IoT) and Cloud Computing together have provided great solution to the increasing transportation issues. In this paper we propose, creating vehicular cloud service network using MQTT protocol. The main objective of this paper is to design a cloud vehicular service for parking purpose based on the basic communication principle of MQTT protocol. We propose an intelligent parking space services to make IoT more suitable for both small-sized and large-scale information retrieval by cloud. This paper briefs the most emerging paradigm of IoT in parking cloud services.
Effects of prior hamstring strain injury on strength, flexibility, and running mechanics.
BACKGROUND Previous studies have shown evidence of residual scar tissue at the musculotendon junction following a hamstring strain injury, which could influence re-injury risk. The purpose of this study was to investigate whether bilateral differences in strength, neuromuscular patterns, and musculotendon kinematics during sprinting are present in individuals with a history of unilateral hamstring injury, and whether such differences are linked to the presence of scar tissue. METHODS Eighteen subjects with a previous hamstring injury (>5 months prior) participated in a magnetic resonance (MR) imaging exam, isokinetic strength testing, and a biomechanical assessment of treadmill sprinting. Bilateral comparisons were made for peak knee flexion torque, angle of peak torque, and the hamstrings:quadriceps strength ratio, as well as muscle activations and peak hamstring stretch during sprinting. MR images were used to measure the volumes of the proximal tendon/aponeurosis of the biceps femoris, with asymmetries considered indicative of scar tissue. FINDINGS A significantly enlarged proximal biceps femoris tendon volume was measured on the side of prior injury. However, no significant differences between the previously injured and uninjured limbs were found in strength measures, peak hamstring stretch, or muscle activation patterns. Further, the degree of asymmetry in tendon volume was not correlated to any of the functional measures. INTERPRETATION Injury-induced changes in morphology do not seem discernable from strength measures, running kinematics, or muscle activation patterns. Further research is warranted to ascertain whether residual scarring alters localized musculotendon mechanics in a way that may contribute to the high rates of muscle re-injury that are observed clinically.
Influence of pole-arc width on cogging torque in permanent magnet motors
This work investigates the effect of the pole-arc width on the cogging torque of the permanent magnet motors based on the assumption of distributed force instead of the previous concentrated one. On the basis of the motors' mechanical and electrical symmetry, the superposition method is employed to examine the above effect. And the phase tuning relationships between the pole-arc width and the cogging torque is obtained. This study presents a tuning method for the prediction and suppression of the magnetically induced vibration, which in turn provides a theoretical foundation for the dynamic design of the permanent magnet motors. The theoretical conclusions are verified by the Finite Element method.
Decoding the patterns of self and nonself by the innate immune system.
The innate immune system evolved several strategies of self/nonself discrimination that are based on the recognition of molecular patterns demarcating infectious nonself, as well as normal and abnormal self. These patterns are deciphered by receptors that either induce or inhibit an immune response, depending on the meaning of these signals.
Covariance Matrix Adaptation Revisited - The CMSA Evolution Strategy -
The covariance matrix adaptation evolution strategy (CMA-ES) rates among the most successful evolutionary algorithms for continuous parameter optimization. Nevertheless, it is plagued with some drawbacks like the complexity of the adaptation process and the reliance on a number of sophisticatedly constructed strategy parameter formulae for which no or little theoretical substantiation is available. Furthermore, the CMA-ES does not work well for large population sizes. In this paper, we propose an alternative – simpler – adaptation step of the covariance matrix which is closer to the ”traditional” mutative self-adaptation. We compare the newly proposed algorithm, which we term the CMSA-ES, with the CMA-ES on a number of different test functions and are able to demonstrate its superiority in particular for large population sizes.
Remote Sensing Scene Classification by Unsupervised Representation Learning
With the rapid development of the satellite sensor technology, high spatial resolution remote sensing (HSR) data have attracted extensive attention in military and civilian applications. In order to make full use of these data, remote sensing scene classification becomes an important and necessary precedent task. In this paper, an unsupervised representation learning method is proposed to investigate deconvolution networks for remote sensing scene classification. First, a shallow weighted deconvolution network is utilized to learn a set of feature maps and filters for each image by minimizing the reconstruction error between the input image and the convolution result. The learned feature maps can capture the abundant edge and texture information of high spatial resolution images, which is definitely important for remote sensing images. After that, the spatial pyramid model (SPM) is used to aggregate features at different scales to maintain the spatial layout of HSR image scene. A discriminative representation for HSR image is obtained by combining the proposed weighted deconvolution model and SPM. Finally, the representation vector is input into a support vector machine to finish classification. We apply our method on two challenging HSR image data sets: the UCMerced data set with 21 scene categories and the Sydney data set with seven land-use categories. All the experimental results achieved by the proposed method outperform most state of the arts, which demonstrates the effectiveness of the proposed method.
Outcome study of the surgical management of panniculitis.
Patients with panniculus morbidus have an abdominal panniculus that becomes a pathologic entity, associated with the development of candidal intertrigo, dermatitis, lymphedema, and ischemic panniculitis. Panniculectomy is a standard treatment for this problem. The objective of this study was to determine risk factors for complications associated with panniculectomy surgery to lower the complication rate. We performed a retrospective chart review of patients who underwent panniculectomy between 1999 and 2007 by looking at data related to surgical complications, comorbidities, age, and gender. In 563 patients, we recorded the incidence of the following complications: wound-related (infection, dehiscence, and/or necrosis), hematoma/seroma, respiratory distress, blood transfusions, deep venous thrombosis or pulmonary embolism, and death. Overall, 34.3% of patients suffered at least 1 complication. In patients with wound complications specifically, there was a significantly higher body mass index versus those with no wound complications (43.7% vs. 30.7%, P < 0.0001). Smokers also had a higher rate of wound complications (40.5% vs. 19.5%, P < 0.0001).
Micronutrients reduce stress and anxiety in adults with Attention-Deficit/Hyperactivity Disorder following a 7.1 earthquake
The role of good nutrition for resilience in the face of stress is a topic of interest, but difficult to study. A 7.1 earthquake took place in the midst of research on a micronutrient treatment for Attention-Deficit/Hyperactivity Disorder (ADHD), providing a unique opportunity to examine whether individuals with ADHD taking micronutrients demonstrated more emotional resilience post-earthquake than individuals with ADHD not taking micronutrients. Thirty-three adults with ADHD were assessed twice following the earthquake using a measure of depression, anxiety and stress also completed at some point pre-earthquake (baseline). Seventeen were not taking micronutrients at the time of the earthquake (control group), 16 were (micronutrient group). While there were no between-group differences one week post-quake (Time 1), at two weeks post-quake (Time 2), the micronutrient group reported significantly less anxiety and stress than the controls (effect size 0.69). These between group differences could not be explained by other variables, such as pre-earthquake measures of emotions, demographics, psychiatric status, and personal loss or damage following the earthquake. The results suggest that micronutrients may increase resilience to ongoing stress and anxiety associated with a highly stressful event in individuals with ADHD and are consistent with controlled studies showing benefit of micronutrients for mental health.
A Generic Exact Solver for Vehicle Routing and Related Problems
Major advances were recently obtained in the exact solution of Vehicle Routing Problems (VRPs). Sophisticated Branch-Cut-and-Price (BCP) algorithms for some of the most classical VRP variants now solve many instances with up to a few hundreds of customers. However , adapting and reimplementing those successful algorithms for other variants can be a very demanding task. This work proposes a BCP solver for a generic model that encompasses a wide class of VRPs. It incorporates the key elements found in the best recent VRP algorithms: ng-path relaxation, rank-1 cuts with limited memory, and route enumeration; all generalized through the new concept of "packing set". This concept is also used to derive a new branch rule based on accumulated resource consumption and to generalize the Ryan and Foster branch rule. Extensive experiments on several variants show that the generic solver has an excellent overall performance, in many problems being better than the best existing specific algorithms. Even some non-VRPs, like bin packing, vector packing and generalized assignment, can be modeled and effectively solved.
Toward a Coherent Theory of Environmentally Significant Behavior
This article develops a conceptual framework for advancing theories of environmentally significant individual behavior and reports on the attempts of the author’s research group and others to develop such a theory. It discusses definitions of environmentally significant behavior; classifies the behaviors and their causes; assesses theories of environmentalism, focusing especially on value-belief-norm theory; evaluates the relationship between environmental concern and behavior; and summarizes evidence on the factors that determine environmentally significant behaviors and that can effectively alter them. The article concludes by presenting some major propositions supported by available research and some principles for guiding future research and informing the design of behavioral programs for environmental protection.
Image resizing via non-homogeneous warping
Image resizing aims to adapt images to displays with different sizes and aspect ratios. In this paper, we provide a new image resizing approach for efficiently determining the non-homogeneous warp that better preserves the global image configuration and concentrates the distortion in regions of the image where they are least-likely to be noticed. Considering the different properties of large displays and small displays, we design different strategies for upsizing and downsizing. We define a variety of quadratic metrics to measure image distortion. We introduce a patch-linking scheme that can better preserve the global image configuration. We formulate image resizing as a quadratic minimization problem, which can be efficiently solved. We experiment with our method on a variety category of images and compare our results to the state of the art.
MarkIt: privacy markers for protecting visual secrets
The increasing popularity of wearable devices that continuously capture video, and the prevalence of third-party applications that utilize these feeds have resulted in a new threat to privacy. In many situations, sensitive objects/regions are maliciously (or accidentally) captured in a video frame by third-party applications. However, current solutions do not allow users to specify and enforce fine grained access control over video feeds. In this paper, we describe MarkIt, a computer vision based privacy marker framework, that allows users to specify and enforce fine grained access control over video feeds. We present two example privacy marker systems -- PrivateEye and WaveOff. We conclude with a discussion of the computer vision, privacy and systems challenges in building a comprehensive system for fine grained access control over video feeds.
A Workflow for Fast Evaluation of Mapping Heuristics Targeting Cloud Infrastructures
Resource allocation is today an integral part of cloud infrastructures management to efficiently exploit resources. Cloud infrastructures centers generally use custom built heuristics to define the resource allocations. It is an immediate requirement for the management tools of these centers to have a fast yet reasonably accurate simulation and evaluation platform to define the resource allocation for cloud applications. This work proposes a framework allowing users to easily specify mappings for cloud applications described in the AMALTHEA format used in the context of the DreamCloud European project and to assess the quality for these mappings. The two quality metrics provided by the framework are execution time and energy consumption.
A randomized trial of a single bolus dosage regimen of recombinant tissue plasminogen activator in patients with acute pulmonary embolism.
Experiments in animals have demonstrated that recombinant tissue plasminogen activator (rt-PA) produces continuing thrombolysis after it is cleared from the circulation and that thrombolysis is both increased and accelerated, and bleeding is reduced when rt-PA is administered over a short period. In previous studies in patients with thrombotic disease, rt-PA has been shown to be an effective thrombolytic agent when administered by continuous infusion over a period between 90 minutes and 8 hours. To determine whether a short course regimen of rt-PA can achieve thrombolysis, a double-blind randomized trial has been conducted in which patients with objectively established acute symptomatic pulmonary embolism who were receiving heparin were allocated to either a 2-minute infusion of rt-PA at a dose of 0.6 mg/kg (33 patients) or saline placebo (25 patients). Perfusion lung scanning was used to assess the change in pulmonary perfusion at 24 hours and seven days post-study drug administration. Thirty-four percent of the rt-PA patients had a greater than 50 percent resolution in the perfusion defect at 24 hours compared to 12 percent of placebo patients (p = 0.026). At 24 hours, the mean relative improvement in the perfusion defect was 37.0 percent in rt-PA treated patients compared to 18.8 percent in the placebo group (p = 0.017). By day 7, no difference in lung scan resolution was detected between the groups. There were no major bleeds in either group nor were there any differences in transfusion requirements between groups. Minor bleeding occurred in 15 of the rt-PA patients mainly at angiogram-catheter insertion and venipuncture sites. These results suggest that a bolus regimen of rt-PA produces accelerated thrombolysis and provides an alternative and convenient approach to thrombolytic therapy in patients with pulmonary embolism.
A 56GS/S 6b DAC in 65nm CMOS with 256×6b memory
Modern optical systems increasingly rely on DSP techniques for data transmission at 40Gbs and recently at 100Gbs and above. A significant challenge towards CMOS TX DSP SoC integration is due to requirements for four 6b DACs (Fig. 10.8.1) to operate at 56Gs/s with low power and small footprint. To date, the highest sampling rate of 43Gs/s 6b DAC is reported in SiGe BiCMOS process [1]. CMOS DAC implementations are constraint to 12Gs/s with the output signal frequency limited to 1.5GHz [2–4]. This paper demonstrates more than one order of magnitude improvement in 6b CMOS DAC design with a test circuit operating at 56Gs/s, achieving SFDR >30dBc and ENOB>4.3b up to the output frequency of 26.9GHz. Total power dissipation is less than 750mW and the core DAC die area is less than 0.6×0.4 mm2.
Attributing Conversion Credit in an Online Environment: An Analysis and Classification
In the context of marketing, attribution is the process of quantifying the value of marketing activities relative to the final outcome. It is a topic rapidly growing in importance as acknowledged by the industry. However, despite numerous tools and techniques designed for its measurement, the absence of a comprehensive assessment and classification scheme persists. Thus, we aim to bridge this gap by providing an academic review to accumulate and comprehend current knowledge in attribution modeling, leading to a road map to guide future research, expediting new knowledge creation.
Anonymous Walk Embeddings
The task of representing entire graphs has seen a surge of prominent results, mainly due to learning convolutional neural networks (CNNs) on graphstructured data. While CNNs demonstrate stateof-the-art performance in graph classification task, such methods are supervised and therefore steer away from the original problem of network representation in task-agnostic manner. Here, we coherently propose an approach for embedding entire graphs and show that our feature representations with SVM classifier increase classification accuracy of CNN algorithms and traditional graph kernels. For this we describe a recently discovered graph object, anonymous walk, on which we design task-independent algorithms for learning graph representations in explicit and distributed way. Overall, our work represents a new scalable unsupervised learning of state-of-the-art representations of entire graphs.
On the Limits of Recursively Self-Improving AGI
Self-improving software has been a goal of computer scientists since the founding of the field of Artificial Intelligence. In this work we analyze limits on computation which might restrict recursive self-improvement. We also introduce Convergence Theory which aims to predict general behavior of RSI systems.
A Randomized Trial Comparing Cisplatin Plus 5-fluorouracil With or Without Levamisole in Operable Gastric Cancer
OBJECTIVES To determine the effectiveness and toxicity when levamisole was added to the adjuvant combination chemotherapy in patients with operable gastric cancer. METHODS After en bloc resection of gastric cancer without gross or microscopic evidence of residual disease from April 1991 to December 1992, 100 patients were randomized to 6 months of 5-fluorouracil 1,000 mg/m2/day administered as continuous infusion for 5 days, cisplatin 60 mg/m2/day as intravenous infusion for 1 day with or without levamisole (50 mg every eight hours P.O for a period of three days every 2 weeks for 6 months). This chemotherapy treatment was begun within 2 to 4 weeks after the surgery. The chemotherapy consisted of discrete 5-day courses administered at 4-weeks intervals. All 100 patients are assessable. RESULTS The fifty patients were assigned to each treatment group. There was no statistical difference and no bias in the distribution of characteristics of the 100 evaluable patients between the two groups. A total of 274 courses of treatment were given in the levamisole group and 260 courses of treatment in non-levamisole group. Eleven patients in each group did not finish planned 6 courses of treatment mainly due to non-compliance. At median follow up of 39 months, 32 patients relapsed 19 in the levamisole group and 13 in the non-levamisole group (p = 0.284). Twenty five patients died of relapsed diseases, 15 in the levamisole group and 10 in the non-levamisole group. The levamisole group tended to show more risk of overall death rate and recurrence than the non-levamisole group. However, this result was not statistically significant at 3 years. The treatment was well tolerated in both treatment groups. The grade 2-3 toxicities were nausea/ vomiting (levamisole, non-levamisole group; 31.7%, 29.3% of treatment courses respectively), diarrhea (7.6%, 8.4%), mucositis (11.6%, 12.3%), and leukopenia (9.8%, 9.6%). CONCLUSION Levamisole had negative effects on disease-free survival and overall survival when added to adjuvant combination chemotherapy of cisplatin and 5-fluorouracil in patients with operable gastric cancer. Both treatment arms were generally well tolerated and the toxicity profile was similar with or without levamisole.
Systematic approach in testing the viability of mechanical partial-cut singulation process towards tin-plateable sidewalls for wettable flank on automotive QFN technology
Satisfying stringent customer requirement of visually detectable solder joint termination for high reliability applications requires the implementation of robust wettable flank strategies. One strategy involves the exposition of the sidewall via partial-cut singulation, where the exposed surface could be made wettable through tin (Sn) electroplating process. Herein, we report our systematic approach in evaluating the viability of mechanical partial-cut singulation process to produce Sn-plateable sidewalls, enabling the wettable flank technology using an automotive QFN package are technology carrier. Optimization DOE produced robust set of parameters showing that mechanical partial cut is a promising solution to produce sidewalls appropriate for Sn electroplating, synergistically yielding excellent wettable flanks.
EPPN: Extended Prime Product Number based wormhole DETECTION scheme for MANETs
MANETs are an upcoming technology that is gaining momentum in recent years. Due to their unique characteristics, MANETs are suffering from wide range of security attacks. Wormhole is a common security issue encounter in MANETs routing protocol. A new routing protocol naming extended prime product number (EPPN) based on the hop count model is proposed in this article. Here hop count between source & destination is obtained depending upon the current active route. This hop count model is integrated into AODV protocol. In the proposed scheme firstly the route is selected on the basis of RREP and then hop count model calculates the hop count between source & destination. Finally wormhole DETECTION procedure will be started if the calculated hop count is greater than the received hop count in the route to get out the suspected nodes.
Structural, electrical and dielectric characteristics of strontium-modified CaCu3Ti4O12
In order to produce an ultra-high dielectric constant and low loss factor in a polycrystalline material of Sr+2-modified (concentration 15%) CaCu3Ti4O12 (i.e., Ca0.85Sr0.15Cu3Ti4O12, abbreviated as CSCTO-15) was prepared using a cost-effective solid-state reaction technique. Structural analysis using Rietveld refinement of the X-ray diffraction data has confirmed that the sample crystallizes in a cubic system with space group symmetry of Im3. Morphological/microstructural analysis of the natural surface of a pellet shows dense grain growth, with clearly visible grain boundaries. The dielectric relaxation mechanism of CSCTO-15 has been revealed by the detailed study of frequency- and temperature-dependent dielectric parameters (εr and tan δ). Analysis of the frequency and temperature dependence of impedance and related parameters, collected by a complex impedance spectroscopic technique, has provided and estimated the contributions of grain, grain boundaries and electrode in the electrical process in the material. The activation energy at high temperatures has been calculated from the temperature-dependent ac conductivity plots at selected frequencies.
The cognitive control of emotion
The capacity to control emotion is important for human adaptation. Questions about the neural bases of emotion regulation have recently taken on new importance, as functional imaging studies in humans have permitted direct investigation of control strategies that draw upon higher cognitive processes difficult to study in nonhumans. Such studies have examined (1) controlling attention to, and (2) cognitively changing the meaning of, emotionally evocative stimuli. These two forms of emotion regulation depend upon interactions between prefrontal and cingulate control systems and cortical and subcortical emotion-generative systems. Taken together, the results suggest a functional architecture for the cognitive control of emotion that dovetails with findings from other human and nonhuman research on emotion.
Current-Only Directional Overcurrent Relay
Overcurrent relays are widely used for protection of power systems, directional ones for transmission side, and nondirectional ones for distribution side. The fault direction may be forward (between relay and grid), or reverse (between relay and source), the normal power flow being from source to the grid. Known directional overcurrent relays rely on a reference voltage phasor for estimating the direction of the fault, requiring both current and voltage sensors. This increases the cost of the relays, prohibiting the utilization of such relays in the distribution side protection and automation, which is going to be a key part in the smart grid initiative. In this paper, a novel current-only directional detection possibility is highlighted.
THREE-AXIS AIR-BEARING BASED PLATFORM FOR SMALL SATELLITE ATTITUDE DETERMINATION AND CONTROL SIMULATION
A frictionless environment simulation platform, utilized for accomplishing three-axis attitude control tests in small satellites, is introduced. It is employed to develop, improve, and carry out objective tests of sensors, actuators, and algorithms in the experimental framework. Different sensors (i.e. sun, earth, magnetometer, and an inertial measurement unit) are utilized to assess three-axis deviations. A set of three inertial wheels is used as primary actuators for attitude control, together with three mutually perpendicular magnetic coils intended for desaturation purposes, and as a backup control system. Accurate balancing, through the platform’s center of mass relocation into the geometrical center of the spherical air-bearing, significatively reduces gravitational torques, generating a virtually torque-free environment. A very practical balancing procedure was developed for equilibrating the table in the local horizontal plane, with a reduced final residual torque. A wireless monitoring system was developed for on-line and post-processing analysis; attitude data are displayed and stored, allowing properly evaluate the sensors, actuators, and algorithms. A specifically designed onboard computer and a set of microcontrollers are used to carry out attitude determination and control tasks in a distributed control scheme. The main components and subsystems of the simulation platform are described in detail. RESUMEN Se presenta una plataforma de simulación de un medio sin fricción, utilizada para llevar a cabo pruebas de control de orientación en satélites pequeños. Ésta se emplea para efectuar de una manera objetiva, el desarrollo, mejoramiento y pruebas de funcionamiento de: sensores, actuadores y algoritmos; desde un punto de vista experimental. Se utilizan diferentes sensores (i.e. sol, tierra, magnetómetro y unidad de medición inercial) para determinar su desviación en tres ejes. Tres ruedas inerciales constituyen el grupo de actuadores primarios para control de orientación, trabajando en conjunto con tres bobinas magnéticas, mutuamente perpendiculares, que sirven para desaturar las ruedas y también como sistema de control de respaldo. La ejecución de un balanceo exacto, a través de la re-localización del centro de masa de la plataforma sobre el centro geométrico del balero de aire esférico, reduce significativamente los pares gravitacionales, generando un medio virtualmente libre de pares externos. Se desarrolló un procedimiento muy práctico de balanceo, para equilibrar la mesa en el plano horizontal local, logrando obtener un par residual final pequeño. Un sistema de monitoreo inalámbrico fue desarrollado con el propósito de llevar a cabo un análisis en línea y en post-proceso; los datos de orientación son desplegados y almacenados, permitiendo una correcta evaluación de sensores, actuadores y algoritmos. Una computadora a bordo de diseño específico y un conjunto de microcontroladores, llevan a cabo tareas de detección, orientación y control, en un esquema de control distribuido. Se describen con detalle los principales componentes y subsistemas de la plataforma de simulación.
Adaptive Adversarial Attack on Scene Text Recognition
Recent studies have shown that state-of-the-art deep learning models are vulnerable to the inputs with small perturbations (adversarial examples). We observe two critical obstacles in adversarial examples: (i) Strong adversarial attacks (e.g., C&W attack) require manually tuning hyper-parameters and take a long time to construct an adversarial example, making it impractical to attack real-time systems; (ii) Most of the studies focus on non-sequential tasks, such as image classification, yet only a few consider sequential tasks. In this work, we speed up adversarial attacks, especially on sequential learning tasks. By leveraging the uncertainty of each task, we directly learn the adaptive multi-task weightings, without manually searching hyper-parameters. A unified architecture is developed and evaluated for both nonsequential tasks and sequential ones. To validate the effectiveness, we take the scene text recognition task as a case study. To our best knowledge, our proposed method is the first attempt to adversarial attack for scene text recognition. Adaptive Attack achieves over 99.9% success rate with 3 ∼ 6× speedup compared to state-of-the-art adversarial attacks.
An analysis of legal warnings after drug approval in Thailand.
Drug risk management has many tools for minimizing risk and black-boxed warnings (BBWs) are one of those tools. Some serious adverse drug reactions (ADRs) emerge only after a drug is marketed and used in a larger population. In Thailand, additional legal warnings after drug approval, in the form of black-boxed warnings, may be applied. Review of their characteristics can assist in the development of effective risk mitigation. This study was a cross sectional review of all legal warnings imposed in Thailand after drug approval (2003-2012). Any boxed warnings for biological products and revised warnings which were not related to safety were excluded. Nine legal warnings were evaluated. Seven related to drugs classes and two to individual drugs. The warnings involved four main types of predictable ADRs: drug-disease interactions, side effects, overdose and drug-drug interactions. The average time from first ADRs reported to legal warnings implementation was 12 years. The triggers were from both safety signals in Thailand and regulatory measures in other countries outside Thailand.
American College of Cardiology/American Heart Association clinical practice guidelines: Part II: evolutionary changes in a continuous quality improvement project.
Case Presentation: A 50-year-old male presents to your office for evaluation of a recent episode of atrial fibrillation. The patient had no prior history of atrial fibrillation until 3 days ago. At that time, he was working in his home woodworking shop on a large cabinet. The cabinet slipped out of the clamps holding it and fell on his right great toe. The patient was having moderate to severe toe pain when he noticed that his heart was beating rapidly and irregularly. His palpitations were not associated with any chest pain, shortness of breath, or lightheadedness. He went to his local emergency room. His blood pressure was 135/80, and his pulse was rapid and irregularly irregular. An ECG showed that he was in atrial fibrillation with a ventricular response rate of 160. There were no ST segment changes. Before the patient received any therapy, he converted to normal sinus rhythm. The total duration of his episode of atrial fibrillation was 2 hours. A subsequent ECG was entirely normal. X-rays of his right foot did not show any fracture. He was discharged from the local emergency room and advised to see you. The patient is physically active. He denies any history of chest pain or chest pressure. He has no history of hypertension, diabetes, or tobacco use. His only other medical problem is mild asthma, treated with occasional inhalers. Both his mother and father lived into their late 80s and died of cancer. His two siblings are both alive and well without any cardiovascular disease. On physical examination, his blood pressure is 120/ 80 mm Hg. His heart rate is 70 bpm and regular. His cardiac examination is normal. On lung examination, there are rare wheezes over both lung fields. His right great toe is badly bruised. His ECG from the local emergency room is available and is normal. What are the appropriate next steps in the evaluation of this patient, as outlined in the American College of Cardiology (ACC)/American Heart Association (AHA)/European Society of Cardiology (ESC) Guidelines for the Management of Patients With Atrial Fibrillation?1 Appropriate clinical evaluation is shown in Table 1. This patient merits a chest x-ray (to evaluate his rare wheezes), a transthoracic echocardiogram, and blood tests of thyroid function. None of the additional tests listed, ie, exercise testing, Holter monitoring, transesophageal echocardiography, or electrophysiological study, are appropriate at this time.
Timing of first alcohol use and alcohol dependence: evidence of common genetic influences.
AIMS To estimate the magnitude of genetic and environmental influences on timing of first alcohol use and alcohol dependence (AD) and to quantify the overlap in these influences across the two alcohol-related outcomes. PARTICIPANTS The sample consisted of 5382 twins (2691 complete pairs), aged 24-36 years, from the Australian Twin Registry. MEASUREMENTS History of alcohol use and DSM-IV alcohol dependence were assessed by structured telephone interview. FINDINGS In both sexes, the relationship between age at first alcohol use and risk for AD followed a linear trend, such that the highest rates of AD were observed in individuals who began drinking at an earlier than average age (14 years or younger). Heritability estimates for timing of first alcohol use and AD were 36% and 53%, respectively. Shared environmental factors accounted for 15% of variance in initiation. There was no evidence of shared environmental influences on AD. The genetic correlation between timing of first alcohol use and AD was 0.59. CONCLUSIONS Findings highlight the substantial role of genetics in the development of AD and the early manifestation of that genetic risk in the timing of alcohol use initiation which, unlike AD, is also influenced to a modest degree by shared environmental factors. The considerable overlap in heritable influences-and the virtual absence of overlap in individual-specific environmental influences-on initiation of alcohol use and AD indicates that the association between age at first drink and AD is attributable in large part to common genetic sources of variance.
Focus on Extracellular Vesicles: Therapeutic Potential of Stem Cell-Derived Extracellular Vesicles
The intense research focus on stem and progenitor cells could be attributed to their differentiation potential to generate new cells to replace diseased or lost cells in many highly intractable degenerative diseases, such as Alzheimer disease, multiple sclerosis, and heart diseases. However, experimental and clinical studies have increasingly attributed the therapeutic efficacy of these cells to their secretion. While stem and progenitor cells secreted many therapeutic molecules, none of these molecules singly or in combination could recapitulate the functional effects of stem cell transplantations. Recently, it was reported that extracellular vesicles (EVs) could recapitulate the therapeutic effects of stem cell transplantation. Based on the observations reported thus far, the prevailing hypothesis is that stem cell EVs exert their therapeutic effects by transferring biologically active molecules such as proteins, lipids, mRNA, and microRNA from the stem cells to injured or diseased cells. In this respect, stem cell EVs are similar to EVs from other cell types. They are both primarily vehicles for intercellular communication. Therefore, the differentiating factor is likely due to the composition of their cargo. The cargo of EVs from different cell types are known to include a common set of proteins and also proteins that reflect the cell source of the EVs and the physiological or pathological state of the cell source. Hence, elucidation of the stem cell EV cargo would provide an insight into the multiple physiological or biochemical changes necessary to affect the many reported stem cell-based therapeutic outcomes in a variety of experimental models and clinical trials.
Nipple-Sparing Mastectomy for Breast Cancer and Risk-Reducing Surgery: The Memorial Sloan-Kettering Cancer Center Experience
Nipple-sparing mastectomy (NSM) has been gathering increased recognition as an alternative to more traditional mastectomy approaches. Initially, questions concerning its oncologic safety limited the use of NSM. Nevertheless, mounting evidence supporting the practice of NSM for both prophylactic and oncologic purposes is leading to its more widespread use and broadened indications. Using a prospectively maintained database, we reviewed our experience of 353 NSM procedures performed in 200 patients over the past 10 years. The indications for surgery were: 196 prophylactic risk-reduction (55.5%), 74 ductal carcinoma in situ (DCIS) (20.8%), 82 invasive cancer (23.2%), and 1 phyllodes tumor (0.5%). The nipple areolar complex (NAC) was entirely preserved in 341 mastectomies (96.7%). There were 11 patients (3.1%) who were found to have cancer at the nipple margin, warranting further excision. A total of 69 breasts (19.5%) had some degree of skin desquamation or necrosis, but only 12 (3.3%) required operative debridement, of which 3 breasts (1%) necessitated removal of a breast implant. Also, 6 patients (2%) were treated for infection. Of the 196 prophylactic NSMs, 11 specimens (5.6%) were found to harbor occult cancer (8 DCIS and 3 invasive cancers). One patient who underwent NSM for invasive ductal carcinoma in 2006 developed metastatic disease to her brain. No other recurrences are attributable to the 353 NSMs. The trends demonstrate the increasing acceptance of NSM as a prophylactic procedure as well as for therapeutic purposes. Although NSM is not standard, our experience supports the selective use of NSM in both prophylactic and malignant settings.
Strain rate imaging after dynamic stress provides objective evidence of persistent regional myocardial dysfunction in ischaemic myocardium: regional stunning identified?
OBJECTIVE To investigate whether persistent ischaemic dysfunction of the myocardium after dynamic stress can be diagnosed from changes in ultrasonic strain rate and strain. DESIGN Prospective observational study, with age matched controls. SETTING University hospital. PATIENTS AND METHODS 26 patients (23 men, mean (SD) age 58.9 (8.1) years) with coronary artery disease but no infarction and 12 controls (9 men, aged 56.1 (8.8) years) with normal coronary arteriography and negative exercise test underwent treadmill exercise (Bruce protocol). Tissue Doppler echocardiography was performed at baseline, at peak exercise, and at intervals up to one hour. Systolic and diastolic velocity, strain, and strain rate were recorded in the basal anterior segment of 16 patients with proximal left anterior descending coronary artery disease. RESULTS Patients developed ischaemia, since they experienced angina, exercised for less time, and reached a lower workload than the control group, and had ST segment depression (-2.4 mm). Myocardial systolic velocity immediately after exercise increased by 31% and strain rate fell by 25% compared with increases of 92% and 62%, respectively, in the control group (p < 0.05). During recovery, myocardial systolic velocity and strain rate normalised quickly, whereas systolic strain remained depressed at 30 and 60 minutes after exercise, by 21% and 23%, respectively, compared with baseline (p < 0.05 versus controls). Myocardial diastolic velocities and strain rate normalised but early diastolic strain remained depressed by 32% compared with controls for 60 minutes (p < 0.05). Strain during atrial contraction was abnormal for 30 minutes. CONCLUSIONS Myocardial strain shows regional post-ischaemic dysfunction in systole and diastole and may become a useful diagnostic tool in patients presenting with chest pain with a normal ECG.
Electronic media use and adolescent health and well-being: cross-sectional community study.
OBJECTIVE To describe time adolescents spend using electronic media (television, computer, video games, and telephone); and to examine associations between self-reported health/well-being and daily time spent using electronic media overall and each type of electronic media. METHODS Design-Cross-sectional data from the third (2005) wave of the Health of Young Victorians Study, an Australian school-based population study. Outcome Measures-Global health, health-related quality of life (HRQoL; KIDSCREEN), health status (Pediatric Quality of Life Inventory 4.0; PedsQL), depression/anxiety (Kessler-10), and behavior problems (Strengths and Difficulties Questionnaire). Exposure Measures-Duration of electronic media use averaged over 1 to 4 days recalled with the Multimedia Activity Recall for Children and Adolescents (MARCA) computerized time-use diary. Analysis-Linear and logistic regression; adjusted for demographic variables and body mass index z score. RESULTS A total of 925 adolescents (mean +/- standard deviation age, 16.1+/-1.2 years) spent, on average, 3 hours 16 minutes per day using electronic media (television, 128 minutes per day; video games, 35; computers, 19; telephone, 13). High overall electronic media use was associated with poorer behavior, health status, and HRQoL. Associations with duration of specific media exposures were mixed; there was a favorable association between computer use (typing/Internet) and psychological distress, whereas high video game use was associated with poorer health status, HRQoL, global health, and depression/anxiety. Television and telephone durations were not associated with any outcome measure. CONCLUSIONS Despite television's associations with obesity, time spent in other forms of media use appear more strongly related to adolescent health and well-being. This study supports efforts to reduce high video game use and further exploration of the role of computers in health enhancement.
Ganglion cyst of the wrist treated with electroacupuncture: a case report.
Objective To illustrate the clinical management of a ganglion cyst presenting on the right dorsal wrist. Clinical Features A 38-year-old female complaining of a symptomatic right dorsal wrist ganglion of four years duration. Intervention and Outcome The patient was treated with high-frequency electroacupuncture in six consecutive treatments over a four week period and reported symptomatic improvement and a decrease in the size of the cyst following therapeutic intervention. Conclusion Ganglion cysts of the wrist are rather common benign connective tissue masses with variable treatment interventions. Electroacupuncture may be a novel and non-invasive conservative approach for the treatment of ganglion cysts. Further evaluation of the efficacy is warranted.
Optimizing Interventions via Offline Policy Evaluation: Studies in Citizen Science
Volunteers who help with online crowdsourcing such as citizen science tasks typically make only a few contributions before exiting. We propose a computational approach for increasing users’ engagement in such settings that is based on optimizing policies for displaying motivational messages to users. The approach, which we refer to as Trajectory Corrected Intervention (TCI), reasons about the tradeoff between the long-term influence of engagement messages on participants’ contributions and the potential risk of disrupting their current work. We combine model-based reinforcement learning with off-line policy evaluation to generate intervention policies, without relying on a fixed representation of the domain. TCI works iteratively to learn the best representation from a set of random intervention trials and to generate candidate intervention policies. It is able to refine selected policies off-line by exploiting the fact that users can only be interrupted once per session. We implemented TCI in the wild with Galaxy Zoo, one of the largest citizen science platforms on the web. We found that TCI was able to outperform the state-of-the-art intervention policy for this domain, and significantly increased the contributions of thousands of users. This work demonstrates the benefit of combining traditional AI planning with off-line policy methods to generate intelligent intervention strategies.