title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
A novel CNTFET-based ternary logic gate design | This paper presents a novel design of ternary logic inverters using carbon nanotube FETs (CNTFETs). Multiple-valued logic (MVL) circuits have attracted substantial interest due to the capability of increasing information content per unit area. In the past extensive design techniques for MVL circuits (especially ternary logic inverters) have been proposed for implementation in CMOS technology. In CNTFET device, the threshold voltage of the transistor can be controlled by controlling the chirality vector (i.e. the diameter); in this paper this feature is exploited to design ternary logic inverters. New designs are proposed and compared with existing CNTFET-based designs. Extensive simulation results using SPICE demonstrate that power delay product is improved by 300% comparing to the conventional ternary gate design. |
Tapping on the potential of q&a community by recommending answer providers | The rapidly increasing popularity of community-based Question Answering (cQA) services, e.g. Yahoo! Answers, Baidu Zhidao, etc. have attracted great attention from both academia and industry. Besides the basic problems, like question searching and answer finding, it should be noted that the low participation rate of users in cQA service is the crucial problem which limits its development potential. In this paper, we focus on addressing this problem by recommending answer providers, in which a question is given as a query and a ranked list of users is returned according to the likelihood of answering the question. Based on the intuitive idea for recommendation, we try to introduce topic-level model to improve heuristic term-level methods, which are treated as the baselines. The proposed approach consists of two steps: (1) discovering latent topics in the content of questions and answers as well as latent interests of users to build user profiles; (2) recommending question answerers for new arrival questions based on latent topics and term-level model. Specifically, we develop a general generative model for questions and answers in cQA, which is then altered to obtain a novel computationally tractable Bayesian network model. Experiments are carried out on a real-world data crawled from Yahoo! Answers during Jun 12 2007 to Aug 04 2007, which consists of 118510 questions, 772962 answers and 150324 users. The experimental results reveal significant improvements over the baseline methods and validate the positive influence of topic-level information. |
Marine Pollution Prevention in Bangladesh: A Way Forward for Implement Comprehensive National Legal Framework | After the permanent demarcation of maritime boundary with Myanmar and India, the Bangladesh government has emphasized on development of marine resources for enhancing its economy. However, Bangladesh faces several challenges in achieving blue growth, of which protecting marine environment from coastal and marine pollution is important. Without appropriate control measures, coastal and marine ecosystem are largely unprotected due to land-based and sea-based pollution sources that includes sewage pollution, industrial pollution, coastal development, habitat destruction etc. which collectively create microbial pollution. The oceanic parts of this country are rich in marine resources including diversity of fishes but pollution has devastating impacts on marine biota including human health. At present, Bangladesh has no ocean governance or policy to protect the resources from pollution. In this context, this paper provides a comprehensive policy framework for marine pollution control in Bangladesh with analysis of national and international legislation. A case study also illustrated to understand the present status of shipbreaking pollution in this region. |
On the relationship between the “default mode network” and the “social brain” | The default mode network (DMN) of the brain consists of areas that are typically more active during rest than during active task performance. Recently however, this network has been shown to be activated by certain types of tasks. Social cognition, particularly higher-order tasks such as attributing mental states to others, has been suggested to activate a network of areas at least partly overlapping with the DMN. Here, we explore this claim, drawing on evidence from meta-analyses of functional MRI data and recent studies investigating the structural and functional connectivity of the social brain. In addition, we discuss recent evidence for the existence of a DMN in non-human primates. We conclude by discussing some of the implications of these observations. |
A Deep Learning Approach for Network Intrusion Detection System | A Network Intrusion Detection System (NIDS) helps system administrators to detect network security breaches in their organization. However, many challenges arise while developing a flexible and effective NIDS for unforeseen and unpredictable attacks. In this work, we propose a deep learning based approach to implement such an effective and flexible NIDS. We use Self-taught Learning (STL), a deep learning based technique, on NSL-KDD a benchmark dataset for network intrusion. We present the performance of our approach and compare it with a few previous work. Compared metrics include the accuracy, precision, recall, and f-measure values. |
Treatment of heart failure in real-world clinical practice: findings from the REFLECT-HF registry in patients with NYHA class II symptoms and a reduced ejection fraction. | BACKGROUND
Optimal medical therapy (OMT) for patients with chronic heart failure and a reduced ejection fraction (HF-REF) includes angiotensin-converting enzyme inhibitors/angiotensin receptor blockers, β-blockers, and mineralocorticoid receptor antagonists, plus a diuretic.
HYPOTHESIS
We hypothesized that OMT is less often prescribed in HF-REF patients (≤35%) with New York Heart Association (NYHA) class II symptoms compared with those with NYHA class III/IV symptoms.
METHODS
This was a cross-sectional, observational, multicenter survey of hospital-based cardiologists, office-based cardiologists, and general practitioners in Germany.
RESULTS
Out of a total of 384 patients enrolled, 144 had REF ≤35%. Patients with REF had NYHA class II symptoms in 39.6% (n = 57) and NYHA class III/IV symptoms in 60.4% (n = 87). The REF/NYHA class II group had a higher proportion of males than the REF/NYHA class III/IV group. For angiotensin-converting enzyme inhibitors/angiotensin receptor blockers and β-blockers, prescription rates were high and comparable between groups. However, prescription rates for mineralocorticoid receptor antagonists were lower compared with other guideline-recommended treatments. Multivariate analyses indicated that OMT prescription was reduced for older patients and increased for patients cared for by an office-based cardiologist.
CONCLUSIONS
Given the high proportion of patients with reduced left ventricular systolic function but only minor symptoms, HF-REF appears to be underdiagnosed, and a higher proportion of patients than are currently recognized could potentially be candidates for OMT. |
A Reconfigurable Circularly Polarized Microstrip Antenna With a Slotted Ground Plane | This letter describes a square patch antenna with a switchable circular polarization (CP) sense. The proposed antenna has four L-shaped slots on the ground plane, and the CP radiation can be generated by current perturbation due to the slotted ground plane. Because the CP sense of the proposed antenna is altered by the current path that is redirected with switched p-i-n diodes on the slots, the CP direction of the proposed antenna can be simply switched between the right-handed and left-handed CPs. As the slot and bias circuits are not placed on the patch side, the proposed antenna has a simple structure and can radiate a CP wave without altering the main beam direction. The experimental results show that antennas have excellent switchable radiation performance at 2.4 GHz. |
Line-to-Line Fault Detection for Photovoltaic Arrays Based on Multiresolution Signal Decomposition and Two-Stage Support Vector Machine | Fault detection in photovoltaic (PV) arrays becomes difficult as the number of PV panels increases. Particularly, under low irradiance conditions with an active maximum power point tracking algorithm, line-to-line (L-L) faults may remain undetected because of low fault currents, resulting in loss of energy and potential fire hazards. This paper proposes a fault detection algorithm based on multiresolution signal decomposition for feature extraction, and two-stage support vector machine (SVM) classifiers for decision making. This detection method only requires data of the total voltage and current from a PV array and a limited amount of labeled data for training the SVM. Both simulation and experimental case studies verify the accuracy of the proposed method. |
Cluster-based concept invention for statistical relational learning | We use clustering to derive new relations which augment database schema used in automatic generation of predictive features in statistical relational learning. Entities derived from clusters increase the expressivity of feature spaces by creating new first-class concepts which contribute to the creation of new features. For example, in CiteSeer, papers can be clustered based on words or citations giving "topics", and authors can be clustered based on documents they co-author giving "communities". Such cluster-derived concepts become part of more complex feature expressions. Out of the large number of generated features, those which improve predictive accuracy are kept in the model, as decided by statistical feature selection criteria. We present results demonstrating improved accuracy on two tasks, venue prediction and link prediction, using CiteSeer data. |
Prediction of antidepressant response to venlafaxine by a combination of early response assessment and therapeutic drug monitoring. | INTRODUCTION
Early assessment of a therapeutic response is a central goal in antidepressant treatment. The present study examined the potential for therapeutic drug monitoring and symptom rating to predict venlafaxine treatment efficacy (measured by overall patient response and remission).
METHODS
88 patients were uptitrated homogenously to 225 mg/day venlafaxine. Serum concentrations of venlafaxine (VEN) and its active metabolite O-desmethylvenlafaxine (ODV) were measured at week 2. Continuous psychopathometric ratings were measured for up to 6 weeks by independent study raters.
RESULTS
An early improvement was significantly more common in venlafaxine responders than non-responders (χ(2); p=0.007). While ODV serum levels were significantly higher in responders (t test; p=0.006), VEN serum levels, sum level of VEN+ODV and the ratio of ODV/VEN levels were not. Moreover, patients who showed an early response combined with an ODV serum level above the median of 222 ng/mL were significantly more likely to achieve full response (binary logistic model; p<0.01). Sensitivity (84% for early response) and specificity (81% for combination of early response and therapeutic drug monitoring) were sufficient to qualify as a reasonable screening instrument.
CONCLUSION
Our results indicate that early improvement and ODV serum concentration are predictive of therapeutic outcome and can thus be used to guide use of the antidepressant venlafaxine. |
Heterogeneous Integration for Mid-infrared Silicon Photonics | Heterogeneous integration enables the construction of silicon (Si) photonic systems, which are fully integrated with a range of passive and active elements including lasers and detectors. Numerous advancements in recent years have shown that heterogeneous Si platforms can be extended beyond near-infrared telecommunication wavelengths to the mid-infrared (MIR) (2–20 μm) regime. These wavelengths hold potential for an extensive range of sensing applications and the necessary components for fully integrated heterogeneous MIR Si photonic technologies have now been demonstrated. However, due to the broad wavelength range and the diverse assortment of MIR technologies, the optimal platform for each specific application is unclear. Here, we overview Si photonic waveguide platforms and lasers at the MIR, including quantum cascade lasers on Si. We also discuss progress toward building an integrated multispectral source, which can be constructed by wavelength beam combining the outputs from multiple lasers with arrayed waveguide gratings and duplexing adiabatic couplers. |
Improving Automated Controversy Detection on the Web | Automatically detecting controversy on the Web is a useful capability for a search engine to help users review web content with a more balanced and critical view. The current state-of-the art approach is to find K-Nearest-Neighbors in Wikipedia to the document query, and to aggregate their controversy scores that are automatically computed from the Wikipedia edit-history features. In this paper, we discover two major weakness in the prior work and propose modifications. First, the generated single query from document to find KNN Wikipages easily becomes ambiguous. Thus, we propose to generate multiple queries from smaller but more topically coherent paragraph of the document. Second, the automatically computed controversy scores of Wikipedia articles that depend on "edit war" features have a drawback that without an edit history, there can be no edit wars. To infer more reliable controversy scores for articles with little edit history, we smooth the original score from the scores of the neighbors with more established edit history. We show that the modified framework is improved by up to 5% for binary controversy classification in a publicly available dataset. |
Survival of childhood polycystic kidney disease following renal transplantation: the impact of advanced hepatobiliary disease. | Childhood PKD encompasses the diagnoses of AR and ADPKD, glomerulocystic disease, and syndromes such as tuberous sclerosis or Jeune's syndrome. Given the fact that a majority of PKD children with ESRD carry the diagnosis of ARPKD, natural history studies assessing the long-term prognosis of PKD patients following renal transplantation must focus on morbidity and mortality issues related to complications from congenital hepatic fibrosis. Using the NAPRTCS registry, we analyzed the patient and graft survival rates of 203 PKD patients and 7044 non-PKD patients undergoing renal transplantation between 1987 and 2001. Deceased PKD patients, all with a diagnosis of ARPKD, were further identified and characterized using a special questionnaire submitted to the principal investigators. Overall graft and patient survival rates were not significantly different between PKD and non-PKD patients. No differences in rates of acute rejection or time to first rejection were noted between PKD and non-PKD patients. The relative risk of living longer than 3 yr in the PKD patients was not significantly different from non-PKD patients (RR = 0.70, p = 0.28). Sepsis was identified as a likely factor in the cause of death in nine (64%) ARPKD patients and was comfirmed with a positive blood culture in four patients. Despite similar graft and patient survival rates among PKD and non-PKD children following renal transplantation, our results suggest that ARPKD transplant recipients appear to be at increased risk for sepsis that may be related to hepatic fibrosis and ascending cholangitis. The utility of early liver transplantation in ARPKD patients with significant hepatobiliary disease is discussed. |
Weapon Bias Split-Second Decisions and Unintended Stereotyping | Race stereotypes can lead people to claim to see a weapon where there is none. Split-second decisions magnify the bias by limiting people’s ability to control responses. Such a bias could have important consequences for decision making by police officers and other authorities interacting with racial minorities. The bias requires no intentional racial animus, occurring even for those who are actively trying to avoid it. This research thus raises difficult questions about intent and responsibility for racially biased errors. KEYWORDS—implicit; attitude; stereotyping; prejudice; weapon The trouble with split-second decisions is that they seem to make themselves. It is not simply that snap decisions are less accurate than ‘‘snail’’ decisions; it is easy to understand why people might make random errors when thinking fast. If you only have 30 seconds, it is probably a bad idea to do your taxes, pick a stock, or solve any problem beginning with ‘‘Two trains leave the station . . .’’ The real puzzle is when snap judgments show systematic biases that differ from our considered decisions. Should I consider those decisions my decisions if they differ from my intentions? Who is responsible? These questions are asked most loudly when decisions have immense consequences, as when a split-second decision has to be made by a surgeon, a soldier, or a police officer. Four New York City police officers had to make that kind of decision while patrolling the Bronx on a February night in 1999. When the officers ordered Amadou Diallo to stop because he matched a suspect’s description, Diallo reacted unexpectedly. Rather than raising his hands, he reached for his pocket. The Ghanaian immigrant may have misunderstood the order, or maybe he meant to show his identification. The misunderstanding was mutual: One officer shouted, ‘‘Gun!’’ and the rest opened fire. Only after the shooting stopped was it clear that Diallo held only his wallet. Many in the public were outraged. Some accused the NYPD of racial bias. Congress introduced legislation. Protests followed the officers’ acquittal, in which the defense successfully argued that at the moment of decision, the officers believed their lives were in danger and that they therefore did not have the conscious intent, the mens rae (literally, ‘‘guilty mind’’) to commit a crime. The court did not consider the mechanisms that might produce such a belief. The death of Amadou Diallo dragged into the spotlight some of the disquieting questions that have run through implicit social cognition research for some time. Can stereotypes about race influence such split-second decisions? And can that kind of race bias take place without intent to discriminate? To answer these questions, it is necessary to move away from the particulars of the Diallo case and toward controlled studies in which causes and mechanisms can be identified. What are the psychological factors that would lead a person, in the crucial moment, to shout, ‘‘Gun’’? |
Feature Extraction and Image Processing | Aguado is incorrect on the spine and on the rear cover! P21 It’s not Matlab 5.3.1 now.. line 13 should read “The current version is Matlab 6, but....” Page 35 – e ..... gives the frequency components in p(t) Page 36 – (as plotted in Figure 2.3(b) suggests that P65 Change Bob Damper’s book (Damper 1995) for Ifeachor’s excellent book (Ifeachor 2002) P65 remove Damper reference (as it’s now out of print): Damper R. I. , Introduction to Discrete-Time Signals and Systems, Chapman and Hall, London UK, 1995 P66 introduce Ifeachor reference: Ifeachor, E. C., and Jervis, B. W., Digital Signal Processing 2 Ed., Prentice Hall, Hemel Hempstead UK, 2002 P83 Code 3.7 swap order of for x and for y statements So Code 3.7 is (with approximations to the fancy symbols) |
Clinical performance of serum prostate-specific antigen isoform [-2]proPSA (p2PSA) and its derivatives, %p2PSA and the prostate health index (PHI), in men with a family history of prostate cancer: results from a multicentre European study, the PROMEtheuS project. | OBJECTIVES
To test the sensitivity, specificity and accuracy of serum prostate-specific antigen isoform [-2]proPSA (p2PSA), %p2PSA and the prostate health index (PHI), in men with a family history of prostate cancer (PCa) undergoing prostate biopsy for suspected PCa. To evaluate the potential reduction in unnecessary biopsies and the characteristics of potentially missed cases of PCa that would result from using serum p2PSA, %p2PSA and PHI.
PATIENTS AND METHODS
The analysis consisted of a nested case-control study from the PRO-PSA Multicentric European Study, the PROMEtheuS project. All patients had a first-degree relative (father, brother, son) with PCa. Multivariable logistic regression models were complemented by predictive accuracy analysis and decision-curve analysis.
RESULTS
Of the 1026 patients included in the PROMEtheuS cohort, 158 (15.4%) had a first-degree relative with PCa. p2PSA, %p2PSA and PHI values were significantly higher (P < 0.001), and free/total PSA (%fPSA) values significantly lower (P < 0.001) in the 71 patients with PCa (44.9%) than in patients without PCa. Univariable accuracy analysis showed %p2PSA (area under the receiver-operating characteristic curve [AUC]: 0.733) and PHI (AUC: 0.733) to be the most accurate predictors of PCa at biopsy, significantly outperforming total PSA ([tPSA] AUC: 0.549), free PSA ([fPSA] AUC: 0.489) and %fPSA (AUC: 0.600) (P ≤ 0.001). For %p2PSA a threshold of 1.66 was found to have the best balance between sensitivity and specificity (70.4 and 70.1%; 95% confidence interval [CI]: 58.4-80.7 and 59.4-79.5 respectively). A PHI threshold of 40 was found to have the best balance between sensitivity and specificity (64.8 and 71.3%, respectively; 95% CI 52.5-75.8 and 60.6-80.5). At 90% sensitivity, the thresholds for %p2PSA and PHI were 1.20 and 25.5, with a specificity of 37.9 and 25.5%, respectively. At a %p2PSA threshold of 1.20, a total of 39 (24.8%) biopsies could have been avoided, but two cancers with a Gleason score (GS) of 7 would have been missed. At a PHI threshold of 25.5 a total of 27 (17.2%) biopsies could have been avoided and two (3.8%) cancers with a GS of 7 would have been missed. In multivariable logistic regression models, %p2PSA and PHI achieved independent predictor status and significantly increased the accuracy of multivariable models including PSA and prostate volume by 8.7 and 10%, respectively (P ≤ 0.001). p2PSA, %p2PSA and PHI were directly correlated with Gleason score (ρ: 0.247, P = 0.038; ρ: 0.366, P = 0.002; ρ: 0.464, P < 0.001, respectively).
CONCLUSIONS
%p2PSA and PHI are more accurate than tPSA, fPSA and %fPSA in predicting PCa in men with a family history of PCa. Consideration of %p2PSA and PHI results in the avoidance of several unnecessary biopsies. p2PSA, %p2PSA and PHI correlate with cancer aggressiveness. |
Discovering social circles in ego networks | People's personal social networks are big and cluttered, and currently there is no good way to automatically organize them. Social networking sites allow users to manually categorize their friends into social circles (e.g., “circles” on Google+, and “lists” on Facebook and Twitter). However, circles are laborious to construct and must be manually updated whenever a user's network grows. In this article, we study the novel task of automatically identifying users' social circles. We pose this task as a multimembership node clustering problem on a user's ego network, a network of connections between her friends. We develop a model for detecting circles that combines network structure as well as user profile information. For each circle, we learn its members and the circle-specific user profile similarity metric. Modeling node membership to multiple circles allows us to detect overlapping as well as hierarchically nested circles. Experiments show that our model accurately identifies circles on a diverse set of data from Facebook, Google+, and Twitter, for all of which we obtain hand-labeled ground truth. |
Development of Textile Antennas for Body Wearable Applications and Investigations on Their Performance Under Bent Conditions | Utilization of wearable textile materials for the development of microstrip antenna segment has been rapid due to the recent miniaturization of wireless devices. A wearable antenna is meant to be a part of the clothing used for communication purposes, which includes tracking and navigation, mobile computing and public safety. This paper describes design and development of four rectangular patch antennas employing different varieties of cotton and polyester clothing for on-body wireless communications in the 2.45GHz WLAN band. The impedance and radiation characteristics are determined experimentally when the antennas are kept in flat position. The performance deterioration of a wearable antenna is analyzed under bent conditions too to check compatibility with wearable applications. Results demonstrate the suitability of these patch antennas for on-body wireless communications. |
Ranking Sentences for Extractive Summarization with Reinforcement Learning | Single document summarization is the task of producing a shorter version of a document while preserving its principal information content. In this paper we conceptualize extractive summarization as a sentence ranking task and propose a novel training algorithm which globally optimizes the ROUGE evaluation metric through a reinforcement learning objective. We use our algorithm to train a neural summarization model on the CNN and DailyMail datasets and demonstrate experimentally that it outperforms state-of-the-art extractive and abstractive systems when evaluated automatically and by humans.1 |
Consciousness: a unique way of processing information | In this article, I argue that consciousness is a unique way of processing information, in that: it produces information, rather than purely transmitting it; the information it produces is meaningful for us; the meaning it has is always individuated. This uniqueness allows us to process information on the basis of our personal needs and ever-changing interactions with the environment, and consequently to act autonomously. Three main basic cognitive processes contribute to realize this unique way of information processing: the self, attention and working memory. The self, which is primarily expressed via the central and peripheral nervous systems, maps our body, the environment, and our relations with the environment. It is the primary means by which the complexity inherent to our composite structure is reduced into the “single voice” of a unique individual. It provides a reference system that (albeit evolving) is sufficiently stable to define the variations that will be used as the raw material for the construction of conscious information. Attention allows for the selection of those variations in the state of the self that are most relevant in the given situation. Attention originates and is deployed from a single locus inside our body, which represents the center of the self, around which all our conscious experiences are organized. Whatever is focused by attention appears in our consciousness as possessing a spatial quality defined by this center and the direction toward which attention is focused. In addition, attention determines two other features of conscious experience: periodicity and phenomenal quality. Self and attention are necessary but not sufficient for conscious information to be produced. Complex forms of conscious experiences, such as the various modes of givenness of conscious experience and the stream of consciousness, need a working memory mechanism to assemble the basic pieces of information selected by attention. |
Cosmological and astrophysical aspects of finite density QCD | The different phases of QCD at finite temperature and density lead to interesting effects in cosmology and astrophysics. In this work I review some aspects of the cosmological QCD transition and of astrophysics at high baryon density. |
What's in a translation rule? | We propose a theory that gives formal semantics to word-level alignments defined over parallel corpora. We use our theory to introduce a linear algorithm that can be used to derive from word-aligned, parallel corpora the minimal set of syntactically motivated transformation rules that explain human translation data. |
Electrical power distribution system (HV270DC), for application in more electric aircraft | In the new designs of military aircraft and unmanned aircraft there is a clear trend towards increasing demand of electrical power. This fact is mainly due to the replacement of mechanical, pneumatic and hydraulic equipments by partially or completely electrical systems. Generally, use of electrical power onboard is continuously increasing within the areas of communications, surveillance and general systems, such as: radar, cooling, landing gear or actuators systems. To cope with this growing demand for electric power, new levels of voltage (270 VDC), architectures and power electronics devices are being applied to the onboard electrical power distribution systems. The purpose of this paper is to present and describe the technological project HV270DC. In this project, one Electrical Power Distribution System (EPDS), applicable to the more electric aircrafts, has been developed. This system has been integrated by EADS in order to study the benefits and possible problems or risks that affect this kind of power distribution systems, in comparison with conventional distribution systems. |
Attentive Recurrent Comparators | Rapid learning requires flexible representations to quickly adopt to new evidence. We develop a novel class of models called Attentive Recurrent Comparators (ARCs) that form representations of objects by cycling through them and making observations. Using the representations extracted by ARCs, we develop a way of approximating a dynamic representation space and use it for oneshot learning. In the task of one-shot classification on the Omniglot dataset, we achieve the state of the art performance with an error rate of 1.5%. This represents the first super-human result achieved for this task with a generic model that uses only pixel information. |
NON-SPEECH ENVIRONMENTAL SOUND CLASSIFICATION USING SVMS WITH A NEW SET OF FEATURES | Mel Frequency Cepstrum Coefficients (MFCCs) are considered as a method of stationary/pseudo-stationary feature extraction. They work very well for the classification of speech and music signals. MFCCs have also been used to classify non-speech sounds for audio surveillance systems, even though MFCCs do not completely reflect the time-varying features of non-stationary non-speech signals. We introduce a new 2D-feature set, used with a feature extraction method based on the pitch range (PR) of non-speech sounds and the Autocorrelation Function. We compare the classification accuracies of the proposed features of this new method to MFCCs by using Support Vector Machines (SVMs) and Radial Basis Function Neural Network classifiers. Non-speech environmental sounds: gunshot, glass breaking, scream, dog barking, rain, engine, and restaurant noise, were studied. The new feature set provides high accuracy rates when used as a classifier. Its usage with MFCCs significantly improves the accuracy rates of the given classifiers in the range of 4% to 35% depending on the classifier used, suggesting that both feature sets are complementary. SVM classifier using the Gaussian kernel provided the highest accuracy rates among the classifiers used in this study. |
Experimental implementation of an Invariant Extended Kalman Filter-based scan matching SLAM | We describe an application of the Invariant Extended Kalman Filter (IEKF) design methodology to the scan matching SLAM problem. We review the theoretical foundations of the IEKF and its practical interest of guaranteeing robustness to poor state estimates, then implement the filter on a wheeled robot hardware platform. The proposed design is successfully validated in experimental testing. |
Turing Patterns in Memristive Cellular Nonlinear Networks | The formation of ordered structures, in particular Turing patterns, in complex spatially extended systems has been observed in many different contexts, spanning from natural sciences (chemistry, physics, and biology) to technology (mechanics and electronics). In this paper, it is shown that the use of memristors in a simple cell of a spatially-extended circuit architecture allows us to design systems able to generate Turing patterns. In addition, the memristor parameters play a key role in the selection of the type and characteristics of the emerging pattern, which is also influenced by the initial conditions. The problem of finding the regions of parameters where Turing patterns may emerge in the proposed cellular architecture is solved in an analytic way, and numerical results are shown to illustrate the system behavior with respect to its parameters. |
Evaluation of bone thickness in the anterior hard palate relative to midsagittal orthodontic implants. | PURPOSE
The Straumann Orthosystem (Institut Straumann, Waldenburg, Switzerland) describes a technique that involves placement of titanium implants (4 or 6 mm long and 3.3 mm in diameter) into the midsagittal hard palate for orthodontic anchorage. The aim of this study was to determine the quantity of bone in the midline of the anterior hard palate, and specifically the thickness inferior to the incisive canal.
MATERIALS AND METHODS
Twenty-five dry skulls were radiographed with a standardized cephalometric technique. The vertical thickness of the midsagittal palate was then measured to the nearest tenth of a millimeter. Next, gutta-percha was injected into the incisive canal, and the radiograph was repeated. The bone thicknesses were then measured from the inferior hard palate to the most Inferior part of the radiopaque canal. This is defined as the actual bone available for the implant without violating the canal.
RESULTS
The measurements have shown that an average of 8.6 +/- 1.3 mm of bone is theoretically available for the implant. However, considering the canal (where only bone thickness inferior to it is utilized and measured), only 4.3 +/- 1.6 mm of bone exists. The canal itself averaged 2.5 +/- 0.6 mm in diameter.
DISCUSSION
Prior studies have overestimated the amount of bone available for implants in the median hard palate. The main reason for this is that the incisive canal is not well visualized on cephalometric radiographs of live patients.
CONCLUSION
This study supports the continued use of implants, as approximately 50% of skulls still had the requisite minimum 4 mm of bone inferior to the incisive canal for maximum osseointegration with the 4-mm implants. However, 6-mm implants should be used with caution. |
VHDL procedure for combinational divider | In the paper, a synthesizable combinational integer number divider VHDL model is described that is suitable for implementation in the FPGA devices. The algorithm the divider is based on is briefly introduced. Along the model, testbench for its functional verification is presented. Results of implementation in Xilinx Spartan-3 and Spartan-6 devices — amount of FPGA resources used and maximum delay, are given in tables. |
End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures | We present a novel end-to-end neural model to extract entities and relations between them. Our recurrent neural network based model captures both word sequence and dependency tree substructure information by stacking bidirectional treestructured LSTM-RNNs on bidirectional sequential LSTM-RNNs. This allows our model to jointly represent both entities and relations with shared parameters in a single model. We further encourage detection of entities during training and use of entity information in relation extraction via entity pretraining and scheduled sampling. Our model improves over the stateof-the-art feature-based model on end-toend relation extraction, achieving 3.5% and 4.8% relative error reductions in F1score on ACE2004 and ACE2005, respectively. We also show a 2.5% relative error reduction in F1-score over the state-ofthe-art convolutional neural network based model on nominal relation classification (SemEval-2010 Task 8). |
Efficient 3D scene abstraction using line segments | Extracting 3D information from a moving camera is traditionally based on interest point detection and matching. This is especially challenging in urban indoorand outdoor environments, where the number of distinctive interest points is naturally limited. While common Structure-from-Motion (SfM) approaches usually manage to obtain the correct camera poses, the number of accurate 3D points is very small due to the low number of matchable features. Subsequent Multi-view Stereo approaches may help to overcome this problem, but suffer from a high computational complexity. We propose a novel approach for the task of 3D scene abstraction, which uses straight line segments as underlying features. We use purely geometric constraints to match 2D line segments from different images, and formulate the reconstruction procedure as a graph-clustering problem. We show that our method generates accurate 3D models with low computational costs, which makes it especially useful for large-scale urban datasets. © 2016 Elsevier Inc. All rights reserved. |
Reliability and validity of a modified PHQ-9 item inventory (PHQ-12) as a screening instrument for assessing depression in Asian Indians (CURES-65). | OBJECTIVES
To evaluate the validity and reliability of the modified Patient Health Questionnaire(PHQ) 12 item instrument as a screening tool for assessing depression compared to the PHQ-9 in a representative south Indian urban population.
METHODS
The Chennai Urban Rural Epidemiology Study [CURES] is a large cross-sectional study conducted in Chennai, South India. In Phase 1 of CURES(urban component), 26,001 individuals aged > or =20 years individuals were selected by a systematic sampling technique of whom one hundred subjects were randomly selected, using computer-generated numbers, for this validation study. Two self-reported questionnaires (modified PHQ-12 item and PHQ-9 item) were administered to the subjects to compare their effectiveness in detecting depression. Reliability and validity were assessed and Receiver Operating Characteristic (ROC) curves were plotted. Pearson's correlation was used to compare the two questionnaires.
RESULTS
The mean age of the study was 38.6 +/- 11.6 years and 48% were males. Pearson's correlation coefficient between the modified PHQ-12 and the PHQ-9 item was 0.913 [p < 0.0001]. Factor Analysis revealed that the modified PHQ-12 item scale can be used as a unidimensional scale and had excellent internal consistency (Cronbach's alpha: 0.88). A cut point of >4 calculated using the ROC curves for the modified PHQ-12 item had the highest sensitivity (92.0%) and specificity (90.7%) using PHQ-9 as the gold standard. The positive predictive value was 76.7%, and the negative predictive value, 97.1% and the area under the ROC curve, 0.979 (95% Confidence Interval: 0.929 - 0.997, p < 0.0001).
CONCLUSION
The modified PHQ-12 item is a valid and reliable instrument for large scale population based screening of depression in Asian Indians and a cut point score of greater than 4 gave the highest sensitivity and specificity. |
Evaluation of the FICA Tool for Spiritual Assessment. | CONTEXT
The National Consensus Project for Quality Palliative Care includes spiritual care as one of the eight clinical practice domains. There are very few standardized spirituality history tools.
OBJECTIVES
The purpose of this pilot study was to test the feasibility for the Faith, Importance and Influence, Community, and Address (FICA) Spiritual History Tool in clinical settings. Correlates between the FICA qualitative data and quality of life (QOL) quantitative data also were examined to provide additional insight into spiritual concerns.
METHODS
The framework of the FICA tool includes Faith or belief, Importance of spirituality, individual's spiritual Community, and interventions to Address spiritual needs. Patients with solid tumors were recruited from ambulatory clinics of a comprehensive cancer center. Items assessing aspects of spirituality within the Functional Assessment of Cancer Therapy QOL tools were used, and all patients were assessed using the FICA. The sample (n=76) had a mean age of 57, and almost half were of diverse religions.
RESULTS
Most patients rated faith or belief as very important in their lives (mean 8.4; 0-10 scale). FICA quantitative ratings and qualitative comments were closely correlated with items from the QOL tools assessing aspects of spirituality.
CONCLUSION
Findings suggest that the FICA tool is a feasible tool for clinical assessment of spirituality. Addressing spiritual needs and concerns in clinical settings is critical in enhancing QOL. Additional use and evaluation by clinicians of the FICA Spiritual Assessment Tool in usual practice settings are needed. |
Computing Marginals Using MapReduce: Keynote talk paper | We consider the problem of computing the data-cube marginals of a fixed order k (i.e., all marginals that aggregate over k dimensions), using a single round of MapReduce. The focus is on the relationship between the reducer size (number of inputs allowed at a single reducer) and the replication rate (number of reducers to which an input is sent). We show that the replication rate is minimized when the reducers receive all the inputs necessary to compute one marginal of higher order. That observation lets us view the problem as one of covering sets of k dimensions with sets of a larger size m, a problem that has been studied under the name "covering numbers." We offer a number of constructions that, for different values of k and m meet or come close to yielding the minimum possible replication rate for a given reducer size. |
The evolution of society. | Although the social mechanisms responsible for the development and maintenance of societies in animals and man have fascinated and intrigued philosophers and scientists since classical times, the first systematic consideration of their evolution appears in the Origin of species (Darwin 1859/1958). Much of Darwin’s thinking about the evolution of societies in animals and humans has a distinctly modern feel about it and he commonly anticipates theoretical developments that only occurred 100 years later. Although he did not confront the problem of altruistic behaviour directly, he was aware of the challenge to his theory posed by the evolution of sterile castes in some social insects (Darwin 1859/1958). In Chapter VIII of the ‘Origin of species’, he describes how he thought, at first, that this was fatal to his whole theory of natural selection. Then, in a paragraph that presages Hamilton’s subsequent extension of evolutionary theory, he describes how he realised that ‘the problem is lessened, or, as I believe, disappears, when it is remembered that selection may be applied to the family, as well as to the individual, and may thus gain the desired end.’ (Darwin 1859, p. 230). In The descent of man (1871), Darwin turned to the evolution of human societies. In Chapter VI, he stresses the contrast between humans and other animals ‘I fully subscribe to the judgement of those writers who maintain that of all the differences between man and the lesser animals, the moral sense or conscience is by far the most important’ (The descent of man, p. 97). He then goes on to argue that the evolution of mutual assistance and the moral senses in humans and other animals are maintained by benefits shared by members of cooperative groups, a suggestion that clearly parallels modern theories of social evolution (Boyd & Richerson 1996; Clutton-Brock 2002). He goes on to point out that many animals live in groups and cooperate with each other and describes how ‘wolves and some other beasts of prey hunt in packs, and aid one another in attacking their victims’, how ‘pelicans fish in concert’ and ‘social animals mutually defend each other’. He describes how vervet monkeys stretch out and groom each others coats and ends by telling a story illustrating the benefits of cooperation: |
Nonlinear observers for predicting state-of-charge and state-of-health of lead-acid batteries for hybrid-electric vehicles | This paper describes the application of state-estimation techniques for the real-time prediction of the state-of-charge (SoC) and state-of-health (SoH) of lead-acid cells. Specifically, approaches based on the well-known Kalman Filter (KF) and Extended Kalman Filter (EKF), are presented, using a generic cell model, to provide correction for offset, drift, and long-term state divergence-an unfortunate feature of more traditional coulomb-counting techniques. The underlying dynamic behavior of each cell is modeled using two capacitors (bulk and surface) and three resistors (terminal, surface, and end), from which the SoC is determined from the voltage present on the bulk capacitor. Although the structure of the model has been previously reported for describing the characteristics of lithium-ion cells, here it is shown to also provide an alternative to commonly employed models of lead-acid cells when used in conjunction with a KF to estimate SoC and an EKF to predict state-of-health (SoH). Measurements using real-time road data are used to compare the performance of conventional integration-based methods for estimating SoC with those predicted from the presented state estimation schemes. Results show that the proposed methodologies are superior to more traditional techniques, with accuracy in determining the SoC within 2% being demonstrated. Moreover, by accounting for the nonlinearities present within the dynamic cell model, the application of an EKF is shown to provide verifiable indications of SoH of the cell pack. |
Challenges and Opportunities for Implementing Integrated Mental Health Care: A District Level Situation Analysis from Five Low- and Middle-Income Countries | BACKGROUND
Little is known about how to tailor implementation of mental health services in low- and middle-income countries (LMICs) to the diverse settings encountered within and between countries. In this paper we compare the baseline context, challenges and opportunities in districts in five LMICs (Ethiopia, India, Nepal, South Africa and Uganda) participating in the PRogramme for Improving Mental health carE (PRIME). The purpose was to inform development and implementation of a comprehensive district plan to integrate mental health into primary care.
METHODS
A situation analysis tool was developed for the study, drawing on existing tools and expert consensus. Cross-sectional information obtained was largely in the public domain in all five districts.
RESULTS
The PRIME study districts face substantial contextual and health system challenges many of which are common across sites. Reliable information on existing treatment coverage for mental disorders was unavailable. Particularly in the low-income countries, many health service organisational requirements for mental health care were absent, including specialist mental health professionals to support the service and reliable supplies of medication. Across all sites, community mental health literacy was low and there were no models of multi-sectoral working or collaborations with traditional or religious healers. Nonetheless health system opportunities were apparent. In each district there was potential to apply existing models of care for tuberculosis and HIV or non-communicable disorders, which have established mechanisms for detection of drop-out from care, outreach and adherence support. The extensive networks of community-based health workers and volunteers in most districts provide further opportunities to expand mental health care.
CONCLUSIONS
The low level of baseline health system preparedness across sites underlines that interventions at the levels of health care organisation, health facility and community will all be essential for sustainable delivery of quality mental health care integrated into primary care. |
Explorer Image Pivoting for Learning Multilingual Multimodal Representations | In this paper we propose a model to learn multimodal multilingual representations for matching images and sentences in different languages, with the aim of advancing multilingual versions of image search and image understanding. Our model learns a common representation for images and their descriptions in two different languages (which need not be parallel) by considering the image as a pivot between two languages. We introduce a new pairwise ranking loss function which can handle both symmetric and asymmetric similarity between the two modalities. We evaluate our models on image-description ranking for German and English, and on semantic textual similarity of image descriptions in English. In both cases we achieve state-of-the-art performance. |
Traditional chinese medicine in treatment of metabolic syndrome. | In management of metabolic syndrome, the traditional Chinese medicine (TCM) is an excellent representative in alternative and complementary medicines with a complete theory system and substantial herb remedies. In this article, basic principle of TCM is introduced and 25 traditional Chinese herbs are reviewed for their potential activities in the treatment of metabolic syndrome. Three herbs, ginseng, rhizoma coptidis (berberine, the major active compound) and bitter melon, were discussed in detail on their therapeutic potentials. Ginseng extracts made from root, rootlet, berry and leaf of Panax quinquefolium (American ginseng) and Panax ginseng (Asian ginseng), are proved for anti-hyperglycemia, insulin sensitization, islet protection, anti-obesity and anti-oxidation in many model systems. Energy expenditure is enhanced by ginseng through thermogenesis. Ginseng-specific saponins (ginsenosides) are considered as the major bioactive compounds for the metabolic activities of ginseng. Berberine from rhizoma coptidis is an oral hypoglycemic agent. It also has anti-obesity and anti-dyslipidemia activities. The action mechanism is related to inhibition of mitochondrial function, stimulation of glycolysis, activation of AMPK pathway, suppression of adipogenesis and induction of low-density lipoprotein (LDL) receptor expression. Bitter melon or bitter gourd (Momordica charantia) is able to reduce blood glucose and lipids in both normal and diabetic animals. It may also protect beta cells, enhance insulin sensitivity and reduce oxidative stress. Although evidence from animals and humans supports the therapeutic activities of ginseng, berberine and bitter melon, multi-center large-scale clinical trials have not been conducted to evaluate the efficacy and safety of these herbal medicines. |
Use of mental health counseling as adolescents become young adults. | PURPOSE
Despite parallels in mental health needs among adolescents and young adults, there is a paucity of evidence regarding use of mental health services in young adulthood. Using a longitudinal sample, this study compares rates of mental health counseling use between adolescents and young adults, examines characteristics and predictors of counseling use for young adults, and identifies reasons for foregone care among those with mental health needs in young adulthood.
METHODS
Secondary data analysis was conducted on a nationally representative sample of 10,817 participants from the National Longitudinal Study of Adolescent Health. Data were derived from an initial survey collected in 1995 (mean age, 15.8 years) and a follow-up survey collected 7 years later (mean age, 21.5 years).
RESULTS
Among individuals with depressive symptomology, young adults reported significantly lower rates of counseling use compared with adolescents. When taking into account the severity of mental health problems, female gender, high maternal education, school attendance, and receipt of routine physical examinations were significantly predictive of counseling use among young adults. Young adults of black ethnicity were significantly less likely to receive counseling compared with those of white ethnicity. Overall, 4% of young adults reported foregoing health care in the past year, despite self-reported mental health needs. Inability to pay, belief that the problem would go away, and lack of time were commonly cited reasons for any type of foregone health care. However, concerns regarding physician's care (i.e., fear of what the doctor would say or do, and belief that the doctor would be unable to help) were more frequently mentioned by those who acknowledged a need for counseling services.
CONCLUSIONS
Low rates of mental health counseling persist from adolescence to young adulthood. Findings such as increased counseling service use among those receiving routine physical examinations, as well as reported concerns of physician care, point to possible areas of intervention within the pediatric community. |
AHA Council on Clinical Cardiology: bringing the best science to the bedside for more than 50 years. | It is indeed a special privilege to be officers of the Council on Clinical Cardiology, as we, along with our members, celebrate the 50th Anniversary of our Council. As we review the state of the Council and look toward its future, it is very clear that we have drawn from our past. The Council on Clinical Cardiology was initially established by the American Heart Association as the Section of Clinical Cardiology on June 5,1952. The minutes of its first meeting held on April 11, 1953, are clear evidence that a strong foundation was set by the collective wisdom and vision of its founding members, then led by the Council’s first chairman, A. Carlton Ernstene. The AHA Section on Clinical Cardiology was established with a clear purpose: “to facilitate and encourage investigations, prevention, treatment, and education in the field of clinical cardiology.”1 In the minutes of this meeting, Samuel A. Levine heralded the future of this Council and its steadfast dedication to the development of practice guidelines and scientific statements, when he declared that “… the American Heart Association’s research program is of a very high caliber, but very serious thought should be given to methods whereby the results of research are made available to the medical profession at large.” That same year, the group developed a meeting structure of presenting brief versions of original investigations (what we now know as abstracts) along with … |
Satellite Attitude Control and Power Tracking with Energy / Momentum Wheels | A control law for an integrated power/attitude control system (IPACS) for a satellite is presented. Four or more energy/momentum wheels in an arbitrary noncoplanar con guration and a set of three thrusters are used to implement the torque inputs. The energy/momentum wheels are used as attitude-control actuators, as well as an energy storage mechanism, providing power to the spacecraft. In that respect, they can replace the currently used heavy chemical batteries. The thrusters are used to implement the torques for large and fast (slew) maneuvers during the attitude-initialization and target-acquisition phases and to implement the momentum management strategies. The energy/momentum wheels are used to provide the reference-tracking torques and the torques for spinning up or down the wheels for storing or releasing kinetic energy. The controller published in a previous work by the authors is adopted here for the attitude-tracking function of the wheels. Power tracking for charging and discharging the wheels is added to complete the IPACS framework. The torques applied by the energy/momentum wheels are decomposed into two spaces that are orthogonal to each other, with the attitude-control torques and power-tracking torques in each space. This control law can be easily incorporated in an IPACS system onboard a satellite. The possibility of the occurrence of singularities, in which no arbitrary energy pro le can be tracked, is studied for a generic wheel cluster con guration. A standard momentum management scheme is considered to null the total angular momentum of the wheels so as to minimize the gyroscopic effects and prevent the singularity from occurring. A numerical example for a satellite in a low Earth near-polar orbit is provided to test the proposed IPACS algorithm. The satellite’s boresight axis is required to track a ground station, and the satellite is required to rotate about its boresight axis so that the solar panel axis is perpendicular to the satellite–sun vector. |
O-MaSE: A Customizable Approach to Developing Multiagent Development Processes | This paper describes the Organization-based Multiagent System Engineering (O-MaSE) Process Framework, which helps process engineers define custom multiagent systems development processes. O-MaSE builds off the MaSE methodology and is adapted from the OPEN Process Framework (OPF). OPF implements a Method Engineering approach to process construction. The goal of O-MaSE is to allow designers to create customized agent-oriented software development processes. O-MaSE consists of three basic structures: (1) a metamodel, (2) a set of methods fragments, and (3) a set of guidelines. The O-MaSE metamodel defines the key concepts needed to design and implement multiagent systems. The method fragments are operations or tasks that are executed to produce a set of work products, which may include models, documents, or code. The guidelines define how the method fragments are related to one another. The paper also shows two O-MaSE process examples. |
Iterative Adaptive Approaches to MIMO Radar Imaging | Multiple-input multiple-output (MIMO) radar can achieve superior performance through waveform diversity over conventional phased-array radar systems. When a MIMO radar transmits orthogonal waveforms, the reflected signals from scatterers are linearly independent of each other. Therefore, adaptive receive filters, such as Capon and amplitude and phase estimation (APES) filters, can be directly employed in MIMO radar applications. High levels of noise and strong clutter, however, significantly worsen detection performance of the data-dependent beamformers due to a shortage of snapshots. The iterative adaptive approach (IAA), a nonparametric and user parameter-free weighted least-squares algorithm, was recently shown to offer improved resolution and interference rejection performance in several passive and active sensing applications. In this paper, we show how IAA can be extended to MIMO radar imaging, in both the negligible and nonnegligible intrapulse Doppler cases, and we also establish some theoretical convergence properties of IAA. In addition, we propose a regularized IAA algorithm, referred to as IAA-R, which can perform better than IAA by accounting for unrepresented additive noise terms in the signal model. Numerical examples are presented to demonstrate the superior performance of MIMO radar over single-input multiple-output (SIMO) radar, and further highlight the improved performance achieved with the proposed IAA-R method for target imaging. |
Emergency ad-hoc networks by using drone mounted base stations for a disaster scenario | In case of a large scale disaster, the wireless access network can become quickly saturated. This is of course undesirable because for this kind of situations we actually need a reliable wireless connectivity. In this study, the potential of mounting LTE femtocell base stations on drones to offer an alternative for the saturated existing wireless infrastructure is investigated. Our preliminary results show that this a very promising approach although a high amount of drones are needed to cover all users in the city center of Ghent, Belgium during a 1h intervention. The number of drones can be significantly reduced (up to 64%) by using a more advanced type of drone, by decreasing the user coverage requirement (11% less drones when requiring 80% instead of 90%) or by increasing the fly height of the drones (about 10% less drones needed when increasing the fly height by 10 m). This study shows that it is interesting to further investigate the use of drones to provide an emergency wireless access network. |
Evaluation of liver fibrosis by transient elastography in methotrexate treated patients. | BACKGROUND AND AIMS
Methotrexate (MTX) safety is questioned by the risk of inducing liver fibrosis (LF). As transient elastography (FibroScan®) is an effective non-invasive technique to evaluate LF, our aims were to assess LF in MTX-treated patients, to evaluate LF regarding treatment duration and cumulative dose, and to determine differences depending on the underlying disease.
PATIENTS AND METHODS
Prospective study including patients with rheumatoid arthritis, inflammatory bowel disease, and psoriasis treated with MTX. Hepatic stiffness was determined by FibroScan®. The LF cut-off values were established using METAVIR score.
RESULTS
Of 53 patients, 22 were men (41.5%), mean age was 55 (15) years, 17 (32%) had rheumatoid arthritis, 18 (34%) inflammatory bowel disease, and 18 (34%) psoriasis. Mean MTX cumulative dose was 1,805 (1,560) mg, and mean treatment duration was 178 weeks. Mean hepatic stiffness was 6.19 (2.43) KPa. In 49 patients (92.5%), absence/mild LF was found (F ≤ 2), and 4 patients (7.5%) had advanced LF (F ≥ 3). Treatment duration or cumulative doses of MTX were not associated with LF.
CONCLUSIONS
Regarding LF development, MTX therapy is safe. FibroScan® is useful for monitoring LF in MTX-treated patients. |
A FRAME THEORY PRIMER FOR THE KADISON-SINGER PROBLEM | This is a primer on frame theory geared towards the parts of the theory needed for people who want to understand the relationship between the Kadison-Singer Problem and frame theory. |
Learning to rank for information retrieval | This tutorial is concerned with a comprehensive introduction to the research area of learning to rank for information retrieval. In the first part of the tutorial, we will introduce three major approaches to learning to rank, i.e., the pointwise, pairwise, and listwise approaches, analyze the relationship between the loss functions used in these approaches and the widely-used IR evaluation measures, evaluate the performance of these approaches on the LETOR benchmark datasets, and demonstrate how to use these approaches to solve real ranking applications. In the second part of the tutorial, we will discuss some advanced topics regarding learning to rank, such as relational ranking, diverse ranking, semi-supervised ranking, transfer ranking, query-dependent ranking, and training data preprocessing. In the third part, we will briefly mention the recent advances on statistical learning theory for ranking, which explain the generalization ability and statistical consistency of different ranking methods. In the last part, we will conclude the tutorial and show several future research directions. |
Security Analysis of PHP Bytecode Protection Mechanisms | PHP is the most popular scripting language for web applications. Because no native solution to compile or protect PHP scripts exists, PHP applications are usually shipped as plain source code which is easily understood or copied by an adversary. In order to prevent such attacks, commercial products such as ionCube, Zend Guard, and Source Guardian promise a source code protection. In this paper, we analyze the inner working and security of these tools and propose a method to recover the source code by leveraging static and dynamic analysis techniques. We introduce a generic approach for decompilation of obfuscated bytecode and show that it is possible to automatically recover the original source code of protected software. As a result, we discovered previously unknown vulnerabilities and backdoors in 1 million lines of recovered source code of 10 protected applications. |
Localization for mobile sensor networks | Many sensor network applications require location awareness, but it is often too expensive to include a GPS receiver in a sensor network node. Hence, localization schemes for sensor networks typically use a small number of seed nodes that know their location and protocols whereby other nodes estimate their location from the messages they receive. Several such localization techniques have been proposed, but none of them consider mobile nodes and seeds. Although mobility would appear to make localization more difficult, in this paper we introduce the sequential Monte Carlo Localization method and argue that it can exploit mobility to improve the accuracy and precision of localization. Our approach does not require additional hardware on the nodes and works even when the movement of seeds and nodes is uncontrollable. We analyze the properties of our technique and report experimental results from simulations. Our scheme outperforms the best known static localization schemes under a wide range of conditions. |
Ontology-based semantic similarity: A new feature-based approach | Estimation of the semantic likeness between words is of great importance in many applications dealing with textual data such as natural language processing, knowledge acquisition and information retrieval. Semantic similarity measures exploit knowledge sources as the base to perform the estimations. In recent years, ontologies have grown in interest thanks to global initiatives such as the Semantic Web, offering an structured knowledge representation. Thanks to the possibilities that ontologies enable regarding semantic interpretation of terms many ontology-based similarity measures have been developed. According to the principle in which those measures base the similarity assessment and the way in which ontologies are exploited or complemented with other sources several families of measures can be identified. In this paper, we survey and classify most of the ontology-based approaches developed in order to evaluate their advantages and limitations and compare their expected performance both from theoretical and practical points of view. We also present a new ontology-based measure relying on the exploitation of taxonomical features. The evaluation and comparison of our approach’s results against those reported by related works under a common framework suggest that our measure provides a high accuracy without some of the limitations observed in other works. |
Impact of Small and Medium Enterprises on Economic Growth and Development | This paper seeks to investigate Small and Medium Enterprises as a veritable tool in Economic Growth and Development. A survey method was used to gather data from 200 SME/Entrepreneurial officers and Managers from five selected local government in Nigeria namely; Ijebu North, Yewa South, Sagamu, Odeda and Ogun Waterside Local government. Data was collected with a structured questionnaire and analyzed with several descriptive statistics to identify the perception of the roles of SMEs in Nigeria. The results of the study therefore reveals that the most common constraints hindering small and medium scale business growth in Nigeria are lack of financial support, poor management, corruption, lack of training and experience, poor infrastructure, insufficient profits, and low demand for product and services. Hence, it therefore recommends that Government should as matter of urgency assist prospective entrepreneurs to have access to finance and necessary information relating to business opportunities, modern technology, raw materials, market, plant and machinery which would enable them to reduce their operating cost and be more efficient to meet the market competitions. |
Tailoring shear banding behaviors in high entropy bulk metallic glass by minor Sn addition: A nanoindentation study | Abstract The present work investigated systematically the glass forming ability, the mechanical properties and the shear deformation of the TiZrHfCuBe high entropy bulk metallic glass (HE-BMG) with minor Sn addition. The results revealed that the glass forming ability and the thermal stability are enhanced by minor Sn addition but within a rather limited composition range. A nanoindentation study of the mechanical response in the (TiZrHfCuBe) 1- x Sn x ( x = 0, 1, 2 and 3 at. %) HE-BMGs showed that the structural heterogeneity caused by Sn addition resulted in higher hardness, larger number of shear bands and smaller size of shear transformation zone, ascribing to the positive enthalpy of mixing between the Sn and Cu/Be elements. A statistical analysis of the serrated flow revealed that the chemical heterogeneities promote a relatively large population of shear deformation units and lead to multiple nucleation of shear bands. The present work might has implications in establishing the link between minor element addition and structure heterogeneity, which is of critical importance for understanding of shear banding behaviors in BMGs. |
The Impact of Game Patterns on Player Experience and Social Interaction in Co-Located Multiplayer Games | Outstanding multiplayer games engage players by providing rich social interactions. Yet, it is still not clear how to purposefully design these interactions. Our paper tackles this issue by establishing a research model for social player interaction highlighting the impact of the game design, the player group, and the gaming setting. Based on that model, we investigated the influence of three particular game design patterns - player interdependence, time pressure, and shared control - on the interaction of players. For that purpose, we developed a co-located game to systematically test variations of those patterns. We analyzed the resulting player experience and social interaction by applying questionnaires and videotaping game play sessions. Results indicate that high player interdependence implies more communication and less frustration, whereas shared control results in less perceived competence and autonomy. Moreover, individual player characteristics also impact the social interaction. |
Assumption-Free Anomaly Detection in Time Series | Recent advancements in sensor technology have made it possible to collect enormous amounts of data in real time. However, because of the sheer volume of data most of it will never be inspected by an algorithm, much less a human being. One way to mitigate this problem is to perform some type of anomaly (novelty /interestingness/surprisingness) detection and flag unusual patterns for further inspection by humans or more CPU intensive algorithms. Most current solutions are “custom made” for particular domains, such as ECG monitoring, valve pressure monitoring, etc. This customization requires extensive effort by domain expert. Furthermore, hand-crafted systems tend to be very brittle to concept drift. In this demonstration, we will show an online anomaly detection system that does not need to be customized for individual domains, yet performs with exceptionally high precision/recall. The system is based on the recently introduced idea of time series bitmaps. To demonstrate the universality of our system, we will allow testing on independently annotated datasets from domains as diverse as ECGs, Space Shuttle telemetry monitoring, video surveillance, and respiratory data. In addition, we invite attendees to test our system with any dataset available on the web. |
Bilingual Correspondence Recursive Autoencoder for Statistical Machine Translation | Learning semantic representations and tree structures of bilingual phrases is beneficial for statistical machine translation. In this paper, we propose a new neural network model called Bilingual Correspondence Recursive Autoencoder (BCorrRAE) to model bilingual phrases in translation. We incorporate word alignments into BCorrRAE to allow it freely access bilingual constraints at different levels. BCorrRAE minimizes a joint objective on the combination of a recursive autoencoder reconstruction error, a structural alignment consistency error and a crosslingual reconstruction error so as to not only generate alignment-consistent phrase structures, but also capture different levels of semantic relations within bilingual phrases. In order to examine the effectiveness of BCorrRAE, we incorporate both semantic and structural similarity features built on bilingual phrase representations and tree structures learned by BCorrRAE into a state-of-the-art SMT system. Experiments on NIST Chinese-English test sets show that our model achieves a substantial improvement of up to 1.55 BLEU points over the baseline. |
A persistence landscapes toolbox for topological statistics | Topological data analysis provides a multiscale description of the geometry and topology of quantitative data. The persistence landscape is a topological summary that can be easily combined with tools from statistics and machine learning. We give efficient algorithms for calculating persistence landscapes, their averages, and distances between such averages. We discuss an implementation of these algorithms and some related procedures. These are intended to facilitate the combination of statistics and machine learning with topological data analysis. We present an experiment showing that the low-dimensional persistence landscapes of points sampled from spheres (and boxes) of varying dimensions differ. |
The FIGO classification of causes of abnormal uterine bleeding in the reproductive years. | At this juncture, clinical management, education for medical providers, and the design and interpretation of clinical trials have been hampered by the absence of a consensus system for nomenclature for the description of symptoms as well as classification of causes or potential causes of abnormal uterine bleeding (AUB). To address this issue, the Fédération Internationale de Gynécologie et d'Obstétrique (FIGO) has designed the PALM-COEIN (Polyp, Adenomyosis, Leiomyoma, Malignancy and Hyperplasia, Coagulopathy, Ovulatory Disorders, Endometrial Disorders, Iatrogenic Causes, and Not Classified) classification system for causes of AUB in the reproductive years. |
Balancing plasticity/stability across brain development. | The potency of the environment to shape brain function changes dramatically across the lifespan. Neural circuits exhibit profound plasticity during early life and are later stabilized. A focus on the cellular and molecular bases of these developmental trajectories has begun to unravel mechanisms, which control the onset and closure of such critical periods. Two important concepts have emerged from the study of critical periods in the visual cortex: (1) excitatory-inhibitory circuit balance is a trigger; and (2) molecular "brakes" limit adult plasticity. The onset of the critical period is determined by the maturation of specific GABA circuits. Targeting these circuits using pharmacological or genetic approaches can trigger premature onset or induce a delay. These manipulations are so powerful that animals of identical chronological age may be at the peak, before, or past their plastic window. Thus, critical period timing per se is plastic. Conversely, one of the outcomes of normal development is to stabilize the neural networks initially sculpted by experience. Rather than being passively lost, the brain's intrinsic potential for plasticity is actively dampened. This is demonstrated by the late expression of brake-like factors, which reversibly limit excessive circuit rewiring beyond a critical period. Interestingly, many of these plasticity regulators are found in the extracellular milieu. Understanding why so many regulators exist, how they interact and, ultimately, how to lift them in noninvasive ways may hold the key to novel therapies and lifelong learning. |
Dynamic encoding of action selection by the medial striatum. | Successful foragers respond flexibly to environmental stimuli. Behavioral flexibility depends on a number of brain areas that send convergent projections to the medial striatum, such as the medial prefrontal cortex, orbital frontal cortex, and amygdala. Here, we tested the hypothesis that neurons in the medial striatum are involved in flexible action selection, by representing changes in stimulus-reward contingencies. Using a novel Go/No-go reaction-time task, we changed the reward value of individual stimuli within single experimental sessions. We simultaneously recorded neuronal activity in the medial and ventral parts of the striatum of rats. The rats modified their actions in the task after the changes in stimulus-reward contingencies. This was preceded by dynamic modulations of spike activity in the medial, but not the ventral, striatum. Our results suggest that the medial striatum biases animals to collect rewards to potentially valuable stimuli and can rapidly influence flexible behavior. |
Why Bank Governance is Different | This paper reviews the pattern of bank failures during the financial crisis and asks whether there was a link with corporate governance. It revisits the theory of bank governance and suggests a multiconstituency approach that emphasizes the role of weak creditors. The empirical evidence suggests that, on average, banks with stronger risk officers, less independent boards, and executives with less variable remuneration incurred fewer losses. There is no evidence that institutional shareholders opposed aggressive risk-taking. The Financial Stability Board published Principles for Sound Compensation Practices in 2009, and the Basel Committee on Banking Supervision issued principles for enhancing corporate governance in 1999, 2006, and 2010. The reports have in common that shareholders retain residual control and executive pay continues to be aligned with shareholder interests. However, we argue that bank governance is different and requires more radical departures from traditional governance for non-financial firms. |
Combining Shape-Changing Interfaces and Spatial Augmented Reality Enables Extended Object Appearance | We propose combining shape-changing interfaces and spatial augmented reality for extending the space of appearances and interactions of actuated interfaces. While shape-changing interfaces can dynamically alter the physical appearance of objects, the integration of spatial augmented reality additionally allows for dynamically changing objects' optical appearance with high detail. This way, devices can render currently challenging features such as high frequency texture or fast motion. We frame this combination in the context of computer graphics with analogies to established techniques for increasing the realism of 3D objects such as bump mapping. This extensible framework helps us identify challenges of the two techniques and benefits of their combination. We utilize our prototype shape-changing device enriched with spatial augmented reality through projection mapping to demonstrate the concept. We present a novel mechanical distance-fields algorithm for real-time fitting of mechanically constrained shape-changing devices to arbitrary 3D graphics. Furthermore, we present a technique for increasing effective screen real estate for spatial augmented reality through view-dependent shape change. |
The “ Electronic Health Record : Standardization and Implementation | Electronic patient record systems (EPRS) are doubtless a key component of every institutional healthcare information system (HCIS). With their capability of storing patient related data concerning patient related facts like problems, diagnoses, illness history etc. together with data concerning medical activities and their results like “Medical treatment planning by Dr. X: treatment plan” or “Taking an ECG by Dr. Y: ECG” EPRS are a central means of documentation, information exchange and collaboration in a modern healthcare organization. Like EPRS for HCIS, electronic health record systems (EHRS) are a key component of current and coming health telematic platforms. EHRS are a means for exchanging health data concerning an individual person between communication partners within the healthcare sector controlled by the individuals the communicated data are belonging to. EHRS are integrating EPRS of healthcare organisations as well as the personal health record of the person itself including for example illness related diary entries. The objectives of an EHRS are diverse, but as for EPRS the main objective is clearly to support the treatment of patients by provision of information needed for decisions by health care professionals. Problems concerning the introduction of EHRS as a part of a regional, nationwide or at best worldwide health telematic platform are diverse and so “the” EHRS does not exists yet. The major problem is the huge amount of different proprietary or standardized interfaces information systems to be potentially integrated today are using: message or interface standards like HL 7, EDIFACT, DICOM, rather content oriented standards like LOINC, ICD-10, ICPM or hybrid approaches like CEN 13606, openEHR to name but a few. Standards are the key for a successful implementation of any EHRS. Four layers of standardization can be recognized. The content layer and the structure layer both are concerned with the standardization of the elements of an EHR, that are meant to be exchanged between communication partners. The content layer addresses aspects of coding the content of EHR-Element using terminological systems like classifications or controlled vocabularies. The structure layer focuses on regulations concerning the structure of communicated EHRElements, e.g. XML-files following corresponding DTDs or XML-Schemes. The border between structure and content layer is often blurred, because several content oriented aspects of e.g. a discharge letter are usually modelled by defining its structure. The two remaining layers are the technological and the organizational layer. The technological layer contains regulations concerning aspects like software and hardware components, distribution, objects and services, the PKI etc. The organizational layer focuses on organizational changes caused by the usage of an EHRS: business processes, guidelines, protocols, roles, PKI etc. Organizational and technological regulations are even more dependent on national strategies as regulations on the structural or content layer and therefore the rest of the paper deals with standardization activities concerned mainly with the structural and the content layer. |
User Interface Goals, AI Opportunities | Is AI antithetical to good user interface design? From the earliest times in the development of computers, activities in human-computer interaction (HCI) and AI have been intertwined. But as subfields of computer science, HCI and AI have always had a love-hate relationship. The goal of HCI is to make computers easier to use and more helpful to their users. The goal of artificial intelligence is to model human thinking and to embody those mechanisms in computers. How are these goals related? Some in HCI have seen these goals sometimes in opposition. They worry that the heuristic nature of many AI algorithms will lead to unreliability in the interface. They worry that AI’s emphasis on mimicking human decision-making functions might usurp the decision-making prerogative of the human user. These concerns are not completely without merit. There are certainly many examples of failed attempts to prematurely foist AI on the public. These attempts gave AI a bad name, at least at the time. But so too have there been failed attempts to popularize new HCI approaches. The first commercial versions of window systems, such as the Xerox Star and early versions of Microsoft Windows, weren’t well accepted at the time of their introduction. Later design iterations of window systems, such as the Macintosh and Windows 3.0, finally achieved success. Key was that these early failures did not lead their developers to conclude window systems were a bad idea. Researchers shouldn’t construe these (perceived) AI failures as a refutation of the idea of AI in interfaces. Modern PDA, smartphone, and tablet computers are now beginning to have quite usable handwriting recognition. Voice recognition is being increasingly employed on phones, and even in the noisy environment of cars. Animated agents, more polite, less intrusive, and better thought out, might also make a |
A 13-bit 200MS/s PIPELINE ADC in 0.13µm CMOS | This paper presents a 0.13μm SHA-less pipeline ADC with LMS calibration technique. The nonlinearity of the first three stages is calibrated with blind LMS algorithm. Opamps and switches are carefully considered and co-designed with the calibration system. Around 7LSB closed-loop nonlinearity of MDAC is achieved. Simulation shows the SNDR of the proposed ADC at 200MS/s sampling rate is 78dB with 3.13MHz input and 75dB with 83.13MHz input. |
Modeling and Indexing Spatiotemporal Trajectory Data in Non-Relational Databases | With the ever-growing nature of spatiotemporal data, it is inevitable to use non-relational and distributed database systems for storing massive spatiotemporal datasets. In this chapter, the important aspects of non-relational (NoSQL) databases for storing large-scale spatiotemporal trajectory data are investigated. Mainly, two data storage schemata are proposed for storing trajectories, which are called traditional and partitioned data models. Additionally spatiotemporal and non-spatiotemporal indexing structures are designed for efficiently retrieving data under different usage scenarios. The results of the experiments exhibit the advantages of utilizing data models and indexing structures for various query types. |
Histogram-based image hashing scheme robust against geometric deformations | In this paper, we propose a robust image hash algorithm by using the invariance of the image histogram shape to geometric deformations. Robustness and uniqueness of the proposed hash function are investigated in detail by representing the histogram shape as the relative relations in the number of pixels among groups of two different bins. It is found from extensive testing that the histogram-based hash function has a satisfactory performance to various geometric deformations, and is also robust to most common signal processing operations thanks to the use of Gaussian kernel low-pass filter in the preprocessing phase. |
Erythropoietin treatment in chemotherapy-induced anemia in previously untreated advanced esophagogastric cancer patients | The impact of erythropoiesis-stimulating agents in chemotherapy-induced anemia has been a constant topic of debate over recent years. We prospectively assessed the efficacy of epoetin beta (Epo-b) in improving hemoglobin (Hb) levels and outcome in patients within an open label, randomized clinical phase II trial with advanced or metastatic gastric/esophagogastric cancer. Previously untreated patients were randomized to receive 3-weekly cycles of capecitabine (1000 mg/m2 bid) for 14 days plus on day 1 either irinotecan 250 mg/m2 or cisplatin 80 mg/m2. Epo-b (30000 IU once weekly) was initiated in patients with Hb <11 g/dl and continued until Hb ≥12 g/dl was reached. If after 4 weeks the Hb increase was <0.5 g/dl, Epo-b was increased to 30000 IU, twice weekly. Of 118 patients enrolled, 32 received Epo-b treatment; of these, 65 % achieved an increase in Hb levels of at least 2 g/dl, with 74 % achieving the target Hb of ≥12 g/dl. Within the study population, patients receiving Epo-b showed better overall survival (median 14.5 vs. 8.0 months, P = 0.056) as well as a significantly improved disease control rate (78 vs. 55 %, P = 0.025). Patients in the irinotecan group profited significantly (P < 0.05) in terms of progression-free survival and overall survival under Epo-b treatment (median 6.5 vs 4.1 months and median 15.4 vs 8.4 months, respectively). Epo-b was effective in raising Hb levels in patients with advanced esophagogastric cancer. Patients receiving Epo-b had a significantly increased response to chemotherapy and a clear trend to improved survival. |
Multi-level Attention Networks for Visual Question Answering | Inspired by the recent success of text-based question answering, visual question answering (VQA) is proposed to automatically answer natural language questions with the reference to a given image. Compared with text-based QA, VQA is more challenging because the reasoning process on visual domain needs both effective semantic embedding and fine-grained visual understanding. Existing approaches predominantly infer answers from the abstract low-level visual features, while neglecting the modeling of high-level image semantics and the rich spatial context of regions. To solve the challenges, we propose a multi-level attention network for visual question answering that can simultaneously reduce the semantic gap by semantic attention and benefit fine-grained spatial inference by visual attention. First, we generate semantic concepts from high-level semantics in convolutional neural networks (CNN) and select those question-related concepts as semantic attention. Second, we encode region-based middle-level outputs from CNN into spatially-embedded representation by a bidirectional recurrent neural network, and further pinpoint the answer-related regions by multiple layer perceptron as visual attention. Third, we jointly optimize semantic attention, visual attention and question embedding by a softmax classifier to infer the final answer. Extensive experiments show the proposed approach outperforms the-state-of-arts on two challenging VQA datasets. |
Multiresolution Gray-Scale and Rotation Invariant Texture Classification with Local Binary Patterns | ÐThis paper presents a theoretically very simple, yet efficient, multiresolution approach to gray-scale and rotation invariant texture classification based on local binary patterns and nonparametric discrimination of sample and prototype distributions. The method is based on recognizing that certain local binary patterns, termed auniform,o are fundamental properties of local image texture and their occurrence histogram is proven to be a very powerful texture feature. We derive a generalized gray-scale and rotation invariant operator presentation that allows for detecting the auniformo patterns for any quantization of the angular space and for any spatial resolution and presents a method for combining multiple operators for multiresolution analysis. The proposed approach is very robust in terms of gray-scale variations since the operator is, by definition, invariant against any monotonic transformation of the gray scale. Another advantage is computational simplicity as the operator can be realized with a few operations in a small neighborhood and a lookup table. Excellent experimental results obtained in true problems of rotation invariance, where the classifier is trained at one particular rotation angle and tested with samples from other rotation angles, demonstrate that good discrimination can be achieved with the occurrence statistics of simple rotation invariant local binary patterns. These operators characterize the spatial configuration of local image texture and the performance can be further improved by combining them with rotation invariant variance measures that characterize the contrast of local image texture. The joint distributions of these orthogonal measures are shown to be very powerful tools for rotation invariant texture analysis. Index TermsÐNonparametric, texture analysis, Outex, Brodatz, distribution, histogram, contrast. |
A phase II study of the combination of endocrine treatment and bortezomib in patients with endocrine-resistant metastatic breast cancer. | The majority of patients with hormone receptor-positive metastatic breast cancer die from disease progression despite different types of anti-hormonal treatments. Preclinical studies have indicated that resistance to anti-hormonal therapies may be the result of an activated NF-κB signalling pathway in breast cancer. Bortezomib is a proteasome inhibitor that blocks the NF-κB pathway. Recent pharmacodynamic and pharmaco-kinetic xenograft studies have shown that drug exposure may be a crucial factor for the efficacy of bortezomib in solid tumours. The aim was to investigate whether the addition of bortezomib to anti-hormonal therapy would result in regained antitumour activity in patients with progressive and measurable disease being treated with an endocrine agent. Clinical benefit was defined as patients obtaining stable disease, partial response or complete response after 2 cycles, lasting for at least another five weeks. Bortezomib was administered on days 1, 8, 15 and 22 of a 5-week regimen (1.6 mg/m2). Eight patients received an aromatase inhibitor and bortezomib, while one received tamoxifen and bortezomib. There were 3 grade 3 gastrointestinal toxicities. Median time to treatment failure was 69 days (range, 35-140). Two out of the 9 patients had stable disease for more than 10 weeks. Despite an effective target inhibition, suggested in peripheral blood mononuclear cells and available tumour samples, no objective antitumour responses were observed. Addition of a proteasome inhibitor to anti-hormonal therapy resulted in a clinical benefit rate of 22% in a limited number of patients with endocrine resistant and progressive metastatic breast cancer. The demonstrated proteasome inhibition in tumour tissue provides evidence that the lack of clinical responses is not attributed to deficient drug exposure. |
Energy Metabolism of the Brain, Including the Cooperation between Astrocytes and Neurons, Especially in the Context of Glycogen Metabolism | Glycogen metabolism has important implications for the functioning of the brain, especially the cooperation between astrocytes and neurons. According to various research data, in a glycogen deficiency (for example during hypoglycemia) glycogen supplies are used to generate lactate, which is then transported to neighboring neurons. Likewise, during periods of intense activity of the nervous system, when the energy demand exceeds supply, astrocyte glycogen is immediately converted to lactate, some of which is transported to the neurons. Thus, glycogen from astrocytes functions as a kind of protection against hypoglycemia, ensuring preservation of neuronal function. The neuroprotective effect of lactate during hypoglycemia or cerebral ischemia has been reported in literature. This review goes on to emphasize that while neurons and astrocytes differ in metabolic profile, they interact to form a common metabolic cooperation. |
Automatic Rule Induction for Unknown-Word Guessing | Words unknown to the lexicon present a substantial problem to NLP modules that rely on morphosyntactic information, such as part-of-speech taggers or syntactic parsers. In this paper we present a technique for fully automatic acquisition of rules that guess possible part-of-speech tags for unknown words using their starting and ending segments. The learning is performed from a general-purpose lexicon and word frequencies collected from a raw corpus. Three complimentary sets of word-guessing rules are statistically induced: prefix morphological rules, suffix morphological rules and ending-guessing rules. Using the proposed technique, unknown-word-guessing rule sets were induced and integrated into a stochastic tagger and a rule-based tagger, which were then applied to texts with unknown words. |
Multi-labelled classification using maximum entropy method | Many classification problems require classifiers to assign each single document into more than one category, which is called multi-labelled classification. The categories in such problems usually are neither conditionally independent from each other nor mutually exclusive, therefore it is not trivial to directly employ state-of-the-art classification algorithms without losing information of relation among categories. In this paper, we explore correlations among categories with maximum entropy method and derive a classification algorithm for multi-labelled documents. Our experiments show that this method significantly outperforms the combination of single label approach. |
Predicting Depth, Surface Normals and Semantic Labels with a Common Multi-scale Convolutional Architecture | In this paper we address three different computer vision tasks using a single basic architecture: depth prediction, surface normal estimation, and semantic labeling. We use a multiscale convolutional network that is able to adapt easily to each task using only small modifications, regressing from the input image to the output map directly. Our method progressively refines predictions using a sequence of scales, and captures many image details without any superpixels or low-level segmentation. We achieve state-of-the-art performance on benchmarks for all three tasks. |
Work in progress - ‘Real World Problems’ as assessment of software engineering | This project evaluated the effectiveness of using certification-like real-world problems for regular assessment of student performance in a software engineering content course. The approach utilized the Turning Pointtrade personal response system as a means to present "Real World" software engineering problems and anonymously assess student learning. The preliminary results indicated that the method was not useful for assessment, but may have promise for stimulating discussion and student learning. |
Short- and long-term prognosis among veterans with neurological disorders and subsequent lower-extremity amputation. | BACKGROUND
Although comorbid neurological conditions are not uncommon for individuals undergoing lower-extremity (LE) amputation, short- and long-term prognosis is unclear.
METHODS
This cohort study on the survival of United States veterans with LE amputations examined the association between different preexisting neurological conditions and short- and long-term (in-hospital and within 1-year of surgical amputation) mortality. Chi(2) and t test statistics compared baseline characteristics for patients with and without neurological disorders. Multiple logistic regression and Cox proportional hazard models were used to examine short- and long-term survival and identify predictors limited to the subset of those with neurological conditions adjusting for age, amputation level and etiology, and co-morbidities.
RESULTS
Of 4,720 patients, 43.3% had neurological disorders documented. Most prevalent were stroke or hemiparesis (18.3%) and peripheral nervous system (PNS) disorders (20.3%). Among patients with neurological conditions, those with a PNS disorder or spinal cord injury (or paralysis) were significantly less likely to die in hospital and within 1 year (p < 0.05) when compared to the other types of neurological condition groups including stroke (or hemiparesis), cerebral degenerative diseases, movement disorders and autonomic disorders.
CONCLUSIONS
The high prevalence of preexisting neurological disorders among LE amputees and the varying effect of different conditions on risk of mortality highlights the need to further characterize the diverseness of this understudied subpopulation. While preexisting spinal cord injury and PNS disorders appear to carry a decreased risk among amputees, those with central nervous system disorders have comparatively greater mortalities. |
Working with structuralism : essays and reviews on nineteenth- and twentieth-century literature | `This is a sane, highly intelligent, lucidly-written book, and one which everyone concerned with literature must read at least once.' - "Daily Telegraph". |
Why don't families initiate treatment? A qualitative multicentre study investigating parents' reasons for declining paediatric weight management. | BACKGROUND
Many families referred to specialized health services for managing paediatric obesity do not initiate treatment; however, reasons for noninitiation are poorly understood.
OBJECTIVE
To understand parents' reasons for declining tertiary-level health services for paediatric weight management.
METHOD
Interviews were conducted with 18 parents of children (10 to 17 years of age; body mass index ≥85th percentile) who were referred for weight management, but did not initiate treatment at one of three Canadian multidisciplinary weight management clinics. A semi-structured interview guide was used to elicit parents' responses about reasons for noninitiation. Interviews were audio-recorded and transcribed verbatim. Data were managed using NVivo 9 (QSR International, Australia) and analyzed thematically.
RESULTS
Most parents (mean age 44.1 years; range 34 to 55 years) were female (n=16 [89%]), obese (n=12 [66%]) and had a university degree (n=13 [71%]). Parents' reasons for not initiating health services were grouped into five themes: no perceived need for paediatric weight management (eg, perceived children did not have a weight or health problem); no perceived need for further actions (eg, perceived children already had a healthy lifestyle); no intention to initiate recommended care (eg, perceived clinical program was not efficacious); participation barriers (eg, children's lack of motivation); and situational factors (eg, weather).
CONCLUSION
Physicians should not only discuss the need for and value of specialized care for managing paediatric obesity, but also explore parents' intention to initiate treatment and address reasons for noninitiation that are within their control. |
A Richly Annotated Dataset for Pedestrian Attribute Recognition | In this paper, we aim to improve the dataset foundation for pedestrian attribute recognition in real surveillance scenarios. Recognition of human attributes, such as gender, and clothes types, has great prospects in real applications. However, the development of suitable benchmark datasets for attribute recognition remains lagged behind. Existing human attribute datasets are collected from various sources or an integration of pedestrian re-identification datasets. Such heterogeneous collection poses a big challenge on developing high quality fine-grained attribute recognition algorithms. Furthermore, human attribute recognition are generally severely affected by environmental or contextual factors, such as viewpoints, occlusions and body parts, while existing attribute datasets barely care about them. To tackle these problems, we build a Richly Annotated Pedestrian (RAP) dataset from real multi-camera surveillance scenarios with long term collection, where data samples are annotated with not only fine-grained human attributes but also environmental and contextual factors. RAP has in total 41,585 pedestrian samples, each of which is annotated with 72 attributes as well as viewpoints, occlusions, body parts information. To our knowledge, the RAP dataset is the largest pedestrian attribute dataset, which is expected to greatly promote the study of large-scale attribute recognition systems. Furthermore, we empirically analyze the effects of different environmental and contextual factors on pedestrian attribute recognition. Experimental results demonstrate that viewpoints, occlusions and body parts information could assist attribute recognition a lot in real applications. |
Population-based prevalence and age distribution of human papillomavirus among women in Santiago, Chile. | UNLABELLED
More than 18 types of human papillomavirus (HPV) are associated with cervical cancer, the relative importance of the HPV types may vary in different populations.
OBJECTIVE
To investigate the types of HPV, age distribution, and risk factors for HPV infection in women from Santiago, Chile.
METHODS
We interviewed and obtained two cervical specimens from a population-based random sample of 1,038 sexually active women (age range, 15-69 years). Specimens were tested for the presence of HPV DNA using a GP5+/6+ primer-mediated PCR and for cervical cytologic abnormalities by Papanicolaou smears.
RESULTS
122 women tested positive for HPV DNA, 87 with high risk types (HR), and 35 with low risks (LR) only. Standardized prevalence of HPV DNA was 14.0% [95% confidence interval (95% CI), 11.5-16.4]. HR HPV by age showed a J reverse curve, whereas LR HPV showed a U curve, both statistically significant in comparison with no effect or with a linear effect. We found 34 HPV types (13 HR and 21 LR); HPV 16, 56, 31, 58, 59, 18, and 52 accounted for 75.4% of HR infections. Thirty-four (3.6%) women had cytologic lesions. Main risk factor for HPV and for cytologic abnormalities was number of lifetime sexual partners, odds ratios for > or =3 versus 1 were 2.8 (95% CI, 1.6-5.0) and 3.8 (95% CI, 1.3-11.4), respectively.
CONCLUSIONS
LR HPV presented a clear bimodal age pattern; HR HPV presented a J reverse curve. HPV prevalence was similar to that described in most Latin American countries. |
Blogel: A Block-Centric Framework for Distributed Computation on Real-World Graphs | The rapid growth in the volume of many real-world graphs (e.g., social networks, web graphs, and spatial networks) has led to the development of various vertex-centric distributed graph computing systems in recent years. However, real-world graphs from different domains have very different characteristics, which often create bottlenecks in vertex-centric parallel graph computation. We identify three such important characteristics from a wide spectrum of real-world graphs, namely (1)skewed degree distribution, (2)large diameter, and (3)(relatively) high density. Among them, only (1) has been studied by existing systems, but many real-world powerlaw graphs also exhibit the characteristics of (2) and (3). In this paper, we propose a block-centric framework, called Blogel, which naturally handles all the three adverse graph characteristics. Blogel programmers may think like a block and develop efficient algorithms for various graph problems. We propose parallel algorithms to partition an arbitrary graph into blocks efficiently, and blockcentric programs are then run over these blocks. Our experiments on large real-world graphs verified that Blogel is able to achieve orders of magnitude performance improvements over the state-ofthe-art distributed graph computing systems. |
Applications of Artificial Intelligence to Network Security | Attacks to networks are becoming more complex and sophisticated every day. Beyond the so-called script-kiddies and hacking newbies, there is a myriad of professional attackers seeking to make serious profits infiltrating in corporate networks. Either hostile governments, big corporations or mafias are constantly increasing their resources and skills in cybercrime in order to spy, steal or cause damage more effectively. With the ability and resources of hackers growing, the traditional approaches to Network Security seem to start hitting their limits and it’s being recognized the need for a smarter approach to threat detections. This paper provides an introduction on the need for evolution of Cyber Security techniques and how Artificial Intelligence (AI) could be of application to help solving some of the problems. It provides also, a high-level overview of some state of the art AI Network Security techniques, to finish analysing what is the foreseeable future of the application of AI to Network Security. Applications of Artificial Intelligence (AI) to Network Security 3 |
Role of long term antibiotics in chronic respiratory diseases. | Antibiotics are commonly used in the management of respiratory disorders such as cystic fibrosis (CF), non-CF bronchiectasis, asthma and COPD. In those conditions long-term antibiotics can be delivered as nebulised aerosols or administered orally. In CF, nebulised colomycin or tobramycin improve lung function, reduce number of exacerbations and improve quality of life (QoL). Oral antibiotics, such as macrolides, have acquired wide use not only as anti-microbial agents but also due to their anti-inflammatory and pro-kinetic properties. In CF, macrolides such as azithromycin have been shown to improve the lung function and reduce frequency of infective exacerbations. Similarly macrolides have been shown to have some benefits in COPD including reduction in a number of exacerbations. In asthma, macrolides have been reported to improve some subjective parameters, bronchial hyperresponsiveness and airway inflammation; however have no benefits on lung function or overall asthma control. Macrolides have also been used with beneficial effects in less common disorders such as diffuse panbronchiolitis or post-transplant bronchiolitis obliterans syndrome. In this review we describe our current knowledge the use of long-term antibiotics in conditions such as CF, non-CF bronchiectasis, asthma and COPD together with up-to-date clinical and scientific evidence to support our understanding of the use of antibiotics in those conditions. |
Circularly Polarized Substrate-Integrated Waveguide Tapered Slot Antenna for Millimeter-Wave Applications | A low-cost wideband circularly polarized (CP) tapered slot antenna based on substrate-integrated waveguide (SIW) is presented for millimeter-wave applications. After using two linearly symmetrical slots on the two sides of the SIW, a bandwidth from 80.1 to 108 GHz for both axial ratio <inline-formula><tex-math notation="LaTeX">$<$ </tex-math></inline-formula> 3 dB and VSWR <inline-formula><tex-math notation="LaTeX">$<$</tex-math> </inline-formula> 2 can be achieved. The measuring results also show that the antenna can generate good endfire radiation patterns and gain around 7.9 dBi over this bandwidth. The mechanism for exciting CP wave of the proposed antenna and the influence of the size of the metalized via holes of SIW are also analyzed. The proposed antenna was fabricated by the standard single-layered printed circuit board (PCB) technology, and the measured results agree reasonably with the simulated ones. |
Construction of Abstract State Graphs with PVS | In this paper, we propose a method for the automat ic construction of an abstract s tate graph of an arbi t rary system using the Pvs theorem prover. Given a parallel composition of sequential processes and a par t i t ion of the s ta te space induced by predicates ~ i , ..., ~ on the program variables which defines an abstract s tate space, we construct an abstract s tate graph, start ing in the abstract initial state. The possible successors of a s tate are computed using the Pvs theorem prover by verifying for each index i if ~ or ~ i is a postcondition of it. This allows an abstract s tate space exploration for arbi t rary programs. keywords: abstract interpretation, state graph exploration, theorem proving |
Hindsight Experience Replay | Dealing with sparse rewards is one of the biggest challenges in Reinforcement Learning (RL). We present a novel technique called Hindsight Experience Replay which allows sample-efficient learning from rewards which are sparse and binary and therefore avoid the need for complicated reward engineering. It can be combined with an arbitrary off-policy RL algorithm and may be seen as a form of implicit curriculum. We demonstrate our approach on the task of manipulating objects with a robotic arm. In particular, we run experiments on three different tasks: pushing, sliding, and pick-and-place, in each case using only binary rewards indicating whether or not the task is completed. Our ablation studies show that Hindsight Experience Replay is a crucial ingredient which makes training possible in these challenging environments. We show that our policies trained on a physics simulation can be deployed on a physical robot and successfully complete the task. The video presenting our experiments is available at https://goo.gl/SMrQnI. |
Machine Learning for Coreference Resolution: From Local Classification to Global Ranking | In this paper, we view coreference resolution as a problem of ranking candidate partitions generated by different coreference systems. We propose a set of partition-based features to learn a ranking model for distinguishing good and bad partitions. Our approach compares favorably to two state-of-the-art coreference systems when evaluated on three standard coreference data sets. |
External attribution of intentional notions to explain and predict agent behaviour | Catholijn Jonker Vrije Universiteit Amsterdam Department of Artificial Intelligence De Boelelaan 1081a 1081 HV Amsterdam The Netherlands Tel. +31 20 444 7743 [email protected] Jan Treur Vrije Universiteit Amsterdam Department of Artificial Intelligence De Boelelaan 1081a 1081 HV Amsterdam The Netherlands Tel. +31 20 444 7763 [email protected] Wieke de Vries Universiteit Utrecht Institute of Information and Computing Sciences Padualaan 14, De Uithof 3584CH Utrecht [email protected] |
Standard Compliant Hazard and Threat Analysis for the Automotive Domain | The automotive industry has successfully collaborated to release the ISO 26262 standard for developing safe software for cars. The standard describes in detail how to conduct hazard analysis and risk assessments to determine the necessary safety measures for each feature. However, the standard does not concern threat analysis for malicious attackers or how to select appropriate security countermeasures. We propose to apply ISO 27001 for this purpose and show how it can be applied together with ISO 26262. We show how ISO 26262 documentation can be re-used and enhanced to satisfy the analysis and documentation demands of the ISO 27001 standard. We illustrate our approach based on an electronic steering column lock system. |
Team-Based Project Design of an Autonomous Robot | In this paper, we discuss the design and engineering of the C-P30, a custom robot design at Cal Poly State University, San Luis Obispo. This robot is designed by undergraduate computer engineering students at the university. This robot project is intended to be continually developed as students enter and leave the project. In addition, working on the project allows the students to fulfill a university senior design experience requirement. As this project is a continual work-in-progress, this paper outlines the current state of the hardware and software design. This paper covers the technical aspects of the robot design, as well as the educational objectives that are achieved. |
The Web as a Graph: Measurements, Models, and Methods | The pages and hyperlinks of the World Wide Web may be viewed as nodes and edges in a directed graph This graph is a fascinating object of study it has several hundred million nodes today over a billion links and appears to grow exponentially with time There are many rea sons mathematical sociological and commercial for studying the evolution of this graph In this paper we begin by describing two algo rithms that operate on the Web graph addressing problems from Web search and automatic community discovery We then report a number of measurements and properties of this graph that manifested themselves as we ran these algorithms on the Web Finally we observe that tradi tional random graph models do not explain these observations and we propose a new family of random graph models These models point to a rich new sub eld of the study of random graphs and raise questions about the analysis of graph algorithms on the Web |
The impact of intercultural factors on global software development | This paper examines the concept of culture, and the potential impact of intercultural dynamics of software development. Many of the difficulties confronting today's global software development (GSD) environment have little to do with technical issues; rather, they are "human" issues that occur when extensive collaboration and communication among developers with distinct cultural backgrounds are required. Although project managers are reporting that intercultural factors are impacting software practices and artifacts and deserve more detailed study, little analytical research has been conducted in this area other than anecdotal testimonials by software professionals. This paper presents an introductory analysis of the effect that intercultural factors have on global software development. The paper first establishes a framework for intercultural variations by introducing several models commonly used to define culture. Cross-cultural issues that often arise in software development are then identified. The paper continues by explaining the importance of taking intercultural issues seriously and proposes some ideas for future research in the area |
A Common Mechanism Underlying Food Choice and Social Decisions | People make numerous decisions every day including perceptual decisions such as walking through a crowd, decisions over primary rewards such as what to eat, and social decisions that require balancing own and others' benefits. The unifying principles behind choices in various domains are, however, still not well understood. Mathematical models that describe choice behavior in specific contexts have provided important insights into the computations that may underlie decision making in the brain. However, a critical and largely unanswered question is whether these models generalize from one choice context to another. Here we show that a model adapted from the perceptual decision-making domain and estimated on choices over food rewards accurately predicts choices and reaction times in four independent sets of subjects making social decisions. The robustness of the model across domains provides behavioral evidence for a common decision-making process in perceptual, primary reward, and social decision making. |
Damage Assessment from Social Media Imagery Data During Disasters | Rapid access to situation-sensitive data through social media networks creates new opportunities to address a number of real-world problems. Damage assessment during disasters is a core situational awareness task for many humanitarian organizations that traditionally takes weeks and months. In this work, we analyze images posted on social media platforms during natural disasters to determine the level of damage caused by the disasters. We employ state-of-the-art machine learning techniques to perform an extensive experimentation of damage assessment using images from four major natural disasters. We show that the domain-specific fine-tuning of deep Convolutional Neural Networks (CNN) outperforms other state-of-the-art techniques such as Bag-of-Visual-Words (BoVW). High classification accuracy under both event-specific and cross-event test settings demonstrate that the proposed approach can effectively adapt deep-CNN features to identify the severity of destruction from social media images taken after a disaster strikes. |
Training Very Deep Networks | Theoretical and empirical evidence indicates that the depth of neural networks is crucial for their success. However, training becomes more difficult as depth increases, and training of very deep networks remains an open problem. Here we introduce a new architecture designed to overcome this. Our so-called highway networks allow unimpeded information flow across many layers on information highways. They are inspired by Long Short-Term Memory recurrent networks and use adaptive gating units to regulate the information flow. Even with hundreds of layers, highway networks can be trained directly through simple gradient descent. This enables the study of extremely deep and efficient architectures. |
Visual Question Answering as a Meta Learning Task | The predominant approach to Visual Question Answering (VQA) demands that the model represents within its weights all of the information required to answer any question about any image. Learning this information from any real training set seems unlikely, and representing it in a reasonable number of weights doubly so. We propose instead to approach VQA as a meta learning task, thus separating the question answering method from the information required. At test time, the method is provided with a support set of example questions/answers, over which it reasons to resolve the given question. The support set is not fixed and can be extended without retraining, thereby expanding the capabilities of the model. To exploit this dynamically provided information, we adapt a state-of-the-art VQA model with two techniques from the recent meta learning literature, namely prototypical networks and meta networks. Experiments demonstrate the capability of the system to learn to produce completely novel answers (i.e. never seen during training) from examples provided at test time. In comparison to the existing state of the art, the proposed method produces qualitatively distinct results with higher recall of rare answers, and a better sample efficiency that allows training with little initial data. More importantly, it represents an important step towards vision-and-language methods that can learn and reason on-the-fly. |
LVCSR rescoring with modified loss functions: a decision theoretic perspective | The problem of speech decoding is considered here in a Decision Theoretic framework and a modified speech decoding procedure to minimize the expected risk under a general loss function is formulated. A specific word error rate loss function is considered and an implementation in an N-best list rescoring procedure is presented. Methods for estimation of the parameters of the resulting decision rules are provided for both supervised and unsupervised training. Preliminary experiments on an LVCSR task show small but statistically significant error rate improvements. |
Solving the 5G Mobile Antenna Puzzle: Assessing Future Directions for the 5G Mobile Antenna Paradigm Shift | Advances in antenna technologies for cellular hand-held devices have been synchronous with the evolution of mobile phones over nearly 40 years. Having gone through four major wireless evolutions [1], [2], starting with the analog-based first generation to the current fourth-generation (4G) mobile broadband, technologies from manufacturers and their wireless network capacities today are advancing at unprecedented rates to meet our unrelenting service demands. These ever-growing demands, driven by exponential growth in wireless data usage around the globe [3], have gone hand in hand with major technological milestones achieved by the antenna design community. For instance, realizing the theory regarding the physical limitation of antennas [4]-[6] was paramount to the elimination of external antennas for mobile phones in the 1990s. This achievement triggered a variety of revolutionary mobile phone designs and the creation of new wireless services, establishing the current cycle of cellular advances and advances in mobile antenna technologies. |
A Scalable, Lexicon Based Technique for Sentiment Analysis | Rapid increase in the volume of sentiment rich social media on the web has resulted in an increased interest among researchers regarding Sentimental Analysis and opinion mining. However, with so much social media available on the web, sentiment analysis is now considered as a big data task. Hence the conventional sentiment analysis approaches fails to efficiently handle the vast amount of sentiment data available now a days. The main focus of the research was to find such a technique that can efficiently perform sentiment analysis on big data sets. A technique that can categorize the text as positive, negative and neutral in a fast and accurate manner. In the research, sentiment analysis was performed on a large data set of tweets using Hadoop and the performance of the technique was measured in form of speed and accuracy. The experimental results shows that the technique exhibits very good efficiency in handling big sentiment data sets. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.