title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
Building real-world trajectory warehouses | The flow of data generated from low-cost modern sensing technologies and wireless telecommunication devices enables novel research fields related to the management of this new kind of data and the implementation of appropriate analytics for knowledge extraction. In this work, we investigate how the traditional data cube model is adapted to trajectory warehouses in order to transform raw location data into valuable information. In particular, we focus our research on three issues that are critical to trajectory data warehousing: (a) the trajectory reconstruction procedure that takes place when loading a moving object database with sampled location data originated e.g. from GPS recordings, (b) the ETL procedure that feeds a trajectory data warehouse, and (c) the aggregation of cube measures for OLAP purposes. We provide design solutions for all these issues and we test their applicability and efficiency in real world settings. |
Prediction of parking space availability in real time | 0957-4174/$ see front matter 2012 Elsevier Ltd. A doi:10.1016/j.eswa.2012.01.091 ⇑ Corresponding author. Tel.: +56 (32) 227 3744. E-mail addresses: [email protected] (F. C (C. Blazquez), [email protected] (P. Miranda). 1 Tel.: +56 (2) 661 5863. 2 Tel.: +56 (32) 227 3701. Intelligent parking reservation (IPR) systems allow customers to select a parking facility according to their preferences, rapidly park their vehicle without searching for a free stall, and pay their reservation in advance avoiding queues. Some IPR systems interact with in-vehicle navigation systems and provide users with information in real time such as capacity, parking fee, and current parking utilization. However, few of these systems provide information on the forecast utilization at specific hours – a process that requires the study of the competition between parking alternatives for the market share. This paper proposes a methodology for predicting real-time parking space availability in IPR architectures. This methodology consists of three subroutines to allocate simulated parking requests, estimate future departures, and forecast parking availability. Parking requests are allocated iteratively using an aggregated approach as a function of simulated drivers’ preferences, and parking availability. This approach is based on a calibrated discrete choice model for selecting parking alternatives. A numerical comparison between a one-by-one simulation-based forecast and the proposed aggregated approach indicates that no significant discrepancies exists, validating and suggesting the use of the less time consuming proposed aggregated methodology. Results obtained from contrasting predictions with real data yielded small average error availabilities. The forecast improves as the system registers arrivals and departures. Thus, the forecast is adequate for potential distribution in real-time using different media such as Internet, navigation systems, cell phones or GIS. 2012 Elsevier Ltd. All rights reserved. |
Anomaly-based network intrusion detection : Techniques , systems and challenges | The Internet and computer networks are exposed to an increasing number of security threats. With new types of attacks appearing continually, developing flexible and adaptive security oriented approaches is a severe challenge. In this context, anomaly-based network intrusion detection techniques are a valuable technology to protect target systems and networks against malicious activities. However, despite the variety of such methods described in the literature in recent years, security tools incorporating anomaly detection functionalities are just starting to appear, and several important problems remain to be solved. This paper begins with a review of the most well-known anomaly-based intrusion detection techniques. Then, available platforms, systems under development and research projects in the area are presented. Finally, we outline the main challenges to be dealt with for the wide scale deployment of anomaly-based intrusion detectors, with special emphasis on assessment issues. a 2008 Elsevier Ltd. All rights reserved. |
REGISPRO, a Geohydrological Information System | Water resources management, nature conservation and environmental legislation in general have become very complex in recent years, in particular in industrialised countries. Environmental legislation has developed into a difficult and unpleasant task to many planners at national, regional and local governmental levels, because of the need to reconcile the interests of different users with the functions of certain elements of the hydrological cycle. Groundwater supply for industries and drinking water, nature conservation, recreation, agriculture, pollution control, etc. are examples of such rivalling users with often strongly conflicting interests. Planners, managers and legislators are faced with the difficult task of finding both realistic and economically feasible solutions which hopefully satisfy the needs and demands of all involved parties. Furthermore, thanks to the publicity on ecology and environment, the general public has become aware of the ongoing deterioration of ecosystems and of groundwater quality and is urging policy makers to take firm action. |
The large-scale organization of metabolic networks | In a cell or microorganism, the processes that generate mass, energy, information transfer and cell-fate specification are seamlessly integrated through a complex network of cellular constituents and reactions. However, despite the key role of these networks in sustaining cellular functions, their large-scale structure is essentially unknown. Here we present a systematic comparative mathematical analysis of the metabolic networks of 43 organisms representing all three domains of life. We show that, despite significant variation in their individual constituents and pathways, these metabolic networks have the same topological scaling properties and show striking similarities to the inherent organization of complex non-biological systems. This may indicate that metabolic organization is not only identical for all living organisms, but also complies with the design principles of robust and error-tolerant scale-free networks, and may represent a common blueprint for the large-scale organization of interactions among all cellular constituents. |
How physical text layout affects reading from screen | The primary objective of this paper is to critically evaluate empirical research on some variables relating to the configuration of text on screen to consolidate our current knowledge in these areas. The text layout variables are line length, columns, window size and interlinear spacing, with an emphasis on line length due to the larger number of studies related to this variable. Methodological issues arising from individual studies and from comparisons among studies are identified. A synthesis of results is offered which provides alternative interpretations of some findings and identifies the number of characters per line as the critical variable in looking at line length. Further studies are needed to explore the interactions between characters per line and eye movements, scrolling movements, reading patterns and familiarity with formats. |
The Heeger & Bergen Pyramid Based Texture Synthesis Algorithm | This contribution deals with the Heeger-Bergen pyramid-based texture analysis/synthesis algorithm. It brings a detailed explanation of the original algorithm tested on many characteristic examples. Our analysis reproduces the original results, but also brings a minor improvement concerning non-periodic textures. Inspired by visual perception theories, Heeger and Bergen proposed to characterize a texture by its first-order statistics of both its color and its responses to multiscale and multi-orientation filters, namely the steerable pyramid. The Heeger-Bergen algorithm consists in the following procedure: starting from a white noise image, histogram matchings are performed to the image alternately in the image domain and the steerable pyramid domain, so that the corresponding output histograms match the ones of the input texture. Source Code An on-line demo1 of the Heeger-Bergen pyramid-based texture synthesis algorithm is available. The demo permits to upload a color image to extract a subimage and to run the texture synthesis algorithm on this subimage. The algorithm available in the demo is a slightly improved version treating non-periodic textures by a “periodic+smooth” decomposition [13]. The algorithm works with color textures and is able to synthesize textures with larger size than the input image. The original version of the Heeger-Bergen algorithm (where the boundaries are handled by mirror symmetrization) is optional in the source code. An ANSI C implementation is available for download here2. It is provided with: • An illustrated html documentation; • Source code; This code requires libpng, libfftw3, openmp, and getopt. Compilation and usage instructions are included in the README.txt file of the zip archive. The illustrated HTML documentation can be reproduced from the source code by using doxygen (see the README.txt file of the zip archive for details). |
Peer-to-Peer Markets | Peer-to-peer markets such as eBay, Uber, and Airbnb allow small suppliers to compete with traditional providers of goods or services. We view the primary function of these markets as making it easy for buyers to
nd sellers and engage in convenient, trustworthy transactions. We discuss elements of market design that make this possible, including search and matching algorithms, pricing, and reputation systems. We then develop a simple model of how these markets enable entry by small or exible suppliers, and the resulting impact on existing
rms. Finally, we consider the regulation of peer-to-peer markets, and the economic arguments for di¤erent approaches to licensing and certi
cation, data and employment regulation. We appreciate support from the National Science Foundation, the Stanford Institute for Economic Policy Research, the Toulouse Network on Information Technology, and the Alfred P. Sloan Foundation. yEinav and Levin: Department of Economics, Stanford University and NBER. Farronato: Harvard Business School. Email: [email protected], [email protected], [email protected]. |
Illustrate It! An Arabic Multimedia Text-to-Picture m-Learning System | Multimedia learning is the process of building mental representation from words associated with images. Due to the intuitiveness and vividness of visual illustration, many texts to picture systems have been proposed. However, we observe some common limitations in the existing systems, such as the retrieved pictures may not be suitable for educational purposes. Also, finding pedagogic illustrations still requires manual work, which is difficult and time-consuming. The commonly used systems based on the best keyword selection and the best sentence selection may suffer from loss of information. In this paper, we present an Arabic multimedia text-to-picture mobile learning system that is based on conceptual graph matching. Using a knowledge base, a conceptual graph is built from the text accompanied with the pictures in the multimedia repository as well as for the text entered by the user. Based on the matching scores of both conceptual graphs, matched pictures are assigned relative rankings. The proposed system demonstrated its effectiveness in the domain of Arabic stories, however, it can be easily shifted to any educational domain to yield pedagogical illustrations for organizational or institutional needs. Comparisons with the current state-of-the-art systems, based on the best keyword selection and the best sentence selection techniques, have demonstrated significant improvements in the performance. In addition, to facilitate educational needs, conceptual graph visualization and visual illustrative assessment modules are also developed. The conceptual graph visualization enables learners to discover relationships between words, and the visual illustrative assessment allows the system to automatically assess the performance of a learner. The profound user studies demonstrated the efficiency of the proposed multimedia learning system. |
A mobile robot for inspection of overhead transmission lines | A new mobile robot prototype for inspection of overhead transmission lines is proposed. The mobile platform is composed of 3 arms. And there is a motorized rubber wheel on the end of each arm. On the two end arms, a gripper is designed to clamp firmly onto the conductors from below to secure the robot. Each arm has a motor to achieve 2 degrees of freedom which is realized by moving along a curve. It could roll over some obstacles (compression splices, vibration dampers, etc). And the robot could clear other types of obstacles (spacers, suspension clamps, etc). |
Deep Residual Learning for Image Recognition | Deeper neural networks are more difficult to train. We present a residual learning framework to ease the training of networks that are substantially deeper than those used previously. We explicitly reformulate the layers as learning residual functions with reference to the layer inputs, instead of learning unreferenced functions. We provide comprehensive empirical evidence showing that these residual networks are easier to optimize, and can gain accuracy from considerably increased depth. On the ImageNet dataset we evaluate residual nets with a depth of up to 152 layers - 8× deeper than VGG nets [40] but still having lower complexity. An ensemble of these residual nets achieves 3.57% error on the ImageNet test set. This result won the 1st place on the ILSVRC 2015 classification task. We also present analysis on CIFAR-10 with 100 and 1000 layers. The depth of representations is of central importance for many visual recognition tasks. Solely due to our extremely deep representations, we obtain a 28% relative improvement on the COCO object detection dataset. Deep residual nets are foundations of our submissions to ILSVRC & COCO 2015 competitions1, where we also won the 1st places on the tasks of ImageNet detection, ImageNet localization, COCO detection, and COCO segmentation. |
Anti-nociceptive and anti-inflammatory properties of the leaf extracts of Hedranthera barteri in rats and mice | ed by: African Index Medicus (WHO), CAB Abstracts, Global Health Abstracts, Asian Science Index, Index Veterinarius Full-text available at http://www.ajbrui.com & http://www.bioline.br/md Received: January, 2006 Accepted (Revised): March, 2006 Published May, 2006 Full Length Research Article Anti-nociceptive and anti-inflammatory properties of the leaf extracts of Hedranthera barteri in rats and mice *Onasanwo SA and Elegbe RA Department of Physiology, College of Medicine, University of Ibadan. Ibadan, Nigeria. ABSTRACT Hedranthera barteri, HB (Apocynaceae) is a shrub in the closed-forest in some parts of West Africa and used among the natives for inflammatory pain relief. This study was carried out to assess the anti-nociceptive and anti-inflammatory effects of its leaf extracts to confirm folkloric claims. Phytochemical screening and acute toxicity were carried out on the leaf of the plant. Aqueous (AEHB), methanol (MEHB) and chloroform (CEHB) extracts of the leaf were assessed for anti-nociceptive and antiinflammatory properties. The probable mechanism of action of the extracts in analgesia was assessed using naloxone. Student’s t-test was used to test for statistical significance.Phytochemistry of the extracts revealed the presence of alkaloids, cardenolides, saponins and flavonoids. The rats tolerated thermal pain significantly more (P<0.05) with the extracts than the control. The inhibitory rates of the extracts on acetic acid-induced writhing, formalin-induced paw licking (late and early phase) and carrageenan-induced paw oedema when compared with the control were significant. Graded doses of MEHB tolerated thermal pain more significantly (P<0.05), compared with the control. Likewise, the inhibition produced by the graded doses of MEHB on acetic acid-induced writhing, formalin-induced paw licking (early and late phases) and carrageenaninduced paw oedema were significant compared with the control (P<0.05). Pre-treatment with naloxone partially prevented the analgesia induced by MEHB in thermally and chemically induced pains. Hedranthera barteri reduced nociception and inflammation in dose-dependent manner. Interactions with naloxone depicted its partial mediation through opioid receptors. (Afr. J. Biomed. Res. 9:109 118, May 2006)Hedranthera barteri, HB (Apocynaceae) is a shrub in the closed-forest in some parts of West Africa and used among the natives for inflammatory pain relief. This study was carried out to assess the anti-nociceptive and anti-inflammatory effects of its leaf extracts to confirm folkloric claims. Phytochemical screening and acute toxicity were carried out on the leaf of the plant. Aqueous (AEHB), methanol (MEHB) and chloroform (CEHB) extracts of the leaf were assessed for anti-nociceptive and antiinflammatory properties. The probable mechanism of action of the extracts in analgesia was assessed using naloxone. Student’s t-test was used to test for statistical significance.Phytochemistry of the extracts revealed the presence of alkaloids, cardenolides, saponins and flavonoids. The rats tolerated thermal pain significantly more (P<0.05) with the extracts than the control. The inhibitory rates of the extracts on acetic acid-induced writhing, formalin-induced paw licking (late and early phase) and carrageenan-induced paw oedema when compared with the control were significant. Graded doses of MEHB tolerated thermal pain more significantly (P<0.05), compared with the control. Likewise, the inhibition produced by the graded doses of MEHB on acetic acid-induced writhing, formalin-induced paw licking (early and late phases) and carrageenaninduced paw oedema were significant compared with the control (P<0.05). Pre-treatment with naloxone partially prevented the analgesia induced by MEHB in thermally and chemically induced pains. Hedranthera barteri reduced nociception and inflammation in dose-dependent manner. Interactions with naloxone depicted its partial mediation through opioid receptors. (Afr. J. Biomed. Res. 9:109 118, May 2006) |
Controlled trial of anti-tuberculous chemotherapy for two years in Crohn's disease. | One hundred and thirty patients with active symptoms of Crohn's disease were treated in a double blind randomised controlled trial with rifampicin, isoniazid, and ethambutol, or identical placebos for up to two years. All other treatment considered necessary was continued. Analyses were based on 126 patients, 63 in each treatment group. Thirty seven in the active and 30 in the placebo group had previous surgical procedures. There was no difference in concomitant treatment between the two groups. Thirty in the active and 46 in the placebo groups were taking corticosteroids at entry to the trial. Forty eight of 63 patients in the active and 49 of 63 in the placebo group, completed at least 12 months' therapy. Reasons for early withdrawal included pregnancy, adverse reaction, and failure to comply. There was no significant difference in the mean number of months completed between the two groups. Nineteen adverse reactions were recorded for 17 patients in the active group compared with three reactions in patients on placebo. All of the nine patients withdrawn early because of adverse reactions were in the active group. Fifteen patients on active treatment and 14 on placebo had surgery during the trial with no difference in the type of surgery required between the groups. Radiological assessments based on 98 patients at the end of the trial showed no significant differences between groups in changes of extent of disease. More patients developed strictures on placebo compared with active treatment but without a statistically significant difference. No differences were found between groups for the total prednisolone dose or the number of days on which prednisolone dose was 10 mg or above. Serial measurements of body weight and Crohn's disease activity index (CDAI) together with blood values for albumin, haemoglobin, white cell count, and platelets showed no consistent different differences between groups. There were occasional significant differences for some of these values between groups, which were not sustained. The trail provides little evidence of tangible benefit from the trail treatment. |
Pushover Analysis of Reinforced Concrete Buildings Using Full Jacket Technics : A Case Study on an Existing Old Building in Madinah | The retrofitting of existing buildings to resist the seismic loads is very important to avoid losing lives or financial disasters. The aim at retrofitting processes is increasing total structure strength by increasing stiffness or ductility ratio. In addition, the response modification factors (R) have to satisfy the code requirements for suggested retrofitting types. In this study, two types of jackets are used, i.e. full reinforced concrete jackets and surrounding steel plate jackets. The study is carried out on an existing building in Madinah by performing static pushover analysis before and after retrofitting the columns. The selected model building represents nearly all-typical structure lacks structure built before 30 years ago in Madina City, KSA. The comparison of the results indicates a good enhancement of the structure respect to the applied seismic forces. Also, the response modification factor of the RC building is evaluated for the studied cases before and after retrofitting. The design of all vertical elements (columns) is given. The results show that the design of retrofitted columns satisfied the code's design stress requirements. However, for some retrofitting types, the ductility requirements represented by response modification factor do not satisfy KSA design code (SBC301). Keywords—Concrete jackets, steel jackets, RC buildings pushover analysis, non-linear analysis. |
Autoparallel variational description of the free relativistic top third order dynamics | A second order variational description of the autoparallel curves of some differential-geometric connection for the third order Mathisson's 'new mechanics' of a relativistic free spinning particle is suggested starting from general requirements of invariance and 'variationality'. |
Социальный контроль как элемент социального управления | The article is about the specificity of social control as an element of social management. The author indicates that social control is carried out within the subject-object relations. The author gives a detailed description of the participants of social relations in social management and control. Particular attention is paid to analysis of the essential features acquired during interaction by participants of social relations. |
Dynamic Few-Shot Visual Learning Without Forgetting | The human visual system has the remarkably ability to be able to effortlessly learn novel concepts from only a few examples. Mimicking the same behavior on machine learning vision systems is an interesting and very challenging research problem with many practical advantages on real world vision applications. In this context, the goal of our work is to devise a few-shot visual learning system that during test time it will be able to efficiently learn novel categories from only a few training data while at the same time it will not forget the initial categories on which it was trained (here called base categories). To achieve that goal we propose (a) to extend an object recognition system with an attention based few-shot classification weight generator, and (b) to redesign the classifier of a ConvNet model as the cosine similarity function between feature representations and classification weight vectors. The latter, apart from unifying the recognition of both novel and base categories, it also leads to feature representations that generalize better on "unseen" categories. We extensively evaluate our approach on Mini-ImageNet where we manage to improve the prior state-of-the-art on few-shot recognition (i.e., we achieve 56.20% and 73.00% on the 1-shot and 5-shot settings respectively) while at the same time we do not sacrifice any accuracy on the base categories, which is a characteristic that most prior approaches lack. Finally, we apply our approach on the recently introduced few-shot benchmark of Bharath and Girshick [4] where we also achieve state-of-the-art results. |
Heart rate monitoring from wrist-type photoplethysmographic (PPG) signals during intensive physical exercise | Heart rate monitoring from wrist-type photoplethysmographic (PPG) signals during subjects' intensive exercise is a difficult problem, since the PPG signals are contaminated by extremely strong motion artifacts caused by subjects' hand movements. In this work, we formulate the heart rate estimation problem as a sparse signal recovery problem, and use a sparse signal recovery algorithm to calculate high-resolution power spectra of PPG signals, from which heart rates are estimated by selecting corresponding spectrum peaks. To facilitate the use of sparse signal recovery, we propose using bandpass filtering, singular spectrum analysis, and temporal difference operation to partially remove motion artifacts and sparsify PPG spectra. The proposed method was tested on PPG recordings from 10 subjects who were fast running at the peak speed of 15km/hour. The results showed that the averaged absolute estimation error was only 2.56 Beats/Minute, or 1.94% error compared to ground-truth heart rates from simultaneously recorded ECG. |
Clinical efficacy and safety of arbekacin for high-risk infections in patients with hematological malignancies. | We performed a clinical trial to investigate the efficacy and safety of arbekacin (ABK), a unique aminoglycoside with activity against methicillin-resistant Staphylococcus aureus (MRSA), in patients with hematological malignancies complicated by high-risk infections. ABK was administered intravenously at a dose of approximately 5 mg/kg with various broad-spectrum β-lactams, followed by therapeutic drug monitoring (TDM). A total of 54 febrile or infectious episodes were registered, and TDM was performed in 44 (81%) cases. The absolute neutrophil count was below 500/μl in 49 (91%) cases, and cytotoxic chemotherapy was being administered in 47 (87%) cases. Before initiation of ABK, 52 (96%) patients had received fluoroquinolones (n = 37) and/or broad-spectrum β-lactams (n = 34). There were 10 cases of documented infections including one of MRSA pneumonia, and 44 cases of febrile neutropenia. The efficacy at the end of treatment was 80% for all patients, and efficacy was significantly higher in patients attaining maximum concentrations ≥ 16 µg/ml or receiving TDM-guided dose-adjustment of ABK (n = 19, 95 vs. 71%, P = 0.039). Renal toxicity was observed in six cases (11%) but was generally acceptable. This study demonstrated that TDM-guided ABK administration may be applicable under limited conditions for patients with hematological malignancies. |
Effective Large-Scale Online Influence Maximization | In this paper, we study a highly generic version of influence maximization (IM), one of optimizing influence campaigns by sequentially selecting "spread seeds" from a set of candidates, a small subset of the node population, under the hypothesis that, in a given campaign, previously activated nodes remain "persistently" active throughout and thus do not yield further rewards. We call this problem online influence maximization with persistence. We introduce an estimator on the candidates' missing mass – the expected number of nodes that can still be reached from a given seed candidate – and justify its strength to rapidly estimate the desired value. We then describe a novel algorithm, GT-UCB, relying on upper confidence bounds on the missing mass. We show that our approach leads to high-quality spreads on classic IM datasets, even though it makes almost no assumptions on the diffusion medium. Importantly, it is orders of magnitude faster than state-of-the-art IM methods. |
A Unified Feature Selection Framework for Graph Embedding on High Dimensional Data | Although graph embedding has been a powerful tool for modeling data intrinsic structures, simply employing all features for data structure discovery may result in noise amplification. This is particularly severe for high dimensional data with small samples. To meet this challenge, this paper proposes a novel efficient framework to perform feature selection for graph embedding, in which a category of graph embedding methods is cast as a least squares regression problem. In this framework, a binary feature selector is introduced to naturally handle the feature cardinality in the least squares formulation. The resultant integral programming problem is then relaxed into a convex Quadratically Constrained Quadratic Program (QCQP) learning problem, which can be efficiently solved via a sequence of accelerated proximal gradient (APG) methods. Since each APG optimization is w.r.t. only a subset of features, the proposed method is fast and memory efficient. The proposed framework is applied to several graph embedding learning problems, including supervised, unsupervised, and semi-supervised graph embedding. Experimental results on several high dimensional data demonstrated that the proposed method outperformed the considered state-of-the-art methods. |
A Probabilistic Programming Approach for Outlier Detection in Healthcare Claims | Healthcare is an integral component in people's lives, especially for the rising elderly population. Medicare is one such healthcare program that provides for the needs of the elderly. It is imperative that these healthcare programs are affordable, but this is not always the case. Out of the many possible factors for the rising cost of healthcare, claims fraud is a major contributor, but its impact can be lessened through effective fraud detection. We propose a general outlier detection model, based on Bayesian inference, using probabilistic programming. Our model provides probability distributions rather than just point values, as with most common outlier detection methods. Credible intervals are also generated to further enhance confidence that the detected outliers should in fact be considered outliers. Two case studies are presented demonstrating our model's effectiveness in detecting outliers. The first case study uses temperature data in order to provide a clear comparison of several outlier detection techniques. The second case study uses a Medicare dataset to showcase our proposed outlier detection model. Our results show that the successful detection of outliers, which indicate possible fraudulent activities, can provide effective and meaningful results for further investigation within medical specialties or by using real-world, medical provider fraud investigation cases. |
Ranging RFID Tags With Ultrasound | Indoor localization and tracking of persons and assets with centimeter-level accuracy for inventory, security, medical monitoring, and training, as well as gesture interfaces for domotics, is highly desirable in the framework of the emerging IoT paradigm. Low cost, tiny, battery, or batteryless operated position sensors are required. 3-D localization can be computed by combining three or more distance measurements between sensor and reference points. Our aim is to give the capability of measuring the distance from a reference point to radio frequency identification (RFID) tags. The main challenge is in the estimation of the distances with millimeter accuracy in presence of both size and power supply strict constraints, and thus with very limited computational power. An accurate ranging technique using cross-correlation and small RFID-based sensors is proposed. Its originality resides in moving the main computational efforts from the sensor to an external processing unit with sufficient computational and supply power, thus overcoming the sensor limits. The system is composed of a beacon that emits ultrasound chirps and RF sync signals, a RFID-based distance sensor, a commercial RFID reader, and a processing unit. Main advantages are the high miniaturization and low power consumption of the remote sensor, and its compatibility with existing RFID standards. |
Compressing LSTMs into CNNs | We show that a deep convolutional network with an architecture inspired by the models used in image recognition can yield accuracy similar to a long-short term memory (LSTM) network, which achieves the state-of-the-art performance on the standard Switchboard automatic speech recognition task. Moreover, we demonstrate that merging the knowledge in the CNN and LSTM models via model compression further improves the accuracy of the convolutional model. |
iKnow how you walk — A smartphone based personalized gait diagnosing system | Humans, due to aging and hormonal changes are prone to pains in their limbs. As a result of which, the fundamental activity of humans i.e., movement pattern of limbs also known as gait is affected. By exerting unequal weight on both limbs in-order to avoid pain in one leg, humans slowly develop an abnormal gait pattern consisting of limping and sideways bend in the posture. This often goes unnoticed for a long time. We propose a system using smartphones that can sense and detect abnormal walking patterns. The sensing of limb movement is performed by an embedded accelerometer in a smartphone and detection of abnormal walk pattern is performed by classifying different features such as stride length, walk velocity etc. By incorporating Naive Bayes and Decision trees classifiers, we obtained close to 89% accuracies in classifying different levels of abnormalaties. Further validation was done by implementing a decision tree based gait variation detector smartphone application across five users which resulted in 90% accuracy. |
Optimization of EEG frequency bands for improved diagnosis of Alzheimer disease | Many clinical studies have shown that electroencephalograms (EEG) of Alzheimer patients (AD) often have an abnormal power spectrum. In this paper a frequency band analysis of AD EEG signals is presented, with the aim of improving the diagnosis of AD from EEG signals. Relative power in different EEG frequency bands is used as features to distinguish between AD patients and healthy control subjects. Many different frequency bands between 4 and 30Hz are systematically tested, besides the traditional frequency bands, e.g., theta band (4–8Hz). The discriminative power of the resulting spectral features is assessed through statistical tests (Mann-Whitney U test). Moreover, linear discriminant analysis is conducted with those spectral features. The optimized frequency ranges (4–7Hz, 8–15Hz, 19–24Hz) yield substantially better classification performance than the traditional frequency bands (4–8Hz, 8–12Hz, 12–30Hz); the frequency band 4–7Hz is the optimal frequency range for detecting AD, which is similar to the classical theta band. The frequency bands were also optimized as features through leave-one-out crossvalidation, resulting in error-free classification. The optimized frequency bands may improve existing EEG based diagnostic tools for AD. Additional testing on larger AD datasets is required to verify the effectiveness of the proposed approach. |
Multi-Source Domain Adaptation with Mixture of Experts | We propose a mixture-of-experts approach for unsupervised domain adaptation from multiple sources. The key idea is to explicitly capture the relationship between a target example and different source domains. This relationship, expressed by a point-to-set metric, determines how to combine predictors trained on various domains. The metric is learned in an unsupervised fashion using metatraining. Experimental results on sentiment analysis and part-of-speech tagging demonstrate that our approach consistently outperforms multiple baselines and can robustly handle negative transfer.1 |
Topology Preserving Parallel Smoothing for 3D Binary Images | This paper presents a new algorithm for smoothing 3D binary images in a topology preserving way. Our algorithm is a reduction operator: some border points that are considered as extremities are removed. The proposed method is composed of two parallel reduction operators. We are to apply our smoothing algorithm as an iterationby-iteration pruning for reducing the noise sensitivity of 3D parallel surface-thinning algorithms. An efficient implementation of our algorithm is sketched and its topological correctness for (26,6) pictures is proved. |
ScreenerNet: Learning Curriculum for Neural Networks | We propose to learn a curriculum or a syllabus for supervised learning with deep neural networks. Specifically, we learn weights for each sample in training by an attached neural network, called ScreenerNet, to the original network and jointly train them in an end-to-end fashion. We show the networks augmented with our ScreenerNet achieve early convergence with better accuracy than the state-of-the-art rule-based curricular learning methods in extensive experiments using three popular vision datasets including MNIST, CIFAR10 and Pascal VOC2012, and a Cartpole task using Deep Q-learning. |
A Hybrid Word-Character Model for Abstractive Summarization | Automatic abstractive text summarization is an important and challenging research topic of natural language processing. Among many widely used languages, the Chinese language has a special property that a Chinese character contains rich information comparable to a word. Existing Chinese text summarization methods, either adopt totally character-based or word-based representations, fail to fully exploit the information carried by both representations. To accurately capture the essence of articles, we propose a hybrid word-character approach (HWC) which preserves the advantages of both wordbased and character-based representations. We evaluate the advantage of the proposed HWC approach by applying it to two existing methods, and discover that it generates state-of-the-art performance with a margin of 24 ROUGE points on a widely used dataset LCSTS. In addition, we find an issue contained in the LCSTS dataset and offer a script to remove overlapping pairs (a summary and a short text) to create a clean dataset for the community. The proposed HWC approach also generates the best performance on the new, clean LCSTS dataset. |
Design and Realization of Multimedia-Examinations for large Numbers of Participants in University | We report on successfully accomplished Multimedia Examinations in a physics lecture for engineering freshmen. The use of New Media enables us to create new types of questions, including Java-based applets, and questions requiring Internet-based research. These new forms are presented in detail. Economically priced hardware solutions and user-friendly software for both teachers as well as students are realised in collaboration with PROMETHEAN CORPORATION. A first evaluation - which is very countenancing - is presented; our e-assessment finds general approval in the participants' opinions. |
Image Critique and the Fall of the Berlin Wall | Although we are now accustomed to watching history unfold live on the air, the fall of the Berlin Wall was one of the first instances when history was produced on television. Inspired by the Wall and its powerful resonances, Sunil Manghani’s breakthrough study presents the new critical concept of “image critique,” a method of critiquing images while simultaneously using them as a means to engage with contemporary culture. Manghani examines current debates surrounding visual culture, ranging from such topics as Francis Fukuyama’s end of history thesis to metapictures and East German film. The resulting volume is an exhilarating interweaving of history, politics, and visual culture |
Peripheral artery tonometry demonstrates altered endothelial function in children with type 1 diabetes. | OBJECTIVES
To assess the ability of reactive hyperemia-peripheral artery tonometry (RH-PAT) to serve as a surrogate marker of endothelial dysfunction in children with type 1 diabetes (T1D).
RESEARCH DESIGN AND METHODS
Forty-four children with T1D [age 14.6 +/- 2.7 yr; duration of diabetes 6.01 +/- 4 yr; range of diabetes duration 1-16 yr; and hemoglobin A1c (HbA1c) 8.34 +/- 1.2%] and 20 children without diabetes (age 14.1 +/- 1.5 yr) underwent RH-PAT endothelial function testing after an overnight fast. Height, weight, body mass index (BMI), blood pressure (BP), fasting lipid profile, and glucose level were determined in each child. Children with T1D underwent a second RH-PAT study 4 wk after their initial study to determine the intrapatient variability of the technique.
RESULTS
Children with T1D had endothelial dysfunction as evidenced by lower mean RH-PAT scores (1.63 +/- 0.5) when compared with children without diabetes (mean RH-PAT score 1.95 +/- 0.3) (p = 0.01). Repeat RH-PAT scores were predicted by initial RH-PAT scores (p = 0.0025). Mean intrapatient standard deviation of RH-PAT score was 0.261 and mean coefficient of variation was 14.8. Variations in RH-PAT score were not explained by differences in glucose, HbA1c, BMI, systolic BP, diastolic BP, or lipids.
CONCLUSIONS
Although larger validation studies are required, RH-PAT is a promising non-invasive technique to assess endothelial function in children with T1D. Non-invasive measures of endothelial dysfunction may provide the additional risk stratification data needed to justify more aggressive primary prevention of cardiovascular disease in children with T1D. |
Learning to Explain: An Information-Theoretic Perspective on Model Interpretation | We introduce instancewise feature selection as a methodology for model interpretation. Our method is based on learning a function to extract a subset of features that are most informative for each given example. This feature selector is trained to maximize the mutual information between selected features and the response variable, where the conditional distribution of the response variable given the input is the model to be explained. We develop an efficient variational approximation to the mutual information, and show the effectiveness of our method on a variety of synthetic and real data sets using both quantitative metrics and human evaluation. |
Decentralized Edge Clouds | Cloud computing services are traditionally deployed on centralized computing infrastructures confined to a few data centers, while cloud applications run in a single data center. However, the cloud's centralized nature can be limiting in terms of performance and cost for applications where users, data, and computation are distributed. The authors present an overview of distributed clouds that might be better suited for such applications. They briefly describe the distributed cloud landscape and introduce Nebula, a highly decentralized cloud that uses volunteer edge resources. The authors provide insights into some of its key properties and design issues, and describe a distributed MapReduce application scenario to illustrate the benefits and trade-offs of using distributed and decentralized clouds for distributed data-intensive computing applications. |
Design of a sensorless commutation IC for BLDC motors | This paper presents the design and realization of a sensorless commutation integrated circuit (IC) for brushless dc motors (BLDCMs) by using mixed-mode IC design methodology. The developed IC can generate accurate commutation signals for BLDCMs by using a modified back-EMF sensing scheme instead of using Hall-effect sensors. This IC can be also easily interfaced with a microcontroller or a digital signal processor (DSP) to complete the closed-loop control of a BLDCM. The developed sensorless commutation IC consists of an analog back-EMF processing circuit and a programmable digital commutation control circuit. Since the commutation control is very critical for BLDCM control, the proposed sensorless commutation IC provides a phase compensation circuit to compensate phase error due to low-pass filtering, noise, and nonideal effects of back-EMFs. By using mixed-mode IC design methodology, this IC solution requires less analog compensation circuits compared to other commercially available motor control ICs. Therefore, high maintainability and flexibility can be both achieved. The proposed sensorless commutation IC is integrated in a standard 0.35m single-poly four-metal CMOS process, and the realization technique of this mixed-mode IC has been given. The proposed control scheme and developed realization techniques provide illustrative engineering procedures for the system-on-a-chip solution for advanced digital motor control. Simulation and experimental results have been carried out in verification of the proposed control scheme. |
Spectrum- and Energy-Efficient OFDM Based on Simultaneous Multi-Channel Reconstruction | Time domain synchronous OFDM (TDS-OFDM) has a higher spectrum and energy efficiency than standard cyclic prefix OFDM (CP-OFDM) by replacing the unknown CP with a known pseudorandom noise (PN) sequence. However, due to mutual interference between the PN sequence and the OFDM data block, TDS-OFDM cannot support high-order modulation schemes such as 256QAM in realistic static channels with large delay spread or high-definition television (HDTV) delivery in fast fading channels. To solve these problems, we propose the idea of using multiple inter-block-interference (IBI)-free regions of small size to realize simultaneous multi-channel reconstruction under the framework of structured compressive sensing (SCS). This is enabled by jointly exploiting the sparsity of wireless channels as well as the characteristic that path delays vary much slower than path gains. In this way, the mutually conditional time-domain channel estimation and frequency-domain data demodulation in TDS-OFDM can be decoupled without the use of iterative interference removal. The Cramér-Rao lower bound (CRLB) of the proposed estimation scheme is also derived. Moreover, the guard interval amplitude in TDS-OFDM can be reduced to improve the energy efficiency, which is infeasible for CP-OFDM. Simulation results demonstrate that the proposed SCS-aided TDS-OFDM scheme has a higher spectrum and energy efficiency than CP-OFDM by more than 10% and 20% respectively in typical applications. |
Sarcasm Detection : Building a Contextual Hierarchy | The conundrum of understanding and classifying sarcasm has been dealt with by the traditional theorists as an analysis of a sarcastic utterance and the ironic situation that surrounds it. The problem with such an approach is that it is too narrow, as it is unable to sufficiently utilize the two indispensable agents in making such an utterance, viz. the speaker and the listener. It undermines the necessary context required to comprehend a sarcastic utterance. In this paper, we propose a novel approach towards understanding sarcasm in terms of the existing knowledge hierarchy between the two participants, which forms the basis of the context that both agents share. The difference in relationship of the speaker of the sarcastic utterance and the disparate audience found on social media, such as Twitter, is also captured. We then apply our model on a corpus of tweets to achieve significant results and consequently, shed light on subjective nature of context, which is contingent on the relation between the speaker and the listener. |
Nasal ventilation in acute exacerbations of chronic obstructive pulmonary disease: effect of ventilator mode on arterial blood gas tensions. | BACKGROUND
There are no controlled trials of the use of different modes of nasal intermittent positive pressure ventilation (NIPPV) in patients with exacerbations of chronic obstructive pulmonary disease (COPD). This study describes the effect on blood gas tensions of four different modes of nasal ventilation.
METHODS
Twelve patients with acute exacerbations of COPD were studied (mean (SD) FEV1 0.59 (0.13) l, PaO2 (air) 5.10 (1.12) kPa, PaCO2 9.28 (1.97) kPa, pH 7.32 (0.03)). Each patient underwent four one-hour periods of nasal ventilation in randomised order: (a) inspiratory pressure support 18 cm H2O; (b) pressure support 18 cm H2O+positive end expiratory pressure (PEEP) 6 cm H2O (IPAP+EPAP); (c) continuous positive airway pressure (CPAP) 8 cm H2O; and (d) volume cycled NIPPV. Arterial blood samples were obtained before each period of ventilation and at one hour.
RESULTS
Pressure support, CPAP, and volume cycled NIPPV all produced significant improvements in PaO2; there was no difference between these three modes. The change in PaO2 with IPAP+EPAP did not reach statistical significance. None of the modes produced significant changes in mean PaCO2; patients with higher baseline levels tended to show a rise in PaCO2 whereas those with lower baseline levels tended to show a fall.
CONCLUSIONS
Although PaO2 improved in all patients there are differences in efficacy between the modes, while the changes in PaCO2 were variable. The addition of EPAP conferred no advantage in terms of blood gas tensions. |
Chromosome 1q21 gains confer inferior outcomes in multiple myeloma treated with bortezomib but copy number variation and percentage of plasma cells involved have no additional prognostic value. | Chromosome 1q21 aberrations have not been yet been made part of routine clinical tests and their effect in multiple myeloma is still under investigation. The prognostic value of copy number variation and percentage of plasma cells involved have remained unclear. In the present study, we analyzed the prognostic value of 1q21 in a series of 290 cases of newly diagnosed multiple myeloma treated in a prospective, non-randomized clinical trial (BDH 2008/02). We found that incidence of 1q21 aberration increased at relapse, but its copy numbers and proportion of cells involved did not change. Gains of 1q21 had no impact on survival in patients receiving thalidomide-based treatment but conferred a significantly inferior prognosis in patients under bortezomib-based chemotherapy and was an independent adverse prognostic factor for progression free survival (HR 3.831; 95%CI: 2.125-6.907; P<0.001) and overall survival (HR 3.245; 95%CI: 1.555-6.773; P=0.002). Strikingly, our results showed that the copy number variation and clone size harboring 1q21 gains carried no additional prognostic value and patients with 1q21 gains did not benefit significantly from regimens incorporating bortezomib. Our results indicate that three copies of 1q21 and 20% of plasma cells with this abnormality were enough to confer bortezomib resistance. Therefore, chromosome 1q21 gains should be considered a high-risk feature in multiple myeloma receiving bortezomib therapy. |
Investigating the global trend of RF power amplifiers with the arrival of 5G | To satisfy the continuously increasing demand for high data rates and mobility required by new wireless applications, the 5G has gained much attention recently. Radio frequency power amplifiers (RF-PAs), as one of critical components of 5G transmitter system, are becoming a hot issue. In this paper, the statistical analysis on RF PA papers shows the research on RF-PAs in Asia-Pacific and cooperation between different affiliations and countries are gradually becoming more prominent, showing the globalization trend of RF PA research and the 5G technologies. The decreased research cycle of RF PA shows the processes of research on PA and 5G technologies are speeding up. Some promising RF-PA technologies for 5G wireless communication system are also been discussed. |
Solventless and solvent-minimized sample preparation techniques for determining currently used pesticides in water samples: a review. | The intensification of agriculture means that increasing amounts of toxic organic and inorganic compounds are entering the environment. The pesticides generally applied nowadays are regarded as some of the most dangerous contaminants of the environment. Their presence in the environment, especially in water, is hazardous because they cause human beings to become more susceptible to disease. For these reasons, it is essential to monitor pesticide residues in the environment with the aid of all accessible analytical methods. The analysis of samples for the presence of pesticides is problematic, because of the laborious and time-consuming operations involved in preparing samples for analysis, which themselves may be a source of additional contaminations and errors. To date, it has been standard practice to use large quantities of organic solvents in the sample preparation process; but as these solvents are themselves hazardous, solventless and solvent-minimized techniques are coming into use. This paper discusses the most commonly used over the last 15 years sample preparation techniques for monitoring organophosphorus and organonitrogen pesticides residue in water samples. Furthermore, a significant trend in sample preparation, in accordance with the principles of 'Green Chemistry' is the simplification, miniaturization and automation of analytical techniques. In view of this aspect, several novel techniques are being developed in order to reduce the analysis step, increase the sample throughput and to improve the quality and the sensitivity of analytical methods. The paper describes extraction techniques requiring the use of solvents - liquid-liquid extraction (LLE) and its modifications, membrane extraction techniques, hollow fibre-protected two-phase solvent microextraction, liquid phase microextraction based on the solidification of a floating organic drop (LPME-SFO), solid-phase extraction (SPE) and single-drop microextraction (SDME) - as well as solvent-free techniques - solid phase microextraction (SPME) and stir bar sorptive extraction (SBSE). The advantages and drawbacks of these techniques are also discussed, and some solutions to their limitations are proposed. |
Machine translation evaluation versus quality estimation | Most evaluation metrics for machine translation (MT) require reference translations for each sentence in order to produce a score reflecting certain aspects of its quality. The de facto metrics, BLEU and NIST, are known to have good correlation with human evaluation at the corpus level, but this is not the case at the segment level. As an attempt to overcome these two limitations, we address the problem of evaluating the quality of MT as a prediction task, where reference-independent features are extracted from the input sentences and their translation, and a quality score is obtained based on models produced from training data. We show that this approach yields better correlation with human evaluation as compared to commonly used metrics, even with models trained on different MT systems, language-pairs and text domains. |
Time-Triggered Ethernet and IEEE 1588 Clock Synchronization | The time-triggered Ethernet unifies real-time and non-real-time traffic into a single communication architecture. We have built a prototype implementation of an FPGA TT-Ethernet switch and an FPGA TT Ethernet communication controller supporting a network bandwidth of 100 Mbit/sec. Time-Triggered Ethernet introduces two message classes, i) the standard event-triggered Ethernet messages, denoted as ET messages, and ii) the time-triggered Ethernet messages, denoted as TT messages. All TT messages are transmitted periodically and are scheduled a priori in a way that there are no conflicts on the network. The network handles these messages according to the cut-through paradigm. Computer nodes containing TT Ethernet communication controllers establish and maintain global time base. However nodes containing standard Ethernet controllers can be connected to a TT Ethernet system and can send ET messages without affecting the temporal properties of the TT messages. The global time format of the TT Ethernet deploys the UTC time format which is compatible with the time format of the IEEE 1588 standard. In these work we present how we deploy the IEEE 1588 in order to synchronize the TT Ethernet controllers which require a tight synchronization among them. Additionally the IEEE 1588 clock synchronization based protocol will be implemented at standard Ethernet controllers such that they can be establish and maintain a global time base. |
Role of Na+ /H+ exchanger 3 in the acidification of the male reproductive tract and male fertility. | 1. Male fertility is a complex process that is dependent on sex hormones and the normal function of the reproductive organs. Defects of these organs result in abnormal sperm production and function, which, in turn, lead to infertility. 2. Spermatozoa released from the testis are unable to move and fertilize with eggs. These features, known as sperm maturation, are acquired during their transit through the epididymis. 3. Among several processes that take place in the epididymis, absorption and acidification of the luminal fluid are essential for sperm maturation, sperm storage and fertility. Currently, the mechanism by which acidification occurs in the epididymis is still not fully understood. 4. The epididymis is fully equipped with the proteins required for acid/base transport, such as Na(+) /H(+) exchanger 3 (NHE3, SLC9A3), vacuolar-type adenosine triphosphatase (V-ATPase) and various isoforms of enzyme carbonic anhydrase (CA). 5. Most studies, so far, have focused on the role of V-ATPase on H(+) secretion and acidification of the epididymis. The involvement of NHE3 in creating the acidic environment of the epididymal spermatozoa receives little attention. 6. This review presents evidence for and discusses the role of NHE3 in the acidification of the male reproductive tract and its requirement for male fertility. |
Recent Trends in Deep Learning Based Natural Language Processing [Review Article] | Deep learning methods employ multiple processing layers to learn hierarchical representations of data, and have produced state-of-the-art results in many domains. Recently, a variety of model designs and methods have blossomed in the context of natural language processing (NLP). In this paper, we review significant deep learning related models and methods that have been employed for numerous NLP tasks and provide a walk-through of their evolution. We also summarize, compare and contrast the various models and put forward a detailed understanding of the past, present and future of deep learning in NLP. |
A Comparative Study of Image Steganography Techniques | Steganography is the art of hiding the fact that communication is taking place, by hiding information in other information. Many different carrier file formats can be used, but digital images are the most popular because of their frequency on the Internet. From the past time to present time security of confidential information is always an important issue. The main purpose of stegnography is to hide the existence of the message so that it becomes difficult for attacker to detect it. Any type of cover object can be taken that may be text, image or video to embed the secret information. In this paper a brief analysis of different image stegnography techniques and their comparison is done. |
Vision-based pedestrian detection -reliable pedestrian candidate detection by combining IPM and a 1D profile | This article presents the improvement of an Inverse Perspective Mapping (IPM) based obstacle detection algorithm and its utilization in a pedestrian candidate detection module of our vision based pedestrian detection system. A vertical 1D profile of the IPM detection on the region of interest termed Pedestrian Detection Strip (PDS) is created first and the candidates are chosen by applying a threshold on the profile. The usage of the vertical profile increases the robustness of the detection on low contrast images as well as distant pedestrians significantly. A low level pedestrian oriented segmentation and fast symmetry search on the leg region of pedestrians is also presented. |
Char2Wav: End-to-End Speech Synthesis | We present Char2Wav, an end-to-end model for speech synthesis. Char2Wav has two components: a reader and a neural vocoder. The reader is an encoderdecoder model with attention. The encoder is a bidirectional recurrent neural network that accepts text or phonemes as inputs, while the decoder is a recurrent neural network (RNN) with attention that produces vocoder acoustic features. Neural vocoder refers to a conditional extension of SampleRNN which generates raw waveform samples from intermediate representations. Unlike traditional models for speech synthesis, Char2Wav learns to produce audio directly from text. |
Deep Scattering Spectrum with deep neural networks | State-of-the-art convolutional neural networks (CNNs) typically use a log-mel spectral representation of the speech signal. However, this representation is limited by the spectro-temporal resolution afforded by log-mel filter-banks. A novel technique known as Deep Scattering Spectrum (DSS) addresses this limitation and preserves higher resolution information, while ensuring time warp stability, through the cascaded application of the wavelet-modulus operator. The first order scatter is equivalent to log-mel features and standard CNN modeling techniques can directly be used with these features. However the higher order scatter, which preserves the higher resolution information, presents new challenges in modelling. This paper explores how to effectively use DSS features with CNN acoustic models. Specifically, we identify the effective normalization, neural network topology and regularization techniques to effectively model higher order scatter. The use of these higher order scatter features, in conjunction with CNNs, results in relative improvement of 7% compared to log-mel features on TIMIT, providing a phonetic error rate (PER) of 17.4%, one of the lowest reported PERs to date on this task. |
Near-infrared spectroscopy-mediated neurofeedback enhances efficacy of motor imagery-based training in poststroke victims: a pilot study. | BACKGROUND AND PURPOSE
Despite the findings that motor imagery and execution are supposed to share common neural networks, previous studies using imagery-based rehabilitation have revealed inconsistent results. In the present study, we investigated whether feedback of cortical activities (neurofeedback) using near-infrared spectroscopy could enhance the efficacy of imagery-based rehabilitation in stroke patients.
METHODS
Twenty hemiplegic patients with subcortical stroke received 6 sessions of mental practice with motor imagery of the distal upper limb in addition to standard rehabilitation. Subjects were randomly allocated to REAL and SHAM groups. In the REAL group, cortical hemoglobin signals detected by near-infrared spectroscopy were fed back during imagery. In the SHAM group, irrelevant randomized signals were fed back. Upper limb function was assessed using the finger and arm subscales of the Fugl-Meyer assessment and the Action Research Arm Test.
RESULTS
The hand/finger subscale of the Fugl-Meyer assessment showed greater functional gain in the REAL group, with a significant interaction between time and group (F(2,36)=15.5; P<0.001). A significant effect of neurofeedback was revealed even in severely impaired subjects. Imagery-related cortical activation in the premotor area was significantly greater in the REAL group than in the SHAM group (T(58)=2.4; P<0.05).
CONCLUSIONS
Our results suggest that near-infrared spectroscopy-mediated neurofeedback may enhance the efficacy of mental practice with motor imagery and augment motor recovery in poststroke patients with severe hemiparesis. |
Walking through Architectural Spaces: The Impact of Interior Forms on Human Brain Dynamics | Neuroarchitecture uses neuroscientific tools to better understand architectural design and its impact on human perception and subjective experience. The form or shape of the built environment is fundamental to architectural design, but not many studies have shown the impact of different forms on the inhabitants' emotions. This study investigated the neurophysiological correlates of different interior forms on the perceivers' affective state and the accompanying brain activity. To understand the impact of naturalistic three-dimensional (3D) architectural forms, it is essential to perceive forms from different perspectives. We computed clusters of form features extracted from pictures of residential interiors and constructed exemplary 3D room models based on and representing different formal clusters. To investigate human brain activity during 3D perception of architectural spaces, we used a mobile brain/body imaging (MoBI) approach recording the electroencephalogram (EEG) of participants while they naturally walk through different interior forms in virtual reality (VR). The results revealed a strong impact of curvature geometries on activity in the anterior cingulate cortex (ACC). Theta band activity in ACC correlated with specific feature types (rs (14) = 0.525, p = 0.037) and geometry (rs (14) = -0.579, p = 0.019), providing evidence for a role of this structure in processing architectural features beyond their emotional impact. The posterior cingulate cortex and the occipital lobe were involved in the perception of different room perspectives during the stroll through the rooms. This study sheds new light on the use of mobile EEG and VR in architectural studies and provides the opportunity to study human brain dynamics in participants that actively explore and realistically experience architectural spaces. |
A flow based approach for SSH traffic detection | The basic objective of this work is to assess the utility of two supervised learning algorithms AdaBoost and RIPPER for classifying SSH traffic from log files without using features such as payload, IP addresses and source/destination ports. Pre-processing is applied to the traffic data to express as traffic flows. Results of 10-fold cross validation for each learning algorithm indicate that a detection rate of 99% and a false positive rate of 0.7% can be achieved using RIPPER. Moreover, promising preliminary results were obtained when RIPPER was employed to identify which service was running over SSH. Thus, it is possible to detect SSH traffic with high accuracy without using features such as payload, IP addresses and source/destination ports, where this represents a particularly useful characteristic when requiring generic, scalable solutions. |
Applying the Clique Percolation Method to analyzing cross-market branch banking network structure: the case of Illinois | This study applies the Clique Percolation Method (CPM) to an investigation of the changing spatial organization of the Illinois cross-market branch banking network. Nonoverlapping community detection algorithms assign nodes into exclusive communities and, when results are mapped, these techniques may generate spatially disjointed geographical regions, an undesirable characteristic for geographical study. Alternative overlapping community detection algorithms allow overlapping membership where a node can be a member of different communities. Such a structure simultaneously accommodates spatial proximity and spatial separation which occur with respect to a node in relation to other nodes in the system. Applying such a structure in geographical analysis helps preserve well-established principles regarding spatial relationships within the geography discipline. The result can also be mapped for display and correct interpretation. The CPM is chosen in this study due to the complete connection within cliques which simulates the practice by banking institutions of forming highly connected networks through multi-location operations in order to diversify their business and hedge against risks. Applying the CPM helps reveal the spatial pattern of branch banking connections which would otherwise be difficult to see. However, the CPM has been shown to not be among the best performing overlapping community detection algorithms. Future research should explore other possible algorithms for detecting overlapping communities. Detecting communities in a network only reveals certain characteristics of the spatial organization of the network, rather than providing explanation of the spatial-network patterns revealed. Full interpretation of the pattern must rely on the attribute data and additional information. This may illustrate the value of an integrated approach in geographical analysis using both social network analysis and spatial analysis techniques. |
Specific ion effects on membrane potential and the permselectivity of ion exchange membranes. | Membrane potential and permselectivity are critical parameters for a variety of electrochemically-driven separation and energy technologies. An electric potential is developed when a membrane separates electrolyte solutions of different concentrations, and a permselective membrane allows specific species to be transported while restricting the passage of other species. Ion exchange membranes are commonly used in applications that require advanced ionic electrolytes and span technologies such as alkaline batteries to ammonium bicarbonate reverse electrodialysis, but membranes are often only characterized in sodium chloride solutions. Our goal in this work was to better understand membrane behaviour in aqueous ammonium bicarbonate, which is of interest for closed-loop energy generation processes. Here we characterized the permselectivity of four commercial ion exchange membranes in aqueous solutions of sodium chloride, ammonium chloride, sodium bicarbonate, and ammonium bicarbonate. This stepwise approach, using four different ions in aqueous solution, was used to better understand how these specific ions affect ion transport in ion exchange membranes. Characterization of cation and anion exchange membrane permselectivity, using these ions, is discussed from the perspective of the difference in the physical chemistry of the hydrated ions, along with an accompanying re-derivation and examination of the basic equations that describe membrane potential. In general, permselectivity was highest in sodium chloride and lowest in ammonium bicarbonate solutions, and the nature of both the counter- and co-ions appeared to influence measured permselectivity. The counter-ion type influences the binding affinity between counter-ions and polymer fixed charge groups, and higher binding affinity between fixed charge sites and counter-ions within the membrane decreases the effective membrane charge density. As a result permselectivity decreases. The charge density and polarizability of the co-ions also appeared to influence permselectivity leading to ion-specific effects; co-ions that are charge dense and have low polarizability tended to result in high membrane permselectivity. |
Fiscal adjustments and income inequality: a first assessment | Using a statistical approach to identify fiscal adjustments, we find that fiscal consolidation appears to shorten the income gap. Fiscal austerity plans that succeed in bringing public debt to a sustainable path seem to be more likely to reduce inequality. Expansionary fiscal adjustments are particularly important to promote changes in the income distribution. |
24.5 A 4.5nW wake-up radio with −69dBm sensitivity | Wake-up receivers (WuRXs) are low-power radios that continuously monitor the RF environment to wake up a higher-power radio upon detection of a predetermined RF signature. Prior-art WuRXs have 100s of kHz of bandwidth [1] with low signature-to-wake-up-signal latency to help synchronize communication amongst nominally asynchronous wireless devices. However, applications such as unattended ground sensors and smart home appliances wake-up infrequently in an event-driven manner, and thus WuRX bandwidth and latency are less critical; instead, the most important metrics are power consumption and sensitivity. Unfortunately, current state-of-the-art WuRXs utilizing direct envelope-detecting [2] and IF/uncertain-IF [1,3] architectures (Fig. 24.5.1) achieve only modest sensitivity at low-power (e.g., −39dBm at 104nW [2]), or achieve excellent sensitivity at higher-power (e.g., −97dBm at 99µW [3]) via active IF gain elements. Neither approach meets the needs of next-generation event-driven sensing networks. |
Organ Segmentation in Poultry Viscera Using RGB-D | We present a pattern recognition framework for semantic segmentation of visual structures, that is, multi-class labelling at pixel level, and apply it to the task of segmenting organs in the eviscerated viscera from slaughtered poultry in RGB-D images. This is a step towards replacing the current strenuous manual inspection at poultry processing plants. Features are extracted from feature maps such as activation maps from a convolutional neural network (CNN). A random forest classifier assigns class probabilities, which are further refined by utilizing context in a conditional random field. The presented method is compatible with both 2D and 3D features, which allows us to explore the value of adding 3D and CNN-derived features. The dataset consists of 604 RGB-D images showing 151 unique sets of eviscerated viscera from four different perspectives. A mean Jaccard index of 78.11 % is achieved across the four classes of organs by using features derived from 2D, 3D and a CNN, compared to 74.28 % using only basic 2D image features. |
Combining classifiers: A theoretical framework | The problem of classifier combination is considered in the context of the two main fusion scenarios: fusion of opinions based on identical and on distinct representations. We develop a theoretical framework for classifier combination for these two scenarios. For multiple experts using distinct representations we argue that many existing schemes such as the product rule, sum rule, min rule, max rule, majority voting, and weighted combination, can be considered as special cases of compound classification. We then consider the effect of classifier combination in the case of multiple experts using a shared representation where the aim of fusion is to obtain a better estimate of the appropriatea posteriori class probabilities. We also show that the two theoretical frameworks can be used for devising fusion strategies when the individual experts use features some of which are shared and the remaining ones distinct. We show that in both cases (distinct and shared representations), the expert fusion involves the computation of a linear or nonlinear function of thea posteriori class probabilities estimated by the individual experts. Classifier combination can therefore be viewed as a multistage classification process whereby thea posteriori class probabilities generated by the individual classifiers are considered as features for a second stage classification scheme. Most importantly, when the linear or nonlinear combination functions are obtained by training, the distinctions between the two scenarios fade away, and one can view classifier fusion in a unified way. |
Automatic Non-rigid 3D Modeling from Video | We present a robust framework for estimating non-rigid 3D shape and motion in video sequences. Given an input video sequence, and a user-specified region to reconstruct, the algorithm automatically solves for the 3D time-varying shape and motion of the object, and estimates which pixels are outliers, while learning all system parameters, including a PDF over non-rigid deformations. There are no user-tuned parameters (other than initialization); all parameters are learned by maximizing the likelihood of the entire image stream. We apply our method to both rigid and non-rigid shape reconstruction, and demonstrate it in challenging cases of occlusion and variable illumination. |
Immunomodulatory effects of total intravenous and balanced inhalation anesthesia in patients with bladder cancer undergoing elective radical Cystectomy: preliminary results | BACKGROUND
Although surgery and anesthesia induce immunesuppression, remains largely unknown whether various anesthetic techniques have different immunosuppressive effects on cancer patients. Therefore, the aim of this study was to investigate the influence of total intravenous anesthesia with target-controlled infusion (TIVA-TCI) and balanced inhalation anesthesia (BAL) on the peri-operative levels of inflammatory cytokines and regulatory T cells (Tregs) in patients with bladder cancer undergoing surgery.
METHODS
Twenty eight consecutive patients with bladder cancer who underwent radical cystectomy were prospectively randomized into two groups to receive TIVA-TCI (n = 14) or BAL (n = 14). Before the induction of anesthesia (T0), 6-8 hours (T1) post-surgery, and 5 days post-surgery (T2), Tregs and serum levels of interleukin -1beta (IL-1β), interferon-gamma (IFN-γ), tumor necrosis factor-alpha (TNF-α), interleukin -2 (IL-2), interleukin -6 (IL-6), and interleukin -10 (IL-10) were measured.
RESULTS
In the peri-operative period all cancer patients showed a marked and significant increase in IL-6. Moreover, TIVA-TCI patients also showed a higher increase in IFN-γ, whereas in BAL patients Tregs were reduced by approximately 30% during surgery. The incidence of infections, metastases, and death was similar in both groups.
CONCLUSIONS
The increase in the Th1 response in the TIVA-TCI group and the reduction in Tregs in the BAL group seem to balance the immunosuppressive effect induced by IL-6. Therefore TIVA-TCI and BAL can be both used in major surgery in patients with bladder cancer without worsening the outcome. |
Groundnut leaf disease detection and classification by using back probagation algorithm | Many studies shows that quality of agricultural products may be reduced from many causes. One of the most important factors contributing to low yield is disease attack. The plant disease such as fungi, bacteria and viruses. The leaf disease completely destroys the quality of the leaf. Common ground nut disease is cercospora. It is one of the type of disease in early stage of ground nut leaf. The upgraded processing pattern comprises of four leading steps. Initially a color renovation constitute intended for the input RGB image is formed, This RGB is converted into HSV because RGB is for Color generation and color descriptor. The next step is plane separation. Next performed the color features. Then using back propagation algorithm detection of leaf disease is done. |
Learning content–social influential features for influence analysis | We address how to measure the information propagation probability between users given certain contents. In sharp contrast to existing works that oversimplify the propagation model as predefined distributions, our approach fundamentally attempts to answer why users are influenced (e.g., by content or relations) and whether the corresponding influential features (e.g., hidden factors) can be inferred from the propagation in the entire network. In particular, we propose a novel method to deeply learn the unified feature representations for both user pair and content, where the homogeneous feature similarity can be used to estimate the propagation probability between users with given content. The features are dubbed content–social influential feature since we consider not only the content of the propagation information but also how it propagates over the social network. We design a fast asynchronous parallel algorithm for the feature learning. Through extensive experiments on a real-world social network with 53 million users and 838 million tweets, we show significantly improved performance as compared to other state-of-the-art methods on various social influence analysis tasks. |
Prediction of the failure properties of short fiber reinforced composites with metal and polymer matrix | The statistical strength of short fiber reinforced composites such as metal matrix composites (MMCs) and polymer matrix composites (PMCs) with different fiber volume fractions is investigated in this work using combined cell models (CCM) and Weibull statistical approach. Statistical combined cell models (SCCM) have been developed, which were originally developed for fiber fracture in composites. This allows to calculate separately the two types of unit cells, i.e. unit cells with unbroken fibers, and with broken ones. The global mechanical behavior of metal or PMCs reinforced with randomly oriented short fibers (e.g., an Al/15 vol.% Al2O3 aluminium matrix composite and PMC with 20 or 30 vol.% glass-fibers) is calculated based on the rule of mixture. In all cases, predictions of the behavior by the SCCM are compared with experiments and they show good agreement. 2003 Elsevier B.V. All rights reserved. PACS: 81.05.N |
Selecting a characteristic set of reviews | Online reviews provide consumers with valuable information that guides their decisions on a variety of fronts: from entertainment and shopping to medical services. Although the proliferation of online reviews gives insights about different aspects of a product, it can also prove a serious drawback: consumers cannot and will not read thousands of reviews before making a purchase decision. This need to extract useful information from large review corpora has spawned considerable prior work, but so far all have drawbacks. Review summarization (generating statistical descriptions of review sets) sacrifices the immediacy and narrative structure of reviews. Likewise, review selection (identifying a subset of 'helpful' or 'important' reviews) leads to redundant or non-representative summaries. In this paper, we fill the gap between existing review-summarization and review-selection methods by selecting a small subset of reviews that together preserve the statistical properties of the entire review corpus. We formalize this task as a combinatorial optimization problem and show that it NP-hard both tosolve and approximate. We also design effective algorithms that prove to work well in practice. Our experiments with real review corpora on different types of products demonstrate the utility of our methods, and our user studies indicate that our methods provide a better summary than prior approaches. |
L2-Nonexpansive Neural Networks | This paper proposes a class of well-conditioned neural networks in which a unit amount of change in the inputs causes at most a unit amount of change in the outputs or any of the internal layers. We develop the known methodology of controlling Lipschitz constants to realize its full potential in maximizing robustness, with a new regularization scheme for linear layers, new ways to adapt nonlinearities and a new loss function. With MNIST and CIFAR-10 classifiers, we demonstrate a number of advantages. Without needing any adversarial training, the proposed classifiers exceed the state of the art in robustness against white-box L2-bounded adversarial attacks. Their outputs are quantitatively more meaningful than ordinary networks and indicate levels of confidence and generalization. They are also free of exploding gradients, among other desirable properties. |
Output constraint control on path following of four-wheel independently actuated autonomous vehicles | This paper investigates the path following problem for four-wheel independently actuated (FWIA) autonomous vehicles (AVs). A novel output constraint controller is proposed to deal with the lateral offset control in path following and maintain the vehicle lateral stability in the presence of tire sliding effects. The innovation of this work includes the following two aspects: (1) A novel output constraint control strategy, hyperbolic projection method, is proposed to strictly bound the lateral offset to prevent vehicle transgressing the safety bound in the path following; (2) A robust LQR controller, is adopted to obtain the optimal active front steering (AFS) and direct yaw moment (DYC) control inputs, with the vehicle stability consideration, and to eliminate the parameter uncertainties and external disturbances. CarSim-Simulink joint simulation results indicate that the proposed controller can compactly bound the lateral offset to avoid transgressing the safe boundary during path following process, especially in extreme driving condition, in presence of tire sliding effects, system uncertainties and disturbances. |
Beneficial effects of using a minimal extracorporeal circulation system during coronary artery bypass grafting. | BACKGROUND
In this study, we assessed clinical results by using a minimal extracorporeal circuit (MECC) and compared it to a conventional cardiopulmonary bypass (CPB) system in patients undergoing coronary artery bypass grafting (CABG) procedures.
METHODS AND MATERIALS
From August to October 2006, forty consecutive patients undergoing isolated CABG procedures were randomly assigned to either a miniaturized closed circuit CPB with the Maquet-Cardiopulmonary MECC system (Group M, n=20) or to a conventional CPB system (Group C, n=20). Clinical outcomes were observed before, during and after the operation. Besides evaluating the perioperative clinical data, serial blood venous samples were obtained after induction, 30 minutes after CPB initiation, 2h, 6h, 12h, and 24h post-CPB. The focus of our study was on myocardial damage (cTnI), neutrophil and platelet counts, activated partial thromboplastin time (aPTT) and free hemoglobin.
RESULTS
Both the transfusion of packed red blood cells and fresh frozen plasma were significantly lower in Group M compared to Group C (p<0.05). The levels of cTnI were lower in Group M at 2h, 6h and 12h post-CPB than in Group C (p<0.01). The values of aPTT in Group M recovered to normal levels after surgery, but were prolonged in Group C at early post-CPB and were statistically longer than Group M at 2h, 6h, and 12h post-CPB (p<0.05). The concentrations of free hemoglobin in Group C were higher than in Group M during and post-CPB, and there was a statistical difference at 2h post-CPB (p<0.05).
CONCLUSION
In conclusion, the MECC system is a safe alternative for patients who undertake extracorporeal circulation (ECC) for CABG surgery. Lower transfusion requirements and less damage to red cells may further promote the use of MECC systems, especially in higher risk patients. |
Notifications and awareness: a field study of alert usage and preferences | Desktop notifications are designed to provide awareness of information while a user is attending to a primary task. Unfortunately the awareness can come with the price of disruption to the focal task. We review results of a field study on the use and perceived value of email notifications in the workplace. We recorded users' interactions with software applications for two weeks and studied how notifications or their forced absence influenced users' quest for awareness of new email arrival, as well as the impact of notifications on their overall task focus. Results showed that users view notifications as a mechanism to provide passive awareness rather than a trigger to switch tasks. Turing off notifications cause some users to self interrupt more to explicitly monitor email arrival, while others appear to be able to better focus on their tasks. Users acknowledge notifications as disruptive, yet opt for them because of their perceived value in providing awareness. |
Association between ADAMTS1 matrix metalloproteinase gene variation, coronary heart disease, and benefit of statin therapy. | OBJECTIVE
The purpose of this study was to investigate the association between the Ala227Pro polymorphism in the ADAMTS1 metalloproteinase gene and coronary heart disease and benefit from statin therapy in 2 independent cohorts.
METHODS AND RESULTS
The frequency of the ADAMTS1 227Pro minor allele was 0.24 in 2421 male subjects from CARE, a randomized trial of pravastatin versus placebo. In the placebo arm, homozygotes (6.3% of study population) had a significantly increased risk of fatal coronary disease or nonfatal myocardial infarction (D/MI) compared with noncarriers (OR 2.12, 95% CI 1.07 to 4.19, P=0.03), and in the entire study the benefit of pravastatin in reducing the risk of D/MI was greater in these subjects (OR 0.21, 95% CI 0.06 to 0.69) than in heterozygotes (OR 0.74, 95% CI 0.48 to 1.14) or noncarriers (OR 0.99, 95% CI 0.68 to 1.42; P(interaction)=0.044). Results were tested in 1565 male subjects from WOSCOPS, also a randomized trial of pravastatin versus placebo. Similar to the results in CARE, in the placebo arm subjects homozygous for the minor allele were at increased risk of D/MI (OR 1.72, P=0.052) and in the entire study the benefit of pravastatin in reducing D/MI was greater in these subjects (OR 0.24, 95% CI 0.09 to 0.68) than in heterozygotes (OR 0.73, 95% CI 0.48 to 1.11) or noncarriers (OR 0.65, 95% CI 0.20 to 2.09) (P(interaction)=0.029).
CONCLUSIONS
In men not on pravastatin, those homozygous for the 227Pro allele of ADAMTS1 have a nearly 2-fold increased risk of coronary heart disease events compared with noncarriers. In this high-risk group, treatment with pravastatin is highly efficacious, reducing the odds of fatal coronary disease or nonfatal MI by approximately 75%, as compared with 25% in noncarriers or heterozygotes. |
Learning Image Embeddings using Convolutional Neural Networks for Improved Multi-Modal Semantics | We construct multi-modal concept representations by concatenating a skip-gram linguistic representation vector with a visual concept representation vector computed using the feature extraction layers of a deep convolutional neural network (CNN) trained on a large labeled object recognition dataset. This transfer learning approach brings a clear performance gain over features based on the traditional bag-of-visual-word approach. Experimental results are reported on the WordSim353 and MEN semantic relatedness evaluation tasks. We use visual features computed using either ImageNet or ESP Game images. |
Coverage of Author Identifiers in Web of Science and Scopus | As digital collections of scientific literature are widespread and used frequently in knowledge-intense working environments, it has become a challenge to identify author names correctly. The treatment of homonyms is crucial for the reliable resolution of author names. Apart from varying handling of first, middle and last names, vendors as well as the digital library community created tools to address the problem of author name disambiguation. This technical report focuses on two widespread collections of scientific literature, Web of Science (WoS) and Scopus, and the coverage with author identification information such as Researcher ID, ORCID and Scopus Author Identifier in the period 1996 2014. The goal of this study is to describe the significant differences of the two collections with respect to overall distribution of author identifiers and its use across different subject domains. We found that the STM disciplines show the best coverage of author identifiers in our dataset of 6,032,000 publications which are both covered by WoS and Scopus. In our dataset we found 184,823 distinct ResearcherIDs and 70,043 distinct ORCIDs. In the appendix of this report we list a complete overview of all WoS subject areas and the amount of author identifiers in these subject areas. |
On the requirements for successful GPS spoofing attacks | An increasing number of wireless applications rely on GPS signals for localization, navigation, and time synchronization. However, civilian GPS signals are known to be susceptible to spoofing attacks which make GPS receivers in range believe that they reside at locations different than their real physical locations. In this paper, we investigate the requirements for successful GPS spoofing attacks on individuals and groups of victims with civilian or military GPS receivers. In particular, we are interested in identifying from which locations and with which precision the attacker needs to generate its signals in order to successfully spoof the receivers. We will show, for example, that any number of receivers can easily be spoofed to one arbitrary location; however, the attacker is restricted to only few transmission locations when spoofing a group of receivers while preserving their constellation. In addition, we investigate the practical aspects of a satellite-lock takeover, in which a victim receives spoofed signals after first being locked on to legitimate GPS signals. Using a civilian GPS signal generator, we perform a set of experiments and find the minimal precision of the attacker's spoofing signals required for covert satellite-lock takeover. |
Polio endgame: the global switch from tOPV to bOPV. | Globally, polio cases have reached an all-time low, and type 2 poliovirus (one of three) is eradicated. Oral polio vaccine (OPV) has been the primary tool, however, in rare cases, OPV induces paralysis. In 2013, the World Health Assembly endorsed the phased withdrawal of OPV and introduction of inactivated poliovirus vaccine (IPV) into childhood routine immunization schedules. Type 2 OPV will be withdrawn through a globally synchronized "switch" from trivalent OPV (all three types) to bivalent OPV (types 1 and 3). The switch will happen in 155 OPV-using countries between April 17(th) and May 1(st), 2016. Planned activities to reduce type 2 outbreak risks post-switch include the following: tOPV campaigns to increase type 2 immunity prior to the switch, monovalent OPV2 stockpiling to respond to outbreaks should they occur, containment of both wild and vaccine type 2 viruses, enhanced acute flaccid paralysis (AFP) and environmental surveillance, outbreak response protocols, and ensured access to IPV and bivalent OPV. |
A robust scene-change detection method for video segmentation | This paper proposes a new method that combines the intensity and motion information to detect scene changes such as abrupt scene changes and gradual scene changes. Two major features are chosen as the basic dissimilarity measures, and selfand cross-validation mechanisms are employed via a static scene test. We also develop a novel intensity statistics model for detecting gradualscenechanges.Experimental resultsshowthat theproposed algorithms are effective and outperform the previous approaches. |
Adaptive Information Extraction from Text by Rule Induction and Generalisation | (LP) 2 is a covering algorithm for adaptive Information Extraction from text (IE). It induces symbolic rules that insert SGML tags into texts by learning from examples found in a user-defined tagged corpus. Training is performed in two steps: initially a set of tagging rules is learned; then additional rules are induced to correct mistakes and imprecision in tagging. Induction is performed by bottom-up generalization of examples in the training corpus. Shallow knowledge about Natural Language Processing (NLP) is used in the generalization process. The algorithm has a considerable success story. From a scientific point of view, experiments report excellent results with respect to the current state of the art on two publicly available corpora. From an application point of view, a successful industrial IE tool has been based on (LP) 2. Real world applications have been developed and licenses have been released to external companies for building other applications. This paper presents (LP) 2 , experimental results and applications, and discusses the role of shallow NLP in rule induction. |
Approaching 80 years of passive radar | The history of passive radar dates back to the early days of radar in 1935 when the Daventry experiment was conducted in the UK. It continues in WW II with the German Klein Heidelberg passive radar and receives new interest today, as passive covert radar (PCR) systems like Silent Sentry, Homeland Alerter 100, Aulos and PARADE are ready for operation. The future of PCR will strongly depend on the availability of transmitters of opportunity such as FM-radio and digital broadcast networks. |
Performance evaluation for Modbus/TCP using Network Simulator NS3 | Modbus protocol is a de-facto standard protocol in industrial automation. As the size and complexity of industry systems increase rapidly, the importance of real-time communication protocols arises as well. In this paper, we analyze the performance of the Modbus/TCP communication protocol which is implemented using the Network Simulator version 3 (NS-3). The performance evaluation focuses on the response time depending on the number of nodes and topology. |
Water purification by photocatalysis on semiconductors | This paper describes the basics of photocatalysis on semiconductors, mainly on TiO2 and the application of photocatalytic processes to water purification from organic matter. The second chapter is devoted to metal cocatalysts introduced in order to improve the photocatalytic action of titania. Finally, a short review of more interesting practical applications of the processes is presented. |
PalmHashing: a novel approach for cancelable biometrics | We propose a novel cancelable biometric approach, known as PalmHashing, to solve the non-revocable biometric proposed method hashes palmprint templates with a set of pseudo-random keys to obtain a unique code called palmhash. The palmhash code can be stored in portable devices such tokens and smartcards for verification. Multiple sets of palmha can be maintained in multiple applications. Thus the privacy and security of the applications can be greatly enhance compromised, revocation can also be achieved via direct replacement of a new set of palmhash code. In addition, PalmHashin offers several advantages over contemporary biometric approaches such as clear separation of the genuine-imposter and zero EER occurrences. In this paper, we outline the implementation details of this method and also highlight its p in security-critical applications. 2004 Elsevier B.V. All rights reserved. |
Determinants of Capital Structure in Nigeria | Capital structure represents one of the most discussed concepts in financial management. Capital structure refers to how a company finances its operations whether through shareholders equity-fund or debt or a combination of both. Various internal and external factors contribute to the choice of these sources of fund. The external factors include factors such as tax policy, capital market conditions and tax policy, among others. Meanwhile, the internal factors are those that relate to individual firm characteristics. This study examines the determinants of capital structure in Nigeria using the descriptive research design. The population comprised of the eighty-six manufacturing firms that are listed in the Nigerian Stock Exchange. The sample firms were selected using the simple random sampling method. Secondary data obtained from the annual accounts of 24 randomly selected manufacturing firms for 10 years period culminating in 240 firm-year observations. The results of the regression analysis revealed that leverage (a measure of capital structure) has a negative relationship with firm size and tax on one hand and a positive relationship with tangibility of assets, profitability and growth on the other hand. However, only with tangibility of assets and firm size that significant relationship is established. It is recommended for future researchers to carry out similar studies in multiple sectors. |
Potential of Phosphorus Nuclear Magnetic Resonance Spectroscopy in Studies of the Energy Metabolism of Skeletal Muscles | The aim of the present study was to investigate the possibility of phosphorus magnetic resonance spectroscopy (MR spectroscopy) in the diagnosis of metabolic lesions of skeletal musculature in patients with intermittent claudication syndrome, chronic cardiac failure, and varicose diseases of the lower limbs. Studies included 50 males: 20 patients with intermittent claudication, 10 patients with chronic cardiac failure, and 10 patients with varicose veins. The control group consisted of 10 healthy volunteers. The following measures were determined: the phosphocreatinine index, the intracellular pH in the gastrocnemius muscle, and the half-recovery time for the phosphocreatinine index. The phosphocreatinine index and the pH at rest did not differ between study groups. Isotonic exercise produced no change in the phosphocreatinine index in the control group; patients with intermittent claudication showed a 26.1% decrease, patients with chronic cardiac failure showed an 8% decrease, and patients with varicose veins showed a 25.6% decrease. The only group showing a significant decrease in pH during exercise was the group of patients with intermittent claudication. This group also showed an inverse correlation between the pressure index and the extent of the decrease in the phosphocreatinine index. Thus, MR spectroscopy provides a non-invasive diagnostic method for lesions of energy metabolism in skeletal musculature in patients with deranged peripheral hemodynamics. |
Prophylaxis and treatment of bladder dysfunction after Wertheim-Meigs operation: the positive effect of early postoperative detrusor stimulation using the cholinergic drug betanecholchloride | The efficacy of betanecholchloride in the postoperative treatment of bladder dysfunction is controversial. We therefore performed a comparative study on the effect of this therapy for the prophylaxis of detrusor hypotonia after Wertheim-Meigs operation. Forty patients with cervical cancer FIGO stage Ib/IIa were divided into two study groups. The control group (24 patients) only received betanecholchloride if the residual urine persisted above 50 ml after the 10th postoperative day. The study group (16 patients) received 50 mg betanecholchloride three times a day from the 3rd postoperative day onward. In this group postoperative catheter treatment, and consequently hospital stay, were significantly shorter (9.6 versus 13.3 days and 15.5 versus 18.6 days). The residual urinary volume normalized faster (8.0 versus 13.0 days) and the rate of cystitis was lower (18.8 versus 25%). According to our study, a prophylactic application of the parasympathomimetic drug betanecholchloride diminishes postoperative complications associated with bladder dysfunction after Wertheim-Meigs operation. |
How We Got Into The Dark Matter Fix And How We Can Get Out | The recent discovery of cosmic repulsion represents a major challenge to standard gravity, adding an apparent missing energy problem to its still not yet adequately resolved missing mass one, while simultaneously requiring the current universe to be a very special, fine-tuned one. We suggest that the resolution of these difficulties lies not in the existence of more and more exotic gravitational sources, but rather in the lack of reliability of the extrapolation of standard gravity from its solar system origins to the altogether larger distance scales associated with cosmology. We suggest a very different such extrapolation, namely that associated with conformal gravity, to find that all of these issues are then readily and naturally resolved. |
Passive Sensor Integration for Vehicle Self-Localization in Urban Traffic Environment † | This research proposes an accurate vehicular positioning system which can achieve lane-level performance in urban canyons. Multiple passive sensors, which include Global Navigation Satellite System (GNSS) receivers, onboard cameras and inertial sensors, are integrated in the proposed system. As the main source for the localization, the GNSS technique suffers from Non-Line-Of-Sight (NLOS) propagation and multipath effects in urban canyons. This paper proposes to employ a novel GNSS positioning technique in the integration. The employed GNSS technique reduces the multipath and NLOS effects by using the 3D building map. In addition, the inertial sensor can describe the vehicle motion, but has a drift problem as time increases. This paper develops vision-based lane detection, which is firstly used for controlling the drift of the inertial sensor. Moreover, the lane keeping and changing behaviors are extracted from the lane detection function, and further reduce the lateral positioning error in the proposed localization system. We evaluate the integrated localization system in the challenging city urban scenario. The experiments demonstrate the proposed method has sub-meter accuracy with respect to mean positioning error. |
A Framework for Transfer in Reinforcement Learning | We present a conceptual framework for transfer in reinforcement learning based on the idea that related tasks share a common space. The framework attempts to capture the notion of tasks that are related (so that transfer is possible) but distinct (so that transfer is non-trivial). We define three types of transfer (knowledge, skill and model transfer) in terms of the framework, and illustrate them with an example scenario. |
Predicting online e-marketplace sales performances: A big data approach | To manage supply chain efficiently, e-business organizations need to understand their sales effectively. Previous research has shown that product review plays an important role in influencing sales performance, especially review volume and rating. However, limited attention has been paid to understand how other factors moderate the effect of product review on online sales. This study aims to confirm the importance of review volume and rating on improving sales performance, and further examine the moderating roles of product category, answered questions, discount and review usefulness in such relationships. By analyzing 2939 records of data extracted from Amazon.com using a big data architecture, it is found that review volume and rating have stronger influence on sales rank for search product than for experience product. Also, review usefulness significantly moderates the effects of review volume and rating on product sales rank. In addition, the relationship between review volume and sales rank is significantly moderated by both answered questions and discount. However, answered questions and discount do not have significant moderation effect on the relationship between review rating and sales rank. The findings expand previous literature by confirming important interactions between customer review features and other factors, and the findings provide practical guidelines to manage e-businesses. This study also explains a big data architecture and illustrates the use of big data technologies in testing theoretical |
Ground reaction forces during downhill and uphill running. | We investigated the normal and parallel ground reaction forces during downhill and uphill running. Our rationale was that these force data would aid in the understanding of hill running injuries and energetics. Based on a simple spring-mass model, we hypothesized that the normal force peaks, both impact and active, would increase during downhill running and decrease during uphill running. We anticipated that the parallel braking force peaks would increase during downhill running and the parallel propulsive force peaks would increase during uphill running. But, we could not predict the magnitude of these changes. Five male and five female subjects ran at 3m/s on a force treadmill mounted on the level and on 3 degrees, 6 degrees, and 9 degrees wedges. During downhill running, normal impact force peaks and parallel braking force peaks were larger compared to the level. At -9 degrees, the normal impact force peaks increased by 54%, and the parallel braking force peaks increased by 73%. During uphill running, normal impact force peaks were smaller and parallel propulsive force peaks were larger compared to the level. At +9 degrees, normal impact force peaks were absent, and parallel propulsive peaks increased by 75%. Neither downhill nor uphill running affected normal active force peaks. Combined with previous biomechanics studies, our normal impact force data suggest that downhill running substantially increases the probability of overuse running injury. Our parallel force data provide insight into past energetic studies, which show that the metabolic cost increases during downhill running at steep angles. |
Explicit Reasoning over End-to-End Neural Architectures for Visual Question Answering | Many vision and language tasks require commonsense reasoning beyond data-driven image and natural language processing. Here we adopt Visual Question Answering (VQA) as an example task, where a system is expected to answer a question in natural language about an image. Current state-ofthe-art systems attempted to solve the task using deep neural architectures and achieved promising performance. However, the resulting systems are generally opaque and they struggle in understanding questions for which extra knowledge is required. In this paper, we present an explicit reasoning layer on top of a set of penultimate neural network based systems. The reasoning layer enables reasoning and answering questions where additional knowledge is required, and at the same time provides an interpretable interface to the end users. Specifically, the reasoning layer adopts a Probabilistic Soft Logic (PSL) based engine to reason over a basket of inputs: visual relations, the semantic parse of the question, and background ontological knowledge from word2vec and ConceptNet. Experimental analysis of the answers and the key evidential predicates generated on the VQA dataset validate |
Aspirin increases nitric oxide formation in chronic stable coronary disease. | INTRODUCTION
There are no published randomized data on secondary prevention in humans about whether aspirin affects nitric oxide (NO) formation. In patients with chronic stable coronary disease, we tested whether aspirin at clinically relevant doses increases NO formation.
MATERIALS AND METHODS
In a randomized, double-blind trial, 37 patients from 2 cardiology office practices were assigned to daily doses of 81, 162.5, 325, 650, or 1300 aspirin for 12 weeks. Primary prespecified outcome measures were changes in heme oxygenase (HO-1), a downstream target of NO formation, and asymmetrical dimethyl arginine (ADMA), a competitive inhibitor of NO synthase.
RESULTS
There were no significant differences for HO-1 or ADMA between any of the clinically relevant doses of aspirin tested, so all were combined. For HO-1, there was a significant increase (10.29 ± 2.44, P < .001) from baseline (15.37 ± 1.85) to week 12 (25.66 ± 1.57). The mean ratio (MR) of week 12 to baseline for HO-1 was significantly higher than 1.0 (1.67, confidence interval [CI] from 1.60 to 1.74, P < .001). For ADMA, there was a significant decrease (-0.24 ± 0.11, P < .001) from baseline (0.78 ± 0.08) to week 12 (0.54 ± 0.07). The MR of week 12 to baseline for ADMA was significantly lower than 1.0 (0.69, CI from 0.66 to 0.73, P < .001).
CONCLUSIONS
In patients with chronic stable coronary disease, all clinically relevant daily doses of aspirin tested, from 81 to 1300 mg, produce similar and statistically significant increases in HO-1 and decreases in ADMA. These are the first randomized data on secondary prevention patients. These data support the hypothesis that aspirin has additional beneficial effects mediated through NO formation. Further research, including direct randomized comparisons on atherosclerosis using noninvasive techniques as well as on occlusive vascular disease events, is necessary. |
Complex suicide by self-stabbing with subsequent drowning in the sea. | The paper presents a unique case of a complex suicide committed by a young man, mostly probably triggered by a disappointment in love. The uniqueness of the suicide lies in the fact that the victim inflicted several deep stab wounds on himself, in the chest and abdomen, while standing partly submerged in the sea and, having done so, he dropped and disappeared in the water. The postmortem examination showed, apart from deep wounds in the trunk, characteristics of drowning that manifested itself in the form of aqueous emphysema of the lungs. Suicide was clearly determined on the basis of the circumstances preceding death, the location, and arrangement of the trunk wounds and the testimony given by a witness of the incident. The circumstances preceding the suicidal act clearly suggest an underlying undiagnosed mental disorder. |
Controlling interfaces : a key to project success | Most problems cited in papers dealing with project success factors are often linked to the responsibility of particular players in the construction phase, rather than to the relations among them and with their environment. Challenging the current belief that "dividing is ruling", this new vision groups problems seemingly different in nature, and provides, from their analysis, a unified set of recommendations that, if applied, could help drastically reduce unpredictability of the outcome of a project and boost productivity. Yet, and it is confirmed by the few articles that have been written about this approach, little has been done on the construction field. The appearance of a new form of project management, lean construction, and the verifiable results of improvement it triggers, offers the interface perspective a more comprehensive and supportive environment, in which it can be more easily developed, implemented and perfected. This study focuses on what should be considered the three most important interfaces during the construction phase of the project, and will compare the improvements suggested by this approach to the principles of lean construction, to support the utility of this new way of perceiving problems. Thesis Supervisor: Dr. Yehiel Rosenfeld Title: Visiting Professor of Civil and Environmental Engineering |
Adaptive Gesture Recognition with Variation Estimation for Interactive Systems | This article presents a gesture recognition/adaptation system for human--computer interaction applications that goes beyond activity classification and that, as a complement to gesture labeling, characterizes the movement execution. We describe a template-based recognition method that simultaneously aligns the input gesture to the templates using a Sequential Monte Carlo inference technique. Contrary to standard template-based methods based on dynamic programming, such as Dynamic Time Warping, the algorithm has an adaptation process that tracks gesture variation in real time. The method continuously updates, during execution of the gesture, the estimated parameters and recognition results, which offers key advantages for continuous human--machine interaction. The technique is evaluated in several different ways: Recognition and early recognition are evaluated on 2D onscreen pen gestures; adaptation is assessed on synthetic data; and both early recognition and adaptation are evaluated in a user study involving 3D free-space gestures. The method is robust to noise, and successfully adapts to parameter variation. Moreover, it performs recognition as well as or better than nonadapting offline template-based methods. |
Decoupling for power gating: Sources of power noise and design strategies | Power gating is essential for controlling leakage power dissipation of modern chip designs. However, power gating introduces unique power delivery integrity issues and tradeoffs between switching and rush current (wake-up) supply noises. In addition, in power-gated power delivery networks (PDNs), the amount of power saving intrinsically trades off with power integrity. In this paper, we propose systemic decoupling capacitance optimization strategies that optimally balance between switching and rush current noises, and tradeoff between power integrity and wake-up time, hence power saving. Furthermore, we propose a novel re-routable decoupling capacitance concept to break the tight interaction between power integrity and power saving, providing further improved tradeoffs between the two. Our design strategies have been implemented in a simulation-based optimization flow and the conducted experimental results have demonstrated significant improvement on leakage power saving through the presented techniques. |
Self-referenced laser system for optical 3D seafloor mapping | Underwater high resolution 3D mapping implies a very accurate sensor pose estimation to fuse the sensor readings over time into one consistent model. In the case of optical mapping systems, the needed accuracy can easily lie outside of the specification of Doppler Velocity Logs normally used for pose estimation by remotely operated and autonomous underwater vehicles. This is especially the case for difficult terrains or for low altitude situations. Therefore, to improve performance in such situations, a self-referenced laser system is proposed performing micro navigation to significantly improve the resolution of optical underwater 3D seabed mapping. |
Supervised probabilistic principal component analysis | Principal component analysis (PCA) has been extensively applied in data mining, pattern recognition and information retrieval for unsupervised dimensionality reduction. When labels of data are available, e.g., in a classification or regression task, PCA is however not able to use this information. The problem is more interesting if only part of the input data are labeled, i.e., in a semi-supervised setting. In this paper we propose a supervised PCA model called SPPCA and a semi-supervised PCA model called S2PPCA, both of which are extensions of a probabilistic PCA model. The proposed models are able to incorporate the label information into the projection phase, and can naturally handle multiple outputs (i.e., in multi-task learning problems). We derive an efficient EM learning algorithm for both models, and also provide theoretical justifications of the model behaviors. SPPCA and S2PPCA are compared with other supervised projection methods on various learning tasks, and show not only promising performance but also good scalability. |
Roborobo! a Fast Robot Simulator for Swarm and Collective Robotics | Roborobo! is a multi-platform, highly portable, robot simulator for large-scale collective robotics experiments. Roborobo! is coded in C++, and follows the KISS guideline (”Keep it simple”). Therefore, its external dependency is solely limited to the widely available SDL library for fast 2D Graphics. Roborobo! is based on a Khepera/ePuck model. It is targeted for fast single and multi-robots simulation, and has already been used in more than a dozen published research mainly concerned with evolutionary swarm robotics, including environment-driven self-adaptation and distributed evolutionary optimization, as well as online onboard embodied evolution and embodied morphogenesis. |
TREC-10 Experiments at KAIST: Batch Filtering and Question Answering | 1.Introduction In TREC-10, we participated in two tasks: batch filtering task in the filtering task, and question answering task. In question answering task, we participated in three sub-tasks (main task, list task, and context task). In batching filtering task, we experimented a filtering technique, which unifies the results of support vector machines for subtopics subdivided by incremental clustering. For a topic, we generated subtopics by detecting similar documents in training relevant documents, and unified the results of SVM classifier for subtopics by OR set operation. In question answering task, we submitted two runs for main task (KAISTQAMAIN1, KAISTQAMAIN2), two runs for list task (KAISTQALIST1, KAISTQALIST2), and one run for context task (KAISTQACTX). |
MAPP: a Scalable Multi-Agent Path Planning Algorithm with Tractability and Completeness Guarantees | Multi-agent path planning is a challenging problem with numerous real-life applications. Running a centralized search such as A* in the combined state space of all units is complete and cost-optimal, but scales poorly, as the state space size is exponential in the number of mobile units. Traditional decentralized approaches, such as FAR and WHCA*, are faster and more scalable, being based on problem decomposition. However, such methods are incomplete and provide no guarantees with respect to the running time or the solution quality. They are not necessarily able to tell in a reasonable time whether they would succeed in finding a solution to a given instance. We introduce MAPP, a tractable algorithm for multi-agent path planning on undirected graphs. We present a basic version and several extensions. They have low-polynomial worst-case upper bounds for the running time, the memory requirements, and the length of solutions. Even though all algorithmic versions are incomplete in the general case, each provides formal guarantees on problems it can solve. For each version, we discuss the algorithm’s completeness with respect to clearly defined subclasses of instances. Experiments were run on realistic game grid maps. MAPP solved 99.86% of all mobile units, which is 18–22% better than the percentage of FAR and WHCA*. MAPP marked 98.82% of all units as provably solvable during the first stage of plan computation. Parts of MAPP’s computation can be re-used across instances on the same map. Speed-wise, MAPP is competitive or significantly faster than WHCA*, depending on whether MAPP performs all computations from scratch. When data that MAPP can re-use are preprocessed offline and readily available, MAPP is slower than the very fast FAR algorithm by a factor of 2.18 on average. MAPP’s solutions are on average 20% longer than FAR’s solutions and 7–31% longer than WHCA*’s solutions. |
Online digital filters for biological signals: some fast designs for a small computer | After reviewing the design of a class of lowpass recursive digital filters having integer multiplier and linear phase characteristics, the possibilities for extending the class to include high pass, bandpass, and bandstop (‘notch’) filters are described. Experience with a PDP 11 computer has shown that these filters may be programmed simply using machine code, and that online operation at sampling rates up to about 8 kHz is possible. The practical application of such filters is illustrated by using a notch desgin to remove mains-frequency interference from an e.c.g. waveform. Après avoir passé en revue la conception d'un type de filtres digitaux récurrents passe-bas à multiplicateurs incorporés et à caractéristiques de phase linéaires, cet article décrit les possibilités d'extension de ce type aux filtres, passe-haut, passe-bande et à élimination de bande. Une expérience menée avec un ordinateur PDP 11 a indiqué que ces filtres peuvent être programmés de manière simple avec un code machine, et qu'il est possible d'effectuer des opérations en ligne avec des taux d'échantillonnage jusqu'à environ 8 kHz. L'application pratique de tels filtres est illustrée par un exemple dans lequel un filtre à élimination de bande est utilisé pour éliminer les interférences due à la fréquence du courant d'alimentation dans un tracé d'e.c.g. Nach einer Untersuchung der Konstruktion einer Gruppe von Rekursivdigitalfiltern mit niedrigem Durchlässigkeitsbereich und mit ganzzahligen Multipliziereinrichtungen und Linearphaseneigenschaften werden die Möglichkeiten beschrieben, die Gruppe so zu erweitern, daß sie Hochfilter, Bandpaßfilter und Bandstopfilter (“Kerbfilter”) einschließt. Erfahrungen mit einem PDP 11-Computer haben gezeigt, daß diese Filter auf einfache Weise unter Verwendung von Maschinenkode programmiert werden können und daß On-Line-Betrieb bei Entnahmegeschwindigkeiten von bis zu 8 kHz möglich ist. Die praktische Anwendung solcher Filter wird durch Verwendung einer Kerbkonstruktion zur Ausscheidung von Netzfrequenzstörungen von einer ECG-Wellenform illustriert. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.