title
stringlengths
8
300
abstract
stringlengths
0
10k
Construction of neoglans penis: a new sculpturing technique from rectus abdominis myofascial flap.
INTRODUCTION Construction of a neoglans penis may be required following glans amputation at circumcision, strangulation by a hair coil, or self-mutilation, among other indications. It may also be combined with phalloplasty to imitate the natural appearance and to support a penile prosthesis. AIM This is a report on a novel technique of neoglans construction for a patient with an amputated glans penis as a result of circumcision injury. METHODS A rectus abdominis myofascial flap was used. The flap was designed to be a 12 x 4 cm segment of the infraumbilical portion of the muscle, based on the inferior epigastric vessels. The flap was harvested through a paramedian incision. The penis was partially degloved through a circumferential incision 1 cm below its summit. The distal penile skin was utilized to elongate the urethra, so that the urethral meatus would be at the tip of the neoglans. The flap was reflected and tunneled underneath the mons veneris and alongside the penis, to emerge distal to the summit of the penis. The flap was fashioned into the shape of a glans and secured in place around the neourethra. The impression of a corona was achieved by tucking the proximal edge of the flap to its undersurface. RESULT Six months following surgery, the patient had a neoglans penis, a corona, and a urethral meatus at the very tip. The neoglans had similar consistency, color, and shape to the normal glans. CONCLUSION Construction of a neoglans penis is possible using the described sculpturing techniques, with satisfactory cosmetic results.
Automated classification of contact lens type in iris images
Textured cosmetic lenses have long been known to present a problem for iris recognition. It was once believed that clear, soft contact lenses did not impact iris recognition accuracy. However, it has recently been shown that persons wearing clear, soft contact lenses experience an increased false non-match rate relative to persons not wearing contact lenses. Iris recognition systems need the ability to automatically determine if a person is (a) wearing no contact lens, (b) wearing a clear prescription lens, or (c), wearing a textured cosmetic lens. This work presents results of the first attempt that we are aware of to solve this three-class classification problem. Results show that it is possible to identify with high accuracy (96.5%) the images in which a textured cosmetic contact lens is present, but that correctly distinguishing between no lenses and soft lenses is a challenging problem.
RFID Coverage Extension Using Microstrip-Patch Antenna Array [Wireless Corner]
In this paper, a UHF-band 2 times 2 microstrip phased-array antenna is designed and implemented to extend the coverage of an RFID reader system. The phased-array antenna has four microstrip-patch antennas, three Wilkinson power dividers, and a transmission-line phase shifter. These are printed on a dielectric substrate with a dielectric constant of 4.5. The array has dimensions of 34 cm times 45 cm, operating at a frequency of 867 MHz, as specified in RFID Gen2 protocol European standards. The phased-array antenna has a measured directivity of 12.1 dB, and the main-beam direction can be steered to angles of plusmn 40deg, with a HPBW of 90deg. The phased-array antenna is used as the receiving antenna in a commercial reader system. Experimental results indicate that the coverage of the RFID system with the phased-array antenna is superior to the coverage with a conventional broader-beamwidth microstrip-patch antenna. The proposed system can also be used for a wireless positioning system.
Participant uptake of the fecal immunochemical test decreases with the two-sample regimen compared with one-sample FIT
BACKGROUND Fecal immunochemical tests (FITs) are recommended to screen average-risk adults for colorectal cancer (CRC). Little research has examined whether a two-sample FIT affects participant uptake, compared with a one-sample FIT. Examining participant uptake is important, as evidence suggests that a two-sample FIT may increase the sensitivity to detect CRC. OBJECTIVE This study had two objectives: (i) to evaluate FIT completion in a population that received either a one-sample FIT kit (1-FIT) or a two-sample FIT kit (2-FIT) and (ii) to understand whether uptake varies by age, sex, or receipt of prior CRC screening. METHODS We conducted a randomized controlled trial in which 3081 participants who were aged between 50 and 75 years and were at an average risk for CRC, and who had requested FITs, randomly received 1-FIT (n=1540) or 2-FIT (n=1541) kits. FIT completion was defined as the completion and return of a one-sample test by the patients in the 1-FIT group or of both sample tests by those in the 2-FIT group. Cox proportional hazard regression models were used to determine the independent effect of group type (2-FIT vs. 1-FIT) on the completion of the FIT, adjusting for age, sex, and receipt of prior CRC screening. RESULTS The 2-FIT group had lower test completion rates (hazard ratio=0.87; 95% confidence interval=0.78-0.97; P=0.01) after adjusting for age, sex, and receipt of prior CRC screening. Participant uptake did not vary by age, sex, or receipt of prior CRC screening. CONCLUSION This unique, rigorous randomized controlled trial found that the 2-FIT regimen decreases completion of FIT. Further research is needed to understand whether decreases in participant uptake are offset by increased gains in test sensitivity.
Maximum Expected Likelihood Estimation for Zero-resource Neural Machine Translation
While neural machine translation (NMT) has made remarkable progress in translating a handful of resource-rich language pairs recently, parallel corpora are not always readily available for most language pairs. To deal with this problem, we propose an approach to zero-resource NMT via maximum expected likelihood estimation. The basic idea is to maximize the expectation with respect to a pivot-to-source translation model for the intended source-to-target model on a pivot-target parallel corpus. To approximate the expectation, we propose two methods to connect the pivot-to-source and source-to-target models. Experiments on two zero-resource language pairs show that the proposed approach yields substantial gains over baseline methods. We also observe that when trained jointly with the source-to-target model, the pivotto-source translation model also obtains improvements over independent training.
The Edge of Violence : Towards Telling the Difference Between Violent and Non-Violent Radicalization
Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content.
The use of NARX neural networks to predict chaotic time series
The prediction of chaotic time series with neural networks is a traditional practical problem of dynamic systems. This paper is not intended for proposing a new model or a new methodology, but to study carefully and thoroughly several aspects of a model on which there are no enough communicated experimental data, as well as to derive conclusions that would be of interest. The recurrent neural networks (RNN) models are not only important for the forecasting of time series but also generally for the control of the dynamical system. A RNN with a sufficiently large number of neurons is a nonlinear autoregressive and moving average (NARMA) model, with “moving average” referring to the inputs. The prediction can be assimilated to identification of dynamic process. An architectural approach of RNN with embedded memory, “Nonlinear Autoregressive model process with eXogenous input” (NARX), showing promising qualities for dynamic system applications, is analyzed in this paper. The performances of the NARX model are verified for several types of chaotic or fractal time series applied as input for neural network, in relation with the number of neurons, the training algorithms and the dimensions of his embedded memory. In addition, this work has attempted to identify a way to use the classic statistical methodologies (R/S Rescaled Range analysis and Hurst exponent) to obtain new methods of improving the process efficiency of the prediction chaotic time series with NARX. Key-Words: Chaotic Time Series, Hurst Exponent, Prediction, Recurrent Neural Networks, NARX Model
Behavioural Finance : A Review and Synthesis
I provide a synthesis of the Behavioural finance literature over the past two decades. I review the literature in three parts, namely, (i) empirical and theoretical analyses of patterns in the cross-section of average stock returns, (ii) studies on trading activity, and (iii) research in corporate finance. Behavioural finance is an exciting new field because it presents a number of normative implications for both individual investors and CEOs. The papers reviewed here allow us to learn more about these specific implications.
Traffic congestion detection in large-scale scenarios using vehicle-to-vehicle communications
Cooperative vehicular systems are currently being investigated to design innovative ITS (Intelligent Transportation Systems) solutions for road traffic management and safety. Through the wireless exchange of information between vehicles, and between vehicles and infrastructure nodes, cooperative systems can support novel decentralized strategies for ubiquitous and more cost-attractive traffic monitoring. In this context, this paper presents and evaluates CoTEC (COperative Traffic congestion detECtion), a novel cooperative technique based on Vehicle-to-Vehicle (V2V) communications designed to detect road traffic congestion. CoTEC is evaluated under large-scale highway scenarios using iTETRIS, a unique open source simulation platform created to investigate the impact of cooperative vehicular systems. The obtained results demonstrate CoTEC’s capability to accurately detect and characterize road traffic congestion conditions under different traffic scenarios and V2V penetration rates. In particular, CoTEC results in congestion detection probabilities higher than 90%. These results are obtained without overloading the cooperative communications channel. In fact, CoTEC reduces the communications overhead needed to detect road traffic congestions compared to related techniques by 88%. & 2012 Elsevier Ltd. All rights reserved.
A flat car-roof antenna module for phone and GPS applications
A flat car-roof antenna module for phone and GPS applications is proposed. The module consists of a planar inverted-F antenna (PIFA) for the phone operation and a planar loop antenna for the GPS one. The two antennas are located in a same plane parallel to car roof and put close to each other so that the dimension of the module is only 50 × 80 mm2 in plane. Gap between the antenna plane and the car roof is only 19 mm. This low profile makes the module almost invisible when it is installed on a car roof. Even with this narrow gap, the two antennas keep high performance in their operation frequencies.
Recurrent Topic-Transition GAN for Visual Paragraph Generation
A natural image usually conveys rich semantic content and can be viewed from different angles. Existing image description methods are largely restricted by small sets of biased visual paragraph annotations, and fail to cover rich underlying semantics. In this paper, we investigate a semi-supervised paragraph generative framework that is able to synthesize diverse and semantically coherent paragraph descriptions by reasoning over local semantic regions and exploiting linguistic knowledge. The proposed Recurrent Topic-Transition Generative Adversarial Network (RTT-GAN) builds an adversarial framework between a structured paragraph generator and multi-level paragraph discriminators. The paragraph generator generates sentences recurrently by incorporating region-based visual and language attention mechanisms at each step. The quality of generated paragraph sentences is assessed by multi-level adversarial discriminators from two aspects, namely, plausibility at sentence level and topic-transition coherence at paragraph level. The joint adversarial training of RTT-GAN drives the model to generate realistic paragraphs with smooth logical transition between sentence topics. Extensive quantitative experiments on image and video paragraph datasets demonstrate the effectiveness of our RTT-GAN in both supervised and semi-supervised settings. Qualitative results on telling diverse stories for an image verify the interpretability of RTT-GAN.
Life Cycle Modeling of Concrete Bridge Design: Comparison of Engineered Cementitious Composite Link Slabs and Conventional Steel Expansion Joints
Concrete infrastructure represents an enormous investment of materials, energy, and capital, and results in signifi ronmental burdens and social costs. There is an ongoing effort to identify material alternatives to conventional concrete. assessment ~LCA! is an important tool to evaluate the environmental performance of alternative infrastructure materials and Here, we present a comparative LCA of two bridge deck systems over a 60 year service life: one using conventional steel expa and the other based on a link slab design using a concrete alternative, engineered cementitious composites ~ECC!. The ECC link slab design is expected to extend the bridge deck service life and reduce maintenance activities. A life cycle model was deve accounts for materials production and distribution, construction and maintenance processes, construction-related traffic cong end-of-life management. Results indicate that the ECC bridge deck system has significant advantages in environmental perform less life cycle energy consumption, 50% less solid waste generation, and 38% less raw material consumption. Construction re congestion is the greatest contributor to most life cycle impact categories. DOI: 10.1061/ ~ASCE!1076-0342~2005!11:1~51! CE Database subject headings: Energy consumption; Environmental impacts; Bridge design; Portland cements; C pavements; Fiber reinforced materials .
Conformal field theory approach to Fermi liquids and other highly entangled states
The Fermi surface may be usefully viewed as a collection of ($1+1$)-dimensional chiral conformal field theories. This approach permits straightforward calculation of many anomalous ground-state properties of the Fermi gas, including entanglement entropy and number fluctuations. The ($1+1$)-dimensional picture also generalizes to finite temperature and the presence of interactions. We argue that the low-energy entanglement structure of Fermi liquid theory is universal, depending only on the geometry of the interacting Fermi surface. We also describe three additional systems in $3+1$ dimensions where a similar mechanism leads to a violation of the boundary law for entanglement entropy.
A Lazy Concurrent List-Based Set Algorithm
List-based implementations of sets are a fundamental building block of many concurrent algorithms. A skiplist based on the lock-free list-based set algorithm of Michael will be included in the Java Concurrency Package of JDK 1.6.0. However, Michael’s lock-free algorithm has several drawbacks, most notably that it requires all list traversal operations, including membership tests, to perform cleanup operations of logically removed nodes, and that it uses the equivalent of an atomically markable reference, a pointer that can be atomically “marked,” which is expensive in some languages and unavailable in others. We present a novel “lazy” list-based implementation of a concurrent set object. It is based on an optimistic locking scheme for inserts and removes, eliminating the need to use the equivalent of an atomically markable reference. It also has a novel wait-free membership test operation (as opposed to Michael’s lock-free one) that does not need to perform cleanup operations and is more efficient than that of all previous algorithms. Empirical testing shows that the new lazy-list algorithm consistently outperforms all known algorithms, including Michael’s lock-free algorithm, throughout the concurrency range. At high load, with 90% membership tests, the lazy algorithm is more than twice as fast as Michael’s. This is encouraging given that typical search structure usage patterns include around 90% membership tests. By replacing the lock-free membership test of Michael’s algorithm with our new wait-free one, we achieve an algorithm that slightly outperforms our new lazy-list (though it may not be as efficient in other contexts as it uses Java’s RTTI mechanism to create pointers that can be atomically marked).
Context based object categorization: A critical survey
Please cite this article in press as: C. Galleguillos doi:10.1016/j.cviu.2010.02.004 The goal of object categorization is to locate and identify instances of an object category within an image. Recognizing an object in an image is difficult when images include occlusion, poor quality, noise or background clutter, and this task becomes even more challenging when many objects are present in the same scene. Several models for object categorization use appearance and context information from objects to improve recognition accuracy. Appearance information, based on visual cues, can successfully identify object classes up to a certain extent. Context information, based on the interaction among objects in the scene or global scene statistics, can help successfully disambiguate appearance inputs in recognition tasks. In this work we address the problem of incorporating different types of contextual information for robust object categorization in computer vision. We review different ways of using contextual information in the field of object categorization, considering the most common levels of extraction of context and the different levels of contextual interactions. We also examine common machine learning models that integrate context information into object recognition frameworks and discuss scalability, optimizations and possible future approaches. 2010 Elsevier Inc. All rights reserved.
Human Young Children as well as Adults Demonstrate ‘Superior’ Rapid Snake Detection When Typical Striking Posture Is Displayed by the Snake
Humans as well as some nonhuman primates have an evolved predisposition to associate snakes with fear by detecting their presence as fear-relevant stimuli more rapidly than fear-irrelevant ones. In the present experiment, a total of 74 of 3- to 4-year-old children and adults were asked to find a single target black-and-white photo of a snake among an array of eight black-and-white photos of flowers as distracters. As target stimuli, we prepared two groups of snake photos, one in which a typical striking posture was displayed by a snake and the other in which a resting snake was shown. When reaction time to find the snake photo was compared between these two types of the stimuli, its mean value was found to be significantly smaller for the photos of snakes displaying striking posture than for the photos of resting snakes in both the adults and children. These findings suggest the possibility that the human perceptual bias for snakes per se could be differentiated according to the difference of the degree to which their presence acts as a fear-relevant stimulus.
Brain size and cognitive ability: Correlations with age, sex, social class, and race.
Using data from magnetic resonance imaging (MRI), autopsy, endocranial measurements, and other techniques, we show that (1) brain size is correlated with cognitive ability about .44 using MRI; (2) brain size varies by age, sex, social class, and race; and (3) cognitive ability varies by age, sex, social class, and race. Brain size and cognitive ability show a curvilinear relation with age, increasing to young adulthood and then decreasing; increasing from women to men; increasing with socioeconomic status; and increasing from Africans to Europeans to Asians. Although only further research can determine if such correlations represent cause and effect, it is clear that the direction of the brain-size/cognitive-ability relationships described by Paul Broca (1824-1880), Francis Galton (1822-1911), and other nineteenth-century visionaries is true, and that the null hypothesis of no relation, strongly advocated over the last half century, is false.
Collision detection between
In this paper, we survey the state of the art in collision detection between general geometric models. The set of models include polygonal objects, spline or algebraic surfaces, CSG models, and deformable bodies. We present a number of techniques and systems available for contact determination. We also describe several N-body algorithms to reduce the number of pairwise intersection tests.
Approximation Methods for Gaussian Process Regression
A wealth of computationally efficient approximation methods for Gaussian process regression have been recently proposed. We give a unifying overview of sparse approximations, following Quiñonero-Candela and Rasmussen (2005), and a brief review of approximate matrix-vector multiplication methods.
Video Identification Solution Using a “ Video Signature ”
NEC’s video identification technology enables instant video content identification by extracting a unique descriptor called a “video signature” from video content. This technology is approved as the MPEG-7 Video Signature Tool; an international standard of interoperable descriptors used for video identification. The video signature is an extremely robust tool for identifying videos with alternations and editing effects. It facilitates search of very short video scenes. It also has a compact design, making ultrafast searches possible via a compact system. In this paper, we propose video identification solutions for the mass media industries. They adopt video signatures as metadata descriptions to enable efficient video registration operations and the visualization of video content relationships.
A Guided Genetic Algorithm for the Planning in Lunar Lander Games
We propose a guided genetic algorithm (GA) for planning in games. In guided GA, an extra reinforcement component is inserted into the evolution procedure of GA. During each evolution procedure, the reinforcement component will simulate the execution of a series of actions of an individual before the real trial and adjust the series of actions according to the reinforcement thus try to improve the performance. We then apply it to a Lunar Lander game in which the falling lunar module needs to learn to land on a platform safely. We compare the performance of guided GA and general GA as well as Q-Learning on the game. The result shows that the guided GA could guarantee to reach the goal and achieve much higher performance than general GA and Q-Learning.
Confidence sets for persistence diagrams
Persistent homology is a method for probing topological properties of point clouds and functions. The method involves tracking the birth and death of topological features as one varies a tuning parameter. Features with short lifetimes are informally considered to be “topological noise,” and those with a long lifetime are considered to be “topological signal.” In this paper, we bring some statistical ideas to persistent homology. In particular, we derive confidence sets that allow us to separate topological signal from topological noise.
Component-Based Representation in Automated Face Recognition
This paper presents a framework for component-based face alignment and representation that demonstrates improvements in matching performance over the more common holistic approach to face alignment and representation. This work is motivated by recent evidence from the cognitive science community demonstrating the efficacy of component-based facial representations. The component-based framework presented in this paper consists of the following major steps: 1) landmark extraction using Active Shape Models (ASM), 2) alignment and cropping of components using Procrustes Analysis, 3) representation of components with Multiscale Local Binary Patterns (MLBP), 4) per-component measurement of facial similarity, and 5) fusion of per-component similarities. We demonstrate on three public datasets and an operational dataset consisting of face images of 8000 subjects, that the proposed component-based representation provides higher recognition accuracies over holistic-based representations. Additionally, we show that the proposed component-based representations: 1) are more robust to changes in facial pose, and 2) improve recognition accuracy on occluded face images in forensic scenarios.
Broadband biquad UHF antenna array for DOA
In this contribution a broadband biquad antenna array for DOA applications is investigated. Simulation and measurement results of the single broadband biquad antenna element and results of antenna array configurations for vertical polarisation, horizontal polarisation or polarisation independant direction of arrival estimation using different algorithms are shown.
Safety and efficacy of 4-aminopyridine in humans with spinal cord injury: a long-term, controlled trial.
STUDY OBJECTIVE To determine the effects of the long-term administration of 4-aminopyridine (4-AP) on sensorimotor function in humans with long-standing spinal cord injury (SCI). DESIGN Randomized, open-label, active-treatment control, dosage-blinded study. SETTING University-affiliated, tertiary-level care, Department of Veterans Affairs Medical Center. PATIENTS Twenty-one healthy men and women outpatients suffering from traumatic SCI (14 tetraplegic, 7 paraplegic) for 2 years or more. INTERVENTIONS Dosages of an immediate-release formulation of 4-AP were titrated. At 3 months, 16 subjects were receiving 4-AP 30 mg/day (high dose); 5 subjects were receiving 4-AP 6 mg/day (low dose) and served as an active-treatment control group. MEASUREMENTS AND MAIN RESULTS Composite motor and sensory scores had statistically significant increases at 3 months. Maximal expiratory pressure, maximal inspiratory pressure, forced vital capacity, and forced expiratory volume in 1 second showed clinically meaningful and/or statistically significant increases among patients receiving 4-AP 30 mg/day. These subjects also had significant decreases in spasticity (modified Ashworth Scale). Serial biochemical profiles and electroencephalographs were unchanged from baseline, and no clinically significant drug toxicity was encountered. CONCLUSIONS Long-term oral administration of immediate-release 4-AP was associated with improvement in and recovery of sensory and motor function, enhanced pulmonary function, and diminished spasticity in patients with long-standing SCI. 4-Aminopyridine appears to be safe and relatively free from toxicity when administered orally over 3 months. Each patient who received immediate-release 4-AP 30 mg/day showed a response in one or more of the outcome measures.
Real-Time Simulation of a Wind Turbine Generator Coupled With a Battery Supercapacitor Energy Storage System
Wind power generation studies of slow phenomena using a detailed model can be difficult to perform with a conventional offline simulation program. Due to the computational power and high-speed input and output, a real-time simulator is capable of conducting repetitive simulations of wind profiles in a short time with detailed models of critical components and allows testing of prototype controllers through hardware-in-the-loop (HIL). This paper discusses methods to overcome the challenges of real-time simulation of wind systems, characterized by their complexity and high-frequency switching. A hybrid flow-battery supercapacitor energy storage system (ESS), coupled in a wind turbine generator to smooth wind power, is studied by real-time HIL simulation. The prototype controller is embedded in one real-time simulator, while the rest of the system is implemented in another independent simulator. The simulation results of the detailed wind system model show that the hybrid ESS has a lower battery cost, higher battery longevity, and improved overall efficiency over its reference ESS.
WEIGHTED TEMPORAL PATTERN MINING WITH DIMENSIONALITY REDUCTION USING MODIFIED AFCM TECHNIQUE
Frequent itemset mining from a time series database is a difficult task. Various techniques have been proposed to mine the frequent associations among the data from the temporal database, but the huge size of the database and frequent time based updates to the database lead to inefficient frequent itemsets. Hence we proposed a dimensionality reduction method which reduces the quantity of data considered for mining. In the proposed system, initially the time based data are converted into fuzzy data. These fuzzy data are provided as input to the proposed Modified Adaptive Fuzzy C Means (MoAFCM) algorithm which is a combination of FCM clustering algorithm and Cuckoo search optimization algorithm. FCM performs dimensionality reduction on the fuzzy data and clustering is performed by the combination of both FCM and cuckoo search optimization algorithm leading to optimized clusters. The resulting clusters contain reference points instead of the original data. Optimization by cuckoo search algorithm leads to better quality clusters. Weighted temporal pattern mining is performed on these clusters to identify the effective temporal patterns which consider knowledge about the patterns having low frequency but high weight in a database which undergoes time based updates. Implementation of the proposed technique is carried out using MATLAB platform and its performance is evaluated using weather forecast dataset. KEYWORDS-Time series database, Dimensionality reduction, Modified Adaptive Fuzzy C Means (MoAFCM), Cuckoo search optimization, Weighted temporal pattern mining.
Interface Structure of Graphene on SiC for Various Preparation Conditions
iv)
Security analysis and enhancements of 3GPP authentication and key agreement protocol
This paper analyzes the authentication and key agreement protocol adopted by Universal Mobile Telecommunication System (UMTS), an emerging standard for third-generation (3G) wireless communications. The protocol, known as 3GPP AKA, is based on the security framework in GSM and provides significant enhancement to address and correct real and perceived weaknesses in GSM and other wireless communication systems. In this paper, we first show that the 3GPP AKA protocol is vulnerable to a variant of the so-called false base station attack. The vulnerability allows an adversary to redirect user traffic from one network to another. It also allows an adversary to use authentication vectors corrupted from one network to impersonate all other networks. Moreover, we demonstrate that the use of synchronization between a mobile station and its home network incurs considerable difficulty for the normal operation of 3GPP AKA. To address such security problems in the current 3GPP AKA, we then present a new authentication and key agreement protocol which defeats redirection attack and drastically lowers the impact of network corruption. The protocol, called AP-AKA, also eliminates the need of synchronization between a mobile station and its home network. AP-AKA specifies a sequence of six flows. Dependent on the execution environment, entities in the protocol have the flexibility of adaptively selecting flows for execution, which helps to optimize the efficiency of AP-AKA both in the home network and in foreign networks.
leanTAP: Lean Tableau-Based Theorem Proving (Extended Abstract)
“prove((E,F),A,B,C,D):- !, prove(E,[F|A],B,C,D). prove((E;F),A,B,C,D):- !, prove(E,A,B,C,D), prove(F,A,B,C,D). prove(all(H,I),A,B,C,D):- !, +length(C,D), copy_term((H,I,C), (G,F,C)), append(A, [all(H,I)],E), prove(F,E,B, [G|C],D). prove(A,_,[C|D],_,_):-((A= -(B); -(A)=B)) → (unify(B,C); prove(A,[],D,_,_)). prove(A,[E|F],B,C,D):- prove(E,F, [A|B],C,D).” implements a first-order theorem prover based on free-variable semantic tableaux. It is complete, sound, and efficient.
'The dots just don't join up': Understanding the support needs of families of children on the autism spectrum.
Much research has documented the elevated levels of stress experienced by families of autistic children. Yet remarkably little research has examined the types of support that these families perceive to be beneficial to their lives. This study, co-produced by researchers and school-based professionals, sought to establish these families' support needs from their own perspectives. In total, 139 parents of autistic children with additional intellectual disabilities and limited spoken communication, all attending an inner-city London school, participated in an initial survey examining parental wellbeing, self-efficacy and the extent to which they felt supported. Semi-structured interviews were conducted with a subgroup of parents ( n = 17), some of whom reported in the survey that they felt unsupported, in order to gain their in-depth perspectives. The results from both the survey and the interviews suggested that existing support (particularly from formal support services) was not meeting parents' needs, which ultimately made them feel isolated and alienated. Parents who were interviewed called for service provision that adopted a relational, family-centred approach - one that understands the specific needs of the whole family, builds a close working relationship with them and ensures that they are supported at times when the parents and families feel they need it most.
Ranibizumab for neovascular age-related macular degeneration.
PURPOSE The pharmacology, pharmacokinetics, clinical efficacy, safety, pharmacoeconomics, and place in therapy of ranibizumab are reviewed. SUMMARY Ranibizumab is the humanized fragment of the murine monoclonal antibody that binds all the active forms of the vascular endothelial growth factor, leading to the inhibition of the neovascular process underlying age-related macular degeneration (AMD). In animal studies, intravitreal administration of ranibizumab resulted in penetration of the drug into all layers of the retina and subsequent slow absorption into the systemic circulation. Improvement in visual acuity by 15 or more letters has been observed in 33.8-40.3% of patients treated with ranibizumab in pivotal clinical trials, compared with 5% of patients treated with sham injections and photodynamic therapy (PDT). The addition of PDT to ranibizumab has not been shown to offer any benefit in terms of efficacy and has been found to worsen ocular adverse reactions. The most common adverse ocular reactions reported in patients receiving ranibizumab during clinical trials include conjunctival hemorrhage, eye pain, vitreous floaters, increased intraocular pressure, and intraocular inflammation. Ranibizumab's efficacy in the treatment of neovascular AMD is well established; however, questions remain regarding the drug's optimal dosing strategy, duration of therapy, and combined therapy with other agents. While ranibizumab has been defined as the best available weapon against AMD, it is also the most expensive. CONCLUSION The efficacy of ranibizumab in the treatment of AMD is well established, but more studies are needed to determine ranibizumab's optimal dosage interval, duration of therapy, and combined use with other agents.
Pulse transit time and pulse width as potential measure for estimating beat-to-beat systolic and diastolic blood pressure
Two cardiovascular parameters of emerging interest suitable for estimation of non-invasive, beat-to-beat, and without cuff, blood pressure parameters are pulse width (PW) and pulse transit time (PTT). In this study the performance of both parameters in estimating beat-to-beat systolic blood pressure (SBP) and diastolic blood pressure (DBP) is analyzed. The overall data set used in the study includes synchronous electrocardiogram signal (ECG), pulse photoplethysmography signal (PPG) and continuous blood pressure signal of 16 healthy subjects during tilt table test, which provokes significant changes in SBP and DBP due to postural changes.
Percutaneous tumor curettage and interstitial delivery of samarium-153 coupled with kyphoplasty for treatment of vertebral metastases.
OBJECT The object of this study was to investigate the use of a minimally invasive technique for treating metastatic tumors of the vertebral body, aimed at relieving pain, preventing further tumor growth, and minimizing the adverse effects of systemic use of samarium-153 ((153)Sm). METHODS The procedure is performed in the same fashion as a kyphoplasty, using a unilateral extrapedicular approach under local anesthesia/mild general sedation, with the patient in the lateral decubitus position. The tumor is accessed as in a standard kyphoplasty. The side is chosen according to the location of the metastasis. Prior to inflation of the balloon the tumor is debulked by percutaneous curettage. Balloon inflation is carried out as per standard kyphoplasty in an attempt to create a larger space and reduce a possible kyphotic deformity. Three mCi of (153)Sm-EDTMP (ethylenediaminetetramethylenephosphonic acid) is then mixed with bone cement (polymethylmethacrylate) and injected into the void created by the balloon tamp. RESULTS Twenty-four procedures were performed in 19 patients. There was reliable and reproducible delivery of the radiolabeled (153)Sm-EDTMP to the metastatic site, without spillage. The procedure was safe. There were no procedure-related complications. There was no hematological toxicity with the low doses of (153)Sm used. Pain improved in all patients. The long-term results related to tumor control continue to be investigated. CONCLUSIONS Combined percutaneous debulking of confined vertebral metastases and administration of local (153)Sm is feasible and safe. Furthermore, this technique leads to immediate relief of cancer-related pain and may help prevent or slow down the progression of vertebral metastatic tumors.
A Bayesian game approach for intrusion detection in wireless ad hoc networks
In wireless ad hoc networks, although defense strategies such as intrusion detection systems (IDSs) can be deployed at each mobile node, significant constraints are imposed in terms of the energy expenditure of such systems. In this paper, we propose a game theoretic framework to analyze the interactions between pairs of attacking/defending nodes using a Bayesian formulation. We study the achievable Nash equilibrium for the attacker/defender game in both static and dynamic scenarios. The dynamic Bayesian game is a more realistic model, since it allows the defender to consistently update his belief on his opponent's maliciousness as the game evolves. A new Bayesian hybrid detection approach is suggested for the defender, in which a lightweight monitoring system is used to estimate his opponent's actions, and a heavyweight monitoring system acts as a last resort of defense. We show that the dynamic game produces energy-efficient monitoring strategies for the defender, while improving the overall hybrid detection power.
Firefly Algorithm for Unconstrained Optimization
Meta-heuristic algorithms prove to be competent in outperforming deterministic algorithms for real-world optimization problems. Firefly algorithm is one such recently developed algorithm inspired by the flashing behavior of fireflies. In this work, a detailed formulation and explanation of the Firefly algorithm implementation is given. Later Firefly algorithm is verified using six unimodal engineering optimization problems reported in the specialized literature.
Ka-Band Rectangular Waveguide to Suspended Stripline Transition
A Ka-band transition between a rectangular waveguide and a suspended stripline (SSL) is proposed. It uses a configuration in which the SSL substrate is perpendicular to the waveguide main axis. A patch printed on the SSL substrate is used to match the characteristic impedance of the waveguide. A set of formulas to facilitate the design of the element is presented. A prototype of the transition was fabricated and the experimental results agree well with the simulations. The proposed element can have the input matching below -15 dB in a 13% bandwidth while maintaining the insertion loss below 0.22 dB.
ADMIT: anomaly-based data mining for intrusions
Security of computer systems is essential to their acceptance and utility. Computer security analysts use intrusion detection systems to assist them in maintaining computer system security. This paper deals with the problem of differentiating between masqueraders and the true user of a computer terminal. Prior efficient solutions are less suited to real time application, often requiring all training data to be labeled, and do not inherently provide an intuitive idea of what the data model means. Our system, called ADMIT, relaxes these constraints, by creating user profiles using semi-incremental techniques. It is a real-time intrusion detection system with host-based data collection and processing. Our method also suggests ideas for dealing with concept drift and affords a detection rate as high as 80.3% and a false positive rate as low as 15.3%.
Deep Learning for solar power forecasting — An approach using AutoEncoder and LSTM Neural Networks
Power forecasting of renewable energy power plants is a very active research field, as reliable information about the future power generation allow for a safe operation of the power grid and helps to minimize the operational costs of these energy sources. Deep Learning algorithms have shown to be very powerful in forecasting tasks, such as economic time series or speech recognition. Up to now, Deep Learning algorithms have only been applied sparsely for forecasting renewable energy power plants. By using different Deep Learning and Artificial Neural Network algorithms, such as Deep Belief Networks, AutoEncoder, and LSTM, we introduce these powerful algorithms in the field of renewable energy power forecasting. In our experiments, we used combinations of these algorithms to show their forecast strength compared to a standard MLP and a physical forecasting model in the forecasting the energy output of 21 solar power plants. Our results using Deep Learning algorithms show a superior forecasting performance compared to Artificial Neural Networks as well as other reference models such as physical models.
Collision Detection and Avoidance in Computer Controlled Manipulators
The problem of planning safe tra-jectories for computer controlled manipulators with two movable links and multiple degrees of freedom is analyzed, and a solution to the problem proposed. The key features of the solution are: 1. the identification of trajectory primitives and a hierarchy of abstraction spaces that permit simple manip-ulator models, 2. the characterization of empty space by approximating it with easily describable entities called charts-the approximation is dynamic and can be selective, 3. a scheme for planning motions close to obstacles that is computationally viable, and that suggests how proximity sensors might be used to do the planning, and 4. the use of hierarchical decomposition to reduce the complexity of the planning problem. 1. INTRODUCTION the 2D and 3D solution noted, it is easy to visualize the solution for the 3D manipulator. Section 2 of this paper presents an example, and Section 3 a statement and analysis of the problem. Sections 4 and 5 present the solution. Section 6 summarizes the key ideas in the solution and indicates areas for future work. 2. AN EXAMPLE This section describes an example (Figure 2.1) of the collision detection and avoidance problem for a two-dimensional manipulator. The example highlights features of the problem and its solution. 2.1 The Problem The manipulator has two links and three degrees of freedom. The larger link, called the boom, slides back and forth and can rotate about the origin. The smaller link, called the forearm, has a rotational degree of freedom about the tip of the boom. The tip of the forearm is called the hand. S and G are the initial and final configurations of the manipulator. Any real manipulator's links will have physical dimensions. The line segment representation of the link is an abstraction; the physical dimensions can be accounted for and how this is done is described later. The problem of planning safe trajectories for computer controlled manipulators with two movable links and multiple degrees of freedom is analyzed, and a solution to the problem is presented. The trajectory planning system is initialized with a description of the part of the environment that the manipulator is to maneuver in. When given the goal position and orientation of the hand, the system plans a complete trajectory that will safely maneuver the manipulator into the goal configuration. The executive system in charge of operating the hardware uses this trajectory to physically move the manipulator. …
Culture Wires the Brain: A Cognitive Neuroscience Perspective.
There is clear evidence that sustained experiences may affect both brain structure and function. Thus, it is quite reasonable to posit that sustained exposure to a set of cultural experiences and behavioral practices will affect neural structure and function. The burgeoning field of cultural psychology has often demonstrated the subtle differences in the way individuals process information-differences that appear to be a product of cultural experiences. We review evidence that the collectivistic and individualistic biases of East Asian and Western cultures, respectively, affect neural structure and function. We conclude that there is limited evidence that cultural experiences affect brain structure and considerably more evidence that neural function is affected by culture, particularly activations in ventral visual cortex-areas associated with perceptual processing.
A novel approach for medical assistance using trained chatbot
There are lot of treatments that are available for various diseases. No human can possibly know about all the medicines and the diseases. So, the problem is that there isn't any place where anyone can have the details of the diseases or the medicines. What if there is a place where you can find your health problem just by entering symptoms or just scanning an ECG or you can check whether the prescribed medicine is supposed to be used the way you are told to. Then it will help us to deduce the problem and to verify the solution. The proposed idea is to create a system with artificial intelligence that can meet the requirements. The AI can predict the diseases based on the symptoms and give the list of available treatments. The System can also give the composition of the medicines and their prescribed uses. It helps them to take the correct treatment. Hence the people can have an idea about their health and can have the right protection.
First Multicenter Study of Modified Release Phosphatidylcholine “LT-02” in Ulcerative Colitis: A Randomized, Placebo-Controlled Trial in Mesalazine-Refractory Courses
OBJECTIVES:Phosphatidylcholine is a key component of the mucosal barrier. Treatment with modified release phosphatidylcholine aims to improve the impaired barrier function. The primary objective is to evaluate the efficacy of LT-02, a newly designed modified release phosphatidylcholine formula, in a multicenter setting.METHODS:This is a double-blinded, randomized, placebo-controlled, superiority study conducted in 24 ambulatory referral centers in Germany, Lithuania, and Romania. A total of 156 patients with an inadequate response to mesalazine, a disease activity score (Simple Clinical Colitis Activity Index (SCCAI)) of ≥5, and bloody diarrhea underwent treatment with 0, 0.8, 1.6, or 3.2 g LT-02. The primary end point was defined a priori as changes in SCCAI from baseline to the end of treatment. The primary statistical model was a general linear least-squares model. The study was funded by the sponsor Lipid Therapeutics, Heidelberg, Germany, and registered at http://clinicaltrials.gov/show/NCT01011322.RESULTS:Baseline characteristics and dropouts were well balanced between all groups. The primary analyses revealed an SCCAI drop of 33.3% in the placebo group (from 9.0 to 6.0 points) compared with 44.3% in the 0.8 g LT-02 (from 8.8 to 4.9, P>0.05) and 40.7% in the 1.6 g groups (from 8.6 to 5.1, P>0.05). The 3.2 g group improved 51.7% from 8.5 to 4.1 (P=0.030 in comparison with placebo). The remission rate was 15% (6/40) in the placebo group compared with 31.4% (11/35) in the highest LT-02 dose group (P=0.089). Mucosal healing was achieved in 32.5% of placebo patients compared with 47.4% of LT-02 patients (P=0.098); the rates for histologic remission were 20% compared with 40.5%, respectively (P=0.016). There were 17 (48.6%) treatment-emergent adverse events in the highest dose group (and 0 serious adverse events (SAEs)) compared with 22 (55%) in the placebo group (4 SAEs).CONCLUSIONS:The primary end point analysis showed a statistically significant improvement in disease activity during LT-02 treatment in comparison with placebo. The drug was found to be very safe.
Genetics and Conservation of Rare Plants
Active lower limb prosthetics: a systematic review of design issues and solutions
This paper presents a review on design issues and solutions found in active lower limb prostheses. This review is based on a systematic literature search with a methodical search strategy. The search was carried out across four major technical databases and the retrieved records were screened for their relevance. A total of 21 different active prostheses, including 8 above-knee, 9 below-knee and 4 combined knee-ankle prostheses were identified. While an active prosthesis may help to restore the functional performance of an amputee, the requirements regarding the actuation unit as well as for the control system are high and the development becomes a challenging task. Regarding mechanical design and the actuation unit high force/torque delivery, high efficiency, low size and low weight are conflicting goals. The actuation principle and variable impedance actuators are discussed. The control system is paramount for a "natural functioning" of the prosthesis. The control system has to enable locomotion and should react to the amputee's intent. For this, multi-level control approaches are reviewed.
Applying an extended model of deterrence across cultures: An investigation of information systems misuse in the U.S. and South Korea
Intentional employee misuse of IS is a global problem. Research suggests that security countermeasures can deter misuse by increasing the perceived certainty and severity of punishment for such behavior. However, the value of generalizing this work beyond Western cultures is open to question. In our study, we examined whether national culture influenced the deterrent capabilities of security policies, security education, training, and awareness programs and computer monitoring. Using U.S. and Korean samples, we found evidence that the deterrent effect of certain security countermeasures varied between the two countries, as did the influence of age and gender. The results have implications for information security management practices in global businesses. 2011 Elsevier B.V. All rights reserved.
A Multimode 1-MHz PFC Front End With Digital Peak Current Modulation
This work presents a novel mixed-signal control scheme for a boost power factor correction (PFC) rectifier. The digital controller modulates the inductor peak current to produce a low-distortion ac line current in discontinuous conduction mode (DCM) and continuous conduction mode (CCM), without the need for average current sensing. A lookup table (LUT) optimizes efficiency at low input currents, by allowing operation at 125-500-kHz DCM based on calculated thresholds. At high input currents, the converter operates at 1-MHz CCM for reduced inductor footprint. An analog off-time generator with a digital frequency locked loop facilitates CCM operation, eliminating the need for slope compensation in the current loop and reduces frequency variations. The LUT is programmed with an adaptive output voltage of 250/450 V for low/high mains line voltage (85-265 Vrms) to optimize efficiency over a broad range of conditions. The 150-W PFC prototype operates up to 1 MHz with a peak efficiency of 95% and a total harmonic distortion of 5%.
Interpreting Dimensions of Consumer Trust in E-Commerce
Consumer trust in an Internet vendor is an issue commanding ever more attention. Based on an extensive review of literature, this paper proposes dimensions of trust in an Internet vendor. These are competence, integrity and benevolence. Competence refers to a company’s ability to fulfill promises made with the consumers. Integrity suggests that a company acts in a consistent, reliable, and honest manner. Benevolence is the ability of a company to hold consumer interests ahead of its own self-interest and indicates sincere concern for the welfare of the customers. In a further analysis various sources where trust might reside are also identified. Drawing on the literature in marketing and general management, the sources of trust are classified as characteristics of the consumer, the firm, the website and the interaction between the consumer and the firm. Given the dimensions and sources of trust, a path model for developing consumer trust in E-commerce is suggested. This research makes a contribution to the development of a theoretical understanding of trust in E-commerce. Although the concepts presented in this paper can be used to carry out further empirical research, they can also be used by practitioners to identify particular trust characteristics for realizing the potential of business to consumer E-commerce venture.
Deep or shallow, NLP is breaking out
Neural net advances improve computers' language ability in many fields.
Concentrated winding segmented rotor switched reluctance machine (SRM) using three-phase standard inverters
Two new topologies of three-phase segmented rotor switched reluctance machine (SRM) that enables the use of standard voltage source inverters (VSIs) for its operation are presented. The topologies has shorter end-turn length, axial length compared to SRM topologies that use three-phase inverters; compared to the conventional SRM (CSRM), these new topologies has the advantage of shorter flux paths that results in lower core losses. FEA based optimization have been performed for a given design specification. The new concentrated winding segmented SRMs demonstrate competitive performance with three-phase standard inverters compared to CSRM.
Effect of drip irrigation on growth and yield of onion (Allium cepa L.)
Drip irrigation is one of the essential, advanced and innovative irrigation methods over surface irrigation. In view of this, an experiment was conducted to study the efficiency of drip irrigation system over surface irrigation in onion during Rabi 2013-14 and 2014-15 and Kharif 2014 and 2015. The results revealed that drip irrigation system performed superior over surface irrigation system in terms of superior plant morphology, yield and quality of bulb. Drip irrigation recorded maximum plant height (66.37 cm & 61.88), number of leaves (9.23 & 8.00) and neck thickness (1.62 cm & 1.30 cm) in both Rabi and Kharif seasons. The bulb equatorial and polar diameter, higher gross yield as well as marketable yield obtained in drip irrigation system. In drip, gross yield and marketable yield increased 18.16% and 24.49%, respectively over surface irrigation method and better water use efficiency and also saved 29.36% and 27.12% water during Rabi and Kharif seasons, respectively.
Ambient intelligence: A survey
In this article we survey ambient intelligence (AmI), including its applications, some of the technologies it uses, and its social and ethical implications. The applications include AmI at home, care of the elderly, healthcare, commerce, and business, recommender systems, museums and tourist scenarios, and group decision making. Among technologies, we focus on ambient data management and artificial intelligence; for example planning, learning, event-condition-action rules, temporal reasoning, and agent-oriented technologies. The survey is not intended to be exhaustive, but to convey a broad range of applications, technologies, and technical, social, and ethical challenges.
The Use of Exothermic Reactions in the Synthesis and Densification of Ceramic Materials
This article will provide information about chemical processes which rely on the heat evolved during reaction to synthesize and, in some cases, to simultaneously density single-phase or composite ceramic materials. Although the basic concept underlying these processes is simple, the high temperature reactions are complex and require careful study with individual systems before their potential as fabrication processes can be fully realized. Many reactions between solids involving elements and/or compounds, or between solids and gases are highly exothermic. Listed in Table I is a selected group of typical chemical reactions accompanied by their calculated adiabatic temperatures. As a general rule any reaction with an adiabatic temperature ~2000°C or over can be reacted under combustion conditions. For example, suppose that a cold-pressed cylindrical compact of titanium and boron powder (Eq. 1) is ignited at the top surface with a convenient source of heat such as a laser. From the igniting surface a combustion wave rapidly self-propagates down the compact, transforming the reactants into the TiB 2 product. Figure 1 shows a Ti and C powder compact where the combustion wave has progressed about halfway down the compact.
Randomized study of early hospital discharge following autologous blood SCT: medical outcomes and hospital costs
We report the first randomized study comparing early hospital discharge with standard hospital-based follow-up after high-dose chemotherapy (HDCT) and PBSCT. Patients aged 18–65 years, with an indication of PBSCT for non-leukemic malignant diseases were randomly assigned between two arms. Arm A consisted of early hospital discharge (HDCT during hospitalization, discharge at day 0, home stay with a caregiver, outpatient clinic follow-up). In arm B patients were followed up as inpatients. In total 131 patients were analyzed (66 in arm A and 65 in arm B). Patient characteristics and hematological reconstitution were comparable between the two groups. In arm A, 26 patients were actually discharged early. Patients in group A spent fewer days in hospital (11 vs 12 days, P=0.006). This strategy resulted in a 6% mean cost reduction per patient when compared with the conventional hospital-based group. The early discharge approach within the French health system, while safe and feasible, is highly dependent on social criteria (caregiver availability and home to hospital distance). It is almost always associated with conventional hospital readmission during the aplasia phase, and limits cost savings when considering the whole population of patients benefiting from HDCT in routine clinical practice.
Deep Reinforcement Learning for Dialogue Generation
• Current neural network models. – They have drawbacks • Solution: – Deep reinforcement learning. CISC850 Cyber Analytics
Restaurant Revenue Management at Chevys: Determining the Best Table Mix
Revenue management has been used in a variety of industries and generally takes the form of managing demand by manipulating length of customer usage and price. Supply mix is rarely considered, although it can have considerable impact on revenue. In this research, we focused on developing an optimal supply mix, specifically on determining the supply mix that would maximize revenue. We used data from a Chevys restaurant, part of a large chain of Mexican restaurants, in conjunction with a simulation model to evaluate and enumerate all possible supply (table) mixes. Compared to the restaurant’s existing table mix, the optimal mix is capable of handling a 30% increase in customer volume without increasing waiting times beyond their original levels. While our study was in a restaurant context, the results of this research are applicable to other service businesses.
A Crowdsourcing Worker Quality Evaluation Algorithm on MapReduce for Big Data Applications
Crowdsourcing is a new emerging distributed computing and business model on the backdrop of Internet blossoming. With the development of crowdsourcing systems, the data size of crowdsourcers, contractors and tasks grows rapidly. The worker quality evaluation based on big data analysis technology has become a critical challenge. This paper first proposes a general worker quality evaluation algorithm that is applied to any critical tasks such as tagging, matching, filtering, categorization and many other emerging applications, without wasting resources. Second, we realize the evaluation algorithm in the Hadoop platform using the MapReduce parallel programming model. Finally, to effectively verify the accuracy and the effectiveness of the algorithm in a wide variety of big data scenarios, we conduct a series of experiments. The experimental results demonstrate that the proposed algorithm is accurate and effective. It has high computing performance and horizontal scalability. And it is suitable for large-scale worker quality evaluations in a big data environment.
All-arthroscopic repair of Palmer 1B triangular fibrocartilage complex tears using the FasT-Fix device.
PURPOSE The FasT-Fix device (Smith and Nephew Endoscopy, Andover, MA), initially developed for knee meniscal tears, is described for all-arthroscopic triangular fibrocartilage complex (TFCC) repairs. Potential benefits of this technique are ease of use, the lack of prominent suture knots, and strength of repair. This case series evaluates the early clinical outcomes of this technique. METHODS We conducted a retrospective review of patients with TFCC Palmer type 1B injuries treated by 1 hand surgeon from 2005 to 2009. The patients' charts were reviewed for postoperative complications, range of motion, grip strength (percentage of contralateral), and return to full activity. In addition, each patient completed Quick Disabilities of the Arm, Shoulder, and Hand (QuickDASH) and Patient-Rated Wrist Evaluation (PRWE) questionnaires. RESULTS Twelve patients had all-arthroscopic peripheral (1B) TFCC repairs using the FasT-Fix suture device. The mean follow-up period was 17.5 months (range, 11-27). Mean supination was 78° (± 14°), and mean grip strength was 64% (±16%) of the nonsurgical extremity by 3 months after surgery. All other range of motion was full. The mean QuickDASH score was 11 (±12), and the mean PRWE score was 19 (±14). Average time to full activity was 5 months. There were no surgical complications of the procedure. One patient complained of persistent ulnar-sided wrist pain 12 months after surgery and had an ulnar shortening osteotomy. Arthroscopy at the time of the osteotomy revealed that the TFCC was stable. CONCLUSIONS At mean 1-year follow-up, 11 out of 12 patients achieved excellent subjective outcomes based on QuickDASH and PRWE questionnaires. Although range of motion and grip strength were slightly decreased compared to prior case series reports, the short-term results indicate that the FasT-Fix all-arthroscopic, all-inside technique is a safe and effective technique for repair of Palmer type 1B TFCC tears.
Breastfeeding and dummy use have a protective effect on sudden infant death syndrome
UNLABELLED We conducted a literature review on the effect of breastfeeding and dummy (pacifier) use on sudden infant death syndrome (SIDS). From 4343 abstracts, we identified 35 relevant studies on breastfeeding and SIDS, 27 on dummy use and SIDS and 59 on dummy use versus breastfeeding. CONCLUSION We found ample evidence that both breastfeeding and dummy use reduce the risk of SIDS. There has been a general reluctance to endorse dummy use in case it has a detrimental effect of breastfeeding. However, recent evidence suggests that dummy use might not be as harmful to breastfeeding as previously believed.
Gender Stereotypes and Discrimination: How Sexism Impacts Development.
In this chapter, we summarize and integrate some of the latest developmental science research on gender stereotypes and discrimination in childhood and adolescence. We focus on five forms of sexism: (a) stereotypes and discrimination against boys regarding their school behaviors and disciplinary actions; (b) stereotypes and discrimination against girls in science, technology, engineering, and mathematics (STEM) domains; (c) stereotypes and discrimination in sports; (d) peer gendered harassment, including sexual harassment and teasing because of gender atypicality or nonconformity; and (e) sexualized gender stereotypes that sexually objectify girls and assume boys are sexually voracious. First, we document each type of sexism and examine children's awareness and perceptions of that bias, including their own self-reports and attributions. We examine the implications of this sexism for children and adolescents' developmental health (i.e., social, academic, and psychological well-being). We then draw connections between these various areas of research, focusing on how these different forms of sexism interact to reduce equity and justice among children and negatively impact positive developmental outcomes. The chapter concludes with suggestions for future research.
Proton and metal ion binding to natural organic polyelectrolytes-II. Preliminary investigation with a peat and a humic acid
Abstract We summarize here experimental studies of proton and metal ion binding to a peat and a humic acid. Data analysis is based on a unified physico-chemical model for reaction of simple ions with polyelectrolytes employing a modified Henderson-Hasselbalch equation. Peat exhibited an apparent intrinsic acid dissociation constant of 10 −4.05 , and an apparent intrinsic metal ion binding constant of: 400 for cadmium ion; 600 for zinc ion; 4000 for copper ion; 20000 for lead ion. A humic acid was found to have an apparent intrinsic proton binding constant of 10 −2.6 . Copper ion binding to this humic acid sample occurred at two types of sites. The first site exhibited reaction characteristics which were independent of solution pH and required the interaction of two ligands on the humic acid matrix to simultaneously complex with each copper ion. The second complex species is assumed to be a simple monodentate copper ion-carboxylate species with a stability constant of 18.
Developing a crack inspection robot for bridge maintenance
One of the important tasks for bridge maintenance is bridge deck crack inspection. Traditionally, a human inspector detects cracks using his/her eyes and finds the location of cracks manually. Thus the accuracy of the inspection result is low due to the subjective nature of human judgement. We propose a system that uses a mobile robot to conduct the inspection, where the robot collects bridge deck images with a high resolution camera. In this method, the Laplacian of Gaussian algorithm is used to detect cracks and the global crack map is obtained through camera calibration and robot localization. To ensure that the robot collects all the images on the bridge deck, we develop a complete coverage path planning algorithm for the mobile robot. We compare it with other path planning strategies. Finally, we validate our proposed system through experiments and simulation.
A Simple Planar Polarization Reconfigurable Monopole Antenna for GNSS/PCS
The design of a simple planar polarization reconfigurable monopole antenna for the Global Navigation Satellite System (GNSS) and Personal Communications System (PCS) is presented. The antenna consists of two meandered monopoles, a feeding network using the Wilkinson power divider, two switchable 90 0-phase shifters implemented using λ/4-microstrip lines and a defected ground structure (DGS). The meandered monopoles resonating at about 1.55 GHz are placed perpendicular to each other. The input signal is divided into two signals with equal amplitude and phase by the power divider and fed to the meandered monopoles via the phase shifters. The two signals arriving at the two monopoles have a phase difference of 90 0, -900 or 0 0, depending on the phase shifters controlled using six PIN-diode switches, hence generating a right/left-handed circularly polarized (CP) or linearly polarized (LP) signal. We propose a novel biasing technique to control the six PIN diodes using five voltages. Measurement results show that the antenna in CP has an impedance bandwidth of 1.06-1.64 GHz and an axial-ratio bandwidth of 1.43-1.84 GHz, and in LP has an impedance bandwidth of 1.63-1.88 GHz. Simulated and measured results on S11, AR, radiation pattern, and gains show good agreements.
Comparing Machine Learning Approaches for Table Recognition in Historical Register Books
We present in this paper experiments on Table Recognition in hand-written register books. We first explain how the problem of row and column detection is modelled, and then compare two Machine Learning approaches (Conditional Random Field and Graph Convolutional Network) for detecting these table elements. Evaluation was conducted on death records provided by the Archives of the Diocese of Passau. With an F-1 score of 89, both methods provide a quality which allows for Information Extraction. Software and dataset are open source/data.
Security Requirements Analysis of ADS-B Networks
Due to their many advantages over their hardwarebased counterparts, Software Defined Radios are becoming the new paradigm for radio and radar applications. In particular, Automatic Dependent Surveillance-Broadcast (ADS-B) is an emerging software defined radar technology, which has been already deployed in Europe and Australia. Deployment in the US is underway as part of the Next Generation Transportation Systems (NextGen). In spite of its several benefits, this technology has been widely criticized for being designed without security in mind, making it vulnerable to numerous attacks. Most approaches addressing this issue fail to adopt a holistic viewpoint, focusing only on part of the problem. In this paper, we propose a methodology that uses semantic technologies to address the security requirements definition from a systemic perspective. More specifically, knowledge engineering focused on misuse scenarios is applied for building customized resilient software defined radar applications, as well as classifying cyber attack severity according to measurable security metrics. We showcase our ideas using an ADS-B-related scenario developed to evaluate
A Pipeline for the Segmentation and Classification of 3D Point Clouds
This paper presents algorithms for fast segmentation of 3D point clouds and subsequent classification of the obtained 3D segments. The method jointly determines the ground surface and segments individual objects in 3D, including overhanging structures. When compared to six other terrain modelling techniques, this approach has minimal error between the sensed data and the representation; and is fast (processing a Velodyne scan in approximately 2 seconds). Applications include improved alignment of successive scans by enabling operations in sections (Velodyne scans are aligned 7% sharper compared to an approach using raw points) and more informed decision-making (paths move around overhangs). The use of segmentation to aid classification through 3D features, such as the Spin Image or the Spherical Harmonic Descriptor, is discussed and experimentally compared. Moreover, the segmentation facilitates a novel approach to 3D classification that bypasses feature extraction and directly compares 3D shapes via the ICP algorithm. This technique is shown to achieve accuracy on par with the best feature based classifier (92.1%) while being significantly faster and allowing a clearer understanding of the classifier’s behaviour.
Microstructure of porous anodic oxide films on aluminium
Microstructure of porous anodized films of aluminium prepared in sulphuric acid solution are different from those prepared in an oxalic or phosphoric acid solution. Transmission electron microscopy reveals a multilayer or higher order structure in the former films. Infrared spectra and specific surface area were also studied for these films and new functional properties of the films suitable for new materials were found. In contrast to the fibrous colloidal structure in the cells and barrier layer in the conventional films anodized in a sulphuric acid solution at d.c. 15 V, a network structure is formed in the cells and barrier layer in the hard films prepared at higher voltage of d.c. 25 V. The microstructure changes according to the anodizing conditions. A new model for these sulphuric acid films is presented, i.e. the cell walls are constructed from five layers and the fracture of the films occurs at the centre of the cell walls. Centre barrier layer (4 to 6 nm in thickness) composed of aluminium oxide of high crystallinity was found in a barrier layer at the bottom of the pore, and the thickness is independent on the applied voltage of the anodizing. Increase in thickness of the barrier layer due to applied voltage is governed by that of the outer barrier layer.
Making Sense of Entities and Quantities in Web Tables
HTML tables and spreadsheets on the Internet or in enterprise intranets often contain valuable information, but are created ad-hoc. As a result, they usually lack systematic names for column headers and clear vocabulary for cell values. This limits the re-use of such tables and creates a huge heterogeneity problem when comparing or aggregating multiple tables. This paper aims to overcome this problem by automatically canonicalizing header names and cell values onto concepts, classes, entities and uniquely represented quantities registered in a knowledge base. To this end, we devise a probabilistic graphical model that captures coherence dependencies between cells in tables and candidate items in the space of concepts, entities and quantities. We give specific consideration to quantities which are mapped into a "measure, value, unit" triple over a taxonomy of physical (e.g. power consumption), monetary (e.g. revenue), temporal (e.g. date) and dimensionless (e.g. counts) measures. Our experiments with Web tables from diverse domains demonstrate the viability of our method and its benefits over baselines.
GPUs and the Future of Parallel Computing
This article discusses the capabilities of state-of-the art GPU-based high-throughput computing systems and considers the challenges to scaling single-chip parallel-computing systems, highlighting high-impact areas that the computing research community can address. Nvidia Research is investigating an architecture for a heterogeneous high-performance computing system that seeks to address these challenges.
Abnormal processing of affective words by psychopaths.
We tested the hypothesis that psychopathy is associated with abnormal processing of affective verbal material. Criminal psychopaths and nonpsychopaths, defined by the Psychopathy Checklist, performed a lexical decision task ("Is it a word or not?") while we recorded reaction time and event-related potentials in response to letter-strings consisting of affective and neutral words and pronounceable nonwords. On the assumption that they do not make efficient use of affective information, our primary prediction was that psychopaths would show less behavioral and event-related potential differentiation between affective and neutral words than would nonpsychopaths. The results were in accordance with this prediction. The lexical decisions of nonpsychopaths were significantly faster, and relevant event-related potential components were significantly larger, to affective words than to neutral words. In sharp contrast, psychopaths failed to show reaction time facilitation or larger amplitude event-related potentials to affective words. We suggest that psychopaths extract less information from affective words than do other individuals. Possible implications of these and related findings for understanding the behavior of psychopaths are discussed.
Multi-document Summarization via Budgeted Maximization of Submodular Functions
We treat the text summarization problem as maximizing a submodular function under a budget constraint. We show, both theoretically and empirically, a modified greedy algorithm can efficiently solve the budgeted submodular maximization problem near-optimally, and we derive new approximation bounds in doing so. Experiments on DUC’04 task show that our approach is superior to the bestperforming method from the DUC’04 evaluation on ROUGE-1 scores.
Joint Event Extraction via Structured Prediction with Global Features
Traditional approaches to the task of ACE event extraction usually rely on sequential pipelines with multiple stages, which suffer from error propagation since event triggers and arguments are predicted in isolation by independent local classifiers. By contrast, we propose a joint framework based on structured prediction which extracts triggers and arguments together so that the local predictions can be mutually improved. In addition, we propose to incorporate global features which explicitly capture the dependencies of multiple triggers and arguments. Experimental results show that our joint approach with local features outperforms the pipelined baseline, and adding global features further improves the performance significantly. Our approach advances state-ofthe-art sentence-level event extraction, and even outperforms previous argument labeling methods which use external knowledge from other sentences and documents.
Long-term interactive group education for type 1 diabetic patients
The aim of this study was to assess the feasibility and efficacy of an Interactive Educational and Support Group programme (IESG) for patients with type 1 diabetes. A sample of 96 type 1 diabetic outpatients was studied measuring the effects of participation in IEGS on metabolic control and diabetes-related quality of life (QoL). Those refusing to participate (n=48) and a sample of 37 patients who were not invited to IESG (control) where studied for comparison. After one year, participants showed a significant (p<0.05) improvement of HbA1c from 7.7±1.6 to 7.2±1.5%, whereas no variation of HbA1c was observed in non-participants and controls. No significant variation of QoL was observed in any of the three groups. At two-years follow-up, HbA1c of the patients attending IESG was not significantly different from that at one-year follow-up, and it was significantly lower than that observed at enrolment. QoL showed a significant improvement at 2 years with respect to baseline and oneyear follow-up. In conclusion, this programme appears to be effective in the improvement of medium term metabolic control and QoL.
Implementing a Rule-Based Contract Compliance Checker
The paper describes the design and implementation of an independent, third party contract monitoring service called Contract Compliance Checker (CCC). The CCC is provided with the specification of the contract in force, and is capable of observing and logging the relevant business-to-business (B2B) interaction events, in order to determine whether the actions of the business partners are consistent with the contract. A contract specification language called EROP (for Events, Rights, Obligations and Prohibitions) for the CCC has been developed based on business rules, that provides constructs to specify what rights, obligation and prohibitions become active and inactive after the occurrence of events related to the execution of business operations. The system has been designed to work with B2B industry standards such as ebXML and RosettaNet.
Stress management versus lifestyle modification on systolic hypertension and medication elimination: a randomized trial.
Isolated systolic hypertension is common in the elderly, but decreasing systolic blood pressure (SBP) without lowering diastolic blood pressure (DBP) remains a therapeutic challenge. Although stress management training, in particular eliciting the relaxation response, reduces essential hypertension its efficacy in treating isolated systolic hypertension has not been evaluated. We conducted a double-blind, randomized trial comparing 8 weeks of stress management, specifically relaxation response training (61 patients), versus lifestyle modification (control, 61 patients). Inclusion criteria were >or=55 years, SBP 140-159 mm Hg, DBP <90 mm Hg, and at least two antihypertensive medications. The primary outcome measure was change in SBP after 8 weeks. Patients who achieved SBP <140 mm Hg and >or=5 mm Hg reduction in SBP were eligible for 8 additional weeks of training with supervised medication elimination. SBP decreased 9.4 (standard deviation [SD] 11.4) and 8.8 (SD 13.0) mm Hg in relaxation response and control groups, respectively (both ps <0.0001) without group difference (p=0.75). DBP decreased 1.5 (SD 6.2) and 2.4 (SD 6.9) mm Hg (p=0.05 and 0.01, respectively) without group difference (p=0.48). Forty-four (44) in the relaxation response group and 36 in the control group were eligible for supervised antihypertensive medication elimination. After controlling for differences in characteristics at the start of medication elimination, patients in the relaxation response group were more likely to successfully eliminate an antihypertensive medication (odds ratio 4.3, 95% confidence interval 1.2-15.9, p=0.03). Although both groups had similar reductions in SBP, significantly more participants in the relaxation response group eliminated an antihypertensive medication while maintaining adequate blood pressure control.
Ectopic pregnancy
Ectopic pregnancy is a implantation occurring elsewhere than in the cavity of the uterus, whereas nintynine percent of extrauterine pregnancies occur in the fallopian tube. The incidence of extrauterine pregnancy has increased from 0.5% thirty years ago, to a present day 1–2%. The most frequent cause of tubal pregnancy is previous salpingitis. Mortality rates for tubal pregnancies used to be approximately 1.7% in the 1970 s but dropped to 0.3% in 1980 s. Diagnosis: Using transvaginal ultrasound it is possible to obtain positive evidence of an ectopic pregnancy at a very early stage. In cases of hCG titers>2000 IU/l, intrauterine pregnancy can be diagnosed with certainty. The most important differential diagnosis of ectopic pregnancy is early intrauterine pregnancy. Clinical management and therapy: Regardless of the therapeutic strategy selected by the physician, informing the patient is a major aspect of the management of ectopic pregnancy. If surgery is considered appropriate, the patient must be informed about the nature, side effects and complications of the procedure. However, it should be remembered that in some cases, the actual chances of cure first become apparent at surgery. In asymptomatic patients with a serum hCG titer <1000 IU/l that is falling, it is appropriate to wait and watch. In clinically stable patients with an unruptured tubal pregnancy and steady hCG levels, systemic treatment with methotrexate might also be considered. In unruptured tubal pregnancy with a hCG titer between 1000 and 2500, a further therapeutic alternative is intratubal injection of prostaglandins, hyperosmolar glucose of NaCl. Generally speaking, the currently widespread laparoscopic surgical treatment of the fallopian tube hardly influences the risk of recurrence. If the gestational mass is larger, the serum hCG titer higher than the approximate limit of 2500 mU/ml and/or the tube already ruptured, surgery is usually required. Prevention: The most effective prevention is to avoid tubal inflammation or, in cases of preexisting inflammation, to administer effective therapy.
TAGME: on-the-fly annotation of short text fragments (by wikipedia entities)
We designed and implemented TAGME, a system that is able to efficiently and judiciously augment a plain-text with pertinent hyperlinks to Wikipedia pages. The specialty of TAGME with respect to known systems [5,8] is that it may annotate texts which are short and poorly composed, such as snippets of search-engine results, tweets, news, etc.. This annotation is extremely informative, so any task that is currently addressed using the bag-of-words paradigm could benefit from using this annotation to draw upon (the millions of) Wikipedia pages and their inter-relations.
Holoprosencephaly, bilateral cleft lip and palate and ectrodactyly: another case and follow up.
We describe a male patient with lobar holoprosencephaly, ectrodactyly, and cleft lip/palate, a syndrome which has been seen previously in only six patients. In addition, our patient developed hypernatraemia, which has been described in three patients before.
Protocol for the examination of specimens from patients with merkel cell carcinoma of the skin.
Authors Priya Rao, MD, FCAP* Department of Pathology and Laboratory Medicine, Cedars-Sinai Medical Center, Los Angeles, California Bonnie L. Balzer, MD, PhD, FCAP Department of Pathology and Laboratory Medicine, Cedars-Sinai Medical Center, Los Angeles, California Bianca D. Lemos, MD Division of Dermatology, University of Washington Medical Center, Seattle, Washington Nanette J. Liegeois, MD Department of Dermatology, Johns Hopkins University School of Medicine, Baltimore, Maryland Jennifer M. McNiff, MD, FASCP Departments of Dermatology and Pathology, Yale University School of Medicine, New Haven, Connecticut Paul Nghiem, MD, PhD Division of Dermatology, University of Washington Medical Center, Seattle, Washington Victor G. Prieto, MD, PhD, FCAP Departments of Pathology and Dermatology, MD Anderson Cancer Center, University of Texas, Houston, Texas M. Timothy Smith, MD Department of Pathology and Laboratory Medicine, Medical University of South Carolina, Charleston, South Carolina Bruce Robert Smoller, MD, FCAP Department of Pathology, University of Arkansas for Medical Sciences, Little Rock, Arkansas Mark R. Wick, MD, FCAP Department of Pathology, University of Virginia Health System, Charlottesville, Virginia David P. Frishberg, MD, FCAP† Department of Pathology and Laboratory Medicine, Cedars-Sinai Medical Center, Los Angeles, California
Genetic Ablation of Orexin Neurons in Mice Results in Narcolepsy, Hypophagia, and Obesity
Orexins (hypocretins) are a pair of neuropeptides implicated in energy homeostasis and arousal. Recent reports suggest that loss of orexin-containing neurons occurs in human patients with narcolepsy. We generated transgenic mice in which orexin-containing neurons are ablated by orexinergic-specific expression of a truncated Machado-Joseph disease gene product (ataxin-3) with an expanded polyglutamine stretch. These mice showed a phenotype strikingly similar to human narcolepsy, including behavioral arrests, premature entry into rapid eye movement (REM) sleep, poorly consolidated sleep patterns, and a late-onset obesity, despite eating less than nontransgenic littermates. These results provide evidence that orexin-containing neurons play important roles in regulating vigilance states and energy homeostasis. Orexin/ataxin-3 mice provide a valuable model for studying the pathophysiology and treatment of narcolepsy.
3D bioprinting of tissues and organs
Additive manufacturing, otherwise known as three-dimensional (3D) printing, is driving major innovations in many areas, such as engineering, manufacturing, art, education and medicine. Recent advances have enabled 3D printing of biocompatible materials, cells and supporting components into complex 3D functional living tissues. 3D bioprinting is being applied to regenerative medicine to address the need for tissues and organs suitable for transplantation. Compared with non-biological printing, 3D bioprinting involves additional complexities, such as the choice of materials, cell types, growth and differentiation factors, and technical challenges related to the sensitivities of living cells and the construction of tissues. Addressing these complexities requires the integration of technologies from the fields of engineering, biomaterials science, cell biology, physics and medicine. 3D bioprinting has already been used for the generation and transplantation of several tissues, including multilayered skin, bone, vascular grafts, tracheal splints, heart tissue and cartilaginous structures. Other applications include developing high-throughput 3D-bioprinted tissue models for research, drug discovery and toxicology.
Getting it right the second time.
Once a business performs a complex activity well, the parent organization often wants to replicate that success. But doing that is surprisingly difficult, and businesses nearly always fail when they try to reproduce a best practice. The reason? People approaching best-practice replication are overly optimistic and overconfident. They try to perfect an operation that's running nearly flawlessly, or they try to piece together different practices to create the perfect hybrid. Getting it right the second time (and all the times after that) involves adjusting for overconfidence in your own abilities and imposing strict discipline on the process and the organization. The authors studied numerous business settings to find out how organizational routines were successfully reproduced, and they identified five steps for successful replication. First, make sure you've got something that can be copied and that's worth copying. Some processes don't lend themselves to duplication; others can be copied but maybe shouldn't be. Second, work from a single template. It provides proof success, performance measurements, a tactical approach, and a reference for when problems arise. Third, copy the example exactly, and fourth, make changes only after you achieve acceptable results. The people who developed the template have probably already encountered many of the problems you want to "fix," so it's best to create a working system before you introduce changes. Fifth, don't throw away the template. If your copy doesn't work, you can use the template to identify and solve problems. Best-practice replication, while less glamorous than pure innovation, contributes enormously to the bottom line of most companies. The article's examples--Banc One, Rank Xerox, Intel, Starbucks, and Re/Max Israel--prove that exact copying is a non-trivial, challenging accomplishment.
Balancing fairness and efficiency in tiered storage systems with bottleneck-aware allocation
Multi-tiered storage made up of heterogeneous devices are raising new challenges in allocating throughput fairly among concurrent clients. The fundamental problem is finding an appropriate balance between fairness to the clients and maximizing system utilization. In this paper we cast the problem within the broader framework of fair allocation for multiple resources. We present a new allocation model BAA based on the notion of per-device bottleneck sets. Clients bottlenecked on the same device receive throughputs in proportion to their fair shares, while allocation ratios between clients in different bottleneck sets are chosen to maximize system utilization. We show formally that BAA satisfies fairness properties of Envy Freedom and Sharing Incentive. We evaluated the performance of our method using both simulation and implementation on a Linux platform. The experimental results show that our method can provide both high efficiency and fairness.
The Elaboration Likelihood Model of Persuasion
A. ArgumentIMessage Quality ........ ................................. 132 B. Peripheral Cues.. . . . . . . . . . . . . . . . . ................................. 134 C. Affecting Elaboration . . . . . . . . . . . . . ................................. 136 V. Postulate 4 Objective Elaboration.. ............................. . . . . . . . 137 A. Distraction ............................................... . . . . . . . 139 B. Repetition .............................. . . . . . . . . . . 143 C. Personal Relevance/Involvement . . . . . . . . . . . . . . . . . . . . . . 144 D. Personal Responsibility ................... E. Need for Cognition. . . .......... 150
Comparison of noninvasive diagnostic tests for Helicobacter pylori infection.
OBJECTIVES Since the (13)C-urea breath test (UBT) has become a highly reliable method for the noninvasive diagnosis of Helicobacter pylori infection, this study was performed in order to compare the sensitivity, specificity and accuracy among noninvasive tests including capsule UBT, conventional UBT and serology in the diagnosis of H. pylori infection. PATIENTS AND METHODS One hundred patients received capsule UBT, conventional UBT and gave blood samples for the diagnosis of H. pylori infection. Upper gastrointestinal endoscopy was performed in all patients. H. pylori infection was defined as the presence of a positive culture or positive results of both histology and rapid urease test (CLO test). McNemar's test was used to determine the significance of differences among capsule UBT, conventional UBT and serology. Differences were considered significant at p < 0.05. RESULTS According to the predefined criteria, the sensitivity, specificity, positive predictive value and negative predictive value of capsule UBT, conventional UBT and serology was 100, 95.7, 96.4 and 100%; 100, 85.1, 88.3 and 100%, and 90.6, 85.1, 82.7 and 88.9%, respectively. The accuracy of capsule UBT was higher than that of conventional UBT and serology (98 vs. 93 and 88%, respectively). Capsule UBT had a similar ability for the detection of H. pylori infection compared with conventional UBT and serology (McNemar's test, p > 0.05). CONCLUSIONS According to our study, capsule UBT was highly accurate compared with other noninvasive tests including conventional UBT and serology. It could become a good alternative to endoscopy for the diagnosis of H. pylori infection.
Cleverhans V0.1: an Adversarial Machine Learning Library
cleverhans is a software library that provides standardized reference implementations of adversarial example construction techniques and adversarial training. The library may be used to develop more robust machine learning models and to provide standardized benchmarks of models’ performance in the adversarial setting. Benchmarks constructed without a standardized implementation of adversarial example construction are not comparable to each other, because a good result may indicate a robust model or it may merely indicate a weak implementation of the adversarial example construction procedure. This technical report is structured as follows. Section 1 provides an overview of adversarial examples in machine learning and of the cleverhans software. Section 2 presents the core functionalities of the library: namely the attacks based on adversarial examples and defenses to improve the robustness of machine learning models to these attacks. Section 3 describes how to report benchmark results using the library. Section 4 describes the versioning system.
Glaserian and Straussian grounded theory: similar or completely different?
Grounded Theory is growing in popularity as a research method in ICT research areas such as Information Systems and Software Engineering. Although there are two distinct methods, namely the Glaserian and the Straussian versions, a substantial number of research articles tend to ignore the difference and just claim that they are using grounded theory. To a researcher new to the grounded theory method, the two methods look very similar. Because the Straussian method is more prescriptive, most opt to follow this method, without investigating the Glaserian version. The few who try to use a hybrid of the two methods (not appreciating that the two methods are substantially different), only realize after a significant investment in time that the methods are not reconcilable and that either the one or the other should be followed. To contribute towards eliminating the confusion, this paper investigates the differences between the two methods. This will, hopefully, enable ICT researchers to make a more informed decision on which method to follow.
Understanding other minds: linking developmental psychology and functional neuroimaging.
Evidence from developmental psychology suggests that understanding other minds constitutes a special domain of cognition with at least two components: an early-developing system for reasoning about goals, perceptions, and emotions, and a later-developing system for representing the contents of beliefs. Neuroimaging reinforces and elaborates upon this view by providing evidence that (a) domain-specific brain regions exist for representing belief contents, (b) these regions are apparently distinct from other regions engaged in reasoning about goals and actions (suggesting that the two developmental stages reflect the emergence of two distinct systems, rather than the elaboration of a single system), and (c) these regions are distinct from brain regions engaged in inhibitory control and in syntactic processing. The clear neural distinction between these processes is evidence that belief attribution is not dependent on either inhibitory control or syntax, but is subserved by a specialized neural system for theory of mind.
Self-driving cars and lidar
Before graduating from X as Waymo, Google's self-driving car project had been using custom lidars for several years. In their latest revision, the lidars are designed to meet the challenging requirements we discovered in autonomously driving 2 million highly-telemetered miles on public roads. Our goal is to approach price points required for advanced driver assistance systems (ADAS) while meeting the performance needed for safe self-driving. This talk will review some history of the project and describe a few use-cases for lidars on Waymo cars. Out of that will emerge key differences between lidars for self-driving and traditional applications (e.g. mapping) which may provide opportunities for semiconductor lasers.
Ketogenic Diet for Obesity: Friend or Foe?
Obesity is reaching epidemic proportions and is a strong risk factor for a number of cardiovascular and metabolic disorders such as hypertension, type 2 diabetes, dyslipidemia, atherosclerosis, and also certain types of cancers. Despite the constant recommendations of health care organizations regarding the importance of weight control, this goal often fails. Genetic predisposition in combination with inactive lifestyles and high caloric intake leads to excessive weight gain. Even though there may be agreement about the concept that lifestyle changes affecting dietary habits and physical activity are essential to promote weight loss and weight control, the ideal amount and type of exercise and also the ideal diet are still under debate. For many years, nutritional intervention studies have been focused on reducing dietary fat with little positive results over the long-term. One of the most studied strategies in the recent years for weight loss is the ketogenic diet. Many studies have shown that this kind of nutritional approach has a solid physiological and biochemical basis and is able to induce effective weight loss along with improvement in several cardiovascular risk parameters. This review discusses the physiological basis of ketogenic diets and the rationale for their use in obesity, discussing the strengths and the weaknesses of these diets together with cautions that should be used in obese patients.
A Single-Stage LED Driver Based on Interleaved Buck-boost Circuit and LLC Resonant Converter
A single-stage LED driver based on interleaved buck-boost circuit and LLC resonant converter is proposed. The buck-boost circuit and LLC resonant converter are integrated by sharing switches, which can decrease the system cost and improve system efficiency. The input voltage of the buck-boost circuit is half of the rectified voltage, and two buckboost circuits are formed with the two half-bridge switches and corresponding diodes. The two buckboost circuits work in interleaved mode and the inductor current is in discontinuous conduction mode, both helping to achieve the power factor correction. The half-bridge LLC resonant converter is adopted here, and the soft switching characteristic of the LLC resonant converter is not changed by the switch integration. The primary-side switches still work in zero voltage switching (ZVS) mode, and the secondary diodes still work in zero current switching (ZCS) mode, which both reduce the switching losses and improve the efficiency of the system.
Performance Prediction for Apache Spark Platform
Apache Spark is an open source distributed data processing platform that uses distributed memory abstraction to process large volume of data efficiently. However, performance of a particular job on Apache Spark platform can vary significantly depending on the input data type and size, design and implementation of the algorithm, and computing capability, making it extremely difficult to predict the performance metric of a job such as execution time, memory footprint, and I/O cost. To address this challenge, in this paper, we present a simulation driven prediction model that can predict job performance with high accuracy for Apache Spark platform. Specifically, as Apache spark jobs are often consist of multiple sequential stages, the presented prediction model simulates the execution of the actual job by using only a fraction of the input data, and collect execution traces (e.g., I/O overhead, memory consumption, execution time) to predict job performance for each execution stage individually. We evaluated our prediction framework using four real-life applications on a 13 node cluster, and experimental results show that the model can achieve high prediction accuracy.
Facial Landmark Detection with Tweaked Convolutional Neural Networks
This paper concerns the problem of facial landmark detection. We provide a unique new analysis of the features produced at intermediate layers of a convolutional neural network (CNN) trained to regress facial landmark coordinates. This analysis shows that while being processed by the CNN, face images can be partitioned in an unsupervised manner into subsets containing faces in similar poses (i.e., 3D views) and facial properties (e.g., presence or absence of eye-wear). Based on this finding, we describe a novel CNN architecture, specialized to regress the facial landmark coordinates of faces in specific poses and appearances. To address the shortage of training data, particularly in extreme profile poses, we additionally present data augmentation techniques designed to provide sufficient training examples for each of these specialized sub-networks. The proposed Tweaked CNN (TCNN) architecture is shown to outperform existing landmark detection methods in an extensive battery of tests on the AFW, ALFW, and 300W benchmarks. Finally, to promote reproducibility of our results, we make code and trained models publicly available through our project webpage.
A Conversation with Linda Tillery and Mary Watkins
and June Millington have brought in pop/rock influences; Izquierda and Holly Near have drawn on Latin rhythms. Watkins and Tillery are largely responsible for the strong strain in women's music of various black musical idioms-jazz, rhythm and blues, gospel. I spoke with Tillery and Watkins the day after their featured performance at a women's music festival in Austin, Texas. The festival had been a perfect illustration of the eclecticism of women's music. All the performers except these two were local, and all were women, but they represented many different musical traditions. The circumstances of the interview seemed to steer the conversation toward the topics of originality versus tradition, the dynamics between the musician and the audience, and women in music. And because I believe that we owe it to ourselves and our artists to understand the processes of their art, I asked some theoretical questions about how songs are written, how arrangements are established, how it feels to collaborate with other musicians, and what is involved in the recording process.
Attention , Distraction , and Cognitive Control Under Load
The extent to which people can focus attention in the face of irrelevant distractions has been shown to critically depend on the level and type of information load involved in their current task. The ability to focus attention improves under task conditions of high perceptual load but deteriorates under conditions of high load on cognitive control processes such as working memory. I review recent research on the effects of load on visual awareness and brain activity, including changing effects over the life span, and I outline the consequences for distraction and inattention in daily life and in clinical populations.
Quasi-newton methods for real-time simulation of hyperelastic materials
We present a new method for real-time physics-based simulation supporting many different types of hyperelastic materials. Previous methods such as Position Based or Projective Dynamics are fast, but support only limited selection of materials; even classical materials such as the Neo-Hookean elasticity are not supported. Recently, Xu et al. [2015] introduced new “splinebased materials” which can be easily controlled by artists to achieve desired animation effects. Simulation of these types of materials currently relies on Newton’s method, which is slow, even with only one iteration per timestep. In this paper, we show that Projective Dynamics can be interpreted as a quasi-Newton method. This insight enables very efficient simulation of a large class of hyperelastic materials, including the Neo-Hookean, splinebased materials, and others. The quasi-Newton interpretation also allows us to leverage ideas from numerical optimization. In particular, we show that our solver can be further accelerated using L-BFGS updates (Limitedmemory Broyden-Fletcher-Goldfarb-Shanno algorithm). Our final method is typically more than 10 times faster than one iteration of Newton’s method without compromising quality. In fact, our result is often more accurate than the result obtained with one iteration of Newton’s method. Our method is also easier to implement, implying reduced software development costs.
Strategic niche management and sustainable innovation journeys: theory, findings, research agenda, and policy
This article discusses empirical findings and conceptual elaborations of the last 10 years in strategic niche management research (SNM). The SNM approach suggests that sustainable innovation journeys can be facilitated by creating technological niches, i.e. protected spaces that allow the experimentation with the co-evolution of technology, user practices, and regulatory structures. The assumption was that if such niches were constructed appropriately, they would act as building blocks for broader societal changes towards sustainable development. The article shows how concepts and ideas have evolved over time and new complexities were introduced. Research focused on the role of various niche-internal processes such as learning, networking, visioning and the relationship between local projects and global rule sets that guide actor behaviour. The empirical findings showed that the analysis of these niche-internal dimensions needed to be complemented with attention to niche external processes. In this respect, the multi-level perspective proved useful for contextualising SNM. This contextualisation led to modifications in claims about the dynamics of sustainable innovation journeys. Niches are to be perceived as crucial for bringing about regime shifts, but they cannot do this on their own. Linkages with ongoing external processes are also important. Although substantial insights have been gained, the SNM approach is still an unfinished research programme. We identify various promising research directions, as well as policy implications.
Neural Approaches to Conversational AI
This tutorial surveys neural approaches to conversational AI that were developed in the last few years. We group conversational systems into three categories: (1) question answering agents, (2) task-oriented dialogue agents, and (3) social bots. For each category, we present a review of state-of-the-art neural approaches, draw the connection between neural approaches and traditional symbolic approaches, and discuss the progress we have made and challenges we are facing, using specific systems and models as case studies.
An Experimental Evaluation of Apple Siri and Google Speech Recognition
We perform an experimental evaluation of two popular cloud-based speech recognition systems. Cloudbased speech recognition systems enhances Web surfing, transportation, health care, etc. Using voice commands helps drivers stay connected to the Internet by avoiding traffic safety risks. The performance of these type of applications should be robust under difficult network conditions. User frustration with network traffic problems can affect the usability of these applications. We evaluate the performance of two popular cloud-based speech recognition applications, Apple Siri and Google Speech Recognition (GSR) under various network conditions. We evaluate transcription delay and accuracy of transcription of each application under different packet loss and jitter values. Results of our study show that performance of cloud-based speech recognition systems can be affected by jitter and packet loss; which are commonly occurring over WiFi and cellular network connections. keywords: Cloud Speech Recognition, Quality of Experience, Software Measurement, Streaming Media, Real-time Systems.
Lumbar spondylolysis: a review
Spondylolysis is an osseous defect of the pars interarticularis, thought to be a developmental or acquired stress fracture secondary to chronic low-grade trauma. It is encountered most frequently in adolescents, most commonly involving the lower lumbar spine, with particularly high prevalence among athletes involved in certain sports or activities. Spondylolysis can be asymptomatic or can be a cause of spine instability, back pain, and radiculopathy. The biomechanics and pathophysiology of spondylolysis are complex and debated. Imaging is utilized to detect spondylolysis, distinguish acute and active lesions from chronic inactive non-union, help establish prognosis, guide treatment, and to assess bony healing. Radiography with satisfactory technical quality can often demonstrate a pars defect. Multislice CT with multiplanar reformats is the most accurate modality for detecting the bony defect and may also be used for assessment of osseous healing; however, as with radiographs, it is not sensitive for detection of the early edematous stress response without a fracture line and exposes the patient to ionizing radiation. Magnetic resonance (MR) imaging should be used as the primary investigation for adolescents with back pain and suspected stress reactions of the lumbar pars interarticularis. Several imaging pitfalls render MR imaging less sensitive than CT for directly visualizing the pars defects (regional degenerative changes and sclerosis). Nevertheless, the presence of bone marrow edema on fluid-sensitive images is an important early finding that may suggest stress response without a visible fracture line. Moreover, MR is the imaging modality of choice for identifying associated nerve root compression. Single-photon emission computed tomography (SPECT) use is limited by a high rate of false-positive and false-negative results and by considerable ionizing radiation exposure. In this article, we provide a review of the current concepts regarding spondylolysis, its epidemiology, pathogenesis, and general treatment guidelines, as well as a detailed review and discussion of the imaging principles for the diagnosis and follow-up of this condition.