title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
Optimal Trajectory Generation Using Model Predictive Control for Aerially Towed Cable Systems | This paper studies trajectory generation for a mothership that tows a drogue using a flexible cable. The contributions of this paper include model validation for the towed cable system described by a lumped mass extensible cable using flight data, and optimal trajectory generation for the towed cable system with tension constraints using model predictive control. The optimization problem is formulated using a combination of the squared-error and L1-norm objective functions. Different desired circular trajectories of the towed body are used to calculate optimal trajectories for the towing vehicle subject to performance limits and wind disturbances. Trajectory generation for transitions from straight and level flight into an orbit is also presented. The computational efficiency is demonstrated, which is essential for potential real-time applications. This paper gives a framework for specifying an arbitrary flight path for the towed body by optimizing the action of the towing vehicle subject to constraints. |
The Impact of Institutional Credit on Agricultural Production in Pakistan | Three main factors that contribute to agricultural growth are the increased use of agricultural inputs, technological change and technical efficiency. Technological change is the result of research and development efforts, while technical efficiency with which new technology is adopted and used more rationally is affected by the flow of information, better infrastructure, availability of funds and farmers’ managerial capabilities. Higher use and better mix of inputs also requires funds at the disposal of farmers. These funds could come either from farmers’ own savings or through borrowings. In less developed countries like Pakistan where savings are negligible especially among the small farmers, agricultural credit appears to be an essential input along with modern technology for higher productivity. Credit requirements of the farming sector have increased rapidly over the past few decades resulting from the rise in use of fertiliser, biocides, improved seeds and mechanisation, and hike in their prices. The agricultural credit system of Pakistan consists of informal and formal sources of credit supply. The informal sources include friends, relatives, commission agents, traders and private moneylenders etc. Presently, the formal credit sources are comprised of financial institutions like Zarai Taraqiati Bank Limited (ZTBL)—formerly known as Agricultural Development Bank of Pakistan (ADBP), Commercial Banks, and Federal Bank for Cooperatives. Recently, some non-government organisations (NGOs) are also advancing agricultural credit to the rural communities. Like most of the developing countries, expansion of subsidised institutional credit has been widely exercised in Pakistan. The target is to attain higher |
The autophagy gene BbATG5, involved in the formation of the autophagosome, contributes to cell differentiation and growth but is dispensable for pathogenesis in the entomopathogenic fungus Beauveria bassiana. | Autophagy is a highly conserved process, representing the major eukaryotic degradative pathway of cellular components. Autophagy-mediated recycling of cellular materials contributes to cell differentiation, tissue remodelling and proper development. In fungi, autophagy is required for normal growth and cell differentiation. The entomopathogenic fungus Beauveria bassiana and its invertebrate targets represent a unique model system with which to examine host-pathogen interactions. The ATG5 gene is one of 17 involved in autophagosome formation, and the B. bassiana homologue (BbATG5) was identified. The role of autophagy in B. bassiana growth and virulence was investigated via construction of a targeted gene knockout of BbATG5. The mutant strain displayed increased sensitivity to nutrient limitation, with decreased germination and growth as compared with the wild-type parent. Conidiation was severely compromised and conidia derived from the ΔBbATG5 strain were altered in morphology. Cell differentiation into blastospores was also greatly reduced. Despite the significant growth and developmental defects, insect bioassays using the oriental leafworm moth, Spodoptera litura, indicated a modest (~40 %) decrease in virulence in the ΔBbATG5 strain. The phenotypic defects of the ΔBbATG5 strain could be restored by introduction of an intact copy of BbATG5. These data suggest that unlike several plant and animal pathogenic fungi, where ATG5 is required for infection, in B. bassiana it is dispensable for pathogenesis. |
THE RESTORATIVE BENEFITS OF NATURE : TOWARD AN INTEGRATIVE | Directed attention plays an important role in human information processing; its fatigue, in turn, has far reaching consequences. Attention Restoration Theory provides an analysis of the kinds of experiences that lead to recovery from such fatigue. Natural environments turn out to be particularly rich in the character istics necessary for restorative experiences. An integrative framework is proposed that places both directed attention and stress in the larger context of human-environment relationships. © 1995 Academic Press Limited |
Concept Mask: Large-Scale Segmentation from Semantic Concepts | Existing works on semantic segmentation typically consider a small number of labels, ranging from tens to a few hundreds. With a large number of labels, training and evaluation of such task become extremely challenging due to correlation between labels and lack of datasets with complete annotations. We formulate semantic segmentation as a problem of image segmentation given a semantic concept, and propose a novel system which can potentially handle an unlimited number of concepts, including objects, parts, stuff, and attributes. We achieve this using a weakly and semi-supervised framework leveraging multiple datasets with different levels of supervision. We first train a deep neural network on a 6M stock image dataset with only image-level labels to learn visual-semantic embedding on 18K concepts. Then, we refine and extend the embedding network to predict an attention map, using a curated dataset with bounding box annotations on 750 concepts. Finally, we train an attention-driven class agnostic segmentation network using an 80-category fully annotated dataset. We perform extensive experiments to validate that the proposed system performs competitively to the state of the art on fully supervised concepts, and is capable of producing accurate segmentations for weakly learned and unseen concepts. |
Deep Logic Networks: Inserting and Extracting Knowledge From Deep Belief Networks | Developments in deep learning have seen the use of layerwise unsupervised learning combined with supervised learning for fine-tuning. With this layerwise approach, a deep network can be seen as a more modular system that lends itself well to learning representations. In this paper, we investigate whether such modularity can be useful to the insertion of background knowledge into deep networks, whether it can improve learning performance when it is available, and to the extraction of knowledge from trained deep networks, and whether it can offer a better understanding of the representations learned by such networks. To this end, we use a simple symbolic language—a set of logical rules that we call confidence rules—and show that it is suitable for the representation of quantitative reasoning in deep networks. We show by knowledge extraction that confidence rules can offer a low-cost representation for layerwise networks (or restricted Boltzmann machines). We also show that layerwise extraction can produce an improvement in the accuracy of deep belief networks. Furthermore, the proposed symbolic characterization of deep networks provides a novel method for the insertion of prior knowledge and training of deep networks. With the use of this method, a deep neural–symbolic system is proposed and evaluated, with the experimental results indicating that modularity through the use of confidence rules and knowledge insertion can be beneficial to network performance. |
Bow-tie rectenna arrays | This paper presents experimental results on several self-complementary bow-tie antenna arrays populated with rectifier diodes. Results with two different diodes are compared, and the number of diodes, antenna polarization, DC load, DC connection (parallel/series) and incident power density are varied. The transmitter is in the far field and the incident power density ranges from 10 to 125μW/cm2 at 2.3GHz. The main conclusion of this work is that it is possible to transmit useful power wirelessly, using low and safe power density levels (tens of μW/cm2), and receive it with a scalable device. |
Maximum likelihood multiple-source localization using acoustic energy measurements with wireless sensor networks | A maximum likelihood (ML) acoustic source location estimation method is presented for the application in a wireless ad hoc sensor network. This method uses acoustic signal energy measurements taken at individual sensors of an ad hoc wireless sensor network to estimate the locations of multiple acoustic sources. Compared to the existing acoustic energy based source localization methods, this proposed ML method delivers more accurate results and offers the enhanced capability of multiple source localization. A multiresolution search algorithm and an expectation-maximization (EM) like iterative algorithm are proposed to expedite the computation of source locations. The Crame/spl acute/r-Rao Bound (CRB) of the ML source location estimate has been derived. The CRB is used to analyze the impacts of sensor placement to the accuracy of location estimates for single target scenario. Extensive simulations have been conducted. It is observed that the proposed ML method consistently outperforms existing acoustic energy based source localization methods. An example applying this method to track military vehicles using real world experiment data also demonstrates the performance advantage of this proposed method over a previously proposed acoustic energy source localization method. |
Detection of Road Cracks with Multiple Images | Extracting the defects of the road pavement in images is difficult and, most of the time, one image is used alone. The difficulties of this task are: illumination changes, objects on the road, artefacts due to the dynamic acquisition. In this work, we try to solve some of these problems by using acquisitions from different points of view. In consequence, we present a new methodology based on these steps : the detection of defects in each image, the matching of the images and the merging of the different extractions. We show the increase in performances and more particularly how the false detections are reduced. |
The Microprocessor: Engine of the Technology Revolution | computer industry. Things change so fast, no simple extrapolation of the present is likely to resemble the future in any important way. The impact of the microprocessor is even more daunting to predict, since it is revolutionizing so many facets of modern life. When Bob Noyce and I founded Intel in 1968, our challenge was to make semiconductor memory practical. This goal was quite a stretch, considering that silicon memory was at least 100 times more expensive than magnetic core memory, the dominant technology at the time. Having delivered our first couple of memory products to market, we were looking for new worlds to conquer when the Japanese manufacturer Busicom asked us to design a set of chips for a family of programmable scientific and business calculators. At the time, all logic chips were hard-wired for a particular application. Busicom's family of calculators required at least a dozen different chips, a task that far outstripped the design engineering capacity of tiny Intel. But Intel engineer Ted Hoff realized a general-purpose computer architecture could do all the calculator functions and need not be more complex than the memory chips we were already working with. He proposed a CPU chip that took instructions from a stored program in read-only |
Dependency grammars as Haskell programs | In the paper we try to show that a lazy functional language such as Haskell is a convenient framework not only for implementing dependency parsers but also for expressing dependency grammars directly in the programming language in a compact, readable and mathematically clean way. The parser core, supplying necessary types and functions, is presented together with two examples of grammars: one trivial and one more elaborate, allowing to express a range of complex grammatical constraints such as long distance agreement. The complete Haskell code of the parser core as well the grammar examples is included. |
Vaccine injection technique and reactogenicity--evidence for practice. | There are inconsistencies in recommendations and practice with regards to how best to administer vaccines. This review evaluates the literature on intramuscular vaccine administration technique in primarily paediatric populations and concludes from available evidence which aspects of vaccine administration are associated with reactogenicity. Variables with best evidence to support practice to reduce reactogenicity were: Site of injection--less reactogenicity has been noted when the buttock is used rather than the thigh; tissue (muscle or subcutaneous)--less reactions are noted when vaccine is administered intramuscularly rather than subcutaneously; length of needle--longer needles are associated with less reactogenicity. Angle of injection--a 90 degrees angle is associated with less reactogenicity than a reduced angle. Despite a need for more empirical studies, there appears to be several vaccine administration techniques relating to needle angle, length, site and depth of injection that result in fewer reactions and these could be considered for public health policy, in conjunction with immunogenicity. |
Native Language Identification Using Large, Longitudinal Data | Native Language Identification (NLI) is a task aimed at determining the native language (L1) of learners of second language (L2) on the basis of their written texts. To date, research on NLI has focused on relatively small corpora. We apply NLI to the recently released EFCamDat corpus which is not only multiple times larger than previous L2 corpora but also provides longitudinal data at several proficiency levels. Our investigation using accurate machine learning with a wide range of linguistic features reveals interesting patterns in the longitudinal data which are useful for both further development of NLI and its application to research on L2 acquisition. |
Composite Scale of Morningness: psychometric properties, validity with Munich ChronoType Questionnaire and age/sex differences in Poland. | The present study aimed at testing psychometric properties of the Composite Scale of Morningness (CSM) and validating it with mid sleep on free days (MSF) derived from the Munich ChronoType Questionnaire (MCTQ) in Poland, along with analyzing age and sex differences in the CSM and MSF. A sample of 952 Polish residents (62.6% females) aged between 13 and 46 was tested. Additionally, a sample of 33 university students were given MCTQ and filled in a sleep diary for 8 days. MSF derived from MCTQ was related to the one from sleep diary (r=.44). The study revealed good reliability of the CSM (α=.84) and its validity: greater morningness preference was associated with earlier MSF from MCTQ (r=-.52). CSM scores were distributed over its full range, with a mean of 34, and did not differ between sexes, although females were earlier than males by 23minutes in MSF. Regarding age, eveningness estimated with both CSM and MSF was greatest in subjects aged 16-18years, and a shift toward eveningness during puberty and a shift back toward morningness in older age was observed. The Polish version of the CSM consisted of two components of morningness. Cutoff scores were: for evening types (lower 10%) 24 or less, for morning types (upper 10%) 43 or more. The Polish CSM presents good psychometric properties, which are similar to those reported in other language versions, and also presents sex/age patterns similar to those found previously. |
Microstructures of superhydrophobic plant leaves - inspiration for efficient oil spill cleanup materials. | The cleanup of accidental oil spills in water is an enormous challenge; conventional oil sorbents absorb large amounts of water in addition to oil and other cleanup methods can cause secondary pollution. In contrast, fresh leaves of the aquatic ferns Salvinia are superhydrophobic and superoleophilic, and can selectively absorb oil while repelling water. These selective wetting properties are optimal for natural oil absorbent applications and bioinspired oil sorbent materials. In this paper we quantify the oil absorption capacity of four Salvinia species with different surface structures, water lettuce (Pistia stratiotes) and Lotus leaves (Nelumbo nucifera), and compare their absorption capacity to artificial oil sorbents. Interestingly, the oil absorption capacities of Salvinia molesta and Pistia stratiotes leaves are comparable to artificial oil sorbents. Therefore, these pantropical invasive plants, often considered pests, qualify as environmentally friendly materials for oil spill cleanup. Furthermore, we investigated the influence of oil density and viscosity on the oil absorption, and examine how the presence and morphology of trichomes affect the amount of oil absorbed by their surfaces. Specifically, the influence of hair length and shape is analyzed by comparing different hair types ranging from single trichomes of Salvinia cucullata to complex eggbeater-shaped trichomes of Salvinia molesta to establish a basis for improving artificial bioinspired oil absorbents. |
VLSI Design of SVM-Based Seizure Detection System With On-Chip Learning Capability | Portable automatic seizure detection system is very convenient for epilepsy patients to carry. In order to make the system on-chip trainable with high efficiency and attain high detection accuracy, this paper presents a very large scale integration (VLSI) design based on the nonlinear support vector machine (SVM). The proposed design mainly consists of a feature extraction (FE) module and an SVM module. The FE module performs the three-level Daubechies discrete wavelet transform to fit the physiological bands of the electroencephalogram (EEG) signal and extracts the time–frequency domain features reflecting the nonstationary signal properties. The SVM module integrates the modified sequential minimal optimization algorithm with the table-driven-based Gaussian kernel to enable efficient on-chip learning. The presented design is verified on an Altera Cyclone II field-programmable gate array and tested using the two publicly available EEG datasets. Experiment results show that the designed VLSI system improves the detection accuracy and training efficiency. |
Data replication policy in a cloud computing environment | In a cloud computing environment, data replication is frequently used for failure recovery. However, data replication can be used also to improve the performance of the execution of application. In this context, this article presents a policy for data replication for a computational cloud with bioinformatics data, such that the execution of application of bioinformatics is reduced. |
Skip-Thought Vectors | We describe an approach for unsupervised learning of a generic, distributed sentence encoder. Using the continuity of text from books, we train an encoderdecoder model that tries to reconstruct the surrounding sentences of an encoded passage. Sentences that share semantic and syntactic properties are thus mapped to similar vector representations. We next introduce a simple vocabulary expansion method to encode words that were not seen as part of training, allowing us to expand our vocabulary to a million words. After training our model, we extract and evaluate our vectors with linear models on 8 tasks: semantic relatedness, paraphrase detection, image-sentence ranking, question-type classification and 4 benchmark sentiment and subjectivity datasets. The end result is an off-the-shelf encoder that can produce highly generic sentence representations that are robust and perform well in practice. We will make our encoder publicly available. |
A spiral antenna over a high-impedance surface consisting of fan-shaped patch cells | Radiation characteristics of a spiral antenna over a high-impedance surface (HIS) are analyzed. The HIS consists of fan-shaped patch cells. The fan-shaped patches are arranged homogeneously in the circumferential direction but are set non-homogeneously in the radial direction. The analysis is performed using method of moment. Radiation characteristics of a spiral antenna with a perfect electric conductor (PEC) reflector are analyzed. It is reaffirmed that wideband radiation characteristics, including input impedance and axial ratio are deteriorate. Subsequently, Radiation characteristics of a spiral antenna with a fan-shaped HIS reflector are analyzed. It is revealed that input impedance and axial ratio are mitigated by replacing the PEC reflector with the fan-shaped HIS reflector. |
Stably free modules over R[X] of rank > dim R are free | We prove that for any finite-dimensional ring R and n > dim R+2, the group E n (R[X]) acts transitively on Um n (R[X]). In particular, we obtain that for any finite-dimensional ring R, all finitely generated stably free modules over R[X] of rank > dim R are free. This result was only known for Noetherian rings. The proof we give is short, simple, and constructive. |
Clearwater: An Extensible, Pliable, and Customizable Approach to Code Generation | Since the advent of RPC Stub Generator, Software tools that translate a high level specification into executable programs have been instrumental in facilitating the development of distributed software systems. Developers write programs at a high level abstraction with high readability and reduced initial development cost. However, existing approaches to building code generation tools for such systems have difficulties evolving these tools to meet challenges of new standards, new platforms and languages, or changing product scopes, resulting in generator tools with limited lifespan.
The difficulties in evolving generator tools can be characterized as a combination of three challenges that appear inherently difficult to solve simultaneously: the abstraction mapping challenge translating a high-level abstraction into a low-level implementation), the interoperable heterogeneity challenge stemming from multiple input and output formats, and the flexible customization challenge to extend base functionality for evolution or new applications. The Clearwater approach to code generation uses XML-based technologies and Software tools to resolve these three challenges with three important code generation features: specification extensibility, whereby an existing specification format can accommodate extensions or variations at low cost; generator pliability, which allows the generator to operator on an extensible specification and/or support multiple and new platforms; and flexible customization, which allows an application developer to make controlled changes to the output of a code generator to support application-specific goals.
The presentation will outline the Clearwater approach and apply it to meet the above three challenges in two domain areas. The first area is information flow applications (e.g., multimedia streaming and event processing), a horizontal domain in which the ISG code generator creates QoS-customized communication code using the Infopipe abstraction and specification language. The second area is enterprise application staging (e.g., complex N-tier distributed applications), a vertical domain in which the Mulini code generator creates multiple types of source code supporting automatic staging of distributed heterogeneous applications in a data center environment. The success of applying Clearwater to these domains Shows the effectiveness of our approach. |
Generic summarization and keyphrase extraction using mutual reinforcement principle and sentence clustering | A novel method for simultaneous keyphrase extraction and generic text summarization is proposed by modeling text documents as weighted undirected and weighted bipartite graphs. Spectral graph clustering algorithms are useed for partitioning sentences of the documents into topical groups with sentence link priors being exploited to enhance clustering quality. Within each topical group, saliency scores for keyphrases and sentences are generated based on a mutual reinforcement principle. The keyphrases and sentences are then ranked according to their saliency scores and selected for inclusion in the top keyphrase list and summaries of the document. The idea of building a hierarchy of summaries for documents capturing different levels of granularity is also briefly discussed. Our method is illustrated using several examples from news articles, news broadcast transcripts and web documents. |
Impact of maternal HIV-1 viremia on lymphocyte subsets among HIV-exposed uninfected infants: protective mechanism or immunodeficiency | BACKGROUND
Reports of increased morbidity and mortality from infectious diseases among HIV Exposed Uninfected (HEU) infants have raised concern about a possible underlying immunodeficiency among them. The objective of this study was to assess the immunological profile of HEU infants born to mothers exhibiting different levels of HIV-1 viremia at the time of delivery.
METHODS
Study subjects were enrolled in the Centre maternel et infantile sur le SIDA (CMIS) mother-child cohort between 1997 and 2010 (n =585). Infant CD4+ T cell, CD8+ T cell and CD19+ B cell counts were assessed at 2 and 6 months of age, and compared among HEU infants in groups defined by maternal viral load (VL) at the time of delivery (VL < 50 copies/ml, VL 50-1000 copies/ml, and VL > 1000 copies/ml) in a multivariable analysis.
RESULTS
At 2 months of age, infants born to mothers with VL > 1000 copies/ml had lower CD4+ T cell counts compared to those born to mothers with VL < 50 copies/ml at the time of delivery (44.3% versus 48.3%, p = 0.007, and 2884 vs. 2432 cells/mm3, p = 0.02). These differences remained significant after adjusting for maternal and infant antiretroviral drug use, gender, race and gestational age, and persisted at 6 months of age. There were no differences in CD8+ T cell count or absolute CD19+ B cell count between groups, though higher CD19+ B cell percentage was seen among infants born to mothers with VL > 1000 copies/ml.
CONCLUSIONS
These results suggest that exposure to high levels of HIV-1 viremia in utero, even in the absence of perinatal transmission, may affect the infant's developing immune system. While further work needs to be done to confirm these findings, they reinforce the need for optimal treatment of HIV infected pregnant women, and careful follow-up of HEU infants. |
Double-Quadrant State-of-Charge-Based Droop Control Method for Distributed Energy Storage Systems in Autonomous DC Microgrids | In this paper, a double-quadrant state-of-charge (SoC)-based droop control method for distributed energy storage system is proposed to reach the proper power distribution in autonomous dc microgrids. In order to prolong the lifetime of the energy storage units (ESUs) and avoid the overuse of a certain unit, the SoC of each unit should be balanced and the injected/output power should be gradually equalized. Droop control as a decentralized approach is used as the basis of the power sharing method for distributed energy storage units. In the charging process, the droop coefficient is set to be proportional to the nth order of SoC, while in the discharging process, the droop coefficient is set to be inversely proportional to the nth order of SoC. Since the injected/output power is inversely proportional to the droop coefficient, it is obtained that in the charging process the ESU with higher SoC absorbs less power, while the one with lower SoC absorbs more power. Meanwhile, in the discharging process, the ESU with higher SoC delivers more power and the one with lower SoC delivers less power. Hence, SoC balancing and injected/output power equalization can be gradually realized. The exponent n of SoC is employed in the control diagram to regulate the speed of SoC balancing. It is found that with larger exponent n, the balancing speed is higher. MATLAB/simulink model comprised of three ESUs is implemented and the simulation results are shown to verify the proposed approach. |
Procedural Content Generation in Games | This chapter introduces the field of procedural content generation (PCG), as well as the book. We start by defining key terms, such as game content and procedural generation. We then give examples of games that use PCG, outline desirable properties, and provide a taxonomy of different types of PCG. Applications of and approaches to PCG can be described in many different ways, and another section is devoted to seeing PCG through the lens of design metaphors. The chapter finishes by providing an overview of the rest of the book. 1.1 What is procedural content generation? You have just started reading a book about Procedural Content Generation in Games. This book will contain quite a lot of algorithms and other technical content, and plenty of discussion of game design. But before we get to the meat of the book, let us start with something a bit more dry: definitions. In particular, let us define Procedural Content Generation, which we will frequently abbreviate as PCG. The definition we will use is that PCG is the algorithmic creation of game content with limited or indirect user input [32]. In other words, PCG refers to computer software that can create game content on its own, or together with one or many human players or designers. A key term here is “content”. In our definition, content is most of what is contained in a game: levels, maps, game rules, textures, stories, items, quests, music, weapons, vehicles, characters, etc. The game engine itself is not considered to be content in our definition. Further, non-player character behaviour—NPC AI—is not considered to be content either. The reason for this narrowing of the definition of content is that within the field of artificial and computational intelligence in games, there is much more research done in applying CI and AI methods to character behaviour than there is on procedural content generation. While the field of PCG is mostly based on AI methods, we want to set it apart from the more “mainstream” 1 Springer International Publishing Switzerland 2016 N. Shaker et al., Procedural Content Generation in Games, Computational Synthesis and Creative Systems, DOI 10.1007/978-3-319-42716-4_1 2 Julian Togelius, Noor Shaker, and Mark J. Nelson use of game-based tasks to test AI algorithms, where AI is most often used to learn to play a game. Like all definitions (except perhaps those in mathematics), our definition of PCG is somewhat arbitrary and rather fuzzy around the edges. We will treat it as such, and are mindful that other people define the term differently. In particular, some would rather use the term “generative methods” for a superset of what we call PCG [8]. Another important term is “games”. Games are famously hard to define (see Wittgenstein’s discussion of the matter [36]), and we will not attempt this here. Suffice it to say that by games we mean such things as videogames, computer games, board games, card games, puzzles, etc. It is important that the content generation system takes the design, affordances and constraints of the game that it is being generated for into account. This sets PCG apart from such endeavours as generative art and many types of computer graphics, which do not take the particular constraints and affordances of game design into account. In particular, a key requirement of generated content is that it must be playable—it should be possible to finish a generated level, ascend a generated staircase, use a generated weapon or win a generated game. The terms “procedural” and “generation” imply that we are dealing with computer procedures, or algorithms, that create something. A PCG method can be run by a computer (perhaps with human help), and will output something. A PCG system refers to a system that incorporates a PCG method as one of its parts, for example an adaptive game or an AI-assisted game design tool. This book will contain plenty of discussion of algorithms and quite a lot of pseudocode, and most of the exercises that accompany the chapters will involve programming. To make this discussion more concrete, we will list a few things we consider to be PCG: • A software tool that creates dungeons for an action adventure game such as The Legend of Zelda without any human input—each time the tool is run, a new level is created; • a system that creates new weapons in a space shooter game in response to what the collective of players do, so that the weapons that a player is presented with are evolved versions of weapons other players found fun to use; • a program that generates complete, playable and balanced board games on its own, perhaps using some existing board games as starting points; • game engine middleware that rapidly populates a game world with vegetation; • a graphical design tool that lets a user design maps for a strategy game, while continuously evaluating the designed map for its gameplay properties and suggesting improvements to the map to make it better balanced and more interesting. In the upcoming chapters, you will find descriptions of all of those things described above. Let us now list a few things that we do not consider to be PCG: • A map editor for a strategy game that simply lets the user place and remove items, without taking any initiative or doing any generation on its own; • an artificial player for a board game; • a game engine capable of integrating automatically generated vegetation. |
Histogram of oriented gradients for human detection in video | Currently, Computer Vision (CV) is one of the most popular research topics in the world. This is because it can support the human daily life. Moreover, CV can also apply to various theories and researches. Human Detection is one of the most popular research topics in Computer Vision. In this paper, we present a study of technique for human detection from video, which is the Histograms of Oriented Gradients or HOG by developing a piece of application to import and detect the human from the video. We use the HOG Algorithm to analyze every frame from the video to find and count people. After analyzing video from starting to the end, the program generate histogram to show the number of detected people versus playing period of the video. As a result, the expected results are obtained, including the detection of people in the video and the histogram generation to show the appearance of human detected in the video file. |
Development of a multidimensional assessment tool for uraemic pruritus: Uraemic Pruritus in Dialysis Patients (UP-Dial). | BACKGROUND
Dialysis patients with uraemic pruritus (UP) have significantly impaired quality of life. To assess the therapeutic effect of UP treatments, a well-validated comprehensive and multidimensional instrument needed to be established.
OBJECTIVES
To develop and validate a multidimensional scale assessing UP in patients on dialysis: the Uraemic Pruritus in Dialysis Patients (UP-Dial).
METHODS
The development and validation of the UP-Dial instrument were conducted in four phases: (i) item generation, (ii) development of a pilot questionnaire, (iii) refinement of the questionnaire with patient recruitment and (iv) psychometric validation. Participants completed the UP-Dial, the visual analogue scale (VAS) of UP, the Dermatology Life Quality Index (DLQI), the Kidney Disease Quality of Life-36 (KDQOL-36), the Pittsburgh Sleep Quality Index (PSQI) and the Beck Depression Inventory (BDI) between 15 May 2012 and 30 November 2015.
RESULTS
The 27-item pilot UP-Dial was generated, with 168 participants completing the pilot scale. After factor analysis was performed, the final 14-item UP-Dial encompassed three domains: signs and symptoms, psychosocial, and sleep. Face and content validity were satisfied through the item generation process and expert review. Psychometric analysis demonstrated that the UP-Dial had good convergent and discriminant validity. The UP-Dial was significantly correlated [Spearman rank coefficient, 95% confidence interval (CI)] with the VAS-UP (0·76, 0·69-0·83), DLQI (0·78, 0·71-0·85), KDQOL-36 (-0·86, -0·91 to -0·81), PSQI (0·85, 0·80-0·89) and BDI (0·70, 0·61-0·79). The UP-Dial revealed excellent internal consistency (Cronbach's α 0·90, 95% CI 0·87-0·92) and reproducibility (intraclass correlation 0·95, 95% CI 0·90-0·98).
CONCLUSIONS
The UP-Dial is valid and reliable for assessing UP among patients on dialysis. Future research should focus on the cross-cultural adaptation and translation of the scale to other languages. |
A state-of-the-art 3D sensor for robot navigation | This paper relates first experiences using a state-of-the-art, time-of-flight sensor that is able to deliver 3D images. The properties and capabilities of the sensor make it a potential powerful tool for applications within mobile robotics especially for real-time tasks, as the sensor features a frame rate of up to 30 frames per second. Its capabilities in terms of basic obstacle avoidance and local path-planning are evaluated and compared to the performance of a standard laser scanner. |
A Novel Cryptosystem Based on Iris Key Generation | Biometric cryptography is a technique using biometric features to encrypt data, which can improve the security of the encrypted data and overcome the shortcomings of the traditional cryptography. This paper proposes a novel biometric cryptosystem based on the most accurate biometric feature - iris. In encryption phase, a quantified 256-dimension textural feature vector is firstly extracted from the preprocessed iris image using a set of 2-D Gabor filters. At the same time, an error-correct-code (ECC) is generated using Reed-Solomon algorithm. Then the feature vector is translated to a cipher key using Hash function. Some general encryption algorithms use this cipher key to encrypt the secret information. In decryption phase, a feature vector extracted from the input iris is firstly corrected using the ECC. Then it is translated to the cipher key using the same Hash function. Finally, the corresponding general decryption algorithms use the key to decrypt the information. Experimental results demonstrate the feasibility of the proposed system. |
Shared-Memory Parallelism Can Be Simple, Fast, and Scalable | Parallelism is the key to achieving high performance in computing. However, writing efficient and scalable parallel programs is notoriously difficult, and often requires significant expertise. To address this challenge, it is crucial to provide programmers with high-level tools to enable them to develop solutions efficiently, and at the same time emphasize the theoretical and practical aspects of algorithm design to allow the solutions developed to run efficiently under all possible settings. This thesis addresses this challenge using a three-pronged approach consisting of the design of shared-memory programming techniques, frameworks, and algorithms for important problems in computing. The thesis provides evidence that with appropriate programming techniques, frameworks, and algorithms, shared-memory programs can be simple, fast, and scalable, both in theory and in practice. The results developed in this thesis serve to ease the transition into the multicore era. The first part of this thesis introduces tools and techniques for deterministic parallel programming, including means for encapsulating nondeterminism via powerful commutative building blocks, as well as a novel framework for executing sequential iterative loops in parallel, which lead to deterministic parallel algorithms that are efficient both in theory and in practice. The second part of this thesis introduces Ligra, the first high-level sharedmemory framework for parallel graph traversal algorithms. The framework allows programmers to express graph traversal algorithms using very short and concise code, delivers performance competitive with that of highly-optimized code, and is up to orders of magnitude faster than existing systems designed for distributed memory. This part of the thesis also introduces Ligra+, which extends Ligra with graph compression techniques to reduce space usage and improve parallel performance at the same time, and is also the first graph processing system to support in-memory graph compression. The third and fourth parts of this thesis bridge the gap between theory and practice in parallel algorithm design by introducing the first algorithms for a variety of important problems on graphs and strings that are efficient both in theory and in practice. For example, the thesis develops the first linear-work and polylogarithmic-depth algorithms for suffix tree construction and graph connectivity that are also practical, as well as a work-efficient, polylogarithmicdepth, and cache-efficient shared-memory algorithm for triangle computations that achieves a 2–5x speedup over the best existing algorithms on 40 cores. |
Monitoring and controlling of smart equipments using Android compatible devices towards IoT applications and services in manufacturing industry | The ever increasing requirements for information being accessible at any time, from any place, regardless the type of remote device or planned operation, together with the need of complete control of a specific scenario or device has paved the way towards the next technological revolution: Internet of Things (IoT) and led to several major research projects. Within this paperwork the authors' vision regarding the architecture of an IoT network and an experimental testing bench for one of the first steps leading towards implementing the IoT vision is briefly introduced. The first part presents an overview of the Internet of Things. In the second part, authors' concept regarding an architecture for IoT and the vision towards implementation it into manufacturing environments are presented. The third part illustrates the implementation and testing of the chosen solution for connectivity between a smart equipment and Android compatible devices. In the last part, conclusions are highlighted and the roadmap regarding to concept implementation is defined. |
The Hill-Sachs lesion: diagnosis, classification, and management. | The Hill-Sachs lesion is an osseous defect of the humeral head that is typically associated with anterior shoulder instability. The incidence of these lesions in the setting of glenohumeral instability is relatively high and approaches 100% in persons with recurrent anterior shoulder instability. Reverse Hill-Sachs lesion has been described in patients with posterior shoulder instability. Glenoid bone loss is typically associated with the Hill-Sachs lesion in patients with recurrent anterior shoulder instability. The lesion is a bipolar injury, and identification of concomitant glenoid bone loss is essential to optimize clinical outcome. Other pathology (eg, Bankart tear, labral or capsular injuries) must be identified, as well. Treatment is dictated by subjective and objective findings of shoulder instability and radiographic findings. Nonsurgical management, including focused rehabilitation, is acceptable in cases of small bony defects and nonengaging lesions in which the glenohumeral joint remains stable during desired activities. Surgical options include arthroscopic and open techniques. |
Chapter 2 Bacillus as PGPR in Crop Ecosystem | Plant growth promoting rhizobacteria (PGPR) are beneficial bacteria which have the ability to colonize the roots and either promote plant growth through direct action or via biological control of plant diseases (Kloepper and Schroth 1978). They are associated with many plant species and are commonly present in varied environments. Strains with PGPR activity, belonging to genera Azoarcus, Azospirillum, Azotobacter, Arthrobacter, Bacillus, Clostridium, Enterobacter, Gluconacetobacter, Pseudomonas, and Serratia, have been reported (Hurek and Reinhold-Hurek 2003). Among these, species of Pseudomonas and Bacillus are the most extensively studied. These bacteria competitively colonize the roots of plant and can act as biofertilizers and/or antagonists (biopesticides) or simultaneously both. Diversified populations of aerobic endospore forming bacteria (AEFB), viz., species of Bacillus, occur in agricultural fields and contribute to crop productivity directly or indirectly. Physiological traits, such as multilayered cell wall, stress resistant endospore formation, and secretion of peptide antibiotics, peptide signal molecules, and extracellular enzymes, are ubiquitous to these bacilli and contribute to their survival under adverse environmental conditions for extended periods of time. Multiple species of Bacillus and Paenibacillus are known to promote plant growth. The principal mechanisms of growth promotion include production of growth stimulating phytohormones, solubilization and mobilization of phosphate, siderophore production, antibiosis, i.e., production of antibiotics, inhibition of plant ethylene synthesis, and induction of plant systemic resistance to pathogens (Richardson et al. 2009; Idris et al. 2007; Gutierrez-Manero et al. 2001; |
In-place Activated BatchNorm for Memory-Optimized Training of DNNs | In this work we present In-Place Activated Batch Normalization (INPLACE-ABN) - a novel approach to drastically reduce the training memory footprint of modern deep neural networks in a computationally efficient way. Our solution substitutes the conventionally used succession of BatchNorm + Activation layers with a single plugin layer, hence avoiding invasive framework surgery while providing straightforward applicability for existing deep learning frameworks. We obtain memory savings of up to 50% by dropping intermediate results and by recovering required information during the backward pass through the inversion of stored forward results, with only minor increase (0.8-2%) in computation time. Also, we demonstrate how frequently used checkpointing approaches can be made computationally as efficient as INPLACE-ABN. In our experiments on image classification, we demonstrate on-par results on ImageNet-1k with state-of-the-art approaches. On the memory-demanding task of semantic segmentation, we report competitive results for COCO-Stuff and set new state-of-the-art results for Cityscapes and Mapillary Vistas. Code can be found at https://github.com/mapillary/inplace_abn. |
[A perfect family medicine storm ]. | ![Figure][1]
Research has, historically and arguably, been an “issue” for family medicine. Although the preponderance of Canadian clinical activity occurs in the family medicine primary care world, most published medical research continues to arise from non–family physicians, most often |
Comparison of pegfilgrastim on day 2 vs. day 4 as primary prophylaxis of intense dose-dense chemotherapy in patients with node-positive primary breast cancer within the prospective, multi-center GAIN study: (GBG 33) | Preliminary data suggest that pegfilgrastim given on day 4 (P4) might be superior to pegfilgrastim on day 2 (P2) in reducing grade 4 leucopenia. Patients with node-positive primary breast cancer receiving epirubicin–paclitaxel–cyclophosphamide chemotherapy were randomized to receive P2 versus P4. Primary endpoint was leucopenia grade 4, assuming a risk reduction of 50% with P4 from 50% in P2 to 25% with P4. Three-hundred fifty-one patients were randomized to P2 (n = 174) versus P4 (n = 177). The rate of leucopenia (grade 4) was 47.1% with P2 and 42.0% with P4 (p = 0.387), neutropenia (grade 3 + 4) was 47.9% versus 40.8% (p = 0.337), FN was 4.7% versus 8.0% (p = 0.271), and infections was 29.9% versus 25.4% (p = 0.404), respectively. This study failed to demonstrate that pegfilgrastim on day 4 was more efficacious than on day 2 with respect to grade 4 leucopenia (the primary endpoint), febrile neutropenia, or infections. |
Caesar : a social code review tool for programming education | Caesar is a distributed, social code review tool designed for the specific constraints and goals of a programming course. Caesar is capable of scaling to a large and diverse reviewer population, provides automated tools for increasing reviewer efficiency, and implements a social web interface for reviewing that encourages discussion and participation. Our system is implemented in three loosely-coupled components: a language-specific code preprocessor that partitions code into small pieces, filters out uninteresting ones, runs static analysis, and detects clusters of similar code; an incremental task router that dynamically assigns reviewers to tasks; and a language-agnostic web interface for reviewing code. Our evaluation using actual student code and a user study indicate that Caesar provides a significant improvement over existing code review workflows and interfaces. We also believe that this work contributes a modular framework for code reviewing systems that can be easily extended and improved. Thesis Supervisor: Robert C. Miller Title: Associate Professor of Computer Science and Engineering |
Digital Innovation and Strategic Transformation | D igital technologies—and how we use them in our personal lives, work, and society—have changed the face of business and will continue to do so. Moreover, research organizations such as Gartner and Accenture, along with MIT professor Erik Brynjolfsson,1 indicate that the digital technologies that underlie computers, robots, and smart equipment are changing rapidly, becoming more powerful, and transforming organizations much faster than in the past (that is, the second machine age). We are in the Fourth Industrial Revolution, the digital revolution of cyber-physical systems that has been unfolding since the middle of the last century.2 The possibility of billions of people connected by mobile devices, in conjunction with unprecedented processing power, storage capacity, and access to knowledge via smart machines, creates enormous opportunities for entrepreneurs and innovative managers alike. |
HALUX: projection-based interactive skin for digital sports | Entertainment contents employing users' whole-body action is now becoming popular, along with the prevalence of low-cost whole-body motion capture systems. To add haptic modality to this context, latency becomes a critical issue because it leads to spatial disparity between the assumed contact location and tactile stimulation position. To cope with this issue, we propose to project drive signal in advance so as to eliminate latency derived from communication. We do not explicitly control each vibrator, but we project "position-dependent, vibration strength distribution" image. Furthermore, the system becomes highly scalable, enabling simultaneous drive of hundreds of units attached to the body. |
Severe Traumatic Brain Injury in Austria VI: Effects of guideline-based management | ZIELE: Das Ziel der vorliegenden Arbeit ist es, die Zusammenhänge zwischen den Ergebnissen der Behandlung und den einzelnen Empfehlungen der Richtlinien darzustellen. PATIENTEN UND METHODEN: Es standen Datensätze von 405 Patienten zur Verfügung, die von 5 österreichischen Zentren in die Studie inkludiert worden waren. Analysiert wurde, in welchem Ausmaß die Behandlung der Patienten den Richtlinien der "Brain Trauma Foundation" folgte. Hierzu wurde ein Algorithmus entwickelt, mittels dessen sich die Compliance mit den Richtlinien für jeden Patienten Tag für Tag überprüfen ließ und der einen Score ergab, der dann für weitere Analysen herangezogen wurde. Mit diesem Score wurden die Beziehungen zwischen einzelnen Empfehlungen und dem Überleben der Intensivbehandlung, gutem Zustand nach einem Jahr, und der Dauer von Intensivbehandlung und Spitalsaufenthalt analysiert. Dies erfolgte mittels logistischer Regression, wobei für die Faktoren Alter, Injury Severity Score und Glasgow Coma Scale korrigiert wurde. ERGEBNISSE: Die Option "Prähospitale Therapie" wurde in 84% der Fälle angewandt, und die Richtlinie "Rasche Schocktherapie" wurde bei 79% erfüllt. Am häufigsten wurde die Richtlinie befolgt, den intrakraniellen Druck ab einer Höhe von 25 mmHg zu therapieren (89%). Die Option, den zerebralen Perfusionsdruck über 70 mmHg zu halten, wurde nur selten befolgt (29%). Ein statistisch signifikanter positiver Einfluss der Befolgung von Empfehlungen auf das Überleben der Intensivbehandlung fand sich nur für die Richtlinie "Rasche Schocktherapie" und für die Option "Zerebraler Perfusionsdruck". Für einige andere Empfehlungen (zu Technik der Hirndruckmessung, Hyperventilation, Antiepileptika) sowie für den Gesamtscore fand sich ebenfalls ein positiver Einfluss (nicht signifikant). Für die Befolgung der übrigen Empfehlungen fand sich ein (nicht signifikanter) negativer Einfluss auf das Überleben der Intensivbehandlung. Bei der Analyse der Beziehung von Compliance Scores und Dauer von Intensiv- und Spitalsaufenthalt von Überlebenden zeigte sich, dass die Implementierung der Empfehlungen zur Technik der Hirndruckmessung den Intensiv- und Spitalsaufenthalt verkürzte. Befolgte man die Richtlinie zur Hyperventilation, so resultierte eine Verkürzung des Intensiv- und eine Verlängerung des Spitalsaufenthalts, befolgte man jene zu Mannitol, so wurde der Spitals-, jedoch nicht der Intensivaufenthalt verkürzt. Unterlassung der Verwendung von Kortikosteroiden (Standard) resultierte in einer Verkürzung von Intensiv- und Spitalsaufenthalt. Vermeidung der prophylaktischen Anwendung von Antiepileptika (Standard) führte zu einer Verkürzung des Intensivaufenthalts. Würde man alle Richtlinien befolgen, so würde dies zu einer Verlängerung des Intensiv- und einer Verkürzung des Spitalsaufenthalt führen. SCHLUSSFOLGERUNGEN: Die Empfehlungen zur Schocktherapie und zur Aufrechterhaltung des zerebralen Perfusionsdruck führten zu einer höheren Rate an Überlebenden und sollten deshalb immer umgesetzt werden. Da die Umsetzung der Richtlinien eine Verkürzung des Spitalsaufenthalts zu Folge haben könnte ist die Implementierung eines entsprechenden Programms aus ökonomischer Sicht ebenfalls zu befürworten. OBJECTIVES: The goal of this paper is to report relations between health outcomes and implementation of individual recommendations of the guidelines. PATIENTS AND METHODS: Data sets from 405 patients included by 5 Austrian hospitals were available. The analysis focused on the compliance of treatment modalities to TBI guidelines recommendations. Compliance was evaluated based on scores developed specifically for this purpose. To evaluate the relations between the TBI guidelines compliance and outcomes the estimation of odds ratios was computed using multiple as well as logistic regression with age, ISS and initial GCS used to control confounding. RESULTS: The option on prehospital resuscitation was followed in 84%, the guideline on early resuscitation was followed in 79%. The guideline on intracranial pressure treatment threshold was the most closely followed one (89%). The option on cerebral perfusion pressure was followed in less than 30% of patients. Only the scores on resuscitation of blood pressure and oxygenation and on cerebral perfusion pressure were positively and statistically significantly related to ICU survival. Positive relations were also found for adherence to the recommendations on the type of monitoring, hyperventilation (guideline), prophylactic use of anti-seizure drugs, and the total of scores. The other recommendations were negatively related to ICU survival, but computed odds ratios were statistically not significant. Analysis of relations between compliance scores and length of ICU and hospitals stay in survivors showed that adherence to the recommendations on type of monitoring was related to a reduction of length of stay in ICU and hospital, adherence to the hyperventilation guideline was related to shortened ICU, but increased hospital stay, and adherence to the guideline on mannitol was related to reduced days in hospital, but not to days in ICU. Implementing the standard on corticosteroid use was related to a reduction of days both in hospital and ICU. Using the standard on prophylactic use of anti-seizure drugs was related to a reduction in ICU days. If all the recommendations were closely followed an increase of days in ICU would be observed, while the length of stay in hospital would be reduced. CONCLUSIONS: The relatively strong relation between initial resuscitation in the hospital and ICU survival provides a firm basis for future efforts of emergency teams. The positive influence of some of the recommendations on reduction of ICU or hospital days may provide economic incentives to promote guidelines implementation. |
A Taxonomy of Workflow Management Systems for Grid Computing | With the advent of Grid and application technologies, scientists and engineers are building more and more complex applications to manage and process large data sets, and execute scientific experiments on distributed resources. Such application scenarios require means for composing and executing complex workflows. Therefore, many efforts have been made towards the development of workflow management systems for Grid computing. In this paper, we propose a taxonomy that characterizes and classifies various approaches for building and executing workflows on Grids. We also survey several representative Grid workflow systems developed by various projects world-wide to demonstrate the comprehensiveness of the taxonomy. The taxonomy not only highlights the design and engineering similarities and differences of state-of-the-art in Grid workflow systems, but also identifies the areas that need further research. |
Spectral karyotyping refines cytogenetic diagnostics of constitutional chromosomal abnormalities | Karyotype analysis by chromosome banding is the standard method for identifying numerical and structural chromosomal aberrations in pre- and postnatal cytogenetics laboratories. However, the chromosomal origins of markers, subtle translocations, or complex chromosomal rearrangements are often difficult to identify with certainty. We have developed a novel karyotyping technique, termed spectral karyotyping (SKY), which is based on the simultaneous hybridization of 24 chromosome-specific painting probes labeled with different fluorochromes or fluorochrome combinations. The measurement of defined emission spectra by means of interferometer-based spectral imaging allows for the definitive discernment of all human chromosomes in different colors. Here, we report the comprehensive karyotype analysis of 16 samples from different cytogenetic laboratories by merging conventional cytogenetic methodology and spectral karyotyping. This approach could become a powerful tool for the cytogeneticists, because it results in a considerable improvement of karyotype analysis by identifying chromosomal aberrations not previously detected by G-banding alone. Advantages, limitations, and future directions of spectral karyotyping are discussed. |
SmartParking: A Secure and Intelligent Parking System Using NOTICE | Parking is costly and limited in almost every major city in the world. Innovative parking systems for meeting near-term parking demand are needed. In this paper, we propose a novel, secure and intelligent parking system based on the concept and framework of NOTICE [WO07], a secure and privacy-aware architecture for the notification of traffic incidents. The proposed system, called SmartParking, is a service-oriented intelligent parking system through which drivers can view and reserve a parking spot on the fly. The parking process can then be a straightforward and non-stop process. More importantly, SmartParking is a secure and privacy-aware parking system. The proposed infrastructure prevents most security/privacy attacks. We address hardware/software architecture and implementations. The evaluation of this proposed system proves its efficiency. |
Effect of sulfur partial pressure on the growth of CuInS2 single crystals | CuInS 2 single crystaLs of moderate size have been produced by the gradient freeze technique under different sulfur pressures. Results from EDX show that with higher sulfur pressures the cracking
along the ingot decreases, but at 2 bar these cracks are still present. XRD shows that CuS is present at these cracks and is due to the loss of In 2 S into the gas phase. Despite this loss of In 2 S PL reveals the material to be In-rich. |
Effects of atrial fibrillation on treatment of mitral regurgitation in the EVEREST II (Endovascular Valve Edge-to-Edge Repair Study) randomized trial. | OBJECTIVES
The purpose of this study was to characterize patients with mitral regurgitation (MR) and atrial fibrillation (AF) treated percutaneously using the MitraClip device (Abbott Vascular, Abbott Park, Illinois) and compare the results with surgery in this population.
BACKGROUND
The EVEREST II (Endovascular Valve Edge-to-Edge Repair Study) randomized controlled trial compared a less invasive catheter-based treatment for MR with surgery, providing an opportunity to assess the impact of AF on the outcomes of both the MitraClip procedure and surgical repair.
METHODS
The study population included 264 patients with moderately severe or severe MR assessed by an independent echocardiographic core laboratory. Comparison of safety and effectiveness study endpoints at 30 days and 1 year were made using both intention-to-treat and per-protocol (cohort of patients with MR ≤2+ at discharge) analyses.
RESULTS
Pre-existing AF was present in 27% of patients. These patients were older, had more advanced disease, and were more likely to have a functional etiology. Similar reduction of MR to ≤2+ before discharge was achieved in patients with AF (83%) and in patients without AF (75%, p = 0.3). Freedom from death, mitral valve surgery for valve dysfunction, and MR >2+ was similar at 12 months for AF patients (64%) and for no-AF patients (61%, p = 0.3). At 12 months, MR reduction to <2+ was greater with surgery than with MitraClip, but there was no interaction between rhythm and MR reduction, and no difference in all-cause mortality between patients with and patients without AF.
CONCLUSIONS
Atrial fibrillation is associated with more advanced valvular disease and noncardiac comorbidities. However, acute procedural success, safety, and 1-year efficacy with MitraClip therapy is similar for patients with AF and without AF. |
Comparison of ex vivo expansion culture conditions of mesenchymal stem cells for human cell therapy. | BACKGROUND
Mesenchymal stem cells (MSCs) are multipotent stem cells. Based on their properties, several clinical trials have been designed to explore their potential therapeutic effect. Fetal calf serum (FCS, commonly used for in vitro expansion) is an undesirable source of xenogeneic antigens and bears the risk of transmitting contaminations. As an alternative for FCS, platelet lysate (PL) and both autologous and allogeneic human serum have been proposed. The aim of this study is to compare the culture of bone marrow (BM)-derived MSCs in the presence of different serum supplements to determine the effect on cell growth, differentiation potential, and immunologic function.
STUDY DESIGN AND METHODS
MSCs from BM of healthy volunteer donors were grown in the presence of 10% FCS supplemented with 1 ng/mL basic fibroblast growth factor (bFGF), 10% human serum supplemented with 1 ng/mL bFGF, 5% PL, and PL 5% supplemented with 1 ng/mL bFGF (PL plus bFGF).
RESULTS
MSCs that expanded in either medium showed a comparable morphology, phenotype, and proliferative and differentiation capacity. While the presence of MSCs in vitro significantly decreased CD3/CD28-mediated T-cell activation, this effect was significantly higher in MSCs cultured with human serum. Production of interferon-gamma was inhibited by cocultured media with MSCs while MSCs also induced a significant inhibition of cell cycle in T cells.
DISCUSSION
In conclusion, PL or autologous serum could offer an alternative to the use of FCS in MSC expansion for clinical use maintaining the same growing potential, phenotype, immunomodulatory properties, and differentiation potential. |
Wide angle colonoscopy with a prototype instrument: impact on miss rates and efficiency as determined by back-to-back colonoscopies | OBJECTIVE:Polyps are missed during conventional colonoscopy, even with meticulous technique. The aim of this study was to investigate whether a prototype wide angle colonoscope is associated with a reduced miss rate for polyps.METHODS:Two studies were performed. In study 1, a total of 50 patients underwent back-to-back, same-day colonoscopy by a single examiner with the prototype wide angle colonoscope and with a standard colonoscope, with the order of scopes randomized. In study 1, an attempt was made to keep examination time with the two colonoscopes equal. In study 2, a total of 20 patients were examined, 10 by the same colonoscopist who performed study 1 and 10 by a second colonoscopist. In study 2, examiners tried to perform the examinations as quickly as accuracy would allow.RESULTS:In study 1, the miss rate for all polyps was lower with the wide angle colonoscope (20% vs 31%; p = 0.046), although the mean examination time with the wide angle instrument was shorter (6.75 min vs 7.64 min; p = 0.0005). There was no significant difference in detection of adenomas. Polyps, including adenomas, were missed in the peripheral endoscopic field more frequently with the standard colonoscope. In study 2, wide angle colonoscopy was associated with reductions in examination time of 25% and 30% for the two examiners, respectively. Miss rates were the same for one colonoscopist but were higher for the other colonoscopist when the wide angle instrument was used.CONCLUSION:A prototype wide angle colonoscope did not eliminate polyp miss rates. Wide angle colonoscopy has the potential to reduce examination time and improve visualization of the periphery of the endoscopic field of view, but improvements in resolution are needed. |
Proposing a theory of gamification effectiveness | Gamification informally refers to making a system more game-like. More specifically, gamification denotes applying game mechanics to a non-game system. We theorize that gamification success depends on the game mechanics employed and their effects on user motivation and immersion. The proposed theory may be tested using an experiment or questionnaire study. |
Indicators of pretreatment suicidal ideation in adults with major depressive disorder. | OBJECTIVE
In order to evaluate the presence of treatment emergent suicidal ideation (SI), it becomes necessary to identify those patients with SI at the onset of treatment. The purpose of this report is to identify sociodemographic and clinical features that are associated with SI in major depressive disorder (MDD) patients prior to treatment with a selective serotonin reuptake inhibitor.
METHOD
This multisite study enrolled 265 out-patients with non-psychotic MDD. Sociodemographic and clinical features of participants with and without SI were compared post hoc.
RESULTS
Social phobia, bulimia nervosa, number of past depressive episodes, and race were independently associated with SI by one or more SI measure.
CONCLUSION
Concurrent social phobia and bulimia nervosa may be potential risk factors for SI in patients with non-psychotic MDD. Additionally, patients with more than one past depressive episode may also be at increased risk of SI. |
3 Data Mining for Web Personalization | In this chapter we present an overview of Web personalization process viewed as an application of data mining requiring support for all the phases of a typical data mining cycle. These phases include data collection and preprocessing, pattern discovery and evaluation, and finally applying the discovered knowledge in real-time to mediate between the user and the Web. This view of the personalization process provides added flexibility in leveraging multiple data sources and in effectively using the discovered models in an automatic personalization system. The chapter provides a detailed discussion of a host of activities and techniques used at different stages of this cycle, including the preprocessing and integration of data from multiple sources, as well as pattern discovery techniques that are typically applied to this data. We consider a number of classes of data mining algorithms used particularly for Web personalization, including techniques based on clustering, association rule discovery, sequential pattern mining, Markov models, and probabilistic mixture and hidden (latent) variable models. Finally, we discuss hybrid data mining frameworks that leverage data from a variety of channels to provide more effective personalization solutions. |
A Distributed Access Control System for Cloud Federations | Cloud federations are a new collaboration paradigm where organizations share data across their private cloud infrastructures. However, the adoption of cloud federations is hindered by federated organizations' concerns on potential risks of data leakage and data misuse. For cloud federations to be viable, federated organizations' privacy concerns should be alleviated by providing mechanisms that allow organizations to control which users from other federated organizations can access which data. We propose a novel identity and access management system for cloud federations. The system allows federated organizations to enforce attribute-based access control policies on their data in a privacy-preserving fashion. Users are granted access to federated data when their identity attributes match the policies, but without revealing their attributes to the federated organization owning data. The system also guarantees the integrity of the policy evaluation process by using block chain technology and Intel SGX trusted hardware. It uses block chain to ensure that users identity attributes and access control policies cannot be modified by a malicious user, while Intel SGX protects the integrity and confidentiality of the policy enforcement process. We present the access control protocol, the system architecture and discuss future extensions. |
School Choice, Racial Segregation, and Test-Score Gaps: Evidence from North Carolina's Charter School Program. | Introduction Among the most vexing and persistent issues in American education are the racial segregation of students and the achievement gap between black and white students. With the Brown v. Board of Education ruling in 1954, de jure segregation of schools was prohibited. Nonetheless de facto segregation remains, and recent growth in the nonwhite student population has exacerbated the problem, especially in large urban areas. In 2000, for example, more than 70 percent of black students attended majority nonwhite schools (Clotfelter, 2004). Potentially related to the racial segregation of students is the achievement gap between black and white students. Although this gap decreased by half during the 1970s, it has been widening since the late 1980s (Perie, Moran & Lutkus 2005). Given their salience, it is not surprising that these issues of racial segregation and achievement gaps are part of the public debate about expanding parental choice of schools. Opponents of expanding school choice are concerned that, in the absence of provisions carefully designed to counter such trends, the more motivated and advantaged students will sort into high quality schools with other students largely like themselves, thereby concentrating less motivated, more disadvantaged students in lower quality educational environments. In stark contrast, proponents of school choice argue that expanding parental choice of schools is likely to reduce segregation and achievement gaps. They start with the observation that many disadvantaged students, particularly poor and minority students in urban areas, currently attend some of the most segregated and poorest performing schools in the country. By replacing dysfunctional bureaucratic control with market-like competition, choice proponents assert that policies that expand 2 parental choice among schools will push the underperforming schools that serve disadvantaged students to improve. Even in the absence of such competitive effects on productivity, expanded forms of school choice will allow many poor and minority students to find their way into less segregated, and higher quality, schools. Thus, at the very least, this argument concludes, disadvantaged students who take advantage of newly available schooling options are likely to benefit. conducted by Paul Peterson and his colleagues. These studies find that although access to vouchers did not improve the average test scores of the full set of participating students, African American students who used vouchers to attend a private school exhibited statistically significant positive gains in test scores (Howell et al. 2002; Howell and Peterson 2002). 1 In addition, a study by Derek … |
Personalisation How a computer can know you better than yourself | Every time you go to one of the top 100 book/music e-commerce sites, you will come into contact with personalisation systems that attempt to judge your interests to increase sales. There are 3 methods for making these personalised recommendations: Content-based filtering, Collaborative filtering and a hybrid of the two. Understanding each of these methods will give you insight as to how your personal information is used on the Internet, and remove some of the mystery associated with the systems. This will allow you understand how these systems work and how they could be improved, so as to make an informed decision as to whether this is a good thing. |
Image super-resolution: The techniques, applications, and future | Super-resolution (SR) technique reconstructs a higher-resolution image or sequence from the observed LR images. As SR has been developed for more than three decades, both multi-frame and single-frame SR have significant applications in our daily life. This paper aims to provide a review of SR from the perspective of techniques and applications, and especially the main contributions in recent years. Regularized SR methods are most commonly employed in the last decade. Technical details are discussed in this article, including reconstruction models, parameter selection methods, optimization algorithms and acceleration strategies. Moreover, an exhaustive summary of the current applications using SR techniques has been presented. Lastly, the article discusses the current obstacles for future research. & 2016 Elsevier B.V. All rights reserved. |
Quantum dot behavior in graphene nanoconstrictions. | Graphene nanoribbons display an imperfectly understood transport gap. We measure transport through nanoribbon devices of several lengths. In long (>/=250 nm) nanoribbons we observe transport through multiple quantum dots in series, while shorter (</=60 nm) constrictions display behavior characteristic of single and double quantum dots. New measurements indicate that dot size may scale with constriction width. We propose a model where transport occurs through quantum dots that are nucleated by background disorder potential in the presence of a confinement gap. |
Confidential clinician-reported surveillance of adverse events among medical inpatients | BACKGROUND: Although iatrogenic injury poses a significant risk to hospitalized patients, detection of adverse events (AEs) is costly and difficult. METHODS: The authors developed a confidential reporting method for detecting AEs on a medicine unit of a teaching hospital. Adverse events were defined as patient injuries. Potential adverse events (PAEs) represented errors that could have, but did not result in harm. Investigators interviewed house officers during morning rounds and by e-mail, asking them to identify obstacles to high quality care and iatrogenic injuries. They compared house officer reports with hospital incident reports and patients’ medical records. A multivariate regression model identified correlates of reporting. RESULTS: One hundred ten events occurred, affecting 84 patients. Queries by e-mail (incidence rate ratio [IRR]=0.16; 95% confidence interval [95% CI], 0.05 to 0.49) and on days when house officers rotated to a new service (IRR=0.12; 95% CI, 0.02 to 0.91) resulted in fewer reports. The most commonly reported process of care problems were inadequate evaluation of the patient (16.4%), failure to monitor or follow up (12.7%), and failure of the laboratory to perform at test (12.7%). Respondents identified 29 (26.4%) AEs, 52 (47.3%) PAEs, and 29 (26.4%) other house officer-identified quality problems. An AE occurred in 2.6% of admissions. The hospital incident reporting system detected only one house officer-reported event. Chart review corroborated 72.9% of events. CONCLUSIONS: House officers detect many AEs among inpatients. Confidential peer interviews of front-line providers is a promising method for identifying medical errors and substandard quality. |
Characterizing the affective responses to an acute bout of moderate-intensity exercise among outpatients with schizophrenia | In addition to offering many physical health benefits, exercise may help improve mental health among individuals with schizophrenia through regulating affect. Therefore, the purpose of this study is to characterize affective responses experienced before, during and after a 10-min bout of exercise versus passive sitting among individuals with schizophrenia. A randomized crossover design compared affect related to feelings of pleasure and arousal at baseline, 6-min into the task, immediately post-task, and 10min post-task to sitting. Thirty participants enroled in the study; 28 participants completed the study. Separate mixed model analyses of variance were conducted for pleasure and arousal, with test order as the between-subject factor, and time and task as within-subject factors. For pleasure, a significant main effect for time and a time x task interaction effect emerged. Post-hoc Bonferroni corrected t-tests (α=.0125) revealed significant differences between pleasure at baseline and both immediately post-task and 10min post-task. No other main effects or interactions emerged. Individuals with schizophrenia derive acute feelings of pleasure from exercise. Thus, exercise may provide a method of regulating affect to improve mental health. Future studies should examine the links between affective responses to health behaviours such as long-term adherence to exercise within this population. |
Sleep deprivation and false memories. | Many studies have investigated factors that affect susceptibility to false memories. However, few have investigated the role of sleep deprivation in the formation of false memories, despite overwhelming evidence that sleep deprivation impairs cognitive function. We examined the relationship between self-reported sleep duration and false memories and the effect of 24 hr of total sleep deprivation on susceptibility to false memories. We found that under certain conditions, sleep deprivation can increase the risk of developing false memories. Specifically, sleep deprivation increased false memories in a misinformation task when participants were sleep deprived during event encoding, but did not have a significant effect when the deprivation occurred after event encoding. These experiments are the first to investigate the effect of sleep deprivation on susceptibility to false memories, which can have dire consequences. |
Acupuncture for shoulder pain after stroke: a systematic review. | OBJECTIVES
Shoulder pain, for which acupuncture has been used, is a common complication after a stroke that interferes with the function of the upper extremities. The aim of this systematic review is to summarize and evaluate the effects of acupuncture for shoulder pain after stroke.
METHODS
Randomized controlled trials (RCTs) involving the effects of acupuncture for shoulder pain, published between January 1990 and August 2009, were obtained from the National Libraries of Medicine, MEDLINE(®), CINAHL, AMED, Embase, Cochrane Controlled Trials Register 2009, Korean Medical Database (Korea Institute of Science Technology Information, DBPIA, KoreaMed, and Research Information Service System), and the Chinese Database (China Academic Journal).
RESULTS
Among the 453 studies that were obtained (300 written in English, 137 in Chinese, and 16 in Korean), 7 studies met the inclusion criteria for this review. All of them were RCTs published in China and reported positive effects of the treatment. The quality of the studies was assessed by the Modified Jadad Scores (MJS) and the Cochrane Back Review Group Criteria List for Methodologic Quality Assessment of RCTs (CBRG); the studies scored between 2 and 3 points on MJS, and between 4 and 7 points on CBRG.
CONCLUSIONS
It is concluded from this systematic review that acupuncture combined with exercise is effective for shoulder pain after stroke. It is recommended that future trials be carefully conducted on this topic. |
High speed locomotion for a quadrupedal microrobot | Research over the past several decades has elucidated some of the mechanisms behind high speed, highly efficient and robust locomotion in insects such as cockroaches. Roboticists have used this information to create biologically-inspired machines capable of running, jumping, and climbing robustly over a variety of terrains. To date, little work has been done to develop an at-scale insect-inspired robot capable of similar feats due to challenges in fabrication, actuation, and electronics integration for a centimeter-scale device. This paper addresses these challenges through the design, fabrication, and control of a 1.27g walking robot, the Harvard Ambulatory MicroRobot (HAMR). The current design is manufactured using a method inspired by pop-up books that enables fast and repeatable assembly of the miniature walking robot. Methods to drive HAMR at low and high speeds are presented, resulting in speeds up to 0.44m/s (10.1 body lengths per second) and the ability to maneuver and control the robot along desired trajectories. |
Substrate integrated waveguide fed LTCC microstrip patch antenna for 94 GHz applications | This paper presents the design, electromagnetic simulation and experimental results of a 94 GHz microstrip patch antenna with four parasitic patches. The structure was designed for a six tape Low Temperature Co-Fired Ceramics process (LTCC). The antenna element is fed by a substrate integrated waveguide, taking advantage of the 3D vertical integration possibilities of the LTCC technology. The measured input matching bandwidth (|S11| <; - 10 dB) is between 89.5 - 95.9 GHz. The measured half-power gain bandwidth is between 80 - 103 GHz. |
Treadmill test responses to an early exercise program after myocardial infarction: a randomized study. | The effects of an exercise program started early after myocardial infraction and the added effects of an outpatient teaching-counseling program were studied. At random, 84 patients were allocated to a control group (A), 88 patients to an exercise group (B1) and 86 patients to an exercise and teaching-counseling group (B2). The same exercise program was prescribed for patients in groups B1 and B2 and was started about 4.5 days after myocardial infarction and continued for 3 months. The outpatient teaching-counseling program consisted of eight group sessions pertaining to risk factor reduction and psychosocial adjustment to myocardial infraction. A low-level treadmill test and an exercise test were performed at 3 months and the exercise test was repeated at 6 months. The clinical, hemodynamic and electrocardiographic responses to these tests were not different among the three groups. However, by the end of 3 months, patients in group B1 and B2 reported walking greater distances than patients in group A. The incidence of morbidity and mortality was not different between the groups. No deleterious or beneficial physiologic effects of an exercise program either by itself or combined with a teaching-counseling program were demonstrated. Routine medical care and our interventions were equally effective in permitting the spontaneous hemodynamics improvements after myocardial infraction. More than 3 months after myocardial infarction, the group as a whole manifested spontaneous recovery in the form of a significant decrease in resting heart rate (p less than 0.001) and a significant increase in systolic and diastolic blood pressure at rest and with submaximal exercise (p less than 0.001). No further improvements were observed between 3 and 6 months. |
A comparison between semi-supervised and supervised text mining techniques on detecting irony in greek political tweets | The present work describes a classification schema for irony detection in Greek political tweets. Our hypothesis states that humorous political tweets could predict actual election results. The irony detection concept is based on subjective perceptions, so only relying on human-annotator driven labor might not be the best route. The proposed approach relies on limited labeled training data, thus a semi-supervised approach is followed, where collective-learning algorithms take both labeled and unlabeled data into consideration. We compare the semi-supervised results with the supervised ones from a previous research of ours. The hypothesis is evaluated via a correlation study between the irony that a party receives on Twitter, its respective actual election results during the Greek parliamentary elections of May 2012, and the difference between these results and the ones of the preceding elections of 2009. & 2016 Elsevier Ltd. All rights reserved. |
Perimesencephalic subarachnoid hemorrhage. Additional perspectives from four cases. | BACKGROUND
Nonaneurysmal perimesencephalic hemorrhage, a distinct form of subarachnoid hemorrhage, is a recently described variant of intracranial hemorrhage. We describe two patients who presented with unusual features of this type of subarachnoid hemorrhage and also two patients who had a perimesencephalic pattern of hemorrhage due to a ruptured posterior circulation aneurysm.
CASE DESCRIPTIONS
The first patient, a 41-year-old woman with perimesencephalic hemorrhage, underwent an exploratory craniotomy because angiography had suggested an anomaly of the basilar tip. No source of hemorrhage could be identified at the time of surgery. The second patient was a 3-year-old boy who presented with opisthotonos and who was found to have a perimesencephalic hemorrhage. Angiography revealed no source for the hemorrhage. The third patient, a 54-year-old man, had a perimesencephalic pattern of subarachnoid hemorrhage from a vertebrobasilar junction aneurysm associated with a fenestration that was missed on the initial angiographic study. The fourth patient, a 43-year-old man, suffered a perimesencephalic pattern of subarachnoid hemorrhage from a small posterior cerebral artery aneurysm, which had not been recognized on two angiograms.
CONCLUSIONS
These patients elaborate on the clinical spectrum of subarachnoid hemorrhage with a perimesencephalic pattern. First, a negative exploratory craniotomy suggests that the source of nonaneurysmal perimesencephalic hemorrhage may not be arterial. Second, nonaneurysmal perimesencephalic hemorrhage may also occur in children. Finally, the index of suspicion for a posterior circulation aneurysm should remain high in patients who present with a perimesencephalic pattern of subarachnoid hemorrhage, and these aneurysms may rise from unusual locations. |
First report of junctional epidermolysis bullosa (JEB) in the Italian draft horse | BACKGROUND
Epitheliogenesis imperfecta in horses was first recognized at the beginning of the 20th century when it was proposed that the disease could have a genetic cause and an autosomal recessive inheritance pattern. Electron microscopy studies confirmed that the lesions were characterized by a defect in the lamina propria and the disease was therefore reclassified as epidermolysis bullosa. Molecular studies targeted two mutations affecting genes involved in dermal-epidermal junction: an insertion in LAMC2 in Belgians and other draft breeds and one large deletion in LAMA3 in American Saddlebred.
CASE PRESENTATION
A mechanobullous disease was suspected in a newborn, Italian draft horse foal, which presented with multifocal to coalescing erosions and ulceration on the distal extremities. Histological examination of skin biopsies revealed a subepidermal cleft formation and transmission electron microscopy demonstrated that the lamina densa of the basement membrane remained attached to the dermis. According to clinical, histological and ultrastructural findings, a diagnosis of junctional epidermolysis bullosa (JEB) was made. Genetic tests confirmed the presence of 1368insC in LAMC2 in the foal and its relatives.
CONCLUSION
This is the first report of JEB in Italy. The disease was characterized by typical macroscopic, histologic and ultrastructural findings. Genetic tests confirmed the presence of the 1368insC in LAMC2 in this case: further investigations are required to assess if the mutation could be present at a low frequency in the Italian draft horse population. Atypical breeding practices are responsible in this case and played a role as odds enhancer for unfavourable alleles. Identification of carriers is fundamental in order to prevent economic loss for the horse industry. |
Test Case Selection and Prioritization Using Multiple Criteria | Regression testing activities such as test case selection and test case prioritization are ordinarily based on the criteria which focused around code coverage, code modifications and test execution costs. The approach mainly based on the multiple criteria of code coverage which performs efficient selection of test case. The method mainly aims to maximize the coverage size by executing the test cases effectively. The selected test cases are then prioritized based on the priority which depends on the code coverage to achieve the desired results. Keywords—Test case selection, Test case prioritization, Cod coverage, Jaccard distance, Greedy algorithm. |
Frequency compensation of high-speed, low-voltage CMOS multistage amplifiers | This paper presents the frequency compensation of high-speed, low-voltage multistage amplifiers. Two frequency compensation techniques, the Nested Miller Compensation with Nulling Resistors (NMCNR) and Reversed Nested Indirect Compensation (RNIC), are discussed and employed on two multistage amplifier architectures. A four-stage pseudo-differential amplifier with CMFF and CMFB is designed in a 1.2 V, 65-nm CMOS process. With NMCNR, it achieves a phase margin (PM) of 59° with a DC gain of 75 dB and unity-gain frequency (fug) of 712 MHz. With RNIC, the same four-stage amplifier achieves a phase margin of 84°, DC gain of 76 dB and fug of 2 GHz. Further, a three-stage single-ended amplifier is designed in a 1.1-V, 40-nm CMOS process. The three-stage OTA with RNIC achieves PM of 81°, DC gain of 80 dB and fug of 770 MHz. The same OTA achieves PM of 59° with NMCNR, while maintaining a DC gain of 75 dB and fug of 262 MHz. Pole-splitting, to achieve increased stability, is illustrated for both compensation schemes. Simulations illustrate that the RNIC scheme achieves much higher PM and fug for lower values of compensation capacitance compared to NMCNR, despite the growing number of low voltage amplifier stages. |
MHC class II proteins and disease: a structural perspective | MHC class II molecules on the surface of antigen-presenting cells display a range of peptides for recognition by the T-cell receptors of CD4+ T helper cells. Therefore, MHC class II molecules are central to effective adaptive immune responses, but conversely, genetic and epidemiological data have implicated these molecules in the pathogenesis of autoimmune diseases. Indeed, the strength of the associations between particular MHC class II alleles and disease render them the main genetic risk factors for autoimmune disorders such as type 1 diabetes. Here, we discuss the insights that the crystal structures of MHC class II molecules provide into the molecular mechanisms by which sequence polymorphisms might contribute to disease susceptibility. |
BILAG 2004. Development and initial validation of an updated version of the British Isles Lupus Assessment Group's disease activity index for patients with systemic lupus erythematosus. | OBJECTIVE
To devise a more discriminating version of the British Isles Lupus Assessment Group (BILAG) disease activity index and to show that it is reliable.
METHODS
A nominal consensus approach was undertaken by members of BILAG to update and improve the BILAG lupus disease activity index. The index has been revised following intense consultations over a 1-yr period. It has been assessed in two real-patient exercises. These involved patients with diverse clinical features of SLE, including gastrointestinal, hepatic and ophthalmic problems, which the earlier versions of the index did not fully take into account. Reliability in terms of the ability to differentiate patients was assessed by calculating intraclass correlation coefficients. The level of agreement between physicians was determined by calculating the ratio of estimates of the standard error (SE) attributable to the physicians to the SE attributable to the patients.
RESULTS
Good reliability and high levels of physician agreement were observed in one or both exercises in the constitutional, mucocutaneous, neurological, cardiorespiratory, renal, ophthalmic and haematological systems. In contrast, the musculoskeletal system did not score as well, although providing more clear-cut glossary definitions should greatly improve the situation.
CONCLUSIONS
Some significant changes in the BILAG disease activity index to assess patients with SLE are proposed. The process of demonstrating validity and reliability has started with these two exercises assessing real patients. Further validation studies are under way. BILAG 2004 is likely to be valuable in clinical trials assessing new therapies for the treatment of SLE, as it provides a more comprehensive system-based disease activity measure than has been available previously. |
Catastrophic Importance of Catastrophic Forgetting | This paper describes some of the possibilities of artificial neural networks that open up after solving the problem of catastrophic forgetting. A simple model and reinforcement learning applications of existing methods are also proposed |
The gut microbiota in IBD | IBD—ulcerative colitis and Crohn's disease—is emerging as a worldwide epidemic. An association between the increased incidence of IBD and environmental factors linked to socioeconomic development has been persistently detected in different parts of the world. The lifestyle in developed countries might impair the natural patterns of microbial colonization of the human gut. The interaction of microbes with mucosal immune compartments in the gut seems to have a major role in priming and regulating immunity. In IBD, mucosal lesions are generated by an excessive or dysregulated immune response against commensal microbes in the gut. In individuals with a genetic susceptibility to IBD, abnormal microbial colonization of the gastrointestinal tract might be the origin of such dysregulation. Developments in gene-sequencing technologies, as well as increased availability of powerful bioinformatic tools, have enabled novel insights into the microbial composition of the human gut microbiota and the effect of microbial communities on human physiology and disease. Studies that used these technologies indicate that dysbiosis (that is, abnormal microbiota composition) and decreased complexity of the gut microbial ecosystem are common features in patients with Crohn's disease or ulcerative colitis. Whether such changes are a cause or a consequence of the disease remains to be elucidated. |
Picking Pesky Parameters: Optimizing Regular Expression Matching in Practice | Network security systems inspect packet payloads for signatures of attacks. These systems use regular expression matching at their core. Many techniques for implementing regular expression matching at line rate have been proposed. Solutions differ in the type of automaton used (i.e., deterministic vs. non-deterministic) and in the configuration of implementation-specific parameters. While each solution has been shown to perform well on specific rule sets and traffic patterns, there has been no systematic comparison across a large set of solutions, rule sets and traffic patterns. Thus, it is extremely challenging for a practitioner to make an informed decision within the plethora of existing algorithmic and architectural proposals. To address this problem, we present a comprehensive evaluation of a broad set of regular expression matching techniques. We consider both algorithmic and architectural aspects. Specifically, we explore the performance, area requirements, and power consumption of implementations targeting processors and field programmable gate arrays using rule sets of practical size and complexity. We present detailed performance results and specific guidelines for determining optimal configurations based on a simple evaluation of the rule set. These guidelines can help significantly when implementing regular expression matching systems in practice. |
Identifying the sentiment styles of YouTube's vloggers | Vlogs provide a rich public source of data in a novel setting. This paper examined the continuous sentiment styles employed in 27,333 vlogs using a dynamic intra-textual approach to sentiment analysis. Using unsupervised clustering, we identified seven distinct continuous sentiment trajectories characterized by fluctuations of sentiment throughout a vlog’s narrative time. We provide a taxonomy of these seven continuous sentiment styles and found that vlogs whose sentiment builds up towards a positive ending are the most prevalent in our sample. Gender was associated with preferences for different continuous sentiment trajectories. This paper discusses the findings with respect to previous work and concludes with an outlook towards possible uses of the corpus, method and findings of this paper for related areas of research. |
The Relationship between Self-esteem and Listening Comprehension of EFL Students | The present study is aimed at investigating the relationship between selfesteem and listening comprehension of EFL students. Sixty intermediate students (30 male and 30 female) from Shahid Tondgouyan Petroleum University of Abadan, Iran, were selected using a sample proficiency test. Students’ English language listening comprehension scores were calculated using a model test of TOEFL, including 34 audio conversations and 34 written form tests, and their self-esteem was estimated using Coopersmith (1967)’s questionnaire. The results showed that the students' listening comprehension was significantly affected by their self-esteem; that is, selfesteem as a psychological factor had a positive relationship with students’ English language listening comprehension. |
Intermittent Computing: Challenges and Opportunities | The maturation of energy-harvesting technology and ultra-low-power computer systems has led to the advent of intermittently-powered, batteryless devices that operate entirely using energy extracted from their environment. Intermittently operating devices present a rich vein of programming languages research challenges and the purpose of this paper is to illustrate these challenges to the PL research community. To provide depth, this paper includes a survey of the hardware and software design space of intermittent computing platforms. On the foundation of these research challenges and the state of the art in intermittent hardware and software, this paper describes several future PL research directions, emphasizing a connection between intermittence, distributed computing, energy-aware programming and compilation, and approximate computing. We illustrate these connections with a discussion of our ongoing work on programming for intermittence, and on building and simulating intermittent distributed systems. 1998 ACM Subject Classification C.0 Hardware/Software Interfaces, D.4.5 Reliability |
Time Series Prediction: Forecasting the Future and Understanding the Past | Make more knowledge even in less time every day. You may not always spend your time and money to go abroad and get the experience and knowledge by yourself. Reading is a good alternative to do in getting this desirable knowledge and experience. You may gain many things from experiencing directly, but of course it will spend much money. So here, by reading time series prediction forecasting the future and understanding the past, you can take more advantages with limited budget. |
Direct and indirect pathways of basal ganglia: a critical reappraisal | The basal ganglia are subcortical nuclei controlling voluntary actions and have been implicated in Parkinson's disease (PD). The prevailing model of basal ganglia function states that two circuits, the direct and indirect pathways, originate from distinct populations of striatal medium spiny neurons (MSNs) and project to different output structures. These circuits are believed to have opposite effects on movement. Specifically, the activity of direct pathway MSNs is postulated to promote movement, whereas the activation of indirect pathway MSNs is hypothesized to inhibit it. Recent findings have revealed that this model might not fully account for the concurrent activation of both pathways during movement. Accordingly, we propose a model in which intrastriatal connections are critical and the two pathways are structurally and functionally intertwined. Thus, all MSNs might either facilitate or inhibit movement depending on the form of synaptic plasticity expressed at a certain moment. In PD, alterations of dopamine-dependent synaptic plasticity could alter this coordinated activity. |
CLINICAL AND DIAGNOSTIC CRITERIA OF MYOTONIA | Myotonia may be in hereditary neuromuscular diseases with different causes, associated with ion channels of membrane in skeletal muscle fibers. Myotonic phenomenon is a symptom of neuromuscular disorders, characterized by slow relaxation (prolonged contracture) of skeletal muscles after voluntary contraction or electrical stimulation. Myotonia may involve all groups of muscles. However, the nature of muscle destruction could vary depending upon the particular disease. |
The periosteum. Part 1: Anatomy, histology and molecular biology. | The periosteum is a thin layer of connective tissue that covers the outer surface of a bone in all places except at joints (which are protected by articular cartilage). As opposed to bone itself, it has nociceptive nerve endings, making it very sensitive to manipulation. It also provides nourishment in the form of blood supply to the bone. The periosteum is connected to the bone by strong collagenous fibres called Sharpey's fibres, which extend to the outer circumferential and interstitial lamellae of bone. The periosteum consists of an outer "fibrous layer" and inner "cambium layer". The fibrous layer contains fibroblasts while the cambium layer contains progenitor cells which develop into osteoblasts that are responsible for increasing bone width. After a bone fracture the progenitor cells develop into osteoblasts and chondroblasts which are essential to the healing process. This review discusses the anatomy, histology and molecular biology of the periosteum in detail. |
Cerebellum and Ocular Motor Control | An intact cerebellum is a prerequisite for optimal ocular motor performance. The cerebellum fine-tunes each of the subtypes of eye movements so they work together to bring and maintain images of objects of interest on the fovea. Here we review the major aspects of the contribution of the cerebellum to ocular motor control. The approach will be based on structural-functional correlation, combining the effects of lesions and the results from physiologic studies, with the emphasis on the cerebellar regions known to be most closely related to ocular motor function: (1) the flocculus/paraflocculus for high-frequency (brief) vestibular responses, sustained pursuit eye movements, and gaze holding, (2) the nodulus/ventral uvula for low-frequency (sustained) vestibular responses, and (3) the dorsal oculomotor vermis and its target in the posterior portion of the fastigial nucleus (the fastigial oculomotor region) for saccades and pursuit initiation. |
Influence of PEEK Coating on Hip Implant Stress Shielding: A Finite Element Analysis | Stress shielding is a well-known failure factor in hip implants. This work proposes a design concept for hip implants, using a combination of metallic stem with a polymer coating (polyether ether ketone (PEEK)). The proposed design concept is simulated using titanium alloy stems and PEEK coatings with thicknesses varying from 100 to 400 μm. The Finite Element analysis of the cancellous bone surrounding the implant shows promising results. The effective von Mises stress increases between 81 and 92% for the complete volume of cancellous bone. When focusing on the proximal zone of the implant, the increased stress transmission to the cancellous bone reaches between 47 and 60%. This increment in load transferred to the bone can influence mineral bone loss due to stress shielding, minimizing such effect, and thus prolonging implant lifespan. |
CHRONIC WOUND TISSUE CLASSIFICATION USING CONVOLUTIONAL NETWORKS AND COLOR SPACE REDUCTION | Chronic Wounds are ulcers presenting a difficult or nearly interrupted cicatrization process that increase the risk of complications to the health of patients, like amputation and infections. This research proposes a general noninvasive methodology for the segmentation and analysis of chronic wounds images by computing the wound areas affected by necrosis. Invasive techniques are usually used for this calculation, such as manual planimetry with plastic films. We investigated algorithms to perform the segmentation of wounds as well as the use of several convolutional networks for classifying tissue as Necrotic, Granulation or Slough. We tested four architectures: U-Net, Segnet, FCN8 and FCN32, and proposed a color space reduction methodology that increased the reported accuracies, specificities, sensitivities and Dice coefficients for all 4 networks, achieving very good levels. |
A richly interactive exploratory data analysis and visualization tool using electronic medical records | BACKGROUND
Electronic medical records (EMRs) contain vast amounts of data that is of great interest to physicians, clinical researchers, and medial policy makers. As the size, complexity, and accessibility of EMRs grow, the ability to extract meaningful information from them has become an increasingly important problem to solve.
METHODS
We develop a standardized data analysis process to support cohort study with a focus on a particular disease. We use an interactive divide-and-conquer approach to classify patients into relatively uniform within each group. It is a repetitive process enabling the user to divide the data into homogeneous subsets that can be visually examined, compared, and refined. The final visualization was driven by the transformed data, and user feedback direct to the corresponding operators which completed the repetitive process. The output results are shown in a Sankey diagram-style timeline, which is a particular kind of flow diagram for showing factors' states and transitions over time.
RESULTS
This paper presented a visually rich, interactive web-based application, which could enable researchers to study any cohorts over time by using EMR data. The resulting visualizations help uncover hidden information in the data, compare differences between patient groups, determine critical factors that influence a particular disease, and help direct further analyses. We introduced and demonstrated this tool by using EMRs of 14,567 Chronic Kidney Disease (CKD) patients.
CONCLUSIONS
We developed a visual mining system to support exploratory data analysis of multi-dimensional categorical EMR data. By using CKD as a model of disease, it was assembled by automated correlational analysis and human-curated visual evaluation. The visualization methods such as Sankey diagram can reveal useful knowledge about the particular disease cohort and the trajectories of the disease over time. |
Vishit: A Visualizer for Hindi Text | We outline the design of a visualizer, named Vishit, for texts in the Hindi language. The Hindi language is lingua franca in many states of India where people speak different languages. The visualized text serves as a universal language where seamless communication is needed by many people who speak different languages and have different cultures. Vishit consists of the following three major processing steps: language processing, knowledge base creation and scene generation. Initial results from the Vishit prototype are encouraging. |
On the notion of canonical dimension for algebraic groups | We define and study a new numerical invariant of an algebraic group action which we call the canonical dimension. We then apply the resulting theory to the problem of computing the minimal number of parameters required to define a generic hypersurface of degree d in P^{n-1}. |
The AdobeIndoorNav Dataset: Towards Deep Reinforcement Learning based Real-world Indoor Robot Visual Navigation | Deep reinforcement learning (DRL) demonstrates its potential in learning a model-free navigation policy for robot visual navigation. However, the data-demanding algorithm relies on a large number of navigation trajectories in training. Existing datasets supporting training such robot navigation algorithms consist of either 3D synthetic scenes or reconstructed scenes. Synthetic data suffers from domain gap to the real-world scenes while visual inputs rendered from 3D reconstructed scenes have undesired holes and artifacts. In this paper, we present a new dataset collected in real-world to facilitate the research in DRL based visual navigation. Our dataset includes 3D reconstruction for real-world scenes as well as densely captured real 2D images from the scenes. It provides highquality visual inputs with real-world scene complexity to the robot at dense grid locations. We further study and benchmark one recent DRL based navigation algorithm [1] and present our attempts and thoughts on improving its generalizability to unseen test targets in the scenes. |
A Novel Low-Ringing Monocycle Picosecond Pulse Generator Based on Step Recovery Diode | This paper presents a high-performance low-ringing ultra-wideband monocycle picosecond pulse generator, formed using a step recovery diode (SRD), simulated in ADS software and generated through experimentation. The pulse generator comprises three parts, a step recovery diode, a field-effect transistor and a Schottky diode, used to eliminate the positive and negative ringing of pulse. Simulated results validate the design. Measured results indicate an output waveform of 1.88 peak-to-peak amplitude and 307ps pulse duration with a minimal ringing of -22.5 dB, providing good symmetry and low level of ringing. A high degree of coordination between the simulated and measured results is achieved. |
Channel Attention and Multi-level Features Fusion for Single Image Super-Resolution | Convolutional neural networks (CNNs) have demonstrated superior performance in super-resolution (SR). However, most CNN-based SR methods neglect the different importance among feature channels or fail to take full advantage of the hierarchical features. To address these issues, this paper presents a novel recursive unit. Firstly, at the beginning of each unit, we adopt a compact channel attention mechanism to adaptively recalibrate the channel importance of input features. Then, the multi-level features, rather than only deep-level features, are extracted and fused. Additionally, we find that it will force our model to learn more details by using the learnable upsampling method (i.e., transposed convolution) only on residual branch (instead of using it both on residual branch and identity branch) while using the bicubic interpolation on the other branch. Analytic experiments show that our method achieves competitive results compared with the state-of-the-art methods and maintains faster speed as well. |
Towards semantic context-aware drones for aerial scenes understanding | Visual object tracking with unmanned aerial vehicles (UAVs) plays a central role in the aerial surveillance. Reliable object detection depends on many factors such as large displacements, occlusions, image noise, illumination and pose changes or image blur that may compromise the object labeling. The paper presents a proposal for a hybrid solution that adds semantic information to the video tracking processing: along with the tracked objects, the scene is completely depicted by data from places, natural features, or in general Points of Interest (POIs). Each scene from a video sequence is semantically described by ontological statements which, by inference, support the object identification which often suffers from some weakness in the object tracking methods. The synergy between the tracking methods and semantic technologies seems to bridge the object labeling gap, enhance the understanding of the situation awareness, as well as critical alarming situations. |
Emerging non-volatile memories: Opportunities and challenges | In recent years, non-volatile memory (NVM) technologies have emerged as candidates for future universal memory. NVMs generally have advantages such as low leakage power, high density, and fast read spead. At the same time, NVMs also have disadvantages. For example, NVMs often have asymetric read and write speed and energy cost, which poses new challenges when applying NVMs. This paper contains a collection of four contributions, presenting basic introduction on three emerging NVM technologies, their unique characteristics, potential challenges, and new opportunities that they may bring forward in memory systems. |
Multi-Hop WBAN Construction for Healthcare IoT Systems | It is expected that the Internet of Things (IoT) applications for medical services can be one of the most remarkable solution for taking care of aging population which is in the rapid growth. IoT consists of communications and sensors to accomplish purpose. In the diverse kinds of networks, wireless body area network (WBAN) is a highly suitable communication tool for the medical IoT devices. There are many researches about WBAN and sensor network, which are mainly focused on energy efficiency. However, in this paper, we discuss more practical issues for implementation of WBAN to healthcare service. Therefore, we propose a multi-hop WBAN construction scheme that is consists of 4 operations, the clustered topology setup, mobility support, and transmission efficiency enhancement. As an auxiliary benefit, the proposed scheme achieves an energy efficient feature by reducing the number of total control messages. Extensive simulation shows that the proposed scheme remarkably improves the performance of WBAN. |
Accelerating Big Data Analytics Using FPGAs | Emerging big data analytics applications require a significant amount of server computational power. As chips are hitting power limits, computing systems are moving away from general-purpose designs and toward greater specialization. Hardware acceleration through specialization has received renewed interest in recent years, mainly due to the dark silicon challenge. To address the computing requirements of big data, and based on the benchmarking and characterization results, we envision a data-driven heterogeneous architecture for next generation big data server platforms that leverage the power of field-programmable gate array (FPGA) to build custom accelerators in a Hadoop MapReduce framework. Unlike a full and dedicated implementation of Hadoop MapReduce algorithm on FPGA, we propose the hardware/software (HW/SW) co-design of the algorithm, which trades some speedup at a benefit of less hardware. Considering communication overhead with FPGA and other overheads involved in Hadoop MapReduce environment such as compression and decompression, shuffling and sorting, our experimental results show significant potential for accelerating Hadoop MapReduce machine learning kernels using HW/SW co-design methodology. |
Data Clustering: Algorithms and Applications | This book contains information obtained from authentic and highly regarded sources. Reasonable efforts have been made to publish reliable data and information, but the author and publisher cannot assume responsibility for the validity of all materials or the consequences of their use. The authors and publishers have attempted to trace the copyright holders of all material reproduced in this publication and apologize to copyright holders if permission to publish in this form has not been obtained. If any copyright material has not been acknowledged please write and let us know so we may rectify in any future reprint. Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced, transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying , microfilming, and recording, or in any information storage or retrieval system, without written permission from the publishers. 978-750-8400. CCC is a not-for-profit organization that provides licenses and registration for a variety of users. For organizations that have been granted a photocopy license by the CCC, a separate system of payment has been arranged. Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe. Data clustering : algorithms and applications / [edited by] Charu C. Aggarwal, Chandan K. Reddy. pages cm.-(Chapman & Hall/CRC data mining and knowledge discovery series) Includes bibliographical references and index. |
Spectrum sharing scenarios and resulting technical requirements for 5G systems | Cellular networks today are designed for and operate in dedicated licensed spectrum. At the same time there are other spectrum usage authorization models for wireless communication, such as unlicensed spectrum or, as widely discussed currently but not yet implemented in practice, various forms of licensed shared spectrum. Hence, cellular technology as of today can only operate in a subset of the spectrum that is in principle available. Hence, a future wireless system may benefit from the ability to access also spectrum opportunities other than dedicated licensed spectrum. It is therefore important to identify which additional ways of authorizing spectrum usage are deemed to become relevant in the future and to analyze the resulting technical requirements. The implications of sharing spectrum between different technologies are analyzed in this paper, both from efficiency and technology neutrality perspective. Different known sharing techniques are outlined and their applicability to the relevant range of future spectrum regulatory regimes is discussed. Based on an assumed range of relevant (according to the views of the authors) future spectrum sharing scenarios, a toolbox of certain spectrum sharing techniques is proposed as the basis for the design of spectrum sharing related functionality in future mobile broadband systems. |
A line-structure-preserving approach to image resizing | This paper proposes a content-aware image resizing method which simultaneously preserves both salient image features and important line structure properties: parallelism, collinearity and orientation. When there are prominent line structures in the image, image resizing methods without explicitly taking these properties into account could produce line structure distortions in their results. Since the human visual system is very sensitive to line structures, such distortions often become noticeable and disturbing. Our method couples mesh deformations for image resizing with similarity transforms for line features. Mesh deformations are used to control content preservation while similarity transforms are analyzed in the Hough space to maintain line structure properties. Our method strikes a good balance between preserving content and maintaining line structure properties. Experiments show the proposed method often outperforms methods without taking line structures into account, especially for scenes with prominent line structures. |
DCAN: Dual Channel-Wise Alignment Networks for Unsupervised Scene Adaptation | Harvesting dense pixel-level annotations to train deep neural networks for semantic segmentation is extremely expensive and unwieldy at scale. While learning from synthetic data where labels are readily available sounds promising, performance degrades significantly when testing on novel realistic data due to domain discrepancies. We present Dual Channel-wise Alignment Networks (DCAN), a simple yet effective approach to reduce domain shift at both pixel-level and feature-level. Exploring statistics in each channel of CNN feature maps, our framework performs channel-wise feature alignment, which preserves spatial structures and semantic information, in both an image generator and a segmentation network. In particular, given an image from the source domain and unlabeled samples from the target domain, the generator synthesizes new images on-the-fly to resemble samples from the target domain in appearance and the segmentation network further refines highlevel features before predicting semantic maps, both of which leverage feature statistics of sampled images from the target domain. Unlike much recent and concurrent work relying on adversarial training, our framework is lightweight and easy to train. Extensive experiments on adapting models trained on synthetic segmentation benchmarks to real urban scenes demonstrate the effectiveness of the proposed framework. |
Inverse Reinforcement Learning in Relational Domains | In this work, we introduce the first approach to the Inverse Reinforcement Learning (IRL) problem in relational domains. IRL has been used to recover a more compact representation of the expert policy leading to better generalization performances among different contexts. On the other hand, relational learning allows representing problems with a varying number of objects (potentially infinite), thus provides more generalizable representations of problems and skills. We show how these different formalisms allow one to create a new IRL algorithm for relational domains that can recover with great efficiency rewards from expert data that have strong generalization and transfer properties. We evaluate our algorithm in representative tasks and study the impact of diverse experimental conditions such as : the number of demonstrations, knowledge about the dynamics, transfer among varying dimensions of a problem, and changing dynamics. |
Dating the colonization of Sahul ( Pleistocene Australia – New Guinea ) : a review of recent research | The date for the initial colonization of Sahul is a key benchmark in human history and the topic of a long-running debate. Most analysts favor either a 40,000 BP or 60,000 BP arrival time, though some have proposed a much earlier date. Here we review data from more than 30 archaeological sites with basal ages >20,000 years reported since 1993, giving special attention to five sites with purported ages >45,000 years. We conclude that while the continent was probably occupied by 42–45,000 BP, earlier arrival dates are not well-supported. This observation undercuts claims for modern human migrations out of Africa and beyond the Levant before 50,000 BP. It also has critical but not yet conclusive implications for arguments about a human role in the extinction of Sahul megafauna. 2003 Elsevier Ltd. All rights reserved. |
The effect of "Internet of Things" on supply chain integration and performance: An organisational capability perspective | The Internet of things (IoT) is a next generation of Internet connected embedded ICT systems in a digital environment to seamlessly integrate supply chain and logistics processes. Integrating emerging IoT into the current ICT systems can be unique because of its intelligence, autonomous and pervasive applications. However, research on the IoT adoption in supply chain domain is scarce and acceptance of the IoT into the retail services in specific has been overly rhetoric. This study is drawn upon the organisational capability theory for developing an empirical model considering the effect of IoT capabilities on multiple dimensions of supply chain process integration, and in turn improves supply chain performance as well as organisational performance. Cross-sectional survey data from 227 Australian retail firms was analysed using structural equation modelling (SEM). The results indicate that IoT capability has a positive and significant effect on internal, customer-, and supplier-related process integration that in turn positively affects supply chain performance and organisational performance. Theoretically, the study contributes to a body of knowledge that integrates information systems research into supply chain integration by establishing an empirical evidence of how IoT-enabled process integration can enhance the performance at both supply chain and organisational level. Practically, the results inform the managers of the likely investment on IoT that can lead to chain’s performance outcome. |
Empirical research in on-line trust: a review and critical assessment | Lack of trust is one of the most frequently cited reasons for consumers not purchasing from Internet vendors. During the last four years a number of empirical studies have investigated the role of trust in the specific context of e-commerce, focusing on different aspects of this multi-dimensional construct. However, empirical research in this area is beset by conflicting conceptualizations of the trust construct, inadequate understanding of the relationships between trust, its antecedents and consequents, and the frequent use of trust scales that are neither theoretically derived nor rigorously validated. The major objective of this paper is to provide an integrative review of the empirical literature on trust in e-commerce in order to allow cumulative analysis of results. The interpretation and comparison of different empirical studies on on-line trust first requires conceptual clarification. A set of trust constructs is proposed that reflects both institutional phenomena (system trust) and personal and interpersonal forms of trust (dispositional trust, trusting beliefs, trusting intentions and trust-related behaviours), thus facilitating a multi-level and multi-dimensional analysis of research problems related to trust in e-commerce. r 2003 Elsevier Science Ltd. All rights reserved. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.