title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
Anterior segment dysgenesis in the eyes of mice deficient for the high-mobility-group transcription factor Sox11. | We describe that Sox11, a member of the group C of the Sox transcription factor family, is critically required during the morphogenetic processes of early eye development, and that lack of Sox11 results in ocular anterior segment dysgenesis (ASD). Sox11-deficient mice show a persistent lens stalk, a delay in lens formation, and the phenotypes of Peters' anomaly and microphthalmia at birth. In addition, the optic fissure does not close in the anterior halves of the eyes resulting in anterior coloboma. The delay in lens formation is associated with a reduced mitotic activity in the lens placode during its invagination into the optic cup. No changes in Pax6 expression are observed in the developing eyes of Sox11-/- mice, whereas the expression of Sox11 is reduced in optic cup, optic vesicle and lens placode of Pax6+/- embryos and in the optic vesicle of Pax6-/- mice. Transfection experiments show an increase in Sox11 expression when higher doses of Pax6 are present. Considerably smaller amounts of BMP7 are expressed in lens and optic cup of Sox11-/- mice as compared to their wild-type littermates. We conclude that Sox11 is required during separation of the lens vesicle from the surface ectoderm and the closure of the anterior optic fissure. The expression of Sox11 in early eye development is under control of Pax6, and changes in BMP7-signalling appear to be involved in the effects of Sox11 on anterior eye development. Our findings suggest that SOX11 might similarly be involved in the pathogenesis of ASD in human patients. |
Saffron aqueous extract prevents metabolic syndrome in patients with schizophrenia on olanzapine treatment: a randomized triple blind placebo controlled study. | OBJECTIVE
The aim of this study was to assess whether saffron aqueous extract (SAE) or its active constituent, crocin, prevents olanzapine-induced metabolic syndrome (MetS) and insulin resistance in patients with schizophrenia.
METHODS
66 patients diagnosed with schizophrenia who were on olanzapine treatment (5-20 mg daily) were randomly allocated to receive a capsule of SAE (n=22; 30 mg daily), crocin (n=22; 30 mg daily) or placebo (n=22) in a 12-week triple-blind trial. Patients were screened not to have MetS at baseline and further assessment was done at weeks 6 and 12. Measurement of fasting blood glucose (FBS) and serum lipids were repeated at weeks 2, 6 and 12. Fasting blood levels of insulin and HbA1c were also measured at baseline and week 12. HOMA-IR and HOMA-β were determined to evaluate insulin resistance.
RESULTS
61 patients completed the trial and no serious adverse effects were reported. Time-treatment interaction showed a significant difference in FBS in both SAE and crocin groups compared to placebo (p=0.004). In addition, SAE could effectively prevent reaching the criteria of metabolic syndrome (0 patients) compared to crocin (9.1%) and placebo (27.3%) as early as week 6.
CONCLUSION
SAE could prevent metabolic syndrome compared to crocin and placebo. Furthermore, both SAE and crocin prevented increases in blood glucose during the study. |
A generic auto-provisioning framework for cloud databases | We discuss the problem of resource provisioning for database management systems operating on top of an Infrastructure-As-A-Service (IaaS) cloud. To solve this problem, we describe an extensible framework that, given a target query workload, continually optimizes the system's operational cost, estimated based on the IaaS provider's pricing model, while satisfying QoS expectations. Specifically, we describe two different approaches, a “white-box” approach that uses a fine-grained estimation of the expected resource consumption for a workload, and a “black-box” approach that relies on coarse-grained profiling to characterize the workload's end-to-end performance across various cloud resources. We formalize both approaches as a constraint programming problem and use a generic constraint solver to efficiently tackle them. We present preliminary experimental numbers, obtained by running TPC-H queries with PostsgreSQL on Amazon's EC2, that provide evidence of the feasibility and utility of our approaches. We also briefly discuss the pertinent challenges and directions of on-going research. |
SERVQUAL : A Multiple-Item Scale for Measuring Consumer Perceptions of Service Quality | This paper describes the development of a 22-item instrument (called SERVQUAL) for assessing customer perceptions of service quality in service and retailing organizations. After a discussion of the conceptualization and operationalization of the service quality construct, the procedures used in constructing and refining a multiple-item scale to measure the construct are described. Evidence of the scale's reliability, factor structure, and validity on the basis of analyzing data from four independent samples is presented next. The paper concludes with a discussion of potential applications of the scale. |
Analytical inversion-mode varactor modeling based on the EKV model and its application to RF VCO design | An analytical modeling approach for the CV-characteristics of inversion-mode MOS varactors that is based on the continuous EKV model equations is presented. Based on this approach it is possible to obtain an analytical expression for the effective large signal capacitance of varactors incorporated into a VCO and to calculate and optimize the resulting VCO tuning sensitivity KVCO. We present a simple but quite accurate hyperbolic tangent approximation for the CV-characteristic of inversion-mode MOS varactors that depends on selected design variables and is therefore well suited to be used in a systematic VCO design flow. In order to verify the validity and accuracy of the modeling approach the CV-characteristics obtained by using the EKV based simulation model are compared with Spectre (Cadence) simulations using a BSIM 3.3 transistor model. As reference semiconductor technology we use a 0.35 µm CMOS process (C35) from austriamicrosytems (AMS). |
The effects of prosocial video games on prosocial behaviors: international evidence from correlational, longitudinal, and experimental studies. | Although dozens of studies have documented a relationship between violent video games and aggressive behaviors, very little attention has been paid to potential effects of prosocial games. Theoretically, games in which game characters help and support each other in nonviolent ways should increase both short-term and long-term prosocial behaviors. We report three studies conducted in three countries with three age groups to test this hypothesis. In the correlational study, Singaporean middle-school students who played more prosocial games behaved more prosocially. In the two longitudinal samples of Japanese children and adolescents, prosocial game play predicted later increases in prosocial behavior. In the experimental study, U.S. undergraduates randomly assigned to play prosocial games behaved more prosocially toward another student. These similar results across different methodologies, ages, and cultures provide robust evidence of a prosocial game content effect, and they provide support for the General Learning Model. |
Field effect diode for effective CDM ESD protection in 45 nm SOI technology | In this paper, the improved field-effect diode (FED) has been characterized and modeled in 45 nm silicon-on-insulator (SOI) technology. It has been experimentally shown to be suitable for pad-based local clamping under normal supply voltage (Vdd) range (below 1 V) in high-speed integrated circuits. ESD protection capabilities are investigated using very fast transmission line pulse (VF-TLP) tests to predict the device's performance in charged device model (CDM) ESD events. The FED's advantages in improving transient turn-on behavior and reducing DC leakage current have been analyzed and compared with other Silicon-Controlled-Rectifier (SCR)-based SOI device variations. Technology CAD (TCAD) simulations are used to interpret the turn-on behavior and the physical effects. Process tradeoffs have been evaluated. The work prepares the device for being directly applied to high-speed Input/Output (I/O) circuit and it addresses the severe challenge in CDM ESD protection. The improved device enables the adoption of local clamping scheme that expands the ESD design window. |
Physical activity in older people: a systematic review | BACKGROUND
Physical activity (PA) in older people is critically important in the prevention of disease, maintenance of independence and improvement of quality of life. Little is known about the physical activity of the older adults or their compliance with current physical activity guidelines.
METHODS
A systematic literature search of the published literature was conducted. Included were published reports of original research that independently reported: the PA level of non-institutional older adults (aged 60 years and over); and the proportion of older adults in the different samples who met PA recommendations or guidelines. The review was restricted to studies published since 2000 to provide a current picture of older adults' PA levels.
RESULTS
Fifty three papers were included in the review. The percentage of older adults meeting recommended physical activity ranged from 2.4 - 83.0% across the studies. Definitions of "recommended" physical activity in older adults varied across the studies as did approaches to measurement which posed methodological challenges to data analysis. Older age groups were less likely than the reference group to be regularly active, and women were less likely than men to achieve regular physical activity, especially leisure time physical activity, when measured by both subjective and objective criteria.
CONCLUSION
The review highlights the need for studies which recruit representative random samples of community based older people and employ validated measurement methods consistently to enable comparison of PA levels over time and between countries. |
Thirty wasted years: Australian social security development, 1950–80, in comparative perspective∗ | Abstract This article seeks to examine the reasons why Australia, in the postwar period of economic growth and social security expansion, so resolutely remained at the bottom of the international league table of welfare state development. Four possible explanations are located — programme inertia, economic resource growth, the age structure of the population and right‐wing political hegemony — and each is shown to have some impact on social security development. The article concludes with a brief discussion of the traditional dilemma for democratic socialist parties of combining reformist political goals with sufficient economic growth. |
Foveated 3D graphics | We exploit the falloff of acuity in the visual periphery to accelerate graphics computation by a factor of 5-6 on a desktop HD display (1920x1080). Our method tracks the user's gaze point and renders three image layers around it at progressively higher angular size but lower sampling rate. The three layers are then magnified to display resolution and smoothly composited. We develop a general and efficient antialiasing algorithm easily retrofitted into existing graphics code to minimize "twinkling" artifacts in the lower-resolution layers. A standard psychophysical model for acuity falloff assumes that minimum detectable angular size increases linearly as a function of eccentricity. Given the slope characterizing this falloff, we automatically compute layer sizes and sampling rates. The result looks like a full-resolution image but reduces the number of pixels shaded by a factor of 10-15.
We performed a user study to validate these results. It identifies two levels of foveation quality: a more conservative one in which users reported foveated rendering quality as equivalent to or better than non-foveated when directly shown both, and a more aggressive one in which users were unable to correctly label as increasing or decreasing a short quality progression relative to a high-quality foveated reference. Based on this user study, we obtain a slope value for the model of 1.32-1.65 arc minutes per degree of eccentricity. This allows us to predict two future advantages of foveated rendering: (1) bigger savings with larger, sharper displays than exist currently (e.g. 100 times speedup at a field of view of 70° and resolution matching foveal acuity), and (2) a roughly linear (rather than quadratic or worse) increase in rendering cost with increasing display field of view, for planar displays at a constant sharpness. |
Module mismatch loss and recoverable power in unshaded PV installations | Distributed electronics which optimize power in PV systems have the potential to improve energy production even under unshaded conditions. This work investigates the extent to which mismatch in the unshaded electrical characteristics of PV panels causes system-level power losses, which can be recovered in arrays employing power optimizers. Of particular interest is how this potential for power recovery is affected by factors such as available light, cell temperature, panel technology, and field degradation. A system for simultaneous collection of panel-level I-V curves over an entire array is designed. This system is used to acquire high and low light module performance data for a variety of arrays at the National Renewable Energy Laboratory (NREL) test facility. The measured data show moderately low variation in module maximum power and maximum power producing current in all of the arrays. As a group, the tested arrays do not show any strong correlations between this variation and array age, technology type, or operating conditions. The measured data are used to create individual panel performance models for high and low light conditions. These models are then incorporated in annual hourly energy simulations for each array. Annual mismatch loss (and thus potential for increased energy capture using power optimizers) is found to be minimal, <;1 % for all of the sampled arrays. Due to the nature of the tested arrays, these results may or may not be indicative of typical PV array behavior; further investigation is planned over a larger group of PV installations to determine the general applicability of this study's results. |
Cooperative (rather than autonomous) vehicle-highway automation systems | The recent DARPA-sponsored automated vehicle "Challenges" have generated strong interest in both the research community and the general public, raising consciousness about the possibilities for vehicle automation. Driverless vehicles make good subjects for the visually-oriented media, and they pose enough interesting research challenges to occupy generations of graduate students. However, automated vehicles also have the potential to help solve a variety of real-world problems. Engineers need to think carefully about which of those problems we are actually solving.A well-engineered system should be designed to satisfy specific needs, and those needs should be reflected in the definition of system requirements. Alternative technological approaches can then be evaluated and traded off based on their ability to meet the requirements. The article describes the rather different needs of the public road transportation system and the military transportation system, and then shows how those needs influence the requirements for automated vehicle systems. These requirements point toward significantly different technical approaches, but it is possible to find some limited areas of technical commonality. |
The C Programming Language | In this chapter we will learn how to • write simple computer programs using the C programming language; • perform basic mathematical calculations; • manage data stored in the computer memory and disk; • generate meaningful output on the screen or into a computer file. The C programming language was developed in the early 1970's by Ken Thomp-son and Dennis Ritchie at the Bell Telephone Laboratories. It was designed and implemented in parallel with the operating system Unix and was aimed mostly as a system implementation language [1]. It, nevertheless, evolved into one of the most flexible and widely used computer programming languages today. In 1989, the American National Standards Institute (ANSI) adopted a document that standardized the C language. The particular version of the language that it describes is widely referred to as ANSI C or C89. (ISO) as C90 and was later expanded to the current standard, which is often referred to as C99. The C language consists of a small set of core commands and a large number of library functions that can be incorporated in a program, if necessary. One can find many excellent books that discuss the C language in detail, starting from the first book The C Programming Language by B. [2]. This book describes ANSI C and remains today one of the easiest texts on the subject. In this chapter, we will cover the most basic of the core commands of the language, as well as those library functions that are useful in developing programs that involve scientific computations. 1.1 The first program It is a tradition in the culture of C programming that the first program compiled and executed by a new student of the language is one that prints on the screen the happy message " Hello World! " [2] |
Design and control of an air-bearing supported three degree-of-freedom fine positioner | In this paper we discuss design considerations and control issues of a fine-motion device which provides very high performance, precision and speed ( I 2 g’s acceleration, 0.2 pm resolution). Coupled with a standard industrial robot and external sensing, improvements in precision up to two orders of magnitude can be achieved. Software enables it to be operated from a serial port of an IBM PC or PS/2. The moving element of the positioning head can be commanded to move over a + _ I mm range in x and y , and rotated up to f1.75” in e about the z axis. The device provides variable compliance with compliance set by loop gain in a digital controller. It incorporates a novel three degree-offreedom motor, a pair of internal position sensors to enable controlled motion in the plane, and an air bearing to support the moving element. |
A five-year analysis of MODIS NDVI and NDWI for grassland drought assessment over the central Great Plains of the United States | [1] A five-year (2001–2005) history of moderate resolution imaging spectroradiometer (MODIS) normalized difference vegetation index (NDVI) and normalized difference water index (NDWI) data was analyzed for grassland drought assessment within the central United States, specifically for the Flint Hills of Kansas and Oklahoma. Initial results show strong relationships among NDVI, NDWI, and drought conditions. During the summer over the Tallgrass Prairie National Preserve, the average NDVI and NDWI were consistently lower (NDVI < 0.5 and NDWI < 0.3) under drought conditions than under non-drought conditions (NDVI>0.6 and NDWI>0.4). NDWI values exhibited a quicker response to drought conditions than NDVI. Analysis revealed that combining information from visible, near infrared, and short wave infrared channels improved sensitivity to drought severity. The proposed normalized difference drought index (NDDI) had a stronger response to summer drought conditions than a simple difference between NDVI and NDWI, and is therefore a more sensitive indicator of drought in grasslands than NDVI alone.Citation: Gu, Y., J. F. Brown, J. P. Verdin, and B. Wardlow (2007), A five-year analysis of MODIS NDVI and NDWI for grassland drought assessment over the central Great Plains of the United States, Geophys. Res. Lett., 34, L06407, doi:10.1029/ 2006GL029127. |
Frequency-dependence of psychophysical and physiological responses to hand-transmitted vibration. | This invited paper reviews experimental studies of the frequency-dependence of absolute thresholds for the perception of vibration, equivalent comfort contours, temporary changes in sensation caused by vibration, and reductions in finger blood flow caused by hand-transmitted vibration. Absolute thresholds depend on the contact conditions but for a typical hand grip the thresholds show greatest sensitivity to acceleration around 125 Hz. The frequency-dependence of discomfort caused by hand-transmitted vibration depends on vibration magnitude: similar to absolute thresholds at low magnitudes, but the discomfort at higher magnitudes is similar when the vibration velocity is similar (at frequencies between about 16 and 400 Hz). Hand-transmitted vibration induces temporary elevations in vibrotactile thresholds that reflect the sensory mechanisms excited by the vibration and are therefore highly dependent on the frequency of vibration. Hand-transmitted vibration reduces finger blood flow during and after exposure; when the vibration velocity is similar at all frequencies there is more vasoconstriction at frequencies greater than 63 Hz than at lower frequencies. A single frequency weighting cannot provide a good indication of how all effects of hand-transmitted vibration depend on vibration frequency. Furthermore, a single frequency weighting provides only an approximate indication of any single response, because many factors influence the frequency-dependence of responses to hand-transmitted vibration, including the magnitude of vibration, contact conditions, and individual differences. Although the frequency weighting in current standards extends from 8 to 1,000 Hz, frequencies greater than 400 Hz rarely increase the weighted value on tools and there is currently little psychophysical or physiological evidence of their effects. |
Violent Crime Rate Studies in Philosophical Context : A Destructive Testing Approach to Heat and Southern Culture ofViolence Effects | The logic behind the translation of conceptual hypotheses into testable propositions was illustrated with the heat hypothesis, The destructive testing philosophy was introduced and applied, This consists of first showing that a predicted empirical relation exists, then attempting to break that relation. by adding competitor variables. The key question in destructive testing is "How difficult was it to break the relation?'~ This approach was used to analyze the heat effect on violent crime rates (Study I ) and on White violent crime arrest rates (Study 2) in U.S. cities. One competitor variable was the particular focus of analysis: southern culture of violence. The heat hypothesis was supported by highly significant correlations between the warmth of a city and its violence rate. This heat effect survived multiple destructive tests. Some support for the southern culture effect was also found, but this effect was more easily broken. |
A Survey of Biometric Gait Recognition : Approaches , Security and Challenges | Biometric systems are becoming increasingly important, since they provide more reliable and efficient means of identity verification. Biometric gait recognition (i.e. recognizing people from the way they walk) is one of the recent attractive topics in biometric research. This paper presents biometric user recognition based on gait. Biometric gait recognition is categorized into three groups based on: machine vision, floor sensor and wearable sensor. An overview of each gait recognition category is presented. In addition, factors that may influence gait recognition are outlined. Furthermore, the security evaluations of biometric gait under various attack scenarios are also presented. |
Testicular sperm recovery in nine 47,XXY Klinefelter patients. | Klinefelter's syndrome is generally characterized by hypergonadotrophic hypogonadism and azoospermia. The clinical features, however, are variable, and occasionally severe oligozoospermia may be present. Usually in these cases a 46,XY/47,XXY mosaic karyotype is involved. However, focal spermatogenesis and severe oligozoospermia have been reported in 47,XXY individuals too. In the present study we investigated whether testicular spermatozoa can be recovered in 47,XXY patients with a view to intracytoplasmic sperm injection (ICSI). In four out of nine apparently non-mosaic 47,XXY patients, spermatozoa were recovered from the wet preparations of testicular tissue and ICSI was performed in three couples. In one patient in whom spermatozoa were successfully recovered and used for ICSI, no spermatozoa were retrieved at a second trial. Although these results show that in some 47,XXY individuals testicular spermatozoa can be successfully recovered and even used for ICSI, at present this approach should be considered experimental. There may indeed be some concern about the chromosomal normality of the embryos generated through this infertility treatment. Patients with Klinefelter's syndrome should therefore be counselled about the complexity of this treatment, which involves multiple testicular biopsies from hypogonadal testes, ICSI and preimplantation diagnosis by fluorescence-in-situ hybridization. |
Lifelong Multi-Agent Path Finding for Online Pickup and Delivery Tasks | The multi-agent path-finding (MAPF) problem has recently received a lot of attention. However, it does not capture important characteristics of many real-world domains, such as automated warehouses, where agents are constantly engaged with new tasks. In this paper, we therefore study a lifelong version of the MAPF problem, called the multiagent pickup and delivery (MAPD) problem. In the MAPD problem, agents have to attend to a stream of delivery tasks in an online setting. One agent has to be assigned to each delivery task. This agent has to first move to a given pickup location and then to a given delivery location while avoiding collisions with other agents. We present two decoupled MAPD algorithms, Token Passing (TP) and Token Passing with Task Swaps (TPTS). Theoretically, we show that they solve all well-formed MAPD instances, a realistic subclass of MAPD instances. Experimentally, we compare them against a centralized strawman MAPD algorithm without this guarantee in a simulated warehouse system. TP can easily be extended to a fully distributed MAPD algorithm and is the best choice when real-time computation is of primary concern since it remains efficient for MAPD instances with hundreds of agents and tasks. TPTS requires limited communication among agents and balances well between TP and the centralized MAPD algorithm. |
Effect of time of day on walking capacity and self-reported fatigue in persons with multiple sclerosis: a multi-center trial. | BACKGROUND
Many persons with multiple sclerosis (PwMS) report increased fatigue in the afternoon and evening compared with the morning. It is commonly accepted that physical capacity also decreases as time of day progresses, potentially influencing the outcomes of testing.
OBJECTIVE
The objective of this article was to determine whether self-reported fatigue level and walking capacity are influenced by time of day in PwMS.
METHODS
A total of 102 PwMS from 8 centers in 5 countries, with a diverse level of ambulatory dysfunction (Expanded Disability Status Scale [EDSS] <6.5), participated. Patients performed walking capacity tests and reported fatigue level at three different time points (morning, noon, afternoon) during 1 day. Walking capacity was measured with the 6-Minute Walk Test (6MWT) and the 10-m walk test performed at usual and fastest speed. Self-reported fatigue was measured by the Rochester Fatigue Diary (RFD). Subgroups with mild (EDSS 1.5-4.0, n = 53) and moderate (EDSS 4.5-6.5, n = 49) ambulatory dysfunction were formed, as changes during the day were hypothesized to depend on disability status.
RESULTS
Subgroups had different degree of ambulatory dysfunction (p < 0.001) but reported similar fatigue levels. Although RFD scores were affected by time of day with significant differences between morning and noon/afternoon (p < 0.0001), no changes in walking capacity were found in any subgroup. Additional analyses on subgroups distinguished by diurnal change in self-reported fatigue failed to reveal analogous changes in walking capacity.
CONCLUSIONS
Testing of walking capacity is unaffected by time of day, despite changes in subjective fatigue. |
Ideological Segregation and the Effects of Social Media on News Consumption | Scholars have argued that online social networks and personalized web search increase ideological segregation. We investigate the impact of these potentially polarizing channels on news consumption by examining web browsing histories for 50,000 U.S.-located users who regularly read online news. We find that individuals indeed exhibit substantially higher segregation when reading articles shared on social networks or returned by search engines, a pattern driven by opinion pieces. However, these polarizing articles from social media and web search constitute only 2% of news consumption. Consequently, while recent technological changes do increase ideological segregation, the magnitude of the effect is limited. JEL: D83, L86, L82 |
Webis: An Ensemble for Twitter Sentiment Detection | We reproduce four Twitter sentiment classification approaches that participated in previous SemEval editions with diverse feature sets. The reproduced approaches are combined in an ensemble, averaging the individual classifiers’ confidence scores for the three classes (positive, neutral, negative) and deciding sentiment polarity based on these averages. The experimental evaluation on SemEval data shows our re-implementations to slightly outperform their respective originals. Moreover, not too surprisingly, the ensemble of the reproduced approaches serves as a strong baseline in the current edition where it is top-ranked on the 2015 test set. |
ANN-based fault classification and location in MVDC shipboard power systems | Uninterrupted power supply is an important requirement for electric ship since it has to confront frequent travel and hostilities. However, the occurrence of faults in the shipboard power systems interrupts the power service continuity and leads to the severe damage on the electrical equipments. Faults need to be quickly detected and isolated in order to restore the power supply and prevent the massive cascading outage effect on the electrical equipments. Proper protective actions are needed to be implemented to ensure smooth and continuous operation of electric ship. This paper proposes an Artificial Neural Network (ANN) based method for the fault classification and location in the MVDC shipboard power systems using the transient information in the fault voltage and current waveforms. The proposed approach is applied to the cable of an equivalent MVDC system which is simulated using PSCAD. It is found to be efficient in detecting the type and location of the DC cable faults. |
Active Grammatical Inference: A New Learning Methodology | In this paper a new methodology for inferring regular grammars y from a set of positive and negative examples and a-priori knowledge (if available), that we have called active grammatical inference, is presented. The methodology is based on a combination of neural and symbolic techniques, with the interesting feature that the learning process can be guided by the validated previous results and/or by introducing constraints or (positive/negative) rules dynamically. This is possible thanks to a new neural learning scheme, in which the inserted symbolic knowledge, that is also preserved, controls the learning mechanism. The whole process is conceived as a sequence of learning cycles, each one including the steps of FSA insertion, neural training, FSA extraction, symbolic manipulation and validation. The insertion of symbolic information into a recurrent neural network is carried out through linear system solving. An example which allows to compare the active grammatical inference methodology with just the neural or symbolic approaches is also included. |
Issues in Code-Switching: Competing Theories and Models | This paper provides a critical overview of the theoretical, analytical, and practical questions most prevalent in the study of the structural and the sociolinguistic dimensions of code-switching (CS). In doing so, it reviews a range of empirical studies from around the world. The paper first looks at the linguistic research on the structural features of CS focusing in particular on the code-switching versus borrowing distinction, and the syntactic constraints governing its operation. It then critically reviews sociological, anthropological, and linguistic perspectives dominating the sociolinguistic research on CS over the past three decades. Major empirical studies on the discourse functions of CS are discussed, noting the similarities and differences between socially motivated CS and style-shifting. Finally, directions for future research on CS are discussed, giving particular emphasis to the methodological issue of its applicability to the analysis of bilingual classroom interaction. |
3D face texture modeling from uncalibrated frontal and profile images | 3D face modeling from 2D face images is of significant importance for face analysis, animation and recognition. Previous research on this topic mainly focused on 3D face modeling from a single 2D face image; however, a single face image can only provide a limited description of a 3D face. In many applications, for example, law enforcement, multi-view face images are usually captured for a subject during enrollment, which makes it desirable to build a 3D face texture model, given a pair of frontal and profile face images. We first determine the correspondence between un-calibrated frontal and profile face images through facial landmark alignment. An initial 3D face shape is then reconstructed from the frontal face image, followed by shape refinement utilizing the depth information provided by the profile image. Finally, face texture is extracted by mapping the frontal face image on the recovered 3D face shape. The proposed method is utilized for 2D face recognition in two scenarios: (i) normalization of probe image, and (ii) enhancing the representation capability of gallery set. Experimental results comparing the proposed method with a state-of-the-art commercial face matcher and densely sampled LBP on a subset of the FERET database show the effectiveness of the proposed 3D face texture model. |
Step-up therapy for children with uncontrolled asthma receiving inhaled corticosteroids. | BACKGROUND
For children who have uncontrolled asthma despite the use of low-dose inhaled corticosteroids (ICS), evidence to guide step-up therapy is lacking.
METHODS
We randomly assigned 182 children (6 to 17 years of age), who had uncontrolled asthma while receiving 100 microg of fluticasone twice daily, to receive each of three blinded step-up therapies in random order for 16 weeks: 250 microg of fluticasone twice daily (ICS step-up), 100 microg of fluticasone plus 50 microg of a long-acting beta-agonist twice daily (LABA step-up), or 100 microg of fluticasone twice daily plus 5 or 10 mg of a leukotriene-receptor antagonist daily (LTRA step-up). We used a triple-crossover design and a composite of three outcomes (exacerbations, asthma-control days, and the forced expiratory volume in 1 second) to determine whether the frequency of a differential response to the step-up regimens was more than 25%.
RESULTS
A differential response occurred in 161 of 165 patients who were evaluated (P<0.001). The response to LABA step-up therapy was most likely to be the best response, as compared with responses to LTRA step-up (relative probability, 1.6; 95% confidence interval [CI], 1.1 to 2.3; P=0.004) and ICS step-up (relative probability, 1.7; 95% CI, 1.2 to 2.4; P=0.002). Higher scores on the Asthma Control Test before randomization (indicating better control at baseline) predicted a better response to LABA step-up (P=0.009). White race predicted a better response to LABA step-up, whereas black patients were least likely to have a best response to LTRA step-up (P=0.005).
CONCLUSIONS
Nearly all the children had a differential response to each step-up therapy. LABA step-up was significantly more likely to provide the best response than either ICS or LTRA step-up. However, many children had a best response to ICS or LTRA step-up therapy, highlighting the need to regularly monitor and appropriately adjust each child's asthma therapy. (ClinicalTrials.gov number, NCT00395304.) |
An XML-Based Multiagent System for Supporting Online Recruitment Services | In this paper, we propose an Extensible Markup Language (XML)-based multiagent recommender system for supporting online recruitment services. Our system is characterized by the following features: 1) it handles user profiles for personalizing the job search over the Internet; 2) it is based on the intelligent agent technology; and 3) it uses XML for guaranteeing a light, versatile, and standard mechanism for information representation, storing, and exchange. This paper discusses the basic features of the proposed system, presents the results of an experimental study we have carried out for evaluating its performance, and makes a comparison between the proposed system and other e-recruitment systems already presented in the past. |
Applying Fuzzy Logic in Assessing the Readiness of the Company for Implementing ERP | ERP (Enterprise Resource Planning) plays an important role to integrate organization's information and functions in all areas and finally results in successful operation in global markets. In underdeveloped countries, most companies have been forced to join global markets. But ERP implementation is costly and time consuming project and unreadiness of the company for this implementation can lead to loss of many valuable resources of the company. Therefore, assessing the readiness of the company for implementing ERP before its implementation is essential. Many factors are essential in determining a company's readiness for ERP implementation. Since most of these factors are qualitative and relations between them are very complicated, determining their exact quantitative values is quiet difficult. Using fuzzy logic can be helpful to simplify the calculations and finally leads to a more precise result to determine qualitative problems like readiness of a company for ERP implementation In this study, we intended to gather critical success factors for ERP implementation and design a Fuzzy System to assess the readiness of a company for ERP. This research is done in a famous auto company in Iran. According to this study the readiness for ERP implementation of this company was low. |
Classical Humanism and Republicanism in English Political Thought 1570–1640: Bibliography | The Journal of British Studies, founded in 1961, is published by the University of Chicago Press under the auspices of the North American Conference on British Studies (NACBS). It was the result of the imaginative generosity of a Trinity College alumnus, Frederick E. Hasler (Hon. LL.D. 1957), who contributed funds to the college for the specific purpose of establishing a learned periodical in the field of British history. The North American Conference on British Studies is a scholarly society affiliated with the American Historical Association and open to anyone in the United States and Canada interested in British civilization in its several aspects: historical, archaeological, literary, artistic, political, and sociological. Its North American constituency comprises about eight hundred members drawn from the fifty states and ten provinces. Affiliated with the parent organization are seven regional conferences (New England, Middle Atlantic, South, Midwest, Western, Pacific Coast, and Northwest) each having its own officers and programs and with a combined membership of more than two thousand. The Conference convenes at least once a year in the autumn, usually in joint session with one of its regional affiliates. It is concerned with advancing and promoting all aspects of the study of British history and culture through its publication program and through meetings, book prizes, and association with likeminded organizations in North America, Britain, and elsewhere. |
Graph Partitioning using Quantum Annealing on the D-Wave System | Graph partitioning (GP) applications are ubiquitous throughout mathematics, computer science, chemistry, physics, bio-science, machine learning, and complex systems. Post Moore's era supercomputing has provided us an opportunity to explore new approaches for traditional graph algorithms on quantum computing architectures. In this work, we explore graph partitioning using quantum annealing on the D-Wave 2X machine. Motivated by a recently proposed graph-based electronic structure theory applied to quantum molecular dynamics (QMD) simulations, graph partitioning is used for reducing the calculation of the density matrix into smaller subsystems rendering the calculation more computationally efficient. Unconstrained graph partitioning as community clustering based on the modularity metric can be naturally mapped into the Hamiltonian of the quantum annealer. On the other hand, when constraints are imposed for partitioning into equal parts and minimizing the number of cut edges between parts, a quadratic unconstrained binary optimization (QUBO) reformulation is required. This reformulation may employ the graph complement to fit the problem in the Chimera graph of the quantum annealer. Partitioning into 2 parts and k parts concurrently for arbitrary k are demonstrated with benchmark graphs, random graphs, and small material system density matrix based graphs. Results for graph partitioning using quantum and hybrid classical-quantum approaches are shown to be comparable to current "state of the art" methods and sometimes better. |
Prostate cancer diagnosis using deep learning with 3D multiparametric MRI | A novel deep learning architecture (XmasNet) based on convolutional neural networks was developed for the classification of prostate cancer lesions, using the 3D multiparametric MRI data provided by the PROSTATEx challenge. End-to-end training was performed for XmasNet, with data augmentation done through 3D rotation and slicing, in order to incorporate the 3D information of the lesion. XmasNet outperformed traditional machine learning models based on engineered features, for both train and test data. For the test data, XmasNet outperformed 69 methods from 33 participating groups and achieved the second highest AUC (0.84) in the PROSTATEx challenge. This study shows the great potential of deep learning for cancer imaging. |
Five-times-sit-to-stand test in children with cerebral palsy: reliability and concurrent validity. | BACKGROUND/PURPOSE
Five-times-sit-to-stand test (FTSST) is a reliable tool for measuring lower limb functional strength and balance ability. However, reports of the reliability of FTSST in children with cerebral palsy have been scarce. The purposes of this study were (1) to investigate the test-retest and inter-rater reliability of the FTSST and (2) to investigate the correlation between the FTSST and standard functional balance tests in children with cerebral palsy.
STUDY DESIGN
Cross-sectional study.
MATERIALS AND METHODS
Thirty-three school children aged from 6 to 18 years with Gross motor functional classification system expanded and revised version (GMFCS-E&R) level I to III were recruited. Reliability of the FTSST and concurrent validity between FTSST and Timed up and go test (TUG) and Berg balance scale (BBS) were determined using the Pearson product moment correlation.
RESULTS
The intra-class correlation coefficient (ICC) for test-retest and inter-rater reliability of FTSST were 0.91 and 0.88 respectively. FTSST showed moderate correlation with TUG (r = 0.552, P < 0.01) and with BBS (r = -0.561, P < 0.01).
CONCLUSION
FTSST is a reliable assessment tool and correlates with functional balance ability tests in children with mild to moderate cerebral palsy. |
Social capital, knowledge management, and sustained superior performance | Purpose – This article attempts to begin the process of removing the cloak of causal ambiguity by examining the role that knowledge management has in the creation of the wide variety of competitive advantages found in some organizations. Specifically, this article aims to extend understanding in the field of knowledge management by examining how knowledge management can affect organizational performance, and by examining one possible determinant of an organization’s capacity to manage knowledge. Design/methodology/approach – Reviews literature on resources-advantage theory of the firm, social capital and knowledge management to propose ways within the organization to improve their ability to manage knowledge and achieve sustained superior performance. The paper is structured around the following constructs: resource-advantage theory of the firm, social capital, and knowledge management. Findings – Describes the relationship between social capital and knowledge management and how both help organizations achieve a sustained superior performance within the market. Suggests that organizations with high levels of social capital have more knowledge-management capabilities than organizations with low levels of social capital. Research limitations/implications – This article extends prior research of knowledge management by proposing how social capital can positively impact the ability of organizations to manage knowledge. Practical implications – Since resources within all businesses are relatively limited, and particularly so when the business is small relative to its competitors, the revelation that social capital can lead to more effective knowledge management makes the decision to support and nurture social-capital development much more credible. Originality/value – Because there is no existing literature that has examined the relationship between social capital, knowledge management, and organizational performance, this paper provides a foundation for future studies that examine the relationship between social capital and knowledge management. |
Using Android in an Introductory Java Course | An electronic control method for externally controlling processes in the computer system is provided in which electronic data is presented as graphical information to a user on a display device. In addition, electronic data input by the user is received. Software routines are written in a script language specifically designed for computer monitoring and control operations through interactively presenting data to and receiving data from the user. Subsequently, processing rules are interpreted from the software routines with an electronic data processor. Operations of the computer system are controlled based on the processing rules with the electronic data processor such that the computer system can be automatically externally controlled with the processing rules by only utilizing preexisting computer system signals received from the computer system and control signals sent to the computer system. In addition, a control system for implementing the external control method is provided. |
YAFIM: A Parallel Frequent Itemset Mining Algorithm with Spark | The frequent itemset mining (FIM) is one of the most important techniques to extract knowledge from data in many real-world applications. The Apriori algorithm is the widely-used algorithm for mining frequent itemsets from a transactional dataset. However, the FIM process is both data-intensive and computing-intensive. On one side, large scale data sets are usually adopted in data mining nowadays, on the other side, in order to generate valid information, the algorithm needs to scan the datasets iteratively for many times. These make the FIM algorithm very time-consuming over big data. The parallel and distributed computing is effective and mostly-used strategy for speeding up large scale dataset algorithms. However, the existing parallel Apriori algorithms implemented with the MapReduce model are not efficient enough for iterative computation. In this paper, we proposed YAFIM (Yet Another Frequent Itemset Mining), a parallel Apriori algorithm based on the Spark RDD framework -- a specially-designed in-memory parallel computing model to support iterative algorithms and interactive data mining. Experimental results show that, compared with the algorithms implemented with MapReduce, YAFIM achieved 18× speedup in average for various benchmarks. Especially, we apply YAFIM in a real-world medical application to explore the relationships in medicine. It outperforms the MapReduce method around 25 times. |
Temporal lobe asymmetry in patients with Alzheimer's disease with delusions. | OBJECTIVE
To test the hypothesis that delusions are associated with asymmetric involvement of the temporal lobe regions in Alzheimer's disease.
METHODS
Temporal lobe atrophy was assessed with a linear measure of width of the temporal horn (WTH) taken from CT films. Temporal asymmetry was computed as the right/left (R/L) ratio of the WTH in 22 non-delusional and 19 delusional patients with Alzheimer's disease. Delusional patients had paranoid delusions (of theft, jealousy, persecution). None of the patients had misidentifications or other delusions of non-paranoid content.
RESULTS
The R/L ratio indicated symmetric temporal horn size in the non-delusional (mean 1. 05 (SD 0.20), and right greater than left temporal horn in the delusional patients (mean 1.30, (SD 0.46); t=2.27, df=39, p=0.03). When patients were stratified into three groups according to the R/L ratio, 47% of the delusional (9/19) and 14% of the non-delusional patients (3/21; chi(2)=5.6, df=1, p=0.02) showed right markedly greater than left WTH.
CONCLUSIONS
Predominantly right involvement of the medial temporal lobe might be a determinant of paranoid delusions in the mild stages of Alzheimer's disease. |
Geometrically-Correct Projection-Based Texture Mapping onto a Deformable Object | Projection-based Augmented Reality commonly employs a rigid substrate as the projection surface and does not support scenarios where the substrate can be reshaped. This investigation presents a projection-based AR system that supports deformable substrates that can be bent, twisted or folded. We demonstrate a new invisible marker embedded into a deformable substrate and an algorithm that identifies deformations to project geometrically correct textures onto the deformable object. The geometrically correct projection-based texture mapping onto a deformable marker is conducted using the measurement of the 3D shape through the detection of the retro-reflective marker on the surface. In order to achieve accurate texture mapping, we propose a marker pattern that can be partially recognized and can be registered to an object's surface. The outcome of this work addresses a fundamental vision recognition challenge that allows the underlying material to change shape and be recognized by the system. Our evaluation demonstrated the system achieved geometrically correct projection under extreme deformation conditions. We envisage the techniques presented are useful for domains including prototype development, design, entertainment and information based AR systems. |
Fulfilling Papert's Dream: Computational Fluency for All | Fifty years ago, Seymour Papert and colleagues developed Logo as the first programming language for children. Today, millions of children are participating in learn-to-code initiatives, but Papert's dream remains unfulfilled. Papert (who passed away last summer) saw programming not as a set of technical skills but as a new form of fluency - a new way for all children to explore, experiment, and express themselves. In this presentation, I will examine strategies for fulfilling Papert's dream. Drawing on examples from our Scratch online coding community, I will discuss how we can design programming environments and activities to help all children, from all backgrounds, to develop their thinking, develop their voices, and develop their identities. |
Basic Properties of the Blockchain: (Invited Talk) | As the first decentralized cryptocurrency, Bitcoin [1] has ignited much excitement, not only for its novel realization of a central bank-free financial instrument, but also as an alternative approach to classical distributed computing problems, such as reaching agreement distributedly in the presence of misbehaving parties, as well as to numerous other applications-contracts, reputation systems, name services, etc. The soundness and security of these applications, however, hinges on the thorough understanding of the fundamental properties of its underlying blockchain data structure, which parties ("miners") maintain and try to extend by generating "proofs of work" (POW, aka "cryptographic puzzle"). In this talk we follow the approach introduced in [2], formulating such fundamental properties of the blockchain, and then showing how applications such as consensus and a robust public transaction ledger can be built ``on top'' of them. The properties are as follows, assuming the adversary's hashing power (our analysis holds against arbitrary attacks) is strictly less than ½ and high network synchrony:
Common prefix: The blockchains maintained by the honest parties possess a large common prefix. More specifically, if two honest parties "prune" (i.e., cut off) k blocks from the end of their local chains, the probability that the resulting pruned chains will not be mutual prefixes of each other drops exponentially in the that parameter.
Chain quality: We show a bound on the ratio of blocks in the chain of any honest party contributed by malicious parties. In particular, as the adversary's hashing power approaches ½, we show that blockchains are only guaranteed to have few, but still some, blocks contributed by honest parties.
Chain growth: We quantify the number of blocks that are added to the blockchain during any given number of rounds during the execution of the protocol. (N.B.: This property, which in [2] was proven and used directly in the form of a lemma, was explicitly introduced in [3]. Identifying it as a separate property enables modular proofs of applications' properties.)
The above properties hold assuming that all parties-honest and adversarial-"wake up" and start computing at the same time, or, alternatively, that they compute on a common random string (the "genesis" block) only made available at the exact time when the protocol execution is to begin. In this talk we also consider the question of whether such a trusted setup/behavioral assumption is necessary, answering it in the negative by presenting a Bitcoin-like blockchain protocol that is provably secure without trusted setup, and, further, overcomes such lack in a scalable way-i.e., with running time independent of the number of parties [4].
A direct consequence of our construction above is that consensus can be solved directly by a blockchain protocol without trusted setup assuming an honest majority (in terms of computational power). |
Brief report: If you build it, they will come | BACKGROUND: Latinos have low representation in cancer prevention trials and intervention studies. Culturally appropriate recruitment strategies are needed to address this issue. OBJECTIVE: To describe and summarize the effectiveness of recruitment strategies used by the Latin American Cancer Research Coalition (LACRC). DESIGN: Descriptive report of recruitment methods. PARTICIPANTS: Uninsured Latino immigrants (N=1,170; 77% female, 23% male) from Central and South America recruited to 7 cancer control studies. APPROACH: The LACRC recruitment model involved inclusion of Latino researchers and providers, and use of culturally acceptable materials released through culturally appropriate outlets such as Latino radio stations. RESULTS: The overall participation rate was high—96% of patients identified as eligible agreed to participate. Women were excellent referrals for recruiting men to research studies. Additionally, a local Latino radio program was used to efficiently recruit eligible study participants. CONCLUSIONS: Latinos are interested and willing to participate in cancer control studies when culturally relevant approaches are used. Research teams that partner with Latino researchers and with Latino service providers are important in educating Latinos about cancer control and encouraging participation in research. |
Analyzing Data Sets with Missing Data: An Empirical Evaluation of Imputation Methods and Likelihood-Based Methods | ÐMissing data are often encountered in data sets used to construct effort prediction models. Thus far, the common practice has been to ignore observations with missing data. This may result in biased prediction models. In this paper, we evaluate four missing data techniques (MDTs) in the context of software cost modeling: listwise deletion (LD), mean imputation (MI), similar response pattern imputation (SRPI), and full information maximum likelihood (FIML). We apply the MDTs to an ERP data set, and thereafter construct regression-based prediction models using the resulting data sets. The evaluation suggests that only FIML is appropriate when the data are not missing completely at random (MCAR). Unlike FIML, prediction models constructed on LD, MI and SRPI data sets will be biased unless the data are MCAR. Furthermore, compared to LD, MI and SRPI seem appropriate only if the resulting LD data set is too small to enable the construction of a meaningful regression-based prediction model. |
Towards Procedural Quest Generation : A Structural Analysis of RPG Quests | An analysis of several role playing games indicates that player quests share common elements, and that these quests may be abstractly represented using a small expressive language. One benefit of this representation is that it can guide procedural content generation by allowing quests to be generated using this abstraction, and then later converting them into a concrete form within a game’s domain. |
Structural and optical properties of ZnO films grown on ion-plated Ga doped ZnO buffer layers by atmospheric-pressure chemical vapor deposition using Zn and H2O as source materials | Abstract Zinc oxide (ZnO) films were grown on glass substrates with ion-platedGa-doped ZnO (GZO) buffer layers at various substrate temperatures by atmospheric-pressure chemical vapor deposition using Zn powder and water as precursors. All the X-ray diffraction patterns of the ZnO/GZO films were dominated by a ZnO(002) peak, indicating the successful growth of highly c -axis oriented films. The substrate temperature dependence of growth rate was divided into three regions with the different activation energies, i.e. re-evaporation, mass-transport controlled and surface-controlled regions. Scanning-electron-microscope observations revealed that the films grown at the substrate temperature in the mass-transport-controlled region exhibited the terrace-like surface morphology with rock-like structures. Photoluminescence spectra of the ZnO/GZO films were composed of a near-band-edge (NBE) emission at a wavelength of about 380 nm and a broad-band emission spreading over the visible wavelength region, as regardless of the substrate temperature. The visible-broad-band emissions observed on many samples could be decomposed into two or three peak emissions with the Lorentzian shapes. With increasing substrate temperature, the wavelength of the emission showing the strongest intensity, which was obtained by the decomposition of the visible-broad-band emission, shifted towards longer wavelengths together with the broadening of its width. |
Biomechanical comparison of different surface modifications for dental implants. | PURPOSE
A satisfactory clinical outcome in dental implant treatment relies on primary stability for immediate load bearing. While the geometric design of an implant contributes to mechanical stability, the nature of the implant surface itself is also critically important. Biomechanical and microcomputerized tomographic evaluation of implant osseointegration was performed to compare alternative structural, chemical and biochemical, and/or pharmaceutical surface treatments applied to an identical established implant design.
MATERIALS AND METHODS
Dental implants with the same geometry but with 6 different surface treatments were tested in vivo in a sheep model (pelvis). Peri-implant bone density and removal torque were compared at 2, 4, and 8 weeks after implantation. Implant surfaces tested were: sandblasted and acid-etched titanium (Ti), sandblasted and etched zirconia, Ti coated with calcium phosphate (CaP), Ti modified via anodic plasma-chemical treatment (APC), bisphosphonate-coated Ti (Ti + Bisphos), and Ti coated with collagen containing chondroitin sulfate (CS).
RESULTS
All dental implants were well integrated at the time of sacrifice. There were no significant differences observed in peri-implant bone density between implant groups. After 8 weeks of healing, removal torque values for Ti, Ti + CaP, Ti + Bisphos, and Ti + collagen + CS were significantly higher than those for zirconia and Ti + APC.
CONCLUSIONS
Whereas the sandblasted/acid-etched Ti implant can still be considered the reference standard surface for dental implants, functional surface modifications such as bisphosphonate or collagen coating seem to enhance early peri-implant bone formation and should be studied further. |
THE EFFECT OF EXTENDED-RELEASE METOPROLOL SUCCINATE ON C-REACTIVE PROTEIN LEVELS IN PERSONS WITH HYPERTENSION - | The objective of this study was to determine whether 3 months of treatment with extended-release metoprolol succinate would reduce Creac-tive protein (CRP) levels. Seventy-five patients aged 30–65 years with uncontrolled hypertension were treated with extended-release metoprolol at 25–50 mg, titrated up to 100–200 mg daily. CRP was evaluated at baseline and at 1 and 3 months. In the 61 hypertensive patients who completed the study, CRP decreased from 6.2±7.5 mg/L at baseline to 5.4±7.0 mg/L (p=0.03) at 1 month and showed no further change at 3 months (5.6±6.5 mg/L; p=0.13). The 13 patients who received 200 mg of extended-release metoprolol had a 32% decline in CRP from 7.0±9.0 mg/L to 4.8±6.6 mg/L (–2.2 mg/L) (p=0.005) over the 3-month period, whereas lower doses did not reduce CRP (p>0.05). Age, sex, and change in blood pressure were not related to the reduc-tion in CRP in multivariate analysis. If CRP evolves into a confirmed modifiable risk factor,a β blocker such as metoprolol may be a useful addition to pharmacotherapy options. |
Chemical modulators of the innate immune response alter gypsy moth larval susceptibility to Bacillus thuringiensis | The gut comprises an essential barrier that protects both invertebrate and vertebrate animals from invasion by microorganisms. Disruption of the balanced relationship between indigenous gut microbiota and their host can result in gut bacteria eliciting host responses similar to those caused by invasive pathogens. For example, ingestion of Bacillus thuringiensis by larvae of some species of susceptible Lepidoptera can result in normally benign enteric bacteria exerting pathogenic effects. We explored the potential role of the insect immune response in mortality caused by B. thuringiensis in conjunction with gut bacteria. Two lines of evidence support such a role. First, ingestion of B. thuringiensis by gypsy moth larvae led to the depletion of their hemocytes. Second, pharmacological agents that are known to modulate innate immune responses of invertebrates and vertebrates altered larval mortality induced by B. thuringiensis. Specifically, Gram-negative peptidoglycan pre-treated with lysozyme accelerated B. thuringiensis-induced killing of larvae previously made less susceptible due to treatment with antibiotics. Conversely, several inhibitors of the innate immune response (eicosanoid inhibitors and antioxidants) increased the host's survival time following ingestion of B. thuringiensis. This study demonstrates that B. thuringiensis infection provokes changes in the cellular immune response of gypsy moth larvae. The effects of chemicals known to modulate the innate immune response of many invertebrates and vertebrates, including Lepidoptera, also indicate a role of this response in B. thuringiensis killing. Interactions among B. thuringiensis toxin, enteric bacteria, and aspects of the gypsy moth immune response may provide a novel model to decipher mechanisms of sepsis associated with bacteria of gut origin. |
AC Versus DC Distribution SystemsDid We Get it Right? | We presently enjoy a predominantly AC electrical distribution system, the engineering basis for which was designed over 100 years ago. While AC distribution systems have served us well, we should periodically pause to assess what opportunities we have accepted or been denied by the overwhelming predominance of AC electrical power distribution systems. What opportunities could be obtained by engineering DC distribution into at least portions of our present system? What advantages of the present AC distribution system should be recognized and protected? This paper will focus on distribution within premise and low-voltage distribution systems. Specifically, we will address the conversion efficiency costs of adopting various premise AC and DC distribution system topologies. According to a simple predictive model formulated in this paper, premise residential DC distribution will incur unfavorable total conversion efficiency compared with existing AC premise distribution. However, if a residence is supplied by a fuel cell or another DC generator, the total conversion efficiency within a residential DC distribution system could be similar to, or even better than, that for AC distribution. |
Complexities of the uniquely human vagina. | The vaginal microbiome in healthy women changes over short periods of time, differs among individuals, and varies in its response to sexual intercourse. |
Protection and social order | Abstract Consider a simple world populated with two types of individuals, those who work and create wealth (peasants) and those who steal the property of others (bandits). With bandits about, peasants need to protect their output and can do so individually or collectively. Either way protection is costly; it consumes resources and interferes with an individual's ability to create wealth. This study investigates how individuals might make decisions in such circumstances, how those decisions evolve over time, and how broader societal characteristics can emerge from such decisions. |
Pornographic image detection utilizing deep convolutional neural networks | Many internet users are potential victims of the pornographic images and a large part of them are underage children. Thus, content-based pornographic images detection is an important task in computer vision and multimedia research. Previous solutions usually rely on hand-engineered visual features that are hard to analyze and select. In this paper, to detect pornographic images in any style accurately and efficiently with a single model, a novel scheme utilizing the deep convolutional neural networks (CNN) is proposed. The training data are obtained from internet followed by an improved sliding window method and some novel data augmentation approaches. Then a highly efficient training algorithm is proposed based on two strategies. The first is the pre-trained midlevel representations non-fixed fine-tuning strategy. The second is adjusting the training data at the appropriate time on the basis of the performance of the proposed network on the validation set. Furthermore, we introduce a fast image scanning method which is also based on the sliding window approach in the test. We further propose a fast forward pass method based on the “fixedpoint algorithm”. So our CNN could detect all scale images so fast by one forward pass. The effectiveness of the proposed method is demonstrated in experiments on the proposed dataset and the comparative results show that our ∗Corresponding author Email address: [email protected] (Teng Li) Preprint submitted to Neurocomputing June 9, 2016 method lead to state-of-the-art detection performance. |
Intraoperative volume restriction in esophageal cancer surgery: an exploratory randomized clinical trial | AIM
To investigate whether the fluid volume administered during esophageal cancer surgery affects pulmonary gas exchange and tissue perfusion.
METHODS
An exploratory single-center randomized clinical trial was performed. Patients with esophageal cancer who underwent Lewis-Tanner procedure between June 2011 and August 2012 at the Department of Thoracic surgery "Jordanovac", Zagreb were analyzed. Patients were randomized (1:1) to receive a restrictive volume of intraoperative fluid (≤8 mL/kg/h) or a liberal volume (>8 mL/kg/h). Changes in oxygen partial pressure (Pao2), inspired oxygen fraction (FiO2), creatinine, and lactate were measured during and after surgery.
RESULTS
Overall 16 patients were randomized and they all were analyzed (restrictive group n=8, liberal group n=8). The baseline value Pao2/FiO2 ratio (restrictive) was 345.01±35.31 and the value six hours after extubation was 315.51±32.91; the baseline Pao2/FiO2 ratio (liberal) was 330.11±34.71 and the value six hours after extubation was 307.11±30.31. The baseline creatinine value (restrictive) was 91.91±12.67 and the value six hours after extubation was 100.88±18.33; the baseline creatinine value (liberal) was 90.88±14.99 and the value six hours after extubation was 93.51±16.37. The baseline lactate value (restrictive) was 3.93±1.33 and the value six hours after extubation was 2.69±0.91. The baseline lactate value (liberal) was 3.26±1.25 and the value six hours after extubation was 2.40±1.08. The two groups showed no significant differences in Pao2/FiO2 ratio (P=0.410), creatinine (P=0.410), or lactate (P=0.574).
CONCLUSIONS
Restriction of intraoperative applied volume does not significantly affect pulmonary exchange function or tissue perfusion in patients undergoing surgical treatment for esophageal cancer. |
Less-redundant Text Summarization Using Ensemble Clustering Algorithm Based on GA and PSO | In this paper, a novel text clustering technique is proposed to summarize text documents. The clustering method, so called ‘Ensemble Clustering Method’, combines both genetic algorithms (GA) and particle swarm optimization (PSO) efficiently and automatically to get the best clustering results. The summarization with this clustering method is to effectively avoid the redundancy in the summarized document and to show the good summarizing results, extracting the most significant and non-redundant sentence from clustering sentences of a document. We tested this technique with various text documents in the open benchmark datasets, DUC01 and DUC02. To evaluate the performances, we used F-measure and ROUGE. The experimental results show that the performance capability of our method is about 11% to 24% better than other summarization algorithms. Key-Words: Text Summarization; Extractive Summarization; Ensemble Clustering; Genetic Algorithms; Particle Swarm Optimization |
PISA: Pixelwise Image Saliency by Aggregating Complementary Appearance Contrast Measures With Edge-Preserving Coherence | Driven by recent vision and graphics applications such as image segmentation and object recognition, computing pixel-accurate saliency values to uniformly highlight foreground objects becomes increasingly important. In this paper, we propose a unified framework called pixelwise image saliency aggregating (PISA) various bottom-up cues and priors. It generates spatially coherent yet detail-preserving, pixel-accurate, and fine-grained saliency, and overcomes the limitations of previous methods, which use homogeneous superpixel based and color only treatment. PISA aggregates multiple saliency cues in a global context, such as complementary color and structure contrast measures, with their spatial priors in the image domain. The saliency confidence is further jointly modeled with a neighborhood consistence constraint into an energy minimization formulation, in which each pixel will be evaluated with multiple hypothetical saliency levels. Instead of using global discrete optimization methods, we employ the cost-volume filtering technique to solve our formulation, assigning the saliency levels smoothly while preserving the edge-aware structure details. In addition, a faster version of PISA is developed using a gradient-driven image subsampling strategy to greatly improve the runtime efficiency while keeping comparable detection accuracy. Extensive experiments on a number of public data sets suggest that PISA convincingly outperforms other state-of-the-art approaches. In addition, with this work, we also create a new data set containing 800 commodity images for evaluating saliency detection. |
Calibrated, Registered Images of an Extended Urban Area | We describe a dataset of several thousand calibrated, time-stamped, geo-referenced, high dynamic range color images, acquired under uncontrolled, variable illumination conditions in an outdoor region spanning several hundred meters. The image data is grouped into several regions which have little mutual inter-visibility. For each group, the calibration data is globally consistent on average to roughly five centimeters and 0 1°, or about four pixels of epipolar registration. All image, feature and calibration data is available for interactive inspection and downloading at http://city.lcs.mit.edu/data. Calibrated imagery is of fundamental interest in a variety of applications. We have made this data available in the belief that researchers in computer graphics, computer vision, photogrammetry and digital cartography will find it of value as a test set for their own image registration algorithms, as a calibrated image set for applications such as image-based rendering, metric 3D reconstruction, and appearance recovery, and as input for existing GIS applications. |
OpenIoT: An open service framework for the Internet of Things | The Internet of Things (IoT) has been a hot topic for the future of computing and communication. It will not only have a broad impact on our everyday life in the near future, but also create a new ecosystem involving a wide array of players such as device developers, service providers, software developers, network operators, and service users. In this paper, we present an open service framework for the Internet of Things, facilitating entrance into the IoT-related mass market, and establishing a global IoT ecosystem with the worldwide use of IoT devices and softwares. We expect that the open IoT service framework we proposed will play an important role in the widespread adoption of the Internet of Things in our everyday life, enhancing our quality of life with a large number of innovative applications and services, but also offering endless opportunities to all of the stakeholders in the world of information and communication technologies. |
ArikIturri: An Automatic Question Generator Based on Corpora and NLP Techniques | Knowledge construction is expensive for Computer Assisted Assessment. When setting exercise questions, teachers use Test Makers to construct Question Banks. The addition of Automatic Generation to assessment applications decreases the time spent on constructing examination papers. In this article, we present ArikIturri, an Automatic Question Generator for Basque language test questions, which is independent from the test assessment application that uses it. The information source for this question generator consists of linguistically analysed real corpora, represented in XML markup language. ArikIturri makes use of NLP tools. The influence of the robustness of those tools and the used corpora is highlighted in the article. We have proved the viability of ArikIturri when constructing fill-in-the-blank, word formation, multiple choice, and error correction question types. In the evaluation of this automatic generator, we have obtained positive results as regards the generation process and its usefulness. |
Retrieval based interactive cartoon synthesis via unsupervised bi-distance metric learning | Cartoons play important roles in many areas, but it requires a lot of labor to produce new cartoon clips. In this paper, we propose a gesture recognition method for cartoon character images with two applications, namely content-based cartoon image retrieval and cartoon clip synthesis. We first define Edge Features (EF) and Motion Direction Features (MDF) for cartoon character images. The features are classified into two different groups, namely intra-features and inter-features. An Unsupervised Bi-Distance Metric Learning (UBDML) algorithm is proposed to recognize the gestures of cartoon character images. Different from the previous research efforts on distance metric learning, UBDML learns the optimal distance metric from the heterogeneous distance metrics derived from intra-features and inter-features. Content-based cartoon character image retrieval and cartoon clip synthesis can be carried out based on the distance metric learned by UBDML. Experiments show that the cartoon character image retrieval has a high precision and that the cartoon clip synthesis can be carried out efficiently. |
Data Quality and Data Cleaning: An Overview | Data quality is a serious concern in any data-driven enterprise, often creating misleading findings during data mining, and causing process disruptions in operational databases. The manifestations of data quality problems can be very expensive- "losing" customers, "misplacing" billions of dollars worth of equipment, misallocated resources due to glitched forecasts, and so on. Solving data quality problems typically requires a very large investment of time and energy -- often 80% to 90% of a data analysis project is spent in making the data reliable enough that the results can be trusted.In this tutorial, we present a multi disciplinary approach to data quality problems. We start by discussing the meaning of data quality and the sources of data quality problems. We show how these problems can be addressed by a multidisciplinary approach, combining techniques from management science, statistics, database research, and metadata management. Next, we present an updated definition of data quality metrics, and illustrate their application with a case study. We conclude with a survey of recent database research that is relevant to data quality problems, and suggest directions for future research. |
Prevalence of somatoform disorders in a psychiatric population: an Italian nationwide survey | The survey involved 50 centres comprising both hospital and community psychiatric care services throughout Italy. Overall, 2620 patients were recruited, and of those 2002 (76%) completed the Somatoform Disorders Schedule (SDS), a CIDI-derived interview. The NOS somatoform disorders (SDs) diagnosis appeared to be the most common (60%) (and they showed the highest number of co-morbid diagnoses), followed by pain disorders (8%). The prevalence of undifferentiated somatoform and hypochondriactal disorders was 1.6%: older age groups showed a tendency towards higher rates of the latter. In general, the study found that a significant percentage of patients with SDs are referred to psychiatric services, but mainly because of other psychopathological problems: in fact, somatic complaints are cross-sectionally present in different psychiatric nosological categories. This study also emphasizes some limitations of the current classification of SDs. |
High-order Graph-based Neural Dependency Parsing | In this work, we present a novel way of using neural network for graph-based dependency parsing, which fits the neural network into a simple probabilistic model and can be furthermore generalized to high-order parsing. Instead of the sparse features used in traditional methods, we utilize distributed dense feature representations for neural network, which give better feature representations. The proposed parsers are evaluated on English and Chinese Penn Treebanks. Compared to existing work, our parsers give competitive performance with much more efficient inference. |
Machine Learning and Social Robotics for Detecting Early Signs of Dementia | This paper presents the EACare project, an ambitious multidisciplinary collaboration with the aim to develop an embodied system, capable of carrying out neuropsychological tests to detect early signs of dementia, e.g., due to Alzheimer’s disease. The system will use methods from Machine Learning and Social Robotics, and be trained with examples of recorded clinician-patient interactions. The interaction will be developed using a participatory design approach. We describe the scope and method of the project, and report on a first Wizard of Oz prototype. |
Geometry of Interaction V: Logic in the hyperfinite factor | Geometry of Interaction is a transcendental syntax developed in the framework of operator algebras. This fifth installment of the program takes place inside a von Neumann algebra, the hyperfinite factor. It provides a built-in interpretation of cut-elimination as well as an explanation for light, i.e., complexity sensitive, logics. |
Understanding Human Behaviors in Crowds by Imitating the Decision-Making Process | Crowd behavior understanding is crucial yet challenging across a wide range of applications, since crowd behavior is inherently determined by a sequential decision-making process based on various factors, such as the pedestrians’ own destinations, interaction with nearby pedestrians and anticipation of upcoming events. In this paper, we propose a novel framework of Social-Aware Generative Adversarial Imitation Learning (SA-GAIL) to mimic the underlying decisionmaking process of pedestrians in crowds. Specifically, we infer the latent factors of human decision-making process in an unsupervised manner by extending the Generative Adversarial Imitation Learning framework to anticipate future paths of pedestrians. Different factors of human decision making are disentangled with mutual information maximization, with the process modeled by collision avoidance regularization and Social-Aware LSTMs. Experimental results demonstrate the potential of our framework in disentangling the latent decision-making factors of pedestrians and stronger abilities in predicting future trajectories. |
Estimation of rotor angles of synchronous machines using artificial neural networks and local PMU-based quantities | This paper investigates a possibility for estimating rotor angles in the time frame of transient (angle) stability of electric power systems, for use in real-time. The proposed dynamic state estimation technique is based on the use of voltage and current phasors obtained from a phasor measurement unit supposed to be installed on the extra-high voltage side of the substation of a power plant, together with a multilayer perceptron trained off-line from simulations. We demonstrate that an intuitive approach to directly map phasor measurement inputs to the neural network to generator rotor angle does not offer satisfactory results. We found out that a good way to approach the angle estimation problem is to use two neural networks in order to estimate the sinðdÞ and cosðdÞ of the angle and recover the latter from these values by simple post-processing. Simulation results on a part of the Mexican interconnected system show that the approach could yield satisfactory accuracy for realtime monitoring and control of transient instability. r 2007 Elsevier B.V. All rights reserved. |
A rare presentation of ventriculitis and brain abscess caused by Fusobacterium nucleatum. | Anaerobic ventriculitis is rare, and usually seen in patients with predisposing factors such as otitis media, mastoiditis, sinusitis or recent neurosurgery. We report what we believe to be the first case of ventriculitis and brain abscess due to Fusobacterium nucleatum infection in a man with no significant predisposing factors. He was successfully treated with antibiotic therapy. |
The Impact of Soft Resource Allocation on n-Tier Application Scalability | Good performance and efficiency, in terms of high quality of service and resource utilization for example, are important goals in a cloud environment. Through extensive measurements of an n-tier application benchmark (RUBBoS), we show that overall system performance is surprisingly sensitive to appropriate allocation of soft resources (e.g., server thread pool size). Inappropriate soft resource allocation can quickly degrade overall application performance significantly. Concretely, both under-allocation and over-allocation of thread pool can lead to bottlenecks in other resources because of non-trivial dependencies. We have observed some non-obvious phenomena due to these correlated bottlenecks. For instance, the number of threads in the Apache web server can limit the total useful throughput, causing the CPU utilization of the C-JDBC clustering middleware to decrease as the workload increases. We provide a practical iterative solution approach to this challenge through an algorithmic combination of operational queuing laws and measurement data. Our results show that soft resource allocation plays a central role in the performance scalability of complex systems such as n-tier applications in cloud environments. |
Comdb2: Bloomberg's Highly Available Relational Database System | Comdb2 is a distributed database system designed for geographical replication and high availability. In contrast with the latest trends in this field, Comdb2 o↵ers full transactional support, a standard relational model, and the expressivity of SQL. Moreover, the system allows for rich stored procedures using a dialect of Lua. Comdb2 implements a serializable system in which reads from any node always return current values. Comdb2 provides transparent High Availability through built-in service discovery and sophisticated retry logic embedded in the standard API. In addition to the relational data model, Comdb2 implements queues for publisher-to-subscriber message delivery. Queues can be combined with table triggers for timeconsistent log distribution, providing functionality commonly needed in modern OLTP. In this paper we give an overview of our last twelve years of work. We focus on the design choices that have made Comdb2 the primary database solution within our company, Bloomberg LP (BLP). |
The importance of sputum cytology in the diagnosis of lung cancer: a cost-effectiveness analysis. | STUDY OBJECTIVE
To assess the potential health and cost effects of initial testing with sputum cytology to diagnose lung cancer.
DESIGN
Cost-effectiveness analysis.
DATA SOURCES
Surveillance Epidemiology and End Results (SEER) program; cost data from Northern California Kaiser Permanente Hospitals and Universities of Stanford and Iowa; National Center for Health Statistics; and a MEDLINE search.
INTERVENTIONS
The use of sputum cytologies preceding other tests (ie, fine-needle aspiration, bronchoscopy, thoracoscopy) in patients with suspected lung cancer.
MAIN OUTCOME MEASURES
Mortality associated with testing and initial surgical treatment (eg, performance of thoracoscopy to remove a local-stage, centrally located cancer), cost of testing and initial treatment, life expectancy, lifetime cost of medical care, and cost-effectiveness.
RESULTS
In central lesions, sputum cytology as the first test was the dominant strategy because it both lowers medical-care costs ($2,516 per patient) and lowers the mortality risk (19 deaths in 100,000 patients) of the evaluation without adversely affecting long-term survival. In peripheral lesions, sputum cytology costs less then $25,000 per year of life saved if the pretest probability of cancer exceeds 50%. The estimated annual savings of adopting sputum cytology as the first test for diagnosing lung cancer in the United States is at least $30 million.
CONCLUSIONS
Experience in regional centers indicates that sputum cytologic testing is infrequently ordered before implementing invasive diagnostic techniques, even in patients with central lung masses. The study findings suggest that sputum cytology as the first test in suspected lung cancer is likely to be cost saving without adversely affecting patient outcomes. |
Towards Extracting Faithful and Descriptive Representations of Latent Variable Models | Methods that use latent representations of data, such as matrix and tensor factorization or deep neural methods, are becoming increasingly popular for applications such as knowledge base population and recommendation systems. These approaches have been shown to be very robust and scalable but, in contrast to more symbolic approaches, lack interpretability. This makes debugging such models difficult, and might result in users not trusting the predictions of such systems. To overcome this issue we propose to extract an interpretable proxy model from a predictive latent variable model. We use a socalled pedagogical method, where we query our predictive model to obtain observations needed for learning a descriptive model. We describe two families of (presumably more) descriptive models, simple logic rules and Bayesian networks, and show how members of these families provide descriptive representations of matrix factorization models. Preliminary experiments on knowledge extraction from text indicate that even though Bayesian networks may be more faithful to a matrix factorization model than the logic rules, the latter are possibly more useful for interpretation and debugging. |
Multilinear Spatial Discriminant Analysis for Dimensionality Reduction | In the last few years, great efforts have been made to extend the linear projection technique (LPT) for multidimensional data (i.e., tensor), generally referred to as the multilinear projection technique (MPT). The vectorized nature of LPT requires high-dimensional data to be converted into vector, and hence may lose spatial neighborhood information of raw data. MPT well addresses this problem by encoding multidimensional data as general tensors of a second or even higher order. In this paper, we propose a novel multilinear projection technique, called multilinear spatial discriminant analysis (MSDA), to identify the underlying manifold of high-order tensor data. MSDA considers both the nonlocal structure and the local structure of data in the transform domain, seeking to learn the projection matrices from all directions of tensor data that simultaneously maximize the nonlocal structure and minimize the local structure. Different from multilinear principal component analysis (MPCA) that aims to preserve the global structure and tensor locality preserving projection (TLPP) that is in favor of preserving the local structure, MSDA seeks a tradeoff between the nonlocal (global) and local structures so as to drive its discriminant information from the range of the non-local structure and the range of the local structure. This spatial discriminant characteristic makes MSDA have more powerful manifold preserving ability than TLPP and MPCA. Theoretical analysis shows that traditional MPTs, such as multilinear linear discriminant analysis, TLPP, MPCA, and tensor maximum margin criterion, could be derived from the MSDA model by setting different graphs and constraints. Extensive experiments on face databases (ORL, CMU PIE, and the extended Yale-B) and the Weizmann action database demonstrate the effectiveness of the proposed MSDA method. |
The role of media violence in violent behavior. | Media violence poses a threat to public health inasmuch as it leads to an increase in real-world violence and aggression. Research shows that fictional television and film violence contribute to both a short-term and a long-term increase in aggression and violence in young viewers. Television news violence also contributes to increased violence, principally in the form of imitative suicides and acts of aggression. Video games are clearly capable of producing an increase in aggression and violence in the short term, although no long-term longitudinal studies capable of demonstrating long-term effects have been conducted. The relationship between media violence and real-world violence and aggression is moderated by the nature of the media content and characteristics of and social influences on the individual exposed to that content. Still, the average overall size of the effect is large enough to place it in the category of known threats to public health. |
Avoidance of overlearning characterises the spacing effect | The spacing of a fixed amount of study time across multiple sessions usually increases subsequent test performance*a finding known as the spacing effect. In the spacing experiment reported here, subjects completed multiple learning trials, and each included a study phase and a test. Once a subject achieved a perfect test, the remaining learning trials within that session comprised what is known as overlearning. The number of these overlearning trials was reduced when learning trials were spaced across multiple sessions rather than massed in a single session. In addition, the degree to which spacing reduced overlearning predicted the size of the spacing effect, which is consistent with the possibility that spacing increases subsequent recall by reducing the occurrence of overlearning. By this account, overlearning is an inefficient use of study time, and the efficacy of spacing depends at least partly on the degree to which it reduces the occurrence of overlearning. |
Control for balancing line follower robot using discrete cascaded PID algorithm on ADROIT V1 education robot | Robotics has been widely used in education as a learning tool to attract and motivate students in performing laboratory experiments within the context of mechatronics, electronics, microcomputer, and control. In this paper we propose an implementation of cascaded PID control algorithm for line follower balancing robot. The algorithm is implemented on ADROIT V1 education robot kits. The robot should be able to follow the trajectory given by the circular guideline while maintaining its balance condition. The controller also designed to control the speed of robot movement while tracking the line. To obtain this purpose, there are three controllers that is used in the same time; balancing controller, speed controller and the line following controller. Those three controllers are cascaded to control the movement of the robot that uses two motors as its actuator. From the experiment, the proposed cascaded PID controller shows an acceptable performance for the robot to maintain its balance position while following the circular line with the given speed setpoint. |
Can Maxwell’s equations be obtained from the continuity equation? | We formulate an existence theorem that states that given localized scalar and vector time-dependent sources satisfying the continuity equation, there exist two retarded fields that satisfy a set of four field equations. If the theorem is applied to the usual electromagnetic charge and current densities, the retarded fields are identified with the electric and magnetic fields and the associated field equations with Maxwell's equations. This application of the theorem suggests that charge conservation can be considered to be the fundamental assumption underlying Maxwell's equations. |
How the Weak Win Wars: A Theory of Asymmetric Conflict | No one had given Muhammad Ali a chance against George Foreman in the World Heavyweight Championship ght of October 30, 1974. Foreman, none of whose opponents had lasted more than three rounds in the ring, was the strongest, hardest hitting boxer of his generation. Ali, though not as powerful as Foreman, had a slightly faster punch and was lighter on his feet. In the weeks leading up to the ght, however, Foreman had practiced against nimble sparring partners. He was ready. But when the bell rang just after 4:00 a.m. in Kinshasa, something completely unexpected happened. In round two, instead of moving into the ring to meet Foreman, Ali appeared to cower against the ropes. Foreman, now condent of victory, pounded him again and again, while Ali whispered hoarse taunts: “George, you’re not hittin’,” “George, you disappoint me.” Foreman lost his temper, and his punches became a furious blur. To spectators, unaware that the elastic ring ropes were absorbing much of the force of Foreman’s blows, it looked as if Ali would surely fall. By the fth round, however, Foreman was worn out. And in round eight, as stunned commentators and a delirious crowd looked on, Muhammad Ali knocked George Foreman to the canvas, and the ght was over. The outcome of that now-famous “rumble in the jungle” was completely unexpected. The two ghters were equally motivated to win: Both had boasted of victory, and both had enormous egos. Yet in the end, a ght that should have been over in three rounds went eight, and Foreman’s prodigious punches proved useless against Ali’s rope-a-dope strategy. This ght illustrates an important yet relatively unexplored feature of interstate conict: how a weak actor’s strategy can make a strong actor’s power irHow the Weak Win Wars Ivan Arreguín-Toft |
Origins of Menger’s thought in French liberal economists | Carl Menger, who became regarded as the founder of the Austrian School, did not only confront the German Historical School and criticize British Classical Political Economy, he also read the French Liberal economists. The link between Say and Menger has already been asserted, but on an intuitive basis. It seemed necessary to give substantial proof of its true extent, as well as to document it with proper archival work—that is done in the present article. Menger’s reading of other French authors: Count Pellegrino Rossi, who succeeded Say at the Collège de France, Michel Chevalier, a major name of the French Industrialization, Frédéric Bastiat, the famous defender of free-trade, is less known. It is also documented here, bringing to light first-hand material, mainly from the Menger Collection located in Japan, and the Perkins Library at Duke University. Thus are acknowledged the origins of Menger’s thought in French liberal economists. |
Frailty as a novel predictor of mortality and hospitalization in individuals of all ages undergoing hemodialysis. | OBJECTIVES
To quantify the prevalence of frailty in adults of all ages undergoing chronic hemodialysis, its relationship to comorbidity and disability, and its association with adverse outcomes of mortality and hospitalization.
DESIGN
Prospective cohort study.
SETTING
Single hemodialysis center in Baltimore, Maryland.
PARTICIPANTS
One hundred forty-six individuals undergoing hemodialysis enrolled between January 2009 and March 2010 and followed through August 2012.
MEASUREMENTS
Frailty, comorbidity, and disability on enrollment in the study and subsequent mortality and hospitalizations.
RESULTS
At enrollment, 50.0% of older (≥ 65) and 35.4% of younger (<65) individuals undergoing hemodialysis were frail; 35.9% and 29.3%, respectively, were intermediately frail. Three-year mortality was 16.2% for nonfrail, 34.4% for intermediately frail, and 40.2% for frail participants. Intermediate frailty and frailty were associated with a 2.7 times (95% confidence interval (CI) = 1.02-7.07, P = .046) and 2.6 times (95% CI = 1.04-6.49, P = .04) greater risk of death independent of age, sex, comorbidity, and disability. In the year after enrollment, median number of hospitalizations was 1 (interquartile range 0-3). The proportion with two or more hospitalizations was 28.2% for nonfrail, 25.5% for intermediately frail, and 42.6% for frail participants. Although intermediate frailty was not associated with number of hospitalizations (relative risk = 0.76, 95% CI = 0.49-1.16, P = .21), frailty was associated with 1.4 times (95% CI = 1.00-2.03, P = .049) more hospitalizations independent of age, sex, comorbidity, and disability. The association between frailty and mortality (interaction P = .64) and hospitalizations (P = .14) did not differ between older and younger participants.
CONCLUSIONS
Adults of all ages undergoing hemodialysis have a high prevalence of frailty, more than five times as high as community-dwelling older adults. In this population, regardless of age, frailty is a strong, independent predictor of mortality and number of hospitalizations. |
On Monotonicity-Preserving Stabilized Finite Element Approximations of Transport Problems | The aim of this work is to design monotonicity-preserving stabilized finite element techniques for transport problems as a blend of linear and nonlinear (shock-capturing) stabilization. As linear stabilization, we consider and analyze a novel symmetric projection stabilization technique based on a local Scott-Zhang projector. Next, we design a weighting of the aforementioned linear stabilization such that, when combined with a finite element discretization enjoying a discrete maximum principle (usually attained via nonlinear stabilization), it does not spoil these monotonicity properties. Then, we propose novel nonlinear stabilization schemes in the form of an artificial viscosity method where the amount of viscosity is proportional to gradient jumps either at finite element boundaries or nodes. For the nodal scheme, we prove a discrete maximum principle for time-dependent multidimensional transport problems. Numerical experiments support the numerical analysis and we show that the resulting methods provide excellent results. In particular, we observe that the proposed nonlinear stabilization techniques do an excellent work eliminating oscillations around shocks. |
BitBlaze: A New Approach to Computer Security via Binary Analysis | In this paper, we give an overview of the BitBlaze project, a new approach to computer security via binary analysis. In particular, BitBlaze focuses on building a unified binary analysis platform and using it to provide novel solutions to a broad spectrum of different security problems. The binary analysis platform is designed to enable accurate analysis, provide an extensible architecture, and combines static and dynamic analysis as well as program verification techniques to satisfy the common needs of security applications. By extracting security-related properties from binary programs directly, BitBlaze enables a principled, root-cause based approach to computer security, offering novel and effective solutions, as demonstrated with over a dozen different security applications. |
A Method for Building Personalized Ontology Summaries | In the context of ontology engineering, the ontology understanding is the basis for its further development and reuse. One intuitive effective approach to support ontology understanding is the process of ontology summarization which highlights the most important concepts of an ontology. Ontology summarization identifies an excerpt from an ontology that contains the most relevant concepts and produces an abridged ontology. In this article, we present a method for summarizing ontologies that represent a data source schema or describe a knowledge domain. We propose an algorithm to produce a personalized ontology summary based on user-defined parameters, e.g. summary size. The relevance of a concept is determined through user indication or centrality measures, commonly used to determine the relative importance of a vertex within a graph. The algorithm searches for the best ontology summary, i.e., the one containing the most relevant ontology concepts respecting all the original relationships between concepts and the parameters set by the user. Experiments were done comparing our generated ontology summaries against golden standard summaries as well as summaries produced by methods available in related work. We achieved in average more than 62.5% of similarity with golden standard summaries. |
Assessing burn severity and comparing soil water repellency, Hayman Fire, Colorado | An important element of evaluating a large wildfire is to assess its effects on the soil in order to predict the potential watershed response. After the 55 000 ha Hayman Fire on the Colorado Front Range, 24 soil and vegetation variables were measured to determine the key variables that could be used for a rapid field assessment of burn severity. The percentage of exposed mineral soil and litter cover proved to be the best predictors of burn severity in this environment. Two burn severity classifications, one from a statistical classification tree and the other a Burned Area Emergency Response (BAER) burn severity map, were compared with measured ‘ground truth’ burn severity at 183 plots and were 56% and 69% accurate, respectively. This study also compared water repellency measurements made with the water drop penetration time (WDPT) test and a mini-disk infiltrometer (MDI) test. At the soil surface, the moderate and highly burned sites had the strongest water repellency, yet were not significantly different from each other. Areas burned at moderate severity had 1Ð5 times more plots that were strongly water repellent at the surface than the areas burned at high severity. However, the high severity plots most likely had a deeper water repellent layer that was not detected with our surface tests. The WDPT and MDI values had an overall correlation of r D 0Ð64 p < 0Ð0001 and appeared to be compatible methods for assessing soil water repellency in the field. Both tests represent point measurements of a soil characteristic that has large spatial variability; hence, results from both tests reflect that variability, accounting for much of the remaining variance. The MDI is easier to use, takes about 1 min to assess a strongly water repellent soil and provides two indicators of water repellency: the time to start of infiltration and a relative infiltration rate. Copyright 2005 John Wiley & Sons, Ltd. |
General Duality for Perpetual American Options | In this paper, we investigate the generalization of the Call-Put duality equality obtained in [1] for perpetual American options when the Call-Put payoff $(y-x)^+$ is replaced by $\phi(x,y)$. It turns out that the duality still holds under monotonicity and concavity assumptions on $\phi$. The specific analytical form of the Call-Put payoff only makes calculations easier but is not crucial unlike in the derivation of the Call-Put duality equality for European options. Last, we give some examples for which the optimal strategy is known explicitly. |
MIMO radar theory and experimental results | The continuing progress of Moore's law has enabled the development of radar systems that simultaneously transmit and receive multiple coded waveforms from multiple phase centers and to process them in ways that have been unavailable in the past. The signals available for processing from these multiple-input multiple-output (MIMO) radar systems appear as spatial samples corresponding to the convolution of the transmit and receive aperture phase centers. The samples provide the ability to excite and measure the channel that consists of the transmit/receive propagation paths, the target and incidental scattering or clutter. These signals may be processed and combined to form an adaptive coherent transmit beam, or to search an extended area with high resolution in a single dwell. Adaptively combining the received data provides the effect of adaptively controlling the transmit beamshape and the spatial extent provides improved track-while-scan accuracy. This paper describes the theory behind the improved surveillance radar performance and illustrates this with measurements from experimental MIMO radars. |
Current Studies On Intrusion Detection System, Genetic Algorithm And Fuzzy Logic | Nowadays Intrusion Detection System (IDS) which is increasingly a key element of system security is used to identify the malicious activities in a computer system or network. There are different approaches being employed in intrusion detection systems, but unluckily each of the technique so far is not entirely ideal. The prediction process may produce false alarms in many anomaly based intrusion detection systems. With the concept of fuzzy logic, the false alarm rate in establishing intrusive activities can be reduced. A set of efficient fuzzy rules can be used to define the normal and abnormal behaviors in a computer network. Therefore some strategy is needed for best promising security to monitor the anomalous behavior in computer network. In this paper I present a few research papers regarding the foundations of intrusion detection systems, the methodologies and good fuzzy classifiers using genetic algorithm which are the focus of current development efforts and the solution of the problem of Intrusion Detection System to offer a realworld view of intrusion detection. Ultimately, a discussion of the upcoming technologies and various methodologies which promise to improve the capability of computer systems to detect intrusions is offered. |
for Cryo-EM Image Processing involving Gröbner Bases Using C + + / Java / HOL / Scala / Scalalab / ImageJ Software Environments – A Short Communication on Gröbner Bases With Applications in Signals and Systems Using JikesRVM / JVM | In this research communication on commutative algebra it was proposed to deal with Grobner Bases and its applications in signals and systems domain.This is one of the pioneering communications in dealing with Cryo-EM Image Processing application using multi-disciplinary concepts involving thermodynamics and electromagnetics based on first principles approach. keywords: Commutative Algebra/HOL/Scala/JikesRVM/Cryo-EM Images/CoCoALib/JAS Introduction & Inspiration : Cryo-Electron Microscopy (Cryo-EM) is an expanding structural biology technique that has recently undergone a quantum leap progression in its applicability to the study of challenging nano-bio systems,because crystallization is not required,only small amounts of sample are needed, and because images can be classified using a computer, the technique has the promising potential to deal with compositional as well as conformational mixtures.Cryo-EM can be used to investigate the complete and fully functional macromolecular complexes in different functional states, providing a richness of nano-bio systems insight. In this short communication,pointing to some of the principles behind the Cryo-EM methodology of single particle analysis via references and discussing Grobner bases application to challenging systems of paramount nano-bio importance is interesting. Special emphasis is on new methodological developments that are leading to an explosion of new studies, many of which are reaching resolutions that could only be dreamed of just few years ago.[1-9][Figures I-IV] There are two main challenges facing researchers in Cryo-EM Image Processing : “(1) The first challenge is that the projection images are extremely noisy (due to the low electron dose that can interact with each molecule before it is destroyed). (2) The second is that the orientations of the molecules that produced every image is unknown (unlike crystallography where the molecules are packed in a form of a crystal and therefore share the same known orientation).Overcoming these two challenges are very much principal in the science of CryoEM. “ according to Prof. Hadani. In the context of above mentioned challenges we intend to investigate and suggest Grobner bases to process Cryo-EM Images using Thermodynamics and Electromagnetics principles.The inspiration to write this short communication was derived mainly from the works of Prof.Buchberger and Dr.Rolf Landauer. source : The physical nature of information Rolf Landauer IBM T.J. Watson Research Center, P.O. Box 218. Yorktown Heights, NY 10598, USA . source : Gröbner Bases:A Short Introduction for Systems Theorists -Bruno Buchberger Research Institute for Symbolic Computation University of Linz,A4232 Schloss,Hagenberg,Austria. Additional interesting facts are observed from an article by Jon Cohen : “Structural Biology – Is HighTech View of HIV Too Good To Be True ?”. (http://davidcrowe.ca/SciHealthEnv/papers/9599-IsHighTechViewOfHIVTooGoodToBeTrue.pdf) Researchers are only interested in finding better software tools to refine the cryo-em image processing tasks on hand using all the mathematical tools at their disposal.Commutative Algebra is one such promising tool.Hence the justification for using Grobner Bases. Informatics Framework Design,Implementation & Analysis : Figure I. Mathematical Algorithm Implementation and Software Architecture -Overall Idea presented in the paper.Self Explanatory Graphical Algorithm Please Note : “Understanding JikesRVM in the Context of Cryo-EM/TEM/SEM Imaging Algorithms and Applications – A General Informatics Introduction from a Software Architecture View Point” by Nirmal & Gagik 2016 could be useful. Figure II. Mathematical Algorithm with various Grobner Bases Mathematical Tools/Software.Self Explanatory Graphical Algorithm Figure III.Scala and Java based Software Architecture Flow Self Explanatory Graphical Algorithm Figure IV. Mathematical Algorithm involving EM Field Theory & Thermodynamics Self Explanatory Graphical Algorithm |
Consolidated Health Economic Evaluation Reporting Standards (CHEERS)--explanation and elaboration: a report of the ISPOR Health Economic Evaluation Publication Guidelines Good Reporting Practices Task Force. | BACKGROUND
Economic evaluations of health interventions pose a particular challenge for reporting because substantial information must be conveyed to allow scrutiny of study findings. Despite a growth in published reports, existing reporting guidelines are not widely adopted. There is also a need to consolidate and update existing guidelines and promote their use in a user-friendly manner. A checklist is one way to help authors, editors, and peer reviewers use guidelines to improve reporting.
OBJECTIVE
The task force's overall goal was to provide recommendations to optimize the reporting of health economic evaluations. The Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement is an attempt to consolidate and update previous health economic evaluation guidelines into one current, useful reporting guidance. The CHEERS Elaboration and Explanation Report of the ISPOR Health Economic Evaluation Publication Guidelines Good Reporting Practices Task Force facilitates the use of the CHEERS statement by providing examples and explanations for each recommendation. The primary audiences for the CHEERS statement are researchers reporting economic evaluations and the editors and peer reviewers assessing them for publication.
METHODS
The need for new reporting guidance was identified by a survey of medical editors. Previously published checklists or guidance documents related to reporting economic evaluations were identified from a systematic review and subsequent survey of task force members. A list of possible items from these efforts was created. A two-round, modified Delphi Panel with representatives from academia, clinical practice, industry, and government, as well as the editorial community, was used to identify a minimum set of items important for reporting from the larger list.
RESULTS
Out of 44 candidate items, 24 items and accompanying recommendations were developed, with some specific recommendations for single study-based and model-based economic evaluations. The final recommendations are subdivided into six main categories: 1) title and abstract, 2) introduction, 3) methods, 4) results, 5) discussion, and 6) other. The recommendations are contained in the CHEERS statement, a user-friendly 24-item checklist. The task force report provides explanation and elaboration, as well as an example for each recommendation. The ISPOR CHEERS statement is available online via Value in Health or the ISPOR Health Economic Evaluation Publication Guidelines Good Reporting Practices - CHEERS Task Force webpage (http://www.ispor.org/TaskForces/EconomicPubGuidelines.asp).
CONCLUSIONS
We hope that the ISPOR CHEERS statement and the accompanying task force report guidance will lead to more consistent and transparent reporting, and ultimately, better health decisions. To facilitate wider dissemination and uptake of this guidance, we are copublishing the CHEERS statement across 10 health economics and medical journals. We encourage other journals and groups to consider endorsing the CHEERS statement. The author team plans to review the checklist for an update in 5 years. |
An Internet computing model for ensuring continuity of healthcare | The lack of an integrated medical information service model has been considered as a main issue in ensuring the continuity of healthcare from doctors, healthcare professionals to patients; the resultant unavailable, inaccurate, or unconformable healthcare information services have been recognized as main causes to the annual millions of medication errors. This paper proposes an Internet computing model aimed at providing an affordable, interoperable, ease of integration, and systematic approach to the development of a medical information service network to enable the delivery of continuity of healthcare. Web services, wireless, and advanced automatic identification technologies are fully integrated in the proposed service model. Some preliminary research results are presented. |
Non-singular inhomogeneous stiff fluid cosmology | In this talk we show a stiff fluid solution of the Einstein equations for a cylindrically symmetric spacetime. The main features of this metric are that it is non-separable in comoving coordinates for the congruence of the worldlineS of the fluid and that it yields regular curvature invariants. |
Efficient mutation analysis by propagating and partitioning infected execution states | Mutation analysis evaluates a testing technique by measur- ing how well it detects seeded faults (mutants). Mutation analysis is hampered by inherent scalability problems — a test suite is executed for each of a large number of mutants. Despite numerous optimizations presented in the literature, this scalability issue remains, and this is one of the reasons why mutation analysis is hardly used in practice. Whereas most previous optimizations attempted to stati- cally reduce the number of executions or their computational overhead, this paper exploits information available only at run time to further reduce the number of executions. First, state infection conditions can reveal — with a single test execution of the unmutated program — which mutants would lead to a different state, thus avoiding unnecessary test executions. Second, determining whether an infected execution state propagates can further reduce the number of executions. Mutants that are embedded in compound expressions may infect the state locally without affecting the outcome of the compound expression. Third, those mutants that do infect the state can be partitioned based on the resulting infected state — if two mutants lead to the same infected state, only one needs to be executed as the result of the other can be inferred. We have implemented these optimizations in the Major mu- tation framework and empirically evaluated them on 14 open source programs. The optimizations reduced the mutation analysis time by 40% on average. |
Biodiversity for Food and Nutrition in Brazil | The Biodiversity for Food and Nutrition Project—officially the Mainstreaming Biodiversity Conservation and Sustainable Use for Improved Human Nutrition and Well-being project, or BFN project, is a multi-country initiative with an ambitious goal to mainstream biodiversity conservation to improve nutrition in four countries: Kenya, Sri Lanka, Turkey, and Brazil. In Brazil, the BFN project is just one of many initiatives within a national strategy to eradicate hunger and extreme poverty, specifically by mainstreaming biodiversity into national food and livelihood efforts. The project is influencing the national policy landscape, both by filling critical data gaps through an online portal which will streamline national data on Brazilian biodiversity and ecosystems, and through strong partnerships. |
Situational uses of syndromic surveillance. | Since 2001, many state and local health departments have implemented automated systems to monitor healthcare use and to promptly identify and track epidemics and other public health threats. In 2007-08, we conducted case studies of selected events with actual or potential public health impacts to determine whether and how health departments and hospitals used these new systems. We interviewed public health and hospital representatives and applied qualitative analysis methods to identify response themes. So-called "syndromic" surveillance methods were most useful in situations with widespread health effects, such as respiratory illness associated with seasonal influenza or exposures to smoke from wildfires. In other instances, such as a tornado or hazardous material exposures, these systems were useful for detecting or monitoring health impacts that affected relatively few people, or they were used to affirm the absence of outbreaks following natural disasters or the detection of a potential pathogen in air samples. Typically, these data supplemented information from traditional sources to provide a timelier or fuller mosaic of community health status, and use was shaped by long-standing contacts between health department and hospital staffs. State or local epidemiologists generally preferred syndromic systems they had developed over the CDC BioSense system, citing lesser familiarity with BioSense and less engagement in its development. Instances when BioSense data were most useful to state officials occurred when analyses and reports were provided by CDC staff. Understanding the uses of surveillance information during such events can inform further investments in surveillance capacity in public health emergency preparedness programs. |
Case Report Mowat-Wilson syndrome: clinical and molecular report of the first case in mainland China | Mowat-Wilson syndrome (MWS, MIM #235730) is a rare genetic disorder characterized by moderate-tosevere mental retardation, a recognizable facial gestalt and multiple congenital anomalies. The striking facial phenotype in addition to other features such as microcephaly, congenital heart defects, Hirschsprung disease (HSCR), severely delayed motor/speech development, seizures, short stature, corpus callosum agenesis and hypospadias are particularly important clues for the initial clinical diagnosis. All molecularly confirmed cases with typical MWS have a heterozygous loss-of-function mutation in the ZEB2 (zinc finger E-box binding homeobox 2) gene, suggesting that haploinsufficiency of the protein is the main pathological mechanism. Here, we report the first individual with MWS in mainland China confirmed by molecular genetic testing. A 1-day-old girl was referred to the department of surgery for abdominal distension and failure to pass meconium. Targeted exome sequencing revealed a de novo heterozygous nonsense mutation (p.Arg302X) in ZEB2 in the patient. Medical record review revealed mild facial gestalt, HSCR and severe congenital heart defects supporting the diagnosis of MWS. We concluded that facial dysmorphism in newborn babies might be atypical; doctors should pay more attention during physical examination and be aware of MWS if multiple congenital defects were discovered. ZEB2 gene mutation screening would be an effective manner to clarify the diagnosis. |
Teacher beliefs and technology integration practices: A critical relationship | Early studies indicated that teachersâ€TM enacted beliefs, particularly in terms of classroom technology practices, often did not align with their espoused beliefs. Researchers concluded this was due, at least in part, to a variety of external barriers that prevented teachers from using technology in ways that aligned more closely with their beliefs. However, many of these barriers (access, support, etc.) have since been eliminated in the majority of schools. This multiple case-study research was designed to revisit the question, “How do the pedagogical beliefs and classroom technology practices of teachers, recognized for their technology uses, align?†Twelve K-12 classroom teachers were purposefully selected based on their awardwinning technology practices, supported by evidence from personal and/or classroom websites. Follow-up interviews were conducted to examine the correspondence between teachersâ€TM classroom practices and their pedagogical beliefs. Results a c Purchase Export |
The successive mean quantization transform | This paper presents the successive mean quantization transform (SMQT). The transform reveals the organization or structure of the data and removes properties such as gain and bias. The transform is described and applied in speech processing and image processing. The SMQT is considered as an extra processing step for the mel frequency cepstral coefficients commonly used in speech recognition. In image processing the transform is applied in automatic image enhancement and dynamic range compression. |
Indirubin and Indirubin Derivatives for Counteracting Proliferative Diseases | Indirubin is the active component of Danggui Longhui Wan, a traditional Chinese medicine formulation. The encouraging clinical results from the 1980s obtained in chronic myelocytic leukemia patients treated with indirubin stimulated numerous studies on this compound. These investigations explored the use of indirubin in different types of cancer and reported the synthesis of novel derivatives with improved chemical and pharmacokinetic properties. In this paper, we review the impressive progress that has been made in elucidating the mechanistic understanding of how indirubin and its derivatives affect physiological and pathophysiological processes, mainly by inhibition of cell proliferation and induction of cell death. Furthermore, we survey the therapeutic use of these compounds in combating proliferative diseases such as cancer, restenosis, and psoriasis. |
Behavior change techniques in top-ranked mobile apps for physical activity. | BACKGROUND
Mobile applications (apps) have potential for helping people increase their physical activity, but little is known about the behavior change techniques marketed in these apps.
PURPOSE
The aim of this study was to characterize the behavior change techniques represented in online descriptions of top-ranked apps for physical activity.
METHODS
Top-ranked apps (n=167) were identified on August 28, 2013, and coded using the Coventry, Aberdeen and London-Revised (CALO-RE) taxonomy of behavior change techniques during the following month. Analyses were conducted during 2013.
RESULTS
Most descriptions of apps incorporated fewer than four behavior change techniques. The most common techniques involved providing instruction on how to perform exercises, modeling how to perform exercises, providing feedback on performance, goal-setting for physical activity, and planning social support/change. A latent class analysis revealed the existence of two types of apps, educational and motivational, based on their configurations of behavior change techniques.
CONCLUSIONS
Behavior change techniques are not widely marketed in contemporary physical activity apps. Based on the available descriptions and functions of the observed techniques in contemporary health behavior theories, people may need multiple apps to initiate and maintain behavior change. This audit provides a starting point for scientists, developers, clinicians, and consumers to evaluate and enhance apps in this market. |
A production planning model for an unreliable production facility: Case of finite horizon and single demand | We study a two-level inventory system that is subject to failures and repairs. The objective is to minimize the expected total cost so as to determine the production plan for a single quantity demand. The expected total cost consists of the inventory carrying costs for finished and unfinished items, the backlog cost for not meeting the demand due-date, and the planning costs associated with the ordering schedule of unfinished items. The production plan consists of the optimal number of lot sizes, the optimal size for each lot, the optimal ordering schedule for unfinished items, and the optimal due-date to be assigned to the demand. To gain insight, we solve special cases and use their results to device an efficient solution approach for the main model. The models are solved to optimality and the solution is either obtained in closed form or through very efficient algorithms. |
Predicting Slice-to-Volume Transformation in Presence of Arbitrary Subject Motion | This paper aims to solve a fundamental problem in intensitybased 2D/3D registration, which concerns the limited capture range and need for very good initialization of state-of-the-art image registration methods. We propose a regression approach that learns to predict rotation and translations of arbitrary 2D image slices from 3D volumes, with respect to a learned canonical atlas co-ordinate system. To this end, we utilize Convolutional Neural Networks (CNNs) to learn the highly complex regression function that maps 2D image slices into their correct position and orientation in 3D space. Our approach is attractive in challenging imaging scenarios, where significant subject motion complicates reconstruction performance of 3D volumes from 2D slice data. We extensively evaluate the effectiveness of our approach quantitatively on simulated MRI brain data with extreme random motion. We further demonstrate qualitative results on fetal MRI where our method is integrated into a full reconstruction and motion compensation pipeline. With our CNN regression approach we obtain an average prediction error of 7mm on simulated data, and convincing reconstruction quality of images of very young fetuses where previous methods fail. We further discuss applications to Computed Tomography and X-ray projections. Our approach is a general solution to the 2D/3D initialization problem. It is computationally efficient, with prediction times per slice of a few milliseconds, making it suitable for real-time scenarios. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.