title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
Tricaine methane‐sulfonate (MS‐222) application in fish anaesthesia | Tricaine methane-sulfonate (MS-222) application in fish anaesthesia By N. Topic Popovic, I. Strunjak-Perovic, R. Coz-Rakovac, J. Barisic, M. Jadan, A. Persin Berakovic and R. Sauerborn Klobucar Laboratory of Ichthyopathology – Biological Materials, Division for Materials Chemistry, Rudjer Boskovic Institute, Zagreb, Croatia; Department of Anaesthesiology, University Hospital Clinic, Zagreb, Croatia |
Viewing artworks: Contributions of cognitive control and perceptual facilitation to aesthetic experience | When we view visual images in everyday life, our perception is oriented toward object identification. In contrast, when viewing visual images as artworks, we also tend to experience subjective reactions to their stylistic and structural properties. This experiment sought to determine how cognitive control and perceptual facilitation contribute to aesthetic perception along with the experience of emotion. Using functional MRI, we show that aesthetic perception activated bilateral insula which we attribute to the experience of emotion. Moreover, while adopting the aesthetic orientation activated the left lateral prefrontal cortex, paintings that facilitated visuospatial exploration activated the left superior parietal lobule. The results suggest that aesthetic experience is a function of the interaction between top-down orienting of attention and bottom-up perceptual facilitation. |
Technology affordances for intersubjective meaning making: A research agenda for CSCL | 11 Abstract Now well into its second decade, the field of Computer Supported 12 Collaborative Learning (CSCL) appears healthy, encompassing a diversity of topics 13 of study, methodologies, and representatives of various research communities. It is 14 an appropriate time to ask: what central questions can integrate our work into a 15 coherent field? This paper proposes the study of technology affordances for 16 intersubjective meaning making as an integrating research agenda for CSCL. A 17 brief survey of epistemologies of collaborative learning and forms of computer 18 support for that learning characterize the field to be integrated and motivate the 19 proposal. A hybrid of experimental, descriptive and design methodologies is 20 proposed in support of this agenda. A working definition of intersubjective meaning 21 making as joint composition of interpretations of a dynamically evolving context is 22 provided, and used to propose a framework around which dialogue between analytic 23 approaches can take place. |
A double-blind, randomised, parallel group, multinational, multicentre study comparing a single dose of ondansetron 24 mg p.o. with placebo and metoclopramide 10 mg t.d.s. p.o. in the treatment of opioid-induced nausea and emesis in cancer patients | Nausea and emesis are common side effects of opioid drugs administered for pain relief in cancer patients. The aim of this study was to compare the anti-emetic efficacy and safety of ondansetron, placebo and metoclopramide in the treatment of opioid-induced nausea and emesis (OIE) in cancer patients. This was a multinational, multicentre, double-blind, parallel group study in which cancer patients who were receiving a full opioid agonist for cancer pain were randomised to receive one of oral ondansetron 24 mg once daily, metoclopramide 10 mg three times daily, or placebo. Study medication was started only if the patient experienced nausea and/or emesis following opioid administration. Efficacy and safety assessments were made over a study period of 24 h from the time of the first dose of anti-emetics/placebo. The study was terminated prematurely because of the difficulties in recruiting patients satisfying the stringent entry criteria. Ninety-two patients were included in the intent-to-treat population: 30 patients received placebo, 29 patients ondansetron and 33 patients metoclopramide. There was no statistically significant difference between the groups in the proportion achieving complete control of emesis (33% of patients on placebo, 48% on ondansetron and 52% on metoclopramide) or complete control of nausea (23% of patients on placebo, 17% on ondansetron and 36% on metoclopramide). Rescue anti-emetics were required in 8 of 33 patients on metoclopramide, 4 of 29 on ondansetron, and 3 of 30 on placebo. The incidence of adverse events was very low and similar in all treatment groups. Neither ondansetron 24 mg once daily nor metoclopromide 10 mg t.d.s. given orally was significantly more effective than placebo in the control of OIE in cancer patients. |
Privacy-Enhanced Web Personalization | Consumer studies demonstrate that online users value personalized content. At the same time, providing personalization on websites seems quite profitable for web vendors. This win-win situation is however marred by privacy concerns since personalizing people's interaction entails gathering considerable amounts of data about them. As numerous recent surveys have consistently demonstrated, computer users are very concerned about their privacy on the Internet. Moreover, the collection of personal data is also subject to legal regulations in many countries and states. Both user concerns and privacy regulations impact frequently used personalization methods. This article analyzes the tension between personalization and privacy, and presents approaches to reconcile the both. |
Directional asymmetries in human smooth pursuit eye movements. | PURPOSE
Humans make smooth pursuit eye movements to bring the image of a moving object onto the fovea. Although pursuit accuracy is critical to prevent motion blur, the eye often falls behind the target. Previous studies suggest that pursuit accuracy differs between motion directions. Here, we systematically assess asymmetries in smooth pursuit.
METHODS
In experiment 1, binocular eye movements were recorded while observers (n = 20) tracked a small spot of light moving along one of four cardinal or diagonal axes across a featureless background. We analyzed pursuit latency, acceleration, peak velocity, gain, and catch-up saccade latency, number, and amplitude. In experiment 2 (n = 22), we examined the effects of spatial location and constrained stimulus motion within the upper or lower visual field.
RESULTS
Pursuit was significantly faster (higher acceleration, peak velocity, and gain) and smoother (fewer and later catch-up saccades) in response to downward versus upward motion in both the upper and the lower visual fields. Pursuit was also more accurate and smoother in response to horizontal versus vertical motion. CONCLUSIONS. Our study is the first to report a consistent up-down asymmetry in human adults, regardless of visual field. Our findings suggest that pursuit asymmetries are adaptive responses to the requirements of the visual context: preferred motion directions (horizontal and downward) are more critical to our survival than nonpreferred ones. |
The crying baby: what approach? | PURPOSE OF REVIEW
Cry-fuss problems are among the most common clinical presentations in the first few months of life and are associated with adverse outcomes for some mothers and babies. Cry-fuss behaviour emerges out of a complex interplay of cultural, psychosocial, environmental and biologic factors, with organic disturbance implicated in only 5% of cases. A simplistic approach can have unintended consequences. This article reviews recent evidence in order to update clinical management.
RECENT FINDINGS
New research is considered in the domains of organic disturbance, feed management, maternal health, sleep management, and sensorimotor integration. This transdisciplinary approach takes into account the variable neurodevelopmental needs of healthy infants, the effects of feeding management on the highly plastic neonatal brain, and the bi-directional brain-gut-enteric microbiota axis. An individually tailored, mother-centred and family-centred approach is recommended.
SUMMARY
The family of the crying baby requires early intervention to assess for and manage potentially treatable problems. Cross-disciplinary collaboration is often necessary if outcomes are to be optimized. |
Information systems strategy: Quo vadis? | This article is a personal retrospective which traces the evolution of information systems strategy (ISS) since it emerged as a topic in the late 1970s and considers the nature of organisations’ ISSs and how they have been influenced by the interplay of many factors over that period. In addition to responding to the rapidly evolving underlying technologies, ISS practice in organisations has had to deal with the combined effects of economic cycles and an increasingly global business context, which effect both the organisations themselves and the development of the IT industry. This article argues that the changing fortunes of the IT suppliers and their strategies are two of the most significant influences on organisations’ ISSs. The influence and contribution of academics and their research is also discussed. The study of ISS has largely followed practice and attempted to explain its nature, role and impact using contemporary theoretical paradigms but often based on relatively limited empirical data. In conclusion it is suggested that a new multi-centred, collaborative approach, involving both academic and practitioner experts to develop a comprehensive evidence base, would enable greater understanding of how the range of factors interact to determine the nature and value of ISS in 21st century organisations. 2012 Elsevier B.V. All rights reserved. |
DendroPy: a Python library for phylogenetic computing | UNLABELLED
DendroPy is a cross-platform library for the Python programming language that provides for object-oriented reading, writing, simulation and manipulation of phylogenetic data, with an emphasis on phylogenetic tree operations. DendroPy uses a splits-hash mapping to perform rapid calculations of tree distances, similarities and shape under various metrics. It contains rich simulation routines to generate trees under a number of different phylogenetic and coalescent models. DendroPy's data simulation and manipulation facilities, in conjunction with its support of a broad range of phylogenetic data formats (NEXUS, Newick, PHYLIP, FASTA, NeXML, etc.), allow it to serve a useful role in various phyloinformatics and phylogeographic pipelines.
AVAILABILITY
The stable release of the library is available for download and automated installation through the Python Package Index site (http://pypi.python.org/pypi/DendroPy), while the active development source code repository is available to the public from GitHub (http://github.com/jeetsukumaran/DendroPy). |
Ambulance location and relocation models | This article traces the evolution of ambulance location and relocation models proposed over the past 30 years. The models are classified in two main categories. Deterministic models are used at the planning stage and ignore stochastic considerations regarding the availability of ambulances. Probabilistic models reflect the fact that ambulances operate as servers in a queueing system and cannot always answer a call. In addition, dynamic models have been developed to repeatedly relocate ambulances throughout the day. 2002 Elsevier Science B.V. All rights reserved. |
Energy Efficient Smartphone-Based Activity Recognition using Fixed-Point Arithmetic | In this paper we propose a novel energy efficient approach for the recognition of human activities using smartphones as wearable sensing devices, targeting assisted living applications such as remote patient activity monitoring for the disabled and the elderly. The method exploits fixed-point arithmetic to propose a modified multiclass Support Vector Machine (SVM) learning algorithm, allowing to better preserve the smartphone battery lifetime with respect to the conventional floating-point based formulation while maintaining comparable system accuracy levels. Experiments show comparative results between this approach and the traditional SVM in terms of recognition performance and battery consumption, highlighting the advantages of the proposed method. |
Neuroophthalmological outcomes associated with use of the Pipeline Embolization Device: analysis of the PUFS trial results. | OBJECT
Neuroophthalmological morbidity is commonly associated with large and giant cavernous and supraclinoid internal carotid artery (ICA) aneurysms. The authors sought to evaluate the neuroophthalmological outcomes after treatment of these aneurysms with the Pipeline Embolization Device (PED).
METHODS
The Pipeline for Uncoilable or Failed Aneurysms (PUFS) trial was an international, multicenter prospective trial evaluating the safety and efficacy of the PED. All patients underwent complete neuroophthalmological examinations both before the PED procedure and at a 6-month follow-up. All examinations were performed for the purpose of this study and according to study criteria.
RESULTS
In total, 108 patients were treated in the PUFS trial, 98 of whom had complete neuroophthalmological follow-up. Of the patients with complete follow-up, 39 (40%) presented with a neuroophthalmological baseline deficit that was presumed to be attributable to the aneurysm, and patients with these baseline deficits had significantly larger aneurysms. In 25 of these patients (64%), the baseline deficit showed at least some improvement 6 months after PED treatment, whereas in 1 patient (2.6%), the deficits only worsened. In 5 patients (5%), new deficits had developed at the 6-month follow-up, while in another 6 patients (6%), deficits that were not originally assumed to be related to the aneurysm had improved by that time. A history of diabetes was associated with failure of the baseline deficits to improve after the treatment. The aneurysm maximum diameter was significantly larger in patients with a new deficit or a worse baseline deficit at 6 months postprocedure.
CONCLUSIONS
Patients treated with the PED for large and giant ICA aneurysms had excellent neuroophthalmological outcomes 6 months after the procedure, with deficits improving in most of the patients, very few deficits worsening, and few new deficits developing. |
Patterns of Play: Play-Personas in User-Centred Game Development | In recent years certain trends from User-Centered design have been seeping into the practice of designing computer games. The balance of power between game designers and players is being renegotiated in order to find a more active role for players and provide them with control in shaping the experiences that games are meant to evoke. A growing player agency can turn both into an increased sense of player immersion and potentially improve the chances of critical acclaim. This paper presents a possible solution to the challenge of involving the user in the design of interactive entertainment by adopting and adapting the "persona" framework introduced by Alan Cooper in the field of Human Computer Interaction. The original method is improved by complementing the traditional ethnographic descriptions of personas with parametric, quantitative, data-oriented models of patterns of user behaviour for computer games. Author |
Current recommendations for the pharmacologic therapy in Kawasaki syndrome and management of its cardiovascular complications. | Kawasaki syndrome is a potentially life-threatening disease of early childhood that untreated holds a risk of severe coronary involvement. Its diagnosis is made via a list of clinical signs because etiology and pathophysiology are still unknown and no specific laboratory tool is available. Appropriate therapy with intravenous immunoglobulins and aspirin reduces the incidence of coronary abnormalities to less than 5%. Immunoglobulins have been shown to be highly effective in reducing disease symptoms or their severity and chiefly in reducing the rate of coronary artery aneurysm development. Aspirin is firstly used in high dose for its anti-inflammatory properties and then in low dose for its anti-thrombotic effects. Timely diagnosis and precociously administered treatment are two crucial points in the definition of prognosis for Kawasaki syndrome. In this review heart complications are discussed and therapeutic options stratified according to both severity of coronary involvement and grading of cardiovascular risk. |
Achievement goals in sport: the development and validation of the Perception of Success Questionnaire. | Recent research into motivation and achievement behaviour in sport has focused on achievement goal theory. This theory states that two goal orientations manifest themselves in achievement contexts and impact on the motivation process. These two goals have been defined as 'task' and 'ego' goal orientations. This paper traces the development of the Perception of Success Questionnaire as a measure of achievement goals developed specifically for the sport context. The early development of the questionnaire is documented, in which the scale was shortened from the initial 29 to the current 12 question format. We demonstrate that task and ego goals are orthogonal, internal reliabilities for the orientations are high, with strong construct and concurrent validity. We conclude by reporting results from two recent confirmatory factor analyses that were conducted on the Children's and Adult versions of the questionnaire; these results show the Perception of Success Questionnaire to be a reliable and valid instrument to measure achievement goal orientations in sport. |
Long-term clinical outcome after a first angiographically confirmed coronary stent thrombosis: an analysis of 431 cases. | BACKGROUND
There are limited data on the long-term clinical outcome after an angiographically confirmed (definite) stent thrombosis (ST).
METHODS AND RESULTS
Four hundred thirty-one consecutive patients with a definite ST were enrolled in this multicenter registry. The primary end point was the composite of cardiac death and definite recurrent ST. Secondary end points were all-cause death, cardiac death, definite recurrent ST, definite and probable recurrent ST, any myocardial infarction, and any target-vessel revascularization. The primary end point occurred in 111 patients after a median follow-up of 27.1 months. The estimated cumulative event rates at 30 days and 1, 2, and 3 years were 18.0%, 23.6%, 25.2%, and 27.9%, respectively. The cumulative incidence rates of definite recurrent ST, definite or probable recurrent ST, any myocardial infarction, and any target-vessel revascularization were 18.8%, 20.1%, 21.3%, and 32.0%, respectively, at the longest available follow-up. Independent predictors for the primary end point were diabetes mellitus, total stent length, severe calcification, American College of Cardiology/American Heart Association B2-C lesions, TIMI (Thrombolysis In Myocardial Infarction) flow grade <3 after percutaneous coronary intervention, and left ventricular ejection fraction <45%. The implantation of an additional coronary stent during the first ST was also associated with unfavorable outcome. Clinical outcome was not affected by the type of previously implanted stent (drug-eluting or bare-metal stent) or the category of ST (early versus late).
CONCLUSIONS
The long-term clinical outcome after a first definite ST is unfavorable, with a high mortality and recurrence rate. Diabetes mellitus, left ventricular ejection fraction <45%, long total stent length, complex coronary lesions, TIMI flow grade <3 after percutaneous coronary intervention, and implantation of an additional coronary stent during the emergent percutaneous coronary intervention for the ST were associated with this unfavorable outcome. |
Improving Semantic Textual Similarity with Phrase Entity Alignment | Semantic Textual Similarity (STS) measures the degree of semantic equivalence between two segments of text, even though the similar context is expressed using different words. The textual segments are word phrases, sentences, paragraphs or documents. The similarity can be measured using lexical, syntactic and semantic information embedded in the sentences. The STS task in SemEval workshop is viewed as a regression problem, where real-valued output is clipped to the range 0-5 on a sentence pair. In this paper, empirical evaluations are carried using lexical, syntactic and semantic features on STS 2016 dataset. A new syntactic feature, Phrase Entity Alignment (PEA) is proposed. A phrase entity is a conceptual unit in a sentence with a subject or an object and its describing words. PEA aligns phrase entities present in the sentences based on their similarity scores. STS score is measured by combing the similarity scores of all aligned phrase entities. The impact of PEA on semantic textual equivalence is depicted using Pearson correlation between system generated scores and the human annotations. The proposed system attains a mean score of 0.7454 using random forest regression model. The results indicate that the system using the lexical, syntactic and semantic features together with PEA feature perform comparably better than existing systems. |
An Introduction to Deep Learning | The deep learning paradigm tackles problems on which shallow architectures (e.g. SVM) are affected by the curse of dimensionality. As part of a two-stage learning scheme involving multiple layers of nonlinear processing a set of statistically robust features is automatically extracted from the data. The present tutorial introducing the ESANN deep learning special session details the state-of-the-art models and summarizes the current understanding of this learning approach which is a reference for many difficult classification tasks. |
Quality-of-service in cloud computing: modeling techniques and their applications | Recent years have seen the massive migration of enterprise applications to the cloud. One of the challenges posed by cloud applications is Quality-of-Service (QoS) management, which is the problem of allocating resources to the application to guarantee a service level along dimensions such as performance, availability and reliability. This paper aims at supporting research in this area by providing a survey of the state of the art of QoS modeling approaches suitable for cloud systems. We also review and classify their early application to some decision-making problems arising in cloud QoS management. |
A clinical experience of the supraclavicular flap used to reconstruct head and neck defects in late-stage cancer patients. | The supraclavicular island flap has been widely used in head and neck reconstruction, providing an alternative to the traditional techniques like regional or free flaps, mainly because of its thin skin island tissue and reliable vascularity. Head and neck patients who require large reconstructions usually present poor clinical and healing conditions. An early experience using this flap for late-stage head and neck tumour treatment is reported. Forty-seven supraclavicular artery flaps were used to treat head and neck oncologic defects after cutaneous, intraoral and pharyngeal tumour resections. Dissection time, complications, donor and reconstructed area outcomes were assessed. The mean time for harvesting the flaps was 50 min by the senior author. All donor sites were closed primarily. Three cases of laryngopharyngectomy reconstruction developed a small controlled (salivary) leak that was resolved with conservative measures. Small or no strictures were detected on radiologic swallowing examinations and all patients regained normal swallowing function. Five patients developed donor site dehiscence. These wounds were treated with regular dressing until healing was complete. There were four distal flap necroses in this series. These necroses were debrided and closed primarily. The supraclavicular flap is pliable for head and neck oncologic reconstruction in late-stage patients. High-risk patients and modified radical neck dissection are not contraindications for its use. The absence of the need to isolate the pedicle offers quick and reliable harvesting. The arc of rotation on the base of the neck provides adequate length for pharyngeal, oral lining and to reconstruct the middle and superior third of the face. |
The security challenges in the IoT enabled cyber-physical systems and opportunities for evolutionary computing & other computational intelligence | Internet of Things (IoT) has given rise to the fourth industrial revolution (Industrie 4.0), and it brings great benefits by connecting people, processes and data. However, cybersecurity has become a critical challenge in the IoT enabled cyber physical systems, from connected supply chain, Big Data produced by huge amount of IoT devices, to industry control systems. Evolutionary computation combining with other computational intelligence will play an important role for cybersecurity, such as artificial immune mechanism for IoT security architecture, data mining/fusion in IoT enabled cyber physical systems, and data driven cybersecurity. This paper provides an overview of security challenges in IoT enabled cyber-physical systems and what evolutionary computation and other computational intelligence technology could contribute for the challenges. The overview could provide clues and guidance for research in IoT security with computational intelligence. |
A Comparative Study on Approaches of Vector Space Model in Information Retrieval | The vector space model is one of the classical and widely applied information retrieval models to rank the web page based on similarity values. The retrieval operations consist of cosine similarity function to compute the similarity values between a given query and the set of documents retrieved and then rank the documents according to the relevance. In this paper, we are presenting different approaches of vector space model to compute similarity values of hits from search engine for given queries based on terms weight. In order to achieve the goal of an effective evaluation algorithm, our work intends to extensive analysis of the main aspects of Vector space model, its approaches and provides a comprehensive comparison for Term-Count |
Self Paced Deep Learning for Weakly Supervised Object Detection | In a weakly-supervised scenario object detectors need to be trained using image-level annotation alone. Since bounding-box-level ground truth is not available, most of the solutions proposed so far are based on an iterative, Multiple Instance Learning framework in which the current classifier is used to select the highest-confidence boxes in each image, which are treated as pseudo-ground truth in the next training iteration. However, the errors of an immature classifier can make the process drift, usually introducing many of false positives in the training dataset. To alleviate this problem, we propose in this paper a training protocol based on the self-paced learning paradigm. The main idea is to iteratively select a subset of images and boxes that are the most reliable, and use them for training. While in the past few years similar strategies have been adopted for SVMs and other classifiers, we are the first showing that a self-paced approach can be used with deep-network-based classifiers in an end-to-end training pipeline. The method we propose is built on the fully-supervised Fast-RCNN architecture and can be applied to similar architectures which represent the input image as a bag of boxes. We show state-of-the-art results on Pascal VOC 2007, Pascal VOC 2010 and ILSVRC 2013. On ILSVRC 2013 our results based on a low-capacity AlexNet network outperform even those weakly-supervised approaches which are based on much higher-capacity networks. |
Comparative effectiveness of early versus conventional timing of dialysis initiation in advanced CKD. | BACKGROUND
Previous observational studies examining outcomes associated with the timing of dialysis therapy initiation in the United States have often been limited by lead time and survivor bias.
STUDY DESIGN
Retrospective cohort study comparing the effectiveness of early versus later (conventional) dialysis therapy initiation in advanced chronic kidney disease (CKD). The analysis used inverse probability weighting to account for an individual's contribution to different exposure groups over time in a pooled logistic regression model. Patients contributed risk to both exposure categories (early and later initiation) until there was a clear treatment strategy (ie, dialysis therapy was initiated early or estimated glomerular filtration rate [eGFR] decreased to <10mL/min/1.73m(2)).
SETTING & PARTICIPANTS
Patients with CKD who had at least one face-to-face outpatient encounter with a Cleveland Clinic health care provider as of January 1, 2005, and at least 3 eGFRs in the range of 20-30mL/min/1.73m(2) measured at least 180 days apart.
PREDICTORS
Timing of dialysis therapy initiation as determined using model-based interpolation of eGFR trajectories over time. Timing was defined as early (interpolated eGFR at dialysis therapy initiation≥10mL/min/1.73m(2)) or later (eGFR < 10mL/min/1.73m(2)) and was time-varying.
OUTCOMES
Death from any cause occurring from the time that eGFR was equal to 20mL/min/1.73m(2) through September 15, 2009.
RESULTS
The study population consisted of 652 patients meeting inclusion criteria. Most (71.3%) of the study population did not initiate dialysis therapy during follow-up. Patients who did not initiate dialysis therapy (n=465) were older, more likely to be white, and had more favorable laboratory profiles than those who started dialysis therapy. Overall, 146 initiated dialysis early and 80 had eGFRs decrease to <10mL/min/1.73m(2). Many participants (n=426) were censored prior to attaining a clear treatment strategy and were considered undeclared. There was no statistically significant survival difference for the early compared with later initiation strategy (OR, 0.85; 95% CI, 0.65-1.11).
LIMITATIONS
Interpolated eGFR, moderate sample size, and likely unmeasured confounders.
CONCLUSIONS
In patients with advanced CKD, timing of dialysis therapy initiation was not associated with mortality when accounting for lead time bias and survivor bias. |
EFFECT OF HIGH-SPEED AND PLYOMETRIC TRAINING FOR 13-YEAR-OLD MALE SOCCER PLAYERS ON ACCELERATION AND AGILITY PERFORMANCE | Acceleration, sprint and agility performance are crucial in sports like soccer. There are few studies regarding the effect of training on youth soccer players in agility performance and in sprint distances shorter than 30 meter. Therefore, the aim of the recent study was to examine the effect of a high-intensity sprint and plyometric training program on 13-year-old male soccer players. A training group of 14 adolescent male soccer players, mean age (±SD) 13.5 years (±0.24) followed an eight week intervention program for one hour per week, and a group of 12 adolescent male soccer players of corresponding age, mean age 13.5 years (±0.23) served as control a group. Preand post-tests assessed 10-m linear sprint, 20-m linear sprint and agility performance. Results showed a significant improvement in agility performance, pre 8.23 s (±0.34) to post 7.69 s (± 0.34) (p<0.01), and a significant improvement in 0-20m linear sprint, pre 3.54s (±0.17) to post 3.42s (±0.18) (p<0.05). In 0-10m sprint the participants also showed an improvement, pre 2.02s (±0.11) to post 1.96s (± 0.11), however this was not significant. The correlation between 10-m sprint and agility was r = 0.53 (p<0.01), and between 20-m linear sprint and agility performance, r = 0.67 (p<0.01). The major finding in the study is the significant improvement in agility performance and in 0-20 m linear sprint in the intervention group. These findings suggest that organizing the training sessions with short-burst high-intensity sprint and plyometric exercises interspersed with adequate recovery time, may result in improvements in both agility and in linear sprint performance in adolescent male soccer players. Another finding is the correlation between linear sprint and agility performance, indicating a difference when compared to adults. 4 | Mathisen: EFFECT OF HIGH-SPEED... |
Scalable Property Aggregation for Linked Data Recommender Systems | Recommender systems are an integral part of today's internet landscape. Recently the enhancement of recommendation services through Linked Open Data (LOD) became a new research area. The ever growing amount of structured data on the web can be used as additional background information for recommender systems. But current approaches in Linked Data recommender systems (LDRS) miss out on an adequate item feature representation in their prediction model and an efficient processing of LOD resources. In this paper, we present a scalable Linked Data recommender system that calculates preferences on multiple property dimensions. The system achieves scalability through parallelization of property-specific rating prediction on a MapReduce framework. Separate prediction results are summarized through a stacking technique. Evaluation results show an increased performance both in terms of accuracy and scalability. |
Abiraterone acetate plus prednisone versus placebo plus prednisone in chemotherapy-naive men with metastatic castration-resistant prostate cancer (COU-AA-302): final overall survival analysis of a randomised, double-blind, placebo-controlled phase 3 study. | BACKGROUND
Abiraterone acetate plus prednisone significantly improved radiographic progression-free survival compared with placebo plus prednisone in men with chemotherapy-naive castration-resistant prostate cancer at the interim analyses of the COU-AA-302 trial. Here, we present the prespecified final analysis of the trial, assessing the effect of abiraterone acetate plus prednisone on overall survival, time to opiate use, and use of other subsequent therapies.
METHODS
In this placebo-controlled, double-blind, randomised phase 3 study, 1088 asymptomatic or mildly symptomatic patients with chemotherapy-naive prostate cancer stratified by Eastern Cooperative Oncology performance status (0 vs 1) were randomly assigned with a permuted block allocation scheme via a web response system in a 1:1 ratio to receive either abiraterone acetate (1000 mg once daily) plus prednisone (5 mg twice daily; abiraterone acetate group) or placebo plus prednisone (placebo group). Coprimary endpoints were radiographic progression-free survival and overall survival analysed in the intention-to-treat population. The study is registered with ClinicalTrials.gov, number NCT00887198.
FINDINGS
At a median follow-up of 49.2 months (IQR 47.0-51.8), 741 (96%) of the prespecified 773 death events for the final analysis had been observed: 354 (65%) of 546 patients in the abiraterone acetate group and 387 (71%) of 542 in the placebo group. 238 (44%) patients initially receiving prednisone alone subsequently received abiraterone acetate plus prednisone as crossover per protocol (93 patients) or as subsequent therapy (145 patients). Overall, 365 (67%) patients in the abiraterone acetate group and 435 (80%) in the placebo group received subsequent treatment with one or more approved agents. Median overall survival was significantly longer in the abiraterone acetate group than in the placebo group (34.7 months [95% CI 32.7-36.8] vs 30.3 months [28.7-33.3]; hazard ratio 0.81 [95% CI 0.70-0.93]; p=0.0033). The most common grade 3-4 adverse events of special interest were cardiac disorders (41 [8%] of 542 patients in the abiraterone acetate group vs 20 [4%] of 540 patients in the placebo group), increased alanine aminotransferase (32 [6%] vs four [<1%]), and hypertension (25 [5%] vs 17 [3%]).
INTERPRETATION
In this randomised phase 3 trial with a median follow-up of more than 4 years, treatment with abiraterone acetate prolonged overall survival compared with prednisone alone by a margin that was both clinically and statistically significant. These results further support the favourable safety profile of abiraterone acetate in patients with chemotherapy-naive metastatic castration-resistant prostate cancer.
FUNDING
Janssen Research & Development. |
Intellectual capital and performance in causal models Evidence from the information technology industry in Taiwan | Purpose – This paper seeks to investigate the impact of intellectual capital elements on business performance, as well as the relationship among intellectual capital elements from a cause-effect perspective. Design/methodology/approach – The partial least squares approach is used to examine the information technology (IT) industry in Taiwan. Findings – Results show that intellectual capital elements directly affect business performance, with the exception of human capital. Human capital indirectly affects performance through the other three elements: innovation capital, process capital, and customer capital. There also exists a cause-effect relationship among four elements of intellectual capital. Human capital affects innovation capital and process capital. Innovation capital affects process capital, which in turn influences customer capital. Finally, customer capital contributes to performance. The cause-effect relationship between leading elements and lagged elements provides implications for the management of firms in the IT industry. Research limitations/implications – The model proposed in this study is applicable to the high-tech IT industry. Modification of the proposed model may be needed in applying this model to other industries. Practical implications – This study helps management identify relevant intellectual capital elements and their indicators to enhance business performance. Originality/value – This paper is a seminal work to propose an integrated cause-effect model to investigate the relationship among elements of intellectual capital for IT in Taiwan. |
Microgrid power electronic converters: State of the art and future challenges | This paper presents a review of the state of the art of power electric converters used in microgrids. The paper focuses primarily on grid connected converters. Different topologies and control and modulation strategies for these specific converters are critically reviewed. Moreover, future challenges in respect of these converters are identified along with their potential solutions. |
Critical Thinking , Cognitive Presence , and Computer Conferencing in Distance Education | This article describes a practical approach to judging the nature and quality of critical discourse in a computer conference. A model of a critical community of inquiry frames the research. A core concept in defining a community of inquiry is cognitive presence. In turn, the practical inquiry model operationalizes cognitive presence for the purpose of developing a tool to assess critical discourse and reflection. Encouraging empirical findings related to an attempt to create an efficient and reliable instrument to assess the nature and quality of critical discourse and thinking in a textbased educational context are presented. Finally, it is suggested that cognitive presence (i.e., critical, practical inquiry) can be created and supported in a computer conference environment with appropriate teaching and social presence. |
RankExplorer: Visualization of Ranking Changes in Large Time Series Data | For many applications involving time series data, people are often interested in the changes of item values over time as well as their ranking changes. For example, people search many words via search engines like Google and Bing every day. Analysts are interested in both the absolute searching number for each word as well as their relative rankings. Both sets of statistics may change over time. For very large time series data with thousands of items, how to visually present ranking changes is an interesting challenge. In this paper, we propose RankExplorer, a novel visualization method based on ThemeRiver to reveal the ranking changes. Our method consists of four major components: 1) a segmentation method which partitions a large set of time series curves into a manageable number of ranking categories; 2) an extended ThemeRiver view with embedded color bars and changing glyphs to show the evolution of aggregation values related to each ranking category over time as well as the content changes in each ranking category; 3) a trend curve to show the degree of ranking changes over time; 4) rich user interactions to support interactive exploration of ranking changes. We have applied our method to some real time series data and the case studies demonstrate that our method can reveal the underlying patterns related to ranking changes which might otherwise be obscured in traditional visualizations. |
Hierarchical Deep Reinforcement Learning: Integrating Temporal Abstraction and Intrinsic Motivation | Learning goal-directed behavior in environments with sparse feedback is a major challenge for reinforcement learning algorithms. One of the key difficulties is insufficient exploration, resulting in an agent being unable to learn robust policies. Intrinsically motivated agents can explore new behavior for their own sake rather than to directly solve external goals. Such intrinsic behaviors could eventually help the agent solve tasks posed by the environment. We present hierarchicalDQN (h-DQN), a framework to integrate hierarchical action-value functions, operating at different temporal scales, with goal-driven intrinsically motivated deep reinforcement learning. A top-level q-value function learns a policy over intrinsic goals, while a lower-level function learns a policy over atomic actions to satisfy the given goals. h-DQN allows for flexible goal specifications, such as functions over entities and relations. This provides an efficient space for exploration in complicated environments. We demonstrate the strength of our approach on two problems with very sparse and delayed feedback: (1) a complex discrete stochastic decision process with stochastic transitions, and (2) the classic ATARI game – ‘Montezuma’s Revenge’. |
Single Document Automatic Text Summarization using Term Frequency-Inverse Document Frequency (TF-IDF) | The increasing availability of online information has triggered an intensive research in the area of automatic text summarization within the Natural Language Processing (NLP). Text summarization reduces the text by removing the less useful information which helps the reader to find the required information quickly. There are many kinds of algorithms that can be used to summarize the text. One of them is TF-IDF (Term Frequency-Inverse Document Frequency). This research aimed to produce an automatic text summarizer implemented with TF-IDF algorithm and to compare it with other various online source of automatic text summarizer. To evaluate the summary produced from each summarizer, The F-Measure as the standard comparison value had been used. The result of this research produces 67% of accuracy with three data samples which are higher compared to the other online summarizers. |
Artificial intelligence in medicine: the challenges ahead. | The modern study of artificial intelligence in medicine (AIM) is 25 years old. Throughout this period, the field has attracted many of the best computer scientists, and their work represents a remarkable achievement. However, AIM has not been successful-if success is judged as making an impact on the practice of medicine. Much recent work in AIM has been focused inward, addressing problems that are at the crossroads of the parent disciplines of medicine and artificial intelligence. Now, AIM must move forward with the insights that it has gained and focus on finding solutions for problems at the heart of medical practice. The growing emphasis within medicine on evidence-based practice should provide the right environment for that change. |
A Microsoft-Excel-based tool for running and critically appraising network meta-analyses—an overview and application of NetMetaXL | BACKGROUND
The use of network meta-analysis has increased dramatically in recent years. WinBUGS, a freely available Bayesian software package, has been the most widely used software package to conduct network meta-analyses. However, the learning curve for WinBUGS can be daunting, especially for new users. Furthermore, critical appraisal of network meta-analyses conducted in WinBUGS can be challenging given its limited data manipulation capabilities and the fact that generation of graphical output from network meta-analyses often relies on different software packages than the analyses themselves.
METHODS
We developed a freely available Microsoft-Excel-based tool called NetMetaXL, programmed in Visual Basic for Applications, which provides an interface for conducting a Bayesian network meta-analysis using WinBUGS from within Microsoft Excel. . This tool allows the user to easily prepare and enter data, set model assumptions, and run the network meta-analysis, with results being automatically displayed in an Excel spreadsheet. It also contains macros that use NetMetaXL's interface to generate evidence network diagrams, forest plots, league tables of pairwise comparisons, probability plots (rankograms), and inconsistency plots within Microsoft Excel. All figures generated are publication quality, thereby increasing the efficiency of knowledge transfer and manuscript preparation.
RESULTS
We demonstrate the application of NetMetaXL using data from a network meta-analysis published previously which compares combined resynchronization and implantable defibrillator therapy in left ventricular dysfunction. We replicate results from the previous publication while demonstrating result summaries generated by the software.
CONCLUSIONS
Use of the freely available NetMetaXL successfully demonstrated its ability to make running network meta-analyses more accessible to novice WinBUGS users by allowing analyses to be conducted entirely within Microsoft Excel. NetMetaXL also allows for more efficient and transparent critical appraisal of network meta-analyses, enhanced standardization of reporting, and integration with health economic evaluations which are frequently Excel-based. |
Equity in the use of antiretroviral treatment in the public health care system in urban South Africa. | OBJECTIVES
The scaling up of antiretroviral treatment (ART) for HIV-infected adults requires a sizeable investment of resources in the South African public health care system. It is important that these resources are used productively and in ways that reach those in need, irrespective of social status or personal characteristics. In this study we evaluate whether the distribution of ART services in the public system reflects the distribution of need among adults in the urban population.
METHODS
Data from a 2008 national survey were used to estimate the distribution of socioeconomic status (SES) and sex in HIV-positive adults in urban areas. These findings were compared to SES and sex distributions in 635 ART users within 6 urban public ART facilities.
RESULTS
Close to 40% of those with HIV are in the lowest SES quintile, while 67% are women. The distributions in users of ART are similar to these distributions in HIV-positive people.
CONCLUSIONS
Patterns of ART use in study settings correspond to patterns of HIV in the urban population at the national level. This suggests that the South African ART programme is on track to ensure equitable delivery of treatment services in urban settings. |
Transcriptional control of a plant stem cell niche. | Despite the independent evolution of multicellularity in plants and animals, the basic organization of their stem cell niches is remarkably similar. Here, we report the genome-wide regulatory potential of WUSCHEL, the key transcription factor for stem cell maintenance in the shoot apical meristem of the reference plant Arabidopsis thaliana. WUSCHEL acts by directly binding to at least two distinct DNA motifs in more than 100 target promoters and preferentially affects the expression of genes with roles in hormone signaling, metabolism, and development. Striking examples are the direct transcriptional repression of CLAVATA1, which is part of a negative feedback regulation of WUSCHEL, and the immediate regulation of transcriptional repressors of the TOPLESS family, which are involved in auxin signaling. Our results shed light on the complex transcriptional programs required for the maintenance of a dynamic and essential stem cell niche. |
Adaptive traffic signal control system using camera sensor and embedded system | Adaptive traffic signal control system is needed to avoid traffic congestion that has many disadvantages. This paper presents an adaptive traffic signal control system using camera as an input sensor that providing real-time traffic data. Principal Component Analysis (PCA) is used to analyze and to classify object on video frame for detecting vehicles. Distributed Constraint Satisfaction Problem (DCSP) method determine the duration of each traffic signal, based on counted number of vehicles at each lane. The system is implemented in embedded systems using BeagleBoard™. |
MaskLab: Instance Segmentation by Refining Object Detection with Semantic and Direction Features | In this work, we tackle the problem of instance segmentation, the task of simultaneously solving object detection and semantic segmentation. Towards this goal, we present a model, called MaskLab, which produces three outputs: box detection, semantic segmentation, and direction prediction. Building on top of the Faster-RCNN object detector, the predicted boxes provide accurate localization of object instances. Within each region of interest, MaskLab performs foreground/background segmentation by combining semantic and direction prediction. Semantic segmentation assists the model in distinguishing between objects of different semantic classes including background, while the direction prediction, estimating each pixel's direction towards its corresponding center, allows separating instances of the same semantic class. Moreover, we explore the effect of incorporating recent successful methods from both segmentation and detection (e.g., atrous convolution and hypercolumn). Our proposed model is evaluated on the COCO instance segmentation benchmark and shows comparable performance with other state-of-art models. |
Versatile low power media access for wireless sensor networks | We propose <i>B-MAC</i>, a carrier sense media access protocol for wireless sensor networks that provides a flexible interface to obtain ultra low power operation, effective collision avoidance, and high channel utilization. To achieve low power operation, <i>B-MAC</i> employs an adaptive preamble sampling scheme to reduce duty cycle and minimize idle listening. <i>B-MAC</i> supports on-the-fly reconfiguration and provides bidirectional interfaces for system services to optimize performance, whether it be for throughput, latency, or power conservation. We build an analytical model of a class of sensor network applications. We use the model to show the effect of changing <i>B-MAC</i>'s parameters and predict the behavior of sensor network applications. By comparing <i>B-MAC</i> to conventional 802.11-inspired protocols, specifically SMAC, we develop an experimental characterization of <i>B-MAC</i> over a wide range of network conditions. We show that <i>B-MAC</i>'s flexibility results in better packet delivery rates, throughput, latency, and energy consumption than S-MAC. By deploying a real world monitoring application with multihop networking, we validate our protocol design and model. Our results illustrate the need for flexible protocols to effectively realize energy efficient sensor network applications. |
Facial Expressions Tracking and Recognition: Database Protocols for Systems Validation and Evaluation | Each human face is unique. It has its own shape, topology, and distinguishing features. As such, developing and testing facial tracking systems are challenging tasks. The existing face recognition and tracking algorithms in Computer Vision mainly specify concrete situations according to particular goals and applications, requiring validation methodologies with data that fits their purposes. However, a database that covers all possible variations of external and factors does not exist, increasing researchers' work in acquiring their own data or compiling groups of databases.
To address this shortcoming, we propose a methodology for facial data acquisition through definition of fundamental variables, such as subject characteristics, acquisition hardware, and performance parameters. Following this methodology, we also propose two protocols that allow the capturing of facial behaviors under uncontrolled and real-life situations. As validation, we executed both protocols which lead to creation of two sample databases: FdMiee (Facial database with Multi input, expressions, and environments) and FACIA (Facial Multimodal database driven by emotional induced acting).
Using different types of hardware, FdMiee captures facial information under environmental and facial behaviors variations. FACIA is an extension of FdMiee introducing a pipeline to acquire additional facial behaviors and speech using an emotion-acting method. Therefore, this work eases the creation of adaptable database according to algorithm's requirements and applications, leading to simplified validation and testing processes. |
A Bound on the Error of Cross Validation Using the Approximation and Estimation Rates, with Consequences for the Training-Test Split | We give a theoretical and experimental analysis of the generalization error of cross validation using two natural measures of the problem under consideration. The approximation rate measures the accuracy to which the target function can be ideally approximated as a function of the number of parameters, and thus captures the complexity of the target function with respect to the hypothesis model. The estimation rate measures the deviation between the training and generalization errors as a function of the number of parameters, and thus captures the extent to which the hypothesis model suffers from overfitting. Using these two measures, we give a rigorous and general bound on the error of the simplest form of cross validation. The bound clearly shows the dangers of making the fraction of data saved for testingtoo large or too small. By optimizing the bound with respect to , we then argue that the following qualitative properties of cross-validation behavior should be quite robust to significant changes in the underlying model selection problem: When the target function complexity is small compared to the sample size, the performance of cross validation is relatively insensitive to the choice of . The importance of choosing optimally increases, and the optimal value for decreases, as the target function becomes more complex relative to the sample size. There is nevertheless a single fixed value for that works nearly optimally for a wide range of target function complexity. |
ONTOLOGY ALIGNMENT USING MACHINE LEARNING TECHNIQUES | In semantic web, ontology plays an important role to provide formal definitions of concepts and relationships. Therefore, communicating similar ontologies becomes essential to provide ontologies interpretability and extendibility. Thus, it is inevitable to have similar but not the same ontologies in a particular domain, since there might be several definitions for a given concept. This paper presents a method to combine similarity measures of different categories without having ontology instances or any user feedbacks towards aligning two given ontologies. To align different ontologies efficiently, K Nearest Neighbor (KNN), Support Vector Machine (SVM), Decision Tree (DT) and AdaBoost classifiers are investigated. Each classifier is optimized based on the lower cost and better classification rate. Experimental results demonstrate that the F-measure criterion improves to 99% using feature selection and combination of AdaBoost and DT classifiers, which is highly comparable, and outperforms the previous reported F-measures. Computer Engineering Department, Shahid Chamran University, Ahvaz, Iran. |
Providing palliative care for cancer patients: the views and exposure of community general practitioners and district nurses in Japan. | CONTEXT
The role of general practitioners (GPs) and district nurses (DNs) is increasingly important to achieve dying at home.
OBJECTIVES
The primary aim of this region-based representative study was to clarify 1) clinical exposure of GPs and DNs to cancer patients dying at home, 2) availability of symptom control procedures, 3) willingness to participate in out-of-hours cooperation and palliative care consultation services, and 4) reasons for hospital admission of terminally ill cancer patients.
METHODS
Questionnaires were sent to 1106 GP clinics and 70 district nursing services in four areas across Japan.
RESULTS
Two hundred thirty-five GPs and 56 district nursing services responded. In total, 53% of GPs reported that they saw no cancer patients dying at home per year, and 40% had one to 10 such patients. In contrast, 31% of district nursing services cared for more than 10 cancer patients dying at home per year, and 59% had one to 10 such patients. Oral opioids, subcutaneous opioids, and subcutaneous haloperidol were available in more than 90% of district nursing services, whereas 35% of GPs reported that oral opioids were unavailable and 50% reported that subcutaneous opioids or haloperidol were unavailable. Sixty-seven percent of GPs and 93% of district nursing services were willing to use palliative care consultation services. Frequent reasons for admission were family burden of caregiving, unexpected change in physical condition, uncontrolled physical symptoms, and delirium.
CONCLUSION
Japanese GPs have little experience in caring for cancer patients dying at home, whereas DNs have more experience. To achieve quality palliative care programs for cancer patients at the regional level, educating GPs about opioids and psychiatric medications, easily available palliative care consultation services, systems to support home care technology, and coordinated systems to alleviate family burden is of importance. |
The Scythian Animal Style | FROM the Ordos country in Inner Mongolia, between the Great Wall of China on the south and the Huang-ho River on the North, and from the new Chinese province of Suiyuan, Dr. Herbert Mueller, recently associated with the Berlin Royal Museums, has collected Scythian metal objects of much interest. After more study these will perhaps throw new light on the later development of the animal style. Together with the Sung-lin Collection, they are now on view for the first time in America, at the Herbert J. Devine Gallery. |
Cognitive behavioral therapy for treatment of chronic primary insomnia: a randomized controlled trial. | CONTEXT
Use of nonpharmacological behavioral therapy has been suggested for treatment of chronic primary insomnia, but well-blinded, placebo-controlled trials demonstrating effective behavioral therapy for sleep-maintenance insomnia are lacking.
OBJECTIVE
To test the efficacy of a hybrid cognitive behavioral therapy (CBT) compared with both a first-generation behavioral treatment and a placebo therapy for treating primary sleep-maintenance insomnia.
DESIGN AND SETTING
Randomized, double-blind, placebo-controlled clinical trial conducted at a single academic medical center, with recruitment from January 1995 to July 1997.
PATIENTS
Seventy-five adults (n = 35 women; mean age, 55.3 years) with chronic primary sleep-maintenance insomnia (mean duration of symptoms, 13.6 years).
INTERVENTIONS
Patients were randomly assigned to receive CBT (sleep education, stimulus control, and time-in-bed restrictions; n = 25), progressive muscle relaxation training (RT; n = 25), or a quasi-desensitization (placebo) treatment (n = 25). Outpatient treatment lasted 6 weeks, with follow-up conducted at 6 months.
MAIN OUTCOME MEASURES
Objective (polysomnography) and subjective (sleep log) measures of total sleep time, middle and terminal wake time after sleep onset (WASO), and sleep efficiency; questionnaire measures of global insomnia symptoms, sleep-related self-efficacy, and mood.
RESULTS
Cognitive behavioral therapy produced larger improvements across the majority of outcome measures than did RT or placebo treatment. For example, sleep logs showed that CBT-treated patients achieved an average 54% reduction in their WASO whereas RT-treated and placebo-treated patients, respectively, achieved only 16% and 12% reductions in this measure. Recipients of CBT also showed a greater normalization of sleep and subjective symptoms than did the other groups with an average sleep time of more than 6 hours, middle WASO of 26.6 minutes, and sleep efficiency of 85.1%. In contrast, RT-treated patients continued to report a middle WASO of 43.3 minutes and sleep efficiency of 78.8%.
CONCLUSIONS
Our results suggest that CBT represents a viable intervention for primary sleep-maintenance insomnia. This treatment leads to clinically significant sleep improvements within 6 weeks and these improvements appear to endure through 6 months of follow-up. |
Effect of treatment environment on modified constraint-induced movement therapy results in children with spastic hemiplegic cerebral palsy: a randomized controlled trial. | PURPOSE
To determine the effects of treatment environment (home and clinic) on results of modified constraint-induced movement therapy (modified CIMT) in children with spastic hemiplegic cerebral palsy.
METHOD
In a single-blinded, randomized, controlled trial, 14 children with spastic hemiplegic cerebral palsy (5 females, 9 males; mean age: 74 months) received 15 hours of modified CIMT, occurring three times/week for 10 sessions every other day in two randomly assigned groups. Each session lasts one and half hours. Treatment environment for intervention group (n = 7) was home and for control group (n = 7) was clinic. Measures were conducted pre, post and 3 months after treatment period by pediatrics motor activity log and subtests 5 (upper limb coordination) and 8 (upper limb speed and dexterity) of Bruininks-Oseretsky test of motor proficiency. Sample randomization and data analysis by analysis of variance with repeated measures were conducted by SPSS-16 software in α level set at p < 0.05.
RESULTS
All subjects showed significant improvement (p < 0.01) in post-test measures except subtest 5 of Bruininks-Oseretsky test of motor proficiency. In contrast to clinic group, subjects in home showed significantly continued improvement at follow-up session in all measures.
CONCLUSIONS
Modified CIMT is effective in improving upper limb function in children with spastic hemiplegic cerebral palsy. In addition, more improved performance in home group places the practice in natural context as the preferred method for treatment of these children. |
Multivalued logics: a uniform approach to reasoning in artificial intelligence | This paper describes a uniform formalization of much of the current work in artificial intelligence on inference systems. We show that many of these systems, including first-order theorem provers, assumption-based truth maintenance systems (ATMSS), and unimplemented formal systems such as default logic or circumscription, can be subsumed under a single general framework. We begin by defining this framework, which is based on a mathematical structure known as a bilattice. We present a formal definition of inference using this structure and show that this definition generalizes work involving ATMSS and some simple nonmonotonic logics. Following the theoretical description, we describe a constructive approach to inference in this setting; the resulting generalization of both conventional inference and ATMSS is achieved without incurring any substantial computational overhead. We show that our approach can also be used to implement a default reasoner, and discuss a combination of default and ATMS methods that enables us to formally describe an “incremental” default reasoning system. This incremental system does not need to perform consistency checks before drawing tentative conclusions, but can instead adjust its beliefs when a default premise or conclusion is overturned in the face of convincing contradictory evidence. The system is therefore much more computationally viable than earlier approaches. Finally, we discuss the implementation of our ideas. We begin by considering general issues that need to be addressed when implementing a multivalued approach such as that we are proposing, and then turn to specific examples showing the results of an existing implementation. This single implementation is used to solve a digital simulation task using first-order logic, a diagnostic task using ATMSS as suggested by de Kleer and Williams, a problem in default reasoning as in Reiter’s default logic or McCarthy’s circumscription, and to solve the same problem more efficiently by combining default methods with justification information. All of these applications use the same general-purpose bilattice theorem prover and differ only in the choice of bilattice being considered. |
Negative eigenvalues of the Hessian in deep neural networks | The loss function of deep networks is known to be non-convex but the precise nature of this nonconvexity is still an active area of research. In this work, we study the loss landscape of deep networks through the eigendecompositions of their Hessian matrix. In particular, we examine how important the negative eigenvalues are and the benefits one can observe in handling them appropriately. |
SVM-Prot: web-based support vector machine software for functional classification of a protein from its primary sequence | Prediction of protein function is of significance in studying biological processes. One approach for function prediction is to classify a protein into functional family. Support vector machine (SVM) is a useful method for such classification, which may involve proteins with diverse sequence distribution. We have developed a web-based software, SVMProt, for SVM classification of a protein into functional family from its primary sequence. SVMProt classification system is trained from representative proteins of a number of functional families and seed proteins of Pfam curated protein families. It currently covers 54 functional families and additional families will be added in the near future. The computed accuracy for protein family classification is found to be in the range of 69.1-99.6%. SVMProt shows a certain degree of capability for the classification of distantly related proteins and homologous proteins of different function and thus may be used as a protein function prediction tool that complements sequence alignment methods. SVMProt can be accessed at http://jing.cz3.nus.edu.sg/cgi-bin/svmprot.cgi. |
Security Model for Hierarchical Clustered Wireless Sensor Networks | The proposed security system for the Wireless Sensor Network (WSN) is based on the WSN security design goal that ‘to design a completely secure WSN, security must be integrated into every node of the system’. This paper discusses on two main components of the security framework viz. the secure key management module and the secure routing scheme. The incorporation of security mechanism during the routing protocol design phase is the main focus of this paper. The proposed security framework viz. ‘Secure and Hierarchical, a Routing Protocol’ (SHARP) is designed for the wireless sensor network applications which is deployed particularly for data collection purpose in a battlefield where the security aspect of the network cannot be compromised at any cost. SHARP consists of three basic integrated modules and each module performs a well defined task to make the whole security framework a complete system on its own. |
Modeling using K-means clustering algorithm | Modeling is an abstract representation of real world process. Predicting the likely behavior from observed behavior would be entirely legitimate if the relationship were found in the data. Two common data mining techniques for finding hidden patterns in data are clustering and classification analyses. Classification is supposed to be supervised learning and clustering is an unsupervised classification with no predefined classes. Clustering tries to group a set of objects and find whether there is some relationship between those objects. In this paper we have used the numerical results generated through the Probability Density Function algorithm as the basis of recommendations in favor of the K-means clustering for weather-related predictions. We propose a model for predicting the probability of the outcome of the Play class as YES or NO through K-means clustering on weather data. The main reason for our choice in favor of K-means clustering algorithm is that it is robust. |
Biomimetic walking robot SCORPION: Control and modeling | We present the biomimetic control scheme for the walking robot SCORPION. We used a concept of Basic Motion Patterns, which can be combined in a very flexible manner. In addition our modeling and simulation approach is described, which has been done based on the ADAMS(TM) simulator. Especially the motion patterns of real scorpions were analyzed and used for walking patterns and acceleration of the robot. |
Coherent noise for non-photorealistic rendering | A wide variety of non-photorealistic rendering techniques make use of random variation in the placement or appearance of primitives. In order to avoid the "shower-door" effect, this random variation should move with the objects in the scene. Here we present coherent noise tailored to this purpose. We compute the coherent noise with a specialized filter that uses the depth and velocity fields of a source sequence. The computation is fast and suitable for interactive applications like games. |
Self-Efficacy Among Young Men Who have Sex with Men: An Exploratory Analysis of HIV/AIDS Risk Behaviors Across Partner Types | HIV infection continues to rise among young men who have sex with men (YMSM). We explored whether unprotected receptive anal intercourse (URAI) occasions and partners, respectively, were associated with YMSM’s (N = 194; ages 18–24) self-efficacy for safe sex with regular and casual partners. We created four self-efficacy typologies: high self-efficacy with both partner types [HRHC; N = 73(41.7 %)], high self-efficacy with regular partners but low with casual partners [HRLC; N = 24(13.7 %)], low self-efficacy with regular partners but high with casual partners [LRHC; N = 21(12.0 %)], and low with both partner types [LRLC; N = 57(32.6 %)]. YMSM in the LRHC category reported fewer URAI occasions, whereas those in the HRLC group reported more URAI partner and occasions, respectively. YMSM having serodiscordant partners were more likely to report more URAI partners, and be represented in the LRLC category. These findings underscore the importance of addressing differential self-efficacy across partner types, and highlight an urgent need to enhance YMSM’s self-efficacy with casual partners. El VIH continúa en aumento en los hombres jóvenes que tienen sexo con otros hombres (YMSM). En este estudio, se examinó la asociación entre el número de actos y parejas con las que los YMSM (N = 194; 18-24 años) tuvieron sexo receptivo anal sin condón (URAI), y su auto-eficacia con parejas regulares y casuales. Creamos cuatro combinaciones de auto-eficacia: alta en ambas parejas [HRHC;N = 73(41.7 %)], alta en parejas regulares y baja en parejas casuales [HRLC;N = 24(13.7 %)], baja en parejas regulares y alta en parejas casuales [LRHC;N = 21(12.0 %)], y baja en ambas parejas [LRLC;N = 57;(32.6 %)]. LRHC reportaron menos ocasiones de URAI, mientras que aquellos en HRLC reportaron un mayor número de actos y parejas, respectivamente. Jóvenes con parejas serodiscordantes fueron más propensos a ser categorizados como LRLC y a tener mayor número de parejas. Estos hallazgos destacan la importancia de considerar las diferencias en auto-eficacia entre tipos de pareja y enfatizan la necesidad de mejorar la auto-eficacia en parejas casuales. |
An Approach to Identify Duplicated Web Pages | A relevant consequence of the unceasing expansion of the Web and e-commerce is the growth of the demand ofnew Web sites and Web applications. As a result, Web sites and applications are usually developed without a fomlalized process, but Web pages are directly coded in an incremental way, where ne}10' pages are obtained by duplicating existing ones. Duplicated Web pages, having the same structure and just differing for the data they include, can be considered as clones. The identification of clones may reduce the effort devoted to test, maintain and evolve Web sites and applications. Moreover, clone detection among different Web sites aims to detect cases ofpossible plagiarism. In this paper we propose an approach, based on similarity metrics, to detect duplicated pages in Web sites and applications, implemented with HTML language and ASP technology. The proposed approach has been assessed by analyzing several Web sites and Web applications. The obtained results are reported in the paper with respect to some case studies. |
Switched state-space model for a switched-capacitor power amplifier | This paper presents a switched state-space modeling approach for a switched-capacitor power amplifier. In contrast to state of the art behavioral models for nonlinear devices like power amplifiers, the state-space representation allows a straightforward inclusion of the nonidealities of the applied input sources. Hence, adding noise on a power supply or phase distortions on the carrier signal do not require a redesign of the mathematical model. The derived state-space model (SSM), which can be efficiently implemented in any numerical simulation tool, allows a significant reduction of the required simulation run-time (14x speedup factor) with respect to standard Cadence Spectre simulations. The derived state-space model (SSM) has been implemented in MATLAB/Simulink and its results have been verified by comparison with Cadence Spectre simulations. |
NEXMark – A Benchmark for Queries over Data Streams DRAFT | A lot of research has focused recently on executing queries over data streams. This recent attention is due to the large number of data streams available, and the desire to get answers to queries on these streams in real time. There are many sources of data streams: environmental sensor networks, network routing traces, financial transactions and cell phone call records. Many systems are currently under development to execute queries over data streams [BW01, CCC02, MF02, NDM00, SH98]. Further, many ideas have been presented to improve the execution of queries over data streams [ABB02, MSHR02, TMSF02]. It is important to be able to measure the effectiveness of this work. To this end, we present the Niagara Extension to XMark benchmark (NEXMark). This is a work in progress. We are circulating it now for feedback. We have three goals: To define a benchmark, to provide stream generators, and to define metrics for measuring queries over continuous data streams. The XMark benchmark [SWK01] is designed to measure the performance of XML repositories. The benchmark provides a data generator that models the state of an auction in XML format, and various queries over the generated data. An abbreviated schema for XMark is shown in Figure 1. |
Rare breast cancer subtypes: histological, molecular, and clinical peculiarities. | Breast cancer encompasses a collection of different diseases characterized by different biological and pathological features, clinical presentation, response to treatments, clinical behavior, and outcome. On the basis of cell morphology, growth, and architecture patterns, breast cancer can be classified in up to 21 distinct histological types. Breast cancer special types, including the classic lobular invasive carcinoma, represent 25% of all breast cancers. The histological diversity of breast carcinomas has relevant prognostic implications. Indeed, the rare breast cancer group includes subtypes with very different prognoses, ranging from the tubular carcinoma, associated with an indolent clinical course, to metaplastic cancer, whose outcome is generally unfavorable. New approaches based on gene expression profiling allow the identification of molecularly defined breast cancer classes, with distinct biological features and clinical behavior. In clinical practice, immunohistochemical classification based on the expression of human epidermal growth factor receptor 2 and Ki67 is applied as a surrogate of the intrinsic molecular subtypes. However, the identification of intrinsic molecular subtypes were almost completely limited to the study of ductal invasive breast cancer. Moreover, some good-prognosis triple-negative histotypes, on the basis of gene expression profiling, can be classified among the poor-prognosis group. Therefore, histopathological classification remains a crucial component of breast cancer diagnosis. Special histologies can be very rare, and the majority of information on outcome and treatments derives from small series and case reports. As a consequence, clear recommendations about clinical management are still lacking. In this review, we summarize current knowledge about rare breast cancer histologies. |
The Human Element in Autonomous Vehicles | Autonomous vehicle research has been prevalent for well over a decade but only recently has there been a small amount of research conducted on the human interaction that occurs in autonomous vehicles. Although functional software and sensor technology is essential for safe operation, which has been the main focus of autonomous vehicle research, handling all elements of human interaction is also a very salient aspect of their success. This paper will provide an overview of the importance of human vehicle interaction in autonomous vehicles, while considering relevant related factors that are likely to impact adoption. Particular attention will be given to prior research conducted on germane areas relating to control in the automobile, in addition to the different elements that are expected to affect the likelihood of success for these vehicles initially developed for human operation. This paper will also include a discussion of the limited research conducted to consider interactions with humans and the current state of published functioning software and sensor technology that exists. |
Moving Object Detection using Background Subtraction, Shadow Removal and Post Processing | In many vision based application identifying moving objects is important and critical task. For different computer vision application Background subtraction is fast way to detect moving object. Background subtraction separates the foreground from background. However, background subtraction is unable to remove shadow from foreground. Moving cast shadow associated with moving object also gets detected making it challenge for video surveillance. The shadow makes it difficult to detect the exact shape of object and to recognize the object. |
Evaluation of eye gaze interaction | Eye gaze interaction can provide a convenient and natural addition to user-computer dialogues. We have previously reported on our interaction techniques using eye gaze [10]. While our techniques seemed useful in demonstration, we now investigate their strengths and weaknesses in a controlled setting. In this paper, we present two experiments that compare an interaction technique we developed for object selection based on a where a person is looking with the most commonly used selection method using a mouse. We find that our eye gaze interaction technique is faster than selection with a mouse. The results show that our algorithm, which makes use of knowledge about how the eyes behave, preserves the natural quickness of the eye. Eye gaze interaction is a reasonable addition to computer interaction and is convenient in situations where it is important to use the hands for other tasks. It is particularly beneficial for the larger screen workspaces and virtual environments of the future, and it will become increasingly practical as eye tracker technology matures. |
Multicamera fusion for shape estimation and visibility analysis of unknown deforming objects | A method is proposed for fused three-dimensional (3-D) shape estimation and visibility analysis of an unknown, markerless, deforming object through a multicamera vision system. Complete shape estimation is defined herein as the process of 3-D reconstruction of a model through fusion of stereo triangulation data and a visual hull. The differing accuracies of both methods rely on the number and placement of the cameras. Stereo triangulation yields a high-density, high-accuracy reconstruction of a surface patch from a small surface area, while a visual hull yields a complete, low-detail volumetric approximation of the object. The resultant complete 3-D model is, then, temporally projected based on the tracked object’s deformation, yielding a robust deformed shape prediction. Visibility and uncertainty analyses, on the projected model, estimate the expected accuracy of reconstruction at the next sampling instant. In contrast to common techniques that rely on a priori known models and identities of static objects, our method is distinct in its direct application to unknown, markerless, deforming objects, where the object model and identity are unknown to the system. Extensive simulations and comparisons, some of which are presented herein, thoroughly demonstrate the proposed method and its benefits over individual reconstruction techniques. © 2016 SPIE and IS&T [DOI: 10.1117/1.JEI.25.4.041009] |
TrBagg: A Simple Transfer Learning Method and its Application to Personalization in Collaborative Tagging | The aim of transfer learning is to improve prediction accuracy on a target task by exploiting the training examples for tasks that are related to the target one. Transfer learning has received more attention in recent years, because this technique is considered to be helpful in reducing the cost of labeling. In this paper, we propose a very simple approach to transfer learning: TrBagg, which is the extension of bagging. TrBagg is composed of two stages: Many weak classifiers are first generated as in standard bagging, and these classifiers are then filtered based on their usefulness for the target task. This simplicity makes it easy to work reasonably well without severe tuning of learning parameters. Further, our algorithm equips an algorithmic scheme to avoid negative transfer. We applied TrBagg to personalized tag prediction tasks for social bookmarks Our approach has several convenient characteristics for this task such as adaptation to multiple tasks with low computational cost. |
K-Ar illite age constraints on the Proterozoic formation and reactivation history of a brittle fault in Fennoscandia | K-Ar ages of authigenic illite from two drill-core gouge samples of a fault in the Palaeoproterozoic basement of Finland record two distinct faulting events. The older sample yields apparent ages from 1240 ± 26 to 1006 ± 21 Ma for four grain size fractions between 6 and <0.1 μm. The second sample is structurally younger and yields statistically distinct ages ranging from 978 ± 20 to 886 ± 18 Ma. We interpret the ages of the <0.1 m fractions, which are the youngest, as representing the actual time of faulting. XRD analysis and age modelling exclude significant age contamination of the finest dated fractions with inherited host rock components. These results provide therefore an example of meaningful isotopic dating of illite-type clay material formed during Precambrian faulting, demonstrate and constrain fault reactivation and give evidence for brittle Sveconorwegian Mesoproterozoic shortening and Neoproterozoic extension in Fennoscandia. |
A case for dynamic frequency tuning in on-chip networks | Performance and power are the first order design metrics for Network-on-Chips (NoCs) that have become the de-facto standard in providing scalable communication backbones for multicores/CMPs. However, NoCs can be plagued by higher power consumption and degraded throughput if the network and router are not designed properly. Towards this end, this paper proposes a novel router architecture, where we tune the frequency of a router in response to network load to manage both performance and power. We propose three dynamic frequency tuning techniques, FreqBoost, FreqThrtl and FreqTune, targeted at congestion and power management in NoCs. As enablers for these techniques, we exploit Dynamic Voltage and Frequency Scaling (DVFS) and the imbalance in a generic router pipeline through time stealing. Experiments using synthetic workloads on a 8x8 wormhole-switched mesh interconnect show that FreqBoost is a better choice for reducing average latency (maximum 40%) while, FreqThrtl provides the maximum benefits in terms of power saving and energy delay product (EDP). The FreqTune scheme is a better candidate for optimizing both performance and power, achieving on an average 36% reduction in latency, 13% savings in power (up to 24% at high load), and 40% savings (up to 70% at high load) in EDP. With application benchmarks, we observe IPC improvement up to 23% using our design. The performance and power benefits also scale for larger NoCs. |
Cambricon-X: An accelerator for sparse neural networks | Neural networks (NNs) have been demonstrated to be useful in a broad range of applications such as image recognition, automatic translation and advertisement recommendation. State-of-the-art NNs are known to be both computationally and memory intensive, due to the ever-increasing deep structure, i.e., multiple layers with massive neurons and connections (i.e., synapses). Sparse neural networks have emerged as an effective solution to reduce the amount of computation and memory required. Though existing NN accelerators are able to efficiently process dense and regular networks, they cannot benefit from the reduction of synaptic weights. In this paper, we propose a novel accelerator, Cambricon-X, to exploit the sparsity and irregularity of NN models for increased efficiency. The proposed accelerator features a PE-based architecture consisting of multiple Processing Elements (PE). An Indexing Module (IM) efficiently selects and transfers needed neurons to connected PEs with reduced bandwidth requirement, while each PE stores irregular and compressed synapses for local computation in an asynchronous fashion. With 16 PEs, our accelerator is able to achieve at most 544 GOP/s in a small form factor (6.38 mm2 and 954 mW at 65 nm). Experimental results over a number of representative sparse networks show that our accelerator achieves, on average, 7.23x speedup and 6.43x energy saving against the state-of-the-art NN accelerator. |
A METHOD FOR OPTIMISED ALLOCATION OF SYSTEM ARCHITECTURES WITH REAL-TIME CONSTRAINTS | Optimised allocation of system architectures is a well researched area as it can greatly reduce the developmental cost of systems and increase performance and reliability in their respective applications. In conjunction with the recent shift from federated to integrated architectures in automotive, and the increasing complexity of computer systems, both in terms of software and hardware, the applications of design space exploration and optimised allocation of system architectures are of great interest. This thesis proposes a method to derive architectures and their allocations for systems with real-time constraints. The method implements integer linear programming to solve for an optimised allocation of system architectures according to a set of linear constraints while taking resource requirements, communication dependencies, and manual design choices into account. Additionally, this thesis describes and evaluates an industrial use case using the method wherein the timing characteristics of a system were evaluated, and, the method applied to simultaneously derive a system architecture, and, an optimised allocation of the system architecture. This thesis presents evidence and validations that suggest the viability of the method and its use case in an industrial setting. The work in this thesis sets precedence for future research and development, as well as future applications of the method in both industry and academia. |
Advances in Automated Knowledge Base Construction | Recent years have seen significant advances on the creation of large-scale knowledge bases (KBs). Extracting knowledge from Web pages, and integrating it into a coherent KB is a task that spans the areas of natural language processing, information extraction, information integration, databases, search and machine learning. Some of the latest developments in the field were presented at the AKBC-WEKEX workshop on knowledge extraction at the NAACL-HLC 2012 conference. This workshop included 23 accepted papers, and 11 keynotes by senior researchers. The workshop had speakers from all major search engine providers, government institutions, and the leading universities in the field. In this survey, we summarize the papers, the keynotes, and the discussions at this workshop. |
Integrating Document Clustering and Multidocument Summarization | Document understanding techniques such as document clustering and multidocument summarization have been receiving much attention recently. Current document clustering methods usually represent the given collection of documents as a document-term matrix and then conduct the clustering process. Although many of these clustering methods can group the documents effectively, it is still hard for people to capture the meaning of the documents since there is no satisfactory interpretation for each document cluster. A straightforward solution is to first cluster the documents and then summarize each document cluster using summarization methods. However, most of the current summarization methods are solely based on the sentence-term matrix and ignore the context dependence of the sentences. As a result, the generated summaries lack guidance from the document clusters. In this article, we propose a new language model to simultaneously cluster and summarize documents by making use of both the document-term and sentence-term matrices. By utilizing the mutual influence of document clustering and summarization, our method makes; (1) a better document clustering method with more meaningful interpretation; and (2) an effective document summarization method with guidance from document clustering. Experimental results on various document datasets show the effectiveness of our proposed method and the high interpretability of the generated summaries. |
Flexible transparent conducting hybrid film using a surface-embedded copper nanowire network: a highly oxidation-resistant copper nanowire electrode for flexible optoelectronics. | We report a flexible high-performance conducting film using an embedded copper nanowire transparent conducting electrode; this material can be used as a transparent electrode platform for typical flexible optoelectronic devices. The monolithic composite structure of our transparent conducting film enables simultaneously an outstanding oxidation stability of the copper nanowire network (14 d at 80 °C), an exceptionally smooth surface topography (R(rms) < 2 nm), and an excellent opto-electrical performances (Rsh = 25 Ω sq(-1) and T = 82%). A flexible organic light emitting diode device is fabricated on the transparent conducting film to demonstrate its potential as a flexible copper nanowire electrode platform. |
Relighting humans: occlusion-aware inverse rendering for full-body human images | Relighting of human images has various applications in image synthesis. For relighting, we must infer albedo, shape, and illumination from a human portrait. Previous techniques rely on human faces for this inference, based on spherical harmonics (SH) lighting. However, because they often ignore light occlusion, inferred shapes are biased and relit images are unnaturally bright particularly at hollowed regions such as armpits, crotches, or garment wrinkles. This paper introduces the first attempt to infer light occlusion in the SH formulation directly. Based on supervised learning using convolutional neural networks (CNNs), we infer not only an albedo map, illumination but also a light transport map that encodes occlusion as nine SH coefficients per pixel. The main difficulty in this inference is the lack of training datasets compared to unlimited variations of human portraits. Surprisingly, geometric information including occlusion can be inferred plausibly even with a small dataset of synthesized human figures, by carefully preparing the dataset so that the CNNs can exploit the data coherency. Our method accomplishes more realistic relighting than the occlusion-ignored formulation. |
An adaptive hysteresis band current controller for inverter base DG with reactive power compensation | In this paper the three-phase grid connected inverter has been investigated. The inverter’s control strategy is based on the adaptive hysteresis current controller. Inverter connects the DG (distributed generation) source to the grid. The main advantages of this method are constant switching frequency, better current control, easy filter design and less THD (total harmonic distortion). Since a constant and ripple free dc bus voltage is not ensured at the output of alternate energy sources, the main aim of the proposed algorithm is to make the output of the inverter immune to the fluctuations in the dc input voltage This inverter can be used to connect the medium and small-scale wind turbines and solar cells to the grid and compensate local load reactive power. Reactive power compensating improves SUF (system usage factor) from nearly 20% (in photovoltaic systems) to 100%. The simulation results confirm that switching frequency is constant and THD of injected current is low. |
Frame-Based Ontology Population with PIKES | We present an approach for ontology population from natural language English texts that extracts RDF triples according to FrameBase, a Semantic Web ontology derived from FrameNet. Processing is decoupled in two independently-tunable phases. First, text is processed by several NLP tasks, including Semantic Role Labeling (SRL), whose results are integrated in an RDF graph of mentions, i.e., snippets of text denoting some entity/fact. Then, the mention graph is processed with SPARQL-like rules using a specifically created mapping resource from NomBank/PropBank/FrameNet annotations to FrameBase concepts, producing a knowledge graph whose content is linked to DBpedia and organized around semantic frames, i.e., prototypical descriptions of events and situations. A single RDF/OWL representation is used where each triple is related to the mentions/tools it comes from. We implemented the approach in PIKES, an open source tool that combines two complementary SRL systems and provides a working online demo. We evaluated PIKES on a manually annotated gold standard, assessing precision/recall in (i) populating FrameBase ontology, and (ii) extracting semantic frames modeled after standard predicate models, for comparison with state-of-the-art tools for the Semantic Web. We also evaluated (iii) sampled precision and execution times on a large corpus of 110 K Wikipedia-like pages. |
A hybrid lidar-based indoor navigation system enhanced by ceiling visual codes for mobile robots | Localization and navigation are fundamental issues to autonomous mobile robotics. In the case of the environmental map has been built, the traditional two-dimensional (2D) lidar localization and navigation system can't be matched to the initial position of the robot in dynamic environment and will be unreliable when kidnapping occurs. Moreover, it relies on high-cost lidar for high accuracy and long range. In view of this, the paper presents a low cost navigation system based on a low cost lidar and a cheap webcam. In this approach, 2D-codes are attached to the ceiling, to provide reference points to aid the indoor robot localization. The mobile robot is equipped with webcam pointing to the ceiling to identify 2D-codes. On the other hand, a low-cost 2D laser scanner is applied to build a map in unknown environment and detect obstacles. Adaptive Monte Carlo Localization (AMCL) is implements for lidar positioning, A* and Dynamic Window Approach (DWA) are applied in path planning based on a 2D grid map. The error analysis and experiments has validated the proposed method. |
An Optimization Framework for Conformal Radiation Treatment Planning | A optimization framework for three-dimensional conformal radiation therapy is presented. In conformal therapy, beams of radiation are applied to a patient from different directions, where the aperture through which the beam is delivered from each direction is chosen to match the shape of the tumor, as viewed from that direction. Wedge filters may be used to produce a gradient in beam intensity across the aperture. Given a set of equispaced beam angles, a mixed-integer linear program can be solved to determine the most effective angles to be used in a treatment plan, the weight (exposure time) to be used for each beam, and the type and orientation of wedges to be used. Practical solution techniques for this problem are described; they include strengthening of the formulation and solution of smaller approximate problems obtained by a reduced parametrization of the treatment region. In addition, techniques for controlling the dose-volume histogram implicitly for various parts of the treatment region using hotand cold-spot control parameters are presented. Computational results are given that show the effectiveness of the proposed approach on practical data sets. |
PERT – Perfect Random Tree Ensembles | Ensemble classifiers originated in the machine learning community. They work by fitting many individual classifiers and combining them by weighted or unweighted voting. The ensemble classifier is often much more accurate than the individual classifiers from which it is built. In fact, ensemble classifiers are among the most accurate general-purpose classifiers available. We introduce a new ensemble method, PERT, in which each individual classifier is a perfectly-fit classification tree with random selection of splits. Compared to other ensemble methods, PERT is very fast to fit. Given the randomness of the split selection, PERT is surprisingly accurate. Calculations suggest that one reason why PERT works so well is that although the individual tree classifiers are extremely weak, they are almost uncorrelated. The simple probabilistic nature of the classifier lends itself to theoretical analysis. We show that PERT is fitting a continuous posterior probability surface for each class. As such, it can be viewed as a classification-via-regression procedure that fits a continuous interpolating surface. In theory, this surface could be found using a one-shot procedure. |
The Impact of Microenvironmental Heterogeneity on the Evolution of Drug Resistance in Cancer Cells | Therapeutic resistance arises as a result of evolutionary processes driven by dynamic feedback between a heterogeneous cell population and environmental selective pressures. Previous studies have suggested that mutations conferring resistance to epidermal growth factor receptor (EGFR) tyrosine kinase inhibitors (TKI) in non-small-cell lung cancer (NSCLC) cells lower the fitness of resistant cells relative to drug-sensitive cells in a drug-free environment. Here, we hypothesize that the local tumor microenvironment could influence the magnitude and directionality of the selective effect, both in the presence and absence of a drug. Using a combined experimental and computational approach, we developed a mathematical model of preexisting drug resistance describing multiple cellular compartments, each representing a specific tumor environmental niche. This model was parameterized using a novel experimental dataset derived from the HCC827 erlotinib-sensitive and -resistant NSCLC cell lines. We found that, in contrast to in the drug-free environment, resistant cells may hold a fitness advantage compared to parental cells in microenvironments deficient in oxygen and nutrients. We then utilized the model to predict the impact of drug and nutrient gradients on tumor composition and recurrence times, demonstrating that these endpoints are strongly dependent on the microenvironment. Our interdisciplinary approach provides a model system to quantitatively investigate the impact of microenvironmental effects on the evolutionary dynamics of tumor cells. |
Academic procrastination: associations with personal, school, and family variables. | Procrastination is a common behavior, mainly in school settings. Only a few studies have analyzed the associations of academic procrastination with students' personal and family variables. In the present work, we analyzed the impact of socio-personal variables (e.g., parents' education, number of siblings, school grade level, and underachievement) on students' academic procrastination profiles. Two independent samples of 580 and 809 seventh to ninth graders, students attending the last three years of Portuguese Compulsory Education, have been taken. The findings, similar in both studies, reveal that procrastination decreases when the parents' education is higher, but it increases along with the number of siblings, the grade level, and the underachievement. The results are discussed in view of the findings of previous research. The implications for educational practice are also analyzed. |
Continuous low-dose cyclophosphamide and methotrexate combined with celecoxib for patients with advanced cancer | Background:Combined therapy of metronomic cyclophosphamide, methotrexate and high-dose celecoxib targeting angiogenesis was used in a phase II trial.Methods:Patients with advanced cancer received oral cyclophosphamide 50 mg o.d., celecoxib 400 mg b.d. and methotrexate 2.5 mg b.d. for two consecutive days each week. Response was determined every 8 weeks; toxicity was evaluated according to CTC version 2.0. Plasma markers of inflammation, coagulation and angiogenesis were measured.Results:Sixty-seven of 69 patients were evaluable for response. Twenty-three patients had stable disease (SD) after 8 weeks, but there were no objective responses to therapy. Median time to progression was 57 days. There was a low incidence of toxicities. Among plasma markers, levels of tissue factor were higher in the SD group of patients at baseline, and levels of both angiopoietin-1 and matrix metalloproteinase-9 increased in the progressive disease group only. There were no changes in other plasma markers.Conclusion:This metronomic approach has negligible activity in advanced cancer albeit with minimal toxicity. Analysis of plasma markers indicates minimal effects on endothelium in this trial. These data for this particular regimen do not support basic tenets of metronomic chemotherapy, such as the ability to overcome resistant tumours by targeting the endothelium. |
Learning Dense Correspondence via 3D-Guided Cycle Consistency | Discriminative deep learning approaches have shown impressive results for problems where human-labeled ground truth is plentiful, but what about tasks where labels are difficult or impossible to obtain? This paper tackles one such problem: establishing dense visual correspondence across different object instances. For this task, although we do not know what the ground-truth is, we know it should be consistent across instances of that category. We exploit this consistency as a supervisory signal to train a convolutional neural network to predict cross-instance correspondences between pairs of images depicting objects of the same category. For each pair of training images we find an appropriate 3D CAD model and render two synthetic views to link in with the pair, establishing a correspondence flow 4-cycle. We use ground-truth synthetic-to-synthetic correspondences, provided by the rendering engine, to train a ConvNet to predict synthetic-to-real, real-to-real and real-to-synthetic correspondences that are cycle-consistent with the ground-truth. At test time, no CAD models are required. We demonstrate that our end-to-end trained ConvNet supervised by cycle-consistency outperforms state-of-the-art pairwise matching methods in correspondence-related tasks. |
Fog Removal from Color Images using Contrast Limited Adaptive Histogram Equalization | The images degraded by fog suffer from poor contrast. In order to remove fog effect, a Contrast Limited Adaptive Histogram Equalization (CLAHE)-based method is presented in this paper. This method establishes a maximum value to clip the histogram and redistributes the clipped pixels equally to each gray-level. It can limit the noise while enhancing the image contrast. In our method, firstly, the original image is converted from RGB to HSI. Secondly, the intensity component of the HSI image is processed by CLAHE. Finally, the HSI image is converted back to RGB image. To evaluate the effectiveness of the proposed method, we experiment with a color image degraded by fog and apply the edge detection to the image. The results show that our method is effective in comparison with traditional methods. KeywordsCLAHE, fog, degraded, remove, color image, HSI, edge detection. |
Dynamic quest plot generation using Petri net planning | In most cases, the story of popular RPG games is designed by professional designers as a main content. However, manual design of game content has limitation in the quantitative aspect. Manual story generation requires a large amount of time and effort. Because gamers want more diverse and rich content, so it is not easy to satisfy the needs with manual design. PCG (Procedural Content Generation) is to automatically generate the content of the game. In this paper, we propose a quest generation engine using Petri net planning. As a combination of Petri-net modules a quest, a quest plot is created. The proposed method is applied to a commercial game platform to show the feasibility. |
Giving patients a lift - the robotic nursing assistant (RoNA) | Nursing has ranked as one of the top 10 occupations for causing the work-related musculoskeletal injuries in U.S. Constantly and manually lifting and repositioning patients around bed and transferring them from bed to bed have been recognized as the major reasons causing nurses' workrelated musculoskeletal injuries. We believe that advanced robotic technologies can assist nurses in performing the labor intensive tasks and preventing the musculoskeletal injuries among medical workers and nurses. In this paper, we present Hstar Technologies' 2nd generation Robotic Nursing Assistant (RoNA) systemRoNA. Compared to the 1st generation RoNA system released in 2011, the RoNA has more powerful arms which can lift up to 500 pounds patient. The RoNA has been equipped with many intelligent sensors that allow nurse easily and intuitively guide it performing the patient lifting. The paper also discusses the design improvement, control system as well as the software architecture of the RoNA system. |
Comparison of the AT1-receptor blocker, candesartan cilexetil, and the ACE inhibitor, lisinopril, in fixed combination with low dose hydrochlorothiazide in hypertensive patients | Aim: To compare candesartan cilexetil and lisinopril in fixed combination with hydrochlorothiazide with respect to antihypertensive efficacy and tolerability.Methods:This was a double-blind (double-dummy), randomised, parallel group comparison in patients with a mean sitting diastolic blood pressure 95–115 mm Hg on prior antihypertensive monotherapy. Treatments were candesartan cilexetil/hydrochlorothiazide 8/12.5 mg once daily (n = 237) and lisinopril/hydrochlorothiazide 10/12.5 mg once daily (n = 116) for 26 weeks. The primary efficacy variable was change in trough sitting diastolic blood pressure.Results: Changes in mean sitting diastolic blood pressure did not differ significantly between the groups (mean difference 0.5 mm Hg; 95% confidence interval −1.6, 2.7, P = 0.20). No significant differences between the groups was found for other haemodynamic variables (sitting systolic blood pressure, standing blood pressure, sitting/erect heart rate, and proportion of responders and controlled patients). Both drugs were well tolerated but the proportion of patients with at least one adverse event was significantly greater in the lisinopril group (80% vs 69%, P = 0.020). The proportion of patients spontaneously reporting cough (23.1% vs 4.6%) and discontinuing therapy due to adverse events (12.0% vs 5.9%) was also higher in the lisinopril group compared with the candesartan cilexetil group.Conclusions: The fixed combinations of candesartan cilexetil and hydrochlorothiazide 8/12.5 mg and lisinopril and hydrochlorothiazide 10/12.5 mg once daily are equally effective as antihypertensive agents. The fixed combination containing candesartan cilexetil is better tolerated than that containing lisinopril. |
Moral conviction: another contributor to attitude strength or something more? | Attitudes held with strong moral conviction (moral mandates) were predicted to have different interpersonal consequences than strong but nonmoral attitudes. After controlling for indices of attitude strength, the authors explored the unique effect of moral conviction on the degree that people preferred greater social (Studies 1 and 2) and physical (Study 3) distance from attitudinally dissimilar others and the effects of moral conviction on group interaction and decision making in attitudinally homogeneous versus heterogeneous groups (Study 4). Results supported the moral mandate hypothesis: Stronger moral conviction led to (a) greater preferred social and physical distance from attitudinally dissimilar others, (b) intolerance of attitudinally dissimilar others in both intimate (e.g., friend) and distant relationships (e.g., owner of a store one frequents), (c) lower levels of good will and cooperativeness in attitudinally heterogeneous groups, and (d) a greater inability to generate procedural solutions to resolve disagreements. |
Oncothermia treatment of cancer: from the laboratory to clinic. | Oncothermia is a long-time applied method (since 1989) in oncology. Its clinical results excellently show its advantages, however the details of its mechanism are under investigation even today. The method is based on a self-selective process of energy concentration and targets the membrane of the malignant cell, using the temperature gradient and the beta-dispersion of the membrane proteins. To prove the theory we show the experimental evidences in vitro experiments where we showed the definite difference between the conventional heating and the oncothermia at the same temperature. In the next step, we studied some xenograft nude-mice models, verifying the temperature-dependent and non temperature dependent factors. In addition, the synergic effect with some chemotherapies were studied, having more efficacy of the oncothermia with drugs than the conventional heating. These experiments show the definite advantages of the oncothermia compared to its classical counterpart, acting on the same temperature. We have also proved the beneficial effect of oncothermia treatment in the veterinary practice Oncothermia is applied in numerous clinics and hospitals, and we would like to show some characteristic case-reports and also the clinical benefit on the survival time elongation of liver-, pancreas-, brain-, and lung-tumor-lesions. |
Advances in vegetation management for power line corridor monitoring using aerial remote sensing techniques | This paper presents a comprehensive discussion of vegetation management approaches in power line corridors based on aerial remote sensing techniques. We address three issues 1) strategies for risk management in power line corridors, 2) selection of suitable platforms and sensor suite for data collection and 3) the progress in automated data processing techniques for vegetation management. We present initial results from a series of experiments and, challenges and lessons learnt from our project. |
5G: Personal mobile internet beyond what cellular did to telephony | Cellular technology has dramatically changed our society and the way we communicate. First it impacted voice telephony, and then has been making inroads into data access, applications, and services. However, today potential capabilities of the Internet have not yet been fully exploited by cellular systems. With the advent of 5G we will have the opportunity to leapfrog beyond current Internet capabilities. |
A large annotated corpus for learning natural language inference | Understanding entailment and contradiction is fundamental to understanding natural language, and inference about entailment and contradiction is a valuable testing ground for the development of semantic representations. However, machine learning research in this area has been dramatically limited by the lack of large-scale resources. To address this, we introduce the Stanford Natural Language Inference corpus, a new, freely available collection of labeled sentence pairs, written by humans doing a novel grounded task based on image captioning. At 570K pairs, it is two orders of magnitude larger than all other resources of its type. This increase in scale allows lexicalized classifiers to outperform some sophisticated existing entailment models, and it allows a neural network-based model to perform competitively on natural language inference benchmarks for the first time. |
KASL clinical practice guidelines: management of chronic hepatitis B | * Clinical Practice Guidelines Committee of KASL for the Management of Chronic Hepatitis B Kwan Sik Lee (Committee Chair, Yonsei University College of Medicine), Si Hyun Bae (Catholic University of Korea), Won Hyeok Choe (Konkuk University College of Medicine), Moon Seok Choi (Sungkyunkwan University School of Medicine), Woo Jin Chung (Keimyung University School of Medicine), Chang Wook Kim (Catholic University of Korea), Hyung Joon Kim (Chung-Ang University College of Medicine), Ja Kyung Kim (Yonsei University College of Medicine), Ji Hoon Kim (Korea University College of Medicine), Suk Bae Kim (Dankook University Medical College), Yoon Jun Kim (Seoul National University College of Medicine), Jae Sung Ko (Seoul National University College of Medicine), Byung Seok Lee (Chungnam National University College of Medicine), Jung Il Lee (Yonsei University College of Medicine), Young-Suk Lim (University of Ulsan College of Medicine), Won Young Tak (Kyungpook National University School of Medicine), Jong Eun Yeon (Korea University College of Medicine), Ki Tae Yoon (Pusan National University School of Medicine). |
BRAHMS: Novel middleware for integrated systems computation | Biological computational modellers are becoming increasingly interested in building large, eclectic models, including components on many different computational substrates, both biological and non-biological. At the same time, the rise of the philosophy of embodied modelling is generating a need to deploy biological models as controllers for robots in real-world environments. Finally, robotics engineers are beginning to find value in seconding biomimetic control strategies for use on practical robots. Together with the ubiquitous desire to make good on past software development effort, these trends are throwing up new challenges of intellectual and technological integration (for example across scales, across disciplines, and even across time) - challenges that are unmet by existing software frameworks. Here, we outline these challenges in detail, and go on to describe a newly developed software framework, BRAHMS, that meets them. BRAHMS is a tool for integrating computational process modules into a viable, computable system; its generality and flexibility facilitate integration across barriers, such as those described above, in a coherent and effective way. We go on to describe several cases where BRAHMS has been successfully deployed in practical situations. We also show excellent performance in comparison with a monolithic development approach. Additional benefits of developing in the framework include source code self-documentation, automatic coarse-grained parallelisation, cross-language integration, data logging, performance monitoring, and will include dynamic load-balancing and 'pause and continue' execution. BRAHMS is built on the nascent, and similarly general purpose, model markup language, SystemML. This will, in future, also facilitate repeatability and accountability (same answers ten years from now), transparent automatic software distribution, and interfacing with other SystemML tools. |
Using Topic Segmentation Models for the Automatic Organisation of MOOCs resources | As online courses such as MOOCs become increasingly popular, there has been a dramatic increase for the demand for methods to facilitate this type of organisation. While resources for new courses are often freely available, they are generally not suitably organised into easily manageable units. In this paper, we investigate how state-of-the-art topic segmentation models can be utilised to automatically transform unstructured text into coherent sections, which are suitable for MOOCs content browsing. The suitability of this method with regards to course organisation is confirmed through experiments with a lecture corpus, configured explicitly according to MOOCs settings. Experimental results demonstrate the reliability and scalability of this approach over various academic disciplines. The findings also show that the topic segmentation model which used discourse cues displayed the best results overall. |
Decision support and intelligent systems in the textile and apparel supply chain: An academic review of research articles | This article provides a comprehensive review of research articles related to the application of decision support and intelligent systems in the textile and apparel supply chains. Data were obtained from 77 articles published from 1994 to 2009 in 35 journals. The articles were categorized according to their applicability into three basic sectors – textile production, apparel manufacture, and distribution/sales. They were further categorized into 16 subsectors based on their operational and management/control processes. A comprehensive list of categorized journal articles identified in this study provides insights and relevant references for both researchers and practitioners on the application of decision support and intelligent systems to various stages of a textile and apparel supply chain. In light of the developed classification framework, we identify gaps in extending the use of the decision support and artificial intelligent systems in the industry and suggest potential and applicable research areas for further consideration in this subject area. 2013 Elsevier Ltd. All rights reserved. |
Ensemble Methods for Multi-label Classification | Ensemble methods have been shown to be an effectiv e tool for solving multi-label classification tasks. In the RAndom k-labELsets (RAKEL) algorithm, each member of the en semble is associated with a small randomly-selected subset of k labels. Then, a single label classifier is trained according to each combination of elements in the subset. In this paper we adopt a similar approach, however, instead of rando mly choosing subsets, we select the minimum required su bsets of k labels that cover all labels and meet additional constraints such as coverage of inter-label correla tions. Construction of the cover is achieved by form ulating the subset selection as a minimum set covering prob lem (SCP) and solving it by using approximation algo rithms. Every cover needs only to be prepared once by offline algorithms. Once prepared, a cover may b e applied to the classification of any given multi-la bel dataset whose properties conform with those of the cover. The contribution of this paper is two-fold. Fir st, we introduce SCP as a general framework for con structing label covers while allowing the user to incorpo rate cover construction constraints. We demonstrate he effectiveness of this framework by proposing two co nstruction constraints whose enforcement produces c overs that improve the prediction performance of ran dom selection. Second, we provide theoretical bound s that quantify the probabilities of random selection t produce covers that meet the proposed construct ion riteria. The experimental results indicate that the p roposed methods improve multi-label classification accuracy and stability compared with the RAKEL algorithm a nd to other state-of-the-art algorithms. |
Wideband Circularly Polarized Aperture-Fed Rotated Stacked Patch Antenna | A novel aperture stacked patch (ASP) antenna with circular polarization is proposed. The antenna consists of four parasitic patches, each one being rotated by an angle of 30° relative to its adjacent patches. The proposed antenna has achieved a simultaneous axial ratio <;3 dB and voltage standing wave ratio (VSWR) <;2 bandwidth of 33.6% (7.2-10.11 GHz) in the single element and 36.15% (7.1-10.2 GHz) in a 2 × 1-element array configuration. The antenna behavior is explained by a thorough parameter study together with fabrication and measurement. |
A New Android Malware Detection Approach Using Bayesian Classification | Mobile malware has been growing in scale and complexity as smartphone usage continues to rise. Android has surpassed other mobile platforms as the most popular whilst also witnessing a dramatic increase in malware targeting the platform. A worrying trend that is emerging is the increasing sophistication of Android malware to evade detection by traditional signature-based scanners. As such, Android app marketplaces remain at risk of hosting malicious apps that could evade detection before being downloaded by unsuspecting users. Hence, in this paper we present an effective approach to alleviate this problem based on Bayesian classification models obtained from static code analysis. The models are built from a collection of code and app characteristics that provide indicators of potential malicious activities. The models are evaluated with real malware samples in the wild and results of experiments are presented to demonstrate the effectiveness of the proposed approach. |
To what extent does the longevity of fixed dental prostheses depend on the function of the cement? Working Group 4 materials: cementation. | AIMS/BACKGROUND
The objective of this review was to define the impact of cementation mode on the longevity of different types of single tooth restorations and fixed dental prostheses (FDP).
METHODS
Literature search by PubMed as the major database was used utilizing the terms namely, adhesive techniques, all-ceramic crowns, cast-metal, cement, cementation, ceramic inlays, gold inlays, metal-ceramic, non-bonded fixed-partial-dentures, porcelain veneers, resin-bonded fixed-partial-dentures, porcelain-fused-to-metal, and implant-supported-restorations together with manual search of non-indexed literature. Cementation of root canal posts and cores were excluded. Due to lack of randomized prospective clinical studies in some fields of cementation, recommendations had to be based on lower evidence level (Centre of Evidence Based Medicine, Oxford) for special applications of current cements.
RESULTS
One-hundred-and-twenty-five articles were selected for the review. The primary function of the cementation is to establish reliable retention, a durable seal of the space between the tooth and the restoration, and to provide adequate optical properties. The various types of cements used in dentistry could be mainly divided into two groups: Water-based cements and polymerizing cements. Water-based cements exhibited satisfying long-term clinical performance associated with cast metal (inlays, onlays, partial crowns) as well as single unit metal-ceramic FDPs and multiple unit FDPs with macroretentive preparation designs and adequate marginal fit. Early short-term clinical results with high-strength all-ceramic restorations luted with water-based cements are also promising. Current polymerizing cements cover almost all fields of water-based cements and in addition to that they are mainly indicated for non-retentive restorations. They are able to seal the tooth completely creating hybrid layer formation. Furthermore, adhesive capabilities of polymerizing cements allowed for bonded restorations, promoting at the same time the preservation of dental tissues. |
A phase II study of cisplatin /S-1 in patients with carcinomas of unknown primary site | Background Carcinomas of unknown primary site (CUPs) are heterogeneous tumors associated with a poor prognosis. This phase II trial was designed to evaluate the efficacy and safety of a novel combination chemotherapy of S-1 and cisplatin (CDDP) in patients with CUP. Patients and Methods Patients with previously untreated CUPs were eligible for this trial. The treatment schedule consisted of oral S-1 (40 mg/m2) twice a day on days 1–21, and intravenous CDDP (60 mg/m2) on day 8. This schedule was repeated every 5 weeks. Results A total of 46 patients were enrolled. The overall response rate and the disease control rate were 41.3 % and 80.4 %, respectively. The median overall survival time was 17.4 months. Grade 3/4 neutropenia, thrombocytopenia, and febrile neutropenia occurred in 28.3 %, 13.0 %, and 2.2 % of the patients, respectively. Conclusion CDDP plus S-1 combination chemotherapy is well tolerated and active first-line empiric therapies for patients with CUP. |
Phase I studies of AVE9633, an anti-CD33 antibody-maytansinoid conjugate, in adult patients with relapsed/refractory acute myeloid leukemia | The efficacy of anti-CD33 immunoconjugates had been previously demonstrated for gemtuzumab-ozogamicin. AVE9633 is an anti-CD33-maytansine conjugate created by ImmunoGen Inc. Phase I trials of AVE9633 were performed in patients with AML to evaluate tolerability, pharmacokinetics and pharmacodynamics. Three phase I studies of AVE9633 were performed in 54 patients with refractory/relapsed AML, evaluating drug infusion on day 1 of a 21-day cycle (Day 1 study), day 1 and 8 (Day 1/8 study) and day 1, 4 and 7 (Day 1/4/7 study) of a 28-day cycle. Toxicity was mainly allergic reaction during infusion (3 grade 3 bronchospasms). DLT was reached for the D1–D7 schedule at 150 mg/sqm (1 keratitis, 1 liver toxicity), and the MTD was set at 130 mg/sqm for this schedule. In the two other phases I, the DLT was not reached. In the Day 1/8 study, CD33 on peripheral blasts was saturated and down-modulated for doses of 75 mg/m2 × 2 or higher, which was correlated with WBC kinetics and plasma levels of AVE9633. Decrease of DM4/CD33 ratio on the blasts surface between day 1 and 8 was the rational for evaluating day 1/4/7 schedule. This induced relatively constant DM4/CD33 levels over the first 8 days, however no activity was noted. One CRp, one PR and biological activity in five other patients were observed in this study. The Day 1 and Day 1/4/7 studies were early discontinued because of drug inactivity at doses significantly higher than CD33 -saturating doses. No myelossuppression was observed at any trial of AVE9633. The pharmacokinetics/pharmacodynamics data obtained in these studies will provide very useful information for the design of the next generation of immunoconjugates. |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.