title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
LocationSpark: A Distributed In-Memory Data Management System for Big Spatial Data | We present LocationSpark, a spatial data processing system built on top of Apache Spark, a widely used distributed data processing system. LocationSpark offers a rich set of spatial query operators, e.g., range search, kNN, spatio-textual operation, spatial-join, and kNN-join. To achieve high performance, LocationSpark employs various spatial indexes for in-memory data, and guarantees that immutable spatial indexes have low overhead with fault tolerance. In addition, we build two new layers over Spark, namely a query scheduler and a query executor. The query scheduler is responsible for mitigating skew in spatial queries, while the query executor selects the best plan based on the indexes and the nature of the spatial queries. Furthermore, to avoid unnecessary network communication overhead when processing overlapped spatial data, We embed an efficient spatial Bloom filter into LocationSpark’s indexes. Finally, LocationSpark tracks frequently accessed spatial data, and dynamically flushes less frequently accessed data into disk. We evaluate our system on real workloads and demonstrate that it achieves an order of magnitude performance gain over a baseline framework. |
Robust and Discriminative Self-Taught Learning | The lack of training data is a common challenge in many machine learning problems, which is often tackled by semi-supervised learning methods or transfer learning methods. The former requires unlabeled images from the same distribution as the labeled ones and the latter leverages labeled images from related homogenous tasks. However, these restrictions often cannot be satisfied. To address this, we propose a novel robust and discriminative self-taught learning approach to utilize any unlabeled data without the above restrictions. Our new approach employs a robust loss function to learn the dictionary, and enforces the structured sparse regularization to automatically select the optimal dictionary basis vectors and incorporate the supervision information contained in the labeled data. We derive an efficient iterative algorithm to solve the optimization problem and rigorously prove its convergence. Promising results in extensive experiments have validated the proposed approach. |
Implicit Regularization in Deep Learning | In an attempt to better understand generalization in deep learning, we study several possible explanations. We show that implicit regularization induced by the optimization method is playing a key role in generalization and success of deep learning models. Motivated by this view, we study how different complexity measures can ensure generalization and explain how optimization algorithms can implicitly regularize complexity measures. We empirically investigate the ability of these measures to explain different observed phenomena in deep learning. We further study the invariances in neural networks, suggest complexity measures and optimization algorithms that have similar invariances to those in neural networks and evaluate them on a number of learning tasks. Thesis Advisor: Nathan Srebro Title: Professor |
An Enhanced Security for Government Base on Multifactor Biometric Authentication | This paper is demonstrating to create a system of multifactor authentication based on biometric verification. Our system use iris for the first factor and fingerprint for the second factor. Once an attacker attempts to attack the system, there must have two factors. If one of them is compromised or broken, the attacker still has at least one more barrier to breach before successfully breaking into the target. Furthermore, this system will be implemented to enhance security for accessing control login government system. |
Using mobile phone text messages to improve insulin injection technique and glycaemic control in patients with diabetes mellitus: a multi-centre study in Turkey. | AIM AND OBJECTIVES
To improve the knowledge and skills of diabetic patients on insulin injections using mobile phone short message services and to evaluate the association of this intervention with metabolic outcomes.
BACKGROUND
Mobile communication technologies are widely used in Turkey, which maintains a diabetic population of more than 6·5 million. However, there are a limited number of studies using mobile technologies in the challenging and complicated management of diabetes.
DESIGN
A one group pretest-posttest design was used in this study.
METHODS
The study sample consisted of 221 people with type 1 and type 2 Diabetes Mellitus from eight outpatient clinics in six cities in Turkey. The 'Demographic and diabetes-related information Form' and 'Insulin Injection Technique and Knowledge Form' were used in the initial interview. Subsequently, 12 short messages related to insulin administration were sent to patients twice a week for six months. Each patient's level of knowledge and skills regarding both the insulin injection technique and glycaemic control (glycated haemoglobin A1c) levels were measured at three months and six months during the text messaging period and six months later (12 months total) when text messaging was stopped.
RESULTS
The mean age of the patients with diabetes was 39·8 ± 16·2 years (min: 18; max: 75). More than half of the patients were females with a mean duration of diabetes of 11·01 ± 7·22 years (min 1; max: 32). Following the text message reminders, the patients' level of knowledge and skills regarding the insulin injection technique improved at month 3 and 6 (p < 0·05). The patients' A1c levels statistically significantly decreased at the end of month 3, 6 and 12 compared to the baseline values (p < 0·05). The number of insulin injection sites and the frequency of rotation of skin sites for insulin injections also increased.
CONCLUSION
This study demonstrated that a short message services-based information and reminder system on insulin injection administration provided to insulin-dependent patients with diabetes by nurses resulted in improved self-administration of insulin and metabolic control.
RELEVANCE TO CLINICAL PRACTICE
Today, with the increased use of mobile communication technologies, it is possible for nurses to facilitate diabetes management by using these technologies. We believe that mobile technologies, which are not only easy to use and to follow-up with by healthcare providers, are associated with positive clinical outcomes for patients and should be more commonly used in the daily practice of diabetes management. |
The Pacific oyster, Crassostrea gigas, shows negative correlation to naturally elevated carbon dioxide levels: Implications for near-term ocean acidification effects | We report results from an oyster hatchery on the Oregon coast, where intake waters experienced variable carbonate chemistry (aragonite saturation state , 0.8 to . 3.2; pH , 7.6 to . 8.2) in the early summer of 2009. Both larval production and midstage growth (, 120 to , 150 mm) of the oyster Crassostrea gigas were significantly negatively correlated with the aragonite saturation state of waters in which larval oysters were spawned and reared for the first 48 h of life. The effects of the initial spawning conditions did not have a significant effect on early-stage growth (growth from D-hinge stage to , 120 mm), suggesting a delayed effect of water chemistry on larval development. Rising atmospheric carbon dioxide (CO2) driven by anthropogenic emissions has resulted in the addition of over 140 Pg-C (1 Pg 5 1015 g) to the ocean (Sabine et al. 2011). The thermodynamics of the reactions between carbon dioxide and water require this addition to cause a decline of ocean pH and carbonate ion concentrations ([CO3 ]). For the observed change between current-day and preindustrial atmospheric CO2, the surface oceans have lost approximately 16% of their [CO3 ] and decreased in pH by 0.1 unit, although colder surface waters are likely to have experienced a greater effect (Feely et al. 2009). Projections for the open ocean suggest that wide areas, particularly at high latitudes, could reach low enough [CO3 ] levels that dissolution of biogenic carbonate minerals is thermodynamically favored by the end of the century (Feely et al. 2009; Steinacher et al. 2009), with implications for commercially significant higher trophic levels (Aydin et al. 2005). There is considerable spatial and temporal variability in ocean carbonate chemistry, and there is evidence that these natural variations affect marine biota, with ecological assemblages next to cold-seep high-CO2 sources having been shown to be distinct from those nearby but less affected by the elevated CO2 levels (Hall-Spencer et al. 2008). Coastal environments that are subject to upwelling events also experience exposure to elevated CO2 conditions where deep water enriched by additions of respiratory CO2 is brought up from depth to the nearshore surface by physical processes. Feely et al. (2008) showed that upwelling on the Pacific coast of central North America markedly increased corrosiveness for calcium carbonate minerals in surface nearshore waters. A small but significant amount of anthropogenic CO2 present in the upwelled source waters provided enough additional CO2 to cause widespread corrosiveness on the continental shelves (Feely et al. 2008). Because the source water for upwelling on the North American Pacific coast takes on the order of decades to transit from the point of subduction to the upwelling locales (Feely et al. 2008), this anthropogenic CO2 was added to the water under a substantially lowerCO2 atmosphere than today’s, and water already en route to this location is likely carrying an increasing burden of anthropogenic CO2. Understanding the effects of natural variations in CO2 in these waters on the local fauna is critical for anticipating how more persistently corrosive conditions will affect marine ecosystems. The responses of organisms to rising CO2 are potentially numerous and include negative effects on respiration, motility, and fertility (Portner 2008). From a geochemical perspective, however, the easiest process to understand conceptually is that of solid calcium carbonate (CaCO3,s) mineral formation. In nearly all ocean surface waters, formation of CaCO3,s is thermodynamically favored by the abundance of the reactants, dissolved calcium ([Ca2+]), and carbonate ([CO3 ]) ions. While oceanic [Ca 2+] is relatively constant at high levels that are well described by conservative relationships with salinity, ocean [CO 3 ] decreases as atmospheric CO2 rises, lowering the energetic favorability of CaCO3,s formation. This energetic favorability is proportional to the saturation state, V, defined by |
A multicenter prospective cohort study of the Strata valve for the management of hydrocephalus in pediatric patients | Object. Previous reports suggest that adjustable valves may improve the survival of cerebrospinal fluid shunts or relieve shunt-related symptoms. To evaluate these claims, the authors conducted a prospective multicenter cohort study of children who underwent placement of Strata valves. Methods. Patients undergoing initial shunt placement (Group 1) or shunt revision (Group 2) were treated using Strata valve shunt systems. Valves were adjustable to five performance level settings by using an externally applied magnet. The performance levels were checked using an externally applied hand tool and radiography. Patients were followed for 1 year or until they underwent shunt revision surgery. Between March 2000 and February 2002, 315 patients were enrolled in the study. In Group 1 (201 patients) the common causes of hydrocephalus were myelomeningocele (16%), aqueductal stenosis (14%), and hemorrhage (14%). The overall 1-year shunt survival was 67%. Causes of shunt failure were obstruction (17%), overdrainage (1.5%), loc ulated ventricles (2%), and infection (10.6%)Patients in Group 2 (114 patients) were older and the causes of hydro cephalus were similar. Among patients in Group 2 the 1-year shunt survival was 71%. There were 256 valve adjustments. Symptoms completely resolved (26%) or improved (37%) after 63% of adjust ments. When symptoms improved or resolved, they did so within 24 hours in 89% of adjustments. Hand-tool and ra diographic readings of valve settings were the same in 234 (98%) of 238 assessments. Conclusions. The 1-year shunt survival for the Strata valve shunt system when used in initial shunt insertion pro cedures or shunt revisions was similar to those demonstrated for other valves. Symptom relief or improvement fol lowing adjustment was observed in 63% of patients. Hand-tool assessment of performance level settings reliably predicted radiographic assessments. |
Antioxidant and antimicrobial activities of beet root pomace extracts. | Čanadanović-Brunet J.M., Savatović S.S., Ćetković G.S., Vulić J.J., Djilas S.M., Markov S.L., Cvetković D.D. (2011): Antioxidant and antimicrobial activities of beet root pomace extracts. Czech J. Food Sci., 29: 575–585. We described the in vitro antioxidant and antimicrobial activities of ethanol, acetone, and water extracts of beet root pomace. Total contents of phenolics (316.30–564.50 mg GAE/g of dry extract), flavonoids (316.30–564.50 mg RE/g of dry extract), betacyanins (18.78–24.18 mg/g of dry extract), and betaxanthins (11.19–22.90 mg/g of dry extract) after solid-phase extraction were determined spectrophotometrically. The antioxidant activity was determined by measuring the reducing power and DPPH scavenging activity by spectrometric metod, and hydroxyl and superoxide anion radical scavenging activity by ESR spectroscopy. In general, the reducing power of all the beet root pomace extracts increased with increasing concentrations. The DPPH-free radical scavenging activity of the extracts, expressed as EC50, ranged from 0.133 mg/ml to 0.275 mg/ml. Significant correlation was observed between all phytochemical components and scavenging activity. 0.5 mg/ml of ethanol extract completely eliminated hydroxyl radical, which had been generated in Fenton system, while the same concentration of this extract scavenged 75% of superoxide anion radicals. In antibacterial tests, Staphylococcus aureus and Bacillus cereus showed higher susceptibility than escherichia coli and Pseudomonas aeruginosa. |
Training and analyzing deep recurrent neural networks | Time series often have a temporal hierarchy, with information that is spread out over multiple time scales. Common recurrent neural networks, however, do not explicitly accommodate such a hierarchy, and most research on them has been focusing on training algorithms rather than on their basic architecture. In this paper we study the effect of a hierarchy of recurrent neural networks on processing time series. Here, each layer is a recurrent network which receives the hidden state of the previous layer as input. This architecture allows us to perform hierarchical processing on difficult temporal tasks, and more naturally capture the structure of time series. We show that they reach state-of-the-art performance for recurrent networks in character-level language modeling when trained with simple stochastic gradient descent. We also offer an analysis of the different emergent time scales. |
A flat pipeline inspection robot with two wheel chains | This paper presents a new pipeline inspection robot that has multiple sensors for inspection of 80–100mm pipelines. The special feature of this robot is realization of driving and steering capability by using only two wheel chains. Compared to popularly employed pipeline robots using three wheel chains, the new design allows simple robot control and easy user interface, specially at T-branch. As another advantage, the flat shape of this robot allows mounting additional sensors on the both sides of the robot. The kinematics and three control modes are described. Finally, the performance of this robot system is verified by experimentation. |
Cloud-enabled wireless body area networks for pervasive healthcare | With the support of mobile cloud computing, wireless body area networks can be significantly enhanced for massive deployment of pervasive healthcare applications. However, several technical issues and challenges are associated with the integration of WBANs and MCC. In this article, we study a cloud-enabled WBAN architecture and its applications in pervasive healthcare systems. We highlight the methodologies for transmitting vital sign data to the cloud by using energy-efficient routing, cloud resource allocation, semantic interactions, and data security mechanisms. |
OpenSense: open community driven sensing of environment | This paper outlines a vision for community-driven sensing of our environment. At its core, community sensing is a dynamic new form of mobile geosensor network. We believe that community sensing networks, in order to be widely deployable and sustainable, need to follow utilitarian approaches towards sensing and data management. Current projects exploring community sensing have paid less attention to these underlying fundamental principles. We illustrate this vision through OpenSense -- a large project that aims to explore community sensing driven by air pollution monitoring. |
Corpus callosum morphology and its relationship to cognitive function in neurofibromatosis type 1. | Neurofibromatosis type 1 (NF1) is associated with cognitive dysfunction and structural brain abnormalities such as an enlarged corpus callosum. This study aimed to determine the relationship between corpus callosum morphology and cognitive function in children with neurofibromatosis type 1 using quantitative neuroanatomic imaging techniques. Children with neurofibromatosis type 1 (n = 46) demonstrated a significantly larger total corpus callosum and corpus callosum index compared with control participants (n = 30). A larger corpus callosum index in children with neurofibromatosis type 1 was associated with significantly lower IQ, reduced abstract concept formation, reduced verbal memory, and diminished academic ability, specifically reading and math. Our results suggest an enlarged corpus callosum in children with neurofibromatosis type 1 is associated with cognitive impairment and may provide an early structural marker for the children at risk of cognitive difficulties. Cognitive deficits associated with structural brain abnormalities in neurofibromatosis type 1 are unlikely to be reversible and so may not respond to proposed pharmacological therapies for neurofibromatosis type 1-related cognitive impairments. |
Overview of ImageCLEFcaption 2017 - Image Caption Prediction and Concept Detection for Biomedical Images | This paper presents an overview of the ImageCLEF 2017 caption tasks on the analysis of images from the biomedical literature. Two subtasks were proposed to the participants: a concept detection task and caption prediction task, both using only images as input. The two subtasks tackle the problem of providing image interpretation by extracting concepts and predicting a caption based on the visual information of an image alone. A dataset of 184,000 figure-caption pairs from the biomedical open access literature (PubMed Central) are provided as a testbed with the majority of them as trainign data and then 10,000 as validation and 10,000 as test data. Across two tasks, 11 participating groups submitted 71 runs. While the domain remains challenging and the data highly heterogeneous, we can note some surprisingly good results of the difficult task with a quality that could be beneficial for health applications by better exploiting the visual content of biomedical figures. |
Health-related quality of life (HRQOL) in family members of cancer victims: results from a longitudinal intervention study in Norway and Sweden. | This study compared the health-related quality of life (HRQOL) of family members of patients who participated in a program of palliative care (intervention family members) with those in conventional care (control family members). The HRQOL was measured by the short-form (SF-36) health survey questionnaire, including eight subscales. The longitudinal intervention study includes two sites: Trondheim, Norway and Malmø, Sweden. Our first hypothesis was that the HRQOL of the family members would deteriorate over time in the terminal phase and reach a low point a few months after the death of the patients, and thereafter gradually increase. This hypothesis was fully supported by the trajectories for the five scales, role limitation due to physical problems, vitality, social functioning, role limitation due to emotional problems, and mental health; but only partially so for the remaining three scales, physical functioning, bodily pain, and general health perception. From a second hypothesis, we expected the trajectories of the HRQOL scale scores for the two groups to show an increasing difference over time in quality of life in favor of the intervention group. This was the case for two of the scales: role limitation due to emotional problems and mental health. Before we may reach a definitive conclusion on the effects of palliative care programs for the HRQOL of family members, we need further longitudinal intervention studies with large samples. |
Question Answering Over Knowledge Graphs: Question Understanding Via Template Decomposition | The gap between unstructured natural language and structured data makes it challenging to build a system that supports using natural language to query large knowledge graphs. Many existing methods construct a structured query for the input question based on a syntactic parser. Once the input question is parsed incorrectly, a false structured query will be generated, which may result in false or incomplete answers. The problem gets worse especially for complex questions. In this paper, we propose a novel systematic method to understand natural language questions by using a large number of binary templates rather than semantic parsers. As sufficient templates are critical in the procedure, we present a low-cost approach that can build a huge number of templates automatically. To reduce the search space, we carefully devise an index to facilitate the online template decomposition. Moreover, we design effective strategies to perform the two-level disambiguations (i.e., entity-level ambiguity and structure-level ambiguity) by considering the query semantics. Extensive experiments over several benchmarks demonstrate that our proposed approach is effective as it significantly outperforms state-of-the-art methods in terms of both precision and recall. PVLDB Reference Format: Weiguo Zheng, Jeffrey Xu Yu, Lei Zou, and Hong Cheng. Question Answering Over Knowledge Graphs: Question Understanding Via Template Decomposition. PVLDB, 11 (11): 1373-1386, 2018. DOI: https://doi.org/10.14778/3236187.3236192. |
Convolutional neural networks at constrained time cost | Though recent advanced convolutional neural networks (CNNs) have been improving the image recognition accuracy, the models are getting more complex and time-consuming. For real-world applications in industrial and commercial scenarios, engineers and developers are often faced with the requirement of constrained time budget. In this paper, we investigate the accuracy of CNNs under constrained time cost. Under this constraint, the designs of the network architectures should exhibit as trade-offs among the factors like depth, numbers of filters, filter sizes, etc. With a series of controlled comparisons, we progressively modify a baseline model while preserving its time complexity. This is also helpful for understanding the importance of the factors in network designs. We present an architecture that achieves very competitive accuracy in the ImageNet dataset (11.8% top-5 error, 10-view test), yet is 20% faster than “AlexNet” [14] (16.0% top-5 error, 10-view test). |
Comparison of Using SVC and STATCOM for Wind Farm Integration | This paper studies system stability of wind farms based on fixed speed induction generators (FSIG) and investigates the use of the static Var compensator (SVC) and static synchronous compensator (STATCOM) for wind farm integration. Due to the nature of asynchronous operation, system instability of wind farms based on FSIG is largely caused by the excessive reactive power absorption by FSIG after fault due to the large rotor slip gained during fault. Wind farm models based on FSIG and equipped with either SVC or STATCOM are developed in PSCAD/EMTDC. It was found that the SVC and STATCOM considerably improve the system stability during and after disturbances, especially when the network is weak. Compared to SVC, STATCOM gave a much better dynamic performance, and provided a better reactive power support to the network, as its maximum reactive current output was virtually independent of the voltage at the point of common coupling (PCC). |
TCA: An Efficient Two-Mode Meta-Heuristic Algorithm for Combinatorial Test Generation (T) | Covering arrays (CAs) are often used as test suites for combinatorial interaction testing to discover interaction faults of real-world systems. Most real-world systems involve constraints, so improving algorithms for covering array generation (CAG) with constraints is beneficial. Two popular methods for constrained CAG are greedy construction and meta-heuristic search. Recently, a meta-heuristic framework called two-mode local search has shown great success in solving classic NPhard problems. We are interested whether this method is also powerful in solving the constrained CAG problem. This work proposes a two-mode meta-heuristic framework for constrained CAG efficiently and presents a new meta-heuristic algorithm called TCA. Experiments show that TCA significantly outperforms state-of-the-art solvers on 3-way constrained CAG. Further experiments demonstrate that TCA also performs much better than its competitors on 2-way constrained CAG. |
Motion of the rearfoot, ankle and subtalar joints and ankle moments when wearing lateral wedge insoles – results from bone anchored markers | Background Knee osteoarthritis is a debilitating condition and increased dynamic loading at the knee has been linked with increased progression of the disease. Lateral wedge insoles have been used in clinical practice since the late 1980s. It is theorised that lateral wedge insoles increase the subtalar joint valgus orientation and increase the ankle valgus moment [1], with subsequent reduced knee varus moments [2]. Results have shown that both clinical success and reductions in knee loading vary between people. Differences could be due to person specific foot biomechanics. The aim of this study was to determine the changes in frontal plane foot and ankle motion and moment due to a lateral wedge orthosis. |
Detecting Malicious Web Links and Identifying Their Attack Types | Malicious URLs have been widely used to mount various cyber attacks including spamming, phishing and malware. Detection of malicious URLs and identification of threat types are critical to thwart these attacks. Knowing the type of a threat enables estimation of severity of the attack and helps adopt an effective countermeasure. Existing methods typically detect malicious URLs of a single attack type. In this paper, we propose method using machine learning to detect malicious URLs of all the popular attack types and identify the nature of attack a malicious URL attempts to launch. Our method uses a variety of discriminative features including textual properties, link structures, webpage contents, DNS information, and network traffic. Many of these features are novel and highly effective. Our experimental studies with 40,000 benign URLs and 32,000 malicious URLs obtained from real-life Internet sources show that our method delivers a superior performance: the accuracy was over 98% in detecting malicious URLs and over 93% in identifying attack types. We also report our studies on the effectiveness of each group of discriminative features, and discuss their evadability. |
Silver nanoparticles and their orthopaedic applications. | Implant-associated infection is a major source of morbidity in orthopaedic surgery. There has been extensive research into the development of materials that prevent biofilm formation, and hence, reduce the risk of infection. Silver nanoparticle technology is receiving much interest in the field of orthopaedics for its antimicrobial properties, and the results of studies to date are encouraging. Antimicrobial effects have been seen when silver nanoparticles are used in trauma implants, tumour prostheses, bone cement, and also when combined with hydroxyapatite coatings. Although there are promising results with in vitro and in vivo studies, the number of clinical studies remains small. Future studies will be required to explore further the possible side effects associated with silver nanoparticles, to ensure their use in an effective and biocompatible manner. Here we present a review of the current literature relating to the production of nanosilver for medical use, and its orthopaedic applications. |
Ambulation training with and without partial weightbearing after traumatic brain injury: results of a randomized, controlled trial. | OBJECTIVE
To test the hypothesis that 8 wks of partial weight-bearing gait retraining improves functional ambulation to a greater extent than traditional physical therapy in individuals after traumatic brain injury.
DESIGN
A randomized, open-label, controlled, cohort study was conducted at two inpatient university-based rehabilitation hospitals. A total of 38 adults with a primary diagnosis of traumatic brain injury and significant gait abnormalities received either 8 wks of standard physical therapy or physical therapy supplemented with partial weight-bearing gait training twice weekly.
RESULTS
Significant (P < 0.05) improvements were detected in both groups on Functional Ambulation Category, Standing Balance Scale, Rivermead Mobility Index, and FIM. However, no differences were found between the treatment groups.
CONCLUSIONS
Results did not support the hypothesis that 8 wks of partial weight-bearing gait retraining improves functional ambulation to a greater extent than traditional physical therapy in individuals after traumatic brain injury based on common clinical measures. |
FLIR Image Segmentation and Natural Object Classification | In this paper we compare four classification techniques for classifying texture data of various natural objects found in FLIR images. The techniques compared include Linear Discriminant Analysis, Mean Classifier and two different models of K-Nearest Neighbour methods. Hermite functions are used for texture feature extraction from segmented regions of interest in natural scenes taken as a video sequence. A total of 2680 samples for a total of twelve different classes are used for object recognition. The results on correctly identifying twelve natural objects in scenes are compared across the four classifiers on both unnormalised and normalised data. On unnormalised data, the average best recognition rate obtained using a ten fold cross-validation is 96.5%, and on unnormalised data it is 86.1% with a single nearest neighbour technique. |
The Use of Zigs and Zags to Reduce Scarring over “Keloid Triangles” during Excisional Surgery: Biomechanics, Review and Recommendations | Aim: The sternal region, cervico-mandibular region and the intra-mammary region have been the bane of many cutaneous surgeons, with a higher propensity for poor scarring and wound complications. In this article, the author undertakes a review of different methods of breaking up scars by utilizing zigs and zags, and conducts a pigskin study to measure the reduction in tension that can be achieved by using a simple zigzag technique while performing excisions. Methods: A pigskin study conducted into the use of the simple zigzag to reduce the tension (and thereby scarring) of surgical wounds is reported here, and comparison and review is undertaken of the biomechanics of elliptical excisions and traditional Z-plasties. Results: Using a simple zigzag reduces tension across the midpoint of the scar more effectively than a Z-plasty or a simple elliptical excision. Conclusion: The techniques of breaking up a scar or incision line by using zigs and zags, in a means to reduce scarring, are not new. However, each of these techniques has specific advantages and disadvantages that need consideration by the surgeon. In this paper, a pigskin study is conducted into the use of the simple zigzag to reduce the tension (and thereby reduce the risk of poor scarring) of surgical wounds. |
Security and privacy framework for ubiquitous healthcare IoT devices | With the support of the wearable devices, healthcare services started a new phase in serving patients need. The new technology adds more facilities and luxury to the healthcare services, Also changes patients' lifestyles from the traditional way of monitoring to the remote home monitoring. Such new approach faces many challenges related to security as sensitive data get transferred through different type of channels. They are four main dimensions in terms of security scope such as trusted sensing, computation, communication, privacy and digital forensics. In this paper we will try to focus on the security challenges of the wearable devices and IoT and their advantages in healthcare sectors. |
Leafsnap: A Computer Vision System for Automatic Plant Species Identification | We describe the first mobile app for identifying plant species using automatic visual recognition. The system – called Leafsnap – identifies tree species from photographs of their leaves. Key to this system are computer vision components for discarding non-leaf images, segmenting the leaf from an untextured background, extracting features representing the curvature of the leaf’s contour over multiple scales, and identifying the species from a dataset of the 184 trees in the Northeastern United States. Our system obtains state-of-the-art performance on the real-world images from the new Leafsnap Dataset – the largest of its kind. Throughout the paper, we document many of the practical steps needed to produce a computer vision system such as ours, which currently has nearly a million users. |
Public Health Surveillance After the 2010 Haiti Earthquake: the Experience of Médecins Sans Frontières | Background In January 2010, Haiti was struck by a powerful earthquake, killing and wounding hundreds of thousands and leaving millions homeless. In order to better understand the severity of the crisis, and to provide early warning of epidemics or deteriorations in the health status of the population, Médecins Sans Frontières established surveillance for infections of epidemic potential and for death rates and malnutrition prevalence. Methods Trends in infections of epidemic potential were detected through passive surveillance at health facilities serving as sentinel sites. Active community surveillance of death rates and malnutrition prevalence was established through weekly home visits. Results There were 102,054 consultations at the 15 reporting sites during the 26 week period of operation. Acute respiratory infections, acute watery diarrhoea and malaria/fever of unknown origin accounted for the majority of proportional morbidity among the diseases under surveillance. Several alerts were triggered through the detection of immediately notifiable diseases and increasing trends in some conditions. Crude and under-5 death rates, and acute malnutrition prevalence, were below emergency thresholds. Conclusion Disease surveillance after disasters should include an alert and response component, requiring investment of resources in informal networks that improve sensitivity to alerts as well as on the more common systems of data collection, compilation and analysis. Information sharing between partners is necessary to strengthen early warning systems. Community-based surveillance of mortality and malnutrition is feasible but requires careful implementation and validation. |
Driving Semantic Parsing from the World's Response | Current approaches to semantic parsing, the task of converting text to a formal meaning representation, rely on annotated training data mapping sentences to logical forms. Providing this supervision is a major bottleneck in scaling semantic parsers. This paper presents a new learning paradigm aimed at alleviating the supervision burden. We develop two novel learning algorithms capable of predicting complex structures which only rely on a binary feedback signal based on the context of an external world. In addition we reformulate the semantic parsing problem to reduce the dependency of the model on syntactic patterns, thus allowing our parser to scale better using less supervision. Our results surprisingly show that without using any annotated meaning representations learning with a weak feedback signal is capable of producing a parser that is competitive with fully supervised parsers. |
Behaviour Centred Design: towards an applied science of behaviour change | Behaviour change has become a hot topic. We describe a new approach, Behaviour Centred Design (BCD), which encompasses a theory of change, a suite of behavioural determinants and a programme design process. The theory of change is generic, assuming that successful interventions must create a cascade of effects via environments, through brains, to behaviour and hence to the desired impact, such as improved health. Changes in behaviour are viewed as the consequence of a reinforcement learning process involving the targeting of evolved motives and changes to behaviour settings, and are produced by three types of behavioural control mechanism (automatic, motivated and executive). The implications are that interventions must create surprise, revalue behaviour and disrupt performance in target behaviour settings. We then describe a sequence of five steps required to design an intervention to change specific behaviours: Assess, Build, Create, Deliver and Evaluate. The BCD approach has been shown to change hygiene, nutrition and exercise-related behaviours and has the advantages of being applicable to product, service or institutional design, as well as being able to incorporate future developments in behaviour science. We therefore argue that BCD can become the foundation for an applied science of behaviour change. |
Species richness in soil bacterial communities: a proposed approach to overcome sample size bias. | Estimates of species richness based on 16S rRNA gene clone libraries are increasingly utilized to gauge the level of bacterial diversity within various ecosystems. However, previous studies have indicated that regardless of the utilized approach, species richness estimates obtained are dependent on the size of the analyzed clone libraries. We here propose an approach to overcome sample size bias in species richness estimates in complex microbial communities. Parametric (Maximum likelihood-based and rarefaction curve-based) and non-parametric approaches were used to estimate species richness in a library of 13,001 near full-length 16S rRNA clones derived from soil, as well as in multiple subsets of the original library. Species richness estimates obtained increased with the increase in library size. To obtain a sample size-unbiased estimate of species richness, we calculated the theoretical clone library sizes required to encounter the estimated species richness at various clone library sizes, used curve fitting to determine the theoretical clone library size required to encounter the "true" species richness, and subsequently determined the corresponding sample size-unbiased species richness value. Using this approach, sample size-unbiased estimates of 17,230, 15,571, and 33,912 were obtained for the ML-based, rarefaction curve-based, and ACE-1 estimators, respectively, compared to bias-uncorrected values of 15,009, 11,913, and 20,909. |
Coffee Ingestion Enhances 1-Mile Running Race Performance. | CONTEXT
Caffeine, often in the form of coffee, is frequently used as a supplement by athletes in an attempt to facilitate improved performance during exercise.
PURPOSE
To investigate the effectiveness of coffee ingestion as an ergogenic aid prior to a 1-mile (1609 m) race.
METHODS
In a double-blind, randomized, cross-over, and placebo-controlled design, 13 trained male runners completed a 1-mile race 60 minutes following the ingestion of 0.09 g·kg-1 coffee (COF), 0.09 g·kg-1 decaffeinated coffee (DEC), or a placebo (PLA). All trials were dissolved in 300 mL of hot water.
RESULTS
The race completion time was 1.3% faster following the ingestion of COF (04:35.37 [00:10.51] min:s.ms) compared with DEC (04:39.14 [00:11.21] min:s.ms; P = .018; 95% confidence interval [CI], -0.11 to -0.01; d = 0.32) and 1.9% faster compared with PLA (04:41.00 [00:09.57] min:s.ms; P = .006; 95% CI, -0.15 to -0.03; d = 0.51). A large trial and time interaction for salivary caffeine concentration was observed (P < .001; [Formula: see text]), with a very large increase (6.40 [1.57] μg·mL-1; 95% CI, 5.5-7.3; d = 3.86) following the ingestion of COF. However, only a trivial difference between DEC and PLA was observed (P = .602; 95% CI, -0.09 to 0.03; d = 0.17). Furthermore, only trivial differences were observed for blood glucose (P = .839; [Formula: see text]) and lactate (P = .096; [Formula: see text]) and maximal heart rate (P = .286; [Formula: see text]) between trials.
CONCLUSIONS
The results of this study show that 60 minutes after ingesting 0.09 g·kg-1 of caffeinated coffee, 1-mile race performance was enhanced by 1.9% and 1.3% compared with placebo and decaffeinated coffee, respectively, in trained male runners. |
Advanced persistent threats: Behind the scenes | Advanced persistent threats (APTs) pose a significant risk to nearly every infrastructure. Due to the sophistication of these attacks, they are able to bypass existing security systems and largely infiltrate the target network. The prevention and detection of APT campaigns is also challenging, because of the fact that the attackers constantly change and evolve their advanced techniques and methods to stay undetected. In this paper we analyze 22 different APT reports and give an overview of the used techniques and methods. The analysis is focused on the three main phases of APT campaigns that allow to identify the relevant characteristics of such attacks. For each phase we describe the most commonly used techniques and methods. Through this analysis we could reveal different relevant characteristics of APT campaigns, for example that the usage of 0-day exploit is not common for APT attacks. Furthermore, the analysis shows that the dumping of credentials is a relevant step in the lateral movement phase for most APT campaigns. Based on the identified characteristics, we also propose concrete prevention and detection approaches that make it possible to identify crucial malicious activities that are performed during APT campaigns. |
Validating a Measurement Tool of Presen ce in Online Communities of Inquiry | This article examines work related to the development and validation of a measurement tool for the Community of Inquiry (CoI) framework in online settings. The framework consists of three elements: social presence, teaching presence and cognitive presence, each of which is integral to the instrument. The 34 item instrument, and thus framework, was tested after being administered at four institutions in the Summer of 2007. The article also includes a discussion of implications for the future use of the CoI survey and the CoI framework itself. |
ARTT taxonomy and cyber-attack Framewok | The security of computer systems has become essential especially in front of the critical issues of cyber-attacks that can result in the compromise of these systems because any act on a system intends to harm one of the security properties (Confidentiality, Integrity and availability). Studying computer attacks is also essential to design models of attacks aiming to protect against these attacks by modeling the last ones. The development of taxonomies leads to characterize and classify the attacks which lead to understand them. In computer's security taxonomy, we can have two broad categories: cyber-attack's taxonomy and cyber-security taxonomy. In this paper, we propose taxonomy for cyber-attacks according to an attacker vision and the aspect of achieving an attack. This taxonomy is based on 4 dimensions: Attack vector, Result, Type and Target. To generalize our approach, we have used the framework of the Discrete EVent system Specification DEVS. This framework depicts the overall vision of cyber-attacks. To partially validate our work, a simulation is done on a case study of buffer overflow. A DEVS model is described and a simulation is done via this formalism. This case study aims to reinforce our proposal. |
Vaginal labiaplasty: current practices and a simplified classification system for labial protrusion. | BACKGROUND
Vaginal labiaplasty has been described for the management of functional and aesthetic problems associated with protrusion of the labia minora. Despite increasing numbers of procedures performed, there is a paucity of data to guide treatment paradigms. This systematic review aims to establish a simple, unifying classification scheme for labial protrusion and summarize current labiaplasty techniques and practices.
METHODS
A systematic literature review was performed using the PubMed database. Additional articles were selected after reviewing references of identified articles.
RESULTS
The search returned 247 articles. After applying inclusion criteria to identify prospective and retrospective studies evaluating different techniques, outcomes, complications, and patient satisfaction, 19 articles were selected. Labiaplasty of the labia minora was described in 1949 patients. Seven different surgical techniques were used for labiaplasty, including deepithelialization, direct excision, W-shaped resection, wedge resection, composite reduction, Z-plasty, and laser excision. Patient satisfaction rates for each technique ranged from 94 to 100 percent. The most common postoperative complication for all techniques was wound dehiscence (4.7 percent). Key areas for perioperative patient management were defined.
CONCLUSIONS
Labiaplasty is safe and carries a high satisfaction rate. However, current practices remain exceedingly diverse. The authors propose a simplified classification system based on the distance of the lateral edge of the labia minora from that of the labia majora, rather than from the introitus. Key areas for perioperative patient management include patient anesthesia, resection technique used, wound closure, and postoperative care. Further randomized studies using a standardized classification system are required to better compare different techniques and establish best practices. |
Compact ASIC Architectures for the 512-Bit Hash Function Whirlpool | Compact hardware architectures are proposed for the ISO/IEC 10118-3 standard hash function Whirlpool. In order to reduce the circuit area, the 512-bit function block ρ[k] for the main datapath is divided into smaller sub-blocks with 256-, 128-, or 64-bit buses, and the sub-blocks are used iteratively. Six architectures are designed by combining the three different datapath widths and two data scheduling techniques: interleave and pipeline. The six architectures in conjunction with three different types of S-box were synthesized using a 90-nm CMOS standard cell library, with two optimization options: size and speed. A total of 18 implementations were obtained, and their performances were compared with conventional designs using the same standard cell library. The highest hardware efficiency (defined by throughput per gate) of 372.3 Kbps/gate was achieved by the proposed pipeline architecture with the 256-bit datapath optimized for speed. The interleaved architecture with the 64-bit datapath optimized for size showed the smallest size of 13.6 Kgates, which requires only 46% of the resources of the conventional compact architecture. |
Multimedia learning object to build cognitive understanding in learning introductory programming | Programming is taught as foundation module at the beginning of undergraduate studies and/or during foundation year. Learning introductory programming languages such as Pascal, Basic / C (procedural) and C++ / Java (object oriented) requires learners to understand the underlying programming paradigm, syntax, logic and the structure. Learning to program is considered hard for novice learners and it is important to understand what makes learning program so difficult and how students learn.
The prevailing focus on multimedia learning objects provides promising approach to create better knowledge transfer. This project aims to investigate: (a) students' perception in learning to program and the difficulties. (b) effectiveness of multimedia learning objects in learning introductory programming language in a face-to-face learning environment. |
Security Concerns in Popular Cloud Storage Services | The authors provide a systematic security analysis on the sharing methods of three major cloud storage and synchronization services: Dropbox, Google Drive, and Microsoft SkyDrive. They show that all three services have security weaknesses that may result in data leakage without users' awareness. |
Mining Social Media for Newsgathering | Social media is becoming an increasingly important data source for learning about and tracking breaking news. This is possible thanks to mobile devices connected to the Internet, which allow anyone to post updates from anywhere, leading in turn to a growing presence of citizen journalism. Consequently, social media has become a go-to resource for journalists during newsgathering. Use of social media for newsgathering is however challenging, and suitable tools are needed in order to facilitate access to useful information for reporting. In this paper, we provide an overview of research in data mining and natural language processing for mining social media for newsgathering. We discuss seven different tasks that researchers have worked on to mitigate the challenges inherent to social media newsgathering: event detection, summarisation, news recommenders, content verification, finding information sources, development of newsgathering dashboards and other tasks. We outline the progress made so far in the field, summarise the current challenges as well as discuss future directions in the use of computational journalism to assist with social media newsgathering. This survey paper is relevant to computer scientists researching news in social media as well as for interdisciplinary researchers interested in the intersection of computer science and journalism. |
Natural mapping and intuitive interaction in videogames | Videogame control interfaces continue to evolve beyond their traditional roots, with devices encouraging more natural forms of interaction growing in number and pervasiveness. Yet little is known about their true potential for intuitive use. This paper proposes methods to leverage existing intuitive interaction theory for games research, specifically by examining different types of naturally mapped control interfaces for videogames using new measures for previous player experience. Three commercial control devices for a racing game were categorised using an existing typology, according to how the interface maps physical control inputs with the virtual gameplay actions. The devices were then used in a within-groups (n=64) experimental design aimed at measuring differences in intuitive use outcomes. Results from mixed design ANOVA are discussed, along with implications for the field. |
A Strategy Development Process for Enterprise Content Management | Today, many organizations maintain a variety of systems and databases in a complex ad-hoc architecture that does not seem to fulfill the needs for company-wide unstructured information management in business processes, business functions, and the extended enterprise. We describe a framework to implement Enterprise Content Management (ECM) in order to address this problem. ECM refers to the technologies, tools, and methods used to capture, manage, store, preserve, and deliver content (e.g. documents, graphics, drawings, web pages) across an enterprise. The framework helps to select content objects that can be brought under ECM to create business value and guide the IT investments needed to realize ECM. The framework was tested in a large high tech organization. |
Working memory deficits in adults with attention-deficit/hyperactivity disorder (ADHD): an examination of central executive and storage/rehearsal processes. | The current study was the first to use a regression approach to examine the unique contributions of central executive (CE) and storage/rehearsal processes to working memory (WM) deficits in adults with ADHD. Thirty-seven adults (ADHD = 21, HC = 16) completed phonological (PH) and visuospatial (VS) working memory tasks. While both groups performed significantly better during the PH task relative to the VS task, adults with ADHD exhibited significant deficits across both working memory modalities. Further, the ADHD group recalled disproportionately fewer PH and VS stimuli as set-size demands increased. Overall, the CE and PH storage/rehearsal processes of adults with ADHD were both significantly impaired relative to those of the healthy control adults; however, the magnitude of the CE effect size was much smaller compared to previous studies of children with the disorder. Collectively, results provide support for a lifelong trajectory of WM deficits in ADHD. |
Mobility of an in-pipe robot with screw drive mechanism inside curved pipes | This paper presents motion analyses and experiments of an in-pipe robot with screw drive mechanism while it moves inside curved pipes. This robot is driven by only one motor and composed of two units. One unit works as a rotator and the another one works as a stator. Screw drive mechanism is a better way to travel inside pipelines of small diameter for inspection, because the area for moving of the robot is narrow and the number of actuators can be reduced (minimum is just one). Therefore, the robot can be smaller and lighter. Furthermore, the control becomes easier. Although many kinds of in-pipe robots with screw drive mechanism have been reported to date, the most important problem of such drive mechanism is the difficulty of traveling in pipes other than straight ones. Examples of such pipes are curved pipes like elbow and bent, branch pipes like a T-shape and the pipe where diameter changes. A concrete analysis has not been done yet. In this paper, we concentrate on a helical driving motion of the robot inside the curved pipe and finally perform experiments to determine the characteristics of it. |
Detection and Resolution of Rumours in Social Media: A Survey | Despite the increasing use of social media platforms for information and news gathering, its unmoderated nature often leads to the emergence and spread of rumours, i.e., items of information that are unverified at the time of posting. At the same time, the openness of social media platforms provides opportunities to study how users share and discuss rumours, and to explore how to automatically assess their veracity, using natural language processing and data mining techniques. In this article, we introduce and discuss two types of rumours that circulate on social media: long-standing rumours that circulate for long periods of time, and newly emerging rumours spawned during fast-paced events such as breaking news, where reports are released piecemeal and often with an unverified status in their early stages. We provide an overview of research into social media rumours with the ultimate goal of developing a rumour classification system that consists of four components: rumour detection, rumour tracking, rumour stance classification, and rumour veracity classification. We delve into the approaches presented in the scientific literature for the development of each of these four components. We summarise the efforts and achievements so far toward the development of rumour classification systems and conclude with suggestions for avenues for future research in social media mining for the detection and resolution of rumours. |
Forum Thread Recommendation for Massive Open Online Courses | Recently, Massive Open Online Courses (MOOCs) have garnered a high level of interest in the media. With larger and larger numbers of students participating in each course, finding useful and informative threads in increasingly crowded course discussion forums becomes a challenging issue for students. In this work, we address this thread overload problem by taking advantage of an adaptive feature-based matrix factorization framework to make thread recommendations. A key component of our approach is a feature space design that effectively characterizes student behaviors in the forum in order to match threads and users. This effort includes content level modeling, social peer connections, and other forum activities. The results from our experiment conducted on one MOOC course show promise that our thread recommendation method has potential to direct students to threads they might be interested in. |
Essence, Reflexion, and Immediacy in Hegel's Science of Logic | This companion provides original, scholarly, and cutting-edge essays that cover the whole range of Hegel's mature thought and his lasting influence. * A comprehensive guide to one of the most important modern philosophers * Essays are written in an accessible manner and draw on the most up-to-date Hegel research * Contributions are drawn from across the world and from a wide variety of philosophical approaches and traditions * Examines Hegel's influence on a range of thinkers, from Kierkegaard and Marx to Heidegger, Adorno and Derrida * Begins with a chronology of Hegel's life and work and is then split into sections covering topics such as Philosophy of Nature, Aesthetics, and Philosophy of Religion |
Family Business in the Middle East: An Eexploratory Study of Retail Management in Kuwait and Lebanon | The Middle East is a growing, lucrative marketplace that has recently captured the interest of the world for political as well as economic reasons due to the War in Iraq, which began in 2003. This exploratory study examines the relationship between retail small/medium enterprises (SMEs) that are family business owned, organizational commitment, and management and employee perceptions of customer service on a number of dimensions. The results suggest that managers and employees of family-owned businesses in the Middle East behave in ways similar to those in Western countries; however, there are differences, probably related to cultural characteristics. The Middle East is a richly diverse region, a myriad of unique cultures. As the market becomes more sophisticated, the importance of service quality increases. Global retailers can benefit from this study by better understanding the managers and employees in the region and the pivotal role of the family on business. Implications for practice are discussed. |
Audit-Free Cloud Storage via Deniable Attribute-Based Encryption | Cloud storage services have become increasingly popular. Because of the importance of privacy, many cloud storage encryption schemes have been proposed to protect data from those who do not have access. All such schemes assumed that cloud storage providers are safe and cannot be hacked; however, in practice, some authorities (i.e., coercers) may force cloud storage providers to reveal user secrets or confidential data on the cloud, thus altogether circumventing storage encryption schemes. In this paper, we present our design for a new cloud storage encryption scheme that enables cloud storage providers to create convincing fake user secrets to protect user privacy. Since coercers cannot tell if obtained secrets are true or not, the cloud storage providers ensure that user privacy is still securely protected. |
A Survey of Communication Sub-systems for Intersatellite Linked Systems and CubeSat Missions | Intersatellite links or crosslinks provide direct connectivity between two or more satellites, thus eliminating the need for intermediate ground stations when sending data. Intersatellite links have been considered for satellite constellation missions involving earth observation and communications. Historically, a large satellite system has needed an extremely high financial budget. However, the advent of the successful CubeSat platform allows for small satellites of less than one kilogram. This low-mass pico-satellite class platform could provide financially feasible support for large platform satellite constellations. This article surveys past and planned large intersatellite linking systems. Then, the article chronicles CubeSat communication subsystems used historically and in the near future. Finally, we examine the history of inter-networking protocols in space and open research issues with the goal of moving towards the next generation intersatellite-linking constellation supported by CubeSat platform satellites. |
Mirage: Toward a Stealthier and Modular Malware Analysis Sandbox for Android | Nowadays, malware is affecting not only PCs but also mobile devices, which became pervasive in everyday life. Mobile devices can access and store personal information (e.g., location, photos, and messages) and thus are appealing to malware authors. One of the most promising approach to analyze malware is by monitoring its execution in a sandbox (i.e., via dynamic analysis). In particular, most malware sandboxing solutions for Android rely on an emulator, rather than a real device. This motivates malware authors to include runtime checks in order to detect whether the malware is running in a virtualized environment. In that case, the malicious app does not trigger the malicious payload. The presence of differences between real devices and Android emulators started an arms race between security researchers and malware authors, where the former want to hide these differences and the latter try to seek them out. In this paper we present Mirage, a malware sandbox architecture for Android focused on dynamic analysis evasion attacks. We designed the components of Mirage to be extensible via software modules, in order to build specific countermeasures against such attacks. To the best of our knowledge, Mirage is the first modular sandbox architecture that is robust against sandbox detection techniques. As a representative case study, we present a proof of concept implementation of Mirage with a module that tackles evasion attacks based on sensors API return values. |
Shoulder injuries in the throwing athlete. | Pathologic conditions in the shoulder of a throwing athlete frequently represent a breakdown of multiple elements of the shoulder restraint system, both static and dynamic, and also a breakdown in the kinetic chain. Physical therapy and rehabilitation should be, with only a few exceptions, the primary treatment for throwing athletes before operative treatment is considered. Articular-sided partial rotator cuff tears and superior labral tears are common in throwing athletes. Operative treatment can be successful when nonoperative measures have failed. Throwing athletes who have a glenohumeral internal rotation deficit have a good response, in most cases, to stretching of the posteroinferior aspect of the capsule. |
Factorized Hidden Layer Adaptation for Deep Neural Network Based Acoustic Modeling | In this paper, we propose the factorized hidden layer FHL approach to adapt the deep neural network DNN acoustic models for automatic speech recognition ASR. FHL aims at modeling speaker dependent SD hidden layers by representing an SD affine transformation as a linear combination of bases. The combination weights are low-dimensional speaker parameters that can be initialized using speaker representations like i-vectors and then reliably refined in an unsupervised adaptation fashion. Therefore, our method provides an efficient way to perform both adaptive training and test-time adaptation. Experimental results have shown that the FHL adaptation improves the ASR performance significantly, compared to the standard DNN models, as well as other state-of-the-art DNN adaptation approaches, such as training with the speaker-normalized CMLLR features, speaker-aware training using i-vector and learning hidden unit contributions LHUC. For Aurora 4, FHL achieves 3.8% and 2.3% absolute improvements over the standard DNNs trained on the LDA + STC and CMLLR features, respectively. It also achieves 1.7% absolute performance improvement over a system that combines the i-vector adaptive training with LHUC adaptation. For the AMI dataset, FHL achieved 1.4% and 1.9% absolute improvements over the sequence-trained CMLLR baseline systems, for the IHM and SDM tasks, respectively. |
Evidence for biological effects of metformin in operable breast cancer: a pre-operative, window-of-opportunity, randomized trial | Metformin may reduce the incidence of breast cancer and enhance response to neoadjuvant chemotherapy in diabetic women. This trial examined the effects of metformin on Ki67 and gene expression in primary breast cancer. Non-diabetic women with operable invasive breast cancer received pre-operative metformin. A pilot cohort of eight patients had core biopsy of the cancer at presentation, a week later (without treatment; internal control), then following metformin 500-mg o.d. for 1 week increased to 1-g b.d. for a further week continued to surgery. A further 47 patients had core biopsy at diagnosis were randomized to metformin (the same dose regimen) or no drug, and 2 weeks later had core biopsy at surgery. Ki67 immunohistochemistry, transcriptome analysis on formalin-fixed paraffin-embedded cores and serum insulin determination were performed blinded to treatment. Seven patients (7/32, 21.9%) receiving metformin withdrew because of gastrointestinal upset. The mean percentage of cells staining for Ki67 fell significantly following metformin treatment in both the pilot cohort (P = 0.041, paired t-test) and in the metformin arm (P = 0.027, Wilcoxon rank test) but was unchanged in the internal control or metformin control arms. Messenger RNA expression was significantly downregulated by metformin for PDE3B (phosphodiesterase 3B, cGMP-inhibited; a critical regulator of cAMP levels that affect activation of AMP-activated protein kinase, AMPK), confirmed by immunohistochemistry, SSR3, TP53 and CCDC14. By ingenuity pathway analysis, the tumour necrosis factor receptor 1 (TNFR1) signaling pathway was most affected by metformin: TGFB and MEKK were upregulated and cdc42 downregulated; mTOR and AMPK pathways were also affected. Gene set analysis additionally revealed that p53, BRCA1 and cell cycle pathways also had reduced expression following metformin. Mean serum insulin remained stable in patients receiving metformin but rose in control patients. This trial presents biomarker evidence for anti-proliferative effects of metformin in women with breast cancer and provides support for therapeutic trials of metformin. |
Class representative autoencoder for low resolution multi-spectral gender classification | Gender is one of the most common attributes used to describe an individual. It is used in multiple domains such as human computer interaction, marketing, security, and demographic reports. Research has been performed to automate the task of gender recognition in constrained environment using face images, however, limited attention has been given to gender classification in unconstrained scenarios. This work attempts to address the challenging problem of gender classification in multi-spectral low resolution face images. We propose a robust Class Representative Autoencoder model, termed as AutoGen for the same. The proposed model aims to minimize the intra-class variations while maximizing the inter-class variations for the learned feature representations. Results on visible as well as near infrared spectrum data for different resolutions and multiple databases depict the efficacy of the proposed model. Comparative results with existing approaches and two commercial off-the-shelf systems further motivate the use of class representative features for classification. |
Are Genetic Algorithms Function Optimizers? | Genetic Algorithms (GAs) have received a great deal of attention regarding their poten tial as optimization techniques for complex functions. The level of interest and success in this area has led to a number of improvements to GA-based function optimizers and a good deal of progress in characterizing the kinds of functions that are easy/hard for GAs to optim ize. With all this activity, there has been a natural tendency to equate GAs with function optimization. However, the motivating context of Holland's initial GA work was the design and implementation of robust adaptive systems. In this paper we argue that a proper under standing of GAs in this broader adaptive systems context is a necessary prerequisite for understanding their potential application to any problem domain. We then use these insights to better understand the strengths and limitations of GAs as function optimizers. |
Platelet aggregation in traumatic spinal cord injury | Study design: Collagen-induced platelet aggregation and platelet count of ten paraplegic patients (four females, six males, aged 16–42 years) with traumatic spinal cord injury (SCI) (posttraumatic 12–48 weeks) and of ten age-matched healthy volunteers (control group; five females, five males, aged 18–37 years) were investigated. Objectives: Investigation of platelet aggregation in the whole blood of the patients with SCI. Setting: Ankara/Turkey. Methods: Platelet aggregation was evaluated by impedance technique using Chrono Log Model 560 WB aggregometer in whole blood. Platelet count was determined by Medonic Cell Analyser 610. Results: Maximal intensity of collagen-induced platelet aggregation of the patients was 18.50±8.28 ohm (mean±SD) and of the controls was 7.60±4.25 ohm. Maximal rate of collagen-induced aggregation of platelets from the patients was 3.98±1.59 ohm/min, maximal rate of aggregation of platelets from the controls was 1.57±1.01 ohm/min. Platelet counts of the patients and controls were 290 500±50 357/mm3 and 273 000±48 343/mm3 respectively. It was determined that both maximal rate (P<0.001) and maximal intensity (P<0.01) of collagen-induced platelet aggregation of the patients were significantly higher than those of the controls. There was no significant difference between the two groups in respect to platelet counts. Conclusion: Collagen-induced platelet aggregation of patients with traumatic SCI 12–48 weeks after the trauma was significantly higher than that of the controls. Our results indicate that increased tendency of platelet aggregation, which is probably induced by free radicals, may have a great impact on the late thromboembolic complications reported in patients with traumatic SCI. |
From Strategy to Business Models and onto Tactics | Strategy scholars have used the notion of the Business Model to refer to the ‘logic of the firm’ e how it operates and creates value for its stakeholders. On the surface, this notion appears to be similar to that of strategy. We present a conceptual framework to separate and relate the concepts of strategy and business model: a business model, we argue, is a reflection of the firm’s realized strategy. We find that in simple competitive situations there is a one-to-one mapping between strategy and business model, which makes it difficult to separate the two notions. We show that the concepts of strategy and business model differ when there are important contingencies on which a well-designed strategy must be based. Our framework also delivers a clear distinction between strategy and tactics, made possible because strategy and business model are different constructs. |
Challenge AI Mind: A Crowd System for Proactive AI Testing | Arti cial Intelligence (AI) has burrowed into our lives in various aspects; however, without appropriate testing, deployed AI systems are often being criticized to fail in critical and embarrassing cases. Existing testing approaches mainly depend on xed and pre-de ned datasets, providing a limited testing coverage. In this paper, we propose the concept of proactive testing to dynamically generate testing data and evaluate the performance of AI systems. We further introduce Challenge.AI, a new crowd system that features the integration of crowdsourcing and machine learning techniques in the process of error generation, error validation, error categorization, and error analysis. We present experiences and insights into a participatory design with AI developers. The evaluation shows that the crowd work ow is more e ective with the help of machine learning techniques. AI developers found that our system can help them discover unknown errors made by the AI models, and engage in the process of proactive testing. |
Interactions of N-desmethyl imatinib, an active metabolite of imatinib, with P-glycoprotein in human leukemia cells | We measured intracellular accumulation of N-desmethyl imatinib (CGP 74588), the main pharmacologically active metabolite of imatinib (Gleevec or STI-571), in Bcr–Abl-positive cells. Using a sensitive and robust non-radioactive in vitro assay, we observed that CGP74588 accumulates in significantly higher amount than imatinib in sensitive K562 cells. In contrast, the intracellular level of CGP74588 was significantly lower than that of imatinib in K562/Dox cells, which represent a multidrug-resistant variant of K562 cells due to the P-glycoprotein (P-gp, ABCB1, MDR1) overexpression. An in vitro enzyme-based assay provided evidence that CGP74588 might serve as an excellent substrate for P-gp. Accordingly, we found that CGP74588 up to 20 μM concentration neither induced apoptosis nor inhibited substantially cell proliferation in resistant K562/Dox cells. In contrast, CGP74588 was capable to inhibit cell proliferation and induced apoptosis in sensitive K562 cells, although its effect was approximately three to four times lower than that of imatinib in the same cell line. Our results indicate that CGP74588 could hardly positively contribute to the treatment of chronic myeloid leukemia (CML) where ABCB1 gene overexpression represents a possible mechanism of resistance to imatinib in vivo. |
A multi-scale convolutional neural network for phenotyping high-content cellular images | Motivation
Identifying phenotypes based on high-content cellular images is challenging. Conventional image analysis pipelines for phenotype identification comprise multiple independent steps, with each step requiring method customization and adjustment of multiple parameters.
Results
Here, we present an approach based on a multi-scale convolutional neural network (M-CNN) that classifies, in a single cohesive step, cellular images into phenotypes by using directly and solely the images' pixel intensity values. The only parameters in the approach are the weights of the neural network, which are automatically optimized based on training images. The approach requires no a priori knowledge or manual customization, and is applicable to single- or multi-channel images displaying single or multiple cells. We evaluated the classification performance of the approach on eight diverse benchmark datasets. The approach yielded overall a higher classification accuracy compared with state-of-the-art results, including those of other deep CNN architectures. In addition to using the network to simply obtain a yes-or-no prediction for a given phenotype, we use the probability outputs calculated by the network to quantitatively describe the phenotypes. This study shows that these probability values correlate with chemical treatment concentrations. This finding validates further our approach and enables chemical treatment potency estimation via CNNs.
Availability and Implementation
The network specifications and solver definitions are provided in Supplementary Software 1.
Contact
[email protected] or [email protected].
Supplementary information
Supplementary data are available at Bioinformatics online. |
Recursive estimation of emitter location using TDOA measurements from two UAVs | This paper considers the recursive estimation of emitter location using time difference of arrival measurements formed by the correlation of signals received by two unmanned aerial vehicles. The time difference of arrival measurement defines an hyperbola of possible emitter locations. This hyperbola is used as a measurement in a nonlinear Alter. The performance of two such filters, an extended Kalman filter (EKF) and an unscented Kalman filter (UKF), is analysed for a stationary and moving emitter and compared with the Cramer-Rao lower bound. The UKF performs generally better than the EKF, but both algorithms suffer from diverged tracks. |
Sustained inhibition of receptor tyrosine kinases and macrophage depletion by PLX3397 and rapamycin as a potential new approach for the treatment of MPNSTs. | PURPOSE
Malignant peripheral nerve sheath tumor (MPNST) is a highly aggressive tumor type that is resistant to chemotherapy and there are no effective therapies. MPNSTs have been shown to have gene amplification for receptor tyrosine kinases (RTK), PDGFR and c-Kit. We tested the c-Kit inhibitor, imatinib, and PLX3397, a selective c-Fms and c-Kit inhibitor, to evaluate their efficacy against MPNST cells in vitro and in vivo.
EXPERIMENTAL DESIGN
We tested the efficacy of imatinib or PLX3397 either alone or in combination with TORC1 inhibitor rapamycin in a cell proliferation assay in vitro and by immunoblotting to determine target inhibition. Immunoblotting and immunohistochemical analysis was further carried out using xenograft samples in vivo.
RESULTS
Our in vitro studies show that imatinib and PLX3397 similarly inhibit cell growth and this can be enhanced with rapamycin with comparable target specificity. However, in vivo studies clearly demonstrate that compared with imatinib, PLX3397 results in sustained blockade of c-Kit, c-Fms, and PDGFRβ, resulting in significant suppression of tumor growth. Moreover, staining for Iba-1, a marker for macrophages, indicates that PLX3397 results in significant depletion of macrophages in the growing tumors. The combination of PLX3397 and rapamycin results in even greater macrophage depletion with continued growth suppression, even when the drug treatment is discontinued.
CONCLUSIONS
Taken together, our data strongly suggest that PLX3397 is superior to imatinib in the treatment of MPNSTs, and the combination of PLX3397 with a TORC1 inhibitor could provide a new therapeutic approach for the treatment of this disease. |
The Value of Calcaneal Bone Mass Measurement Using a Dual X-Ray Laser Calscan Device in Risk Screening for Osteoporosis | OBJECTIVE
To evaluate how bone mineral density in the calcaneus measured by a dual energy X-ray laser (DXL) correlates with bone mineral density in the spine and hip in Turkish women over 40 years of age and to determine whether calcaneal dual energy X-ray laser variables are associated with clinical risk factors to the same extent as axial bone mineral density measurements obtained using dual energy x-ray absorbtiometry (DXA).
MATERIALS AND METHODS
A total of 2,884 Turkish women, aged 40-90 years, living in Ankara were randomly selected. Calcaneal bone mineral density was evaluated using a dual energy X-ray laser Calscan device. Subjects exhibiting a calcaneal dual energy X-ray laser T- score < or = -2.5 received a referral for DXA of the spine and hip. Besides dual energy X-ray laser measurements, all subjects were questioned about their medical history and the most relevant risk factors for osteoporosis.
RESULTS
Using a T-score threshold of -2.5, which is recommended by the World Health Organization (WHO), dual energy X-ray laser calcaneal measurements showed that 13% of the subjects had osteoporosis, while another 56% had osteopenia. The mean calcaneal dual energy X-ray laser T-score of postmenopausal subjects who were smokers with a positive history of fracture, hormone replacement therapy (HRT), covered dressing style, lower educational level, no regular exercise habits, and low tea consumption was significantly lower than that obtained for the other group (p<0.05). A significant correlation was observed between the calcaneal dual energy X-ray laser T-score and age (r= -0.465, p=0.001), body mass index (BMI) (r=0.223, p=0.001), number of live births (r= -0.229, p=0.001), breast feeding time (r= -0.064, p=0.001), and age at menarche (r= -0.050, p=0.008). The correlations between calcaneal DXL and DXA T-scores (r=0.340, p=0.001) and calcaneal DXL and DXA Z-scores (r=0.360, p=0.001) at the spine, and calcaneal DXL and DXA T- scores (r=0.28, p=0.001) and calcaneal DXL and DXA Z-scores (r=0.33, p=0.001) at the femoral neck were statistically significant.
CONCLUSION
Bone mineral density measurements in the calcaneus using a dual energy X-ray laser are valuable for screening Turkish women over 40 years of age for the risk of osteoporosis. |
Manipulating the soil microbiome to increase soil health and plant fertility | A variety of soil factors are known to increase nutrient availability and plant productivity. The most influential might be the organisms comprising the soil microbial community of the rhizosphere, which is the soil surrounding the roots of plants where complex interactions occur between the roots, soil, and microorganisms. Root exudates act as substrates and signaling molecules for microbes creating a complex and interwoven relationship between plants and the microbiome. While individual microorganisms such as endophytes, symbionts, pathogens, and plant growth promoting rhizobacteria are increasingly featured in the literature, the larger community of soil microorganisms, or soil microbiome, may have more far-reaching effects. Each microorganism functions in coordination with the overall soil microbiome to influence plant health and crop productivity. Increasing evidence indicates that plants can shape the soil microbiome through the secretion of root exudates. The molecular communication fluctuates according to the plant development stage, proximity to neighboring species, management techniques, and many other factors. This review seeks to summarize the current knowledge on this topic. |
Oxidation pre-treatment and electrophoretic deposition of SiC nanowires to improve the thermal shock resistance of SiC coating for C/C composites | Abstract To improve the thermal shock resistance of SiC coating for carbon/carbon composites (C/Cs) prepared by chemical vapor deposition (CVD), mild oxidation of C/C substrate and electrophoretic deposition (EPD) of SiC nanowires were used before CVD, aiming to modify the mechanical properties of SiC coating. In comparison to the pristine one, the modified SiC coating shows 73.1% and 93.5% improvements respectively in fracture toughness and adhesion with C/Cs, which endows the coating with fewer and smaller cracks and better thermal shock resistance between 1200 °C and room temperature. It has been also demonstrated that the combined pre-treatments can increase the thermal shock performance of CVD SiC coating better than individual one, which paves a way for greatly increasing the efficiency of SiC coating in protecting C/Cs against oxidation consumption at intermediate temperatures (800–1200 °C). |
Competing in the Information Economy | In the information economy, the successful organization will be a sophisticated information-gathering machine that analyzes and manages information flow with the same skill and focus that the industrial company managed work processes. In the information economy, the successful company will be competitive on price and quality as a given, but will also compress cycle time to a minimum. Time compression will become the principle basis for competitive advantage. The emerging business strategies that achieve competitive advantage through time compression are explored in this paper. |
Embodying gesture-based multimedia to improve learning | Introduction As revealed by the recent Horizon Report (Johnson, Smith, Willis, Levine & Haywood, 2011), the creation of gesture-based interfaces (eg, Microsoft Kinect, Nintendo Wii and Apple iPhone/iPad) create promising opportunities for educators to offer students easier and more intuitive ways to interact with the content in multimedia learning environments than ever before. For instance, Kinect, a motion-sensing input device developed by Microsoft for Xbox360 (Redmond, WA, USA), enables students to use their body motions, such as swiping, jumping and moving, to interact with the content on the screen. Growing communities (eg, KinectEDucation; http://www. kinecteducation.com/) advocate integrating Kinect applications into classroom teaching to make students’ learning experiences more active and joyful. Few studies investigate the impact of gesture-based multimedia learning on students’ cognitive learning outcomes and/or its theoretical underpinnings. This paper briefly discusses the theoretical underpinnings for adopting gesture-based multimedia learning, then describes how we used Kinect to embody the most common type of multimedia learning in classroom (ie, PowerPoint presentations) and finally details a preliminary study exploring the impact of gesture-based multimedia learning on students’ cognitive learning outcomes. |
On the Effectiveness of Defensive Distillation | We report experimental results indicating that defensive distillation successfully mitigates adversarial samples crafted using the fast gradient sign method [2], in addition to those crafted using the Jacobian-based iterative attack [5] on which the defense mechanism was originally evaluated. |
5 Mbps optical wireless communication with error correction coding for underwater sensor nodes | One issue with underwater sensors is how to efficiently transfer large amounts of data collected by the node to an interrogating platform such as an underwater vehicle. It is often impractical to make a physical connection between the node and the vehicle which suggests an acoustic or optical wireless solution. For large amounts of data, the high bandwidth of underwater optical wireless is an advantage. A small, low-cost platform to demonstrate the potential of an optical wireless communications interface for underwater sensor nodes is demonstrated. To enhance the reliability and robustness of the optical wireless communication digital signal processing and error correction techniques are used. The system was tested in 3 and 7.7 meter tanks at 5 Mbps with the turbidity of the water controlled by the addition of Maalox. |
Effects of the kappa opioid receptor antagonist, norbinaltorphimine, on stress and drug-induced reinstatement of nicotine-conditioned place preference in mice | Several studies implicate stress as a risk factor for the development and maintenance of drug addictive behaviors and drug relapse. Kappa opioid receptor (KOR) antagonists have been shown to attenuate behavioral responses to stress and stress-induced reinstatement of cocaine and ethanol seeking and preference. In the current study, we determined whether the selective KOR antagonist, norbinaltorphimine (nor-BNI), would block stress-induced reinstatement of nicotine preference. Adult Institute of Cancer Research mice were conditioned with 0.5 mg/kg nicotine, injected subcutaneously (s.c.) for 3 days and tested in the nicotine-conditioned place preference (CPP) model. After 3 days extinction, nor-BNI (10 mg/kg, s.c.) was administered 16 h prior to a priming dose of nicotine (0.1 mg/kg, s.c.), and mice were tested in the CPP model for nicotine-induced reinstatement of CPP. A separate group of mice was subjected to a 2-day modified forced swim test (FST) paradigm to induce stress after 3 days extinction from CPP. Mice were given vehicle or nor-BNI (10 mg/kg, s.c.) 16 h prior to each FST session. Nor-BNI pretreatment significantly attenuated stress-induced reinstatement of nicotine-CPP, but had no effect on nicotine-primed reinstatement. Blockade of KORs by selective antagonists attenuates stress-induced reinstatement of nicotine-CPP. Overall, the kappa opioid system may serve as a therapeutic target for suppressing multiple signaling processes which contribute to maintenance of smoking, smoking relapse, and drug abuse in general. |
The effect of threshold values on association rule based classification accuracy | Classification Association Rule Mining (CARM) systems operate by applying an Association Rule Mining (ARM) method to obtain classification rules from a training set of previously-classified data. The rules thus generated will be influenced by the choice of ARM parameters employed by the algorithm (typically support and confidence threshold values). In this paper we examine the effect that this choice has on the predictive accuracy of CARM methods. We show that the accuracy can almost always be improved by a suitable choice of parameters, and describe a hill-climbing method for finding the best parameter settings. We also demonstrate that the proposed hill-climbing method is most effective when coupled with a fast CARM algorithm such as the TFPC algorithm which is also described. |
Investigating the determinants and age and gender differences in the acceptance of mobile learning | With the proliferation of mobile computing technology, mobile learning (m-learning) will play a vital role in the rapidly growing electronic learning market. M-learning is the delivery of learning to students anytime and anywhere through the use of wireless Internet and mobile devices. However, acceptance of m-learning by individuals is critical to the successful implementation of m-learning systems. Thus, there is a need to research the factors that affect user intention to use m-learning. Based on the unified theory of acceptance and use of technology (UTAUT), which integrates elements across eight models of information technology use, this study was to investigate the determinants of m-learning acceptance and to discover if there exist either age or gender differences in the acceptance of m-learning, or both. Data collected from 330 respondents in Taiwan were tested against the research model using the structural equation modelling approach. The results indicate that performance expectancy, effort expectancy, social influence, perceived playfulness, and self-management of learning were all significant determinants of behavioural intention to use m-learning. We also found that age differences moderate the effects of effort expectancy and social influence on m-learning use intention, and that gender differences moderate the effects of social influence and self-management of learning on m-learning use intention. These findings provide several important implications for m-learning acceptance, in terms of both research and practice. British Journal of Educational Technology Vol 40 No 1 2009 92–118 doi:10.1111/j.1467-8535.2007.00809.x © 2007 The Authors. Journal compilation © 2007 Becta. Published by Blackwell Publishing, 9600 Garsington Road, Oxford OX4 2DQ, UK and 350 Main Street, Malden, MA 02148, USA. Introduction The use of information and communication technology (ICT) may improve learning, especially when coupled with more learner-centred instruction (Zhu & Kaplan, 2002). From notebook computers to wireless phones and handheld devices, the massive infusion of computing devices and rapidly improving Internet capabilities have altered the nature of higher education (Green, 2000). Mobile learning (m-learning) is the follow up of e-learning, which for its part originates from distance education. M-learning refers to the delivery of learning to students anytime and anywhere through the use of wireless Internet and mobile devices, including mobile phones, personal digital assistants (PDAs), smart phones and digital audio players. Namely, m-learning users can interact with educational resources while away from their normal place of learning— the classroom or desktop computer. The place independence of mobile devices provides several benefits for e-learning environments, such as allowing students and instructors to utilise their spare time while traveling in trains or buses to finish their homework or lesson preparation (Virvou & Alepis, 2005). If e-learning took learning away from the classroom, then m-learning is taking learning away from a fixed location (Cmuk, 2007). Motiwalla (2007) contends that learning on mobile devices will never replace classroom or other e-learning approaches. Thus, m-learning is a complementary activity to both e-learning and traditional learning. However, Motiwalla (2007) also suggests that if leveraged properly, mobile technology can complement and add value to the existing learning models, such as the social constructive theory of learning with technology (Brown & Campione, 1996) and conversation theory (Pask, 1975). Thus, some believe that m-learning is becoming progressively more significant, and that it will play a vital role in the rapidly growing e-learning market. Despite the tremendous growth and potential of the mobile devices and networks, wireless e-learning and m-learning are still in their infancy or embryonic stage (Motiwalla, 2007). While the opportunities provided by m-learning are new, there are several challenges facing m-learning, such as connectivity, small screen sizes, limited processing power and reduced input capabilities. Siau, Lim and Shen (2001) also note that mobile devices have ‘(1) small screens and small multifunction key pads; (2) less computational power, limited memory and disk capacity; (3) shorter battery life; (4) complicated text input mechanisms; (5) higher risk of data storage and transaction errors; (6) lower display resolution; (7) less surfability; (8) unfriendly user-interfaces; and (9) graphical limitations’ (p. 6). Equipped with a small phone-style keyboard or a touch screen, users might require more time to search for some information on a page than they need to read it (Motiwalla, 2007). These challenges mean that adapting existing e-learning services to m-learning is not an easy work, and that users may be inclined to not accept m-learning. Thus, the success of m-learning may depend on whether or not users are willing to adopt the new technology that is different from what they have used in the past. While e-learning and mobile commerce/learning has received extensive attention (Concannon, Flynn & Campbell, 2005; Davies & Graff, 2005; Govindasamy, 2002; Harun, 2002; Ismail, 2002; Luarn & Lin, 2005; Mwanza & Engeström, 2005; Motiwalla, 2007; Pituch & Lee, 2006; Selim, 2007; Shee & Wang, in Determinants and age and gender in mobile learning 93 © 2007 The Authors. Journal compilation © 2007 Becta. press; Ravenscroft & Matheson, 2002; Wang, 2003), thus far, little research has been conducted to investigate the factors affecting users’ intentions to adopt m-learning, and to explore the age and gender differences in terms of the acceptance of m-learning. As Pedersen and Ling (2003) suggest, even though traditional Internet services and mobile services are expected to converge into mobile Internet services, few attempts have been made to apply traditional information technology (IT) adoption models to explain their potential adoption. Consequently, the objective of this study was to investigate the determinants, as well as the age and gender differences, in the acceptance of m-learning based on the unified theory of acceptance and use of technology (UTAUT) proposed by Venkatesh, Morris, Davis and Davis (2003). The remainder of this paper is organised as follows. In the next section, we review the UTAUT and show our reasoning for adopting it as the theoretical framework of this study. This is followed by descriptions of the research model and methods. We then present the results of the data analysis and hypotheses testing. Finally, the implications and limitations of this study are discussed. Unified Theory of Acceptance and Use of Technology M-learning acceptance is the central theme of this study, and represents a fundamental managerial challenge in terms of m-learning implementation. A review of prior studies provided a theoretical foundation for hypotheses formulation. Based on eight prominent models in the field of IT acceptance research, Venkatesh et al (2003) proposed a unified model, called the unified theory of acceptance and use of technology (UTAUT), which integrates elements across the eight models. The eight models consist of the theory of reasoned action (TRA) (Fishbein & Ajzen, 1975), the technology acceptance model (TAM) (Davis, 1989), the motivational model (MM) (Davis, Bagozzi & Warshaw, 1992), the theory of planned behaviour (TPB) (Ajzen, 1991), the combined TAM and TPB (C-TAM-TPB) (Taylor & Todd, 1995a), the model of PC utilisation (MPCU) (Triandis, 1977; Thompson, Higgins & Howell, 1991), the innovation diffusion theory (IDT) (Rogers, 2003; Moore & Benbasat, 1991) and the social cognitive theory (SCT) (Bandura, 1986; Compeau & Higgins, 1995). Based on Venkatesh et al’s (2003) study, we briefly review the core constructs in each of the eight models, which have been theorised as the determinants of IT usage intention and/or behaviour. First, TRA has been considered to be one of the most fundamental and influential theories on human behaviour. Attitudes toward behaviour and subjective norms are the two core constructs in TRA. Second, TAM was originally developed to predict IT acceptance and usage on the job, and has been extensively applied to various types of technologies and users. Perceived usefulness and perceived ease of use are the two main constructs mentioned in TAM. More recently, Venkatesh and Davis (2000) presented TAM2 by adding subjective norms to the TAM in the case of mandatory settings. Third, Davis et al (1992) employed motivation theory to understand new technology acceptance and usage, focusing on the primary constructs of extrinsic motivation and intrinsic motivation. Fourth, TPB extended TRA by including the construct of perceived behavioural control, and has been successfully applied to the 94 British Journal of Educational Technology Vol 40 No 1 2009 © 2007 The Authors. Journal compilation © 2007 Becta. understanding of individual acceptance and usage of various technologies (Harrison, Mykytyn & Riemenschneider, 1997; Mathieson, 1991; Taylor & Todd, 1995b). Fifth, C-TAM-TPB is a hybrid model that combines the predictors of TPB with perceived usefulness from TAM. Sixth, based on Triandis’ (1977) theory of human behaviour, Thompson et al (1991) presented the MPCU and used this model to predict PC utilisation. MPCU consists of six constructs, including job fit, complexity, long-term consequences, affect towards use, social factors and facilitating conditions. Seventh, Moore and Benbasat (1991) adapted the properties of innovations posited by IDT and refined a set of constructs that could be used to explore individual technology acceptance. These constructs include relative advantage, ease of use, image, visibility, compatibility, results demonstrability and voluntariness of use. Finally, Compeau and Higgins (1995) applied and extended SCT to the context of computer utilisation (see also Compeau, Higgins & |
3D printing for the design and fabrication of polymer-based gradient scaffolds. | To accurately mimic the native tissue environment, tissue engineered scaffolds often need to have a highly controlled and varied display of three-dimensional (3D) architecture and geometrical cues. Additive manufacturing in tissue engineering has made possible the development of complex scaffolds that mimic the native tissue architectures. As such, architectural details that were previously unattainable or irreproducible can now be incorporated in an ordered and organized approach, further advancing the structural and chemical cues delivered to cells interacting with the scaffold. This control over the environment has given engineers the ability to unlock cellular machinery that is highly dependent upon the intricate heterogeneous environment of native tissue. Recent research into the incorporation of physical and chemical gradients within scaffolds indicates that integrating these features improves the function of a tissue engineered construct. This review covers recent advances on techniques to incorporate gradients into polymer scaffolds through additive manufacturing and evaluate the success of these techniques. As covered here, to best replicate different tissue types, one must be cognizant of the vastly different types of manufacturing techniques available to create these gradient scaffolds. We review the various types of additive manufacturing techniques that can be leveraged to fabricate scaffolds with heterogeneous properties and discuss methods to successfully characterize them.
STATEMENT OF SIGNIFICANCE
Additive manufacturing techniques have given tissue engineers the ability to precisely recapitulate the native architecture present within tissue. In addition, these techniques can be leveraged to create scaffolds with both physical and chemical gradients. This work offers insight into several techniques that can be used to generate graded scaffolds, depending on the desired gradient. Furthermore, it outlines methods to determine if the designed gradient was achieved. This review will help to condense the abundance of information that has been published on the creation and characterization of gradient scaffolds and to provide a single review discussing both methods for manufacturing gradient scaffolds and evaluating the establishment of a gradient. |
Second-line antiretroviral therapy in resource-limited settings: the experience of Médecins Sans Frontières. | OBJECTIVES
To describe the use of second-line protease-inhibitor regimens in Médecins Sans Frontières HIV programmes, and determine switch rates, clinical outcomes, and factors associated with survival.
DESIGN/METHODS
We used patient data from 62 Médecins Sans Frontières programmes and included all antiretroviral therapy-naive adults (> 15 years) at the start of antiretroviral therapy and switched to a protease inhibitor-containing regimen with at least one nucleoside reverse transcriptase inhibitor change after more than 6 months of nonnucleoside reverse transcriptase inhibitor first-line use. Cumulative switch rates and survival curves were estimated using Kaplan-Meier methods, and mortality predictors were investigated using Poisson regression.
RESULTS
Of 48,338 adults followed on antiretroviral therapy, 370 switched to a second-line regimen after a median of 20 months (switch rate 4.8/1000 person-years). Median CD4 cell count at switch was 99 cells/microl (interquartile ratio 39-200; n = 244). A lopinavir/ritonavir-based regimen was given to 51% of patients and nelfinavir-based regimen to 43%; 29% changed one nucleoside reverse transcriptase inhibitor and 71% changed two nucleoside reverse transcriptase inhibitors. Median follow-up on second-line antiretroviral therapy was 8 months, and probability of remaining in care at 12 months was 0.86. Median CD4 gains were 90 at 6 months and 135 at 12 months. Death rates were higher in patients in World Health Organization stage 4 at antiretroviral therapy initiation and in those with CD4 nadir count less than 50 cells/microl.
CONCLUSION
The rate of switch to second-line treatment in antiretroviral therapy-naive adults on non-nucleoside reverse transcriptase inhibitor-based first-line antiretroviral therapy was relatively low, with good early outcomes observed in protease inhibitor-based second-line regimens. Severe immunosuppression was associated with increased mortality on second-line treatment. |
Multifamily psychoeducation for improvement of mental health among relatives of patients with major depressive disorder lasting more than one year: study protocol for a randomized controlled trial | BACKGROUND
Major depressive disorder (MDD) is a long-lasting disorder with frequent relapses that have significant effects on the patient's family. Family psychoeducation is recognized as part of the optimal treatment for patients with psychotic disorder. A previous randomized controlled trial has found that family psychoeducation is effective in enhancing the treatment of MDD. Although MDD can easily become a chronic illness, there has been no intervention study on the families of patients with chronic depression. In the present study, we design a randomized controlled trial to examine the effectiveness of family psychoeducation in improving the mental health of relatives of patients with MDD lasting more than one year.
METHODS/DESIGN
Participants are patients with MDD lasting more than one year and their relatives. Individually randomized, parallel-group trial design will be employed. Participants will be allocated to one of two treatment conditions: relatives will receive (a) family psychoeducation (four, two-hour biweekly multifamily psychoeducation sessions) plus treatment-as-usual for the patient (consultation by physicians), or (b) counseling for the family (one counseling session from a nurse) plus treatment-as-usual for the patient. The primary outcome measure will be relatives' mental health as measured by K6 that was developed to screen for DSM-IV depressive and anxiety disorder. Additionally, the severity of depressive symptoms in patients measured by the Beck Depression Inventory-II (BDI-II) scale will be assessed. Data from the intention-to-treat sample will be analyzed 16 weeks after randomization.
DISCUSSION
This is the first study to evaluate the effectiveness of family psychoeducation for relatives of patients with MDD lasting more than one year. If this type of intervention is effective, it could be a new method of rehabilitation for patients with MDD lasting more than one year.
TRIAL REGISTRATION
Clinical Trials.gov NCT01734291 (registration date: 18 October 2012). |
Schenkerian Analysis by Computer : A Proof of Concept | A system for automatically deriving a Schenkerian reduction of an extract of tonal music is described. Schenkerian theory is formalised in a quasi-grammatical manner, expressing a reduction as a binary-tree structure. Computer software which operates in the manner of a chart parser using this grammar has been implemented, capable of deriving a matrix of reduction possibilities, in polynomial time, from a representation of the score. A full reduction of the extract can be discovered by selecting a tree from this matrix. The number of possible valid reductions for even short extracts is found to be extremely large, so criteria are required to distinguish good reductions from bad ones. To find such criteria, themes from five Mozart piano sonatas are analysed and samples of ̳good‘ reductions (defined by reference to pre-existing analyses of these themes) are compared with randomly sampled reductions. Nine criteria are thereby derived, which can be applied in the process of parsing and selecting a reduction. The results are promising, but the process is still too computationally expensive—only extracts of a few bars in length can be reduced—and more extensive testing is required before the system can be properly claimed to perform automatic Schenkerian analysis. |
Automatic generation of artistic chinese calligraphy | Chinese calligraphy is among the finest and most important of all Chinese art forms and an inseparable part of Chinese history. Its delicate aesthetic effects are generally considered to be unique among all calligraphic arts. Its subtle power is integral to traditional Chinese painting. A novel intelligent system uses a constraint-based analogous-reasoning process to automatically generate original Chinese calligraphy that meets visually aesthetic requirements. We propose an intelligent system that can automatically create novel, aesthetically appealing Chinese calligraphy from a few training examples of existing calligraphic styles. To demonstrate the proposed methodology's feasibility, we have implemented a prototype system that automatically generates new Chinese calligraphic art from a small training set. |
Quality of life after self-management cancer rehabilitation: a randomized controlled trial comparing physical and cognitive-behavioral training versus physical training. | OBJECTIVE
To conduct a randomized controlled trial and compare the effects on cancer survivors' quality of life in a 12-week group-based multidisciplinary self-management rehabilitation program, combining physical training (twice weekly) and cognitive-behavioral training (once weekly) with those of a 12-week group-based physical training (twice weekly). In addition, both interventions were compared with no intervention.
METHODS
Participants (all cancer types, medical treatment completed > or = 3 months ago) were randomly assigned to multidisciplinary rehabilitation (n = 76) or physical training (n = 71). The nonintervention comparison group consisted of 62 patients on a waiting list. Quality of life was measured using the RAND-36. The rehabilitation groups were measured at baseline, after rehabilitation, and 3-month follow-up, and the nonintervention group was measured at baseline and 12 weeks later.
RESULTS
The effects of multidisciplinary rehabilitation did not outperform those of physical training in role limitations due to emotional problem (primary outcome) or any other domains of quality of life (all p > .05). Compared with no intervention, participants in both rehabilitation groups showed significant and clinically relevant improvements in role limitations due to physical problem (primary outcome; effect size (ES) = 0.66), and in physical functioning (ES = 0.48), vitality (ES = 0.54), and health change (ES = 0.76) (all p < .01).
CONCLUSIONS
Adding a cognitive-behavioral training to group-based self-management physical training did not have additional beneficial effects on cancer survivors' quality of life. Compared with the nonintervention group, the group-based self-management rehabilitation improved cancer survivors' quality of life. |
Detection of diabetic foot hyperthermia by infrared imaging | In diabetic foot, the occurrence of an ulcer is often associated with hyperthermia. Hyperthermia is defined as a temperature greater than 2.2oC in a given region of one of the foot compared to the temperature of the same region of the contralateral foot. Unfortunately, hyperthermia is not yet assessed in current diabetic foot therapy. In this paper, we propose an easy way to detect a possible hyperthermia by using an infrared camera. A specific acquisition protocol of the thermal images is proposed. A dedicated image analysis is developed: it is composed of a contour detection of the 2 feet using the Chan and Vese active contour method associated to the ICP rigid registration technique. Among 85 type II diabetes persons recruited in the Dos de Mayo hospital in Lima, Peru, 9 individuals show significant hyperthermia. It is expected that the new possibility of detecting hyperthermia in hospitals or in diabetic health centers which is now available, thanks to the proposed method, will help in reducing foot ulcer occurrence for diabetic persons. |
Antecedents and Dimensions of Online Service Expectations | Researchers find that customer satisfaction with both offline and online services can be modeled effectively based on expectations of future service performance, perceptions of actual performance, and a comparison of customers' initial expectations to subsequent perceptions of performance known as disconfirmation. Research has examined antecedents to offline service expectations, and is now beginning to examine online service expectations. We add to this research by predicting that two antecedents-information-seeking need and prior service satisfaction-will have positive influences on initial expectations of a relational multichannel service provider where the service provider and customer have a long term-relationship, and services are offered through offline and online channels. We test these antecedents by surveying existing patients of a large health care provider during the introduction of a new online channel, regarding their information-seeking need, prior service satisfaction, and initial expectations. Three months later, we gathered data regarding patients' satisfaction with the online service. Conceptualizing the dimensions of expectations as usefulness, ease-of-use, and enjoyment, we show that information-seeking need and prior service satisfaction are significant antecedents to expectations, and that expectations have a positive influence on satisfaction. |
Active Appearance Models Revisited | Active Appearance Models (AAMs) and the closely related concepts of Morphable Models and Active Blobs are generative models of a certain visual phenomenon. Although linear in both shape and appearance, overall, AAMs are nonlinear parametric models in terms of the pixel intensities. Fitting an AAM to an image consists of minimising the error between the input image and the closest model instance; i.e. solving a nonlinear optimisation problem. We propose an efficient fitting algorithm for AAMs based on the inverse compositional image alignment algorithm. We show that the effects of appearance variation during fitting can be precomputed (“projected out”) using this algorithm and how it can be extended to include a global shape normalising warp, typically a 2D similarity transformation. We evaluate our algorithm to determine which of its novel aspects improve AAM fitting performance. |
Rock physics modeling to monitor movement of CO2in Sleipner gas field, North Sea: An ideal CCS field | Sleipner gas field in the North Sea is the world’s first industrial scale CO2 injection project designed specifically to reduce the emission of greenhouse gas. Here CO2 separated from natural gas produced at Sleipner gas field is injected into the Utsira sand, which is a major saline aquifer in the North Sea basin. In time-lapse threedimensional seismic data (4D), CO2 plume is imaged as a number of bright sub-horizontal reflections within the reservoir. Correlation of log data with the seismic data indicates that CO2 accumulates within a series of interbedded sandstones and mudstones beneath a thick cap rock of mudstone. Nine reflective horizons have been mapped within the reservoir on the six seismic surveys from 1999 to 2008. Comparison with the baseline seismic survey of 1994 (pre-injection) provides clear impression of the migration of CO2 plume. In this paper, we attempt to model CO2 distribution quantitatively within the reservoir by applying a pressure-dependent differential effective medium (PDEM) theory using 4D seismic data. Preand post-injection acoustic impedances are calculated by inverting respectively post-stack seismic data of 1994 and 2001 using a model-based inversion technique. 3D CO2 saturation volume is estimated using PDEM theory from inverted acoustic impedance of the year 2001 taking the reference of that from the results of pre-injection data of the year 1994. Since the gas distribution type is seldom known, we estimate the saturation distribution using both a homogeneous and a patchy distribution pattern in our rock physics model. We estimate saturation for homogeneous distribution of CO2 to be 0-20% and for CO2 as patches of gas as 0-80% of the total porosity within ~200 m thick reservoir unit. 5-7% uncertainty in the predicted CO2 saturation is estimated using a Monte-Carlo simulation technique. Our results indicate that a large amount of CO2 is accumulated as patches of gas within sand layers capped by mud layers, though some amount of gas may have dissolved uniformly with water. |
D-PAGE: Diverse Paraphrase Generation | In this paper, we investigate the diversity aspect of paraphrase generation. Prior deep learning models employ either decoding methods or add random input noise for varying outputs. We propose a simple method Diverse Paraphrase Generation (D-PAGE), which extends neural machine translation (NMT) models to support the generation of diverse paraphrases with implicit rewriting patterns. Our experimental results on two real-world benchmark datasets demonstrate that our model generates at least one order of magnitude more diverse outputs than the baselines in terms of a new evaluation metric Jeffrey’s Divergence. We have also conducted extensive experiments to understand various properties of our model with a focus on diversity. |
Benefits of mild cleansing: synthetic surfactant based (syndet) bars for patients with atopic dermatitis. | Atopic dermatitis (AD) is a recurring inflammatory skin disease, characterized by marked pruritus, which usually develops in early childhood. AD is associated with a wide array of symptoms, including itching, dryness, erythema, crusted lesions, and superficial inflammation. Topical steroid cream or ointment with proper washing is a primary treatment approach for AD. Nonsoap-based personal washing or syndet bars containing synthetic detergents or surfactants are milder than soaps; thus, they are widely used by patients with a variety of skin conditions, including AD. The primary goals of this study were to determine the compatibility of syndet bar use with the therapy of AD and the potential benefits of syndet bars compared with subjects' usual cleansing products, mostly soap bars. In this evaluation, 50 subjects (14 subjects were aged < or =15 years) with mild AD on a stable treatment regimen were recruited and asked to use 1 of 2 syndet bars as part of their normal shower routine for 28 days. The severity of eczematous lesions, skin condition (dryness, erythema, texture), and hydration were evaluated at baseline and after 28 days of syndet application by investigators and subjects. Syndet bar use reduced the severity of eczematous lesions, improved skin condition, and maintained hydration. Overall, the results of this study indicate that syndet formulations are compatible with the therapy of AD. |
An assessment of farmer-pastoralist conflict in Nigeria using GIS | Pastoralism in Nigeria faces challenges and these hampers the productivity that consequentially affect the Nation’s economy. Available grazing lands are diminishing at an alarming rate and livestock pathways are blocked through land use, urbanisation and frontiers. The old grazing routes that existed for centuries are almost gone. Only 2.82% of the grazing reserves have been acquired and these are poorly managed. The increase in population, drying of waterholes, shifting in rainfall pattern leading to drought as a result of the changing climate affects both pastoralists and farmers. Hence, they compete over land leading to conflict, and embedded within these are growing form of capitalists land tenure and delay in the justice system that exacerbates the situation. This study examines the argument of land use conflict as the major cause of farmer-pastoralist conflict in Nigeria. The Nigerian Forestry Management Evaluation and Coordinating Unit (FORMECU) land use and land cover (LULC) dataset and published articles of previous farmer-pastoralist conflicts in the country are used. Results show that between 1976 and 1995, all land uses gain, attesting to the increase in population and competition over dwindling resources. However, overlap maps show intensive crop farming has expanded into grazing lands in many areas over these years. These areas of encroachment agree with most of the conflict points recorded. For a lasting solution, we propose a possible revisit of symbiotic engagements between farmers and pastoralists. The full engagement of communities, Non-governmental Organisations (NGOs), Alternative Dispute Resolutions (ADRs) and government as overseers are suggested. |
Hepatitis in Albanian children: molecular analysis of hepatitis A virus isolates. | Hepatitis A is a common disease in developing countries and Albania has a high prevalence of this disease associated to young age. In spite of the occurrence of a unique serotype there are different genotypes classified from I to VII. Genotype characterisation of HAV isolates circulating in Albania has been undertaken, as well as the study of the occurrence of antigenic variants in the proteins VP3 and VP1. To evaluate the genetic variability of the Albanian hepatitis A virus (HAV) isolates, samples were collected from 12 different cities, and the VP1/2A junction amplified and sequenced. These sequences were aligned and a phylogenetic analysis performed. Additionally, the amino half sequence of the protein VP3 and the complete sequence of the VP1 was determined. Anti-HAV IgM were present in 66.2% of all the sera. Fifty HAV isolates were amplified and the analysis revealed that all the isolates were sub-genotype IA with only limited mutations. When the deduced amino acid sequences were obtained, the alignment showed only two amino acids substitutions at positions 22 and 34 of the 2A protein. A higher genomic stability of the VP1/2A region, in contrast with what occurs in other parts of the world could be observed, indicating high endemicity of HAV in Albania. In addition, two potential antigenic variants were detected. The first at position 46 of VP3 in seven isolates and the second at position 23 of VP1 in six isolates. |
Severe thoracolumbar osteoporotic burst fractures: treatment combining open kyphoplasty and short-segment fixation. | INTRODUCTION
The majority of osteoporotic, spinal cord compressive, vertebral fractures occurs at the thoracolumbar junction level. When responsible for neurological impairment, these rare lesions require a decompression procedure. We present the results of a new option to treat these lesions: an open balloon kyphoplasty associated with a short-segment posterior internal fixation.
MATERIALS AND METHODS
Twelve patients, aged a mean 72.3 years, were included in this prospective series; all of them presented osteoporotic burst fractures located between T11 and L2 associated with neurological impairment. The surgical procedure first consisted of a laminectomy, for decompression, followed by an open balloon kyphoplasty. A short-segment posterior internal fixation was subsequently put into place when the local kyphosis was considered severe. A CAT scan study evaluated local vertebral body's height restoration using two pre- and postoperative radiological indices.
RESULTS
All of the patients in the series were followed up for a mean 14 months. Local kyphosis improved a mean 10.8 (p<0.001). Vertebral body height was also substantially restored, with a mean gain of 26% according to the anterior height/adjacent height ratio and 28% according to the Beck Index (p<0.001). Two cases of cement leakage were recorded, with no adverse clinical side effect. Complete neurological recovery was observed in 10 patients; two retained a minimal neurological deficit but kept a walking capacity.
DISCUSSION
The results presented in this study confirm the data reported in the literature in terms of local kyphosis correction and vertebral body height restoration. The combination of this technique with laminectomy plus osteosynthesis allowed us to effectively treat burst fractures of the thoracolumbar junction and led to stable results 1 year after surgery. This can be advantageous in a population often carrying multiple co-morbidities. With a single operation, we can achieve neurological decompression and spinal column stability in a minimally invasive way; this avoids more substantial surgery in these fragile patients.
LEVEL OF EVIDENCE
Level IV. Therapeutic prospective study. |
Interaction between basal ganglia and limbic circuits in learning and memory processes. | Hippocampus and striatum play distinctive roles in memory processes since declarative and non-declarative memory systems may act independently. However, hippocampus and striatum can also be engaged to function in parallel as part of a dynamic system to integrate previous experience and adjust behavioral responses. In these structures the formation, storage, and retrieval of memory require a synaptic mechanism that is able to integrate multiple signals and to translate them into persistent molecular traces at both the corticostriatal and hippocampal/limbic synapses. The best cellular candidate for this complex synthesis is represented by long-term potentiation (LTP). A common feature of LTP expressed in these two memory systems is the critical requirement of convergence and coincidence of glutamatergic and dopaminergic inputs to the dendritic spines of the neurons expressing this form of synaptic plasticity. In experimental models of Parkinson's disease abnormal accumulation of α-synuclein affects these two memory systems by altering two major synaptic mechanisms underlying cognitive functions in cholinergic striatal neurons, likely implicated in basal ganglia dependent operative memory, and in the CA1 hippocampal region, playing a central function in episodic/declarative memory processes. |
A Neuro-fuzzy System for Fraud Detection in Electricity Distribution | The volume of energy loss that Brazilian electrical utilities have to deal with has been ever increasing. Electricity distribution companies have suffered significant and increasing losses in the last years, due to theft, measurement errors and other irregularities. Therefore there is a great concern to identify the profile of irregular customers, in order to reduce the volume of such losses. This paper presents a combined approach of a neural networks committee and a neuro-fuzzy hierarchical system intended to increase the level of accuracy in the identification of irregularities among low voltage consumers. The data used to test the proposed system are from Light S.A., the distribution company of Rio de Janeiro. The results obtained presented a significant increase in the identification of irregular customers when compared to the current methodology employed by the company. Keywords— neural nets, hierarchical neuro-fuzzy systems, binary space partition, electricity distribution, fraud detection. |
A Resumption of Adolescent Development: Discussion of “School Refusal and the Parent-Child Relationship” | This discussion of Christogiorgios and Gianna Kopoulos’ fine case, “School Refusal and the Parent-Child Relationship: A Psychodynamic Perspective,” focuses on the young adolescent’s overall development during his emotional crisis. I pay particular attention to the patient, Peter’s, growth as a young man, his relationship to other males—including his father, and his anxious engagement with his peers. My purpose is to contextualize the clinical work within the larger fabric of Peter’s life. |
Constraint-based motion optimization using a statistical dynamic model | In this paper, we present a technique for generating animation from a variety of user-defined constraints. We pose constraint-based motion synthesis as a maximum a posterior (MAP) problem and develop an optimization framework that generates natural motion satisfying user constraints. The system automatically learns a statistical dynamic model from motion capture data and then enforces it as a motion prior. This motion prior, together with user-defined constraints, comprises a trajectory optimization problem. Solving this problem in the low-dimensional space yields optimal natural motion that achieves the goals specified by the user. We demonstrate the effectiveness of this approach by generating whole-body and facial motion from a variety of spatial-temporal constraints. |
COMPETITIVE PRIORITIES: INVESTIGATING THE NEED FOR TRADE-OFFS IN OPERATIONS STRATEGY* | A heated debate continues over the need for trade-offs in operations strategy. Some researchers call for plants to focus on a single manufacturing capability and devote their limited resources accordingly, while others claim that advanced manufacturing technology (AMT) enables concurrent improvements in quality, cost, flexibility, and delivery. Yet there is little empirical evidence for or against the trade-off model. In response, this study addresses the question: “To what extent do manufacturing plants view competitive priorities as trade-offs?” We employ survey data collected from managers and operators in 110 plants that have recently implemented AMT. Our findings suggest that trade-offs remain. However, perceived differences in competitive priorities are subtle and may vary across levels of the plant hierarchy. (COMPETITIVE PRIORITIES; OPERATIONS STRATEGY; SURVEY RESEARCH/DESIGN) |
Bayesian Regularization and Pruning Using a Laplace Prior | Standard techniques for improved generalization from neural networks include weight decay and pruning. Weight decay has a Bayesian interpretation with the decay function corresponding to a prior over weights. The method of transformation groups and maximum entropy suggests a Laplace rather than a gaussian prior. After training, the weights then arrange themselves into two classes: (1) those with a common sensitivity to the data error and (2) those failing to achieve this sensitivity and that therefore vanish. Since the critical value is determined adaptively during training, pruningin the sense of setting weights to exact zerosbecomes an automatic consequence of regularization alone. The count of free parameters is also reduced automatically as weights are pruned. A comparison is made with results of MacKay using the evidence framework and a gaussian regularizer. |
Strong Statements of Analysis | Examples are discussed of natural statements about irrational numbers that are equivalent, provably in ZFC, to strong set-theoretical hypotheses, and of apparently classical statements provable in ZFC of which the only known proofs use strong set-theoretical concepts. |
Lamivudine therapy for chemotherapy-induced reactivation of hepatitis B virus infection | A 54-yr-old man with lymphoma and serological evidence of prior hepatitis B virus (HBV) infection, with detectable anti-HBc and anti-HBs, was treated with intensive chemotherapy. He had reactivation of HBV infection with acute hepatitis B manifest by detectable HBsAg and elevated aminotransferase levels >1000 IU/L. He was treated with lamivudine 150 mg daily and had prompt resolution of acute hepatitis B with return of elevated aminotransferases to normal, and initial loss of HBeAg with later loss of HBsAg. Lamivudine was continued during the course of further chemotherapy as prophylaxis against repeat HBV reactivation. Lamivudine is a nucleoside analogue that is a potent inhibitor of HBV reverse transcriptase and HBV replication. Lamivudine therapy should be considered for the treatment of HBV reactivation and might play a future role as preemptive therapy of HBV reactivation in patients with prior hepatitis B or chronic hepatitis B with inactive viral replication. |
Investigating ICT using problem-based learning in face-to-face and online learning environments | This article reports on the design, implementation and evaluation of a module in the MEd (Business) in the Faculty of Education at the University of Hong Kong in which an explicit problem-based learning (PBL) approach was used to investigate the challenges associated with the adoption and use of information and communication technologies (ICT) in Hong Kong secondary school classrooms. PBL influenced both the way the curriculum was developed and the process by which students (n = 18) investigated topics related to the integration of ICT in business studies classrooms. The evaluation was based on five evaluative questions dealing with the implementation of PBL, the extent to which PBL facilitated academic discourse, the extent of new knowledge about ICT that had been created, the role of the tutor, and the online learning environment provided. The evaluation revealed that PBL provided a practical approach to investigating ICT in both face-to-face and online learning environments, leading to new knowledge about challenges associated with the adoption and use of new technologies in various educational settings. 2004 Elsevier Ltd. All rights reserved. |
Travel time estimation and order batching in a 2-block warehouse | The order batching problem (OBP) is the problem of determining the number of orders to be picked together in one picking tour. Although various objectives may arise in practice, minimizing the average throughput time of a random order is a common concern. In this paper, we consider the OBP for a 2-block rectangular warehouse with the assumptions that orders arrive according to a Poisson process and the method used for routing the orderpickers is the well-known S-shape heuristic. We first elaborate on the first and second moment of the order-picker's travel time. Then we use these moments to estimate the average throughput time of a random order. This enables us to estimate the optimal picking batch size. Results from simulation show that the method provides a high accuracy level. Furthermore, the method is rather simple and can be easily applied in practice. |
More Effective Synchronization Scheme in ML Using Stale Parameters | In Machine learning (ML) the model we use is increasingly important, and the model's parameters, the key point of the ML, are adjusted through iteratively processing a training dataset until convergence. Although data-parallel ML systems often engage a perfect error tolerance when synchronizing the model parameters for maximizing parallelism, the synchronization of model parameters may delay in completion, a problem that generally gets worse at a large scale. This paper presents a Bounded Asynchronous Parallel (BAP) model of computation that allows computations using stale model parameters in order to reduce synchronization overheads. In the meanwhile, our BAP model ensures theoretical convergence guarantees for large scale data-parallel ML applications. This model permits distributed workers to use the stale parameters storing in the local cache, instead of waiting until the Parameter Server (PS) produces a new version. This expressively reduces the time workers spend on waiting. Furthermore, the BAP model guarantees the convergence of ML algorithm by bounding the maximum distance of the stale parameters. Experiments conducted on 4 cluster nodes with up to 32 GPUs showed that our model significantly improved the proportion of computing time relative to the waiting time and led to 1.2–2×speedup. Besides, we elaborated how to choose the staleness threshold when considering the tradeoff between Efficiency and Speed. |
Person Re-identification by Attributes | Visually identifying a target individual reliably in a crowded environment observed by a distributed camera network is critical to a variety of tasks in managing business information, border control, and crime prevention. Automatic re-identification of a human candidate from public space CCTV video is challenging due to spatiotemporal visual feature variations and strong visual similarity between different people, compounded by low-resolution and poor quality video data. In this work, we propose a novel method for re-identification that learns a selection and weighting of mid-level semantic attributes to describe people. Specifically, the model learns an attribute-centric, parts-based feature representation. This differs from and complements existing low-level features for re-identification that rely purely on bottom-up statistics for feature selection, which are limited in discriminating and identifying reliably visual appearances of target people appearing in different camera views under certain degrees of occlusion due to crowdedness. Our experiments demonstrate the effectiveness of our approach compared to existing feature representations when applied to benchmarking datasets. |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.