title
stringlengths 8
300
| abstract
stringlengths 0
10k
|
---|---|
Phase I trial of dihydrolenperone in lung cancer patients: A novel compound within vitro activity against lung cancer | Antitumor activity of the butyrophenone dihydrolenperone in non-small cell lung cancer was initially suggested byin vitro screening against tumor cells derived from fresh surgical samples using the human tumor colony-forming assay. We have completed a directed phase I trial in patients with lung cancer. Thirty-two patients with lung cancer have completed 25 courses of therapy at doses of 10 to 60 mg/square meter orally on a twice daily schedule. Twenty-three men and 9 women with a median age of 55 (range 24–69) were entered. Twenty-four were performance status 0 or 1 and 8 were 2. The maximum tolerated dose was 50 mg/square meter orally twice daily and the dose limiting toxicity was somnolence. Of the 32 patients, 18 developed symptomatic hypotension (grade 1 or 2). There was no significant hematologic, renal, or hepatic toxicity.In vitro drug testing using the MTT [3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyl-tetrazolium bromide (thiazolyl blue)] assay confirmed 50% inhibition of non-small cell and small cell lung cancer cell line growth at 70–450 micromolar concentrations. Plasma dihydrolenperone levels were at least 75-fold less than levels at whichin vitro activity was observed. We conclude: 1) the maximum tolerated dose in our study is 50 mg/square meter orally twice daily, 2) the dose-limiting side effect of dihydrolenperone is somnolence, and 3) the concentrations of dihydrolenperone observed in plasma are significantly lower than those associated within vitro activity. |
Neural dynamics of planned arm movements: emergent invariants and speed-accuracy properties during trajectory formation. | A real-time neural network model, called the vector-integration-to-endpoint (VITE) model is developed and used to simulate quantitatively behavioral and neural data about planned and passive arm movements. Invariants o farm movements emerge through network interactions rather than through an explicitly precomputed trajectory. Motor planning occurs in the form of a target position command (TPC), which specifies where the arm intends to move, and an independently controlled GO command, which specifies the movement's overall speed. Automatic processes convert this information into an arm trajectory with invariant properties. These automatic processes include computation of a present position command (PPC) and a difference vector (DV). The DV is the difference between the PPC and the TPC at any time. The PPC is gradually updated by integrating the DV through time. The GO signal multiplies the DV before it is integrated by the PPC. The PPC generates an outflow movement command to its target muscle groups. Opponent interactions regulate the PPCs to agonist and antagonist muscle groups. This system generates synchronous movements across synergetic muscles by automatically compensating for the different total contractions that each muscle group must undergo. Quantitative simulations are provided of Woodworth's law, of the speed-accuracy trade-offknown as Fitts's law, of isotonic arm-movement properties before and after deafferentation, of synchronous and compensatory "central-error-correction" properties of isometric contractions, of velocity amplification during target switching, of velocity profile invariance and asymmetry, of the changes in velocity profile asymmetry at higher movement speeds, of the automarie compensation for staggered onset times of synergetic muscles, of vector cell properties in precentral motor cortex, of the inverse relation between movement duration and peak velocity, and of peak acceleration as a function of movement amplitude and duration. It is shown that TPC, PPC, and DV computations are needed to actively modulate, or gate, the learning of associative maps between TPCs of different modalities, such as between the eye-head system and the hand-arm system. By using such an associative map, looking at an object can activate a TPC of the hand-arm system, as Piaget noted. Then a VITE circuit can translate this TPC into an invariant movement trajectory. An auxiliary circuit, called the Passive Update of Position (PUP) model is described for using inflow signals to update the PPC during passive arm movements owing to external forces. Other uses of outflow and inflow signals are also noted, such as for adaptive linearization of a nonlinear muscle plant, and sequential readout of TPCs during a serial plan, as in reaching and grasping. Comparisons are made with other models of motor control, such as the mass-spring and minimumjerk models. |
Information handling by the brain: proposal of a new “paradigm” involving the roamer type of volume transmission and the tunneling nanotube type of wiring transmission | The current view on the organization of the central nervous system (CNS) is basically anchored to the paradigm describing the brain as formed by networks of neurons interconnected by synapses. Synaptic contacts are a fundamental characteristic for describing CNS operations, but increasing evidence accumulated in the last 30 years pointed to a refinement of this view. A possible overcoming of the classical “neuroscience paradigm” will be here outlined, based on the following hypotheses: (1) the basic morpho-functional unit in the brain is a compartment of tissue (functional module) where different resident cells (not only neurons) work as an integrated unit; (2) in these complex networks, a spectrum of intercellular communication processes is exploited, that can be classified according to a dichotomous criterion: wiring transmission (occurring through physically delimited channels) and volume transmission (exploiting diffusion in the extracellular space); (3) the connections between cells can themselves be described as a network, leading to an information processing occurring at different levels from cell network down to molecular level; (4) recent evidence of the existence of specialized structures (microvesicles and tunneling nanotubes) for intercellular exchange of materials, could allow a further type of polymorphism of the CNS networks based on at least transient changes in cell phenotype. When compared to the classical paradigm, the proposed scheme of cellular organization could allow a strong increase of the degrees of freedom available to the whole system and then of its plasticity. Furthermore, long range coordination and correlation can be more easily accommodated within this framework. |
Assembly algorithms for next-generation sequencing data. | The emergence of next-generation sequencing platforms led to resurgence of research in whole-genome shotgun assembly algorithms and software. DNA sequencing data from the Roche 454, Illumina/Solexa, and ABI SOLiD platforms typically present shorter read lengths, higher coverage, and different error profiles compared with Sanger sequencing data. Since 2005, several assembly software packages have been created or revised specifically for de novo assembly of next-generation sequencing data. This review summarizes and compares the published descriptions of packages named SSAKE, SHARCGS, VCAKE, Newbler, Celera Assembler, Euler, Velvet, ABySS, AllPaths, and SOAPdenovo. More generally, it compares the two standard methods known as the de Bruijn graph approach and the overlap/layout/consensus approach to assembly. |
Withdrawal of inhaled glucocorticoids and exacerbations of COPD. | BACKGROUND
Treatment with inhaled glucocorticoids in combination with long-acting bronchodilators is recommended in patients with frequent exacerbations of severe chronic obstructive pulmonary disease (COPD). However, the benefit of inhaled glucocorticoids in addition to two long-acting bronchodilators has not been fully explored.
METHODS
In this 12-month, double-blind, parallel-group study, 2485 patients with a history of exacerbation of COPD received triple therapy consisting of tiotropium (at a dose of 18 μg once daily), salmeterol (50 μg twice daily), and the inhaled glucocorticoid fluticasone propionate (500 μg twice daily) during a 6-week run-in period. Patients were then randomly assigned to continued triple therapy or withdrawal of fluticasone in three steps over a 12-week period. The primary end point was the time to the first moderate or severe COPD exacerbation. Spirometric findings, health status, and dyspnea were also monitored.
RESULTS
As compared with continued glucocorticoid use, glucocorticoid withdrawal met the prespecified noninferiority criterion of 1.20 for the upper limit of the 95% confidence interval (CI) with respect to the first moderate or severe COPD exacerbation (hazard ratio, 1.06; 95% CI, 0.94 to 1.19). At week 18, when glucocorticoid withdrawal was complete, the adjusted mean reduction from baseline in the trough forced expiratory volume in 1 second was 38 ml greater in the glucocorticoid-withdrawal group than in the glucocorticoid-continuation group (P<0.001); a similar between-group difference (43 ml) was seen at week 52 (P=0.001). No change in dyspnea and minor changes in health status occurred in the glucocorticoid-withdrawal group.
CONCLUSIONS
In patients with severe COPD receiving tiotropium plus salmeterol, the risk of moderate or severe exacerbations was similar among those who discontinued inhaled glucocorticoids and those who continued glucocorticoid therapy. However, there was a greater decrease in lung function during the final step of glucocorticoid withdrawal. (Funded by Boehringer Ingelheim Pharma; WISDOM ClinicalTrials.gov number, NCT00975195.). |
Complications in spinal fusion for adolescent idiopathic scoliosis in the new millennium. A report of the Scoliosis Research Society Morbidity and Mortality Committee. | STUDY DESIGN
The Morbidity and Mortality database of the Scoliosis Research Society (SRS) was queried as to the incidence and type of complications as reported by its members for the treatment of adolescent idiopathic scoliosis (AIS) with spinal fusion and instrumentation procedures regarding surgical approach (anterior, posterior, or combined anterior-posterior) during a recent 3-year period.
OBJECTIVE
To evaluate the incidence of surgeon-reported complications in a large series of spinal fusions with instrumentation for a single spinal deformity diagnosis and age group regarding surgical approach.
SUMMARY OF BACKGROUND DATA
The SRS has been collecting morbidity and mortality data from its members since its formation in 1965 with the intent of using these data to assess the complications and adverse outcomes (death and/or spinal cord injury) of surgical treatment for spinal deformity. Surgical approaches to the management of treatment of AIS have a measurable impact on efficacy of correction, levels fused, and operative morbidity. However, there is a lack of consensus on the choice of surgical approach for the treatment of spinal deformity.
METHODS
Of the 58,197 surgical cases submitted by members of the SRS in the years 2001, 2002, and 2003, 10.9% were identified as having had anterior, posterior, or combined spinal fusion with instrumentation for the diagnosis of AIS, and comprised the study cohort. All reported complications were tabulated and totaled for each of the 3 types of procedures, and statistical analysis was conducted.
RESULTS
Complications were reported in 5.7% of the 6334 patients in this series. Of the 1164 patients who underwent anterior fusion and instrumentation, 5.2% had complications, of the 4369 who underwent posterior instrumentation and fusion, 5.1% had complications, and of the 801 who underwent combined instrumentation and fusion, 10.2% had complications. There were 2 patients (0.03%) who died of their complications. There was no statistical difference in overall complication rates between anterior and posterior procedures. However, the difference in complication rates between anterior or posterior procedures compared to combined procedures was highly significant (P < 0.0001). The differences in neurologic complication rates between combined and anterior procedures, as well as combined and posterior procedures were also highly statistically significant (P < 0.0001), but not between anterior and posterior procedures.
CONCLUSIONS
This study shows that complication rates are similar for anterior versus posterior approaches to AIS deformity correction. Combined anterior and posterior instrumentation and fusion has double the complication rate of either anterior or posterior instrumentation and fusion alone. Combined anterior and posterior instrumentation and fusion also has a significantly higher rate of neurologic complications than anterior or posterior instrumentation and fusion alone. |
Expansion of Regulatory T Cells in Patients with Langerhans Cell Histiocytosis | BACKGROUND
Langerhans cell histiocytosis (LCH) is a rare clonal granulomatous disease that affects mainly children. LCH can involve various tissues such as bone, skin, lung, bone marrow, lymph nodes, and the central nervous system, and is frequently responsible for functional sequelae. The pathophysiology of LCH is unclear, but the uncontrolled proliferation of Langerhans cells (LCs) is believed to be the primary event in the formation of granulomas. The present study was designed to further investigate the nature of proliferating cells and the immune mechanisms involved in the LCH granulomas.
METHODS AND FINDINGS
Biopsies (n = 24) and/or blood samples (n = 25) from 40 patients aged 0.25 to 13 y (mean 7.8 y), were studied to identify cells that proliferate in blood and granulomas. We found that the proliferating index of LCs was low ( approximately 1.9%), and we did not observe expansion of a monocyte or dendritic cell compartment in patients. We found that LCH lesions were a site of active inflammation, tissue remodeling, and neo-angiogenesis, and the majority of proliferating cells were endothelial cells, fibroblasts, and polyclonal T lymphocytes. Within granulomas, interleukin 10 was abundant, LCs expressed the TNF receptor family member RANK, and CD4(+) CD25(high) FoxP3(high) regulatory T cells (T-regs) represented 20% of T cells, and were found in close contact with LCs. FoxP3(+) T-regs were also expanded compared to controls, in the blood of LCH patients with active disease, among whom seven out of seven tested exhibited an impaired skin delayed-type hypersensitivity response. In contrast, the number of blood T-regs were normal after remission of LCH.
CONCLUSIONS
These findings indicate that LC accumulation in LCH results from survival rather than uncontrolled proliferation, and is associated with the expansion of T-regs. These data suggest that LCs may be involved in the expansion of T-regs in vivo, resulting in the failure of the host immune system to eliminate LCH cells. Thus T-regs could be a therapeutic target in LCH. |
Optical repetition MIMO transmission with multipulse PPM | We study the use of multiple laser transmitters combined with multiple photodetectors for atmospheric, line-of-sight optical communication, and focus upon the use of multiple-pulse-position-modulation as a power-efficient transmission format, with signal repetition across the laser array. Ideal (photon counting) photodetectors are assumed, with and without background radiation. The resulting multiple-input/multiple-output channel has the potential for combating fading effects on turbulent optical channels, for which both log-normal and Rayleigh-fading models are treated. Our focus is upon symbol error probability for uncoded transmission, and on capacity for coded transmission. Full spatial diversity is obtained naturally in this application. |
Clifford Space as the Arena for Physics | A new theory is considered according to which extended objects in n-dimensional space are described in terms of multivector coordinates which are interpreted as generalizing the concept of centre of mass coordinates. While the usual centre of mass is a point, by generalizing the latter concept, we associate with every extended object a set of r-loops, r = 0, 1, ..., n − 1, enclosing oriented (r + 1)-dimensional surfaces represented by Clifford numbers called (r + 1)-vectors or multivectors. Superpositions of multivectors are called polyvectors or Clifford aggregates and they are elements of Clifford algebra. The set of all possible polyvectors forms a manifold, called C-space. We assume that the arena in which physics takes place is in fact not Minkowski space, but C-space. This has many far reaching physical implications, some of which are discussed in this paper. The most notable is the finding that although we start from the constrained relativity in C-space we arrive at the unconstrained Stueckelberg relativistic dynamics in Minkowski space which is a subspace of C-space. |
Smart Choice for the Smart Grid: Narrowband Internet of Things (NB-IoT) | The low power wide area network (LPWAN) technologies, which is now embracing a booming era with the development in the Internet of Things (IoT), may offer a brand new solution for current smart grid communications due to their excellent features of low power, long range, and high capacity. The mission-critical smart grid communications require secure and reliable connections between the utilities and the devices with high quality of service (QoS). This is difficult to achieve for unlicensed LPWAN technologies due to the crowded license-free band. Narrowband IoT (NB-IoT), as a licensed LPWAN technology, is developed based on the existing long-term evolution specifications and facilities. Thus, it is able to provide cellular-level QoS, and henceforth can be viewed as a promising candidate for smart grid communications. In this paper, we introduce NB-IoT to the smart grid and compare it with the existing representative communication technologies in the context of smart grid communications in terms of data rate, latency, range, etc. The overall requirements of communications in the smart grid from both quantitative and qualitative perspectives are comprehensively investigated and each of them is carefully examined for NB-IoT. We further explore the representative applications in the smart grid and analyze the corresponding feasibility of NB-IoT. Moreover, the performance of NB-IoT in typical scenarios of the smart grid communication environments, such as urban and rural areas, is carefully evaluated via Monte Carlo simulations. |
A comparative study on security features of Indian , Canadian and Dubai cheques | Cheque is a document that orders the bank to pay a specific amount of money from the holder’s account to the person whose details are drafted on the cheque. Cheque is one of the most prevailing documents of the bank whose chances of being counterfeited is maximum, therefore in order to prevent counterfeiting the cheque are being embedded with security features. Cheque is essentially considered as the bill of exchange that is manufactured in order to make money transactions without having cash in hand.1 Section 13 of Negotiable Instruments Act, 1881 (Act no. 26 of 1881) of Indian constitution states that a negotiable instrument is a promissory note, bill of exchange or a cheque payable either to payee or to a bearer and Section 6 of this Act exclusively defines a cheque as a bill of exchange drawn on a specified banker, and not expressed to be payable otherwise than on demand.2 Cheque is an instrument in writing containing an unconditional order, addressed to a banker, signed by the person who has deposited money with the banker, requiring him to pay on demand a certain sum of money only two or to the order of a certain person or to the bearer of instrument.3 |
A mixed-method systematic review: support for ethical competence of nurses. | AIM
The aim was to appraise and synthesize evidence of empirical studies of how nurses' ethical competence can be supported.
BACKGROUND
Ethical competence is an essential element of nursing practice. Nurses increasingly need support in competence when carrying out their responsibilities towards their patients.
DESIGN
A mixed-method systematic review of quantitative and qualitative studies was undertaken according to the University of York's Centre for Reviews and Dissemination guidelines.
DATA SOURCES
Searches of MEDLINE, Nursing Database and British Nursing Index databases were conducted, yielding 512 citations between 1985-2012.
METHODS
After a two-stage application of inclusion and exclusion criteria, 34 articles were included. The quality of the studies was assessed using STROBE or COREQ criteria. Data were analysed by content analysis.
RESULTS
Nurses' ethical competence has been studied from different viewpoints: ethical decision-making, ethical sensitivity, ethical knowledge and ethical reflection. There was little empirical evidence of provided support, but studies offered recommendations on how to support ethical competence. The most common strategies to support ethical competence were ethics education, ethics rounds, ethics committee and consultation. Nurse leaders and colleagues have a key role in providing opportunities for nurses to gain ethical competence.
CONCLUSIONS
There is a need to develop evidence-based support at the organizational and individual level to support nurses' ethical competence. Barriers for multiprofessional cooperation in ethical issues should be recognized and addressed as part of the development of organizational ethical practices. Research should pay more attention to the conceptual, theoretical and practical perspectives of ethical competence. |
A weighted least-norm solution based scheme for avoiding joint limits for redundant joint manipulators | It is proposed to use weighted least-norm solution to avoid joint limits for redundant joint manipulators. A comparison is made with the gradient projection method for avoiding joint limits. While the gradient projection method provides the optimal direction for the joint velocity vector within the null space, its magnitude is not unique and is adjusted by a scalar coefficient chosen by trial and error. It is shown in this paper that one fixed value of the scalar coefficient is not suitable even in a small workspace. The proposed manipulation scheme automatically chooses an appropriate magnitude of the self-motion throughout the workspace. This scheme, unlike the gradient projection method, guarantees joint limit avoidance, and also minimizes unnecessary self-motion. It was implemented and tested for real-time control of a seven-degree-offreedom (7-DOF) Robotics Research Corporation (RRC) manipulator. |
Switching control strategy to extend the ZVS operating range of a Dual Active Bridge AC/DC converter | A switching control strategy to extend the zero-voltage-switching (ZVS) operating range of a Dual Active Bridge (DAB) AC/DC converter to the entire input-voltage interval and the full power range is proposed. The converter topology consists of a DAB DC/DC converter, receiving a rectified AC line voltage via a synchronous rectifier. The DAB comprises a primary side half bridge and secondary side full bridge, linked by a high-frequency isolation transformer and inductor. Using conventional control strategies, the soft-switching boundary conditions are exceeded at the higher voltage conversion ratios of the AC input interval. A novel pulse-width-modulation strategy to fully eliminate these boundaries and its analysis are presented in this paper, allowing increased performance (in terms of efficiency and stresses). Additionally, by using a half bridge / full bridge configuration, the number of active components is reduced. A prototype converter was constructed and experimental results are given to validate the theoretical analyses and practical feasibility of the proposed strategy. |
Fractional Order Fuzzy Control of Hybrid Power System with Renewable Generation Using Chaotic PSO | This paper investigates the operation of a hybrid power system through a novel fuzzy control scheme. The hybrid power system employs various autonomous generation systems like wind turbine, solar photovoltaic, diesel engine, fuel-cell, aqua electrolyzer etc. Other energy storage devices like the battery, flywheel and ultra-capacitor are also present in the network. A novel fractional order (FO) fuzzy control scheme is employed and its parameters are tuned with a particle swarm optimization (PSO) algorithm augmented with two chaotic maps for achieving an improved performance. This FO fuzzy controller shows better performance over the classical PID, and the integer order fuzzy PID controller in both linear and nonlinear operating regimes. The FO fuzzy controller also shows stronger robustness properties against system parameter variation and rate constraint nonlinearity, than that with the other controller structures. The robustness is a highly desirable property in such a scenario since many components of the hybrid power system may be switched on/off or may run at lower/higher power output, at different time instants. |
A Standard Mutual Authentication Protocol for Cloud Computing Based Health Care System | Telecare Medical Information System (TMIS) supports a standard platform to the patient for getting necessary medical treatment from the doctor(s) via Internet communication. Security protection is important for medical records (data) of the patients because of very sensitive information. Besides, patient anonymity is another most important property, which must be protected. Most recently, Chiou et al. suggested an authentication protocol for TMIS by utilizing the concept of cloud environment. They claimed that their protocol is patient anonymous and well security protected. We reviewed their protocol and found that it is completely insecure against patient anonymity. Further, the same protocol is not protected against mobile device stolen attack. In order to improve security level and complexity, we design a light weight authentication protocol for the same environment. Our security analysis ensures resilience of all possible security attacks. The performance of our protocol is relatively standard in comparison with the related previous research. |
Human Swarms, a real-time method for collective intelligence | Although substantial research has explored the emergence of collective intelligence in real-time human-based collaborative systems, much of this work has focused on rigid scenarios such as the Prisoner’s Dilemma (PD). (Pinheiro et al., 2012; Santos et al., 2012). While such work is of great research value, there’s a growing need for a flexible real-world platform that fosters collective intelligence in authentic decision-making situations. This paper introduces a new platform called UNUM that allows groups of online users to collectively answer questions, make decisions, and resolve dilemmas by working together in unified dynamic systems. Modeled after biological swarms, the UNUM platform enables online groups to work in real-time synchrony, collaboratively exploring a decision-space and converging on preferred solutions in a matter of seconds. We call the process “social swarming” and early real-world testing suggests it has great potential for harnessing collective intelligence. |
Conjugated linoleic acid and disease prevention: a review of current knowledge. | Conjugated linoleic acid (CLA), a derivative of a fatty acid linoleic acid (LA), has been reported to decrease tumorigenesis in animals. CLA is unique because unlike most antioxidants which are components of plant products, it is present in food from animal sources such as dairy foods and meats. CLA concentrations in dairy products typically range from 2.9 to 8.92 mg/g fat of which the 9-cis, 11-trans isomer makes up to 73% to 93% of the total CLA. Low concentrations of CLA are found in human blood and tissues. In vitro results suggest that CLA is cytotoxic to MCF-7 cells and it inhibits the proliferation of human malignant melanoma and colorectal cancer cells. In animal studies, CLA has inhibited the development of mouse epidermal tumors, mouse forestomach cancer and rat mammary cancer. Hamsters fed CLA collectively had significantly reduced levels of plasma total cholesterol, non-high-density lipoprotein cholesterol, (combined very-low and low-density lipoprotein) and triglycerides with no effect on high-density lipoprotein cholesterol, as compared to controls. Dietary CLA modulated certain aspects of the immune defense but had no obvious effect on the growth of an established, aggressive mammary tumor in mice. It is now thought that CLA itself may not have anti-oxidant capabilities but may produce substances which protect cells from the detrimental effects of peroxides. There is, however, insufficient evidence from human epidemiological data, and very few of the animal studies have shown a dose-response relationship with the quantity of CLA feed and the extent of tumor growth. Further research with tumor models is needed to test the efficacy and utility of CLA in cancer and other disease prevention and form the basis of evaluating its effect in humans by observational studies and clinical trials. |
Speech compression and decompression using DWT and DCT | speech compression is the digital signal which is compressed by using various techniques for transmission. This paper explains a transform methodology for compression of the speech signal. In this paper speech is compressed by discrete wavelet transform technique afterward compressed signal is again compressed by discrete cosine transform afterward compressed signal is decompressed by discrete wavelet transform. The performance of speech signal is measure on the basis of peak signal to noise ratio (PSNR) and mean square error (MSE) by using different filters of wavelet family. Keywords— DCT, DWT, SPEECH COMPRESSION AND DECOMPRESSION |
Pixel-Based Skin Detection for Pornography Filtering | A robust skin detector is the primary need of many fields of computer vision, including face detection, gesture recognition, and pornography filtering. Less than 10 years ago, the first paper on automatic pornography filtering was published. Since then, different researchers claim different color spaces to be the best choice for skin detection in pornography filtering. Unfortunately, no comprehensive work is performed on evaluating different color spaces and their performance for detecting naked persons. As such, researchers usualy refer to the results of skin detection based on the work doen for face detection, which underlies different imaging conditions. In this paper, we examine 21 color spaces in all their possible representations for pixel-based skin detection in pornographic images. Consequently, this paper holds a large investigation in the field of skin detection, and a specific run on the pornographic images. |
Wireless Body Area Network (WBAN): A Survey on Reliability, Fault Tolerance, and Technologies Coexistence | Wireless Body Area Network (WBAN) has been a key element in e-health to monitor bodies. This technology enables new applications under the umbrella of different domains, including the medical field, the entertainment and ambient intelligence areas. This survey paper places substantial emphasis on the concept and key features of the WBAN technology. First, the WBAN concept is introduced and a review of key applications facilitated by this networking technology is provided. The study then explores a wide variety of communication standards and methods deployed in this technology. Due to the sensitivity and criticality of the data carried and handled by WBAN, fault tolerance is a critical issue and widely discussed in this paper. Hence, this survey investigates thoroughly the reliability and fault tolerance paradigms suggested for WBANs. Open research and challenging issues pertaining to fault tolerance, coexistence and interference management and power consumption are also discussed along with some suggested trends in these aspects. |
Web Content Mining Techniques Tools & Algorithms – A Comprehensive Study | Nowadays, the growth of World Wide Web has exceeded a lot with more expectations. Large amount of text documents, multimedia files and images were available in the web and it is still increasing in its forms. Data mining is the form of extracting data’s available in the internet. Web mining is a part of data mining which relates to various research communities such as information retrieval, database management systems and Artificial intelligence. The information’s in these forms are well structured from the ground principles. This Web mining adopts much of the data mining techniques to discover potentially useful information from web contents. In this paper, the concepts of web mining with its categories were discussed. The paper mainly focused on the Web Content mining tasks along with its techniques and algorithms. |
Effect of heliox on heart rate kinetics and dynamic hyperinflation during high-intensity exercise in COPD | Respiratory mechanical abnormalities in patients with chronic obstructive pulmonary disease (COPD) may impair cardiodynamic responses and slow down heart rate (HR) kinetics compared with normal resulting in reduced convective oxygen delivery during exercise. We reasoned that heliox breathing (79% helium–21% oxygen) and the attendant reduction of operating lung volumes should accelerate HR kinetics in the transition from rest to high-intensity exercise. Eleven male ambulatory patients with clinically stable COPD undertook constant work-rate cycle testing at 80% of each individuals’ maximum work capacity while breathing room air (RA) or heliox (HX), randomly. Mean response time (MRT) for HR and dynamic end-expiratory lung volume (EELV) were measured. Resting EELV was not affected by HX breathing, while exercise EELV decreased significantly by 0.23 L at isotime during HX breathing compared with RA. During HX breathing, MRT for HR significantly accelerated (p = 0.002) by an average of 20 s (i.e., 17%). Speeded MRT for HR correlated with indices of reduced lung hyperinflation, such as EELV at isotime (r = 0.88, p = 0.03), and with improved exercise endurance time (r = −0.64, p = 0.03). The results confirm that HX-induced reduction of dynamic lung hyperinflation is associated with consistent improvement in indices of cardio-circulatory function such as HR kinetics in the rest-to-exercise transition in COPD patients. |
IQA: Visual Question Answering in Interactive Environments | We introduce Interactive Question Answering (IQA), the task of answering questions that require an autonomous agent to interact with a dynamic visual environment. IQA presents the agent with a scene and a question, like: "Are there any apples in the fridge?" The agent must navigate around the scene, acquire visual understanding of scene elements, interact with objects (e.g. open refrigerators) and plan for a series of actions conditioned on the question. Popular reinforcement learning approaches with a single controller perform poorly on IQA owing to the large and diverse state space. We propose the Hierarchical Interactive Memory Network (HIMN), consisting of a factorized set of controllers, allowing the system to operate at multiple levels of temporal abstraction. To evaluate HIMN, we introduce IQUAD V1, a new dataset built upon AI2-THOR [35], a simulated photo-realistic environment of configurable indoor scenes with interactive objects. IQUAD V1 has 75,000 questions, each paired with a unique scene configuration. Our experiments show that our proposed model outperforms popular single controller based methods on IQUAD V1. For sample questions and results, please view our video: https://youtu.be/pXd3C-1jr98. |
DeepPlaylist : Using Recurrent Neural Networks to Predict Song Similarity | The abstract paragraph should be indented 1/2 inch (3 picas) on both left and righthand margins. Use 10 point type, with a vertical spacing of 11 points. The word Abstract must be centered, bold, and in point size 12. Two line spaces precede the abstract. The abstract must be limited to one paragraph. |
The Q-matrix Method: Mining Student Response Data for Knowledge | Although many talented researchers have created excellent tools for computer-assisted instruction and intelligent tutoring systems, creating high-quality, effective, scalable but individualized tools for learning at a low cost is still an open research challenge. Many learning tools create complex models of student behavior that require extensive time on the part of subject experts, as well as cognitive science researchers, to create effective help and feedback strategies. In this research, we propose a different approach, called the q-matrix method, where data from student behavior is “mined” to create concept models of the material being taught. These models are then used to both understand student behavior and direct learning paths for future students. We describe the q-matrix method and present preliminary results that imply that the method can effectively predict which concepts need further review. |
Comparing the performance of different x86 SIMD instruction sets for a medical imaging application on modern multi- and manycore chips | Single Instruction, Multiple Data (SIMD) vectorization is a major driver of performance in current architectures, and is mandatory for achieving good performance with codes that are limited by instruction throughput. We investigate the efficiency of different SIMD-vectorized implementations of the RabbitCT benchmark. RabbitCT performs 3D image reconstruction by back projection, a vital operation in computed tomography applications. The underlying algorithm is a challenge for vectorization because it consists, apart from a streaming part, also of a bilinear interpolation requiring scattered access to image data. We analyze the performance of SSE (128 bit), AVX (256 bit), AVX2 (256 bit), and IMCI (512 bit) implementations on recent Intel x86 systems. A special emphasis is put on the vector gather implementation on Intel Haswell and Knights Corner microarchitectures. Finally we discuss why GPU implementations perform much better for this specific algorithm. |
Sequential Person Recognition in Photo Albums with a Recurrent Network | Recognizing the identities of people in everyday photos is still a very challenging problem for machine vision, due to issues such as non-frontal faces, changes in clothing, location, lighting. Recent studies have shown that rich relational information between people in the same photo can help in recognizing their identities. In this work, we propose to model the relational information between people as a sequence prediction task. At the core of our work is a novel recurrent network architecture, in which relational information between instances labels and appearance are modeled jointly. In addition to relational cues, scene context is incorporated in our sequence prediction model with no additional cost. In this sense, our approach is a unified framework for modeling both contextual cues and visual appearance of person instances. Our model is trained end-to-end with a sequence of annotated instances in a photo as inputs, and a sequence of corresponding labels as targets. We demonstrate that this simple but elegant formulation achieves state-of-the-art performance on the newly released People In Photo Albums (PIPA) dataset. |
On the ecological environmental quality of Wukeshu | The ecosystems of the town Wukeshu mainly include the following types:agro-ecosystem,water ecosystem,settlement ecosystems and scenic tourist ecosystem.According to the reaearch results of the environmental quality and ecosystem of the town Wukeshu,through scientific analysis and forecast.It is considered that the dust,the waste water and the domestic waste will not produce harmful effects on regional ecological environment. |
Multiracial Facial Golden Ratio and Evaluation of Facial Appearance | UNLABELLED
This study aimed to investigate the association of facial proportion and its relation to the golden ratio with the evaluation of facial appearance among Malaysian population. This was a cross-sectional study with 286 randomly selected from Universiti Sains Malaysia (USM) Health Campus students (150 females and 136 males; 100 Malaysian Chinese, 100 Malaysian Malay and 86 Malaysian Indian), with the mean age of 21.54 ± 1.56 (Age range, 18-25). Facial indices obtained from direct facial measurements were used for the classification of facial shape into short, ideal and long. A validated structured questionnaire was used to assess subjects' evaluation of their own facial appearance. The mean facial indices of Malaysian Indian (MI), Malaysian Chinese (MC) and Malaysian Malay (MM) were 1.59 ± 0.19, 1.57 ± 0.25 and 1.54 ± 0.23 respectively. Only MC showed significant sexual dimorphism in facial index (P = 0.047; P<0.05) but no significant difference was found between races. Out of the 286 subjects, 49 (17.1%) were of ideal facial shape, 156 (54.5%) short and 81 (28.3%) long. The facial evaluation questionnaire showed that MC had the lowest satisfaction with mean score of 2.18 ± 0.97 for overall impression and 2.15 ± 1.04 for facial parts, compared to MM and MI, with mean score of 1.80 ± 0.97 and 1.64 ± 0.74 respectively for overall impression; 1.75 ± 0.95 and 1.70 ± 0.83 respectively for facial parts.
IN CONCLUSION
1) Only 17.1% of Malaysian facial proportion conformed to the golden ratio, with majority of the population having short face (54.5%); 2) Facial index did not depend significantly on races; 3) Significant sexual dimorphism was shown among Malaysian Chinese; 4) All three races are generally satisfied with their own facial appearance; 5) No significant association was found between golden ratio and facial evaluation score among Malaysian population. |
Social cognitive theory of posttraumatic recovery: the role of perceived self-efficacy. | The present article integrates findings from diverse studies on the generalized role of perceived coping self-efficacy in recovery from different types of traumatic experiences. They include natural disasters, technological catastrophes, terrorist attacks, military combat, and sexual and criminal assaults. The various studies apply multiple controls for diverse sets of potential contributors to posttraumatic recovery. In these different multivariate analyses, perceived coping self-efficacy emerges as a focal mediator of posttraumatic recovery. Verification of its independent contribution to posttraumatic recovery across a wide range of traumas lends support to the centrality of the enabling and protective function of belief in one's capability to exercise some measure of control over traumatic adversity. |
Exploring Text Links for Coherent Multi-Document Summarization | Summarization aims to represent source documents by a shortened passage. Existing methods focus on the extraction of key information, but often neglect coherence. Hence the generated summaries suffer from a lack of readability. To address this problem, we have developed a graph-based method by exploring the links between text to produce coherent summaries. Our approach involves finding a sequence of sentences that best represent the key information in a coherent way. In contrast to the previous methods that focus only on salience, the proposed method addresses both coherence and informativeness based on textual linkages. We conduct experiments on the DUC2004 summarization task data set. A performance comparison reveals that the summaries generated by the proposed system achieve comparable results in terms of the ROUGE metric, and show improvements in readability by human evaluation. |
Machine-to-Machine Communication in LTE-A | Wireless equipped machines are increasing greatly in recent years, among which the cellular network based machine-to-machine communication (M2M) has shown the advantages of better coverage and lower network deployment cost. However, the current cellular network is designed for human-to-human communication, targeting the voice/media transmission with low access delay and high throughput, so the machines in GSM and UMTS are designed like a screenless phone. To extend the market to machine type communication, it is envisaged that the specific optimizations for M2M will be introduced in LTE-A, especially when M2M communication accounts for the considerable part in the total networking activities. In this paper, some of the considerations of the network side improvements are presented, from the physical layer, MAC to core network. |
A new insight into the formation of odor active carbonyls by thermally-induced degradation of phospholipids in self-assembly structures. | The role of molecular organization in heated aqueous dispersions of egg phosphatidylcholine (PC) and egg phosphatidylethanolamine (PE) was studied with respect to the formation of key odorants. Evidence was found for the crucial role of self-assembly structures adopted by phospholipid molecules on the quantitative composition of volatile constituents. The concentrations of seven aldehydes and one vinyl ketone were determined by isotope dilution assay in heated aqueous dispersions of PC and PE present in various ratios. Addition of PE to PC drastically decreased the amount of (E,E)-2,4-decadienal formed, which cannot be explained by the differences in the fatty acid composition of PC and PE. The free amino group in PE does not explain this phenomenon either, as replacing PE by phosphatidic acid distearylester also reduced the amounts of (E,E)-2,4-decadienal. We suggest that the type of self-assembly structure adopted by phospholipids in water significantly influences the reaction yields. However, the mechanisms leading to the preferred formation of phospholipid-derived odorants in a lamellar phase, as compared to the reversed hexagonal phase, remain unknown. |
Deep Neural Decision Trees | Deep neural networks have been proven powerful at processing perceptual data, such as images and audio. However for tabular data, tree-based models are more popular. A nice property of tree-based models is their natural interpretability. In this work, we present Deep Neural Decision Trees (DNDT) – tree models realised by neural networks. A DNDT is intrinsically interpretable, as it is a tree. Yet as it is also a neural network (NN), it can be easily implemented in NN toolkits, and trained with gradient descent rather than greedy splitting. We evaluate DNDT on several tabular datasets, verify its efficacy, and investigate similarities and differences between DNDT and vanilla decision trees. Interestingly, DNDT self-prunes at both split and feature-level. |
A Wearable Computing Prototype for supporting training activities in Automotive Production | This paper presents the results of the wearable computing prototype supporting trainingand qualification activities at the SKODA production facilities in Czech Republic. The emerged prototype is based upon the first of the 2 main “Variant Production Showcases” (training and assembly-line) which are to be implemented in the WearIT@work project (EC IP 004216). As an introduction, the authors of this paper investigate current training processes at Skoda, and derive the potential benefits and risks of applying wearable computing technology. Accordingly, the approach of creating the wearable prototypes, via usability experiments at the Skoda production site, is explained in detail. As a preliminary result, the first functional prototypes, including a task recognition prototype, based upon the components of the European Wearable Computing Platform, are described. The paper is rounded up by providing a short outlook regarding the second envisaged test case, which is focussed upon selected assembly line operations of blue collar workers. |
Effects of methylphenidate and expectancy on children with ADHD: behavior, academic performance, and attributions in a summer treatment program and regular classroom settings. | Pharmacological and expectancy effects of 0.3 mg/kg methylphenidate on the behavior and attributions of boys with attention-deficit/hyperactivity disorder were evaluated. In a within-subject, balanced-placebo design, 136 boys received 4 medication-expectancy conditions. Attributions for success and failure on a daily report card were gathered. Assessments took place within the setting of a summer treatment program and were repeated in boys' regular classrooms. Expectancy did not affect the boys' behavior; only active medication improved their behavior. Boys attributed their success to their effort and ability and attributed failure to task difficulty and the pill, regardless of medication and expectancy. Results were generally equivalent across the two settings; where there were differences, beneficial effects of medication were more apparent in the school setting. The findings were unaffected by individual-difference factors. |
Pharmacokinetic and pharmacodynamic comparison of fluoropyrimidine derivatives, capecitabine and 5′-deoxy-5-fluorouridine (5′-DFUR) | Capecitabine is a three-step prodrug that was rationally designed to be a more effective and safer alternative to its intermediate metabolite, 5′-deoxy-5-fluorouridine (5′-DFUR). We compared the pharmacokinetics/pharmacodynamics of these drugs in metastatic breast cancer patients. Six patients received oral capecitabine at 1657 mg/m2 twice daily and 17 received 5′-DFUR at 400 mg three times daily. Both drugs were administered for 21 days followed by a 7-day rest. Median daily 5′-DFUR AUC was significantly higher for capecitabine than for 5′-DFUR (81.1 vs 32.6 mmol h/l; P=0.01). Following treatment with 5′-DFUR, the median AUC and Cmax of 5′-DFUR tended to be higher in patients with a partial response (3.83 μg h/ml and 4.88 μg/ml) and stable disease (6.46 μg h/ml and 4.96 μg/ml) than in those with disease progression (2.53 μg h/ml and 1.36 μg/ml). The AUC and Cmax of 5′-DFUR was significantly related to overall survival. These results support the superiority of capecitabine over 5′-DFUR. |
Characterization of laminar jet impingement cooling in portable computer application | A thermal characterization study of laminar air jet impingement cooling of electronic components within a geometry representative of the CPU compartment of a typical portable computer is reported. A finite control volume technique was used to solve for the velocity and temperature fields. Convection, conduction and radiation effects were included in the simulations. The range of jet Reynolds numbers considered was 63 to 1500; the applied compartment heat load ranged from 5W-15W. Radiation effects were significant over the range of Reynolds numbers and heat loads considered, while the effect of natural convection was only noticeable for configurations when the ratio Gr/Re exceeded 5. The predicted importance of Re rather than jet size was confirmed with test data. Proof of concept was demonstrated with a numerical model representative of a full laptop computer. Both simulations and lab tests showed that low flow rate JI cooling schemes can provide cooling comparable to a high volume flow rate configuration, while using only a fraction of the air flow. Further, under the conservative assumption of steady state, fully powered components, a hybrid cooling scheme utilizing a heat pipe and laminar JI was capable of cooling the processor chip within to 11C of the vendor specified maximum temperature for a system with a total power dissipation of over 21 W. Index Terms – jet impingement, laminar flow, portable computers, heat pipe NOMENCLATURE Dh Jet hydraulic diameter (m) Gr Grashof number (gβ∆TL/ν) H Compartment Height (m) h Heat transfer coefficient (W/mK) k Thermal conductivity (W/mK) L Characteristic length (m) Nu Nusselt number (q”L/∆Tk) P Power dissipation (W) p Pressure (Pa) Pr Fluid Prandtl number Q Heat flow (W) q” Heat flux (W/m) Re Reynolds number (VDh/ν) T Temperature (C) u Velocity in the x direction (m/s) V Average jet velocity (m/s) v Velocity in the y direction (m/s) w Velocity in the z direction (m/s) W Jet width (m) α Thermal diffusivity (m/s) β Coefficient of volume expansion (1/K) ν Kinematic viscosity (m/s) θ Thermal resistance (C/W) |
MicroProteins: small size-big impact. | MicroProteins (miPs) are short, usually single-domain proteins that, in analogy to miRNAs, heterodimerize with their targets and exert a dominant-negative effect. Recent bioinformatic attempts to identify miPs have resulted in a list of potential miPs, many of which lack the defining characteristics of a miP. In this opinion article, we clearly state the characteristics of a miP as evidenced by known proteins that fit the definition; we explain why modulatory proteins misrepresented as miPs do not qualify as true miPs. We also discuss the evolutionary history of miPs, and how the miP concept can extend beyond transcription factors (TFs) to encompass different non-TF proteins that require dimerization for full function. |
Channel Gain Map Tracking via Distributed Kriging | A collaborative algorithm is developed to estimate the channel gains of wireless links in a geographical area. The spatiotemporal evolution of shadow fading is characterized by judiciously extending an experimentally verified spatial-loss field model. Kriged Kalman filtering (KKF), which is a tool with widely appreciated merits in spatial statistics and geosciences, is adopted and implemented in a distributed fashion to track the time-varying shadowing field using a network of radiometers. The novel distributed KKF requires only local message passing yet achieves a global view of the radio frequency environment through consensus iterations. Numerical tests demonstrate superior tracking accuracy of the collaborative algorithm compared with its noncollaborative counterpart. Furthermore, the efficacy of the global channel gain knowledge obtained is showcased in the context of cognitive radio resource allocation. |
Security analysis of the constrained application protocol in the Internet of Things | The concept of Internet of Things involves huge number of constrained devices such as wireless sensors to communicate in a machine-to-machine pattern. Based on the implementation scenario, such communication might take place over a public network such as the Internet, which is based on the TCP/IP stack. However, different research working groups argue that some of these stack protocols such as the Hyper Text Transfer Protocol (HTTP) might not be suitable for constrained devices. Therefore, the IETF Constrained RESTful Environments (CoRE) WG has proposed the Constrained Application Protocol (CoAP); an application layer protocol for constrained devices in the IoTs. The CoRE WG proposed using IPSec or DTLS to secure the CoAP communication at different levels of the protocol stack. However, to investigate the feasibility of such a proposal, we use the X.805 security standard to analyze the security aspects of such implementation. The analysis highlights the main security drawbacks and hence argues of the need for a new integrated security solution. |
AffectButton: A method for reliable and valid affective self-report | In this article we report on a new digital interactive self-report method for the measurement of human affect. The AffectButton (Broekens & Brinkman, 2009) is a button that enables users to provide affective feedback in terms of values on the well-known three affective dimensions of Pleasure (Valence), Arousal and Dominance. The AffectButton is an interface component that functions and looks like a medium-sized button. The button presents one dynamically changing iconic facial expression that changes based on the coordinates of the user’s pointer in the button. To give affective feedback the user selects the most appropriate expression by clicking the button, effectively enabling 1-click affective self-report on 3 affective dimensions. Here we analyze 5 previously published studies, and 3 novel large-scale studies (n=325, n=202, n=128). Our results show the reliability, validity, and usability of the button for acquiring three types of affective feedback in various domains. The tested domains are holiday preferences, real-time music annotation, emotion words, and textual situation descriptions (ANET). The types of affective feedback tested are preferences, affect attribution to the previously mentioned stimuli, and self-reported mood. All of the subjects tested were Dutch and aged between 15 and 56 years. We end this article with a discussion of the limitations of the AffectButton and of its relevance to areas including recommender systems, preference elicitation, social computing, online surveys, coaching and tutoring, experimental psychology and psychometrics, content annotation, and game consoles. |
Survival After Extracorporeal Cardiopulmonary Resuscitation on Weekends in Comparison With Weekdays. | BACKGROUND
Extracorporeal cardiopulmonary resuscitation (ECPR) requires urgent decision-making and high-quality skills, which may not be uniformly available throughout the week. Few data exist on the outcomes of patients with cardiac arrest who receive in-hospital ECPR on the weekday versus weekend. Therefore, we investigated whether the outcome differed when patients with in-hospital cardiac arrest received ECPR during the weekend compared with a weekday.
METHODS
Two hundred patients underwent extracorporeal membrane oxygenation after in-hospital cardiac arrest between January 2004 and December 2013. Patients treated between 0800 on Monday to 1759 on Friday were considered to receive weekday care and patients treated between 1800 on Friday through 0759 on Monday were considered to receive weekend care.
RESULTS
A total of 135 cases of ECPR for in-hospital cardiac arrest occurred during the weekday (64 during daytime hours and 71 during nighttime hours), and 65 cases occurred during the weekend (39 during daytime/evening hours and 26 during nighttime hours). Rates of survival to discharge were higher with weekday care than with weekend care (35.8% versus 21.5%, p = 0.041). Cannulation failure was more frequent in the weekend group (1.5% versus 7.7%, p = 0.038). Complication rates were higher on the weekend than on the weekday, including cannulation site bleeding (3.0% versus 10.8%, p = 0.041), limb ischemia (5.9% versus 15.6%, p = 0.026), and procedure-related infections (0.7% versus 9.2%, p = 0.005).
CONCLUSIONS
ECPR on the weekend was associated with a lower survival rate and lower resuscitation quality, including higher cannulation failure and higher complication rate. |
Type II diabetes mellitus and menopause: a multinational study. | BACKGROUND
Type II diabetes mellitus causes metabolic changes that may lead to early menopause and worsen climacteric symptoms.
OBJECTIVES
To determine the risk factors for type II diabetes mellitus and assess the impact of this disease on the age of menopause and on climacteric symptoms.
METHODS
A total of 6079 women aged between 40 and 59 years from 11 Latin American countries were requested to answer the Menopause Rating Scale and Goldberg Anxiety-Depression Scale.
RESULTS
The prevalence of diabetes was 6.7%. Diabetes mellitus was associated with arterial hypertension (odds ratio (OR) 4.49; 95% confidence interval (CI) 3.47-5.31), the use of psychotropic drugs (OR 1.54; 95% CI 1.22-1.94), hormonal therapy (OR 1.46; 95% CI 1.11-1.92), ≥ 50 years of age (OR 1.48; 95% CI 1.17-1.86), overweight or obese (OR 1.47; 95% CI 1.15-1.89), and waist circumference ≥ 88 cm (OR 1.32; 95% CI 1.06-1.65). Factors associated with lower risk of diabetes were the use of hormonal contraceptives (OR 0.55; 95% CI 0.35-0.87), alcohol (OR 0.73; 95% CI 0.54-0.98) and living in cities > 2500 meters above sea level (OR 0.70; 95% CI 0.53-0.91) or with high temperatures (OR 0.67; 95% CI 0.51-0.88). In turn, diabetes tripled the risk of menopause in women under 45 years of age. Diabetes did not increase the risk of deterioration of quality of life due to climacteric symptoms.
CONCLUSION
Menopause does not increase the risk of type II diabetes mellitus. Diabetes is associated with early menopause in women under 45 years of age. |
Piezoelectric properties of ScAlN thin films for piezo-MEMS devices | This paper reports the piezoelectric properties of ScAlN thin films. We evaluated the piezoelectric coefficients d<sub>33</sub> and d<sub>31</sub> of Sc<sub>x</sub>Al<sub>1-x</sub>N thin films directly deposited onto silicon wafers, as well the radio frequency (RF) electrical characteristics of Sc<sub>0.35</sub>Al<sub>0.65</sub>N bulk acoustic wave (BAW) resonators at around 2 GHz, and found a maximum value for d<sub>33</sub> of 28 pC/N and a maximum -d<sub>31</sub> of 13 pm/V at 40% scandium concentration. In BAW resonators that use Sc<sub>0.35</sub>Al<sub>0.65</sub>N as a piezoelectric film, the electromechanical coupling coefficient k<sup>2</sup> (=15.5%) was found to be 2.6 times that of resonators with AlN films. These experimental results are in very close agreement with first-principles calculations. The large electromechanical coupling coefficient and high sound velocity of these films should make them suitable for high frequency applications. |
German Werkkunstschules and the Establishment of Industrial Design Education in Turkey | Introduction Istanbul has been the pioneering city for product design instruction since the inception of this academic field in Turkey. The current umbrella organization, under which industrial design instruction is governed, is the State Academy of Fine Arts. In 1972, a separate division at the Academy (the “Interior Architecture and Industrial Design” division) was formed to specifically address instruction in industrial design at higher schools and universities nationwide. The precursor and inspiration for this division was the State School of Applied Fine Arts, an industrial product design school which opened in 1957. Originally a three-year school, in 1962, the State School of Applied Fine Arts became a four-year institution and its name was changed to the Istanbul State School of Applied Fine Arts. Following the example of the State School of Applied Fine Arts, a private school for the applied arts which opened in 1968, but was taken over by the State Academy shortly thereafter (during the period in which private universities were banned in Turkey). This second applied arts school was renamed the School of Applied Industrial Arts, and included a department of industrial design until the 1980s. While the establishment of the State School of Applied Fine Arts, which forms the core of this article,1 was still in progress, the rapid developments seen in the State Academy of Fine Arts were realized thanks to the personal efforts of Önder Küçükerman, who, at the time, was head of the Interior Architecture Department. At the same time, researchers in the field considered the State School of Applied Fine Arts to be the pioneering school for industrial design training in Turkey. H. Alpay Er has observed: It can be claimed that training for industrial product design in Turkey stemmed from two main roots. One of them is the two important art schools in Istanbul: the Istanbul State Academy of Fine Arts and the State School of Applied Fine Arts. Although not much was written about the latter school, relatively more resources are available to study the foundation of instruction at the Istanbul State Academy of Fine Arts. The Istanbul root that developed from the 1 The subject of this article first was presented as “An Overview of the State Academy of Applied Arts and the Influence of German Werkkunstschules during the Establishment and Development Period of Industrial Product Design in Turkey” at “Mind the Map, Third International Conference on Design History & Design Studies” (Istanbul, July 2002, organized by the Istanbul Technical University in cooperation with the Kent Institute of Art and Design). |
Calculus of cooperation and game-based reasoning about protocol privacy | The article introduces a new formal system, the calculus of cooperation, for reasoning about coalitions of players in a certain class of games. The calculus is an extension of the propositional intuitionistic logic that adds a coalition parameter to intuitionistic implication. The system is shown to be sound and complete with respect to a game semantics.
One intended application of the calculus of cooperation is the verification of privacy properties in multiparty computation protocols. The article argues that such properties can be established by providing a set of strategies for a non-zero-sum, perfect information game based on the protocol. It concludes with several examples of such verifications formalized in the calculus of cooperation. |
tmChem: a high performance approach for chemical named entity recognition and normalization | Chemical compounds and drugs are an important class of entities in biomedical research with great potential in a wide range of applications, including clinical medicine. Locating chemical named entities in the literature is a useful step in chemical text mining pipelines for identifying the chemical mentions, their properties, and their relationships as discussed in the literature. We introduce the tmChem system, a chemical named entity recognizer created by combining two independent machine learning models in an ensemble. We use the corpus released as part of the recent CHEMDNER task to develop and evaluate tmChem, achieving a micro-averaged f-measure of 0.8739 on the CEM subtask (mention-level evaluation) and 0.8745 f-measure on the CDI subtask (abstract-level evaluation). We also report a high-recall combination (0.9212 for CEM and 0.9224 for CDI). tmChem achieved the highest f-measure reported in the CHEMDNER task for the CEM subtask, and the high recall variant achieved the highest recall on both the CEM and CDI tasks. We report that tmChem is a state-of-the-art tool for chemical named entity recognition and that performance for chemical named entity recognition has now tied (or exceeded) the performance previously reported for genes and diseases. Future research should focus on tighter integration between the named entity recognition and normalization steps for improved performance. The source code and a trained model for both models of tmChem is available at: http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/tmChem. The results of running tmChem (Model 2) on PubMed are available in PubTator: http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/PubTator. |
Psycho-physiological measures for assessing cognitive load | With a focus on presenting information at the right time, the ubicomp community can benefit greatly from learning the most salient human measures of cognitive load. Cognitive load can be used as a metric to determine when or whether to interrupt a user. In this paper, we collected data from multiple sensors and compared their ability to assess cognitive load. Our focus is on visual perception and cognitive speed-focused tasks that leverage cognitive abilities common in ubicomp applications. We found that across all participants, the electrocardiogram median absolute deviation and median heat flux measurements were the most accurate at distinguishing between low and high levels of cognitive load, providing a classification accuracy of over 80% when used together. Our contribution is a real-time, objective, and generalizable method for assessing cognitive load in cognitive tasks commonly found in ubicomp systems and situations of divided attention. |
Hatha yoga: improved vital capacity of college students. | CONTEXT
The vital capacity of the lungs is a critical component of good health. Vital capacity is an important concern for those with asthma, heart conditions, and lung ailments; those who smoke; and those who have no known lung problems.
OBJECTIVE
To determine the effects of yoga postures and breathing exercises on vital capacity.
DESIGN
Using the Spiropet spirometer, researchers measured vital capacity. Vital capacity determinants were taken near the beginning and end of two 17-week semesters. No control group was used.
SETTING
Midwestern university yoga classes taken for college credit.
PARTICIPANTS
A total of 287 college students, 89 men and 198 women.
INTERVENTION
Subjects were taught yoga poses, breathing techniques, and relaxation in two 50-minute class meetings for 15 weeks.
MAIN OUTCOME MEASURES
Vital capacity over time for smokers, asthmatics, and those with no known lung disease.
RESULTS
The study showed a statistically significant (P < .001) improvement in vital capacity across all categories over time.
CONCLUSIONS
It is not known whether these findings were the result of yoga poses, breathing techniques, relaxation, or other aspects of exercise in the subjects' life. The subjects' adherence to attending class was 99.96%. The large number of 287 subjects is considered to be a valid number for a study of this type. These findings are consistent with other research studies reporting the positive effect of yoga on the vital capacity of the lungs. |
EasyTracker: automatic transit tracking, mapping, and arrival time prediction using smartphones | In order to facilitate the introduction of transit tracking and arrival time prediction in smaller transit agencies, we investigate an automatic, smartphone-based system which we call EasyTracker. To use EasyTracker, a transit agency must obtain smartphones, install an app, and place a phone in each transit vehicle. Our goal is to require no other input.
This level of automation is possible through a set of algorithms that use GPS traces collected from instrumented transit vehicles to determine routes served, locate stops, and infer schedules. In addition, online algorithms automatically determine the route served by a given vehicle at a given time and predict its arrival time at upcoming stops.
We evaluate our algorithms on real datasets from two existing transit services. We demonstrate our ability to accurately reconstruct routes and schedules, and compare our system's arrival time prediction performance with the current "state of the art" for smaller transit operators: the official schedule. Finally, we discuss our current prototype implementation and the steps required to take it from a research prototype to a real system. |
A non-volatile organic electrochemical device as a low-voltage artificial synapse for neuromorphic computing. | The brain is capable of massively parallel information processing while consuming only ∼1-100 fJ per synaptic event. Inspired by the efficiency of the brain, CMOS-based neural architectures and memristors are being developed for pattern recognition and machine learning. However, the volatility, design complexity and high supply voltages for CMOS architectures, and the stochastic and energy-costly switching of memristors complicate the path to achieve the interconnectivity, information density, and energy efficiency of the brain using either approach. Here we describe an electrochemical neuromorphic organic device (ENODe) operating with a fundamentally different mechanism from existing memristors. ENODe switches at low voltage and energy (<10 pJ for 103 μm2 devices), displays >500 distinct, non-volatile conductance states within a ∼1 V range, and achieves high classification accuracy when implemented in neural network simulations. Plastic ENODes are also fabricated on flexible substrates enabling the integration of neuromorphic functionality in stretchable electronic systems. Mechanical flexibility makes ENODes compatible with three-dimensional architectures, opening a path towards extreme interconnectivity comparable to the human brain. |
An investigation of the efficacy of oral myofunctional therapy as a precursor to articulation therapy for pre-first grade children. | This study investigated the assumption that oral myofunctional services might facilitate remediation of articulation disorders. Subjects were ten 6-year-old elementary school children who exhibited both tongue-thrust behaviors and articulation errors. All subjects had equal service time for a 14-week period, half receiving articulation services only and the other half receiving oral myofunctional services prior to and in conjunction with articulation services. Results showed that children in both programs made essentially equal progress in correcting placement of tongue-tip sounds, remediating /s/ and /z/ misarticulation, and remediating general articulation errors. Only children who received oral myofunctional services remediated tongue-thrust behaviors. |
Reliability and validity in a nutshell. | AIMS
To explore and explain the different concepts of reliability and validity as they are related to measurement instruments in social science and health care.
BACKGROUND
There are different concepts contained in the terms reliability and validity and these are often explained poorly and there is often confusion between them.
DESIGN
To develop some clarity about reliability and validity a conceptual framework was built based on the existing literature.
RESULTS
The concepts of reliability, validity and utility are explored and explained.
CONCLUSIONS
Reliability contains the concepts of internal consistency and stability and equivalence. Validity contains the concepts of content, face, criterion, concurrent, predictive, construct, convergent (and divergent), factorial and discriminant. In addition, for clinical practice and research, it is essential to establish the utility of a measurement instrument.
RELEVANCE TO CLINICAL PRACTICE
To use measurement instruments appropriately in clinical practice, the extent to which they are reliable, valid and usable must be established. |
A Kinect-Based Wearable Face Recognition System to Aid Visually Impaired Users | In this paper, we introduce a real-time face recognition (and announcement) system targeted at aiding the blind and low-vision people. The system uses a Microsoft Kinect sensor as a wearable device, performs face detection, and uses temporal coherence along with a simple biometric procedure to generate a sound associated with the identified person, virtualized at his/her estimated 3-D location. Our approach uses a variation of the K-nearest neighbors algorithm over histogram of oriented gradient descriptors dimensionally reduced by principal component analysis. The results show that our approach, on average, outperforms traditional face recognition methods while requiring much less computational resources (memory, processing power, and battery life) when compared with existing techniques in the literature, deeming it suitable for the wearable hardware constraints. We also show the performance of the system in the dark, using depth-only information acquired with Kinect's infrared camera. The validation uses a new dataset available for download, with 600 videos of 30 people, containing variation of illumination, background, and movement patterns. Experiments with existing datasets in the literature are also considered. Finally, we conducted user experience evaluations on both blindfolded and visually impaired users, showing encouraging results. |
Comparison of inverse kinematics solutions using neural network for 6R robot manipulator with offset | An artificial neural network (ANN) using backpropagation algorithm is applied to solve inverse kinematics problems of industrial robot manipulator. 6R robot manipulator with offset wrist was chosen as industrial robot manipulator because geometric feature of this robot does not allow solving inverse kinematics problems analytically. In other words, there is no closed form solution for this problem. In order to define orientation of robot end-effector, three different representations are used here: homogeneous transformation matrix, Euler angles and equivalent angle axis. These representations were compared to obtain inverse kinematics solutions for 6R robot manipulator with offset wrist. Simulation results show that prediction performance from the approximation accuracy point of view is satisfactory with low effective errors based on 10 degrees data resolution |
Dots and Incipients: Extended Features for Partial Fingerprint Matching | There are fundamental differences in the way fingerprints are compared by forensic examiners and current automatic systems. For example, while automatic systems focus mainly on the quantitative measures of fingerprint minutiae (ridge ending and bifurcation points), forensic examiners often analyze details of intrinsic ridge characteristics and relational information. This process, known as qualitative friction ridge analysis [1], includes examination of ridge shape, pores, dots, incipient ridges, etc. This explains the challenges that current automatic systems face in processing partial fingerprints, mostly seen in latents. The forensics and automatic fingerprint identification systems (AFIS) communities have been active in standardizing the definition of extended feature set, as well as quantifying the relevance and reliability of these features for automatic matching systems. CDEFFS (committee to define an extended feature set) has proposed a working draft on possible definitions and representations of extended features [2]. However, benefits of utilizing these extended features in automatic systems are not yet known. While fingerprint matching technology is quite mature for matching tenprints [3], matching partial fingerprints, especially latents, still needs a lot of improvement. We propose an algorithm to extract two major level 3 feature types, dots and incipients, based on local phase symmetry and demonstrate their effectiveness in partial print matching. Since dots and incipients can be easily encoded by forensic examiners, we believe the results of this research will have benefits to next generation identification (NGI) systems. |
The Development of the Turtle Carapace " ( 1989 ) , by Ann | Ann Campbell Burke examines the development and evolution [5] of vertebrates, in particular, turtles [6]. Her Harvard University [7] experiments, described in "Development of the Turtle Carapace [4]: Implications for the Evolution of a Novel Bauplan," were published in 1989. Burke used molecular techniques to investigate the developmental mechanisms responsible for the formation of the turtle shell. Burke's work with turtle embryos has provided empirical evidence for the hypothesis that the evolutionary origins of turtle morphology [8] depend on changes in the embryonic and developmental mechanisms underpinning shell production. |
Enhanced Simulated Annealing for Globally Minimizing Functions of Many-Continuous Variables | A new global optimization algorithm for functions of many continuous variables is presented, derived from the basic Simulated annealing method. Our main contribution lies in dealing with high-dimensionality minimization problems, which are often difficult to solve by all known minimization methods with or without gradient. In this article we take a special interest in the variables discretization issue. We also develop and implement several complementary stopping criteria. The original Metropolis iterative random search, which takes place in a Euclidean space Rn, is replaced by another similar exploration, performed within a succession of Euclidean spaces Rp, with p <<n. This Enhanced Simulated Annealing (ESA) algorithm was validated first on classical highly multimodal functions of 2 to 100 variables. We obtained significant reductions in the number of function evaluations compared to six other global optimization algorithms, selected according to previously published computational results for the same set of test functions. In most cases, ESA was able to closely approximate known global optima. The reduced ESA computational cost helped us to refine further the obtained global results, through the use of some local search. We have used this new minimizing procedure to solve complex circuit design problems, for which the objective function evaluation can be exceedingly costly. |
Revealing the Traces of Median Filtering Using High-Order Local Ternary Patterns | Recently, detecting the traces introduced by the content-preserving image manipulations has received a great deal of attention from forensic analyzers. It is well known that the median filter is a widely used nonlinear denoising operator. Therefore, the detection of median filtering is of important realistic significance in image forensics. In this letter, a novel local texture operator, named the second-order local ternary pattern (LTP), is proposed for median filtering detection. The proposed local texture operator encodes the local derivative direction variations by using a 3-valued coding function and is capable of effectively capturing the changes of local texture caused by median filtering. In addition, kernel principal component analysis (KPCA) is exploited to reduce the dimensionality of the proposed feature set, making the computational cost manageable. The experiment results have shown that the proposed scheme performs better than several state-of-the-art approaches investigated. |
A stabilized multigrid solver for hyperelastic image registration | Correspondence Lars Ruthotto, Department of Mathematics and Computer Science, Emory University, 400 Dowman Dr, Atlanta, GA 30322, USA. Email: [email protected] Summary Image registration is a central problem in a variety of areas involving imaging techniques and is known to be challenging and ill-posed. Regularization functionals based on hyperelasticity provide a powerful mechanism for limiting the ill-posedness. A key feature of hyperelastic image registration approaches is their ability to model large deformations while guaranteeing their invertibility, which is crucial in many applications. To ensure that numerical solutions satisfy this requirement, we discretize the variational problem using piecewise linear finite elements, and then solve the discrete optimization problem using the Gauss–Newton method. In this work, we focus on computational challenges arising in approximately solving the Hessian system. We show that the Hessian is a discretization of a strongly coupled system of partial differential equations whose coefficients can be severely inhomogeneous. Motivated by a local Fourier analysis, we stabilize the system by thresholding the coefficients. We propose a Galerkin-multigrid scheme with a collective pointwise smoother. We demonstrate the accuracy and effectiveness of the proposed scheme, first on a two-dimensional problem of a moderate size and then on a large-scale real-world application with almost 9 million degrees of freedom. |
Content-based recommendation techniques for requirements engineering | Assuring quality in software development processes is often a complex task. In many cases there are numerous needs which cannot be fulfilled with the limited resources given. Consequently it is crucial to identify the set of necessary requirements for a software project which needs to be complete and conflict-free. Additionally, the evolution of single requirements (artifacts) plays an important role because the quality of these artifacts has an impact on the overall quality of the project. To support stakeholders in mastering these tasks there is an increasing interest in AI techniques. In this paper we presents two content-based recommendation approaches that support the Requirements Engineering (RE) process. First, we propose a Keyword Recommender to increase requirements reuse. Second, we define a thesaurus enhanced Dependency Recommender to help stakeholders finding complete and conflict-free requirements. Finally, we present studies conducted at the Graz University of Technology to evaluate the applicability of the proposed recommendation technologies. |
Object tracking using SIFT features and mean shift | Proposed method – Object tracking using mean shift/SIFT strategy • Using SIFT feature − Corresponding region of interests across frame • Using mean shift − Conducting similarity search by color histogram • Using expectation-maximization algorithm − Optimizing probability function for better similarity search |
A systematic review of randomized controlled trials of mHealth interventions against non-communicable diseases in developing countries | BACKGROUND
The reasons of deaths in developing countries are shifting from communicable diseases towards non-communicable diseases (NCDs). At the same time the number of health care interventions using mobile phones (mHealth interventions) is growing rapidly. We review studies assessing the health-related impacts of mHealth on NCDs in low- and middle-income countries (LAMICs).
METHODS
A systematic literature search of three major databases was performed in order to identify randomized controlled trials (RCTs) of mHealth interventions. Identified studies were reviewed concerning key characteristics of the trial and the intervention; and the relationship between intervention characteristics and outcomes was qualitatively assessed.
RESULTS
The search algorithms retrieved 994 titles. 8 RCTs were included in the review, including a total of 4375 participants. Trials took place mostly in urban areas, tested different interventions (ranging from health promotion over appointment reminders and medication adjustments to clinical decision support systems), and included patients with different diseases (diabetes, asthma, hypertension). Except for one study all showed rather positive effects of mHealth interventions on reported outcome measures. Furthermore, our results suggest that particular types of mHealth interventions that were found to have positive effects on patients with communicable diseases and for improving maternal care are likely to be effective also for NCDs.
CONCLUSIONS
Despite rather positive results of included RCTs, a firm conclusion about the effectiveness of mHealth interventions against NCDs is not yet possible because of the limited number of studies, the heterogeneity of evaluated mHealth interventions and the wide variety of reported outcome measures. More research is needed to better understand the specific effects of different types of mHealth interventions on different types of patients with NCDs in LaMICs. |
WCP-Nets: A Weighted Extension to CP-Nets for Web Service Selection | User preference often plays a key role in personalized applications such as web service selection. CP-nets is a compact and intuitive formalism for representing and reasoning with conditional preferences. However, the original CP-nets does not support fine-grained preferences, which results in the inability to compare certain preference combinations (service patterns). In this paper, we propose a weighted extension to CPnets called WCP-nets by allowing users to specify the relative importance (weights) between attribute values and between attributes. Both linear and nonlinear methods are proposed to adjust the attribute weights when conflicts between users’ explicit preferences and their actual behaviors of service selection occur. Experimental results based on two real datasets show that our method can effectively enhance the expressiveness of user preference and select more accurate services than other counterparts. |
Endurance training effect on individuals with postpoliomyelitis. | OBJECTIVE
To determine the effects of an endurance training program on the exercise capacity and muscle structure and function in individuals with postpolio syndrome.
DESIGN
Preexercise and postexercise testing was performed with muscle strength evaluations using isokinetic testing as well as hand-held Myometer. Muscle fatigue was determined by use of isokinetic testing, and endurance was determined by exercise testing. Enzymatic evaluation was performed with muscle biopsies taken at the same site; preexercise and postexercise muscle cross-sectional area was measured by computed tomography. Disability and psychosocial evaluation was performed by a Functional Status Questionnaire.
SETTING
A university.
SUBJECTS
Seventeen postpolio subjects ranging in age from 39 to 49 years volunteered for a 6-month combined endurance and strength training program. They had a history of acute poliomyelitis at least 25 years earlier and were able to walk with or without aid.
INTERVENTION
Twelve of the subjects (mean age 42 years) completed the program, attending an average of 29 sessions, which were offered for 60 minutes twice a week.
MAIN OUTCOME MEASURES
Strength, endurance, enzymatic activity, and cross-sectional area were measured 3 months before the beginning of training, just before training, and at the completion of the exercise program.
RESULTS
Knee extension was reduced to an average of 60% of control values and did not change with training. Strength measured with a hand-held Myometer increased significantly for elbow flexion, wrist extension, and hip abduction. Exercise test on a bicycle-ergometer showed significant reduction (6beats/min) in heart rate at 70W and increase (12beats/min) in maximal heart rate with training. The training program could be performed without major complications and resulted in an increase in muscle strength in some muscle groups and in work performance with respect to heart rate at submaximal work load. |
Employing a fully convolutional neural network for road marking detection | Since road markings are one of the main landmarks used for traffic guidance, perceiving them may be a crucial task for autonomous vehicles. In visual approaches, road marking detection consists in detecting pixels of an image that corresponds to a road marking. Recently, most approaches have aimed on detecting lane markings only, and few of them proposed methods to detect other types of road markings. Moreover, most of those approaches are based on local gradient, which provides noisy detections caused by cluttered images. In this paper, we propose an alternative approach based on a deep Fully Convolutional Neural Network (FCNN) with an encoder-decoder architecture for road marking detection and segmentation. The experimental results reveal that the proposed approach can detect any road marking type in a high level of accuracy, resulting in a smooth segmentation. |
Correlation between toe flexor strength and ankle dorsiflexion ROM during the
countermovement jump | [Purpose] This study assessed the relationships between peak toe flexor muscle strength, ankle dorsiflexion range of motion, and countermovement jump height. [Subjects and Methods] Eighteen healthy volunteers participated in the study. Each participant completed tests for peak toe flexor muscle strength, ankle dorsiflexion range of motion, and countermovement jump height. [Results] The results showed (1) a moderate correlation between ankle dorsiflexion range of motion and countermovement jump height and (2) a high correlation between peak first toe flexor muscle strength and countermovement jump height. Peak first toe flexor muscle strength and ankle dorsiflexion range of motion are the main contributors to countermovement jump performance. [Conclusion] These findings indicate that the measurement of peak first toe flexor muscle strength and ankle dorsiflexion range of motion may be useful in clinical practice for improving jump performance in athletes training for sports such as volleyball and basketball. |
A Review of Statistical Approaches to Level Set Segmentation: Integrating Color, Texture, Motion and Shape | Since their introduction as a means of front propagation and their first application to edge-based segmentation in the early 90’s, level set methods have become increasingly popular as a general framework for image segmentation. In this paper, we present a survey of a specific class of region-based level set segmentation methods and clarify how they can all be derived from a common statistical framework. Region-based segmentation schemes aim at partitioning the image domain by progressively fitting statistical models to the intensity, color, texture or motion in each of a set of regions. In contrast to edge-based schemes such as the classical Snakes, region-based methods tend to be less sensitive to noise. For typical images, the respective cost functionals tend to have less local minima which makes them particularly well-suited for local optimization methods such as the level set method. We detail a general statistical formulation for level set segmentation. Subsequently, we clarify how the integration of various low level criteria leads to a set of cost functionals. We point out relations between the different segmentation schemes. In experimental results, we demonstrate how the level set function is driven to partition the image plane into domains of coherent color, texture, dynamic texture or motion. Moreover, the Bayesian formulation allows to introduce prior shape knowledge into the level set method. We briefly review a number of advances in this domain. |
Effects of diet treatment on some biochemical and physiological parameters in patients with type 2 diabetes mellitus. | The aim of the present study was to investigate the effect of diet treatment on serum glucose, triglyceride (TG), total cholesterol (TC), high density lipoprotein-(HDL) cholesterol, low density lipoprotein-(LDL) cholesterol and very low density lipoprotein-(VLDL) cholesterol levels, systolic and diastolic blood pressure and electrocardiograms (ECGs) in patients with type 2 diabetes mellitus (DM). Twenty healthy subjects (mean age 45.9 +/- 3.7 years) and newly diagnosed patients with type 2 diabetes prior to receiving diet treatment (mean age 47.6 +/- 6.2 years) were included in this study. Diabetic patients were given a standard dietary treatment that was composed of 50% to 55% carbonhydrate and 30% fat (1200 kcal for women and 1600 kcal for men) for 2 months. No diet treatment was applied for control. For both groups serum glucose, TG, TC, HDL-cholesterol, LDL-cholesterol and VLDL-cholesterol levels, systolic and diastolic blood pressure and ECGs were measured at the beginning and end of the diet treatment. Although diet treatment decreased the elevated serum glucose in diabetic patients, it still remained higher than that in the controls. Diet treatment also decreased the elevated TG and VLDL-cholesterol in diabetic patients to control values. Although heart rate and systolic blood pressure were higher, diastolic blood pressure was not different in diabetic patients than those in controls. Ventricular hypertrophy was also observed in ECGs of 10% of diabetic patients. Diet treatment normalized all of these findings, except systolic blood pressure. This study showed that diet treatment could not normalize the high systolic blood pressure in type 2 DM. Thus, an effective way of controlling blood pressure should be taken to improve healing in DM. |
A Game Theoretic Framework for Incentives in P2P Systems | Peer-To-Peer (P2P) networks are self-organizing, distributed systems, with no centralized authority or infrastructure. Because of the voluntary participation, the availability of resources in a P2P system can be highly variable and unpredictable. In this paper, we use ideas from Game Theory to study the interaction of strategic and rational peers, and propose a differential service-based incentive scheme to improve the system’s performance. |
Swallowable smart pills for local drug delivery: present status and future perspectives. | Smart pills were originally developed for diagnosis; however, they are increasingly being applied to therapy - more specifically drug delivery. In addition to smart drug delivery systems, current research is also looking into localization systems for reaching the target areas, novel locomotion mechanisms and positioning systems. Focusing on the major application fields of such devices, this article reviews smart pills developed for local drug delivery. The review begins with the analysis of the medical needs and socio-economic benefits associated with the use of such devices and moves onto the discussion of the main implemented technological solutions with special attention given to locomotion systems, drug delivery systems and power supply. Finally, desired technical features of a fully autonomous robotic capsule for local drug delivery are defined and future research trends are highlighted. |
Study ofΛb→Λνν¯with polarized baryons | We investigate the decay of Λb → Λ ν ν̄ with the polarized baryons of Λb and Λ. With the most general hadronic form factors, we first study the decay branching ratio and then derive the longitudinal, normal and transverse polarizations of Λ in terms of the spin unit vectors of Λb and Λ and the momentum of Λ. A polarization of Λb is also discussed. |
New Constrains on Upper Mantle Flow at Regional Scales. | The regional mantle flow beneath the westernmost Mediterranean basin and its transition to 12 the Atlantic domain is addressed by inspecting the anisotropic properties of the mantle. More 13 than 100 new sites, from the Variscan core of Iberia to the northern rim of the Western 14 African Craton, are now investigated using the data provided by different temporary and 15 permanent broad-band seismic arrays. Our main objective is to provide a larger regional 16 framework to the results recently presented along the Gibraltar Arc in order to check the 17 validity of the different geodynamic interpretations proposed so far. 18 19 |
Preliminary Design of a Tandem-Wing Tail-Sitter UAV Using Multi-Disciplinary Design Optimisation | The optimisation of a tail-sitter UAV (Unmanned Aerial Vehicle) that uses a stall-tumble manoeuvre to transition from vertical to horizontal flight and a pull-up manoeuvre to regain the vertical is investigated. The tandem wing vehicle is controlled in the hover and vertical flight phases by prop-wash over wing mounted control surfaces. It represents an innovative and potentially simple solution to the dual requirements of VTOL (Vertical Take-off and Landing) and high speed forward flight by obviating the need for complex mechanical systems such as rotor heads or tilt-rotor systems. |
Physiological response to moderate exercise workloads in a pulmonary rehabilitation program in patients with varying degrees of airflow obstruction. | STUDY OBJECTIVES
To investigate whether a 12-week pulmonary rehabilitation program that includes moderately intensive exercise training performed twice weekly can induce a training effect in patients with a wide variation of airflow limitation.
PARTICIPANTS
Sixty patients with COPD (38 men) with a mean +/- SD FEV(1) % predicted of 55.1 +/- 19.8 (range, 0.51 to 2.99). All patients performed identical incremental symptom-limited cycle ergometer testing before and after a 12-week study period.
MEASUREMENTS AND RESULTS
After 12 weeks, the patients demonstrated a significant (p < 0.05) increase in the peak values for work rate (WR; 77 +/- 30 vs 91 +/- 36 W) and oxygen uptake (1.14 +/- 0.45 vs 1.20 +/- 0.52 L/min). Furthermore, at a given WR during incremental symptom-limited cycle ergometer testing, there were significant (p < 0.05) reductions in minute ventilation (42.4 +/- 16.1 vs 37.0 +/- 13. 6 L/min), carbon dioxide output (1.13 +/- 0.49 vs 1.03 +/- 0.42 L/min), ventilatory equivalent for oxygen (37.6 +/- 8.1 vs 36.0 +/- 6.3), and heart rate (135 +/- 15 vs 128 +/- 16 beats/min). None of the observed physiologic changes correlated with FEV(1) % predicted.
CONCLUSIONS
A pulmonary rehabilitation program performed twice weekly with moderate exercise workloads can lead to a physiologic training response irrespective of the degree of airflow limitation. |
Measuring the surgical 'learning curve': methods, variables and competency. | OBJECTIVES
To describe how learning curves are measured and what procedural variables are used to establish a 'learning curve' (LC). To assess whether LCs are a valuable measure of competency.
PATIENTS AND METHODS
A review of the surgical literature pertaining to LCs was conducted using the Medline and OVID databases.
RESULTS
Variables should be fully defined and when possible, patient-specific variables should be used. Trainee's prior experience and level of supervision should be quantified; the case mix and complexity should ideally be constant. Logistic regression may be used to control for confounding variables. Ideally, a learning plateau should reach a predefined/expert-derived competency level, which should be fully defined. When the group splitting method is used, smaller cohorts should be used in order to narrow the range of the LC. Simulation technology and competence-based objective assessments may be used in training and assessment in LC studies.
CONCLUSIONS
Measuring the surgical LC has potential benefits for patient safety and surgical education. However, standardisation in the methods and variables used to measure LCs is required. Confounding variables, such as participant's prior experience, case mix, difficulty of procedures and level of supervision, should be controlled. Competency and expert performance should be fully defined. |
Derivative expansion of the renormalization group in O(N) scalar field theory | We apply a derivative expansion to the Legendre effective action flow equations of O(N) symmetric scalar field theory, making no other approximation. We calculate the critical exponents η, ν, and ω at the both the leading and second order of the expansion, associated to the three dimensional Wilson-Fisher fixed points, at various values of N . In addition, we show how the derivative expansion reproduces exactly known results, at special values N = ∞,−2,−4, · · ·. |
[My experiences as editor and author in history journals (with a special emphasis in Past & Present)]. | This text registers my thoughts on the editorial work done in academic journals in the area of history. These are made with special attention to the production on Latin America. The basis of my reflections are my participation in the editorial committee of Past & Present, one of the main history journals in the world, my research on the archives of this journal, and my knowledge of the processes of review and edition of journals devoted to Latin American history. |
Supramaximal testing to confirm attainment of VO2max in sedentary men and women. | Supramaximal testing is widely used to verify VO2max attainment, yet its efficacy in sedentary subjects is unknown. The aim of the study was to test this hypothesis in men and women completing maximal cycle ergometry. Fifteen sedentary subjects (age=22.4+/-3.9 year) completed incremental exercise, and returned at least 24 h later to complete constant load exercise at 105% peak work rate (Wmax). Another group of nine sedentary men and women (age=21.8+/-5 year) completed supramaximal exercise at 115% Wmax 1-1.5 h after incremental exercise. During exercise, gas exchange data and heart rate (HR) were continuously obtained. VO2max was similar (p>0.05) between incremental and supramaximal exercise in subjects in the first (32.32+/-4.81 mL/kg/min vs. 31.80+/-5.35 mL/kg/min) and second subset (40.63+/-3.61 mL/kg/min vs. 41.66+/-5.55 mL/kg/min). Maximal HR was lower (p<0.05) with supramaximal exercise, yet respiratory exchange ratio was higher (p<0.05). Test-retest reliability (r=0.81-0.89, p<0.05) for VO2max was high during repeated bouts of supramaximal testing. Findings support use of this protocol to confirm VO2max attainment in healthy, sedentary men and women completing incremental cycle ergometry. |
Association Between Palliative Care and Patient and Caregiver Outcomes: A Systematic Review and Meta-analysis. | Importance
The use of palliative care programs and the number of trials assessing their effectiveness have increased.
Objective
To determine the association of palliative care with quality of life (QOL), symptom burden, survival, and other outcomes for people with life-limiting illness and for their caregivers.
Data Sources
MEDLINE, EMBASE, CINAHL, and Cochrane CENTRAL to July 2016.
Study Selection
Randomized clinical trials of palliative care interventions in adults with life-limiting illness.
Data Extraction and Synthesis
Two reviewers independently extracted data. Narrative synthesis was conducted for all trials. Quality of life, symptom burden, and survival were analyzed using random-effects meta-analysis, with estimates of QOL translated to units of the Functional Assessment of Chronic Illness Therapy-palliative care scale (FACIT-Pal) instrument (range, 0-184 [worst-best]; minimal clinically important difference [MCID], 9 points); and symptom burden translated to the Edmonton Symptom Assessment Scale (ESAS) (range, 0-90 [best-worst]; MCID, 5.7 points).
Main Outcomes and Measures
Quality of life, symptom burden, survival, mood, advance care planning, site of death, health care satisfaction, resource utilization, and health care expenditures.
Results
Forty-three RCTs provided data on 12 731 patients (mean age, 67 years) and 2479 caregivers. Thirty-five trials used usual care as the control, and 14 took place in the ambulatory setting. In the meta-analysis, palliative care was associated with statistically and clinically significant improvements in patient QOL at the 1- to 3-month follow-up (standardized mean difference, 0.46; 95% CI, 0.08 to 0.83; FACIT-Pal mean difference, 11.36] and symptom burden at the 1- to 3-month follow-up (standardized mean difference, -0.66; 95% CI, -1.25 to -0.07; ESAS mean difference, -10.30). When analyses were limited to trials at low risk of bias (n = 5), the association between palliative care and QOL was attenuated but remained statistically significant (standardized mean difference, 0.20; 95% CI, 0.06 to 0.34; FACIT-Pal mean difference, 4.94), whereas the association with symptom burden was not statistically significant (standardized mean difference, -0.21; 95% CI, -0.42 to 0.00; ESAS mean difference, -3.28). There was no association between palliative care and survival (hazard ratio, 0.90; 95% CI, 0.69 to 1.17). Palliative care was associated consistently with improvements in advance care planning, patient and caregiver satisfaction, and lower health care utilization. Evidence of associations with other outcomes was mixed.
Conclusions and Relevance
In this meta-analysis, palliative care interventions were associated with improvements in patient QOL and symptom burden. Findings for caregiver outcomes were inconsistent. However, many associations were no longer significant when limited to trials at low risk of bias, and there was no significant association between palliative care and survival. |
Optimal Prediction of Burgers's Equation | We examine an application of the optimal prediction framework to the truncated Fourier–Galerkin approximation of Burgers’s equation. Under particular conditions on the density of the modes and the length of the memory kernel, optimal prediction introduces an additional term to the Fourier–Galerkin approximation which represents the influence of an arbitrary number of small wavelength unresolved modes on the long wavelength resolved modes. The modified system, called the t-model by previous authors, takes the form of a time-dependent cubic term added to the original quadratic system. Numerical experiments show that this additional term restores qualitative features of the solution in the case where the number of modes is insufficient to resolve the resulting shocks (i.e., zero or very small viscosity) and for which the original Fourier–Galerkin approximation is very poor. In particular, numerical examples are shown in which the kinetic energy decays in the same manner as in the exact solution, i.e., as t−2 when tshock t Re, even when a very small number of resolved modes is used. Correlation-like quantities related to the memory kernel are then computed, and these exhibit a t−3 tail for the same time period. |
Academia.edu: Social network or Academic Network? | Academic social network sites Academia.edu and ResearchGate and reference sharing sites Mendeley, Bibsonomy, Zotero, and CiteULike give scholars the ability to publicise their research outputs and connect to each other. With millions of users, these are a significant addition to the scholarly communication and academic information seeking eco-structure. There is thus a need to understand the role that they play and the changes, if any, that they can make to the dynamics of academic careers. This article investigates attributes of philosophy scholars on Academia.edu, introducing a median-based time-normalising method to adjust for time delays in joining the site. In comparison to students, faculty tend to attract more profile views but female philosophers did not attract more profile views than did males, suggesting that academic capital drives philosophy uses of the site more than friendship and networking. Secondary analyses of law, history and computer science confirmed the faculty advantage (in terms of higher profile views) except for females in law and females in computer science. It also found a female advantage for both faculty and students in law and computer science as well as for history students. Hence, Academia.edu overall seems to reflect a hybrid of scholarly norms (the faculty advantage) and a female advantage that is suggestive of general social networking norms. Finally, traditional bibliometric measures did not correlate with any Academia.edu metrics for philosophers, perhaps because more senior academics use the site less extensively or because of the range informal scholarly activities that cannot be measured by bibliometric methods. |
Robust longitudinal spin-Seebeck effect in Bi-YIG thin films | In recent years, the coupling of magnetic insulators (bismuth-doped yttrium iron garnet, Bi-YIG) with platinum has garnered significant interest in spintronics research due to applicability as spin-current-driven thermoelectric coatings. These coatings bridge the gap between spintronics technologies and thermoelectric materials, providing a novel means of transforming waste heat into electricity. However, there remain questions regarding the origins of the spin-Seebeck effect (SSE) as well as claims that observed effects are a manifestation of magnetic proximity effects, which would induce magnetic behavior in platinum. Herewith we provide support that the voltages observed in the Bi-YIG/Pt films are purely SSE voltages. We reaffirm claims that magnon transport theory provides an ample basis for explaining SSE behavior. Finally, we illustrate the advantages of pulsed-laser deposition, as these Bi-YIG films possess large SSE voltages (even in absence of an external magnetic field), as much as twice those of films fabricated via solution-based methods. |
Internal implementation | We introduce a constrained mechanism design setting called internal implementation, in which the mechanism designer is explicitly modeled as a player in the game of interest. This distinguished player has the opportunity to modify the game before play. Specifically, the player is able to make reliable binding commitments of outcome-specific monetary transfers to the other players in the game. We characterize the power of internal implementation for certain interesting classes of games, and show that the impact of internal implementation on the utility of the players' and the social welfare is often counterintuitive; for example, the social welfare can be arbitrarily worse after an internal implementation. |
Chapter 2 Soil Liquefaction in Earthquakes 2.1. Definition of Soil Liquefaction | Soil liquefaction and related ground failures are commonly associated with large earthquakes. In common usage, liquefaction refers to the loss of strength in saturated, cohesionless soils due to the build-up of pore water pressures during dynamic loading. A more precise definition of soil liquefaction is given by Sladen et al. (1985): "Liquefaction is a phenomenon wherein a mass of soil loses a large percentage of its shear resistance, when subjected to monotonic, cyclic, or shock loading, and flows in a manner resembling a liquid until the shear stresses acting on the mass are as low as the reduced shear resistance." In a more general manner, soil liquefaction has been defined as the transformation "from a solid state to a liquefied state as a consequence of increased pore pressure and reduced effective stress" ("Definition of terms..." 1978). Some ground failures attributed to soil liquefaction are more correctly ascribed to "cyclic mobility" which results in limited soil deformations without liquid-like flow. The proper, concise definition for soil liquefaction has been the subject of a continuing debate within the geotechnical profession. While investigators have argued that liquefaction and cyclic mobility should be carefully distinguished (Castro and Poulos 1977), "liquefaction" is commonly used to describe all failure mechanisms resulting from the build-up of pore pressures during undrained cyclic shear of saturated soils. |
Validity and responsiveness of the Michigan Hand Questionnaire in patients with rheumatoid arthritis: a multicenter, international study. | OBJECTIVE
Millions of patients experience the disabling hand manifestations of rheumatoid arthritis (RA), yet few hand-specific instruments are validated in this population. Our objective was to assess the reliability, validity, and responsiveness of the Michigan Hand Questionnaire (MHQ) in patients with RA.
METHODS
At enrollment and at 6 months, 128 RA patients with severe subluxation of the metacarpophalangeal joints completed the MHQ, a 37-item questionnaire with 6 domains: function, activities of daily living (ADL), pain, work, aesthetics, and satisfaction. Reliability was measured using Spearman's correlation coefficients between time periods. Internal consistency was measured using Cronbach's alpha. Construct validity was measured by correlating MHQ responses with the Arthritis Impact Measurement Scales 2 (AIMS2). Responsiveness was measured by calculating standardized response means (SRMs) between time periods.
RESULTS
The MHQ demonstrated good test-retest reliability (r = 0.66, P < 0.001). Cronbach's alpha scores were high for ADL (α = 0.90), function (α = 0.87), aesthetics (α = 0.79), and satisfaction (α = 0.89), indicating redundancy. The MHQ correlated well with AIMS2 responses. Function (r = -0.63), ADL (r = -0.77), work (r = -0.64), pain (r = 0.59), and summary score (r = -0.74) were correlated with the physical domain. Affect was correlated with ADL (r = -0.47), work (r = -0.47), pain (r = 0.48), and summary score (r = -0.53). Responsiveness was excellent among arthroplasty patients in function (SRM 1.42), ADL (SRM 0.89), aesthetics (SRM 1.23), satisfaction (SRM 1.76), and summary score (SRM 1.61).
CONCLUSION
The MHQ is easily administered, reliable, and valid to measure rheumatoid hand function, and can be used to measure outcomes in rheumatic hand disease. |
Atypical lymphocytic lobular panniculitis: an overlap condition with features of subcutaneous panniculitis-like T-cell lymphoma and lupus profundus. | To cite: He A, Kwatra SG, Kazi N, et al. BMJ Case Rep Published online: [please include Day Month Year] doi:10.1136/bcr-2016215335 DESCRIPTION A woman aged 45 years presented for evaluation of skin lesions. She reported an 8–9-year history of occasionally tender, waxing-and-waning skin nodules refractory to dapsone, prednisone and methotrexate. Examination revealed multiple indurated subcutaneous nodules distributed on the upper extremities, with scattered patches of lipoatrophy in areas of nodule regression (figure 1). Her medical history was unremarkable; CBC and CMP were within normal limits, with no history of radiotherapy or evidence of internal organ involvement. She had a positive ANA titre (1:160, speckled), but negative anti-dsDNA, anti-Smith, anti-Ro and anti-La antibodies. Differential diagnosis included erythema nodosum (EN), erythema induratum of Bazin (EIB), lupus profundus (LP) and cutaneous lymphoma. Initial wedge biopsy in 2008 disclosed a predominantly lobular panniculitic process with some septal involvement (figure 2A). Broad zones of necrosis were present (figure 2B). The infiltrate consisted of a pleomorphic population of lymphocytes with occasional larger atypical lymphocytes (figure 2C). There were foci of adipocyte rimming by the atypical lymphocytes (figure 2C). Immunophenotyping revealed predominance of CD3+ T cells with some CD20+ B-cell aggregates. The atypical cells stained CD4 and CD8 in approximately equal ratios. TIA-1 was positive in many of the atypical cells but not prominently enough to render a diagnosis of cytotoxic T-cell lymphoma. T-cell receptor PCR studies showed polyclonality. Subsequent biopsies performed annually after treatment with prednisone in 2008 and 2010, dapsone in 2009 and methotrexate in 2012 showed very similar pathological and molecular features. Adipocyte rimming and TCR polyclonality persisted. EN is characterised by subcutaneous nodules on the lower extremities in association with elevated erythrocyte sedimentation rate (ESR) and C reactive protein (CRP), influenza-like prodrome preceding nodule formation and self-limiting course. Histologically, EN shows a mostly septal panniculitis with radial granulomas. EN was ruled out on the basis of normal ESR (6) and CRP (<0.1), chronic relapsing course and predominantly lobular panniculitis process histologically. EIB typically presents with violaceous nodules located on the posterior lower extremities, with arms rarely affected, of patients with a history of tuberculosis (TB). Histologically, EIB shows granulomatous inflammation with focal necrosis, vasculitis and septal fibrosis. Our patient had no evidence or history of TB infection and presented with nodules of a different clinical morphology. Ultimately, this constellation of histological and immunophenotypic findings showed an atypical panniculitic T-lymphocytic infiltrate. Although the lesion showed a lobular panniculitis with features that could be seen in subcutaneous panniculitis-like T-cell lymphoma (SPTCL), the presence of plasma cells, absence of CD8 and TIA restriction and T-cell polyclonality did not definitively support that |
Product concepts for land mobile satellite communication terminals in Ku-/Ka-band | The details of two product concepts and the hardware realization aspects for low profile Ku-/Ka-band land mobile terminals for high data rate satellite communications are described. Beside the competitive low cost approach, the non-obstructive design and lower weight are key issues for many applications to achieve a broad customer acceptance. The introduction of a proprietary planar antenna technology opens the possibility to meet these requirements. |
A vision of industry 4 . 0 from an artificial intelligence point of view | During the first years of the so called fourth industrial revolution, main attempts that tried to define the main ideas and tools behind this new era of manufacturing, always end up referring to the concept of smart machines that would be able to communicate with each and with the environment. In fact, the defined cyber physical systems, connected by the internet of things, take all the attention when referring to the new industry 4.0. But, nevertheless, the new industrial environment will benefit from several tools and applications that complement the real formation of a smart, embedded system that is able to perform autonomous tasks. And most of these revolutionary concepts rest in the same background theory as artificial intelligence does, where the analysis and filtration of huge amounts of incoming information from different types of sensors, assist to the interpretation and suggestion of the most recommended course of action. For that reason, artificial intelligence science suit perfectly with the challenges that arise in the consolidation of the fourth industrial revolution. |
Sentiment Classification in Chinese Microblogs: Lexicon-based and Learning-based Approaches | Sentiment classification in Chinese microblogs is more challenging than that of Twitter for numerous reasons. In this paper, two kinds of approaches are proposed to classify opinionated Chinesemicroblog posts: 1) lexicon-based approaches combining Simple Sentiment Word-Count Method with 3 Chinese sentiment lexicons, 2) machine learning models with multiple features. According to our experiment, lexicon-based approaches can yield relatively fine results and machine learning classifiers outperform both the majority baseline and lexicon-based approaches. Among all the machine learning-based approaches, Random Forests works best and the results are satisfactory. |
Enhancing Security Event Management Systems with Unsupervised Anomaly Detection | Security Information and Event Management (SIEM) systems are today a key component of complex enterprise networks. They usually aggregate and correlate events from different machines and perform a rule-based analysis to detect threats. In this paper we present an enhancement of such systems which makes use of unsupervised anomaly detection algorithms without the need for any prior training of the system. For data acquisition, events are exported from an existing SIEM appliance, parsed, unified and preprocessed to fit the requirements of unsupervised anomaly detection algorithms. Six different algorithms are evaluated qualitatively and finally a global k-NN approach was selected for a practical deployment. The new system was able to detect misconfigurations and gave the security operation center team more insight about processes in the |
A Reliability-Aware Approach for Resource Efficient Virtual Network Function Deployment | Network function virtualization (NFV) is a promising technique aimed at reducing capital expenditures (CAPEX) and operating expenditures (OPEX), and improving the flexibility and scalability of an entire network. In contrast to traditional dispatching, NFV can separate network functions from proprietary infrastructure and gather these functions into a resource pool that can efficiently modify and adjust service function chains (SFCs). However, this emerging technique has some challenges. A major problem is reliability, which involves ensuring the availability of deployed SFCs, namely, the probability of successfully chaining a series of virtual network functions while considering both the feasibility and the specific requirements of clients, because the substrate network remains vulnerable to earthquakes, floods, and other natural disasters. Based on the premise of users’ demands for SFC requirements, we present an ensure reliability cost saving algorithm to reduce the CAPEX and OPEX of telecommunication service providers by reducing the reliability of the SFC deployments. The results of extensive experiments indicate that the proposed algorithms perform efficiently in terms of the blocking ratio, resource consumption, time consumption, and the first block. |
An overview of Fog computing and its security issues | Fog computing is a paradigm that extends Cloud computing and services to the edge of the network. Similar to Cloud, Fog provides data, compute, storage and application services to end users. In this article, we elaborate the motivation and advantages of Fog computing and analyse its applications in a series of real scenarios, such as Smart Grid, smart traffic lights in vehicular networks and software defined networks. We discuss the state of the art of Fog computing and similar work under the same umbrella. Distinguished from other reviewing work of Fog computing, this paper further discloses the security and privacy issues according to current Fog computing paradigm. As an example, we study a typical attack, man-in-the-middle attack, for the discussion of system security in Fog computing. We investigate the stealthy features of this attack by examining its CPU and memory consumption on Fog device. In addition, we discuss the authentication and authorization techniques that can be used in Fog computing. An example of authentication techniques is introduced to address the security scenario where the connection between Fog and Cloud is fragile. Copyright © 2015 John Wiley & Sons, Ltd. |
The origin of multiple superconducting gaps in MgB2 | Magnesium diboride, MgB2, has the highest transition temperature (Tc = 39 K) of the known metallic superconductors. Whether the anomalously high Tc can be described within the conventional BCS (Bardeen–Cooper–Schrieffer) framework has been debated. The key to understanding superconductivity lies with the ‘superconducting energy gap’ associated with the formation of the superconducting pairs. Recently, the existence of two kinds of superconducting gaps in MgB2 has been suggested by several experiments; this is in contrast to both conventional and high-Tc superconductors. A clear demonstration of two gaps has not yet been made because the previous experiments lacked the ability to resolve the momentum of the superconducting electrons. Here we report direct experimental evidence for the two-band superconductivity in MgB2, by separately observing the superconducting gaps of the σ and π bands (as well as a surface band). The gaps have distinctly different sizes, which unambiguously establishes MgB2 as a two-gap superconductor. |
Block-Sparse Recurrent Neural Networks | Recurrent Neural Networks (RNNs) are used in state-of-the-art models in domains such as speech recognition, machine translation, and language modelling. Sparsity is a technique to reduce compute and memory requirements of deep learning models. Sparse RNNs are easier to deploy on devices and high-end server processors. Even though sparse operations need less compute and memory relative to their dense counterparts, the speed-up observed by using sparse operations is less than expected on different hardware platforms. In order to address this issue, we investigate two different approaches to induce block sparsity in RNNs: pruning blocks of weights in a layer and using group lasso regularization to create blocks of weights with zeros. Using these techniques, we demonstrate that we can create block-sparse RNNs with sparsity ranging from 80% to 90% with small loss in accuracy. This allows us to reduce the model size by roughly 10×. Additionally, we can prune a larger dense network to recover this loss in accuracy while maintaining high block sparsity and reducing the overall parameter count. Our technique works with a variety of block sizes up to 32×32. Block-sparse RNNs eliminate overheads related to data storage and irregular memory accesses while increasing hardware efficiency compared to unstructured sparsity. |
A bioinspired multi-modal flying and walking robot. | With the aim to extend the versatility and adaptability of robots in complex environments, a novel multi-modal flying and walking robot is presented. The robot consists of a flying wing with adaptive morphology that can perform both long distance flight and walking in cluttered environments for local exploration. The robot's design is inspired by the common vampire bat Desmodus rotundus, which can perform aerial and terrestrial locomotion with limited trade-offs. Wings' adaptive morphology allows the robot to modify the shape of its body in order to increase its efficiency during terrestrial locomotion. Furthermore, aerial and terrestrial capabilities are powered by a single locomotor apparatus, therefore it reduces the total complexity and weight of this multi-modal robot. |
Task-Driven Dictionary Learning | Modeling data with linear combinations of a few elements from a learned dictionary has been the focus of much recent research in machine learning, neuroscience, and signal processing. For signals such as natural images that admit such sparse representations, it is now well established that these models are well suited to restoration tasks. In this context, learning the dictionary amounts to solving a large-scale matrix factorization problem, which can be done efficiently with classical optimization tools. The same approach has also been used for learning features from data for other purposes, e.g., image classification, but tuning the dictionary in a supervised way for these tasks has proven to be more difficult. In this paper, we present a general formulation for supervised dictionary learning adapted to a wide variety of tasks, and present an efficient algorithm for solving the corresponding optimization problem. Experiments on handwritten digit classification, digital art identification, nonlinear inverse image problems, and compressed sensing demonstrate that our approach is effective in large-scale settings, and is well suited to supervised and semi-supervised classification, as well as regression tasks for data that admit sparse representations. |
Spinal epidural abscess in two calves. | OBJECTIVE
To report clinical signs, diagnostic and surgical or necropsy findings, and outcome in 2 calves with spinal epidural abscess (SEA).
STUDY DESIGN
Clinical report.
ANIMALS
Calves (n=2).
METHODS
Calves had neurologic examination, analysis and antimicrobial culture of cerebrospinal fluid (CSF), vertebral column radiographs, myelography, and in 1 calf, magnetic resonance imaging (MRI). A definitive diagnosis of SEA was confirmed by necropsy in 1 calf and during surgery and histologic examination of vertebral canal tissue in 1 calf.
RESULTS
Clinical signs were difficulty in rising, ataxia, fever, apparent spinal pain, hypoesthesia, and paresis/plegia which appeared 15 days before admission. Calf 1 had pelvic limb weakness and difficulty standing and calf 2 had severe ataxia involving both thoracic and pelvic limbs. Extradural spinal cord compression was identified by myelography. SEA suspected in calf 1 with discospondylitis was confirmed at necropsy whereas calf 2 had MRI identification of the lesion and was successfully decompressed by laminectomy and SEA excision. Both calves had peripheral neutrophilia and calf 2 had neutrophilic pleocytosis in CSF. Bacteria were not isolated from CSF, from the surgical site or during necropsy. Calf 2 improved neurologically and had a good long-term outcome.
CONCLUSION
Good outcome in a calf with SEA was obtained after adequate surgical decompression and antibiotic administration.
CLINICAL RELEVANCE
SEA should be included in the list of possible causes of fever, apparent spinal pain, and signs of myelopathy in calves. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.