query
stringlengths
8
1.13k
pos
stringlengths
13
1.09k
neg
stringlengths
11
1.07k
query_lang
stringclasses
32 values
__index_level_0__
int64
41
1.05M
Model of Location for Highway Maintenance Site and Its Application
According to the requirements of economy and timeliness for setting the highway maintenance site,the multi-objective model of location for highway maintenance site was established with the lowest total costs,which considered the constant costs and transport costs of maintenance site. And then,the corresponding algorithm was also given. Finally,the best quantity and location of maintenance site was obtained through the case study of highway network. The results show that the proposed model and algorithm are both reasonable,which has certain referential value for the selection of location of highway maintenance site.
The paper describes the building process beauty country,assess environment of rural human settlement of Jubao bay after construction in the method of the Post-Occupancy Evaluation construction completion.Investigate that is related to users,in order to appraisal to environment of rural human settlement of Jubao bay of Jiaxing city.In finally,proved the result of country planning practice of Jubao bay in the aspect of uesrs.
eng_Latn
2,851
Policies and Procedures for Successful Implementation of Alternative Technical Concepts
Transportation agencies increasingly allow design–build teams and construction contractors to incorporate alternative technical concepts (ATCs) into proposals for highway projects. The ATC approach allows proposers to suggest modifications to a contract requirement that would result in the project being equal to or better than the design in the solicitation. This paper investigates current procurement policies and presubmittal procedures followed by state transportation agencies in the implementation of ATCs in transportation projects. The paper reports survey results of state departments of transportation (DOTs) that generated responses from 42 state DOTs. These results were compared with a content analysis of solicitation documents of 65 ATC projects from 24 DOTs. The results showed that although ATC usage was most common in design–build projects, ATCs have been successfully implemented in nearly all project delivery methods. The study showed that incorporation of ATCs into the procurement process allow...
Taking the product/service attribute system as a foundation, the article utilizes two-factor theory expanded and means-end theory to establish the product/service positional matrix which is based on customers' cognition object and customers' satisfaction, so as to explore the new method of positioning product/service.
eng_Latn
2,856
Regression Test Case Prioritization Technique Using Genetic Algorithm
Regression testing is a maintenance action in which it ensures validity of changed software. Regression testing takes much time to execute the entire test suite and this activity is very costly. In this paper we present a technique which is based on Genetic algorithms (GA) for test case prioritization. Genetic algorithm is a generative algorithm based on natural evolution which generate solutions to optimization problem. In this paper, a new Genetic algorithm is used for regression testing that will prioritize test cases using statement coverage technique. The Algorithm finds fitness function using statement coverage. The results shows the efficiency of algorithms with the help of Average Percentage of Statement Coverage (APSC) metric. This prioritization technique shows optimum results to prioritize the test case. Genetic Algorithm is used to produce the population and it finds the optimal sequence order of test case in regression testing.
Large-scale construction projects frequently appear which make people pay more attention to the performance evaluation.This paper established the large-scale construction projects'performance evaluation index system and carried out comprehensive evaluation of the performance with a combined model of fuzzy theory and the matter-element theory.The analytic hierarchy process and the membership function were used to determine the weights of indicator elements and the membership matter-element.Finally,the study used the performance indicator system in construction phase for a large-scale project as an example to verify the fuzzy matter-element correctness,rationality and feasibility.
kor_Hang
2,889
Analysis on Construction of Grain Production in Huzhou City of Zhejiang Province
Construction conditions of functional region for grain production were investigated and construction achievements in Huzhou City since 2010 were summarized. There were many difficulties identified in this study: appropriate regions selection, fund raising, subject cultivation and follow-up management in the process of construction. According to the existing problems, some strategies were proposed,for example, to strengthen the leadership and cooperation among the government multi-branches, to integrate resource and increase investment from government, to perfect the mating facilities and improve service quality for the functional region, and to establish the innovative mechanism and strengthen the follow-up management.
Taking the BOT project of Chongquin Fuling Lidu Bridge as example,this paper discusses key links needing attention during BOT investment choice,contract signing and construction management.Some important points are summarized in this paper for successful executing BOT project.This paper can be referenced by investors in related building industries.
eng_Latn
2,892
Energy-Efficient Dimensioning with Traffic Engineering for Municipal Mesh Access Networks
We target energy efficiency of municipal access networks. A municipal access network typically provides coverage for city-wide area, and is characterized by diverse and dynamic traffic profiles. Most existing approaches to energy savings adapt network state to traffic variation. However, these approaches fail to capture the potential of energy savings by exploiting the traffic that has already existed in the network. We study network dimensioning for municipal mesh access networks to save energy by rearranging existing traffic and adjusting link rate. The problem of energy-efficient network dimensioning is formulated using mixed integer linear programming. We also propose an efficient heuristic algorithm that consists of both traffic engineering and link-rate adaptation. Numerical experimental results show that our scheme is more effective than link-rate adaptation in saving energy, and achieves a good balance between energy efficiency and resource efficiency.
Aimed at the problem of the configuration program of road maintenance machinery index difficult to quantify, by using the method of fuzzy comprehensive in highway maintenance machinery configuration evaluation, depicted some fuzzy phenomena with subjective qualitative description , closely integrated quantitative analysis and subjective qualitative description. Try to find a reasonable and workable evaluation approach. So that make the mechanization of highway maintenance and construction more scientific and practical
eng_Latn
2,893
The application of the bionic concept in the new urban waterfront district design:A case study of Kengkou area in Dinghu District of Zhaoqing City
Aim To explore an innovative planning and design concept about new waterfront area during rapid urbanization.Methods Field investigation was used to determine the core issue,and the idea of bionics was used as the origin of thought and a space prototype.Results Planning on the whole has achieved a "multi-core,one axis,nine-area "space structure,a "five vertical and three horizontal" road frame and five styles of landscape partition based on different layout and construction style.Conclusion Biomimetic principle applied in the urban design of a new urban waterfront area is a new exploration.The new urban waterfront district plan derived from bionics will guide the regional economic and social scientific development and harmonious coexistence between human being and nature.
Taking the BOT project of Chongquin Fuling Lidu Bridge as example,this paper discusses key links needing attention during BOT investment choice,contract signing and construction management.Some important points are summarized in this paper for successful executing BOT project.This paper can be referenced by investors in related building industries.
eng_Latn
2,895
The treatment of congenital dislocation and subluxation of the hip in the older child.
Two additional risks are involved in the treatment of congenital dislocation of the hip when diagnosis is made too late: avascular necrosis following the reduction, and residual coxo-femoral dysplasia. Our study and the experience that we acquired within the past 10 years on 142 hips show the value of closed reduction by slow progressive traction with abduction and internal rotation, and the value of innominate osteotomy. These are two strong points on which we can base our treatment in order to further improve the results of our efforts in this field.
This paper presents an exact combinatorial algorithm for solving the Discrete Berth Allocation Problem (DBAP) and the Hybrid Berth Allocation Problem (HBAP) with fixed handling times of vessels based on the original algorithm for solving combinatorial problems called Sedimentation Algorithm. We address the issues of DBAP and HBAP according to the Rashidi and Tsang model. To the best of our knowledge, the proposed algorithm is the first exact combinatorial algorithm for solving the general DBAP and HBAP based on Rashidi and Tsang model. Computational results prove the superiority of the proposed algorithms compared with the exact solvers based on the Mixed Integer Programming (MIP) models. Efficient C implementation enabled us to solve instances with up to 65 vessels. This resolves most of the real life problems, even in large ports.
eng_Latn
2,897
57Fe Mössbauer spectroscopic study of organic-rich sediments (source rocks) from test wells CTP-1 and MDP-1 located in Eastern Krishna–Godavari basin, India
Abstract A large number of sub-surface sedimentary samples using Mossbauer spectroscopy were obtained from various depths of wells CTP-1 and MDP-1 drilled in Eastern Krishna–Godavari basin (KG basin) of India. Results indicate that iron is distributed in pyrite, siderite and in clay minerals, apart from these minerals an anomalously large presence of sulfate minerals was also found. Their presence indicates oxidizing conditions in sediments. Significance of presence of minerals, which show oxidizing conditions in context of source rock characterization, is discussed.
Different systems act as one of the most promising forms of integration in the urban planning structure. In the process of formation of plans for social and economic development of major cities more often, there is a situation, when the improvement of resources efficiency needs not just a concentration of effort, but also some new and innovative forms of building production organization. It is proposed to establish in Odessa the "Corporate Scientific and Technical Complex of urban planning energy renovation" as an innovative organizational structure which practically uses the accumulated scientific and technical potential for the reconstruction of historic buildings in Odessa in 1820–920 using energy efficiency standards. It is necessary to organize courses in the form of accelerated training for workers of the occupation "master of finishing construction work" specialty "plasterer" for "KNTK GERek" effective functioning.
eng_Latn
2,900
A Modest Proposal for the Documentary Value of The Treasure of Religious Maintenance
The Treasure of Religious Maintenance was compiled by Guo Sizhou and so on, the followers of the Three Instruct religion. Covering various aspects of the Three Instruct religion during the interim of Qing Dynasty and the Republic of China, such as the constructing of memorial temples, the revision of canons, the improvement of religion class system, the book is of fairly high documentary value and thus becomes an indispensable historical material on studying the history of the Three Instruct religion.
Aimed at the problem of the configuration program of road maintenance machinery index difficult to quantify, by using the method of fuzzy comprehensive in highway maintenance machinery configuration evaluation, depicted some fuzzy phenomena with subjective qualitative description , closely integrated quantitative analysis and subjective qualitative description. Try to find a reasonable and workable evaluation approach. So that make the mechanization of highway maintenance and construction more scientific and practical
eng_Latn
2,901
Combustion Optimization-oriented Spectral Measurement and Field Reconstruction of Furnace Parameters for Power Station Boilers
An analysis was made to the effects of furnace parameters on the efficiency,pollutant emission and service life of a boiler,so as to study the mutual relationship between these parameters and the boiler combustion optimization.A new technology for measurement of furnace parameters based on laser absorption spectrum was described from the aspect of measuring principle and system structure.Taking a 680 MW coal-fired boiler as an example,field measurement and reconstruction of furnace parameters were carried out to the boiler based on laser spectrum.Results show that via this method,measurement and field reconstruction can be completed to multiple furnace parameters,which may serve as visible and effective guidance to optimization of boiler combustion.By adjusting the boiler combustion according to measurement results,the flame can be positioned in the furnace center,which therefore fills the furnace adequately and makes the temperature field evenly distributed.
Different systems act as one of the most promising forms of integration in the urban planning structure. In the process of formation of plans for social and economic development of major cities more often, there is a situation, when the improvement of resources efficiency needs not just a concentration of effort, but also some new and innovative forms of building production organization. It is proposed to establish in Odessa the "Corporate Scientific and Technical Complex of urban planning energy renovation" as an innovative organizational structure which practically uses the accumulated scientific and technical potential for the reconstruction of historic buildings in Odessa in 1820–920 using energy efficiency standards. It is necessary to organize courses in the form of accelerated training for workers of the occupation "master of finishing construction work" specialty "plasterer" for "KNTK GERek" effective functioning.
eng_Latn
2,906
Box type girder hoisting device
The utility model belongs to the technical field of box type girder body mating workpieces, and particularly relates to a box type girder hoisting device. According to the technical scheme, the box type girder hoisting device comprises an L-shaped steel plate, a first moving plate and a second moving plate which are connected by a shaft are arranged at two ends of the steel plate, and the ends of the first moving plate and the second moving plate are connected by a third moving shaft; and a left hoisting ring and a right hoisting ring are arranged at two ends of the first moving plate or of the second moving plate. The box type girder hoisting device not only can be repeatedly used, but also can overcome the defect that the hoisting rings arranged on a girder body influences the usability of the girder body.
Abstract : The purpose of this thesis is to evaluate the systems engineering effort by the Aviation Research and Development Activity (AVRADA), the Airborne Engineering Research Activity (AERA), and support contractor DOSS to install the Trimble Global Positioning System (GPS) receiver onto Army helicopter platforms. This study is an example of a successful systems engineering effort to install a non-developmental item (NDI) onto existing aircraft platforms in response to an urgent requirement created by the deployment of aircraft for Operation Desert Shield.
eng_Latn
2,907
Construction of Collaborative Innovation Platform for Universities and Colleges
Since the "innovation capacity improvement plan for universities and colleges"(abbr."Plan 211") has been carried out,lots of national universities have begun to nurture or implement their collaborative innovation one by one.In combination of the practical background and relevant theory,the article describes the content,functions,and basic principles of construction of collaborative innovation platform for universities and colleges.Then the discussion about the platform operation,safeguard mechanism and institutional system construction is carried.Finally,the study may provide some references and suggestions for the progress of collaborative innovation work of our universities and colleges,which is one important part of national innovation system.
Taking the BOT project of Chongquin Fuling Lidu Bridge as example,this paper discusses key links needing attention during BOT investment choice,contract signing and construction management.Some important points are summarized in this paper for successful executing BOT project.This paper can be referenced by investors in related building industries.
eng_Latn
2,921
Innovation appraisal of the SpaceLiner concept
Innovation plays an important role in space. Consequently, constant and ongoing appraisal of innovation and break-through trends is necessary to remain competitive in research and development, as well as from an economic perspective. In industry currently existing methods for the evaluation of innovations are investigated in this work in terms of transferability for complex space transportation projects. Their advantages, disadvantages and significance are highlighted. Finally, the innovation of the ‘SpaceLiner’ is applied and evaluated by these methods.
The paper introducess the implemented conditions which colling recycle of flexible base on Ying-DaLine.It over all xpounds the investigation of road condlitions of original pavement,indoor trial study,determination of trial lot plan as well as completion of trial lot etc.
eng_Latn
2,926
170-265 Rocks Transportation Road Construction and Optimization in Fujiawu Mining Area
This paper emphasizes on 170-265 rocks transportation road construction methode and optimization in Fujiawu mining area,Dexing copper mine,introduces construction procedure,quality control,safety guarantee measure and large dump truck transportation safety of open-pit mine.This optimization and construction method is identified to achieve all the requirements of economic and safety through practice on site,provides referential experience and guidance for the future's project construction and optimization.
We have modeled the AntSim case study for the GraBats 2008 tool contest with the Fujaba tool. It turned out that for this problem the moving of single ants is the most frequent operation. The execution time for this operation dominates the overall execution time. This paper will report how we addressed the move ant problem using dedicated Fujaba features and which performance we have achieved.
eng_Latn
2,931
Effective application of automatic auxiliary supervision equipment for Highway patrol task
Automatic auxiliary supervision equipment was launched for Highway patrol tasks in Taiwan in 2015. This study uses filming instead of taking photos of damages on the road at night. In addition, virtual detection points were set to identify the location of a patrol vehicle between being on a viaduct or ground. More importantly, the data of damage types was analyzed further for road management and maintenance.
Hierarchy analysis process adopted,according to the characteristics of the postal network and the transport network,starting four aspects such as the economical,the adaptable,the coordinative,and the social,we scientifically analyze factors of the postal transport logistics center location and get the order of all the factors by the importance for the postal and email center.AHP are used to simplify the selection process of postal distribution center that has certain applicability.
eng_Latn
2,934
Computational Enhancements in Tree-Growing Methods
In this paper we show how to avoid unnecessary calculations and to save considerably the computational cost in a wide class of tree-based methods. So called auxiliary statistics, which enable to restrict handling the raw data, are introduced. Aside that, a fast splitting algorithm is outlined, which allows to recognize and avoid unnecessary split evaluations during the search of an optimal split. Relationships between the computational cost savings and properties of both a specific method and data are summarized.
Large-scale construction projects frequently appear which make people pay more attention to the performance evaluation.This paper established the large-scale construction projects'performance evaluation index system and carried out comprehensive evaluation of the performance with a combined model of fuzzy theory and the matter-element theory.The analytic hierarchy process and the membership function were used to determine the weights of indicator elements and the membership matter-element.Finally,the study used the performance indicator system in construction phase for a large-scale project as an example to verify the fuzzy matter-element correctness,rationality and feasibility.
eng_Latn
2,938
Hierarchical Networks-on-Chip Interconnect for Astrocyte-Neuron Network Hardware
Scalable hardware interconnect is a significant research challenge for neuromorphic systems in particular, this becomes more pronounced when we seek to realise the integration of neurons with astrocytes cells. This paper presents a novel interactive architecture for the astrocyte-neuron network (ANN) hardware systems, and the novel Hierarchical Astrocyte Network Architecture (HANA) using networks-on-chip (NoC) for the efficient information exchange between astrocyte cells. The proposed HANA incorporates a two-level NoC packet transmission mechanism to increase the information exchange rate between astrocyte cells and to provide a NoC traffic balance for local and global astrocyte networks. Experimental results demonstrate that the proposed HANA approach can provide efficient information exchange rates for the ANN, while the hardware synthesis results using 90 nm CMOS technology show that it has a low area overhead which maintains scalability.
This report documents work performed by the University of California at Davis in collaboration with the California Department of Transportation (Caltrans) in relationship to a Precursor System Analysis dealing with a study of automated construction, maintenance, and operational requirements for Automated Highway Systems (AHSs). This study was conducted in the activity area "K" dealing with AHS roadway operational analysis. The report documents a design analysis for a robotic system for automated installation of discrete magnetic markers for AHSs. The design includes a manpower assessment and a cost-benefit analysis. Summaries for supporting research are included in appendices, including: a literature review, a reference architecture and classification system for an AHS, and a computer simulation and animation of the prototype discrete magnetic marker placement system described in this report.
eng_Latn
2,942
Combination of a dynamic-hybrid berth allocation problem with a quay crane scheduling problem
Recently due to the increasing pressure to improve the efficiency of port operations, a great deal of research has been devoted to optimizing container terminal operations. Most papers deal with either the berth allocation problem or the crane scheduling problem. However only a limited number of papers deal with both, berth allocation and crane scheduling through either a combined model of the two problems, or two models with nested loops of each problem feeding into each other. In this paper, however, a mixed integer programming model is proposed for the integrated berth allocation and quay crane scheduling. Numerical experiments are conducted to test the performance of the proposed approach, using commercial software. The results illustrate that predefining both the berth allocation and crane scheduling before the vessel is processed results in improved time optimization.
This paper presents a suitable solution of studying methods on BRBs’ critical load with finite element software.The effects of the relevant factors on the BRBs’ ultimate bearing capacity are investigated, and some practical suggestions for the design and production are provided.
eng_Latn
2,948
ROUTE PLANNING FOR ELECTRIC BUSES
Electric vehicles still lack the flexibility and thus the prevalence of internal combustion engines mostly due to inadequacy of battery technologies. Electric vehicles designed for general use, mostly result in economic, operational and sometimes technical infeasibilities. Eliminating these feasibility issues is necessary in taking advantage of electric vehicles. Many of those infeasibilities can be eliminated through individual customization and configuration of vehicles according to their intended use and areas. Our study tries to answer according to what criteria electrification of city buses should be made in order to maintain economic feasibility. These criteria are applied on selected city bus lines and the configuration parameters are determined.
Introduces to lose the applied function that deal turn software R2V, analyzing the R2V in highway flat surface line design of the application realizes and concrete operation, for perfect increase the highway flat surface line design has to consults certainly value.
yue_Hant
2,950
Regulations Relaxing,Structure Adjusting and Competition Efficiency:A Comparative Study on Electric Power System Reform of China,Japan and Korea
This dissertation makes a comparative study on the practice of Chinese,Japanese and Korean electric power system reform,and analyzes the starting point of the reform, choice of target, mode of drawing competition,supervision to electricity,electric power privatization and the effect of the reform. At last,the author tries to get some experiences that are meaningful to deepen the reform of Chinese electric power system.
Large-scale construction projects frequently appear which make people pay more attention to the performance evaluation.This paper established the large-scale construction projects'performance evaluation index system and carried out comprehensive evaluation of the performance with a combined model of fuzzy theory and the matter-element theory.The analytic hierarchy process and the membership function were used to determine the weights of indicator elements and the membership matter-element.Finally,the study used the performance indicator system in construction phase for a large-scale project as an example to verify the fuzzy matter-element correctness,rationality and feasibility.
eng_Latn
2,954
On the Construction of the Hardware environment of Independent English Study
This paper analyses the importance and present situation of independent English study in our country,with the teaching practice for years,the auther raised from such aspects as the building targets,the building principles and the building plans on the construction of independent English study,the thinking,pos- sibilities and practical.
Aimed at the problem of the configuration program of road maintenance machinery index difficult to quantify, by using the method of fuzzy comprehensive in highway maintenance machinery configuration evaluation, depicted some fuzzy phenomena with subjective qualitative description , closely integrated quantitative analysis and subjective qualitative description. Try to find a reasonable and workable evaluation approach. So that make the mechanization of highway maintenance and construction more scientific and practical
eng_Latn
2,964
Compositions and methods for improving the cognitive function in animals and cognitive related functions
Compositions comprising melatonin and at least one carotenoid, and improvement of cognitive function in animals and cognitive-related functions (e.g., inhibition or prevention of decline of social interaction, inhibition or prevention of behavioral changes in age-related, trainable increase sex, improvement of attention, how to use the best maintenance of brain function, facilitating learning and memory, inhibition of amnesia, the composition for the prevention of loss of cognitive due to dementia or treatment) . Preferably, the composition is a useful food composition for improving cognitive function in humans and companion animals. .BACKGROUND
Large-scale construction projects frequently appear which make people pay more attention to the performance evaluation.This paper established the large-scale construction projects'performance evaluation index system and carried out comprehensive evaluation of the performance with a combined model of fuzzy theory and the matter-element theory.The analytic hierarchy process and the membership function were used to determine the weights of indicator elements and the membership matter-element.Finally,the study used the performance indicator system in construction phase for a large-scale project as an example to verify the fuzzy matter-element correctness,rationality and feasibility.
eng_Latn
2,973
Evaluation of supervising work under construction
For the sake of the objective evaluation of supervising works under construction stages, main evaluation factors are established to door window, decoration and roof engineering for supervising works. To make use of the fuzzy evaluation method, this paper put forward factor, comment and weight aggregations for the evaluation. Using this method, the integrative evaluation results of all the evaluation items for door window, decoration and roof engineering can be obtained, which provide supervising departments with foundations for the second evaluation to work quantity monitoring system.
Abstract : This study uses eyewitness accounts of enterprise operations as an unconventional source of information on Soviet productivity and worker behavior, Soviet Union, Worker productivity.
eng_Latn
2,983
Designing determination of procrastination level in students utilize genetic algorithms method in data mining classification
Someone who has difficulties in doing something in deadline, often did postpone, overprepare in something, even fail in completing the task in deadline, called as someone who did procrastination. This study utilize Genetic Algorithm method to determine the consideration in designing information system. The objective of this study is to apply the database management in determinig procrastinative characteristic in students at school. In Genetic Algorithm method, this study utilizemost appropriate determined indicators with Chromosome Representative in similarity between data characteristics in the constituent criteria classification that can influence the procrastinative characteristic in students. Based on the test and result, the conclusions are, Information System in Procrastination Determination Program would be computerized system, thus the Information System in Determining Procrastination in students would be useful in the future.
The key factor for improving the efficiency of assembly sequence planning is to reduce the number of components involved in the search or generation of assembly sequences. This paper presents a methodology for automatically extracting subassemblies from a product through analyzing the interference matrices and connection matrices of an assembly in order to simplify the assembly sequence generation. By considering the engineering information of components, the number of components involved in the search of assembly sequences can be reduced greatly.
eng_Latn
2,987
Motion planning with pulley, rope, and baskets
We study a motion planning problem where items have to be transported from the top room of a tower to the bottom of the tower, while simultaneously other items have to be transported into the opposite direction. Item sets are moved in two baskets hanging on a rope and pulley. To guarantee stability of the system, the weight difference between the contents of the two baskets must always stay below a given ::: threshold. ::: ::: We prove that it is Pi-2-p-complete to decide whether some given initial situation of the underlying discrete system can lead to a given goal situation. Furthermore we identify several polynomially solvable special cases of this reachability problem, and we also settle the computational complexity of a number of related questions.
Sooner or later each manufacturer, because of the rising level of competition on the market, will have to find a way to decrease the expenses and to increase the profit. Optimization is the way to make that. In this paper optimization of metal masts construction using ANSYS Mechanical and IOSO algorithms will be discussed. Loads are automatically calculated according to the Code of Rules 20.13330.2011 and load*bearing elements are checked for strength and stability according to the Code of Rules 16.13330.2011. After the optimization run a Pareto set is created which allows to manually choose an optimal variant of construction.
eng_Latn
2,989
Study on the Associative Character of Dongba Script in Lijiang Area
Dongba script, which is in the early stages of writing, is an important pic-tograph, and the study of Dongba script helps to study the origin and de-velopment of writing. Through long-time researching on Dongba scripture in Lijiang area, we find that associative character in Dongba script is special. This article researches on the associative characters in Dongba script.
Taking the BOT project of Chongquin Fuling Lidu Bridge as example,this paper discusses key links needing attention during BOT investment choice,contract signing and construction management.Some important points are summarized in this paper for successful executing BOT project.This paper can be referenced by investors in related building industries.
eng_Latn
2,993
Arrangement of and method for improving the image quality, in particular for image projection arrangements
An image projection arrangement for projecting an image, comprising: a) a scanner for sweeping a main laser beam along mutually orthogonal scan directions to project a pattern of scan lines each having a number of pixels; b) a control device which is operatively connected to the scanner, for causing selected pixels to be illuminated, and rendered visible, to produce the image; and c) having an optical arrangement for generating a plurality of constituent laser beams having respective output powers and mutually orthogonal polarizations, and a polarization beam combiner for combining the constituent laser beams to form the main laser beam with an output power greater than each output power of the constituent laser beams, to increase the brightness of the illuminated pixels.
Large-scale construction projects frequently appear which make people pay more attention to the performance evaluation.This paper established the large-scale construction projects'performance evaluation index system and carried out comprehensive evaluation of the performance with a combined model of fuzzy theory and the matter-element theory.The analytic hierarchy process and the membership function were used to determine the weights of indicator elements and the membership matter-element.Finally,the study used the performance indicator system in construction phase for a large-scale project as an example to verify the fuzzy matter-element correctness,rationality and feasibility.
eng_Latn
2,994
Bucket elimination: a unifying framework for processing hard and soft constraints
This position paper argues that extending the CSP model to a richer set of tasks such as, constraint optimization, probabilistic inference and decision theoretic tasks can be done within a unifying framework called ‘‘bucket elimination’’. The framework allows uniform hybrids for combining elimination and conditioning guided by the problem's structure and for explicating the tradeoffs between space and time and between time and accuracy.
This paper systematically analyzes main factors affecting the decision of bracing system for deep foundation pits; establishes an index system of evaluation scheme and a multiobjective fuzzy optimum model of bracing schemes through fuzzy mathematical theory; and puts forward rational methods of the definiteness of objective weight and quantitating qualitative objects. A case study shows that the decision method is effective and easy to operate.
eng_Latn
2,997
An application-independent concurrency skeleton in Ada 95
Foundations for the study of software architecture
Users' Preference Prediction of Real Estates Featuring Floor Plan Analysis using FloorNet
eng_Latn
3,044
A method for determining the state of charge of a secondary battery intercalation a wiedereaufladbaren
It is proposed a method for determining the state of charge of a secondary intercalation cell having an anode, a cathode, a separator and an anode, the cathode and the separator by soaking electrolyte phase, wherein the charge state calculated back based on on the intercalation of measured measurement values ​​by means of an electrochemical simulation model becomes. The underlying simulation model physico-chemical properties in the anode and the cathode are each simply called homogeneous in the anode and cathode in the considered distributed and Butler-Volmer kinetics will be calculated in each case for the anode and for the cathode. According to the invention it is provided that the Butler-Volmer kinetics anode side to a potential portion (Φ
The investigation of cardiac dysrhythmias has become increasingly more complex, time consuming, and expensive.’ A complex case requires 5 to 8 hours of physician time. Although automated computer technologies have been applied to dysrhythmia recognition by surface ECG and postcase measurement of electrophysiology data,2 they have not, to our knowledge, been used to perform stimulation protocols or measure stimulated intracardiac intervals automatically during electrophysiologic testing on line. It is the purpose of this article to report the development of a stimulation/measurement system and programs, which automatically measure intracardiac intervals during electrophysiologic study, and to compare these measurements to those obtained in the usual manner.
eng_Latn
3,095
what sensor detects blood ph
Chemoreceptors detect the levels of carbon dioxide in the blood by monitoring the concentrations of hydrogen ions in the blood. Learning Objective. Describe the role of chemoreceptors in the regulation of breathing. An increase in carbon dioxide concentration leads to a decrease in the pH of blood due to the production of H+ ions from carbonic acid.
The Diagnostics of Extraesophageal Reflux With the Restech System. This study has been completed. The aim of the project is to define the frequency with which EER is present in patient with chronic rhinosinusitis (CHR). The measurement will be carried out with a 24-hour monitoring of the pH using the Restech system. This modern device is equipped with a narrow antimony probe. The sensor is able to record not only liquid but also aerosol reflux episodes. The second aim is to determine the relation among EER, CHR and asthma bronchiale.
eng_Latn
3,138
what is a pulse volume recording
What is a pulse volume recording (PVR) study? A PVR study is a noninvasive vascular test in which blood pressure cuffs and a hand-held ultrasound device (called a Doppler or transducer) are used to obtain information about arterial blood flow in the arms and legs.
pulse oximeter. A device for the continuous monitoring of the blood oxygen levels, both by visible and audible means, during general anaesthesia. This is a major aid to patient safety as a very small drop in blood oxygenation is immediately apparent, alerting the anaesthetist to investigate the cause.
eng_Latn
3,161
what is emg solvent
Electromyography (EMG) is an experiemental technique used to record and analyse myoelectric signals. It can be defined as 'the study of muscle function through the inquiry of of the electrical signals that the muscle emanates (gives out or emits).
Electromyography (EMG) is an electrodiagnostic medicine technique for evaluating and recording the electrical activity produced by skeletal muscles. EMG is performed using an instrument called an electromyograph, to produce a record called an electromyogram.
eng_Latn
3,167
how to check blood flow in legs
A common test for checking the blood flow in your legs is called a PVR (pulse volume recording) study. During this test, cuffs like the ones used to measure blood pressure in your arm are wrapped around your arm and your leg on the same side of your body.Four cuffs are wrapped around your leg--1 at the upper thigh, 1 at the lower thigh, 1 at the upper calf and 1 at the ankle.uring this test, cuffs like the ones used to measure blood pressure in your arm are wrapped around your arm and your leg on the same side of your body. Four cuffs are wrapped around your leg--1 at the upper thigh, 1 at the lower thigh, 1 at the upper calf and 1 at the ankle.
A cardiac catheterization can check blood flow in the coronary arteries, check blood flow and blood pressure in the chambers of the heart, find out how well the heart valves work, and check for defects in the way the wall of the heart moves.
eng_Latn
3,181
define emg procedure
Electromyography (EMG) is a diagnostic procedure to assess the health of muscles and the nerve cells that control them (motor neurons). Motor neurons transmit electrical signals that cause muscles to contract. An EMG translates these signals into graphs, sounds or numerical values that a specialist interprets.
Highlights. 1 Electromyography (EMG) is a procedure that assesses the health of muscles and nerves. 2 There are usually two parts to an EMG procedure: the nerve conduction study and the needle EMG. 3 Abnormal EMG results usually indicate nerve or muscle damage.
eng_Latn
3,193
anthropometric measurements definition in ergonomics
Definition: Anthropometry is the measure of wo/man (anthro=man, pometry=measure). The study of anthropometry is the study of human body measurements to assist in understanding human physical variations and aid in anthropological classification.efinition: Anthropometry is the measure of wo/man (anthro=man, pometry=measure). The study of anthropometry is the study of human body measurements to assist in understanding human physical variations and aid in anthropological classification.
These important issues need to be understood and applied if the objective is to reduce work-related injuries, improve productivity, and improve the quality of life of the workers. ANTHROPOMETRY Anthropometry may be defmed as the measurement (e.g., height, elbow-wrist length, etc.) of human beings.
eng_Latn
3,205
what test is for electrolyte
Electrolyte tests are performed from routine blood tests. The techniques are simple, automated, and fairly uniform throughout the United States. During the preparation of blood plasma or serum, health workers must take care not to break the red blood cells, especially when testing for serum potassium.
Electromyography (EMG) is a form of electrodiagnostic testing that is used to study nerve and muscle function. Commonly performed by a physiatrist or neurologist trained in this procedure, EMG testing can provide your doctor with specific information about the extent of nerve and/or muscle injury and can also determine the exact location of injury and give some indication whether the damage is reversible.
eng_Latn
3,206
electromyography is what kind of specialist
Commonly performed by a physiatrist or neurologist trained in this procedure, EMG testing can provide your doctor with specific information about the extent of nerve and/or muscle injury and can also determine the exact location of injury and give some indication whether the damage is reversible.
Electromyography (EMG) is an experiemental technique used to record and analyse myoelectric signals. It can be defined as 'the study of muscle function through the inquiry of of the electrical signals that the muscle emanates (gives out or emits).
eng_Latn
3,294
what does a nerve conduction test show
The speed of the response is called the conduction velocity. The same nerves on the other side of the body may be studied for comparison. When the test is done, the electrodes are removed. Nerve conduction studies are done before an EMG if both tests are being done.Nerve conduction tests may take from 15 minutes to 1 hour or more, depending on how many nerves and muscles are studied.ormal: The EMG recording shows no electrical activity when the muscle is at rest. There is a smooth, wavy line on the recording with each muscle contraction. The nerve conduction studies show that the nerves send electrical impulses to the muscles or along the sensory nerves at normal speeds, or conduction velocities.
A nerve conduction study (NCS) is a medical diagnostic test commonly used to evaluate the function, especially the ability of electrical conduction, of the motor and sensory nerves of the human body. nerve conduction study (NCS) is a medical diagnostic test commonly used to evaluate the function, especially the ability of electrical conduction, of the motor and sensory nerves of the human body.
eng_Latn
3,302
what is a seismologist
What is a Seismologist? Seismology is the study of seismic waves, energy waves caused by rock suddenly breaking apart within the earth or the slipping of tectonic plates. We know these as events as earthquakes.
A seismogram is a visual record that is created by a seismograph. A seismograph is a piece of equipment that records earthquake movements. These two items go hand in hand and are essential for the study of earthquakes. Without a seismograph, there would be no seismogram. A seismograph detects movement in the Earth's crust, translating that movement through its inner workings to move a recording device, often a needle, that makes markings on what becomes the seismogram.
eng_Latn
3,317
what is the primary tool scientist use to measure earthquakes
The study of seismic waves is known as seismology, a word derived from a Greek word meaning to shake.. Seismographs are the instruments which record earthquakes. Scientists use these instruments as their principal tool to study seismic waves. They are very sensitive instruments that can detect, measure and record ground vibrations and their intensities during an earthquake. A seismograph is a simple pendulum.
A seismometer is the scientific instrument used to detect earthquakes. Signals received from seismometers are recorded and this allows scientists to calculate the size (amount of energy released) of an earthquake, that is, its magnitude.
eng_Latn
3,318
This type of weather radar calculates the speed & direction of a weather system
How to use and interpret Doppler weather radar - d4n.nl Words that appear in bold type are words that can be found in the glossary in .... Doppler radar has not always been used for weather radar. Doppler radar ..... Since radar only sends pulses of energy in one direction per pulse, the wind speed detected is ... measure or calculate wind in more than one dimension. However...
Forensic Science Vocabulary Unit #1 Flashcards | Quizlet Start studying Forensic Science Vocabulary Unit #1. Learn vocabulary, terms, and more with flashcards, games, and ... Ballistics. The science that deals with the motion, behavior, and effects of projectiles, most often firearms and bullets.
eng_Latn
3,379
what are baseline studies
Baseline studies. What is a baseline study? The purpose of a baseline study is to provide an information base against which to monitor and assess an activity’s progress and effectiveness during implementation and after the activity is completed.
Baseline Courses. 1 IS-700 NIMS, an Introduction: This independent study course introduces the NIMS concept. 2 ICS-100 Introduction to the Incident Command System: This independent study course introduces ICS and provides the foundation for higher level ICS training.
eng_Latn
3,437
operational definition a study of matter
An operational definition is a detailed specification of how one would go about measuring a given variable. Operational definitions can range from very simple and straightforward to quite complex, depending on the nature of the variable and the needs of the researcher.
Operational Definition. The operational definition of a variable is the specific way in which it is measured in that study. Another study might measure the same conceptual measure differently. If you were studying ways of helping people stop smoking, smoking cessation would be an outcome measure (dependent variable).
eng_Latn
3,441
how is physics used in forensics science
Answer Wiki. 1. Gun ballistics - the trajectory of a bullet can be reconstructed using physics equations, the impact between the bullet and the target (and shooter) is analyzed using conservation of momentum. The bullet also behaves as a gyroscope while in air and follows a precession or nutation oscillatory motion.
1 Forensic archaeology is the application of a combination of archaeological techniques and forensic science, typically in law enforcement. 2 Forensic astronomy uses methods from astronomy to determine past celestial constellations for forensic purposes. Forensic psychology is the study of the mind of an individual, using forensic methods. 2 Usually it determines the circumstances behind a criminal's behavior. 3 Forensic seismology is the study of techniques to distinguish the seismic signals generated by underground nuclear explosions from those generated by earthquakes.
eng_Latn
3,447
SOME INTERDICIPLINARY ASPECTS OF MINERALOGY
This paper gives a brief discussion of three problems related to the inter disciplinary aspects of mineralogy: (1) The relationship between the external forms and internal microscopic structures of crystals is traced historically, with special emphasis on calaverite whose indexing difficulty leads to incommensurately modulated structure, a kind of quasiperiodic structure. (2) The real structure (including imperfections and heterogeneous phase) of mineral crystals has been found to be a fertile field and of immense potential for mineralogy. (3) The study of internal periodic structure of opals in the scale of microns gave a first instance of mesoscopic periodic structure, and influenced materials science of the fabrication of photonic crystals and nanostrctures.
ABSTRACTNavigation using geophysical information is often implemented on underwater vehicles to correct the position errors of inertial navigation system. In this paper, we fuse the information of ...
yue_Hant
3,498
what are three key aspects of minerals?
Amethyst, a variety of quartz. A mineral is a naturally occurring chemical compound, usually of crystalline form and abiogenic in origin. A mineral has one specific chemical composition, whereas a rock can be an aggregate of different minerals or mineraloids. The study of minerals is called mineralogy.
Minerals are important for your body to stay healthy. Your body uses minerals for many different jobs, including building bones, making hormones and regulating your heartbeat. There are two kinds of minerals: macrominerals and trace minerals. Macrominerals are minerals your body needs in larger amounts.
eng_Latn
3,671
what does the mineral name tell us
Amethyst, a variety of quartz. A mineral is a naturally occurring chemical compound, usually of crystalline form and abiogenic in origin. A mineral has one specific chemical composition, whereas a rock can be an aggregate of different minerals or mineraloids. The study of minerals is called mineralogy.
1 We can tell different minerals apart by what crystal shape they are. 2 Sometimes minerals form in spaces where there is not a lot of room, so they don't have a crystal shape. 3 When there is just a big hunk of a mineral, it is called a massive mineral. Most of the earth's crystals were formed millions of years ago. 2 Crystals form when the liquid rock from inside the earth cool and harden. 3 Sometimes crystals form when liquids underground find their way into cracks and slowly deposit minerals.
eng_Latn
3,736
Here in this study we propose an efficient entanglement concentration protocol (ECP) for separate nitrogen-vacancy (NV) centers, resorting to the single-photon input---output process of the NV center and microtoroidal resonator coupled system. In the proposed ECP, one ancillary single-photon is prepared and passed through a hybrid quantum circuit. By measuring the photon under the suitable polarization basis, maximally entangled state between the separate NV centers can be obtained with a certain success probability. The solid entanglement will be preserved during the process, which can be iterated several rounds to obtain an optimal total success probability. We also discuss the experimental feasibility of the protocol by considering current technologies, and we believe that the protocol is useful in the future applications of long-distance quantum communication and distributed quantum computation.
The concatenated Greenberger-Horne-Zeiglinger (C-GHZ) state which is a new type of logic-qubit entanglement has attracted a lot of attentions recently. We present a feasible entanglement concentration protocol (ECP) for logic-qubit entanglement. This ECP is based on the linear optics, and it does not know the initial coefficients of the less-entangled C-GHZ state. This protocol can be extended to arbitrary C-GHZ state. This protocol may be useful in future quantum information processing tasks.
Since the end of the Cold War, the study of European defence has been dominated by a ‘Common Security and Defence Policy (CSDP)-centric’ approach, while largely neglecting the comparative analysis ...
eng_Latn
3,849
Recent demonstrations of macroscopic quantum coherence in Josephson junction based electronic circuits have opened an entirely new dimension for research and applications in the established field of Josephson electronics. In this article we discuss basic Josephson circuits for qubit applications, methods of quantum description of these circuits, and circuit solutions for qubit couplings. Principles of manipulation and readout of superconducting qubits are reviewed and illustrated with recent experiments using various qubit types.
Already in the first edition of this book (Barone and Paterno,"Fundamentals and Physics and Applications of the Josephson Effect", Wiley 1982), a great number of interesting and important applications for Josephson junctions were discussed. In the decades that have passed since then, several new applications have emerged. This chapter treats one such new class of applications: quantum optics and quantum information processing (QIP) based on superconducting circuits with Josephson junctions. In this chapter, we aim to explain the basics of superconducting quantum circuits with Josephson junctions and demonstrate how these systems open up new prospects, both for QIP and for the study of quantum optics and atomic physics.
The reasons for the failure to develop a successful Josephson tunnel junction made from high-temperature superconducting cuprates is discussed. The difficulties in developing a theoretical analysis of even simple-to-make cuprate Josephson devices are pointed out. The development of alternative Josephson devices based on the fabrication of engineered microbridges is addressed, and an overview is given of successful YBCO microbridges. Emphasis is given to focused ion beam and step-edge microbridges.
eng_Latn
3,851
This work inquires into global climatic catastrophes of the past, presenting data not easily available outside of the Socialist Countries, and applies these results to the study of future climatic developments, especially as they threaten in case of Nuclear Warfare - Nuclear Winter. The authors discuss probable after effects from the Soviet point of view on the basis of research, stressing the need to avoid all conflict which might lead to the next and final Global Climatic Catastrophy.
The Flood/post-Flood boundary in the geologic column can be determined by investigating geophysical evidence in light of Scripture’s record of the Flood. The following evidences are investigated: (1) global sediment and post-Flood erosion, (2) volcanism and climatic impact, (3) changes in the global sea level, (4) formation of the mountains of Ararat, and (5) the formation of fossil fuels. The evidences suggest that the Flood/post-Flood boundary is very late in the Cainozoic and most likely in the Pleistocene.
We present a scheme for achieving coherent spin squeezing of nuclear spin states in semiconductor quantum dots. The nuclear polarization dependence of the electron spin resonance generates a unitary evolution that drives nuclear spins into a collective entangled state. The polarization dependence of the resonance generates an area-preserving, twisting dynamics that squeezes and stretches the nuclear spin Wigner distribution without the need for nuclear spin flips. Our estimates of squeezing times indicate that the entanglement threshold can be reached in current experiments.
eng_Latn
3,862
Papers presented at the fourth annual meeting of the Society for Medical Decision Making are discussed in the context of a review of the rapidly evolving interdisciplinary field of medical decision making (MDM). Advocates claim that probabilistic MDM techniques will incorporate diagnostic information, treatment options and outcomes, patient preferences, societal ethics, and financial considerations into a rational framework for making decisions in the face of uncertainty.
This article highlights some observations made in the American Occupational Therapy Association/American Occupational Therapy Foundation Clinical Reasoning Study, an ethnographic study of 14 occupational therapists working in a large teaching hospital. Concepts and premises that frequently appear in the clinical reasoning in medicine literature are discussed and compared and contrasted to observations and interpretations made of the practice and reasoning strategies of the occupational therapists who were participants in the Clinical Reasoning Study. It is postulated that similarities in the reasoning strategies of the members of the two professions are a result of use of the scientific model that calls for hypothetical reasoning. Differences, it is proposed, are accounted for by the difference in the particular focus, goals, and tasks of the two professions and the nature of the practice in those arenas. Five hypotheses are proposed as questions for further research in clinical reasoning in occupational therapy.
Perfect Quantum Cloning Machines (QCM) would allow to use quantum nonlocality for arbitrary fast signaling. However perfect QCM cannot exist. We derive a bound on the fidelity of QCM compatible with the no-signaling constraint. This bound equals the fidelity of the Bu\v{z}ek-Hillery QCM.
eng_Latn
3,868
The Electrochemical Machining (ECM) is widely used in machining variety of components used in aerospace, automotive, defense & medical applications. Due to low machining accuracy ECM is yet to be a best alternative process. This paper presents experimental investigation of PECM parameters such as voltage, feed rate, and pulse on time, duty cycle on MRR. Keeping pressure constant, Taguchi‟s orthogonal array L9 has been effectively used to study the effect of independent process parameters. The results show PECM has enhanced MRR. The experimental results were analyzed using analysis of variance (ANOVA) method and by plotting various graphs.
This paper presents the concept and prototype of a computer aided engineering (CAE) system that can be used to solve different task of electrochemical machining (ECM), such as: tool-electrode design, selection of optimal machining variant and input machining parameters optimization. The system uses computer simulation software that was developed for various kinds of ECM operations like: electrochemical (EC) sinking, EC milling, EC smoothing, ECM-CNC with a universal electrode and numerically controlled electrode movement, etc. The results of computer simulation of different ECM processes and results of experimental verifications are also presented in the paper.
Perfect Quantum Cloning Machines (QCM) would allow to use quantum nonlocality for arbitrary fast signaling. However perfect QCM cannot exist. We derive a bound on the fidelity of QCM compatible with the no-signaling constraint. This bound equals the fidelity of the Bu\v{z}ek-Hillery QCM.
eng_Latn
3,879
The purpose of this work is to investigate the material and device properties of GST-based PCM by studying relaxation oscillations [1, 2]. Our experimental results relate oscillation characteristics to applied voltage, load resistance and device thickness.
We survey progress in the PCM field over the past five years, ranging from large-scale PCM demonstrations to materials improvements for high–temperature retention and faster switching. Both materials and new cell designs that support lower-power switching are discussed, as well as higher reliability for long cycling endurance. Two paths towards higher density are discussed: through 3D integration by the combination of PCM and 3D-capable access devices, and through multiple bits per cell, by understanding and managing resistance drift caused by structural relaxation of the amorphous phase. We also briefly survey work in the nascent field of brain-inspired neuromorphic systems that use PCM to implement non-Von Neumann computing.
Perfect Quantum Cloning Machines (QCM) would allow to use quantum nonlocality for arbitrary fast signaling. However perfect QCM cannot exist. We derive a bound on the fidelity of QCM compatible with the no-signaling constraint. This bound equals the fidelity of the Bu\v{z}ek-Hillery QCM.
eng_Latn
3,880
Quantum simulations of a fiber squeezing experiment
We report on the excellent agreement of a first-principles, quantum dynamical simulation with the experimentally measured results of a fiber squeezer using intense, ultra-short laser pulses.
In applying reinforcement learning to continuous space problems, discretization or redefinition of the learning space can be a promising approach. Several methods and algorithms have been introduced to learning agents to respond to this problem. In our previous study, we introduced an FCCM clustering technique into Q-learning (called QL-FCCM) and its transfer learning in the Markov process. Since we could not respond to complicated environments like a non-Markov process, in this study, we propose a method in which an agent updates his Q-table by changing the trade-off ratio, Q-learning and QL-FCCM, based on the damping ratio. We conducted numerical experiments of the single pendulum standing problem and our model resulted in a smooth learning process.
eng_Latn
3,891
S-Matrix for AdS from General Boundary QFT
The General Boundary Formulation (GBF) is a new framework for studying quantum theories. After concise overviews of the GBF and Schrodinger-Feynman quantization we apply the GBF to resolve a well known problem on Anti-deSitter spacetime where due to the lack of temporally asymptotic free states the usual S-matrix cannot be defined. We construct a different type of S-matrix plus propagators for free and interacting real Klein-Gordon theory.
We consider the throughput performance of ARQ in interfering channels, where the signal of interest as well as the interferers are subject to independent distributed Nakagami-m block fading. The key contribution is the derivation of closedform expressions for the rate-maximized throughput. For this purpose, we employ the powerful parameterization approach from [1], allowing the problem to be solved exactly in a closedform. We also consider the scaled-power, and the interferencelimited, case.
eng_Latn
3,894
This is page 1 Printer: Opaque this Quantum Monte Carlo Methods for Strongly Correlated Electron Systems
We review some of the recent development in quantum Monte Carlo (QMC) methods for models of strongly correlated electron systems. QMC is a promising general theoretical tool to study many-body systems, and has been widely applied in areas spanning condensed-matter, high-energy, and nuclear physics. Recent progress has included two new methods, the ground-state and finite-temperature constrained path Monte Carlo methods. These methods significantly improve the capability of numerical approaches to lattice models of correlated electron systems. They allow calculations without any decay of the sign, making possible calculations for large system sizes and low temperatures. The methods are approximate. Benchmark calculations show that accurate results on energy and correlation functions can be obtained. This chapter gives a pedagogical introduction to quantum Monte Carlo, with a focus on the constrained path Monte Carlo methods.
A geometric approach in the design of codebooks for OR frequency-hopping multiple-access (FHMA) channels is developed by treating a signal matrix as a finite set of distinct points. The relationship among the parameters of an interference-free j-distinguishable-point codebook, j>or=1, is established by using coordinate-free arguments. Geometry induced by such a codebook is characterized, and the design of a well-structured j-distinguishable-point codebook is related to a block design problem. It is shown that a well-structured 1-distinguishable-point codebook implies the axiom system of a finite affine plane. >
eng_Latn
3,897
Non-Markovian quantum state diffusion for an open quantum system in fermionic environments
Non-Markovian quantum state diffusion (NMQSD) provides a powerful approach to the dynamics of an open quantum system in bosonic environments. Here we develop an NMQSD method to study the open quantum system in fermionic environments. This problem involves anticommutative noise functions (i.e., Grassmann variables) that are intrinsically different from the noise functions of bosonic baths. We obtain the NMQSD equation for quantum states of the system and the non-Markovian master equation. Moreover, we apply this NMQSD method to single and double quantum-dot systems.
In this paper, a linear-quadratic leader-follower (LQLF) differential game is considered, where the game system is governed by a mean-field stochastic differential equation (MF-SDE). By stochastic maximum principle, the optimal solution to the LF stochastic differential game is expressed as a feedback form of the state and its mean with the aid of two systems of Riccati equations.
eng_Latn
3,899
Verifying cross-Kerr induced number squeezing: a case study
AbstractWe analyse an experimental method for creating interesting nonclassical states by processing the entanglement generated when two large coherent states interact in a cross-Kerr medium. We specifically investigate the effects of loss and noise in every mode of the experiment, as well as the effect of ‘binning’ the post-selection outcomes. Even with these imperfections, we find an optimal set of currently achievable parameters which would allow a proof-of-principle demonstration of number squeezing in states with large mean photon number. We discuss other useful states which can be generated with the same experimental tools, including a class of states which contain coherent superpositions of differing photon numbers, e.g. good approximations to the state 12(|0⟩+|20⟩). Finally, we suggest one possible application of this state in the field of optomechanics.
Given a pair of simple dimension groups with isomorphic state spaces we try to express it as the pair K0(A),K0(A×αR) of K0 groups for a C*-algebra A with an action α of R, where both A and the crossed product A×αR are supposed to be simple AT algebras of real rank zero. We solve this when the state spaces are finite-dimensional.
eng_Latn
3,902
Generating Entangled Photons from the Vacuum by Accelerated Measurements: Quantum Information Theory Meets the Unruh-Davies Effect
quantum communications between accelerated and stationary observers. We find that the projective measurement by a uniformly accelerated observer can excite real particles from the vacuum in the inertial frame, even if no additional particles are created by the measurement process in the accelerating frame. Furthermore, we show that the particles created by this accelerating measurement can be highly entangled in the inertial frame, and it is also possible to use this process to generate even maximally entangled two-qubit states by a certain arrangement of measurements. As a by-product of our analysis, we also show that a single qubit of information can be perfectly transmitted from the accelerating observer to the inertial one. In principle, such an effect could be exploited in designing an entangled-state generator for quantum communication.
Abstract In this paper, we study the long-time behavior of solutions for a non-autonomous strongly damped wave equation. We first prove the existence of a uniform attractor for the equation with a translation compact driving force and then obtain an upper estimate for the Kolmogorov e -entropy of the uniform attractor. Finally we obtain an upper bound of the fractal dimension of the uniform attractor with quasiperiodic force.
eng_Latn
3,904
On the Complexity of Quantum Languages
The standard inputs given to a quantum machine are classical binary strings. In this view, any quantum complexity class is a collection of subsets of {0, 1} � . However, a quantum machine can also accept quantum states as its input. T. Yamakami has introduced a general framework for quantum operators and inputs [18]. In this paper we present several quantum languages within this model and by generalizing the complexity classes QMA and QCMA we analyze the complexity of the introduced languages. We also discuss how to derive a classical language from a given quantum language and as a result we introduce new QCMA and QMAlanguages.
The generation and behavior of the fractal Koch array factor from a Kaiser window generator is studied. The main advantage of using Kaiser windows is that pattern parameters become much more flexible through altering the Kaiser window. The mainlobe width, current distribution, side-lobe ratio are now adjustable. Different reduced array structures can be obtained by using different threshold levels. Higher threshold values result in a highly reduced number of elements but they may highly distort the pattern and, hence, the multiband behavior. Finally, we study the effect of quantization of the feeding values. Quantization is necessary for implementation and simplification purposes. Several configurations of current distributions with the corresponding patterns are illustrated for different quantization levels. It is shown that moderate quantization keeps the same interesting similarity properties at several bands.
eng_Latn
3,909
Source for macroscopic entangled states
Abstract New scheme to conditionally produce two types of macroscopic entangled states is developed. The studied system consists of the system of coupled down converters with type-I phase matching pumped simultaneously by powerful optical fields in coherent states, one auxiliary photon in superposition state of two input modes and projective measurement system used in output generated modes. Given scheme works without photon number resolving detection. Moreover, an analysis of “separation” between components of the entangled states and influence of detector inefficiency on fidelity of the scheme is accomplished.
Abstract The new interest of a study of the incoherent γ-ray scattering on bound electrons is pointed out. An experimental set-up is described; it allows angular- and energy-distribution measurements on germanium K-shell electrons. Performances and preliminary results obtained with an incoming energy of 662 keV are given.
eng_Latn
3,951
Measurement and the Quantum World
The quantum theory of measurement and its development are explained. The basis for viewing measurement in quantum mechanics as a genuine physical, rather than psychophysical process or a process that depends on consciousness or the mental, is given and defended against the critiques of Bell and others. The notion of quantum measurement as the actualization of quantum potentiality as grounded in the related versions advocated by Heisenberg and Shimony is explicated in the context of the theory of positive-operator-valued measures. Quantum interference is discussed as a process of interference of quantum potentialities in contradistinction to the interference of some material substance or the interference of probability waves. This provides the basis for a valid realist interpretation of quantum theory.
The generation and behavior of the fractal Koch array factor from a Kaiser window generator is studied. The main advantage of using Kaiser windows is that pattern parameters become much more flexible through altering the Kaiser window. The mainlobe width, current distribution, side-lobe ratio are now adjustable. Different reduced array structures can be obtained by using different threshold levels. Higher threshold values result in a highly reduced number of elements but they may highly distort the pattern and, hence, the multiband behavior. Finally, we study the effect of quantization of the feeding values. Quantization is necessary for implementation and simplification purposes. Several configurations of current distributions with the corresponding patterns are illustrated for different quantization levels. It is shown that moderate quantization keeps the same interesting similarity properties at several bands.
eng_Latn
3,952
Quantum statistical inference for density estimation
A new penalized likelihood method for non-parametric density estimation is proposed, which is based on a mathematical analogy to quantum statistical physics. The mathematical procedure for density estimation is related to maximum entropy methods for inverse problems; the penalty function is a convex information divergence enforcing global smoothing toward default models, positivity, extensivity and normalization. The novel feature is the replacement of classical entropy by quantum entropy, so that local smoothing may be enforced by constraints on the expectation values of differential operators. Although the hyperparameters, covariance, and linear response to perturbations can be estimated by a variety of statistical methods, we develop the Bayesian interpretation. The linear response of the MAP estimate is proportional to the covariance. The hyperparameters are estimated by type-II maximum likelihood. The method is demonstrated on standard data sets.
The generation and behavior of the fractal Koch array factor from a Kaiser window generator is studied. The main advantage of using Kaiser windows is that pattern parameters become much more flexible through altering the Kaiser window. The mainlobe width, current distribution, side-lobe ratio are now adjustable. Different reduced array structures can be obtained by using different threshold levels. Higher threshold values result in a highly reduced number of elements but they may highly distort the pattern and, hence, the multiband behavior. Finally, we study the effect of quantization of the feeding values. Quantization is necessary for implementation and simplification purposes. Several configurations of current distributions with the corresponding patterns are illustrated for different quantization levels. It is shown that moderate quantization keeps the same interesting similarity properties at several bands.
eng_Latn
3,957
Is the Preferred Basis selected by the environment?
We show that in a quantum measurement, the preferred basis is determined by the interaction between the apparatus and the quantum system, instead of by the environment. This interaction entangles three degrees of freedom, one system degree of freedom we are interested in and preserved by the interaction, one system degree of freedom that carries the change due to the interaction, and the apparatus degree of freedom which is always ignored. Considering all three degrees of freedom the composite state only has one decomposition, and this guarantees that the apparatus would end up in the expected preferred basis of our daily experiences. We also point out some problems with the environment-induced super-selection (Einselection) solution to the preferred basis problem, and clarifies a common misunderstanding of environmental decoherence and the preferred basis problem.
The paper introducess the implemented conditions which colling recycle of flexible base on Ying-DaLine.It over all xpounds the investigation of road condlitions of original pavement,indoor trial study,determination of trial lot plan as well as completion of trial lot etc.
eng_Latn
3,969
Selective correlations in finite quantum systems and the Desargues property
The Desargues property is well known in the context of projective geometry. An analogous property is presented in the context of both classical and Quantum Physics. In a classical context, the Desargues property implies that two logical circuits with the same input, show in their outputs selective correlations. In general their outputs are uncorrelated, but if the output of one has a particular value, then the output of the other has another particular value. In a quantum context, the Desargues property implies that two experiments each of which involves two successive projective measurements, have selective correlations. For a particular set of projectors, if in one experiment the second measurement does not change the output of the first measurement, then the same is true in the other experiment.
The generation and behavior of the fractal Koch array factor from a Kaiser window generator is studied. The main advantage of using Kaiser windows is that pattern parameters become much more flexible through altering the Kaiser window. The mainlobe width, current distribution, side-lobe ratio are now adjustable. Different reduced array structures can be obtained by using different threshold levels. Higher threshold values result in a highly reduced number of elements but they may highly distort the pattern and, hence, the multiband behavior. Finally, we study the effect of quantization of the feeding values. Quantization is necessary for implementation and simplification purposes. Several configurations of current distributions with the corresponding patterns are illustrated for different quantization levels. It is shown that moderate quantization keeps the same interesting similarity properties at several bands.
eng_Latn
3,977
Modeling quantum information dynamics achieved with time-dependent driven fields in the context of universal quantum processing
Quantum information is a useful resource to set up information processing. Despite physical components are normally two-level systems, their combination with entangling interactions becomes in a complex dynamics. Studied for piecewise field pulses, this work analyzes the modeling for quantum information operations with fields affordable technologically towards a universal quantum computation model.
The generation and behavior of the fractal Koch array factor from a Kaiser window generator is studied. The main advantage of using Kaiser windows is that pattern parameters become much more flexible through altering the Kaiser window. The mainlobe width, current distribution, side-lobe ratio are now adjustable. Different reduced array structures can be obtained by using different threshold levels. Higher threshold values result in a highly reduced number of elements but they may highly distort the pattern and, hence, the multiband behavior. Finally, we study the effect of quantization of the feeding values. Quantization is necessary for implementation and simplification purposes. Several configurations of current distributions with the corresponding patterns are illustrated for different quantization levels. It is shown that moderate quantization keeps the same interesting similarity properties at several bands.
eng_Latn
3,979
An) analysis of quantum cryptography vulnerability by Binary merge
In this paper, quantum cryptography systems used in the design process inevitably open bit stream of pseudo-random number that exists multiple open channels between them and the need to share information on the part of the situation exposes a pair of bit stream. In this paper, the base test of pseudo-random number I tested out this process and the merge bit binary column look out for randomness.
The generation and behavior of the fractal Koch array factor from a Kaiser window generator is studied. The main advantage of using Kaiser windows is that pattern parameters become much more flexible through altering the Kaiser window. The mainlobe width, current distribution, side-lobe ratio are now adjustable. Different reduced array structures can be obtained by using different threshold levels. Higher threshold values result in a highly reduced number of elements but they may highly distort the pattern and, hence, the multiband behavior. Finally, we study the effect of quantization of the feeding values. Quantization is necessary for implementation and simplification purposes. Several configurations of current distributions with the corresponding patterns are illustrated for different quantization levels. It is shown that moderate quantization keeps the same interesting similarity properties at several bands.
eng_Latn
3,980
Controlling chaos in higher dimensional maps with constant feedback: an analytical approach.
We introduce two methods to control chaos in higher-dimensional discrete maps with constant feedback. It is analytically shown for a general class of function vectors that chaotic attractors can be converted into fixed point attractors. Additionally, a method to choose an appropriate constant feedback is presented. The application of these methods does not require a priori knowledge of system equations, since time series information can be used. Desired periodic orbits can be accessed by varying the constant feedback. As an example, the methods are applied to the Hénon map.
Quantum Mechanics at Planck scale is considered as a deformation of the conventional Quantum Mechanics. Similar to the earlier works of the author, the main object of deformation is the density matrix. On this basis a notion of the entropy density is introduced that is a matrix value used for a detail study of the Information Problem at the Universe, and in particular, for the Information Paradox Problem.
eng_Latn
3,990
A spectral characterization for generalized quantum gates
In this article, we prove that a contraction A on a separable Hilbert space is not a generalized quantum gate (or a convex combination of unitaries) if and only if A is a semi-Fredholm with ind A≠0 and a compact perturbation of a partial isometry.
The generation and behavior of the fractal Koch array factor from a Kaiser window generator is studied. The main advantage of using Kaiser windows is that pattern parameters become much more flexible through altering the Kaiser window. The mainlobe width, current distribution, side-lobe ratio are now adjustable. Different reduced array structures can be obtained by using different threshold levels. Higher threshold values result in a highly reduced number of elements but they may highly distort the pattern and, hence, the multiband behavior. Finally, we study the effect of quantization of the feeding values. Quantization is necessary for implementation and simplification purposes. Several configurations of current distributions with the corresponding patterns are illustrated for different quantization levels. It is shown that moderate quantization keeps the same interesting similarity properties at several bands.
eng_Latn
3,991
Which book will be more prefer to study quantum mechanics?
What is the best book on Quantum Mechanics?
What are some good books to read to prepare for CAT?
eng_Latn
3,999
Hardware design of an NTT-based polynomial multiplier
Algorithms for quantum computation: discrete logarithms and factoring
Orientation sensitivity to graspable objects: An fMRI adaptation study
eng_Latn
4,178
Quantum Computation by Adiabatic Evolution
Strengths and Weaknesses of Quantum Computing
Group value and intention to use - A study of multi-agency disaster management information systems for public safety
kor_Hang
4,192
Which book is most useful for studying quantum mechanics?
Which book will be more prefer to study quantum mechanics?
What is the best book on quantum mechanics published in 2015?
eng_Latn
4,207
The magnetic design of a ten-period (each period 14 mm) prototype superconducting undulator is reported using RADIA. The results of modelling the magnetic flux density are presented in an analytical formula. The dependence of the field integrals and phase error on the current density and undulator gap has been calculated, and temperature curves are determined for the models and are compared with earlier reported Moser-Rossmanith fits.
In this paper, we study the magnetic field measurement of U20 undulator. The U20 is a prototype hybrid structure of NdFeB magnets and cobalt steel poles. The hybrid device is made up of 25 periods with a period length of 20-mm each and is designed to deliver magnetic flux density (in rms) from 2400G to 500G in the 10–20-mm gap range. The validity of the theoretical and analytical formulas is analyzed through the measured field and phase integrals.
We prove that groups acting geometrically on delta-quasiconvex spaces contain no essential Baumslag-Solitar quotients as subgroups. This implies that they are translation discrete, meaning that the translation numbers of their nontorsion elements are bounded away from zero.
eng_Latn
4,250
The paper aims to analyze the electromagnetic field components in the near field of a high frequency (HF) emission antenna. The typical case of a dipole antenna emitting Near Vertical Incident Skywave (NVIS) HF signals at two emitting frequencies (3.6 MHz and 7.1 MHz) and at 100 W input power is being discussed. The measurements have highlighted the places around the antenna where the field levels are high, from the perspective of safe exposure to electromagnetic radiation. Thus, it was found that on a line situated just beneath and parallel to the emitting antenna, the electric field strength may exceed the safety limit recommended for the population by ICNIRP guidelines. The magnetic field component however was lower than the limit in all considered cases.
High frequency data transmission over short distances (few tens-few hundreds of km) is a valuable option for communication links in special/critical situations. Present experimental study of noise level and channel availability for digital multi-carrier modulated signals in the range 2.5–7.6MHz showed that reliable transmission is possible when signal to noise ratio exceeds 14dB and when receiving location is well characterized empirically prior to link deployment.
The Superconducting Magnet Division at Brookhaven National Laboratory (BNL) is making and testing 20 insertion region dipoles for the Large Hadron Collider (LHC) at CERN. These 9.45 m-long, 8 cm aperture magnets have the same coil design as the arc dipoles now operating in the Relativistic Heavy Ion Collider (RHIC) at BNL and will be of single and twin aperture cold mass configurations. They will produce fields up to 4.14 T for operation at 7.56 TeV. The magnets will be tested at 4.5 K using either forced flow supercritical helium or liquid helium. This paper reports the results of tests of four D1 magnets, including spontaneous quench performance, verification of quench protection heater operation, and magnetic field quality.
eng_Latn
4,275
Dynamo bifurcations in an array of driven convectionlike rolls
The bifurcations in a three-dimensional incompressible, electrically conducting fluid with an external forcing of the Roberts type have been studied numerically. The corresponding flow can serve as a model for the convection in the outer core of the Earth and is realized in an ongoing laboratory experiment aimed at demonstrating a dynamo effect. The symmetry group of the problem has been determined and special attention has been paid to symmetry breaking by the bifurcations. The nonmagnetic, steady Roberts flow loses stability to a steady magnetic state, which in turn is subject to secondary bifurcations. The secondary solution branches have been traced until they end up in chaotic states.
Abstract A very promising spin physics programme will be soon on the way at the BNL Relativistic Heavy Ion Collider (RHIC). By studying the spin asymmetries for various processes (single photon, single jet and W ± production), we will compare the different predictions obtained using some sets of polarized parton distributions, available in the recent literature. We will put some emphasise on the analysis of the anticipated errors, given the event rates expected from this high luminosity new machine and the current acceptance for the detector systems at RHIC.
eng_Latn
4,279
Influence of an electric field on self-diffraction of light waves in titanosillenite
A titanosillenite crystal is used as an example to demonstrate a strong dependence of the characteristics of energy exchange by self-diffraction of light in photorefractive crystals on the initial polarizations of the interacting waves and on the presence of an external electric field.
In support of the TCV experimental campaign aiming at studying H-mode plasmas with snowflake (SF) divertor, free boundary equilibrium and stability studies were performed with the SPIDER and KINX codes. Due to the high flexibility of plasma shaping capabilities of TCV, SF divertor conditions can be reached for various plasma geometries. However, at high plasma current some configurations require poloidal field (PF) coil currents close to the machine limit. This is particularly important when the equilibrium sensitivity to the edge pedestal profiles, which is higher than for standard X-point configurations, is taken into account. That is why the configuration optimization should also include the profile sensitivity study when planning the shot scenario.
eng_Latn
4,282
Plan power supply for static rectifiers
Static rectifiers are widely used for HVDC conversion and in industrial plants. Rectifiers disturb a power system by generating harmonic-frequency voltages and currents which affect power system equipment in many ways. The planning required so that a rectifier plant will have the least detrimental effect on a power system is discussed. Recommendations for total harmonic voltage are given. (LCL)
In support of the TCV experimental campaign aiming at studying H-mode plasmas with snowflake (SF) divertor, free boundary equilibrium and stability studies were performed with the SPIDER and KINX codes. Due to the high flexibility of plasma shaping capabilities of TCV, SF divertor conditions can be reached for various plasma geometries. However, at high plasma current some configurations require poloidal field (PF) coil currents close to the machine limit. This is particularly important when the equilibrium sensitivity to the edge pedestal profiles, which is higher than for standard X-point configurations, is taken into account. That is why the configuration optimization should also include the profile sensitivity study when planning the shot scenario.
eng_Latn
4,313
130 MM aperture quadrupoles for the LHC luminosity upgrade
Several studies for the LHC luminosity upgrade pointed out the need for low-beta quadrupoles with apertures larger than the present baseline (70 mm). In this paper we focus on the design issues of a 130 mm aperture quadrupole. We first consider the Nb-Ti option, presenting a magnetic design with the LHC dipole and quadrupole cables. We study the electromagnetic forces and we discuss the field quality constraints. For the Nb3Sn option, we sketch three designs, two based on the LARP 10 mm width cable, and one on a larger cable with the same strand. The issue of the stress induced by the e.m. forces, which is critical for the Nb3Sn, is discussed using both scaling laws and finite element models.
4 pages.-- PACS numbers: 05.45.Xt, 87.10.+e.-- ArXiv pre-print: http://arxiv.org/abs/nlin.CD/0512009.-- Final full-text version of the paper available at: http://dx.doi.org/10.1103/PhysRevE.73.055202.
eng_Latn
4,316
Transport of fast particles in turbulent fields
In light of planned ITER operation, it is important to improve understanding of fast particle behavior. This work focuses on the influence of both electrostatic and electromagnetic background microturbulent fields by studying passive fast ions in gyrokinetic turbulence simulations. Since magnetic fluctuations are the primary source of fast particle diffusion at high energies, the threshold values are investigated where this regime starts to dominate over the effects of electrostatic turbulence. Comparisons with analytical theory are presented.
This paper deals with the development of a mathematical model for very high power density actively shielded air-core superconducting (SC) machines. The interacting forces of SC coils in the actively shielded SC machine are studied using a two-dimensional analytical approach. The transfer relation methodology is employed to analyze the fully air-core SC motor that is designed to reduce the weight of the machine with actively shielded coil stator. Magnetic flux and force density distributions obtained by the proposed method are compared with those obtained from finite element analyses. The results can be considered as elements of a library of tools leading toward efficient optimization and mechanical design of actively shielded SC machines.
eng_Latn
4,329
A chain rule in \(L^{1}\left({\operatorname*{div};\Omega}\right)\) and its applications to lower semicontinuity
A chain rule in the space \(L^{1}\left(\operatorname*{div};\Omega\right) \) is obtained under weak regularity conditions. This chain rule has important applications in the study of lower semicontinuity problems for general functionals of the form \(\int_{\Omega}f(x,u,\nabla u) dx\) with respect to strong convergence in \(L^{1}\left(\Omega\right) \) . Classical results of Serrin and of De Giorgi, Buttazzo and Dal Maso are extended and generalized.
As part of the LANL/VNIIEF collaboration a high velocity cylindrical liner driven Hugoniot experiment is being designed to be driven by a VNIEF Disk Explosive Magnetic (flux compression) Generator (DEMG). Several variations in drive current and liner thickness have been proposed. This presentation will describe the LANL 1D and 2D simulations used to evaluate those designs. The presentation will also propose an analysis technique to assess a high current drive systems ability to stably and optimally drive a cylindrical aluminum liner for this type of experiment.
eng_Latn
4,335
Optimization of the snowflake diverted equilibria in the TCV tokamak
In support of the TCV experimental campaign aiming at studying H-mode plasmas with snowflake (SF) divertor, free boundary equilibrium and stability studies were performed with the SPIDER and KINX codes. Due to the high flexibility of plasma shaping capabilities of TCV, SF divertor conditions can be reached for various plasma geometries. However, at high plasma current some configurations require poloidal field (PF) coil currents close to the machine limit. This is particularly important when the equilibrium sensitivity to the edge pedestal profiles, which is higher than for standard X-point configurations, is taken into account. That is why the configuration optimization should also include the profile sensitivity study when planning the shot scenario.
Conditions are derived for the construction of total variation diminishing difference schemes with multi-point support. These conditions, which are proved for explicit, implicit, and semi-discrete schemes, correspond in a general sense to the introduction of upwind biasing.
eng_Latn
4,339
Dust-Acoustic Shock Waves in a Self-Gravitating Opposite Polarity Dusty Plasmas With Trapped Ions
The basic characteristics of dust-acoustic (DA) shock waves (DASHWs) in self-gravitating dusty plasmas containing massive dust of opposite polarity, trapped ions, and Boltzmann electrons has been studied. The reductive perturbation technique has been employed to derive standard modified Burgers equation (mBE). The basic properties (viz., amplitude, width, and speed) of small but finite-amplitude DASHWs are significantly modified by the combined effects of positively and negatively charged dust component, self-gravitational force, and trapped ions. The implication of our investigation can be very effective for understanding and studying the nonlinear characteristics of the DA waves (DAWs) in laboratory and space dusty plasmas.
In this paper a study for very high efficiency targeting 99 % range converter is described. We have proposed a new soft switching boost type chopper based on snubber assisted zero voltage and zero current transition (SAZZ) with output diode fabricated “SiC schottky diode”. The output power of 8 kW with the efficiency of 98.96% was obtained. The loss breakdown evaluation of SiC-SAZZ is discussed.
eng_Latn
4,340
Giant sawtooth oscillations in the Doublet III tokamak
Large-amplitude sawtooth oscillations have been observed in the soft-X-ray emission, central electron temperature, neutron production rate, radiated power, Dα emission, fast-neutral flux and one-turn voltage during beam-heating experiments in the Doublet III tokamak. The necessary conditions for the appearance of giant sawteeth seem to be operation at low qa and high βT. Many features of the giant sawteeth have been simulated with a transport code to which a sawtooth mixing model and several diagnostic models were added. Substantial redistribution of particles and energy evidently occurs over the plasma interior. Global energy confinement, however, is only moderately degraded by the sawteeth.
This study investigates the effects of Pierson-Moskowitz, Jonswap spectrum that are typical irregular wave spectrums for wind turbine system with jacket support structure. Also various offshore environmental parameters based on korean local condition were used in our study. The loads acting on the system was considered by referring to the Design Load Case from IEC guide line. And improved von Karman model was used as a turbulence model. As a result, various significant wave height and peak spectral period cause noticeable difference of extreme and fatigue loads prediction.
eng_Latn
4,351
Electric and magnetic fields of 50 Hz are not associated with sudden infant death syndrome
Sudden infant death syndrome (SIDS) is a multifactorial condition. Power frequency magnetic fields have been implicated in SIDS. Through the use of a case-control study measuring 50 Hz electric and magnetic fields at the SIDS baby's last head position, no association could be found between SIDS and either electric (p = 0.327) or magnetic (p = 0.827) 50 Hz fields.
In support of the TCV experimental campaign aiming at studying H-mode plasmas with snowflake (SF) divertor, free boundary equilibrium and stability studies were performed with the SPIDER and KINX codes. Due to the high flexibility of plasma shaping capabilities of TCV, SF divertor conditions can be reached for various plasma geometries. However, at high plasma current some configurations require poloidal field (PF) coil currents close to the machine limit. This is particularly important when the equilibrium sensitivity to the edge pedestal profiles, which is higher than for standard X-point configurations, is taken into account. That is why the configuration optimization should also include the profile sensitivity study when planning the shot scenario.
eng_Latn
4,355
Bifurcation of nonlinear elliptic system from the first eigenvalue, Electron
We study the following bifurcation problem in a bounded domain in IR N : 8 : pu = juj jvj v + f(x; u; v; ) in qv = juj jvj u + g(x; u; v; ) in (u; v) 2 W 1;p 0 () W 1;q 0 () : We prove that the principal eigenvalue 1 of the following eigen- value problem 8
It is shown that a finite-amplitude ion-acoustic wave in a uniform magneto- plasma can enhance two-dimensional plasma vortices. The latter results from a modulational instability. The growth rates are obtained analytically.
eng_Latn
4,371
Features of electron density and temperature in the 500–3500 km region of the plasmasphere—(1) Dawn and dusk sectors
Dawn-dusk features of the plasmasphere are examined for intervals in February and September 1969, using electrostatic probe data ofN e andT e from the ISIS-I satellite. Clear plasmatrough formation is seen in the vicinity of 70° geomagnetic latitude in both dawn and dusk sectors in the 1500–3500 km region, but the plasmatrough is absent in the altitude range 500–1500 km. The plasmatrough minimum near 70°φ exhibits no asymmetry between dawn and dusk sectors in its latitudinal position. TheT e peak associated with the plasmatrough is more pronounced in the dawn sector. DawnN e is less than duskN e, but dawnT e exceeds duskT e. The influence of processes in the magnetosphere in causing these features is examined.
A simplified treatment is proposed to study quantitatively the lattice dynamics of CsK, CsRb, and RbK alloy systems. The volume effect on the lattice dynamics of the pure constituent is considered, and the phonon dispersion relations of the local and band modes are obtained forthe Rb0.71K0.29, Rb0.3K0.7, Cs0.7K0.3, Cs0.7Rb0.3, and Cs0.3Rb0.7 systems. Then, the x-dependence of the local and band mode frequencies is calculated for the Rb1−xKx, Cs1−xKx and Cs1−xRbx systems.
eng_Latn
4,381
SADE: The starspot and dynamo explorer
Abstract We propose a mission called SADE, the Starspot And Dynamo Explorer, to study dynamo activity in nearby late-type stars. The onboard instruments will be a Ca-K telescope for magnetically dominated chromospheric emission, and an X-ray grazing incidence telescope to study coronal emission. We design the mission for a life-time of 15 years or longer to capture a full activity cycle for most solar-type stars. We aim to firmly establish the spectrum of the relation between chromospheric and corona' emission in late-type stars, and capture one or more stars going into or coming out of a Maunder type minimum. Operation costs will be kept to a minimum by automating mission operations to a maximum, and have the science operations be carried out by students at Montana State University.
This paper presents the recent development of the pulsed power technology based on inductive energy storage (including Superconducting Magnetic Energy Storage, SMES) and its opening switch. It also introduces several circuit topologies with power electronics/superconducting opening switch.
eng_Latn
4,388
Constrained space-time zero-forcing pre-equalizer for the downlink channel of UMTS-TDD
The great diversity of services expected to be delivered by third generation mobile radio systems will impose severe operating conditions on the mobile terminal in terms of computational requirements and power consumption. Therefore, we propose to move the most demanding signal processing tasks, usually performed by the mobile unit, to the base station. This technique is developed for a UMTS-TDD downlink scenario through an equalizer synthesis method based on the redundancy between non-overlapping bands of a direct sequence spread spectrum (DS-SS) signal, with the design optimised for minimum power transmission under the zero-forcing criterion.
Abstract In this paper, we study the long-time behavior of solutions for a non-autonomous strongly damped wave equation. We first prove the existence of a uniform attractor for the equation with a translation compact driving force and then obtain an upper estimate for the Kolmogorov e -entropy of the uniform attractor. Finally we obtain an upper bound of the fractal dimension of the uniform attractor with quasiperiodic force.
eng_Latn
4,421
A power penalty method for linear complementarity problems
We propose a power penalty approach to a linear complementarity problem (LCP) in R^n based on approximating the LCP by a nonlinear equation. We prove that the solution to this equation converges to that of the LCP at an exponential rate when the penalty parameter tends to infinity.
Summary form only given, as follows. A study is presented of precision constraints imposed by a hybrid chip architecture with analog neurons and digital backpropagation calculations. Conversions between the analog and digital domains and weight storage restrictions impose precision limits on both analog and digital calculations. It is shown through simulations that a learning system of this nature can be implemented in spite of limited resolution in the analog circuits and using fixed-point arithmetic to implement the backpropagation algorithm. >
eng_Latn
4,423
Information Aggregation in Probabilistic Prediction
Probabilistic prediction generally involves the consideration of information from many different sources, and this information must be aggregated to determine a single probability (or probability distribution). This paper is concerned with the aggregation process, and although some aspects of the paper are new, much of the paper is tutorial in nature. Models of the aggregation process are discussed, with particular emphasis on the question of the conditional dependence of information, and measures of the redundancy of information are developed. In addition, a review of previous experiments concerning the aggregation process is given, along with suggestions for experiments that should provide additional insight into the nature and ``efficiency'' of this process. In view of the importance of probabilistic prediction in inferential and decision-making situations, additional investigation and experimentation concerning the aggregation process should be of considerable value.
Summary form only given, as follows. A study is presented of precision constraints imposed by a hybrid chip architecture with analog neurons and digital backpropagation calculations. Conversions between the analog and digital domains and weight storage restrictions impose precision limits on both analog and digital calculations. It is shown through simulations that a learning system of this nature can be implemented in spite of limited resolution in the analog circuits and using fixed-point arithmetic to implement the backpropagation algorithm. >
eng_Latn
4,424
Hardware-efficient belief propagation
Fast approximate energy minimization via graph cuts
basic principles of information protection a . considerations surrounding the study of .
eng_Latn
4,434
Importance sampling evaluation of digital phase detectors based on extended Kalman-Bucy filters
This paper proposes an importance sampling methodology for the performance evaluation of a class of open-loop receivers with random carrier phase tracking in additive white Gaussian noise channels. The receivers, consisting of a bank of extended Kalman-Bucy filters and a decision algorithm based on the filters' innovations processes, perform symbol-by-symbol phase detection while kepping track of the random phase process within the symbol interval. We use a large deviations approach to start a stochastic importance sampling optimization, both for the irreducible error floor and for the general noisy operation of the receiver. Our simulations show a practical coincidence with conventional Monte Carlo results, with considerable simulation time gains.
The generation and behavior of the fractal Koch array factor from a Kaiser window generator is studied. The main advantage of using Kaiser windows is that pattern parameters become much more flexible through altering the Kaiser window. The mainlobe width, current distribution, side-lobe ratio are now adjustable. Different reduced array structures can be obtained by using different threshold levels. Higher threshold values result in a highly reduced number of elements but they may highly distort the pattern and, hence, the multiband behavior. Finally, we study the effect of quantization of the feeding values. Quantization is necessary for implementation and simplification purposes. Several configurations of current distributions with the corresponding patterns are illustrated for different quantization levels. It is shown that moderate quantization keeps the same interesting similarity properties at several bands.
eng_Latn
4,445
A lack of security metrics signifies that it is not possible to measure the success of security policies, mechanisms and implementations, and security cannot, in turn, be improved if it cannot be measured. The importance of the use of metrics to obtain security quality is thus widely accepted. However, the definition of security metrics concerns a discipline which is still in its first stages of development, meaning that few documented resources or works centring on this subject exist to date. In this paper we shall therefore study the latest existing models with which to define security metrics and their components as aspects that have a bearing on the quality of software products with the intention that this will serve as a basis for continued advancement in research into this area of knowledge.
Software security metrics are measurements to assess security related imperfections (or perfections) introduced during software development. A number of security metrics have been proposed. However, all the perspectives of a software system have not been provided specific attention. While most security metrics evaluate software from a system-level perspective, it can also be useful to analyze defects at a lower level, i.e., at the source code level. To address this issue, we propose some code-level security metrics which can be used to suggest the level of security of a code segment. We provide guidelines about where and how these metrics can be used to improve source code structures. We have also conducted two case studies to demonstrate the applicability of the proposed metrics.
Software security metrics are measurements to assess security related imperfections (or perfections) introduced during software development. A number of security metrics have been proposed. However, all the perspectives of a software system have not been provided specific attention. While most security metrics evaluate software from a system-level perspective, it can also be useful to analyze defects at a lower level, i.e., at the source code level. To address this issue, we propose some code-level security metrics which can be used to suggest the level of security of a code segment. We provide guidelines about where and how these metrics can be used to improve source code structures. We have also conducted two case studies to demonstrate the applicability of the proposed metrics.
eng_Latn
4,471
Automatic generation of the C# code for security protocols verified with Casper/FDR
Formal methods technique offer a means of verifying the correctness of the design process used to create the security protocol. Notwithstanding the successful verification of the design of security protocols, the implementation code for them may contain security flaws, due to the mistakes made by the programmers or bugs in the programming language itself. We propose an ACG-C# tool, which can be used to generate automatically C# implementation code for the security protocol verified with Casper and FDR. The ACG-C# approach has several different features, namely automatic code generation, secure code, and high confidence. We conduct a case study on the Yahalom security protocol, using ACG-C# to generate the C# implementation code.
This paper particularly introduces the layout,design and realization.It is based on the structure of B/S,including browser,web server and database server.The key data are encrypted in order to ensure the security of the system.The user,who has logged on the system,can manipulate the database.All the operation will be noted.It also has powerful function,sustaining intellective and faintness demand.The function of data back-up and resume is self-contained.
eng_Latn
4,480
Palindrome Detection Using On-line Position
The purpose of this study is to detect a reverse substring in any position i of the string y, uRru is a factor of y, uR is a reverse substring of u and r is a substring between uR and u. We develop an efficient algorithm to detect all reverse repetitions in a string and describe the algorithm for computing the Longest Previous reverse Factor (LPrF) table by using the on-line construction of position heap data structure. This algorithm runs in linear time on a fixed size alphabet. For the applications in bioinformatics field, LPrF table can use to detect palindrome and gapped palindrome.
In this paper, we have presented an effective yield improvement methodology that can help both manufacturing ::: foundries, fabless and fab-lite companies to identify systematic failures. It uses the physical addresses of failing bits ::: from wafer sort results to overlay to inline wafer defect inspection locations. The inline defect patterns or the design ::: patterns where overlay results showed matches were extracted and grouped by feature similarity or cell names. The potentially problematic design patterns can be obtained and used for design debug and process improvement.
eng_Latn
4,490
Buffer Overflow Exploit and Defensive Techniques
Buffer overflow attack is most common and dangerous attack method at present. So the analysis is useful in studying the principle of buffer overflow and buffer overflow exploits. In the paper a didactic example is included to illustrate one method of buffer overflow exploits, and though adding a jmp esp instruction into the process space as a springboard, it makes the shell code successfully to be executed. Finally, an overview for protecting and defending against buffer overflow is summarized.
In this paper, we have presented an effective yield improvement methodology that can help both manufacturing ::: foundries, fabless and fab-lite companies to identify systematic failures. It uses the physical addresses of failing bits ::: from wafer sort results to overlay to inline wafer defect inspection locations. The inline defect patterns or the design ::: patterns where overlay results showed matches were extracted and grouped by feature similarity or cell names. The potentially problematic design patterns can be obtained and used for design debug and process improvement.
eng_Latn
4,499
Understanding the Evolution of Code Smells by Observing Code Smell Clusters
Code smells are more likely to stay inter-connected in software rather than remaining as a single instance. These code smell clusters create maintainability issues in evolving software. This paper aims to understand the evolution of the code smells in software, by analyzing the behavior of these clusters such as size, number and connectivity. For this, the clusters are first identified and then these characteristics are observed. The identification of code smell clusters is performed in three steps - detection of code smells (God Class, Long Method, Feature Envy, Type Checking) using smell detection tools, extraction of their relationships by analyzing the source code architecture, and generation of graphs from the identified smells and their relationships, that finally reveals the smelly clusters. This analysis was executed on JUnit as a case study, and four important cluster behaviors were reported.
In this paper, we have presented an effective yield improvement methodology that can help both manufacturing ::: foundries, fabless and fab-lite companies to identify systematic failures. It uses the physical addresses of failing bits ::: from wafer sort results to overlay to inline wafer defect inspection locations. The inline defect patterns or the design ::: patterns where overlay results showed matches were extracted and grouped by feature similarity or cell names. The potentially problematic design patterns can be obtained and used for design debug and process improvement.
eng_Latn
4,500
Study on Chinese text digit watermarking algorithm
In order to improve the robustness of watermark,to adapt the text integrity testing needs,many literature advocates multiple watermarking redundancy embedding method,which relate to the blocked text.Based on Chinese characters structure,the design method of blocked text of Chinese characters and watermarking algorithm were put forward.The method has obvious features of Chinese characters,can effectively improve the robustness of watermark,and raise the watermark's recovery ability after being attacked.
Formal methods technique offer a means of verifying the correctness of the design process used to create the security protocol. Notwithstanding the successful verification of the design of security protocols, the implementation code for them may contain security flaws, due to the mistakes made by the programmers or bugs in the programming language itself. We propose an ACG-C# tool, which can be used to generate automatically C# implementation code for the security protocol verified with Casper and FDR. The ACG-C# approach has several different features, namely automatic code generation, secure code, and high confidence. We conduct a case study on the Yahalom security protocol, using ACG-C# to generate the C# implementation code.
eng_Latn
4,512