name
stringlengths
7
10
title
stringlengths
13
125
abstract
stringlengths
67
3.02k
fulltext
stringclasses
1 value
keywords
stringlengths
17
734
train_725
Banks pin their back-office hopes on successors to screen scrapers
The big name in account aggregation has been Yodlee, based in Redwood Shores, CA. It pioneered the art of screen scraping, or pulling data off Web sites and aggregating it into a single statement. That data, however, is a snapshot and does not include a customer's investment history. Also, because Web sites update data at different times, scraping them can provide an inaccurate picture of a customer's financial situation, making it difficult for reps seeking to provide timely and accurate advice. The objective is to access both fresh and historical data across a client's financial spectrum, from investments to checking accounts and loans to insurance policies, a Complete Customer balance sheet. At least two technology vendors are progressing in that direction, each coming from different directions. One is Advent, based in San Francisco, another is Fincentric, out of Vancouver
checking;account aggregation;loans;insurance;bankers;yodlee;screen scraping;web sites;investment;advent;fincentric
train_726
New wrinkle on the Web? Hmm. [banking]
The financial sector produced its share of technology hype during the new economy years. You. can't blame folks if the next next thing, a wave of Internet-related innovation called Web services, is being met with healthy skepticism. Many gurus are placing their bets on Web services to drive the next chapter of finance technology, dramatically upgrading disappointing automated customer management strategies by electronically breaking down barriers between products, firms and customers, and perhaps creating a whole new line of business in the process. But it's not a magic wand. It doesn't change the need for a bank to reorganize and streamline its operations
bank;web services
train_727
Life after bankruptcy [telecom carriers]
How comeback telecom carriers are changing industry economics, and why others may have no choice but to follow their lead
debt levels;telecom carriers;bankruptcy;restructured companies;industry economics
train_728
Questioning the RFP process [telecom]
In the current climate, the most serious concern about the purchasing habits of telecom carriers is obviously the lack of spending. Even against a backdrop of economic constraints and financial struggles, however, genuine concerns about the purchasing process itself are being raised by some of those closest to it
telecom carriers;request for proposal;purchasing process;sales cycle;request for information
train_73
How does attitude impact IT implementation: a study of small business owners
According to previous studies, attitude towards information technology (IT) among small business owners appears to be a key factor in achieving high quality IT implementations. In an effort to extend this stream of research, we conducted case studies with small business owners and learned that high quality IT implementations resulted with owners who had positive or negative attitudes toward IT, but not with owners who had uncertain attitudes. Owners with apolar attitude, either positive or negative, all took action to temper the uncertainty and risk surrounding the use of new IT in their organization. In contrast, owners with uncertain attitudes did not make mitigating attempts to reduce uncertainty and risk. A consistent finding among those with high quality IT implementations was an entrepreneurial, or shared, management style. It is proposed, based on case study data, that small business owners with an uncertain attitude towards IT might experience higher quality IT results in their organizations through practicing a more entrepreneurial, or shared, management style. The study provides insights for both computer specialists and small business owners planning IT implementations
small business owners;organization;information technology implementation;positive attitudes;planning;management style;risk;negative attitudes;computer specialists;uncertain attitude
train_730
Multi-hour design of survivable classical IP networks
Most of Internet intra-domain routing protocols (OSPF, RIP, and IS-IS) are based on shortest path routing. The path length is defined as the sum of metrics associated with the path links. These metrics are often managed by the network administrator. In this context, the design of an Internet backbone network consists in dimensioning the network (routers and transmission links) and establishing the metric. Many requirements have to be satisfied. First, Internet traffic is not static as significant variations can be observed during the day. Second, many failures can occur (cable cuts, hardware failures, software failures, etc.). We present algorithms (meta-heuristics and greedy heuristic) to design Internet backbone networks, taking into account the multi-hour behaviour of traffic and some survivability requirements. Many multi-hour and protection strategies are studied and numerically compared. Our algorithms can be extended to integrate other quality of service constraints
transmission links;internet backbone network;network protection;is-is;path links;path length;survivable classical ip networks;internet intra-domain routing protocols;survivability requirements;quality of service constraints;greedy heuristic algorithm;network dimensioning;meta-heuristics algorithm;shortest path routing;network administrator;multi-hour design;ospf;internet traffic;rip;network routers;network failures;qos constraints
train_731
Aggregate bandwidth estimation in stored video distribution systems
Multimedia applications like video on demand, distance learning, Internet video broadcast, etc. will play a fundamental role in future broadband networks. A common aspect of such applications is the transmission of video streams that require a sustained relatively high bandwidth with stringent requirements of quality of service. In this paper various original algorithms for evaluating, in a video distribution system, a statistical estimation of aggregate bandwidth needed by a given number of smoothed video streams are proposed and discussed. The variable bit rate traffic generated by each video stream is characterized by its marginal distribution and by conditional probabilities between rates of temporary closed streams. The developed iterative algorithms evaluate an upper and lower bound of needed bandwidth for guaranteeing a given loss probability. The obtained results are compared with simulations and with other results, based on similar assumptions, already presented in the literature. Some considerations on the developed algorithms are made, in order to evaluate the effectiveness of the proposed methods
upper bound;distance learning;quality of service;simulations;multimedia applications;video streams transmission;lower bound;video coding;temporary closed streams;stored video distribution systems;conditional probabilities;vod;internet video broadcast;loss probability;aggregate bandwidth estimation;video on demand;iterative algorithms;statistical estimation;broadband networks;marginal distribution;qos;variable bit rate traffic
train_732
A unifying co-operative Web caching architecture
Network caching of objects has become a standard way of reducing network traffic and latency in the Web. However, Web caches exhibit poor performance with a hit rate of about 30%. A solution to improve this hit rate is to have a group of proxies form co-operation where objects can be cached for later retrieval. A cooperative cache system includes protocols for hierarchical and transversal caching. The drawback of such a system lies in the resulting network load due to the number of messages that need to be exchanged to locate an object. This paper proposes a new co-operative Web caching architecture, which unifies previous methods of Web caching. Performance results shows that the architecture achieve up to 70% co-operative hit rate and accesses the cached object in at most two hops. Moreover, the architecture is scalable with low traffic and database overhead
protocols;network caching;hierarchical caching;co-operative hit rate;network latency reduction;web browser;co-operative web caching architecture;network load;transversal caching;low database overhead;network traffic reduction;world wide web;low traffic overhead;scalable architecture;cooperative cache system
train_733
VoIP makeover transforms ugly duckling network
Surrey County Council's Swan project is Europe's biggest implementation of voice over IP. Six Wans and countless Lans are are being consolidated into a single network covering 6,000 users at 200 sites. The contract was signed in October 2001 for Pounds 13m over five years and rollout will be completed in May 2003
voice over ip;swan;lan;wan;surrey county council
train_734
Web services boost integration
Microsoft and IBM have announced products to help their database software co-exist with competitors' offerings. The products use web services technology allowing users to improve integration between databases and application software from rival vendors
database software;ibm;microsoft;web services technology
train_735
IT at the heart of joined-up policing
Police IT is to shift from application-focused to component-based technology. The change of strategy, part of the Valiant Programme, will make information held by individual forces available on a national basis
uk;police it;valiant programme
train_736
The year of the racehorse [China Telecom]
Does China really offer the telecoms industry a route out of the telecoms slump? According to the Chinese government it has yet to receive a single application from foreign companies looking to invest in the country's domestic telecoms sector since the country joined the World Trade Organisation
foreign investment;telecoms industry;china netcom;china unicorn;china;china telecom
train_737
What's in a name? [mobile telephony branding]
Mobile operators are frantically consolidating businesses into single international brands
branding;mobile telephony;consolidating businesses
train_738
Playing for time [3G networks]
The delays in rolling out 3G networks across Europe should not always be seen with a negative slant
europe;mobile operators;delays;3g networks
train_739
Disposable mobiles
After many delays, the reusable, recyclable, disposable mobile phone is finally going on sale in the US. But with a business model largely dependent on niche markets, Elizabeth Biddlecombe asks if these simplified handsets will be good enough to survive a brutal market
simplified handsets;disposable mobile phone;recyclable;reusable
train_74
End-user perspectives on the uptake of computer supported cooperative working
Researchers in information systems have produced a rich collection of meta-analyses and models to further understanding of factors influencing the uptake of information technologies. In the domain of CSCW, however, these models have largely been neglected, and while there are many case studies, no systematic account of uptake has been produced. We use findings from information systems research to structure a meta-analysis of uptake issues as reported in CSCW case studies, supplemented by a detailed re-examination of one of our own case studies from this perspective. This shows that while there are some factors which seem to be largely specific to CSCW introductions, many of the case study results are very similar to standard IS findings. We conclude by suggesting how the two communities of researchers might build on each other's work, and finally propose activity theory as a means of integrating the two perspectives
end-user perspectives;information technology;cscw;computer supported cooperative work;meta-analyses;information systems;activity theory
train_740
The Malaysian model
Japan's first third generation service, Foma, is unlikely to be truly attractive to consumers until 2005. That still falls well within the financial planning of its operator Docomo. But where does that leave European 3G operators looking for reassurance? Malaysia, says Simon Marshall
malaysia;maxis communications;3g operators;telekom malaysia
train_741
Mothball mania [3G licences]
Telefonica Moviles has frozen its 3G operations in Germany, Austria, Italy and Switzerland. With other 3G licence holders questioning the logic of entering already saturated markets with unproven technology, Emma McClune asks if the mothball effect is set to snowball any further
saturated markets;mobile telephony;mothball;3g licence holders
train_742
Second term [International Telecommunication Union]
Later this month Yoshio Utsumi is expected to be re-elected for a second four year term as secretary general of the International Telecommunication Union. Here he talks to Matthew May about getting involved in internet addressing, the prospects for 3g, the need for further reform of his organisation... and the translating telephone
3g;translating telephone;internet addressing;international telecommunication union
train_743
Local satellite
Consumer based mobile satellite phone services went from boom to burn up in twelve months despite original forecasts predicting 10 million to 40 million users by 2005. Julian Bright wonders what prospects the technology has now and if going regional might be one answer
mobile satellite phone services
train_744
A virtual victory [virtual networks]
Newly fashionable virtual network operators look all set to clean up in the corporate sector
corporate sector;virtual network operators
train_745
Intensity based affine registration including feature similarity for spatial
normalization This paper presents a new spatial normalization with affine transformation. The quantitative comparison of brain architecture across different subjects requires a common coordinate system. For the analysis of a specific brain area, it is necessary to normalize and compare a region of interest and the global brain. The intensity based registration method matches the global brain well, but a region of interest may not be locally normalized compared to the feature based method. The method in this paper uses feature similarities of local regions as well as intensity similarities. The lateral ventricle and central gray nuclei of the brain, including the corpus callosum, which is used for features in schizophrenia detection, is appropriately normalized. Our method reduces the difference of feature areas such as the corpus callosum (7.7%, 2.4%) and lateral ventricle (8.2%, 13.5%) compared with mutual information and Talairach methods
intensity based affine registration;global brain;corpus callosum;lateral ventricle;talairach method;central gray nuclei;affine transformation;schizophrenia detection;common coordinate system;brain architecture;feature similarities;spatial normalization;feature similarity;region of interest;mutual information method
train_746
Real-time transmission of pediatric echocardiograms using a single ISDN line
We tested the adequacy of a videoconferencing system using a single integrated systems digital network (ISDN) line (128 kilobits per second) for the remote diagnosis of children with suspected congenital heart disease (CHD). Real-time echocardiogram interpretation was compared to subsequent videotape review in 401 studies with concordance in 383 (95.5%) studies. A new diagnosis of CHD was made in 98 studies. Immediate patient transfer was arranged based upon a real-time diagnosis in five studies. In 300 studies, a normal diagnosis obviated further evaluation. A single ISDN line is adequate for transmission of pediatric echocardiograms and it allows for remote management of patients with CHD
single isdn line;real-time echocardiogram interpretation;remote patient management;real-time pediatric echocardiogram transmission;remote diagnosis;videoconferencing system;videotape review;immediate patient transfer;children;suspected congenital heart disease
train_747
Simulation of cardiovascular physiology: the diastolic function(s) of the heart
The cardiovascular system was simulated by using an equivalent electronic circuit. Four sets of simulations were performed. The basic variables investigated were cardiac output and stroke volume. They were studied as functions (i) of right ventricular capacitance and negative intrathoracic pressure; (ii) of left ventricular relaxation and of heart rate; and (iii) of left ventricle failure. It seems that a satisfactory simulation of systolic and diastolic functions of the heart is possible. Presented simulations improve our understanding of the role of the capacitance of both ventricles and of the diastolic relaxation in cardiovascular physiology
cardiac output;heart;left ventricular relaxation;right ventricular capacitance;stroke volume;simulation;diastolic relaxation;systolic functions;cardiovascular physiology;negative intrathoracic pressure;diastolic function;left ventricle failure;heart rate;equivalent electronic circuit
train_748
Simulation study of the cardiovascular functional status in hypertensive
situation An extended cardiovascular model was established based on our previous work to study the consequences of physiological or pathological changes to the homeostatic functions of the cardiovascular system. To study hemodynamic changes in hypertensive situations, the impacts of cardiovascular parameter variations (peripheral vascular resistance, arterial vessel wall stiffness and baroreflex gain) upon hemodynamics and the short-term regulation of the cardiovascular system were investigated. For the purpose of analyzing baroregulation function, the short-term regulation of arterial pressure in response to moderate dynamic exercise for normotensive and hypertensive cases was studied through computer simulation and clinical experiments. The simulation results agree well with clinical data. The results of this work suggest that the model presented in this paper provides a useful tool to investigate the functional status of the cardiovascular system in normal or pathological conditions
arterial pressure;arterial vessel wall stiffness;moderate dynamic exercise;computer simulation;normotensive cases;peripheral vascular resistance;baroreflex gain;extended cardiovascular model;pathological changes;physiological changes;hemodynamics;homeostatic functions;clinical experiments;short-term regulation;hypertensive situation;cardiovascular parameter variations;cardiovascular functional status
train_749
Numerical modeling of the flow in stenosed coronary artery. The relationship
between main hemodynamic parameters The severity of coronary arterial stenosis is usually measured by either simple geometrical parameters, such as percent diameter stenosis, or hemodynamically based parameters, such as the fractional flow reserve (FFR) or coronary flow reserve (CFR). The present study aimed to establish a relationship between actual hemodynamic conditions and the parameters that define stenosis severity in the clinical setting. We used a computational model of the blood flow in a vessel with a blunt stenosis and an autoregulated vascular bed to simulate a stenosed blood vessel. A key point in creating realistic simulations is to properly model arterial autoregulation. A constant flow regulation mechanism resulted in CFR and FFR values that were within the physiological range, while a constant wall-shear stress model yielded unrealistic values. The simulation tools developed in the present study may be useful in the clinical assessment of single and multiple stenoses by means of minimally invasive methods
arterial autoregulation;stenosis severity;stenosed blood vessel;constant flow regulation mechanism;physiological range;numerical modeling;constant wall shear stress model;simulation;computational model;blunt stenosis;minimally invasive methods;clinical setting;blood flow;hemodynamic parameters;coronary arterial stenosis;autoregulated vascular bed
train_75
A portable Auto Attendant System with sophisticated dialog structure
An attendant system connects the caller to the party he/she wants to talk to. Traditional systems require the caller to know the full name of the party. If the caller forgets the name, the system fails to provide service for the caller. In this paper we propose a portable Auto Attendant System (AAS) with sophisticated dialog structure that gives a caller more flexibility while calling. The caller may interact with the system to request a phone number by providing just a work area, specialty, surname, or title, etc. If the party is absent, the system may provide extra information such as where he went, when he will be back, and what he is doing. The system is built modularly, with components such as speech recognizer, language model, dialog manager and text-to-speech that can be replaced if necessary. By simply changing the personnel record database, the system can easily be ported to other companies. The sophisticated dialog manager applies many strategies to allow natural interaction between user and system. Functions such as fuzzy request, user repairing, and extra information query, which are not provided by other systems, are integrated into our system. Experimental results and comparisons to other systems show that our approach provides a more user friendly and natural interaction for auto attendant system
clear request;telephone;telephone-based system;fuzzy request;semantic frame;spoken dialog systems;dialog manager;auto attendant system;attendant system;speech recognizer
train_750
Automated cerebrum segmentation from three-dimensional sagittal brain MR images
We present a fully automated cerebrum segmentation algorithm for full three-dimensional sagittal brain MR images. First, cerebrum segmentation from a midsagittal brain MR image is performed utilizing landmarks, anatomical information, and a connectivity-based threshold segmentation algorithm as previously reported. Recognizing that the cerebrum in laterally adjacent slices tends to have similar size and shape, we use the cerebrum segmentation result from the midsagittal brain MR image as a mask to guide cerebrum segmentation in adjacent lateral slices in an iterative fashion. This masking operation yields a masked image (preliminary cerebrum segmentation) for the next lateral slice, which may truncate brain region(s). Truncated regions are restored by first finding end points of their boundaries, by comparing the mask image and masked image boundaries, and then applying a connectivity-based algorithm. The resulting final extracted cerebrum image for this slice is then used as a mask for the next lateral slice. The algorithm yielded satisfactory fully automated cerebrum segmentations in three-dimensional sagittal brain MR images, and had performance superior to conventional edge detection algorithms for segmentation of cerebrum from 3D sagittal brain MR images
connectivity-based threshold segmentation algorithm;masked image boundaries;full 3d sagittal brain mr images;midsagittal brain mr image;brain region truncation;laterally adjacent slices;boundary end points;connectivity-based algorithm;fully automated cerebrum segmentation algorithm;masking operation;anatomical information;landmarks
train_751
A new method of regression on latent variables. Application to spectral data
Several applications are based on the assessment of a linear model linking a set of variables Y to a set of predictors X. In the presence of strong colinearity among predictors, as in the case with spectral data, several alternative procedures to ordinary least squares (OLS) are proposed, We discuss a new alternative approach which we refer to as regression models through constrained principal components analysis (RM-CPCA). This method basically shares certain common characteristics with PLS regression as the dependent variables play a central role in determining the latent variables to be used as predictors. Unlike PLS, however, the approach discussed leads to straightforward models. This method also bears some similarity to latent root regression analysis (LRR) that was discussed by several authors. Moreover, a tuning parameter that ranges between 0 and 1 is introduced and the family of models thus formed includes several other methods as particular cases
strong colinearity;latent variables;regression models through constrained principal components analysis;latent root regression analysis;dependent variables;tuning parameter;spectral data;predictors;linear model;near-ir spectroscopy
train_752
Presenting-a better mousetrap [Leeza outboard video signal processor]
Scaling interlaced video to match high-resolution plasma, LCD, and DLP displays is a tough job, but Key Digital's Leeza is zip to the tack. And it's digitally bilingual, too. There's no question that outboard video signal processors like Leeza help overcome the inherent limitations of fixed-pixel displays. Being able to match a native display rate with heavily processed video makes the viewing experience much more enjoyable. But it seemed that 70% of the improvement in image quality came from using a digital interface to the DVD player, as most noise and picture artifacts are introduced in the analog video encoding process
heavily processed video;plasma displays;leeza;fixed-pixel displays;lcd displays;dlp displays;outboard video signal processors
train_753
In medias res [DVD formats]
Four years in the making, the DVD format war rages on, no winner insight. meanwhile, the spoils of war abound, and DVD media manufacturers stand poised to profit
compatibility;dvd-r;dvd+rw;dvd-rw;dvd media manufacturers;dvd-ram;dvd+r;dvd format war;writable dvd
train_754
Record makers [UK health records]
Plans for a massive cradle-to-grave electronic records project have been revealed by the government. Is the scheme really viable?
social care;integrated care records services;electronic records project;health care;uk health records
train_755
Hardware and software platform for real-time processing and visualization of
echographic radiofrequency signals In this paper the architecture of a hardware and software platform, for ultrasonic investigation is presented. The platform, used in conjunction with an analog front-end hardware for driving the ultrasonic transducers of any commercial echograph, having the radiofrequency echo signal access, make it possible to dispose of a powerful echographic system for experimenting any processing technique, also in a clinical environment in which real-time operation mode is an essential prerequisite. The platform transforms any echograph into a test-system for evaluating the diagnostic effectiveness of new investigation techniques. A particular user interface was designed in order to allow a real-time and simultaneous visualization of the results produced in the different stages of the chosen processing procedure. This is aimed at obtaining a better optimization of the processing algorithm. The most important platform aspect, which also constitutes the basic differentiation with respect to similar systems, is the direct processing of the radiofrequency echo signal, which is essential for a complete analysis of the particular ultrasound-media interaction phenomenon. The platform completely integrates the architecture of a personal computer (PC) giving rise to several benefits, such as the quick technological evolution in the PC field and an extreme degree of programmability for different applications. The PC also constitutes the user interface, as a flexible and intuitive visualization support, and performs some software signal processing, by custom algorithms and commercial libraries. The realized close synergy between hardware and software allows the acquisition and real-time processing of the echographic radiofrequency (RF) signal with fast data representation
real-time processing;hardware platform;clinical diagnosis;software platform;ultrasonic imaging;user interface;data visualization;personal computer;echographic radiofrequency signal
train_756
A new high resolution color flow system using an eigendecomposition-based
adaptive filter for clutter rejection We present a new signal processing strategy for high frequency color flow mapping in moving tissue environments. A new application of an eigendecomposition-based clutter rejection filter is presented with modifications to deal with high blood-to-clutter ratios (BCR). Additionally, a new method for correcting blood velocity estimates with an estimated tissue motion profile is detailed. The performance of the clutter filter and velocity estimation strategies is quantified using a new swept-scan signal model. In vivo color flow images are presented to illustrate the potential of the system for mapping blood flow in the microcirculation with external tissue motion
hf colour flow mapping;echoes;blood velocity estimates correction;estimated tissue motion profile;signal processing strategy;clutter suppression performance;high resolution colour flow system;in vivo color flow images;swept-scan signal model;microcirculation;high blood-to-clutter ratios;high frequency color flow mapping;eigendecomposition-based adaptive filter;clutter rejection filter;blood flow mapping;moving tissue environments
train_757
Ultrafast compound imaging for 2-D motion vector estimation: application to
transient elastography This paper describes a new technique for two-dimensional (2-D) imaging of the motion vector at a very high frame rate with ultrasound. Its potential is experimentally demonstrated for transient elastography. But, beyond this application, it also could be promising for color flow and reflectivity imaging. To date, only axial displacements induced in human tissues by low-frequency vibrators were measured during transient elastography. The proposed technique allows us to follow both axial and lateral displacements during the shear wave propagation and thus should improve Young's modulus image reconstruction. The process is a combination of several ideas well-known in ultrasonic imaging: ultra-fast imaging, multisynthetic aperture beamforming, 1-D speckle tracking, and compound imaging. Classical beamforming in the transmit mode is replaced here by a single plane wave insonification increasing the frame rate by at least a factor of 128. The beamforming is achieved only in the receive mode on two independent subapertures. Comparison of successive frames by a classical 1-D speckle tracking algorithm allows estimation of displacements along two different directions linked to the subapertures beams. The variance of the estimates is finally improved by tilting the emitting plane wave at each insonification, thus allowing reception of successive decorrelated speckle patterns
young's modulus image reconstruction;2d imaging;two-dimensional imaging;human tissues;decorrelated speckle patterns;multisynthetic aperture beamforming;1d speckle tracking algorithm;lateral displacements;high frame rate;ultrafast compound imaging;colour flow imaging;single plane wave insonification;reflectivity imaging;ultrasonic imaging;2d motion vector estimation;shear wave propagation;ultrasound;axial displacements;transient elastography
train_758
Four-terminal quantum resistor network for electron-wave computing
Interconnected ultrathin conducting wires or, equivalently, interconnected quasi-one-dimensional electron waveguides, which form a quantum resistor network, are presented here in four-terminal configurations. The transmission behaviors through such four-terminal networks are evaluated and classified. In addition, we show that such networks can be used as the basic building blocks for a possible massive wave computing machine in the future. In a network, each interconnection, a node point, is an elastic scatterer that routes the electron wave. Routing and rerouting of electron waves in a network is described in the framework of quantum transport from Landauer-Buttiker theory in the presence of multiple elastic scatterers. Transmissions through various types of four-terminal generalized clean Aharonov-Bohm rings are investigated at zero temperature. Useful logic functions are gathered based on the transmission probability to each terminal with the use of the Buttiker symmetry rule. In the generalized rings, even and odd numbers of terminals can possess some distinctly different transmission characteristics as we have shown here and earlier. Just as an even or odd number of atoms in a ring is an important quantity for classifying the transmission behavior, we show here that whether the number of terminals is an even or an odd number is just as important in understanding the physics of transmission through such a ring. Furthermore, we show that there are three basic classes of four-terminal rings and the scaling relation for each class is provided. In particular, the existence of equitransmission among all four terminals is shown here. This particular physical phenomena cannot exist in any three-terminal ring. Comparisons and discussions of transmission characteristics between three-terminal and four-terminal rings are also presented. The node-equation approach by considering the Kirchhoff current conservation law at each node point is used for this analysis. Many useful logic functions for electron-wave computing are shown here. In particular, we show that a full adder can be constructed very simply using the equitransmission property of the four-terminal ring. This is in sharp contrast with circuits based on transistor logic
equitransmission property;multiple elastic scatterers;four-terminal quantum resistor network;rerouting;transmission behavior;landauer-buttiker theory;quasi1d electron waveguides;buttiker symmetry rule;aharonov-bohm rings;electron-wave computing;kirchhoff current conservation law;transmission probability;interconnected ultrathin conducting wires;logic functions
train_759
Mathematical properties of dominant AHP and concurrent convergence method
This study discusses the mathematical structure of the dominant AHP and the concurrent convergence method which were originally developed by Kinoshita and Nakanishi. They introduced a new concept of a regulating alternative into an analyzing tool for a simple evaluation problem with a criterion set and an alternative set. Although the original idea of the dominant AHP and the concurrent convergence method is unique, the dominant AHP and the concurrent convergence method are not sufficiently analyzed in mathematical theory. This study shows that the dominant AHP consists of a pair of evaluation rules satisfying a certain property of overall evaluation vectors. This study also shows that the convergence of concurrent convergence method is guaranteed theoretically
overall evaluation vectors;dominant ahp;concurrent convergence method
train_76
Reaching strong consensus in a general network
The strong consensus (SC) problem is a variant of the conventional distributed consensus problem (also known as the Byzantine agreement problem). The SC problem requires that the agreed value among fault-free processors be one of the fault-free processor's initial values. Originally, the problem was studied in a fully connected network with malicious faulty processors. In this paper, the SC problem is re-examined in a general network, in which the components (processors and communication links) may be subjected to different faulty types simultaneously (also called the hybrid fault model or mixed faulty types) and the network topology does not have to be fully connected. The proposed protocol can tolerate the maximum number of tolerable faulty components such that each fault-free processor obtains a common value for the SC problem in a general network
hybrid fault model;strong consensus;byzantine agreement;fully connected network;strong consensus problem;distributed consensus problem;fault-tolerant distributed system;fault-free processors
train_760
An improved fuzzy MCDM model based on ideal and anti-ideal concepts
Liang presented (1999) a fuzzy multiple criteria decision making (MCDM) method based on the concepts of ideal and anti-ideal points. Despite its merits, Liang method has the following limitations: (i) the objective criteria are converted into dimensionless indices and the subjective criteria are not converted, which may prevent compatibility for these criteria, (ii) the formulas for converting objective criteria are not reliable, and (iii) an unreliable ranking method, i.e. maximizing set and minimizing set, is applied to rank the fuzzy numbers. This paper applies the Hsu and Chen method and suggests a fuzzy number ranking method to propose an improved fuzzy MCDM model based on ideal and anti-ideal concepts to overcome the shortcomings of the Liang method. Numerical examples demonstrate the effectiveness and feasibility of the proposed ranking method and the improved model, respectively
ideal concepts;anti-ideal concepts;fuzzy number ranking;fuzzy mcdm model;dimensionless indices;multicriterion decision-making
train_761
Towards a NMR implementation of a quantum lattice gas algorithm
Recent theoretical results suggest that an array of quantum information processors communicating via classical channels can be used to solve fluid dynamics problems. Quantum lattice-gas algorithms (QLGA) running on such architectures have been shown to solve the diffusion equation and the nonlinear Burgers equations. In this report, we describe progress towards an ensemble nuclear magnetic resonance (NMR) implementation of a QLGA that solves the diffusion equation. The methods rely on NMR techniques to encode an initial mass density into an ensemble of two-qubit quantum information processors. Using standard pulse techniques, the mass density can then manipulated and evolved through the steps of the algorithm. We provide the experimental results of our first attempt to realize the NMR implementation. The results qualitatively follow the ideal simulation, but the observed implementation errors highlight the need for improved control
quantum lattice gas algorithm;two-qubit quantum information.processors;fluid dynamics problems;nuclear magnetic resonance;diffusion equation;quantum information processors;nmr implementation;nonlinear burgers equations
train_762
Quantum computing with spin qubits in semiconductor structures
We survey recent work on designing and evaluating quantum computing implementations based on nuclear or bound-electron spins in semiconductor heterostructures at low temperatures and in high magnetic fields. General overview is followed by a summary of results of our theoretical calculations of decoherence time scales and spin-spin interactions. The latter were carried out for systems for which the two-dimensional electron gas provides the dominant carrier for spin dynamics via exchange of spin-excitons in the integer quantum Hall regime
spin-spin interactions;low temperatures;semiconductor heterostructures;spin dynamics;2deg;spin qubits;quantum computing;spin-excitons exchange;2d electron gas;high magnetic fields;dominant carrier;semiconductor structures;integer quantum hall regime;integer qhe
train_763
A quantum full adder for a scalable nuclear spin quantum computer
We demonstrate a strategy for implementation a quantum full adder in a spin chain quantum computer. As an example, we simulate a quantum full adder in a chain containing 201 spins. Our simulations also demonstrate how one can minimize errors generated by non-resonant effects
scalable nuclear spin quantum computer;error minimization;nonresonant effects;quantum full adder
train_764
Lattice Boltzmann schemes for quantum applications
We review the basic ideas behind the quantum lattice Boltzmann equation (LBE), and present a few thoughts on the possible use of such an equation for simulating quantum many-body problems on both (parallel) electronic and quantum computers
quantum applications;quantum many-body problems;quantum computers;lattice boltzmann schemes;parallel computing
train_765
Simulating fermions on a quantum computer
The real-time probabilistic simulation of quantum systems in classical computers is known to be limited by the so-called dynamical sign problem, a problem leading to exponential complexity. In 1981 Richard Feynman raised some provocative questions in connection to the "exact imitation" of such systems using a special device named a "quantum computer". Feynman hesitated about the possibility of imitating fermion systems using such a device. Here we address some of his concerns and, in particular, investigate the simulation of fermionic systems. We show how quantum computers avoid the sign problem in some cases by reducing the complexity from exponential to polynomial. Our demonstration is based upon the use of isomorphisms of algebras. We present specific quantum algorithms that illustrate the main points of our algebraic approach
fermions simulation;sign problem;quantum computer;exponential complexity;real-time probabilistic simulation;isomorphisms;algebras;classical computers;dynamical sign problem;fermion systems
train_766
Physical quantum algorithms
I review the differences between classical and quantum systems, emphasizing the connection between no-hidden variable theorems and superior computational power of quantum computers. Using quantum lattice gas automata as examples, I describe possibilities for efficient simulation of quantum and classical systems with a quantum computer. I conclude with a list of research directions
quantum lattice gas automata;quantum computers;classical systems;physical quantum algorithms;no-hidden variable theorems
train_767
Quantum computation for physical modeling
One of the most famous American physicists of the twentieth century, Richard Feynman, in 1982 was the first to propose using a quantum mechanical computing device to efficiently simulate quantum mechanical many-body dynamics, a task that is exponentially complex in the number of particles treated and is completely intractable by any classical computing means for large systems of many particles. In the two decades following his work, remarkable progress has been made both theoretically and experimentally in the new field of quantum computation
quantum mechanical many-body dynamics;quantum computation;quantum mechanical computing;physical modeling
train_768
Critical lines identification on voltage collapse analysis
This paper deals with critical lines identification on voltage collapse analysis. It is known, from the literature, that voltage collapse is a local phenomenon that spreads around an initial neighborhood Therefore, identifying the system critical bus plays an important role on voltage collapse prevention. For this purpose, the system critical transmission lines should also be identified In this paper, these issues are addressed, yielding reliable results in a short computational time. Tests are done with the help of the IEEE-118 bus and the Southeastern Brazilian systems
brazil;computer simulation;power system voltage collapse analysis;local phenomenon;critical transmission lines identification;system critical bus identification;ieee-118 bus
train_769
Permission grids: practical, error-bounded simplification
We introduce the permission grid, a spatial occupancy grid which can be used to guide almost any standard polygonal surface simplification algorithm into generating an approximation with a guaranteed geometric error bound. In particular, all points on the approximation are guaranteed to be within some user-specified distance from the original surface. Such bounds are notably absent from many current simplification methods, and are becoming increasingly important for applications in scientific computing and adaptive level of detail control. Conceptually simple, the permission grid defines a volume in which the approximation must lie, and does not permit the underlying simplification algorithm to generate approximations outside the volume. The permission grid makes three important, practical improvements over current error-bounded simplification methods. First, it works on arbitrary triangular models, handling all manners of mesh degeneracies gracefully. Further, the error tolerance may be easily expanded as simplification proceeds, allowing the construction of an error-bounded level of detail hierarchy with vertex correspondences among all levels of detail. And finally, the permission grid has a representation complexity independent of the size of the input model, and a small running time overhead, making it more practical and efficient than current methods with similar guarantees
spatial occupancy grid;approximation;mesh degeneracies;error-bounded simplification;vertex correspondences;error tolerance;arbitrary triangular models;adaptive level of detail control;representation complexity;user-specified distance;running time overhead;scientific computing;permission grid;polygonal surface simplification algorithm;guaranteed geometric error bound
train_77
Modeling frequently accessed wireless data with weak consistency
To reduce the response times of wireless data access in a mobile network, caches are utilized in wireless handheld devices. If the original data entry has been updated, the cached data in the handheld device becomes stale. Thus, a mechanism is required to predict when the cached copy will expire. This paper studies a weakly consistent data access mechanism that computes the time-to-live (TTL) interval to predict the expiration time. We propose an analytic model to investigate this TTL-based algorithm for frequently accessed data. The analytic model is validated against simulation experiments. Our study quantitatively indicates how the TTL-based algorithm reduces the wireless communication cost by increasing the probability of stale accesses. Depending on the requirements of the application, appropriate parameter values can be selected based on the guidelines provided
time-to-live interval;mobile network;data entry;weak consistency;stale access probability;expiration time prediction;wireless data access;caches;simulation experiments;frequently accessed wireless data modeling;response time reduction;wireless communication cost;wireless handheld devices;analytic model
train_770
The 3D visibility complex
Visibility problems are central to many computer graphics applications. The most common examples include hidden-part removal for view computation, shadow boundaries, mutual visibility of objects for lighting simulation. In this paper, we present a theoretical study of 3D visibility properties for scenes of smooth convex objects. We work in the space of light rays, or more precisely, of maximal free segments. We group segments that "see" the same object; this defines the 3D visibility complex. The boundaries of these groups of segments correspond to the visual events of the scene (limits of shadows, disappearance of an object when the viewpoint is moved, etc.). We provide a worst case analysis of the complexity of the visibility complex of 3D scenes, as well as a probabilistic study under a simple assumption for "normal" scenes. We extend the visibility complex to handle temporal visibility. We give an output-sensitive construction algorithm and present applications of our approach
visual events;computer graphics;light rays;3d visibility complex;hidden-part removal;probabilistic study;output-sensitive construction algorithm;temporal visibility;view computation;maximal free segments;normal scenes;mutual object visibility;shadow boundaries;smooth convex objects;lighting simulation;worst case complexity analysis
train_771
Pareto-optimal formulations for cost versus colorimetric accuracy trade-offs in
printer color management Color management for the printing of digital images is a challenging task, due primarily to nonlinear ink-mixing behavior and the presence of redundant solutions for print devices with more than three inks. Algorithms for the conversion of image data to printer-specific format are typically designed to achieve a single predetermined rendering intent, such as colorimetric accuracy. We present two CIELAB to CMYK color conversion schemes based on a general Pareto-optimal formulation for printer color management. The schemes operate using a 149-color characterization data set selected to efficiently capture the entire CMYK gamut. The first scheme uses artificial neural networks as transfer functions between the CIELAB and CMYK spaces. The second scheme is based on a reformulation of tetrahedral interpolation as an optimization problem. Characterization data are divided into tetrahedra for the interpolation-based approach using the program Qhull, which removes the common restriction that characterization data be well organized. Both schemes offer user control over trade-off problems such as cost versus reproduction accuracy, allowing for user-specified print objectives and the use of constraints such as maximum allowable ink and maximum allowable AE*/sub ab/. A formulation for minimization of ink is shown to be particularly favorable, integrating both a clipping and gamut compression features into a single methodology
cost versus colorimetric accuracy trade-offs;qhull program;clipping;color characterization data set;image data conversion;nonlinear ink-mixing behavior;gamut compression features;optimization;rendering intent;user-specified print objectives;interpolation-based approach;cost versus reproduction accuracy;digital image printing;printer color management;macbeth colorchecker chart;pareto-optimal formulations;transfer functions;redundant solutions;constraints;grey component replacement;tetrahedra;artificial neural networks;maximum allowable ink;tetrahedral interpolation;cielab to cmyk color conversion schemes;user control
train_772
Meshed atlases for real-time procedural solid texturing
We describe an implementation of procedural solid texturing that uses the texture atlas, a one-to-one mapping from an object's surface into its texture space. The method uses the graphics hardware to rasterize the solid texture coordinates as colors directly into the atlas. A texturing procedure is applied per-pixel to the texture map, replacing each solid texture coordinate with its corresponding procedural solid texture result. The procedural solid texture is then mapped back onto the object surface using standard texture mapping. The implementation renders procedural solid textures in real time, and the user can design them interactively. The quality of this technique depends greatly on the layout of the texture atlas. A broad survey of texture atlas schemes is used to develop a set of general purpose mesh atlases and tools for measuring their effectiveness at distributing as many available texture samples as evenly across the surface as possible. The main contribution of this paper is a new multiresolution texture atlas. It distributes all available texture samples in a nearly uniform distribution. This multiresolution texture atlas also supports MIP-mapped minification antialiasing and linear magnification filtering
object surface;texture atlas;graphics hardware;one-to-one mapping;real-time procedural solid texturing;rendering;texture space;meshed atlases;solid texture coordinates;linear magnification filtering;colors;mip-mapped minification antialiasing;multiresolution texture atlas;rasterization
train_773
Topology-reducing surface simplification using a discrete solid representation
This paper presents a new approach for generating coarse-level approximations of topologically complex models. Dramatic topology reduction is achieved by converting a 3D model to and from a volumetric representation. Our approach produces valid, error-bounded models and supports the creation of approximations that do not interpenetrate the original model, either being completely contained in the input solid or bounding it. Several simple to implement versions of our approach are presented and discussed. We show that these methods perform significantly better than other surface-based approaches when simplifying topologically-rich models such as scene parts and complex mechanical assemblies
topology-reducing surface simplification;topologically complex models;scene parts;discrete solid representation;3d model;complex mechanical assemblies;coarse-level approximations;volumetric representation;error-bounded models
train_774
Keeping Web accessibility in mind: I&R services for all
After presenting three compelling reasons for making Web sites accessible to persons with a broad range of disabilities (it's the morally right thing to do, it's the smart thing to do from an economic perspective, and it's required by law), the author discusses design issues that impact persons with particular types of disabilities. She presents practical advice for assessing and addressing accessibility problems. An extensive list of resources for further information is appended, as is a list of sites which simulate the impact of specific accessibility problems on persons with disabilities
disabilities;web site accessibility;information and referral services
train_775
Disability-related special libraries
One of the ways that the federal government works to improve services to people with disabilities is to fund disability-related information centers and clearinghouses that provide information resources and referrals to disabled individuals, their family members, service providers, and the general public. The Teaching Research Division of Western Oregon University operates two federally funded information centers for people with disabilities: OBIRN (the Oregon Brain Injury Resource Network) and DB-LINK (the National Information Clearinghouse on Children who are Deaf-Blind). Both have developed in-depth library collections and services in addition to typical clearinghouse services. The authors describe how OBIRN and DB-LINK were designed and developed, and how they are currently structured and maintained. Both information centers use many of the same strategies and tools in day-to-day operations, but differ in a number of ways, including materials and clientele
library collections;disability-related clearinghouses;disability-related special libraries;information resources;disability-related information centers;national information clearinghouse on children who are deaf-blind;db-link;oregon brain injury resource network;obirn;federal government;western oregon university;information referrals
train_776
Information access for all: meeting the needs of deaf and hard of hearing
people Discusses the nature of deafness and hearing impairments, with particular reference to the impact which the onset of hearing loss presents at various ages. The author goes on to present practical tips for interacting with deaf and hard of hearing clients in various communication contexts, including sightreading, TTY communications, and ASL interpreters. An annotated list of suggested readings is appended
information access;deaf clients;asl interpreters;deafness;communication contexts;hard of hearing clients;sightreading;hearing impairments;tty communications
train_777
Access to information for blind and visually impaired clients
This article guides I&R providers in establishing effective communication techniques for working with visually impaired consumers. The authors discuss common causes of vision impairment and the functional implications of each and offer information on disability etiquette and effective voice, accessible media and in-person communication. There is an overview of assistive technologies used by people who are visually impaired-to facilitate written and electronic communications as well as low-tech solutions for producing large-print and Braille materials in-house. Providers who implement these communication techniques will be well equipped to serve visually-impaired consumers, and consumers will be more likely to avail themselves of these services when providers make them easily accessible
information and referral systems;information access;written communications;effective voice;accessible media;large-print materials;assistive technologies;communication techniques;disability etiquette;braille materials;visually impaired clients;electronic communications;in-person communication;blind clients
train_778
Access matters
Discusses accessibility needs of people with disabilities, both from the perspective of getting the information from I&R programs (including accessible Web sites, TTY access, Braille, and other mechanisms) and from the perspective of being aware of accessibility needs when referring clients to resources. Includes information on ADA legislation requiring accessibility to public places and recommends several organizations and Web sites for additional information
information and referral programs;public places;braille;accessibility needs;disabled people;ada legislation;accessible web sites;tty access
train_779
Domesticating computers and the Internet
The people who use computers and the ways they use them have changed substantially over the past 25 years. In the beginning highly educated people, mostly men, in technical professions used computers for work, but over time a much broader range of people are using computers for personal and domestic purposes. This trend is still continuing, and over a shorter time scale has been replicated with the use of the Internet. The paper uses data from four national surveys to document how personal computers and the Internet have become increasingly domesticated since 1995 and to explore the mechanisms for this shift. Now people log on more often from home than from places of employment and do so for pleasure and for personal purposes rather than for their jobs. Analyses comparing veteran Internet users to novices in 1998 and 2000 and analyses comparing the change in use within a single sample between 1995 and 1996 support two complementary explanations for how these technologies have become domesticated. Women, children, and less well-educated individuals are increasingly using computers and the Internet and have a more personal set of motives than well-educated men. In addition, the widespread diffusion of the PC and the Internet and the response of the computing industry to the diversity in consumers has led to a rich set of personal and domestic services
demographics;novices;personal motives;internet;women;veteran internet users;domestic purposes;national surveys;online behavior;highly educated people;personal usage;domestic services;pc diffusion;computer domestication;computing industry;technical professions;personal computers;children
train_78
Applying genetic algorithms to solve the fuzzy optimal profit problem
This study investigated the application of genetic algorithms in solving a fuzzy optimization problem that arises in business and economics. In this problem, a fuzzy price is determined using a linear or a quadratic fuzzy demand function as well as a linear cost function. The objective is to find the optimal fuzzy profit, which is derived from the fuzzy price and fuzzy cost. Traditional methods for solving this problem are (1) the extension principle, and (2) using interval arithmetic and alpha -cuts. However, we argue that traditional methods for solving this problem are too restrictive to produce an optimal solution, and that an alternative approach is possibly needed. We use genetic algorithms to obtain an approximate solution for this fuzzy optimal profit problem without using membership functions. We not only give empirical examples to show the effectiveness of this approach, but also give theoretical proofs to validate correctness of the algorithm. We conclude that genetic algorithms can produce good approximate solutions when applied to solve fuzzy optimization problems
linear cost function;fuzzy price;economics;theoretical proofs;business;fuzzy optimal profit problem;genetic algorithms;linear fuzzy demand function;algorithm correctness validation;fuzzy optimization problem;approximate solution;quadratic fuzzy demand function
train_780
Failures and successes: notes on the development of electronic cash
Between 1997 and 2001, two mid-sized communities in Canada hosted North America's most comprehensive experiment to introduce electronic cash and, in the process, replace physical cash for casual, low-value payments. The technology used was Mondex, and its implementation was supported by all the country's major banks. It was launched with an extensive publicity campaign to promote Mondex not only in the domestic but also in the global market, for which the Canadian implementation was to serve as a "showcase." However, soon after the start of the first field test it became apparent that the new technology did not work smoothly. On the contrary, it created a host of controversies, in areas as varied as computer security, consumer privacy, and monetary policy. In the following years, few of these controversies could be resolved and Mondex could not be established as a widely used payment mechanism. In 2001, the experiment was finally terminated. Using the concepts developed in recent science and technology studies (STS), the article analyzes these controversies as resulting from the difficulties of fitting electronic cash, a new sociotechnical system, into the complex setting of the existing payment system. The story of Mondex not only offers lessons on why technologies fail, but also offers insight into how short-term failures can contribute to long-term transformations. This suggests the need to rethink the dichotomy of success and failure
long-term transformations;short-term failures;global market;low-value payments;payment mechanism;mondex;canadian implementation;publicity campaign;major banks;canada;electronic cash;computer security;science and technology studies;consumer privacy;sociotechnical system;monetary policy
train_781
ICANN and Internet governance: leveraging technical coordination to realize
global public policy The Internet Corporation for Assigned Names and Numbers (ICANN) was created in 1998 to perform technical coordination of the Internet. ICANN also lays the foundations for governance, creating capabilities for promulgating and enforcing global regulations on Internet use. ICANN leverages the capabilities in the Internet domain name system (DNS) to implement four mechanisms of governance: authority, law, sanctions, and jurisdictions. These governance-related features are embodied in seemingly technical features of ICANN's institutional design. Recognition of ICANN's governance mechanisms allows us to better understand the Internet's emerging regulatory regime
global regulations;institutional design;governance-related features;internet governance;internet dns;global public policy;regulatory regime;internet domain name system;technical coordination;icann;internet use;internet corporation for assigned names and numbers
train_782
Community technology and democratic rationalization
The objective of the paper is to explore questions of human agency and democratic process in the technical sphere through the example of "virtual community." The formation of relatively stable long-term group associations (community in the broad sense of the term), is the scene on which a large share of human development occurs. As such it is a fundamental human value mobilizing diverse ideologies and sensitivities. The promise of realizing this value in a new domain naturally stirs up much excitement among optimistic observers of the Internet. At the same time, the eagerness to place hopes for community in a technical system flies in the face of an influential intellectual tradition of technology criticism. This eagerness seems even more naive in the light of the recent commercialization of so much Internet activity. Despite the widespread skepticism, we believe the growth of virtual community is significant for an inquiry into the democratization of technology. We show that conflicting answers to the central question of the present theoretical debate - Is community possible on computer networks? epsilon neralize from particular features of systems and software prevalent at different stages in the development of computer networking. We conclude that research should focus instead on how to design computer networks to better support community activities and values
human development;human value;intellectual tradition;computer networking;stable long-term group associations;conflicting answers;democratic process;optimistic observers;technical system;democratic rationalization;internet activity;community activities;virtual community;technical sphere;computer networks;human agency;diverse ideologies;technology criticism;community technology
train_783
The network society as seen from Italy
Italy was behind the European average in Internet development for many years, but a new trend, which has brought considerable change, emerged at the end of 1998 and showed its effects in 2000 and the following years. Now Italy is one of the top ten countries worldwide in Internet hostcount and the fourth largest in Europe. The density of Internet activity in Italy in proportion to the population is still below the average in the European Union, but is growing faster than Germany, the UK and France, and faster than the worldwide or European average. From the point of view of media control there are several problems. Italy has democratic institutions and freedom of speech, but there is an alarming concentration in the control of mainstream media (especially broadcast). There are no officially declared restrictions in the use of the Internet, but several legal and regulatory decisions reveal a desire to limit freedom of opinion and dialogue and/or gain centralized control of the Net
network society;internet development;uk;europe;media control;freedom of speech;france;worldwide average;regulatory decisions;european union;internet activity;italy;legal decisions;broadcast media;centralized control;european average;democratic institutions;internet hostcount;germany;mainstream media
train_784
Where tech is cheap [servers]
Talk, consultancy, support, not tech is the expensive part of network installations. It's a good job that small-scale servers can either be remotely managed, or require little actual management
small-scale servers;network installations;management
train_785
Networking without wires
Several types of devices use radio transmitters to send data over thin air. Are WLANs, wireless local area networks, the end to all cables? Will Dalrymple weighs up the costs and benefits
costs;benefits;wireless local area networks
train_787
New kit on the block [IT upgrades]
As time passes, new hardware and software replace the old. The hows are straightforward: IT resellers and consultants can help with upgrade practicalities. Will Dalrymple examines the business issues and costs involved in IT upgrades
microsoft;costs;business issues;it upgrades;it resellers;consultants
train_788
Rise of the supercompany [CRM]
All the thoughts, conversations and notes of employees help the firm create a wider picture of business. Customer relationship management (CRM) feeds on data, and it is hungry
customer relationship management;staff trained;database;central data repository
train_79
An efficient and stable ray tracing algorithm for parametric surfaces
In this paper, we propose an efficient and stable algorithm for finding ray-surface intersections. Newton's method and Bezier clipping are adapted to form the core of our algorithm. Ray coherence is used to find starting points for Newton iteration. We introduce an obstruction detection technique to verify whether an intersection point found using Newton's method is the closest. When Newton's method fails to achieve convergence, we use Bezier clipping substitution to find the intersection points. This combination achieves a significant improvement in tracing primary rays. A similar approach successfully improves the performance of tracing secondary rays
newton iteration;ray-surface intersections;primary ray tracing;ray coherence;efficient stable ray tracing algorithm;convergence;obstruction detection technique;bezier clipping;parametric surfaces;newton method;secondary ray tracing
train_790
Data assimilation of local model error forecasts in a deterministic model
One of the most popular data assimilation techniques in use today are of the Kalman filter type, which provide an improved estimate of the state of a system up to the current time level, based on actual measurements. From a forecasting viewpoint, this corresponds to an updating of the initial conditions. The standard forecasting procedure is to then run the model uncorrected into the future, driven by predicted boundary and forcing conditions. The problem with this methodology is that the updated initial conditions quickly 'wash-out', thus, after a certain forecast horizon the model predictions are no better than from an initially uncorrected model. This study demonstrates that through the assimilation of error forecasts (in the present case made using so-called local models) entire model domains can be corrected for extended forecast horizons (i.e. long after updated initial conditions have become washed-out), thus demonstrating significant improvements over the conventional methodology. Some alternate uses of local models are also explored for the re-distribution of error forecasts over the entire model domain, which are then compared with more conventional Kalman filter type schemes
forcing conditions;error prediction;kalman filter;hydrodynamic modelling;deterministic model;local model error forecasts;forecast horizon;data assimilation
train_791
The rise and fall and rise again of customer care
Taking care of customers has never gone out of style, but as the recession fades, interest is picking up in a significant retooling of the CRM solutions banks have been using. The goal: usable knowledge to help improve service
banks;customer relationship management;usable knowledge
train_792
Remember e-commerce? Yeah, well, it's still here
Sandy Kemper, the always outspoken CEO of successful e-commerce company eScout, offers his views on the purported demise of "commerce" in e-commerce, and what opportunities lie ahead for those bankers bold enough to act in a market turned tentative by early excesses
e-commerce;escout;bankers
train_793
Advancements during the past quarter century in on-line monitoring of motor and
generator winding insulation Electrical insulation plays a critical role in the operation of motor and generator rotor and stator windings. Premature failure of the insulation can cost millions of dollars per day. With advancements in electronics, sensors, computers and software, tremendous progress has been made in the past 25 yr which has transformed on-line insulation monitoring from a rarely used and expensive tool, to the point where 50% of large utility generators in North America are now equipped for such monitoring. This review paper outlines the motivation for online monitoring, discusses the transition to today's technology, and describes the variety of methods now in use for rotor winding and stator winding monitoring
motor generator winding insulation;stator windings;ozone monitoring;sensors;endwinding vibration monitoring;electronics;rotor windings;winding insulation on-line monitoring;software;magnetic flux monitoring;partial discharge monitoring;temperature monitoring;pd monitoring;generator winding insulation;premature insulation failure;computers;tagging compounds;condition monitors;electrical insulation
train_794
On the discretization of double-bracket flows
This paper extends the method of Magnus series to Lie-algebraic equations originating in double-bracket flows. We show that the solution of the isospectral flow Y' = [[Y,N],Y], Y(O) = Y/sub 0/ in Sym(n), can be represented in the form Y(t) = e/sup Omega (t)/Y/sub 0/e/sup - Omega (1)/, where the Taylor expansion of Omega can be constructed explicitly, term-by-term, identifying individual expansion terms with certain rooted trees with bicolor leaves. This approach is extended to other Lie-algebraic equations that can be appropriately expressed in terms of a finite "alphabet"
bicolor leaves;double-bracket flows discretization;isospectral flow;taylor expansion;lie-algebraic equations;magnus series
train_795
Approximation and complexity. II. Iterated integration
For pt. I. see ibid., no. 1, p. 289-95 (2001). We introduce two classes of real analytic functions W contained in/implied by U on an interval. Starting with rational functions to construct functions in W we allow the application of three types of operations: addition, integration, and multiplication by a polynomial with rational coefficients. In a similar way, to construct functions in U we allow integration, addition, and multiplication of functions already constructed in U and multiplication by rational numbers. Thus, U is a subring of the ring of Pfaffian functions. Two lower bounds on the L/sub infinity /-norm are proved on a function f from W (or from U, respectively) in terms of the complexity of constructing f
integration;l/sub infinity /-norm;pfaffian functions;multiplication;lower bounds;polynomial;rational functions;real analytic functions;addition
train_796
Quadratic Newton iteration for systems with multiplicity
Newton's iterator is one of the most popular components of polynomial equation system solvers, either from the numeric or symbolic point of view. This iterator usually handles smooth situations only (when the Jacobian matrix associated to the system is invertible). This is often a restrictive factor. Generalizing Newton's iterator is still an open problem: How to design an efficient iterator with a quadratic convergence even in degenerate cases? We propose an answer for an m-adic topology when the ideal m can be chosen generic enough: compared to a smooth case we prove quadratic convergence with a small overhead that grows with the square of the multiplicity of the root
newton's iterator;jacobian matrix;polynomial equation system solvers;systems with multiplicity;m-adic topology;quadratic newton iteration;quadratic convergence
train_797
Adaptive wavelet methods. II. Beyond the elliptic case
This paper is concerned with the design and analysis of adaptive wavelet methods for systems of operator equations. Its main accomplishment is to extend the range of applicability of the adaptive wavelet-based method developed previously for symmetric positive definite problems to indefinite or unsymmetric systems of operator equations. This is accomplished by first introducing techniques (such as the least squares formulation developed previously) that transform the original (continuous) problem into an equivalent infinite system of equations which is now well-posed in the Euclidean metric. It is then shown how to utilize adaptive techniques to solve the resulting infinite system of equations. It is shown that for a wide range of problems, this new adaptive method performs with asymptotically optimal complexity, i.e., it recovers an approximate solution with desired accuracy at a computational expense that stays proportional to the number of terms in a corresponding wavelet-best N-term approximation. An important advantage of this adaptive approach is that it automatically stabilizes the numerical procedure so that, for instance, compatibility constraints on the choice of trial spaces, like the LBB condition, no longer arise
asymptotically optimal complexity;operator equations;adaptive wavelet methods;elliptic case;euclidean metric;n-term approximation;least squares formulation
train_798
ClioWeb, ClioRequest, and Clio database: enhancing patron and staff
satisfaction Faced with increased demand from students and faculty for a speedier and more user-friendly method of obtaining materials from other institutions, the interlibrary loan (ILL) department sought to implement a management system which would accomplish the task. Students wanted remote interconnectivity to the system and staff wanted increased workflow efficiency, reduced paper work, and better data management. This paper focuses on Washington College's experience in selecting and implementing an interlibrary loan system, which would enhance student satisfaction as well as that of the library staff
clio database;user-friendly method;patron satisfaction;faculty;interlibrary loan department;washington college;data management;cliorequest;staff satisfaction;remote interconnectivity;clioweb;management system;workflow efficiency;students
train_799
Electronic reserves at University College London: understanding the needs of
academic departments This article describes a recent project at University College London to explore the feasibility of providing a service to improve access to electronic course materials. Funded by the Higher Education Funding Council for England (HEFCE), the project was not simply to set up an electronic reserve. By undertaking a needs analysis of academic departments, the project was able to tailor the design of the new service appropriately. While new initiatives in libraries are often established using project funding, this work was unique in being research-led. It also involved collaboration between library and computing staff and learning technologists
higher education funding council for england;electronic course materials;university college london;learning technologists;academic department needs;computing staff;electronic reserves
train_8
New investors get steal of a deal [Global Crossing]
Hutchison Telecommunications and Singapore Technologies take control of Global Crossing for a lot less money than they originally offered. The deal leaves the bankrupt carrier intact, but doesn't put it in the clear just yet
global crossing;hutchison telecommunications;bankrupt;singapore technologies
train_80
Evaluating the performance of a distributed database of repetitive elements in
complete genomes The original version of the Repeat Sequence Database (RSDB) was created based on centralized database systems (CDBSs). RSDB presently includes an enormous amount of data, with the amount of biological data increasing rapidly. Distributed RSDB (DRSDB) is developed to yield better performance. This study proposed many approaches to data distribution and experimentally determines the best approach to obtain good performance of our database. Experimental results indicate that DRSDB performs well for particular types of query
distributed repeat sequence database;complete genomes;repetitive elements;data distribution;performance evaluation;biological data;queries
train_800
A model for choosing an electronic reserves system: a pre-implementation study
at the library of Long Island University's Brooklyn campus This study explores the nature of electronic reserves (e-reserves) and investigates the possibilities of implementing the e-reserves at the Long Island University/Brooklyn Campus Library (LIU/BCL)
long island university brooklyn campus library;electronic reserves system
train_801
International customers, suppliers, and document delivery in a fee-based
information service The Purdue University Libraries library fee-based information service, the Technical Information Service (TIS), works with both international customers and international suppliers to meet its customers' needs for difficult and esoteric document requests. Successful completion of these orders requires the ability to verify fragmentary citations; ascertain documents' availability; obtain pricing information; calculate inclusive cost quotes; meet customers' deadlines; accept international payments; and ship across borders. While international orders make tip a small percent of the total workload, these challenging and rewarding orders meet customers' needs and offer continuous improvement opportunities to the staff
fragmentary citation verification;inclusive cost quotes;pricing information;document requests;international customers;document availability;document delivery;continuous staff improvement;customer deadline meeting;purdue university libraries fee-based information service;technical information service;international payments;international suppliers
train_802
A brief history of electronic reserves
Electronic reserves has existed as a library service for barely ten years, yet its history, however brief, is important as an indicator of the direction being taken by the profession of Librarianship as a whole. Recent improvements in technology and a desire to provide better service to students and faculty have resulted in the implementation of e-reserves by ever greater numbers of academic libraries. Yet a great deal of confusion still surrounds the issue of copyright compliance. Negotiation, litigation, and legislation in particular have framed the debate over the application of fair use to an e-reserves environment, and the question of whether or not permission fees should be paid to rights holders, but as of yet no definitive answers or standards have emerged
library service;faculty;permission fees;copyright compliance;e-reserves environment;legislation;negotiation;librarianship;litigation;students;academic libraries;electronic reserves
train_803
The mutual effects of grid and wind turbine voltage stability control
This note considers the results of wind turbine modelling and power system stability investigations. Voltage stability of the power grid with grid-connected wind turbines will be improved by using blade angle control for a temporary reduction of the wind turbine power during and shortly after a short circuit fault in the grid
wind turbine modelling;power system stability;grid voltage stability control;wind turbine power reduction;wind turbine voltage stability control;short circuit fault;blade angle control;power grid;offshore wind turbines;grid-connected wind turbines
train_804
Voltage control methods with grid connected wind turbines: a tutorial review
Within electricity grid networks it is conventional for large-scale central generators to both provide power and control grid node voltage. Therefore when wind turbines replace conventional power stations on a substantial scale, they must not only generate power, but also control grid node voltages. This paper reviews the basic principles of voltage control for tutorial benefit and then considers application of grid-connected wind turbines for voltage control. The most widely used contemporary wind turbine types are considered and further detail is given for determining the range of variables that allow control
electricity grid networks;reactive power;direct drive synchronous generator;voltage control;direct drive;offshore wind park;converter rating;grid connected wind turbines;weak grid;squirrel cage induction generator;large-scale central generators;grid node voltages control;variable speed;doubly fed induction generator
train_805
Active pitch control in larger scale fixed speed horizontal axis wind turbine
systems. I. linear controller design This paper reviews and addresses the principles of linear controller design of the fixed speed wind turbine system in above rated wind speed, using pitch angle control of the blades and applying modern control theory. First, the nonlinear equations of the system are built in under some reasonable suppositions. Then, the nonlinear equations are linearised at set operating point and digital simulation results are shown in this paper. Finally, a linear quadratic optimal feedback controller is designed and the dynamics of the closed circle system are simulated with digital calculation. The advantages and disadvantages of the assumptions and design method are also discussed. Because of the inherent characteristics of the linear system control theory, the performance of the linear controller is not sufficient for operating wind turbines, as is discussed
linear quadratic optimal feedback controller;aerodynamics;drive train dynamics;nonlinear equations;wind turbines;fixed speed wind turbine system;active pitch control;pitch angle control;horizontal axis wind turbine systems;linear system control theory;linear controller design;control theory;closed circle system;digital simulation
train_806
Flow measurement - future directions
Interest in the flow of liquids and its measurement can be traced back to early studies by the Egyptians, the Chinese and the Romans. Since these early times the science of flow measurement has undergone a massive change but during the last 25 years or so (1977-2002) it has matured enormously. One of the principal reasons for this is that higher accuracies and reliabilities have been demanded by industry in the measurement of fiscal transfers and today there is vigorous interest in the subject from both the flowmeter manufacturer and user viewpoints. This interest is coupled with the development of advanced computer techniques in fluid mechanics together with the application of increasingly sophisticated electronics
fiscal transfers;flow metering;liquid flow;egyptians;flowmeter manufacturer;advanced computer techniques;chinese;fluid mechanics;flow measurement;electronics application;signal processing;romans
train_807
Integrated optical metrology controls post etch CDs
Control of the transistor gate critical dimension (CD) on the order of a few nanometers is a top priority in many advanced IC fabs. Each nanometer deviation from the target gate length translates directly into the operational speed of these devices. However, using in-line process control by linking the lithography and etch tools can improve CD performance beyond what each individual tool can achieve. The integration of optical CD metrology tools to etch mainframes can result in excellent etcher stability and better control of post-etch CDs
operational speed;etcher stability;in-line process control;etch mainframes;optical cd metrology tools;cd performance;post etch cd control;target gate length;integrated optical metrology;lithography tools;transistor gate critical dimension;ic fabs;photolithography
train_808
A novel control logic for fast valving operations
This letter proposes new control logic for operating parallel valves in fast valving schemes in order to improve the transient stability performance of power systems. A fast valving scheme using parallel valves overcomes many of the limitations of the conventional scheme. The proposed control logic for operating these valves has been applied to a typical single machine infinite bus system. Single as well as multiple stroke operations for controlling the turbine power output have been studied with the new control sequences. Encouraging results have been shown over the conventional schemes of fast valving
multiple stroke operations;single machine infinite bus system;turbine power output control;transient stability performance;control logic;parallel valves operation;single stroke operations;transient stability
train_809
Edison's direct current influenced "Broadway" show lighting
During the early decades of the 20 th century, midtown Manhattan in New York City developed an extensive underground direct current (DC) power distribution system. This was a result of the original introduction of direct current by Thomas Edison's pioneering Pearl Street Station in 1882. The availability of DC power in the theater district, led to the perpetuation of an archaic form of stage lighting control through nearly three-quarters of the 20 th century. This control device was known as a "resistance dimmer." It was essentially a series-connected rheostat, but it was wound with a special resistance "taper" so as to provide a uniform change in the apparent light output of typical incandescent lamps throughout the travel of its manually operated arm. The development and use of DC powered stage lighting is discussed in this article
series-connected rheostat;manhattan;dc powered stage lighting;resistance dimmer;theater district;apparent light output;broadway show lighting;thomas edison's pearl street station;new york city;stage lighting control;incandescent lamps;underground direct current power distribution system;resistance taper
train_81
A scalable and efficient systolic algorithm for the longest common subsequence
problem A longest common subsequence (LCS) of two strings is a common subsequence of two strings of maximal length. The LCS problem is that of finding an LCS of two given strings and the length of the LCS. This problem has been the subject of much research because its solution can be applied in many areas. In this paper, a scalable and efficient systolic algorithm is presented. For two given strings of length m and n, where m>or=n, the algorithm can solve the LCS problem in m+2r-1 (respectively n+2r-1) time steps with r<n/2 (respectively r<m/2) processors. Experimental results show that the algorithm can be faster on multicomputers than all the previous systolic algorithms for the same problem
longest common subsequence problem;parallel algorithms;scalable algorithm;systolic algorithm
train_810
Oracle's Suite grows up
Once a low-cost Web offering, Oracle's Small Business Suite now carries a price tag to justify VAR interest
resellers;netledger;accounting;oracle small business suite
train_811
Integration, the Web are key this season [tax]
Integration and the Web are driving many of the enhancements planned by tax preparation software vendors for this coming season
accounting packages;software integration;internet;tax packages;gosystem tax rs;netconnection;cch;people's choice;intuit;taxworks;drake;ria;petz;taxsimple;visual tax;atx;cpasoftware
train_812
eLeaders make the Web work
Some companies are making the most of back-office/Web integration. Here are some winners
back-office/web integration;accpac etransact;visual integrator;e-commerce
train_813
On generalized Gaussian quadratures for exponentials and their applications
We introduce new families of Gaussian-type quadratures for weighted integrals of exponential functions and consider their applications to integration and interpolation of bandlimited functions. We use a generalization of a representation theorem due to Caratheodory to derive these quadratures. For each positive measure, the quadratures are parameterized by eigenvalues of the Toeplitz matrix constructed from the trigonometric moments of the measure. For a given accuracy epsilon , selecting an eigenvalue close to epsilon yields an approximate quadrature with that accuracy. To compute its weights and nodes, we present a new fast algorithm. These new quadratures can be used to approximate and integrate bandlimited functions, such as prolate spheroidal wave functions, and essentially bandlimited functions, such as Bessel functions. We also develop, for a given precision, an interpolating basis for bandlimited functions on an interval
bandlimited functions;toeplitz matrix;approximation;integration;generalized gaussian quadratures;bessel functions;weighted integrals;trigonometric moments;prolate spheroidal wave functions;exponential functions;caratheodory representation theorem;interpolation;eigenvalues
train_814
A framework for image deblurring using wavelet packet bases
We show that the average over translations of an operator diagonal in a wavelet packet basis is a convolution. We also show that an operator diagonal in a wavelet packet basis can be decomposed into several operators of the same kind, each of them being better conditioned. We investigate the possibility of using such a convolution to approximate a given convolution (in practice an image blur). Then we use these approximations to deblur images. First, we show that this framework permits us to redefine existing deblurring methods. Then, we show that it permits us to define a new variational method which combines the wavelet packet and the total variation approaches. We argue and show by experiments that this permits us to avoid the drawbacks of both approaches which are, respectively, ringing and staircasing
staircasing;operator diagonal;deconvolution;ringing;convolution;total variation approach;wavelet packet bases;image deblurring
train_815
The canonical dual frame of a wavelet frame
We show that there exist wavelet frames that have nice dual wavelet frames, but for which the canonical dual frame does not consist of wavelets, i.e., cannot be generated by the translates and dilates of a single function
gabor frames;compact support;canonical dual frame;wavelet frame;multiresolution hierarchy
train_816
Accelerating filtering techniques for numeric CSPs
Search algorithms for solving Numeric CSPs (Constraint Satisfaction Problems) make an extensive use of filtering techniques. In this paper we show how those filtering techniques can be accelerated by discovering and exploiting some regularities during the filtering process. Two kinds of regularities are discussed, cyclic phenomena in the propagation queue and numeric regularities of the domains of the variables. We also present in this paper an attempt to unify numeric CSPs solving methods from two distinct communities, that of CSP in artificial intelligence, and that of interval analysis
numeric csps;csps-solving;extrapolation methods;propagation;interval analysis;artificial intelligence;filtering techniques;search algorithms;constraint satisfaction problems;pruning
train_817
Summarization beyond sentence extraction: A probabilistic approach to sentence
compression When humans produce summaries of documents, they do not simply extract sentences and concatenate them. Rather, they create new sentences that are grammatical, that cohere with one another, and that capture the most salient pieces of information in the original document. Given that large collections of text/abstract pairs are available online, it is now possible to envision algorithms that are trained to mimic this process. In this paper, we focus on sentence compression, a simpler version of this larger challenge. We aim to achieve two goals simultaneously: our compressions should be grammatical, and they should retain the most important pieces of information. These two goals can conflict. We devise both a noisy-channel and a decision-tree approach to the problem, and we evaluate results against manual compressions and a simple baseline
sentence compression;grammatical;document summarization;noisy-channel;decision-tree