corpus_id
stringlengths
7
12
paper_id
stringlengths
9
16
title
stringlengths
1
261
abstract
stringlengths
70
4.02k
source
stringclasses
1 value
bibtex
stringlengths
208
20.9k
citation_key
stringlengths
6
100
arxiv-670101
cs/0109033
CLP versus LS on Log-based Reconciliation Problems
<|reference_start|>CLP versus LS on Log-based Reconciliation Problems: Nomadic applications create replicas of shared objects that evolve independently while they are disconnected. When reconnecting, the system has to reconcile the divergent replicas. In the log-based approach to reconciliation, such as in the IceCube system, the input is a common initial state and logs of actions that were performed on each replica. The output is a consistent global schedule that maximises the number of accepted actions. The reconciler merges the logs according to the schedule, and replays the operations in the merged log against the initial state, yielding to a reconciled common final state. In this paper, we show the NP-completeness of the log-based reconciliation problem and present two programs for solving it. Firstly, a constraint logic program (CLP) that uses integer constraints for expressing precedence constraints, boolean constraints for expressing dependencies between actions, and some heuristics for guiding the search. Secondly, a stochastic local search method with Tabu heuristic (LS), that computes solutions in an incremental fashion but does not prove optimality. One difficulty in the LS modeling lies in the handling of both boolean variables and integer variables, and in the handling of the objective function which differs from a max-CSP problem. Preliminary evaluation results indicate better performance for the CLP program which, on somewhat realistic benchmarks, finds nearly optimal solutions up to a thousands of actions and proves optimality up to a hundreds of actions.<|reference_end|>
arxiv
@article{fages2001clp, title={CLP versus LS on Log-based Reconciliation Problems}, author={Francois Fages}, journal={arXiv preprint arXiv:cs/0109033}, year={2001}, archivePrefix={arXiv}, eprint={cs/0109033}, primaryClass={cs.PL} }
fages2001clp
arxiv-670102
cs/0109034
Relevant Knowledge First - Reinforcement Learning and Forgetting in Knowledge Based Configuration
<|reference_start|>Relevant Knowledge First - Reinforcement Learning and Forgetting in Knowledge Based Configuration: In order to solve complex configuration tasks in technical domains, various knowledge based methods have been developed. However their applicability is often unsuccessful due to their low efficiency. One of the reasons for this is that (parts of the) problems have to be solved again and again, instead of being "learnt" from preceding processes. However, learning processes bring with them the problem of conservatism, for in technical domains innovation is a deciding factor in competition. On the other hand a certain amount of conservatism is often desired since uncontrolled innovation as a rule is also detrimental. This paper proposes the heuristic RKF (Relevant Knowledge First) for making decisions in configuration processes based on the so-called relevance of objects in a knowledge base. The underlying relevance-function has two components, one based on reinforcement learning and the other based on forgetting (fading). Relevance of an object increases with its successful use and decreases with age when it is not used. RKF has been developed to speed up the configuration process and to improve the quality of the solutions relative to the reward value that is given by users.<|reference_end|>
arxiv
@article{kreuz2001relevant, title={Relevant Knowledge First - Reinforcement Learning and Forgetting in Knowledge Based Configuration}, author={Ingo Kreuz, Dieter Roller}, journal={arXiv preprint arXiv:cs/0109034}, year={2001}, archivePrefix={arXiv}, eprint={cs/0109034}, primaryClass={cs.AI cs.LG} }
kreuz2001relevant
arxiv-670103
cs/0109035
Revenge of the Bell Heads: How the Net Heads Lost Control of the Internet
<|reference_start|>Revenge of the Bell Heads: How the Net Heads Lost Control of the Internet: A dichotomy in regulatory treatment and corporate cultures exists between Internet Service Providers (ISPs) and telecommunication carriers. Telephone company executives (Bell Heads) may resent regulation, but they accept their fate and work creatively to exploit anomalies and opportunities to secure a regulation-conferred competitive advantage. Most ISP executives (Net Heads) appear to embrace a libertarian attitude, strongly opposing any government involvement. Despite the clash of cultures, the telecommunications and Internet worlds have merged. Such convergence jeopardizes the ability of Net Heads to avoid some degree of regulation, particularly when they offer services functionally equivalent to what their Bell Head counterparts offer. This paper will assess the regulatory consequences when telecommunication and Internet services converge in the marketplace and in terms of operating technologies. The paper identifies commercial developments in the Internet to support the view that the Internet has become more hierarchical and more like telecommunication networks. The paper concludes that telecommunication carriers will display superior skill in working the regulatory process to their advantage. The paper suggests that Bell Heads will outmaneuver Net Heads particularly when the revenue siphoning effect of Internet-mediated services offsets the revenues generated from ISP leases of telecommunication transmission capacity.<|reference_end|>
arxiv
@article{frieden2001revenge, title={Revenge of the Bell Heads: How the Net Heads Lost Control of the Internet}, author={Rob Frieden}, journal={arXiv preprint arXiv:cs/0109035}, year={2001}, archivePrefix={arXiv}, eprint={cs/0109035}, primaryClass={cs.CY} }
frieden2001revenge
arxiv-670104
cs/0109036
Competition and Price Dispersion in International Long Distance Calling
<|reference_start|>Competition and Price Dispersion in International Long Distance Calling: This paper examines the relationship between changes in telecommunications provider concentration on international long distance routes and changes in prices on those routes. Overall, decreased concentration is associated with significantly lower prices to consumers of long distance services. However, the relationship between concentration and price varies according to the type of long distance plan considered. For the international flagship plans frequently selected by more price-conscious consumers of international long distance, increased competition on a route is associated with lower prices. In contrast, for the basic international plans that are the default selection for consumers, increased competition on a route is actually associated with higher prices. Thus, somewhat surprisingly, price dispersion appears to increase as competition increases.<|reference_end|>
arxiv
@article{ennis2001competition, title={Competition and Price Dispersion in International Long Distance Calling}, author={Sean F. Ennis}, journal={arXiv preprint arXiv:cs/0109036}, year={2001}, number={TPRC-2001-024}, archivePrefix={arXiv}, eprint={cs/0109036}, primaryClass={cs.CY} }
ennis2001competition
arxiv-670105
cs/0109037
Antitrust, Intellectual Property and Standard-Setting Organizations
<|reference_start|>Antitrust, Intellectual Property and Standard-Setting Organizations: Standard-setting organizations (SSOs) regularly encounter situations in which one or more companies claim to own proprietary rights that cover a proposed industry standard. The industry cannot adopt the standard without the permission of the intellectual property owner (or owners). How SSOs respond to those who assert intellectual property rights is critically important. Whether or not private companies retain intellectual property rights in group standards will determine whether a standard is "open" or "closed." It will determine who can sell compliant products, and it may well influence whether the standard adopted in the market is one chosen by a group or one offered by a single company. SSO rules governing intellectual property rights will also affect how standards change as technology improves. Given the importance of SSO rules governing intellectual property rights, there has been surprisingly little treatment of SSOs or their intellectual property rules in the legal literature. My aim in this article is to fill that void. To do so, I have surveyed the intellectual property policies of dozens of SSOs, primarily but not exclusively in the computer networking and telecommunications industries.<|reference_end|>
arxiv
@article{lemley2001antitrust,, title={Antitrust, Intellectual Property and Standard-Setting Organizations}, author={Mark A. Lemley}, journal={arXiv preprint arXiv:cs/0109037}, year={2001}, number={TPRC-2001-001}, archivePrefix={arXiv}, eprint={cs/0109037}, primaryClass={cs.CY} }
lemley2001antitrust,
arxiv-670106
cs/0109038
Crisis of Public Utility Deregulation and the Unrecognized Welfare State
<|reference_start|>Crisis of Public Utility Deregulation and the Unrecognized Welfare State: Successful achievement of public policies requires satisfaction of conditions affecting political feasibility for policy adoption and maintenance as well as economic viability of the desired activity or enterprise. This paper discusses the difficulties of satisfying these joint constraints given the legacy of the common law doctrines of "just price" and "businesses affected with a public interest." In this regard, it is helpful to view traditional public utility regulation as a form of welfare state regulation, as it suffers from similar political problems from policy retrenchment. The retrenchment problems are examined in the context of the electricity crisis in California as well as the passage and implementation of the Telecommunications Act of 1996. As expected, retrenchment from low residential retail rates - the most universalistic benefit for customers - faces the greatest political resistance. The societal trade-offs between monopoly and competition must be reexamined in light of the greater instability and political difficulties under a deregulatory regime.<|reference_end|>
arxiv
@article{cherry2001crisis, title={Crisis of Public Utility Deregulation and the Unrecognized Welfare State}, author={Barbara A. Cherry}, journal={arXiv preprint arXiv:cs/0109038}, year={2001}, number={TPRC-2001-034}, archivePrefix={arXiv}, eprint={cs/0109038}, primaryClass={cs.CY} }
cherry2001crisis
arxiv-670107
cs/0109039
Testing for Mathematical Lineation in Jim Crace's "Quarantine" and T S Eliot's "Four Quartets"
<|reference_start|>Testing for Mathematical Lineation in Jim Crace's "Quarantine" and T S Eliot's "Four Quartets": The mathematical distinction between prose and verse may be detected in writings that are not apparently lineated, for example in T. S. Eliot's "Burnt Norton", and Jim Crace's "Quarantine". In this paper we offer comments on appropriate statistical methods for such work, and also on the nature of formal innovation in these two texts. Additional remarks are made on the roots of lineation as a metrical form, and on the prose-verse continuum.<|reference_end|>
arxiv
@article{constable2001testing, title={Testing for Mathematical Lineation in Jim Crace's "Quarantine" and T. S. Eliot's "Four Quartets"}, author={John Constable and Hideaki Aoyama}, journal={arXiv preprint arXiv:cs/0109039}, year={2001}, archivePrefix={arXiv}, eprint={cs/0109039}, primaryClass={cs.CL} }
constable2001testing
arxiv-670108
cs/0109040
The Building of BODHI, a Bio-diversity Database System
<|reference_start|>The Building of BODHI, a Bio-diversity Database System: We have recently built a database system called BODHI, intended to store plant bio-diversity information. It is based on an object-oriented modeling approach and is developed completely around public-domain software. The unique feature of BODHI is that it seamlessly integrates diverse types of data, including taxonomic characteristics, spatial distributions, and genetic sequences, thereby spanning the entire range from molecular to organism-level information. A variety of sophisticated indexing strategies are incorporated to efficiently access the various types of data, and a rule-based query processor is employed for optimizing query execution. In this paper, we report on our experiences in building BODHI and on its performance characteristics for a representative set of queries.<|reference_end|>
arxiv
@article{srikanta2001the, title={The Building of BODHI, a Bio-diversity Database System}, author={B. J. Srikanta, Jayant Haritsa and Udaysankar Sen}, journal={arXiv preprint arXiv:cs/0109040}, year={2001}, number={TR-2001-02}, archivePrefix={arXiv}, eprint={cs/0109040}, primaryClass={cs.DB q-bio.PE} }
srikanta2001the
arxiv-670109
cs/0109041
Open Access beyond cable: The case of Interactive TV
<|reference_start|>Open Access beyond cable: The case of Interactive TV: In this paper we analyze the development of interactive TV in the U.S. and Western Europe. We argue that despite the nascent character of the market there are important regulatory issues at stake, as exemplified by the AOL/TW merger and the British Interactive Broadcasting case. Absent rules that provide for non-discriminatory access to network components (including terminal equipment specifications), dominant platform operators are likely to leverage ownership of delivery infrastructure into market power over interactive TV services. While integration between platform operator, service provider and terminal vendor may facilitate the introduction of services in the short-term, the lasting result will be a collection of fragmented "walled gardens" offering limited content and applications. Would interactive TV develop under such model, the exciting opportunities for broad-based innovation and extended access to multiple information, entertainment and educational services opened by the new generation of broadcasting technologies will be foregone<|reference_end|>
arxiv
@article{galperin2001open, title={Open Access beyond cable: The case of Interactive TV}, author={Hernan Galperin, Francois Bar}, journal={arXiv preprint arXiv:cs/0109041}, year={2001}, number={TPRC-2001-039}, archivePrefix={arXiv}, eprint={cs/0109041}, primaryClass={cs.MM} }
galperin2001open
arxiv-670110
cs/0109042
Intelligent Search of Correlated Alarms from Database containing Noise Data
<|reference_start|>Intelligent Search of Correlated Alarms from Database containing Noise Data: Alarm correlation plays an important role in improving the service and reliability in modern telecommunications networks. Most previous research of alarm correlation didn't consider the effect of noise data in Database. This paper focuses on the method of discovering alarm correlation rules from database containing noise data. We firstly define two parameters Win_freq and Win_add as the measure of noise data and then present the Robust_search algorithm to solve the problem. At different size of Win_freq and Win_add, experiments with alarm data containing noise data show that the Robust_search Algorithm can discover the more rules with the bigger size of Win_add. We also experimentally compare two different interestingness measures of confidence and correlation.<|reference_end|>
arxiv
@article{zheng2001intelligent, title={Intelligent Search of Correlated Alarms from Database containing Noise Data}, author={Qingguo Zheng, Ke Xu, Weifeng Lv, Shilong Ma}, journal={arXiv preprint arXiv:cs/0109042}, year={2001}, archivePrefix={arXiv}, eprint={cs/0109042}, primaryClass={cs.NI cs.AI} }
zheng2001intelligent
arxiv-670111
cs/0109043
PUC Autonomy and Policy Innovation: Local Telephone Competition in Arkansas and New York
<|reference_start|>PUC Autonomy and Policy Innovation: Local Telephone Competition in Arkansas and New York: In the pre-divestiture era, the regulatory environment in the U.S. was fairly uniform and harmonious with the FCC setting the course and the accommodative state PUCs making corresponding changes in their own policies. The divestiture fractured this monolithic system as it forced the PUCs to respond to new forces unleashed in their own backyards. Soon there was great diversity in the overall regulatory landscape. Within this new environment, there is considerable disparity among the PUCs in terms of their ability to implement new ideas. This paper seeks to understand the structural factors that influence the latitude of regulatory action by PUCs via a comparative study of local telephone competition policy making in Arkansas and New York. The analysis suggests that the presence or absence of countervailing forces determines the relative autonomy the PUCs enjoy and thereby their ability to introduce new ideas into their states.<|reference_end|>
arxiv
@article{lee2001puc, title={PUC Autonomy and Policy Innovation: Local Telephone Competition in Arkansas and New York}, author={Hokyu Lee and Harmeet Sawhney}, journal={arXiv preprint arXiv:cs/0109043}, year={2001}, number={TPRC-2001-026}, archivePrefix={arXiv}, eprint={cs/0109043}, primaryClass={cs.CY} }
lee2001puc
arxiv-670112
cs/0109044
Analyzing ENUM Service and Administration from the Bottom Up: The addressing system for IP telephony and beyond
<|reference_start|>Analyzing ENUM Service and Administration from the Bottom Up: The addressing system for IP telephony and beyond: ENUM creates many new market opportunities and raises several important policy issues related to the implementation and administration of the ENUM database and services. Recent World Telecommunications Policy Forum 2001 dealt with the emergence of ENUM as an important numbering issue of IP telephony. This paper prepares some important emerging issues of ENUM administration and policy by taking an empirical research approach from the bottom up. We will identify potential key ENUM services, and estimating the size of the service market opportunities created by the availability of PSTN-IP addressing and mapping mechanisms, particularly in the context of IP telephony. Also, we analyze the possible administrative models and relationship scenarios among different ENUM players such as Registry(ies), Registrars, Telephone Service Providers, ENUM Application Service Providers, etc. Then, we will assess the effects of various administrative model architectures of ENUM service by looking at the market opportunities and motivations of the players. From the empirical findings, we will draw the implications on transactions among different kinds of ENUM service providers. Finally, the results of the model analysis will be used for the discussion of policy related issues around the ENUM and IP telephony services. Keywords: IP Telephony, ENUM, Internet Policy, Numbering and Addressing System, Service and Market Study, Administration Model, Empirical Market Study.<|reference_end|>
arxiv
@article{hwang2001analyzing, title={Analyzing ENUM Service and Administration from the Bottom Up: The addressing system for IP telephony and beyond}, author={Junseok Hwang, Milton Mueller, Gunyoung Yoon, and Joonmin Kim}, journal={arXiv preprint arXiv:cs/0109044}, year={2001}, number={TPRC-2001-063}, archivePrefix={arXiv}, eprint={cs/0109044}, primaryClass={cs.CY} }
hwang2001analyzing
arxiv-670113
cs/0109045
Product Cycle, Wintelism, and Cross-national Production Networks (CPN) for Developing Countries-- China's Telecom Manufacturing Industry as A Case
<|reference_start|>Product Cycle, Wintelism, and Cross-national Production Networks (CPN) for Developing Countries-- China's Telecom Manufacturing Industry as A Case: Focusing on the telecom manufacturing industry in China, this paper contends that the existing literature needs to be expanded in order to explain the Chinese case. First, product cycle theory could be applied to explain multinational corporations' strategies of importing and localizing their products in China in order to take advantage of lower labor costs and often more significantly to break barriers to the Chinese market. Second, there are no significant indicators pointing to local multinational subsidiaries and indigenous manufacturers serving as a substantial part of the cross-national production networks in the global telecom industry yet, although there are some signs of potential development. Third, the success of "Wintelism" and the maturity of cross-national production networks in the global market have had significant impacts on China's indigenous industry.<|reference_end|>
arxiv
@article{tan2001product, title={Product Cycle, Wintelism, and Cross-national Production Networks (CPN) for Developing Countries-- China's Telecom Manufacturing Industry as A Case}, author={Zixiang Alex Tan}, journal={arXiv preprint arXiv:cs/0109045}, year={2001}, number={TPRC-2001-059}, archivePrefix={arXiv}, eprint={cs/0109045}, primaryClass={cs.CY} }
tan2001product
arxiv-670114
cs/0109046
Internet Radio: A New Engine for Content Diversity?
<|reference_start|>Internet Radio: A New Engine for Content Diversity?: While traditional radio stations are subject to extensive government regulations, Internet radio stations remain largely unregulated. As Internet radio usage has increased certain stakeholders have begun to argue that these Internet radio broadcasters are providing significant and diverse programming to American audiences and that government regulation of spectrum-using radio station ownership may be further relaxed. One of the primary justifications for regulation of ownership has been to protect diversity in broadcasting. This study hypothesizes that Internet radio broadcasting does add diversity to the radio broadcasting industry and that it should be considered as relevant by regulators. This study evaluates the role of Internet radio broadcasters according to five criteria intended to gauge the level of diversity being delivered to listeners online. By measuring the levels of format, channel, ownership, location and language diversity among Internet radio stations, it is possible to draw benchmark lessons about the new medium's ability to provide Americans with diverse broadcasting options. The study finds that Internet radio broadcasters are in fact adding measurable diversity to the radio broadcasting industry. Internet broadcasters are providing audiences with access to an increasing number of stations, owners, formats, and language choices, and it is likely that technologies aiding in the mobility of access as well as broadband evolution will reinforce these findings.<|reference_end|>
arxiv
@article{compaine2001internet, title={Internet Radio: A New Engine for Content Diversity?}, author={Benjamin Compaine and Emma Smith}, journal={arXiv preprint arXiv:cs/0109046}, year={2001}, number={TPRC-2001-2078}, archivePrefix={arXiv}, eprint={cs/0109046}, primaryClass={cs.CY} }
compaine2001internet
arxiv-670115
cs/0109047
Between a rock and a hard place: assessing the application of domestic policy and South Africa's commitments under the WTO'S Basic Telecommunications Agreement
<|reference_start|>Between a rock and a hard place: assessing the application of domestic policy and South Africa's commitments under the WTO'S Basic Telecommunications Agreement: South Africa adopted the GATS Basic Agreement on Telecommunications and the regulatory principles in 1998. Obligations undertaken by South Africa mirrored the framework for the gradual telecommunications reform process that was begun in 1996. In the light of two threatened actions for anti-competitive practices in violation of the Agreement, this paper reviews the nature of the commitments undertaken by South Africa and assesses the country's compliance to date. This paper also seeks to explore the tension that arises between domestic policy reforms and international trade aspirations. It is argued that the dynamic produced through this tension affords domestic governments a mechanism with which to balance the seemingly opposing goals of competition and development. It is further argued that the broad regulatory principles, adopted by all signatories and often criticized for lack of precision, facilitate this fine balancing and affords domestic governments an opportunity to advance sovereign concerns while pursuing international trade ideals.<|reference_end|>
arxiv
@article{cohen2001between, title={Between a rock and a hard place: assessing the application of domestic policy and South Africa's commitments under the WTO'S Basic Telecommunications Agreement}, author={Tracy Cohen}, journal={arXiv preprint arXiv:cs/0109047}, year={2001}, number={TPRC-2001-007}, archivePrefix={arXiv}, eprint={cs/0109047}, primaryClass={cs.CY} }
cohen2001between
arxiv-670116
cs/0109048
Competition and Commons: The Post-Telecom Act Public Interest, in and after the AOLTW Merger
<|reference_start|>Competition and Commons: The Post-Telecom Act Public Interest, in and after the AOLTW Merger: In asserting a competitive market environment as a justification for regulatory forbearance, the Telecommunications Act of 1996 finally articulated a clear standard for the FCC's public interest standard, one of the most protean concepts in communications. This seeming clarity has not, however, inhibited intense political conflict over the term. This paper examines public and regulatory debate over the AOL Time Warner merger as an example of the way in which the linkage between competitions and commons policy becomes relevant to communications policy, particularly in relation to mass media, and discusses interpretations of the public interest in the current FCC. The paper proposes that the Telecom Act's goal of fostering economic competition among information service providers, and the democratic ideal of nurturing public relationships and behaviors can be linked. Competition policy that creates the opportunity for untrammeled interactivity also provides a sine qua non to nurture the social phenomenon of the commons. The linked concepts of competition and commons could also provide useful ways to interpret the public interest in policy arenas as spectrum allocation and intellectual property.<|reference_end|>
arxiv
@article{aufderheide2001competition, title={Competition and Commons: The Post-Telecom Act Public Interest, in and after the AOLTW Merger}, author={Patricia Aufderheide}, journal={arXiv preprint arXiv:cs/0109048}, year={2001}, number={TPRC-2001-2030}, archivePrefix={arXiv}, eprint={cs/0109048}, primaryClass={cs.CY} }
aufderheide2001competition
arxiv-670117
cs/0109049
Signing Initiative Petitions Online: Possibilities, Problems and Prospects
<|reference_start|>Signing Initiative Petitions Online: Possibilities, Problems and Prospects: Many people expect the Internet to change American politics, most likely in the direction of increasing direct citizen participation and forcing government officials to respond more quickly to voter concerns. A recent California initiative with these objectives would authorize use of encrypted digital signatures over the Internet to qualify candidates, initiatives, and other ballot measures. Proponents of Internet signature gathering say it will significantly lower the cost of qualifying initiatives and thereby reduce the influence of organized, well-financed interest groups. They also believe it will increase both public participation in the political process and public understanding about specific measures. However, opponents question whether Internet security is adequate to prevent widespread abuse and argue that the measure would create disadvantages for those who lack access to the Internet. Beyond issues of security, cost, and access lie larger questions about the effects of Internet signature gathering on direct democracy. Would it encourage greater and more informed public participation in the political process? Or would it flood voters with ballot measures and generally worsen current problems with the initiative process itself? Because we lack good data on these questions, answers to them today are largely conjectural. We can be fairly sure, however, that Internet petition signing, like Internet voting, will have unintended consequences.<|reference_end|>
arxiv
@article{baer2001signing, title={Signing Initiative Petitions Online: Possibilities, Problems and Prospects}, author={Walter S. Baer}, journal={arXiv preprint arXiv:cs/0109049}, year={2001}, number={TPRC-2001-2054}, archivePrefix={arXiv}, eprint={cs/0109049}, primaryClass={cs.CY} }
baer2001signing
arxiv-670118
cs/0109050
A Framework for Assessing Universal Service Obligations: A Developing Country Perspective
<|reference_start|>A Framework for Assessing Universal Service Obligations: A Developing Country Perspective: A critical element of most national telecom policy objectives is advancing universal service. In a multi-operator context, this is usually operationalized through Universal Service Obligations (USO), by which various operators are mandated to provide a part of their services to rural areas or to high cost to serve customers at "affordable" prices. This paper highlights the various issues in USO from a developing country perspective. The first part of this paper gives an overview of USO practices and issues. The second part reviews the Telecom Regulatory Authority of India's recommendations on USO cost estimation. In the third part, the paper analyzes characteristics of rural exchanges with a view to evolve a framework for assessing USO. This framework is applicable to developing countries as the study carried out in this paper is in the context of a developing country characterized by low telecom penetration and non-availability of data with regulators.<|reference_end|>
arxiv
@article{jain2001a, title={A Framework for Assessing Universal Service Obligations: A Developing Country Perspective}, author={Rekha S. Jain, Pinaki Das}, journal={arXiv preprint arXiv:cs/0109050}, year={2001}, number={TPRC-2001-045}, archivePrefix={arXiv}, eprint={cs/0109050}, primaryClass={cs.CY} }
jain2001a
arxiv-670119
cs/0109051
Internet TV: Business Models and Program Content
<|reference_start|>Internet TV: Business Models and Program Content: Internet technology should eventually provide important improvements over established media not only in the efficiency of broadband delivery, and of particularimportance, in the efficiency of business models that can be used to collect money for that programming. I identify five economic characteristics of Internet technology that should lead to these greater efficiencies: (1) lower delivery costs and reduced capacity constraints, (2) more efficient interactivity, (3) more efficient advertising and sponsorship, (4) more efficient direct pricing and bundling, and (5) lower costs of copying and sharing. The most successful Internet TV business models are likely to involve syndication to or from other media, and also international distribution. In the broader context, Internet TV is another syndication outlet by which program suppliers can segment their overall markets and thus support higher production investments. Many innovative and more sharply focused programs will surely prosper on Internet TV, but the attractiveness to audiences of high production value programming will tend to advantage broad appeal programming, such as Hollywood movies. Historical evidence about the performance of cable television and videocassettes is presented to support these points.<|reference_end|>
arxiv
@article{waterman2001internet, title={Internet TV: Business Models and Program Content}, author={David Waterman}, journal={arXiv preprint arXiv:cs/0109051}, year={2001}, number={TPRC-2001-2118}, archivePrefix={arXiv}, eprint={cs/0109051}, primaryClass={cs.CY} }
waterman2001internet
arxiv-670120
cs/0109052
Globalization and Governance in Cyberspace: Mapping the Processes of Emergent Regime Formation in Global Information and Communications Policy
<|reference_start|>Globalization and Governance in Cyberspace: Mapping the Processes of Emergent Regime Formation in Global Information and Communications Policy: This paper develops a theoretical perspective on globalization and the Information Society and combines it with a critical usage of international regime theory as a heuristic for understanding the current historical period of transition from an international telecommunicaitons regime (Cowhey, 1990; 1994) to a new and complex regime aimed at providing governance for the Global Information Infrastructure and Global Information Society (GII/GIS). In analyzing the principles, values, norms, rules, collective decision-making prdocedures, and enforcement mechanisms of the emergent GII/GIS regime, this paper differentiates between three regime levels: (1) Macro-Regime--global; (2) Mezzo-Regime--regional and sub-regional; and (3) Micro-Regime--national. The paper employs a case-study approach to explore some of the specific national responses (i.e. South Africa) to this regime transition, with an analysis of potential best practices and lessons learned for other emerging economies. Key findings in this paper are: (1) that a range of social, political, economic, and technological factors are eroding the existing international telecommunications regime (e.g., VOIP;, call-back, VSATs, accounting rate restructuring, pressure for applicaitons development, and SMMEs); (2) a new regime for global information and communicaitons policy is emerging, but is being driven not by the broad possibilities of the Information Society, but by the more specific interests of global and multi-national corporations related to global e-commerce; (3) numerous strategic responses have been developed at national, subregional, and regional levels to the challenges of this transition in both developed and developing regions; and (4) without a collaborative response, the developing world will be further marginalized by this new regime.<|reference_end|>
arxiv
@article{cogburn2001globalization, title={Globalization and Governance in Cyberspace: Mapping the Processes of Emergent Regime Formation in Global Information and Communications Policy}, author={Derrick L. Cogburn}, journal={arXiv preprint arXiv:cs/0109052}, year={2001}, number={TPRC-2001-2277}, archivePrefix={arXiv}, eprint={cs/0109052}, primaryClass={cs.CY} }
cogburn2001globalization
arxiv-670121
cs/0109053
Price Increases from Online Privacy
<|reference_start|>Price Increases from Online Privacy: Consumers value keeping some information about them private from potential marketers. E-commerce dramatically increases the potential for marketers to accumulate otherwise private information about potential customers. Online marketers claim that this information enables them to better market their products. Policy makers are currently drafting rules to regulate the way in which these marketers can collect, store, and share this information. However, there is little evidence yet either of consumers' valuation of their privacy or of the benefits they might reap through better target marketing. We provide a framework for measuring a portion of the benefits from allowing marketers to make better use of consumer information. Target marketing is likely to reduce consumer search costs, improve consumer product selection decisions, and lower the marketing costs of goods sold. Our model allows us to estimate the value to consumers of only the latter, price reductions from more efficient marketing.<|reference_end|>
arxiv
@article{ward2001price, title={Price Increases from Online Privacy}, author={Michael R. Ward and Yu-Ching Chen}, journal={arXiv preprint arXiv:cs/0109053}, year={2001}, number={TPRC-2001-014}, archivePrefix={arXiv}, eprint={cs/0109053}, primaryClass={cs.CY} }
ward2001price
arxiv-670122
cs/0109054
Monopoly Power on the Web - A Preliminary Investigation of Search Engines
<|reference_start|>Monopoly Power on the Web - A Preliminary Investigation of Search Engines: E-Commerce challenges traditional approaches to assessing monopolistic practices due to the rapid rate of growth, rapid change in technology, difficulty in assessing market share for information products like web sites, and high degree of interconnectivity and alliance formation among corporations. This paper has provided a fundamental framework that integrates a network and economic perspective to the search engine market. The findings indicate that (1) despite an increasing number of search engines, barriers to entry seem high, largely due to the exponential growth in the number of web sites and the non-scalability of the current search technology and collective switching costs; (2) older search engine sites tend typically to have more features to lock in users. Using standard economic indicators (CR4=58% and HHI=1163), the industry looks close to being plagued by anticompetitive practices. But based on a network adjusted HHI constructed in this paper, its value, 870, suggests that there is less cause for concern. Based on all indicators, it suggests that Yahoo would be a contender. Other possible contenders are MSN and Netscape. On the basis of results to date, some search engines keep increasing their audience reach while others don't. The trend shows that some search engines may dominate the search engine market. We suggest conducting research in the coverage performance of search engines and investigate "information search cost" as a performance indicator of search techniques. In addition, we suggest paying attention to any anticompetitive conduct (e.g. product bundling) that may lesson competition and reduce consumer welfare. The combination of network theory and economic theory to study the search engine market is a particularly powerful approach for E-Commerce.<|reference_end|>
arxiv
@article{sheu2001monopoly, title={Monopoly Power on the Web - A Preliminary Investigation of Search Engines}, author={Tair-Rong Sheu, Kathleen Carley}, journal={arXiv preprint arXiv:cs/0109054}, year={2001}, number={TPRC-2001-035}, archivePrefix={arXiv}, eprint={cs/0109054}, primaryClass={cs.CY} }
sheu2001monopoly
arxiv-670123
cs/0109055
Standardization versus Coverage in Wireless Telephone Networks
<|reference_start|>Standardization versus Coverage in Wireless Telephone Networks: The issue of market-based versus mandated standards has been addressed in many settings. In most settings in which network effects are present, compatibility across platforms has been a key determinant of the success or failure of a particular technology. In the case of wireless telecommunications, however, interconnection and the availability of the relevant infrastructure can be a substitute for compatibility. In this paper, we examine the tradeoff between mandated standards and interconnection. We first provide institutional background; we then empirically examine whether, other things being equal, penetration rates were lower (or higher) for countries with multiple incompatible digital standards. We finally discuss the implications of our results for the current debate about 3G standards, in which CDMA2000, which is backed mainly by US firms, competes with WCDMA, which is backed by the European Community.<|reference_end|>
arxiv
@article{gandal2001standardization, title={Standardization versus Coverage in Wireless Telephone Networks}, author={Neil Gandal (Tel Aviv University), David Salant (NERA Economic Consulting), Leonard Waverman (London Business School and NERA Economic Consulting)}, journal={arXiv preprint arXiv:cs/0109055}, year={2001}, number={TPRC-2001-010}, archivePrefix={arXiv}, eprint={cs/0109055}, primaryClass={cs.CY} }
gandal2001standardization
arxiv-670124
cs/0109056
Ideological and Policy Origins of the Internet, 1957-1969
<|reference_start|>Ideological and Policy Origins of the Internet, 1957-1969: This paper examines the ideological and policy consensus that shaped computing research funded by the Information Processing Techniques Office (IPTO) within the Department of Defense's Advanced Research Projects Agency (ARPA). This historical case study of the period between Sputnik and the creation of the ARPANET shows how military, scientific, and academic values shaped the institutions and relations of a foundational period in the creation of the Internet. The paper probes three areas: the ideology of the science policy consensus, the institutional philosophy of IPTO under J. C. R. Licklider, and the ways that this consensus and philosophy shaped IPTO research in the period leading to the creation of the ARPANET. By examining the intellectual, cultural, and institutional details of the consensus that governed IPTO research between 1957 and 1969, we can understand the ways that these values defined the range of possibilities for network computing. The influence of the social values expressed by these actors was decisive: that government had an obligation to support a broad base of scientific research to promote both the public good and the national defense; that IPTO-sponsored computing research would accomplish both military and scientific objectives; and that IPTO could leverage its power within this consensus to create a network to share resources and unite researchers over geographical distance. A greater awareness of the ways that "consensus" worked in this period -- the "pre-history" of the Internet -- provides a richer context for evaluating the unique features of the Internet, such as its open architecture, collegial culture, and standards-based governance.<|reference_end|>
arxiv
@article{russell2001ideological, title={Ideological and Policy Origins of the Internet, 1957-1969}, author={Andrew L. Russell}, journal={arXiv preprint arXiv:cs/0109056}, year={2001}, number={TPRC-2001-087}, archivePrefix={arXiv}, eprint={cs/0109056}, primaryClass={cs.CY} }
russell2001ideological
arxiv-670125
cs/0109057
Do Switching Costs Make Markets More or Less Competitive?: The Case of 800-Number Portability
<|reference_start|>Do Switching Costs Make Markets More or Less Competitive?: The Case of 800-Number Portability: Do switching costs reduce or intensify price competition in markets where firms charge the same price to old and new consumers? Theoretically, the answer could be either "yes" or "no," due to two opposing incentives in firms' pricing decisions. The firm would like to charge a higher price to previous purchasers who are "locked-in" and a lower price to unattached consumers who offer higher future profitability. I demonstrate this ambiguity in an infinite-horizon theoretical model. 800- (toll-free) number portability provides empirical evidence to answer this question. Before portability, a customer had to change numbers to change service providers. This imposed significant switching costs on users, who generally invested heavily to publicize these numbers. In May 1993 a new database made 800-numbers portable. This drop in switching costs and regulations that precluded price discrimination between old and new consumers provide an empirical test of switching costs' effect on price competition. I use contracts for virtual private network (VPN) services to test how AT&T adjusted its prices for toll-free services in response to portability. Preliminarily (awaiting completion of data collection), I find that AT&T reduced margins for VPN contracts containing toll-free services relative to those that did not as the portability date approached. This implies that the switching costs due to non-portability made the market less competitive. These results suggest that, despite toll-free services growing rapidly during this time period, AT&T's incentive to charge a higher price to "locked-in" consumers exceeded its incentive to capture new consumers in the high switching costs era of non-portability.<|reference_end|>
arxiv
@article{viard2001do, title={Do Switching Costs Make Markets More or Less Competitive?: The Case of 800-Number Portability}, author={V. Brian Viard}, journal={arXiv preprint arXiv:cs/0109057}, year={2001}, number={TPRC-2001-085}, archivePrefix={arXiv}, eprint={cs/0109057}, primaryClass={cs.CY} }
viard2001do
arxiv-670126
cs/0109058
Profiling Internet Users' Participation in Social Change Agendas: An application of Q methodology
<|reference_start|>Profiling Internet Users' Participation in Social Change Agendas: An application of Q methodology: New computer-mediated channels of communication are action oriented and have the ability to deliver information and dialogue - moderated and unmoderated - which can facilitate the bringing together of a series of society's stakeholders, opinion leaders and change agents, who have the ability to influence social action. However, existing online studies have been limited in explaining Internet users' willingness to participate in social change agendas online. They have relied predominately on basic demographic descriptors such as age, education, income and access to technology and have ignored, social, psychological and attitudinal variables that may explain online participation and social change. The authors propose a Q methodology research approach better evaluate Internet users participation in online social change agendas.<|reference_end|>
arxiv
@article{previte2001profiling, title={Profiling Internet Users' Participation in Social Change Agendas: An application of Q methodology}, author={J. Previte, G. Hearn, S. Dann}, journal={arXiv preprint arXiv:cs/0109058}, year={2001}, archivePrefix={arXiv}, eprint={cs/0109058}, primaryClass={cs.CY} }
previte2001profiling
arxiv-670127
cs/0109059
Bringing the Internet to Schools: US and EU policies
<|reference_start|>Bringing the Internet to Schools: US and EU policies: The Internet is changing rapidly the way people around the world communicate, learn, and work. Yet the tremendous benefits of the Internet are not shared equally by all. One way to close the gap of the "digital divide" is to ensure Internet access to all schools from an early age. While both the USA and EU have embraced the promotion of Internet access to schools, the two have decided to finance it differently. This paper shows that the main costs of Internet access to schools are not communications-related (telecommunications and Internet services) but rather non-communications-related (hardware, educational training, software). This paper goes on to discuss whether the identified costs should be financed in any way by the universal service obligations funded by the telecommunications industry/sector/consumers (sector specific) or a general governmental budget (educational budget).<|reference_end|>
arxiv
@article{kosmidis2001bringing, title={Bringing the Internet to Schools: US and EU policies}, author={Michelle S. Kosmidis}, journal={arXiv preprint arXiv:cs/0109059}, year={2001}, number={TPRC-2001-076}, archivePrefix={arXiv}, eprint={cs/0109059}, primaryClass={cs.CY} }
kosmidis2001bringing
arxiv-670128
cs/0109060
Branching: the Essence of Constraint Solving
<|reference_start|>Branching: the Essence of Constraint Solving: This paper focuses on the branching process for solving any constraint satisfaction problem (CSP). A parametrised schema is proposed that (with suitable instantiations of the parameters) can solve CSP's on both finite and infinite domains. The paper presents a formal specification of the schema and a statement of a number of interesting properties that, subject to certain conditions, are satisfied by any instances of the schema. It is also shown that the operational procedures of many constraint systems including cooperative systems) satisfy these conditions. Moreover, the schema is also used to solve the same CSP in different ways by means of different instantiations of its parameters.<|reference_end|>
arxiv
@article{fernandez2001branching:, title={Branching: the Essence of Constraint Solving}, author={Antonio J. Fernandez, Patricia M. Hill}, journal={arXiv preprint arXiv:cs/0109060}, year={2001}, archivePrefix={arXiv}, eprint={cs/0109060}, primaryClass={cs.PL} }
fernandez2001branching:
arxiv-670129
cs/0109061
Geography and the Internet: Is the Internet a Substitute or a Complement for Cities?
<|reference_start|>Geography and the Internet: Is the Internet a Substitute or a Complement for Cities?: By combining persons around the world into a single market, the Internet may serve as a substitute for urban agglomeration. That is, the Internet may level the consumption playing field between large, variety-laden and small, variety-starved markets. However, if local content on the Internet is more prevalent in larger markets, then the Internet may be a complement for urban agglomeration. Characterizing the nature of available content using Media Metrix web page visits by about 13,500 households, we document that substantially more online local content is available in larger markets. Combining this with CPS Internet use data, we find statistically significant direct evidence of both complementarity and substitutability: Individuals are more likely to connect in markets with more local online content; and holding local online content constant, are less likely to connect in larger markets. We also find that individuals connect to overcome local isolation: Blacks are more likely to connect, relative to whites, when they comprise a smaller fraction of local population, making the Internet is a substitute for agglomeration of preference minorities within cities, if not cities themselves. On balance we find that the substitution and complementarity effects offset each other so that the Internet does not promote or discourage agglomeration in larger markets.<|reference_end|>
arxiv
@article{sinai2001geography, title={Geography and the Internet: Is the Internet a Substitute or a Complement for Cities?}, author={Todd Sinai and Joel Waldfogel}, journal={arXiv preprint arXiv:cs/0109061}, year={2001}, number={TPRC-2001-XXX}, archivePrefix={arXiv}, eprint={cs/0109061}, primaryClass={cs.CY} }
sinai2001geography
arxiv-670130
cs/0109062
India Attempts to Give a Jump-start to its Derailed Telecommunications Liberalization Process
<|reference_start|>India Attempts to Give a Jump-start to its Derailed Telecommunications Liberalization Process: After the 1991 economic policy made a shift from a closed economic model to a market-oriented model. The government invited private sector to participate in reforming its telecom sector. However, the government took a half-hearted approach in overhauling the legal and regulatory regime, suitable for competitive regime or in framing the 1994 Telecom Policy. Competition was allowed in cellular and basis services. The ministry and the incumbent (DOT) issued licenses to their competitors. Lack of transparency in issuing licenses and unrealistic license fee derailed the reforms process and led to wasteful litigation. The courts did not support the regulator and virtually made its role redundant.<|reference_end|>
arxiv
@article{gupta2001india, title={India Attempts to Give a Jump-start to its Derailed Telecommunications Liberalization Process}, author={Rajni Gupta}, journal={arXiv preprint arXiv:cs/0109062}, year={2001}, archivePrefix={arXiv}, eprint={cs/0109062}, primaryClass={cs.CY} }
gupta2001india
arxiv-670131
cs/0109063
Universal service, specific services on generic networks, some logic begins to emerge in the policy area
<|reference_start|>Universal service, specific services on generic networks, some logic begins to emerge in the policy area: It has proved to be difficult to translate the lessons from the literature on universal service into the policy framework because of political interests and regulatory capture. Neither the USA or Europe has made a very good job of devising a clean framework and the WTO agreement is sparing in this area. A number of pressures in the European context have enabled a more systematic approach to emerge, that exploits the academic work. They include the need for the European regulatory framework to encompass E. European countries where network development and income levels are much lower, the desire to encompass Internet within the universal service regulatory framework, a willingness to design a framework that covers all communications networks and remove the telecommunications bias, thereby forcing issues of economic neutrality to the fore. The paper systematically goes through a number of key areas and principles of regulation and how they are being designed to deal with a range of national situations. They include, defining the scope of universal service and the principles by which it might be modified in the light of technological and economic developments; incorporating latitude for intervention outside this defined scope, defining incentive and designation methods to encourage the efficient supply of elements of universal service obligations, interpreting affordability in the context of price and income levels that diverge considerably, requiring both allocative efficiency and competitive neutrality, formulating alternative financing methods including general government financing and value added tax type methods which can co-exist and provide comparative policy yardsticks.<|reference_end|>
arxiv
@article{cawley2001universal, title={Universal service, specific services on generic networks, some logic begins to emerge in the policy area}, author={Richard Cawley}, journal={arXiv preprint arXiv:cs/0109063}, year={2001}, number={TPRC-2001-046}, archivePrefix={arXiv}, eprint={cs/0109063}, primaryClass={cs.CY} }
cawley2001universal
arxiv-670132
cs/0109064
Commonalities: The REA and High-Speed Rural Internet Access
<|reference_start|>Commonalities: The REA and High-Speed Rural Internet Access: This paper explores commonalities between the creation of the Rural Electrification Administration and the similar dilemma of providing an affordable infrastructure for high-speed Internet access in places where profit incentives do not exist. In the case of the R.E.A., the necessity for an aggressive federal initiative to wire rural America, where the market for electricity had failed, is revisited as the missing incentives are identified and explored. We then examine the incentive-poor similarities between rural electrification and rural high-speed Internet access through how consumers currently and prospectively gain access to broadband Internet service. The regulatory environment created by the Telecommunications Act of 1996 and the Federal Communications Commission is considered. Although the FCC is required (Section 254.b.3) to take regulatory measures to ensure comparable and affordable access to the Internet for all Americans, the historical similarities and comparative analysis of rural electrification and high-speed Internet access suggests the goal of universal service is unlikely to be met in the near future. Regulatory disincentives to build such networks are present, driven in part by market realities and in part by competitive restrictions in the Telecommunications Act of 1996. Finally, we pose the question of whether a federal effort equivalent to the R.E.A. is needed to ensure that residents of sparsely populated areas, like their predecessors in the 1930s, are not comparatively disadvantaged in the first decades of the 21st century. The paper concludes with a proposal to accelerate the deployment of broadband infrastructure in rural America.<|reference_end|>
arxiv
@article{malone2001commonalities:, title={Commonalities: The R.E.A. and High-Speed Rural Internet Access}, author={Laurence J. Malone}, journal={arXiv preprint arXiv:cs/0109064}, year={2001}, number={TPRC-2001-2128}, archivePrefix={arXiv}, eprint={cs/0109064}, primaryClass={cs.CY} }
malone2001commonalities:
arxiv-670133
cs/0109065
On the Use of Vickrey Auctions for Spectrum Allocation in Developing Countries
<|reference_start|>On the Use of Vickrey Auctions for Spectrum Allocation in Developing Countries: In this paper, we assess the applicability of auctions based on the Vickrey second price model for allocating wireless spectrum in developing countries. We first provide an overview of auction models for allocating resources. We then examine the experience of auctioning spectrum in different countries. Based on this examination, we posit some axioms that seem to have to be satisfied when allocating spectrum in most developing countries. In light of these axioms, we provide a critical evaluation of using Vickrey second- price auctions to allocate spectrum in developing countries. We suggest the use of a new auction mechanism, the Vickrey "share auction" which will satisfy many of these axioms.<|reference_end|>
arxiv
@article{anandalingam2001on, title={On the Use of Vickrey Auctions for Spectrum Allocation in Developing Countries}, author={G. Anandalingam}, journal={arXiv preprint arXiv:cs/0109065}, year={2001}, number={TPRC-2001-2146}, archivePrefix={arXiv}, eprint={cs/0109065}, primaryClass={cs.CY} }
anandalingam2001on
arxiv-670134
cs/0109066
CLP Approaches to 2D Angle Placements
<|reference_start|>CLP Approaches to 2D Angle Placements: The paper presents two CLP approaches to 2D angle placements, implemented in CHIP v.5.3. The first is based on the classical (rectangular) cumulative global constraint, the second on the new trapezoidal cumulative global constraint. Both approaches are applied to a specific presented.<|reference_end|>
arxiv
@article{szczygiel2001clp, title={CLP Approaches to 2D Angle Placements}, author={Tomasz Szczygiel}, journal={arXiv preprint arXiv:cs/0109066}, year={2001}, archivePrefix={arXiv}, eprint={cs/0109066}, primaryClass={cs.PL} }
szczygiel2001clp
arxiv-670135
cs/0109067
Voice over IP in the Local Exchange: A Case Study
<|reference_start|>Voice over IP in the Local Exchange: A Case Study: There have been a small number of cost studies of Voice over IP (VoIP) in the academic literature. Generally, they have been for abstract networks, have not been focused on the public switched telephone network, or they have not included the operating costs. This paper presents the operating cost portion of our ongoing research project comparing circuit-switched and IP network costs for an existing local exchange carrier. We have found that (1) The operating cost differential between IP and circuit switching for this LEC will be small; and (2) A substantial majority of a telco's operating cost lies in customer service and outside plant maintenance, which will be incurred equally in both networks in a pure substitution scenario. Thus, the operating cost difference lies in the actual cost differences of the switching technologies. This appears to be less than 10%-15% of the total operating cost of the network. Thus, even if the cost differences for substitute services were large, the overall impact on the telco's financial performance would be small. But IP has some hidden benefits on the operations side. Most notably, data and voice services could be managed with the same systems infrastructure, meaning that the incremental operations cost of rolling out new services would likely be much lower, since it would all be IP.<|reference_end|>
arxiv
@article{weiss2001voice, title={Voice over IP in the Local Exchange: A Case Study}, author={Martin B.H. Weiss, Hak-Ju Kim}, journal={arXiv preprint arXiv:cs/0109067}, year={2001}, number={TPRC-2001-053}, archivePrefix={arXiv}, eprint={cs/0109067}, primaryClass={cs.CY} }
weiss2001voice
arxiv-670136
cs/0109068
Second-Level Digital Divide: Mapping Differences in People's Online Skills
<|reference_start|>Second-Level Digital Divide: Mapping Differences in People's Online Skills: Much of the existing approach to the digital divide suffers from an important limitation. It is based on a binary classification of Internet use by only considering whether someone is or is not an Internet user. To remedy this shortcoming, this project looks at the differences in people's level of skill with respect to finding information online. Findings suggest that people search for content in a myriad of ways and there is a large variance in how long people take to find various types of information online. Data are collected to see how user demographics, users' social support networks, people's experience with the medium, and their autonomy of use influence their level of user sophistication.<|reference_end|>
arxiv
@article{hargittai2001second-level, title={Second-Level Digital Divide: Mapping Differences in People's Online Skills}, author={Eszter Hargittai}, journal={arXiv preprint arXiv:cs/0109068}, year={2001}, number={TPRC-2001-083}, archivePrefix={arXiv}, eprint={cs/0109068}, primaryClass={cs.CY} }
hargittai2001second-level
arxiv-670137
cs/0109069
United States v Microsoft: A Failure of Antitrust in the New Economy
<|reference_start|>United States v Microsoft: A Failure of Antitrust in the New Economy: This paper analyzes the law and economics of United States v. Microsoft, a landmark case of antitrust intervention in network industries. [abridged]<|reference_end|>
arxiv
@article{economides2001united, title={United States v. Microsoft: A Failure of Antitrust in the New Economy}, author={Nicholas Economides}, journal={arXiv preprint arXiv:cs/0109069}, year={2001}, number={TPRC-2001-XXX}, archivePrefix={arXiv}, eprint={cs/0109069}, primaryClass={cs.CY} }
economides2001united
arxiv-670138
cs/0109070
Networks Unplugged: Towards A Model of Compatibility Regulation Between Information Platforms
<|reference_start|>Networks Unplugged: Towards A Model of Compatibility Regulation Between Information Platforms: Networks Unplugged: Towards A Model of Compatibility Regulation Between Information Platforms This Article outlines a basic model for regulating interoperability between rival information platforms. In so doing, it insists that antitrust, intellectual property, and telecommunications regulation all must follow the same set of principles to facilitate competition between rival standards where possible, mandating or allowing cooperation only where necessary to facilitate competition within a standard when network-level competition is infeasible. To date, the antitrust regime best approximates the type of model I have in mind, but sound competition policy requires that telecommunications regulation and intellectual property law follow its basic principles as well.<|reference_end|>
arxiv
@article{weiser2001networks, title={Networks Unplugged: Towards A Model of Compatibility Regulation Between Information Platforms}, author={Phil Weiser}, journal={arXiv preprint arXiv:cs/0109070}, year={2001}, number={TPRC-2001-XXX}, archivePrefix={arXiv}, eprint={cs/0109070}, primaryClass={cs.CY} }
weiser2001networks
arxiv-670139
cs/0109071
The Impact of Incentives in the Telecommunications Act of 1996 on Corporate Strategies
<|reference_start|>The Impact of Incentives in the Telecommunications Act of 1996 on Corporate Strategies: Rules are necessary to provide or shape the incentives of individuals and organizations. This is particularly true when free markets lead to undesirable outcomes. The Telecommunications Act of 1996 attempted to create incentives to foster competition. Ambiguity as well as the timing of the Act has led to delays in the clarification of rules and the rapid obsolescence of the document. The paper presents the strategies that common carriers adopted to try to tilt regulation in their favor, slow the entry of competitors, maintain their market leadership, and expand into other segments. Some of the strategies analyzed include lobbying efforts, court challenges, and lack of cooperation with new entrants.<|reference_end|>
arxiv
@article{garcia-murillo2001the, title={The Impact of Incentives in the Telecommunications Act of 1996 on Corporate Strategies}, author={Martha Garcia-Murillo, Ian MacInnes}, journal={arXiv preprint arXiv:cs/0109071}, year={2001}, number={TPRC-2001-2308968915}, archivePrefix={arXiv}, eprint={cs/0109071}, primaryClass={cs.CY} }
garcia-murillo2001the
arxiv-670140
cs/0109072
Higher-Order Pattern Complement and the Strict Lambda-Calculus
<|reference_start|>Higher-Order Pattern Complement and the Strict Lambda-Calculus: We address the problem of complementing higher-order patterns without repetitions of existential variables. Differently from the first-order case, the complement of a pattern cannot, in general, be described by a pattern, or even by a finite set of patterns. We therefore generalize the simply-typed lambda-calculus to include an internal notion of strict function so that we can directly express that a term must depend on a given variable. We show that, in this more expressive calculus, finite sets of patterns without repeated variables are closed under complement and intersection. Our principal application is the transformational approach to negation in higher-order logic programs.<|reference_end|>
arxiv
@article{momigliano2001higher-order, title={Higher-Order Pattern Complement and the Strict Lambda-Calculus}, author={Alberto Momigliano and Frank Pfenning}, journal={ACM Trans. Comput. Log. 4(4): 493-529 (2003)}, year={2001}, doi={10.1145/937555.937559}, number={University of Leicester Technical Report 2001/22}, archivePrefix={arXiv}, eprint={cs/0109072}, primaryClass={cs.LO cs.PL} }
momigliano2001higher-order
arxiv-670141
cs/0109073
E-Business and SMEs: Preliminary Evidence from Selected Italian Districts
<|reference_start|>E-Business and SMEs: Preliminary Evidence from Selected Italian Districts: The debate on the Information Society shows large agreement on the assumption that the promised benefits will fully display only if the diffusion of ICTs and the Internet will involve all the actors of the socio-economic system. Accordingly, special emphasis is put on the participation of small and medium enterprises (SMEs), but also on public administrations (PAs) as promoters and catalysts of private initiatives. As for SMEs, public intervention concerns both the promotion of fully competitive e-markets and the solution of market failures. However, effective and efficient intervention requires specific information on SMEs' approach to e-commerce, often depending upon specific sector and local condition and in most cases still lacking. In order to identify the need and the scope for public intervention, the paper focuses on a peculiar SMEs-intensive productive environment: the manufacturing industrial district, which traditionally constitutes an examples of winning SMEs' network, characterised by common industrial culture and intense input-output interactions. The paper presents empirical evidence from the Italian districts of Como (textile industry) and Lumezzane (metalwork industry). The research results show that pro-active entrepreneurs are creatively exploring the opportunities offered by the Internet to promote their businesses. However, it is also clear that the transition to the Internet economy still involves a reduced percentage of potential participants, and that institutional actions are needed in order to foster a larger participation.<|reference_end|>
arxiv
@article{piscitello2001e-business, title={E-Business and SMEs: Preliminary Evidence from Selected Italian Districts}, author={Lucia Piscitello, Francesca Sgobbi}, journal={arXiv preprint arXiv:cs/0109073}, year={2001}, number={TPRC-2001-050}, archivePrefix={arXiv}, eprint={cs/0109073}, primaryClass={cs.CY} }
piscitello2001e-business
arxiv-670142
cs/0109074
Indicators of Independence in Regulatory Commissions
<|reference_start|>Indicators of Independence in Regulatory Commissions: Independent regulatory commissions such as the Federal Communications Commission (FCC) must produce policies that reflect technical expertise, legal precedent, and stakeholder input. Given these situational imperatives, how does the FCC implement independence in its decision-making? This research explicates some of the underlying rules, resources, and relationships within the environment in which the agency is embedded that influence agency work practices to operationalize independence. Research such as this may be helpful in the creation of new, or for assessment of existing, regulatory commissions, but only if great attention is paid not only to institutional structure, but also to the practice of staff in the agency.<|reference_end|>
arxiv
@article{oberlander2001indicators, title={Indicators of Independence in Regulatory Commissions}, author={Susan Oberlander}, journal={arXiv preprint arXiv:cs/0109074}, year={2001}, number={TPRC-2001-048}, archivePrefix={arXiv}, eprint={cs/0109074}, primaryClass={cs.CY} }
oberlander2001indicators
arxiv-670143
cs/0109075
ICANN and Antitrust
<|reference_start|>ICANN and Antitrust: The Internet Corporation for Assigned Names and Numbers (ICANN) is a private non-profit company which, pursuant to contracts with the US government, acts as the de facto regulator for DNS policy. ICANN decides what TLDs will be made available to users, and which registrars will be permitted to offer those TLDs for sale. In this article we focus on a hitherto-neglected implication of ICANN's assertion that it is a private rather than a public actor: its potential liability under the U.S. antitrust laws, and the liability of those who transact with it. ICANN argues that it is not as closely tied to the government as NSI and IANA were in the days before ICANN was created. If this is correct, it seems likely that ICANN will not benefit from the antitrust immunity those actors enjoyed. Some of ICANN's regulatory actions may restrain competition, e.g. its requirement that applicants for new gTLDs demonstrate that their proposals would not enable competitive (alternate) roots and ICANN's preventing certain types of non-price competition among registrars (requiring the UDRP). ICANN's rule adoption process might be characterized as anticompetitive collusion by existing registrars, who are likely not be subject to the Noerr-Pennington lobbying exemption. Whether ICANN has in fact violated the antitrust laws depends on whether it is an antitrust state actor, whether the DNS is an essential facility, and on whether it can shelter under precedents that protect standard-setting bodies. If (as seems likely) a private ICANN and those who petition it are subject to antitrust law, everyone involved in the process needs to review their conduct with an eye towards legal liability. ICANN should act very differently with respect to both the UDRP and the competitive roots if it is to avoid restraining trade.<|reference_end|>
arxiv
@article{froomkin2001icann, title={ICANN and Antitrust}, author={A. Michael Froomkin and Mark A. Lemley}, journal={arXiv preprint arXiv:cs/0109075}, year={2001}, number={TPRC-2001-041}, archivePrefix={arXiv}, eprint={cs/0109075}, primaryClass={cs.CY} }
froomkin2001icann
arxiv-670144
cs/0109076
Out of the Loop: Problems in the development of next generation community networks
<|reference_start|>Out of the Loop: Problems in the development of next generation community networks: Drawing on an ongoing longitudinal research study, we discuss problems in the development of five next generation community networking projects in central New York. The projects were funded under a state program to diffuse broadband technologies in economically depressed areas of the state. The networks are technologically complex and entail high costs for subscribers. The political economy of the development process has biased the subscriber base toward the resource rich and away from the resource poor, and toward tried-and-tested uses like Internet and intra-organizational connectivity and away from community-oriented uses. These trends raise troubling questions about network ontology and function, and about the relation between the network and its physical host community. The need for appropriate social policy, and new planning practices, is argued to effect desired change.<|reference_end|>
arxiv
@article{venkatesh2001out, title={Out of the Loop: Problems in the development of next generation community networks}, author={Murali Venkatesh and Dong-Hee Shin}, journal={arXiv preprint arXiv:cs/0109076}, year={2001}, number={TPRC-2001-078}, archivePrefix={arXiv}, eprint={cs/0109076}, primaryClass={cs.CY} }
venkatesh2001out
arxiv-670145
cs/0109077
Coase's Penguin, or Linux and the Nature of the Firm
<|reference_start|>Coase's Penguin, or Linux and the Nature of the Firm: The paper explains why open source software is an instance of a potentially broader phenomenon. Specifically, I suggest that nonproprietary peer-production of information and cultural materials will likely be a ubiquitous phenomenon in a pervasively networked society. I describe a number of such enterprises, at various stages of the information production value chain. These enterprises suggest that incentives to engage in nonproprietary peer production are trivial as long as enough contributors can be organized to contribute. This implies that the limit on the reach of peer production efforts is the modularity, granularity, and cost of integration of a good produced, not its total cost. I also suggest reasons to think that peer-production can have systematic advantages over both property-based markets and corporate managerial hierarchies as a method of organizing information and cultural production in a networked environment, because it is a better mechanism for clearing information about human capital available to work on existing information inputs to produce new outputs, and because it permits largers sets of agents to use larger sets of resources where there are increasing returns to the scale of both the set of agents and the set of resources available for work on projects. As capital costs and communications costs decrease in importance as factors of information production, the relative advantage of peer production in clearing human capital becomes more salient.<|reference_end|>
arxiv
@article{benkler2001coase's, title={Coase's Penguin, or Linux and the Nature of the Firm}, author={Yochai Benkler}, journal={arXiv preprint arXiv:cs/0109077}, year={2001}, number={TPRC-2001-019}, archivePrefix={arXiv}, eprint={cs/0109077}, primaryClass={cs.CY} }
benkler2001coase's
arxiv-670146
cs/0109078
Internet Attacks: A Policy Framework for Rules of Engagement
<|reference_start|>Internet Attacks: A Policy Framework for Rules of Engagement: Information technology is redefining national security and the use of force by state and nonstate actors. The use of force over the Internet warrants analysis given recent terrorist attacks. At the same time that information technology empowers states and their commercial enterprises, information technology makes infrastructures supported by computer systems increasingly accessible, interdependent, and more vulnerable to malicious attack. The Computer Security Institute and the FBI jointly estimate that financial losses attributed to malicious attack amounted to $378 million in 2000. International Law clearly permits a state to respond in self-defense when attacked by another state through the Internet, however, such attacks may not always rise to the scope, duration, and intensity threshold of an armed attack that may justify a use of force in self-defense. This paper presents a policy framework to analyze the rules of engagement for Internet attacks. We describe the state of Internet security, incentives for asymmetric warfare, and the development of international law for conflict management and armed conflict. We focus on options for future rules of engagement specific to Information Warfare. We conclude with four policy recommendations for Internet attack rules of engagement: (1) the U.S. should pursue international definitions of "force" and "armed attack" in the Information Warfare context; (2) the U.S. should pursue international cooperation for the joint investigation and prosecution of Internet attacks; (3) the U.S. must balance offensive opportunities against defensive vulnerabilities; and (4) the U.S. should prepare strategic plans now rather than making policy decisions in real-time during an Internet attack.<|reference_end|>
arxiv
@article{yurcik2001internet, title={Internet Attacks: A Policy Framework for Rules of Engagement}, author={William Yurcik, David Doss}, journal={arXiv preprint arXiv:cs/0109078}, year={2001}, number={TPRC-2001-089}, archivePrefix={arXiv}, eprint={cs/0109078}, primaryClass={cs.CY} }
yurcik2001internet
arxiv-670147
cs/0109079
National Information Infrastructure Development in Canada and the US: Redefining Universal Service and Universal Access in the Age of Techno-Economic Convergence
<|reference_start|>National Information Infrastructure Development in Canada and the US: Redefining Universal Service and Universal Access in the Age of Techno-Economic Convergence: This exploratory and descriptive research compares the policy-making processes and policy recommendations regarding universal service and universal access developed by the U.S. National Information Infrastructure Advisory Council (NIIAC) and the Canadian Information Highway Advisory Council (IHAC) in conjunction with related federal government agencies. Created in 1993 and 1994, respectively, the Councils were charged with "bringing forward" the concepts of universal service and universal access to adjust to the effects of deregulation, new and converged Information and Communication Technologies (ICTs), and neo-liberal economic competition and globalization that included acknowledging the private sector as the primary creator of the Information Highway.This qualitative study used as its methodology organizational, policy, narrative, and discourse analyses to create a picture of what universal service and universal access were and what they became in the hands of NIIAC, IHAC. The U.S. had started with a more clearly defined universal service tradition than Canada, and undertook a more complex policy-making process with more experienced personnel. It was also clear that IHAC had in many ways followed the U.S. model and arrived at many similar recommendations as NIIAC. Because of the inevitability of technical, economic, and social change related to the Information Highway, no definitive outcome to the Universal Service and Universal Access "story" can be determined. Because the Canadian government did not follow up on some of IHAC's most crucial recommendations, the Canadian Information Highway "story," in particular, has been left less complete than that of the U.S.<|reference_end|>
arxiv
@article{dowding2001national, title={National Information Infrastructure Development in Canada and the U.S.: Redefining Universal Service and Universal Access in the Age of Techno-Economic Convergence}, author={Martin Dowding}, journal={arXiv preprint arXiv:cs/0109079}, year={2001}, number={TPRC-2001-XXX}, archivePrefix={arXiv}, eprint={cs/0109079}, primaryClass={cs.CY} }
dowding2001national
arxiv-670148
cs/0109080
Lead, Follow, or Go Your Own Way: Empirical Evidence Against Leader-Follower Behavior in Electronic Markets
<|reference_start|>Lead, Follow, or Go Your Own Way: Empirical Evidence Against Leader-Follower Behavior in Electronic Markets: Low search costs in Internet markets can be used by consumers to find low prices, but can also be used by retailers to monitor competitors' prices. This price monitoring can lead to price matching, resulting in dampened price competition and higher prices in some cases. This paper analyzes price data for 316 bestselling, computer, and random book titles gathered from 32 retailers between August 1999 and January 2000. In contrast to previous studies we find no evidence of leader-follow behavior for the vast majority of retailers we study. Further, the few cases of leader-follow behavior we observe seem to be associated with managerial convenience as opposed to anti-competitive behavior. We offer a methodology that can be used by future academic researchers or government regulators to check for anti-competitive price matching behavior in future time periods or in additional product categories.<|reference_end|>
arxiv
@article{clay2001lead,, title={Lead, Follow, or Go Your Own Way: Empirical Evidence Against Leader-Follower Behavior in Electronic Markets}, author={Karen Clay, Michael Smith, and Eric Wolff}, journal={arXiv preprint arXiv:cs/0109080}, year={2001}, number={TPRC-2001-047}, archivePrefix={arXiv}, eprint={cs/0109080}, primaryClass={cs.CY} }
clay2001lead,
arxiv-670149
cs/0109081
Pricing and Network Externalities in Peer-to-Peer Communications Networks (Draft)
<|reference_start|>Pricing and Network Externalities in Peer-to-Peer Communications Networks (Draft): This paper analyzes the pricing of transit traffic in wireless peer-to-peer networks using the concepts of direct and indirect network externalities. We first establish that without any pricing mechanism, congestion externalities overwhelm other network effects in a wireless data network. We show that peering technology will mitigate the congestion and allow users to take advantage of more the positive network externalities. However, without pricing, the peering equilibrium breaks down just like a bucket brigade made up of free-riding agents. With pricing and perfect competition, a peering equilibrium is possible and allows many more users on the network at the same time. However, the congestion externality is still a problem, so peering organized through a club may be the best solution.<|reference_end|>
arxiv
@article{chandan2001pricing, title={Pricing and Network Externalities in Peer-to-Peer Communications Networks (Draft)}, author={Sam Chandan and Christiaan Hogendorn}, journal={arXiv preprint arXiv:cs/0109081}, year={2001}, number={TPRC-2001-044}, archivePrefix={arXiv}, eprint={cs/0109081}, primaryClass={cs.CY cs.NI} }
chandan2001pricing
arxiv-670150
cs/0109082
Asymmetric Regulation on Steroids: US Competition Policy and Fiber to the Home
<|reference_start|>Asymmetric Regulation on Steroids: US Competition Policy and Fiber to the Home: Fiber to the Home (FTTH) describes a set of emerging technologies with the potential to affect competition in local access. On one hand, the high cost of deploying fiber to the residence suggests limitations on facilities-based competition among FTTH networks. On the other hand, FTTH opens up new possibilities for service-level competition, defined as the sharing of a single network infrastructure by multiple higher-layer service providers, whether of the same or different services. Yet technology is hardly an exogenous factor that independently shapes future local access competition; the regulatory environment also plays a key role. By shaping expectations about future competitive requirements, current regulations influence network operators' deployment choices among competing FTTH technologies, as well as design choices made by vendors and standards bodies for technologies still under development.The current regulatory approach to FTTH is far from consistent. Network operators likely to deploy FTTH include Incumbent Local Exchange Carriers (ILECs), incumbent cable operators, competitive access providers (including CLECs), independent telephone companies, and municipalities. This paper reviews the rules related to service-level competition that apply to each of these categories. In essence, the paper finds that if current regulatory trends continue, asymmetries in the regulation of service-level competition will be on steroids by the time FTTH starts being more commonly deployed. Current regulatory requirements are either non-existent, or extremely detailed and technology- and service- specific (e.g. UNEs). We argue that neither of these approaches is likely to achieve the desired result for FTTH, given the current state of flux in emerging FTTH technology.<|reference_end|>
arxiv
@article{gillett2001asymmetric, title={Asymmetric Regulation on Steroids: U.S. Competition Policy and Fiber to the Home}, author={Sharon Eisner Gillett and Emy Tseng}, journal={arXiv preprint arXiv:cs/0109082}, year={2001}, number={TPRC-2001-062}, archivePrefix={arXiv}, eprint={cs/0109082}, primaryClass={cs.CY cs.NI} }
gillett2001asymmetric
arxiv-670151
cs/0109083
Electronic Access to Information and the Privacy Paradox: Rethinking :'Practical Obscurity'
<|reference_start|>Electronic Access to Information and the Privacy Paradox: Rethinking :'Practical Obscurity': This article addresses the U.S. Supreme Court's central purpose formulation in Reporters Committee v. Department of Justice under the federal Freedom of Information Act. By examining all lower federal court opinions interpreting Reporters Committee and by analyzing the effects of the Court's opinion on the implementation of the FOIA, the paper finds that the Court's opinion has greatly narrowed the scope of the FOIA and limited the power of EFOIA to democratize electronic information. To assist in remedying the damage to the public interest in freedom of information, the author urges judicial consideration of a Privacy Act case, Tobey v. NRB, that more subtly treats information collected by government about individuals. The paper concludes that the Privacy Act demonstrates clearly that information, particularly computerized databases, can not be treated categorically for purposes of access.<|reference_end|>
arxiv
@article{davis2001electronic, title={Electronic Access to Information and the Privacy Paradox: Rethinking :'Practical Obscurity'}, author={Charles N. Davis}, journal={arXiv preprint arXiv:cs/0109083}, year={2001}, number={TPRC-2001-096}, archivePrefix={arXiv}, eprint={cs/0109083}, primaryClass={cs.CY} }
davis2001electronic
arxiv-670152
cs/0109084
The Internet and Community Networks: Case Studies of Five US Cities
<|reference_start|>The Internet and Community Networks: Case Studies of Five US Cities: This paper looks at five U.S. cities (Austin, Cleveland, Nashville, Portland, and Washington, DC) and explores strategies being employed by community activists and local governments to create and sustain community networking projects. In some cities, community networking initiatives are relatively mature, while in others they are in early or intermediate stages. The paper looks at several factors that help explain the evolution of community networks in cities: 1) Local government support; 2) Federal support 3) Degree of community activism, often reflected by public-private partnerships that help support community networks. In addition to these (more or less) measurable elements of local support, the case studies enable description of the different objectives of community networks in different cities. Several community networking projects aim to improve the delivery of government services (e.g., Portland and Cleveland), some have a job-training focus (e.g., Austin, Washington, DC), others are oriented very explicitly toward community building (Nashville, DC), and others toward neighborhood entrepreneurship (Portland and Cleveland). The paper ties the case studies together by asking whether community technology initiatives contribute to social capital in the cities studied.<|reference_end|>
arxiv
@article{horrigan2001the, title={The Internet and Community Networks: Case Studies of Five U.S. Cities}, author={John B. Horrigan}, journal={arXiv preprint arXiv:cs/0109084}, year={2001}, number={TPRC-2001-027}, archivePrefix={arXiv}, eprint={cs/0109084}, primaryClass={cs.DB} }
horrigan2001the
arxiv-670153
cs/0109085
Policy for access: Framing the question
<|reference_start|>Policy for access: Framing the question: Five years after the '96 Telecommunications Act, we still find precious little local facilities-based competition. In response there are calls in Congress and even from the FCC for new legislation to "free the Bells." However, the same ideology drove policy, not just five years ago, but also almost twenty years back with the first modern push for "freedom," namely divestiture. How might we frame the question of policy for local access to engender a more fruitful approach? The starting point for this analysis is the network--not bits and bytes, but the human network. With the human network as starting point, the unit of analysis is the community--specifically, the individual in a tension with community. There are two core ideas. The first takes a behavioral approach to the economics--and the relative share between beneficial chaos and order, in economic affairs, becomes explicit. If the first main idea provides a conceptual base for open source, the second core idea distinguishes open source from open design, ie at the information 'frontier' we push forward. The resulting policy frame for access is worked out in the detailed, concrete steps of an extended thought experiment. A small town setting (Concord, Massachusetts) grounds the discussion in the real world. The purpose overall is to stimulate new thinking which may break out of the conundrum where periodic rounds to legislate 'freedom' produce the opposite, recursively. The ultimate aim is better fit between our analytically-driven expectations and economic outcomes.<|reference_end|>
arxiv
@article{allen2001policy, title={Policy for access: Framing the question}, author={David Allen}, journal={arXiv preprint arXiv:cs/0109085}, year={2001}, number={TPRC-2001-008}, archivePrefix={arXiv}, eprint={cs/0109085}, primaryClass={cs.CY} }
allen2001policy
arxiv-670154
cs/0109086
Realspace Sovereigns in Cyberspace: The Case of Domain Names
<|reference_start|>Realspace Sovereigns in Cyberspace: The Case of Domain Names: In this piece, we take up the case of the domain name system as an example of challenges and solutions for realspace sovereigns in cyberspace. First, we analyze the 'in rem' provision of the US Anticybersquatting Consumer Protection Act (ACPA), which purports to expand the scope of the ACPA to encompass disputes with little direct connection with the United States. In reality, there exist no cases of foreign cybersquatting as to which the 'in rem' provision will be both applicable and constitutional. Instead the ACPA 'in rem' provision is notable primarily for its aggressive assertion of jurisdiction, leading us to consider the (often overlooked) role of realspace sovereigns in the regulation of the domain names system. By mapping the logical control over the domain names system-the distributed hierarchy that is the basis of the system's design-onto realspace territory, the potential for sovereign regulation of the system becomes apparent. We argue that the regulatory significance of geography, and the essentially arbitrary nature of the present territorial locations of the key components of the domain name system implies the future segmentation of the domain name system, and the resulting dramatic decrease in its value. Accordingly, we argue that realspace sovereigns (and especially the United States) have strong interests in avoiding segmentation, and thus must seek to coordinate the regulation of the system.<|reference_end|>
arxiv
@article{wagner2001realspace, title={Realspace Sovereigns in Cyberspace: The Case of Domain Names}, author={R. Polk Wagner and Catherine T. Struve}, journal={arXiv preprint arXiv:cs/0109086}, year={2001}, number={TPRC-2001-084}, archivePrefix={arXiv}, eprint={cs/0109086}, primaryClass={cs.CY} }
wagner2001realspace
arxiv-670155
cs/0109087
Civic Engagement among Early Internet Adopters: Trend or Phase?
<|reference_start|>Civic Engagement among Early Internet Adopters: Trend or Phase?: This paper brings evidence to bear on the question of the long-term effects of Internet diffusion on civic engagement in geographic communities. It draws on findings from survey data collected in four U.S. towns and cities in fall 2000 where community computer networking is established. The study shows that early adopters of the Internet are more likely to engage in civic activities and to have higher levels of community involvement than later adopters. Further, early adopters are more likely to use the Internet to increase their community involvement and political participation. Later adopters in all four sites show less involvement in their local community and less interest in political activity and information, online or offline. These findings reinforce those of the Kohut (1999) study showing that later adopters are less civic minded and more interested than early adopters in consumer and commercial applications, such as shopping and entertainment. The evidence in these four sites is consistent with earlier findings in Blacksburg, Virginia (Kavanaugh, 2000; Patterson and Kavanaugh, 2001; Kavanaugh and Patterson, 2001) and other studies of early innovation adopters (Rogers, 1983; Kohut, 1999; Valente, 1995, among others). The results reported in this paper lend weight to the argument that increases in civic engagement and community involvement are due primarily to the behavior of early adopters, making such increases a phase, not a trend. As later adopters come on line, use of the Internet for community involvement or civic engagement decreases. In the long term, we can expect that Internet access may have only a modest effect on community involvement and civic engagement in geographic communities.<|reference_end|>
arxiv
@article{kavanaugh2001civic, title={Civic Engagement among Early Internet Adopters: Trend or Phase?}, author={Andrea L. Kavanaugh}, journal={arXiv preprint arXiv:cs/0109087}, year={2001}, archivePrefix={arXiv}, eprint={cs/0109087}, primaryClass={cs.CY} }
kavanaugh2001civic
arxiv-670156
cs/0109088
Value of Usage and Seller's Listing Behavior in Internet Auctions
<|reference_start|>Value of Usage and Seller's Listing Behavior in Internet Auctions: In this paper, we aim to empirically examine the value of website usage and sellers' listing behavior in the two leading Internet auctions sites, eBay and Yahoo!Auctions. The descriptive data analysis of the seller's equilibrium listing behavior indicates that a seller's higher expected auction revenue from eBay is correlated with a lager number of potential bidders measured by website usage per listing. Our estimation results, based on the logarithm specifications of sellers' expected auction revenues and potential bidders' website usage, show that in a median case, (i) 1 percent increase of the unique visitors (page views) per listed item induces 0.022 (0.007) percent increase of a seller's expected auction revenue; and (ii) 1 percent increase of sellers' listings induces 1.99 (4.74) percent increase of the unique visitors (page views). Since increased expected auction revenues will induce more listings, we can infer positive feedback effects between the number of listings and website usage. Consequently, Yahoo!Auctions, which has substantially less listings, has greater incentives to increase listings via these feedback effects which are reflected in its fee schedules.<|reference_end|>
arxiv
@article{park2001value, title={Value of Usage and Seller's Listing Behavior in Internet Auctions}, author={Sangin Park}, journal={arXiv preprint arXiv:cs/0109088}, year={2001}, number={TPRC-2001-072}, archivePrefix={arXiv}, eprint={cs/0109088}, primaryClass={cs.CY} }
park2001value
arxiv-670157
cs/0109089
What Should be Hidden and Open in Computer Security: Lessons from Deception, the Art of War, Law, and Economic Theory
<|reference_start|>What Should be Hidden and Open in Computer Security: Lessons from Deception, the Art of War, Law, and Economic Theory: "What Should be Hidden and Open in Computer Security: Lessons from Deception, the Art of War, Law, and Economic Theory" Peter P. Swire, George Washington University. Imagine a military base. It is defended against possible attack. Do we expect the base to reveal the location of booby traps and other defenses? No. But for many computer applications,a software developer will need to reveal a great deal about the code to get other system owners to trust the code and know how to operate with it. This article examines these conflicting intuitions and develops a theory about what should be open and hidden in computer security. Part I of the paper shows how substantial openness is typical for major computer security topics, such as firewalls, packaged software, and encryption. Part II shows what factors will lead to openness or hiddenness in computer security. Part III presents an economic analysis of the issue of what should be open in computer security. The owner who does not reveal the booby traps is like a monopolist, while the open-source software supplier is in a competitive market. This economic approach allows us to identify possible market failures in how much openness occurs for computer security. Part IV examines the contrasting approaches of Sun Tzu and Clausewitz to the role of hiddenness and deception in military strategy. The computer security, economic, and military strategy approaches thus each show factors relevant to what should be kept hidden in computer security. Part V then applies the theory to a range of current legal and technical issues.<|reference_end|>
arxiv
@article{swire2001what, title={What Should be Hidden and Open in Computer Security: Lessons from Deception, the Art of War, Law, and Economic Theory}, author={Peter P. Swire}, journal={arXiv preprint arXiv:cs/0109089}, year={2001}, number={TPRC-2001-004}, archivePrefix={arXiv}, eprint={cs/0109089}, primaryClass={cs.CR cs.CY} }
swire2001what
arxiv-670158
cs/0109090
Telecommunications and rural economies: Findings from the Appalachian region
<|reference_start|>Telecommunications and rural economies: Findings from the Appalachian region: This research investigates the relationship between telecommunications infrastructure, economic conditions, and federal and state policies and initiatives. It presents a detailed look at the telecommunications environment of the Appalachian region, particularly focusing on broadband technologies. A strong, positive association exists between telecommunications infrastructure and economic status. The effects of federal and state universal service policies are examined, as well as some of the ways states have leveraged their own infrastructure to improve telecommunications capabilities in their region. Other state and local telecommunications-related programs are noted.<|reference_end|>
arxiv
@article{strover2001telecommunications, title={Telecommunications and rural economies: Findings from the Appalachian region}, author={Sharon Strover, Michael Oden, Nobuya Inagaki}, journal={arXiv preprint arXiv:cs/0109090}, year={2001}, number={TPRC-2001-080}, archivePrefix={arXiv}, eprint={cs/0109090}, primaryClass={cs.CY} }
strover2001telecommunications
arxiv-670159
cs/0109091
E PLURIBUS ENUM: Unifying International Telecommunications Networks and Governance
<|reference_start|>E PLURIBUS ENUM: Unifying International Telecommunications Networks and Governance: ENUM effectively bridges the telephone and Internet worlds by placing telephone numbers from the ITU Rec. E.164 public telecommunication numbering plan into the Internet Domain Name System (DNS) as domain names. ENUM potentially presents significant public policy issues at both the domestic and international levels. Ultimately, it should not matter whether ENUM is approached as a telecommunications issue or an Internet issue because: (1) they are becoming the same thing technically, and (2) they engage the same global public interests. For the same reasons as apply to traditional telecommunications, and even to the Internet itself, public oversight of ENUM naming, numbering, and addressing resources is justified both by technical necessity and the interests of consumer protection (particularly personal privacy) and competition at higher service layers. A single, coordinated global DNS domain for at least Tier 0 (the international level) of the ENUM names hierarchy should be designated by public authorities. Many of the technical characteristics and policy considerations relevant at the ENUM Tier 0 and 1 zones are also directly applicable to the Internet's IP address space and DNS root (or Tier 0) zone - key shared elements of the Internet's logical infrastructure. Despite the fundamentally international nature of the Internet's logical infrastructure layer, and the purported privatization of administration of its IP address space and the DNS, Internet governance is not yet truly international. The ENUM policy debate illustrates the need for authoritative international public oversight of public communications network logical infrastructure, including that of traditional telecommunications, the Internet, and ENUM.<|reference_end|>
arxiv
@article{mctaggart2001e, title={E PLURIBUS ENUM: Unifying International Telecommunications Networks and Governance}, author={Craig McTaggart}, journal={arXiv preprint arXiv:cs/0109091}, year={2001}, number={TPRC-2001-064}, archivePrefix={arXiv}, eprint={cs/0109091}, primaryClass={cs.CY cs.NI} }
mctaggart2001e
arxiv-670160
cs/0109092
Is the Commercial Mass Media Necessary, or Even Desirable, for Liberal Democracy?
<|reference_start|>Is the Commercial Mass Media Necessary, or Even Desirable, for Liberal Democracy?: Is a commercial mass media, dependent on the market for its sustenance, necessary, or even desirable, for liberal democracy? Yochai Benkler has argued that a decentralized, peer-to-peer system of communications and information is both possible with digital technology and preferable to a system based on commercial mass media. He has contended in fact that the presence of politically powerful, copyright-rich mass media imposes significant barriers to the development of peer-to-peer information-sharing networks. In contrast, I have argued that the commercial mass media play an important, and perhaps even vital, role in liberal democracy by galvanizing public opinion, serving as a watchdog against government and corporate wrongdoing, agenda-setting (which enables public discourse), and serving as a relatively trustworthy source of information. This paper seeks to push the ball forward on this issue. It first isolates and enumerates the contributions that the commercial mass media are said to make towards liberal democracy. It then briefly assesses the extent to which the commercial mass media actually fulfills these constitutive functions. It then asks whether alternative institutions might serve some or all of these functions just as well or better. In so doing, it looks both to the past and the future. First, it examines the political party-supported press that thrived in the United States through much of the 19th century. Second, it examines government-funded mass media. Third, it looks, skeptically, at possibilities for peer-to-peer sharing of information and opinion in the digital network environment. I conclude that, despite the weaknesses of commercial mass media, an information policy suitable to liberal democracy should include a plurality of types of voices, including commercial mass media.<|reference_end|>
arxiv
@article{netanel2001is, title={Is the Commercial Mass Media Necessary, or Even Desirable, for Liberal Democracy?}, author={Neil Netanel}, journal={arXiv preprint arXiv:cs/0109092}, year={2001}, number={TPRC - 2001 - XXX}, archivePrefix={arXiv}, eprint={cs/0109092}, primaryClass={cs.CY} }
netanel2001is
arxiv-670161
cs/0109093
Open Access to Monopoly Cable Platforms Versus Direct Access To Competitive International Telecommunications Satellite Facilities: A Study In Contrasts
<|reference_start|>Open Access to Monopoly Cable Platforms Versus Direct Access To Competitive International Telecommunications Satellite Facilities: A Study In Contrasts: In 1999, the FCC authorized direct access to INTELSAT, allowing INTELSAT's U.S. customers and competitors to bypass INTELSAT's U.S. retail affiliate (COMSAT), and to take satellite capacity at wholesale prices directly from INTELSAT. This policy was modeled in many respects on the access and unbundling requirements applicable to domestic incumbent local exchange carriers (ILECs) under the Telecommunications Act of 1996. At the same time, incumbent domestic cable TV system operators have not been required to provide wholesale open access to competitive Internet Service Providers (ISPs) seeking to provide residential broadband Internet service through existing proprietary cable facilities. Yet the policy arguments favoring open access to incumbent domestic cable systems appear to be stronger than those favoring direct access to INTELSAT. For example, it may be fairly debated whether entrenched cable system operators are now positioned to unfairly leverage their dominance in the multichannel video programming distribution (MVPD) market to thwart competition in the broadband ISP market, as some cable open access advocates assert. In contrast, it is clear that no analogous issues of tying were implicated by INTELSAT in 1999, when its position in the international telecommunications market was substantially nondominant and, in any event, it had no new product to tie to its established offerings. Similarly, while it may be debated whether or not a cable plant is a bottleneck facility that gatekeeps broadband Internet for many residential users, it is beyond cavil that INTELSAT in 1999 controlled virtually no such bottleneck facilities.<|reference_end|>
arxiv
@article{katkin2001open, title={Open Access to Monopoly Cable Platforms Versus Direct Access To Competitive International Telecommunications Satellite Facilities: A Study In Contrasts}, author={Ken Katkin}, journal={arXiv preprint arXiv:cs/0109093}, year={2001}, number={TPRC-2001-079}, archivePrefix={arXiv}, eprint={cs/0109093}, primaryClass={cs.CY} }
katkin2001open
arxiv-670162
cs/0109094
Competition and Globalization Brazilian Telecommunications Policy at Crossroads
<|reference_start|>Competition and Globalization Brazilian Telecommunications Policy at Crossroads: The current pattern of competition in the Brazilian telecommunications market was defined by a regulatory reform implemented in the second half of the nineties. The telecommunications regulatory reform discussed in the paper promoted the privatization of Telebras System and fostered competition in the Brazilian market under the stricted supervision of the new regulator Anatel. Notwithstanding, the regulatory Brazilian scheme is at a crossroads. From 2002 on an open market approach will be implemented in telecom arena. We analyse the main aspects of those change, from the firms perspective and also in a broad scenario under influence of the World Trade Organization system<|reference_end|>
arxiv
@article{piragibe2001competition, title={Competition and Globalization Brazilian Telecommunications Policy at Crossroads}, author={Clelia Piragibe}, journal={arXiv preprint arXiv:cs/0109094}, year={2001}, number={TPRC-2001-090}, archivePrefix={arXiv}, eprint={cs/0109094}, primaryClass={cs.CY} }
piragibe2001competition
arxiv-670163
cs/0109095
Assessing the Effectiveness of Section 271 Five Years After the Telecommunications Act of 1996
<|reference_start|>Assessing the Effectiveness of Section 271 Five Years After the Telecommunications Act of 1996: A major goal of the Telecommunications Act of 1996 is to promote competition in both the local exchange and long distance wireline markets. In section 271 Congress permitted the Bell Operating Companies (BOCs) to enter the long distance market only if they demonstrate to the FCC that they have complied with the market-opening requirements of section 251. This paper examines the logic behind section 271, to determine if it is a reasonable means of achieving increased competition in both the local and long distance markets, given the technical characteristics of the industry and the legal and informational constraints on regulators who must ensure compliance. It also provides an update on the extent of competitive entry in the local exchange market five years after enactment of the Act. In this paper we examine a variety of schemes for ensuring BOC compliance that Congress could have used. Given the characteristics of the industry and the limitations on regulators' ability to observe BOC's efforts, we determine that the use of a prize such as BOC entry into long distance is a superior incentive mechanism. We further determine that conditioning a BOC's long distance entry on its demonstrating compliance with section 251 is a logical method of protecting the long distance market against a BOC discriminating against long distance competitors once it has gained entry. The statistical evidence we look at, using data we have collected on ILEC lines sold to CLECs for POTS services, appears to confirm that section 271 has thus far been effective in ensuring compliance.<|reference_end|>
arxiv
@article{shiman2001assessing, title={Assessing the Effectiveness of Section 271 Five Years After the Telecommunications Act of 1996}, author={Daniel R. Shiman, Jessica Rosenworcel}, journal={arXiv preprint arXiv:cs/0109095}, year={2001}, number={TPRC-2001-057}, archivePrefix={arXiv}, eprint={cs/0109095}, primaryClass={cs.CY} }
shiman2001assessing
arxiv-670164
cs/0109096
CyberCampaigns and Canadian Politics: Still Waiting?
<|reference_start|>CyberCampaigns and Canadian Politics: Still Waiting?: The early election call in the fall of 2000 provided the perfect opportunity to study the impact the Internet has had on election campaigning in Canada. With the explosion of use the Net has seen since the 1997 general election, Canadian federal parties stood at the threshold of a new age in election campaigning. Pundits such as Rheingold (1993) have argued that the Internet will provide citizens with a way to bypass traditional media and gain unmediated access to each parties political message as well as providing a forum for citizens to engage the parties, and each other in deliberative debate. Through a longitudinal analysis of party web pages and telephone interviews with party staffers, we analyze the role the Internet played in the election campaigns of Canada's federal parties. Our findings indicate that the parties are still focusing on providing online features that talk at the voter instead of engaging them in any type of meaningful discourse. Most of these sites were exceptionally similar in their structure and in the type of content they provided. Generally, these sites served as digital archives for campaign material created with other media in mind and despite the multimedia capabilities of the Internet, these sites tended to be overwhelmingly text oriented. In line with Stromer-Galley's (2000) discussion of why candidates in the U.S. avoid online interaction, we also argue that little incentive exists to motivate parties to engage in any meaningful interaction with voters online.<|reference_end|>
arxiv
@article{christensen2001cybercampaigns, title={CyberCampaigns and Canadian Politics: Still Waiting?}, author={Tony Christensen, Peter McCormick}, journal={arXiv preprint arXiv:cs/0109096}, year={2001}, number={TPRC-2001-058}, archivePrefix={arXiv}, eprint={cs/0109096}, primaryClass={cs.CY} }
christensen2001cybercampaigns
arxiv-670165
cs/0109097
'Negotiated Liberalization': Stakeholder Politics and Communication Sector Reform in South Africa
<|reference_start|>'Negotiated Liberalization': Stakeholder Politics and Communication Sector Reform in South Africa: The paper examines the South African transition from apartheid to democracy through the lens of the reform of the communication sector. Through a set of participatory stakeholder consultative processes, the institutions of broadcasting, telecommunications, print press, and state information agency underwent reform in a process I refer to as 'negotiated liberalization.'<|reference_end|>
arxiv
@article{horwitz2001'negotiated, title={'Negotiated Liberalization': Stakeholder Politics and Communication Sector Reform in South Africa}, author={Robert B. Horwitz}, journal={arXiv preprint arXiv:cs/0109097}, year={2001}, number={TPRC-2001-009}, archivePrefix={arXiv}, eprint={cs/0109097}, primaryClass={cs.CY} }
horwitz2001'negotiated
arxiv-670166
cs/0109098
The Influence of Policy Regimes on the Development and Social Implications of Privacy Enhancing Technologies
<|reference_start|>The Influence of Policy Regimes on the Development and Social Implications of Privacy Enhancing Technologies: As privacy issues have gained social salience, entrepreneurs have begun to offer privacy enhancing technologies (PETs) and the U.S. has begun to enact privacy legislation. But "privacy" is an ambiguous notion. In the liberal tradition, it is an individualistic value protecting citizens from intrusion into a realm of autonomy. A feminist critique suggests that the social utility of privacy is to exclude certain issues from the public realm. Sociologists suggest that privacy is about identity management, while political economists suggest that the most salient privacy issue is the use of personal information to normalize and rationalize populations according to the needs of capital. While PETs have been developed for use by individual consumers, recently developers are focusing on the business to business market, where demand is stoked by the existence of new privacy regulations. These new laws tend to operationalize privacy in terms of "personally identifiable information." The new generation of PETs reflect and reify that definition. This, in turn, has implications for the everyday understandings of privacy and the constitution of identity and social life. In particular, this socio-technical practice may strengthen the ability of data holders to rationalize populations and create self-serving social categories. At the same time, they may permit individuals to negotiate these categories outside of panoptic vision. They may also encourage public discussion and awareness of these created social categories.<|reference_end|>
arxiv
@article{phillips2001the, title={The Influence of Policy Regimes on the Development and Social Implications of Privacy Enhancing Technologies}, author={David J. Phillips}, journal={arXiv preprint arXiv:cs/0109098}, year={2001}, number={TPRC-2001-067}, archivePrefix={arXiv}, eprint={cs/0109098}, primaryClass={cs.CY} }
phillips2001the
arxiv-670167
cs/0109099
ICANN as Regulator
<|reference_start|>ICANN as Regulator: This paper tells the story leading to ICANN's selection of seven new Internet top level domains in November 2000. In implementing proposals to expand the name space, ICANN adopted an approach far different from Jon Postel's lightweight proposals. ICANN staff, in setting the ground rules for considering new gTLDs, emphasized that only a few applicants would be allowed in, and imposed strict threshold requirements. Staff determined that the Board should pick TLDs by looking at all relevant aspects of every proposal, and deciding which ones presented the best overall combination of a variety of incommensurable factors. Aspects of the resulting process were predictable: Anyone familiar with the FCC comparative hearing process for broadcast licenses can attest that this sort of ad hoc comparison is necessarily subjective, lending itself to arbitrariness and biased application. Yet the process had advantages that appealed to ICANN decision-makers. The Board members would be free to take their best shots, in a situationally sensitive manner, at advancing the policies they thought important. The approach allowed ICANN to maintain the greatest degree of control. The end result, though, was a process stunning in its arbitrariness, a bad parody of fact-bound, situationally sensitive (rather than rules-based) decision-making.<|reference_end|>
arxiv
@article{weinberg2001icann, title={ICANN as Regulator}, author={Jonathan Weinberg}, journal={arXiv preprint arXiv:cs/0109099}, year={2001}, number={TPRC-2001-012}, archivePrefix={arXiv}, eprint={cs/0109099}, primaryClass={cs.CY} }
weinberg2001icann
arxiv-670168
cs/0109100
Comparison of Wireless Standards-Setting --United States Versus Europe
<|reference_start|>Comparison of Wireless Standards-Setting --United States Versus Europe: When decisions about developing standards are left to individual firms, there are many advantages including a flexible response to market evolution, accommodation to rapid technology change, and avoidance of costly coordination. However, the market process does not necessarily lead to compatible standards, especially when there is no clear market dominator. Standards Institutions generally facilitate better communication among market participants, which may discourage early and/or primitive incompatible standards to emerge in the market. However, expenditures on institutional standards-setting can lead to diminishing returns, or even be counter-productive because institutional standards usually take longer to produce and are slower to respond to technology development. Government agencies can specify compulsory standards to avoid incompatibility. In addition, governments can tie their policy on standardization closely with their industrial, trade, and/or regulatory policies. However, many exogenous factors handicap a government's effective involvement. Government often fails to respond to the dynamics of technology development and consumer demand, and to pick up the "right technology". This study first discusses why two different models have emerged in the two regions. It then examines outcomes and implications of the two models, including impacts on domestic service deployment, global trade competition, technology innovations, and strategies of multinational corporations. The goal is to conduct an empirical comparison between the two contrasting models.<|reference_end|>
arxiv
@article{tan2001comparison, title={Comparison of Wireless Standards-Setting --United States Versus Europe}, author={Zixiang Alex Tan}, journal={arXiv preprint arXiv:cs/0109100}, year={2001}, number={tprc-2001-060}, archivePrefix={arXiv}, eprint={cs/0109100}, primaryClass={cs.CY} }
tan2001comparison
arxiv-670169
cs/0109101
Best Effort versus Spectrum Markets: Best Effort versus Spectrum Markets: The Cases of European MVNOs and (Ultra)Wideband Unlicensed Services
<|reference_start|>Best Effort versus Spectrum Markets: Best Effort versus Spectrum Markets: The Cases of European MVNOs and (Ultra)Wideband Unlicensed Services: This paper compares two models for delivering broadband wireless services: best effort vs. QoS guaranteed services. The 'best effort' services we refer to in this paper are more commonly known as unlicensed wireless services, while the 'Quality of Service guaranteed' services are more commonly referred to as traditional landline telephony, as well as cellular telephone services of either the second or third generation. This paper highlights the differing 'market' versus 'engineering' philosophies implicit in alternative wireless service architectures.<|reference_end|>
arxiv
@article{mcknight2001best, title={Best Effort versus Spectrum Markets: Best Effort versus Spectrum Markets: The Cases of European MVNOs and (Ultra)Wideband Unlicensed Services}, author={Lee W. McKnight, Raymond Linsenmayer, William Lehr}, journal={arXiv preprint arXiv:cs/0109101}, year={2001}, archivePrefix={arXiv}, eprint={cs/0109101}, primaryClass={cs.CY cs.NI} }
mcknight2001best
arxiv-670170
cs/0109102
The Next Frontier for Openness: Wireless Communications
<|reference_start|>The Next Frontier for Openness: Wireless Communications: For wireless communications, the FCC has fostered competition rather than openness. This has permitted the emergence of vertically integrated end-to-end providers, creating problems of reduced hardware innovation, software applications, user choice, and content access. To deal with these emerging issues and create multi-level forms of competition, one policy is likely to suffice: a Carterfone for wireless, coupled with more unlicensed spectrum.<|reference_end|>
arxiv
@article{noam2001the, title={The Next Frontier for Openness: Wireless Communications}, author={Eli M. Noam}, journal={arXiv preprint arXiv:cs/0109102}, year={2001}, number={TPRC-2001-094}, archivePrefix={arXiv}, eprint={cs/0109102}, primaryClass={cs.CY} }
noam2001the
arxiv-670171
cs/0109103
Leveraging Software, Advocating Ideology: Free Software and Open Source
<|reference_start|>Leveraging Software, Advocating Ideology: Free Software and Open Source: This paper uses the software program Linux and the discourse around it in order to examine how software programs can be used to articulate and defend social and economic positions. Although I do not use the term "expression" in a strict legal sense, I claim that in order to make policy decisions involving software, it is important to understand how the functionality of software is expressive. Another way to state this is that software programs like Linux are socially meaningful through functionality and talk about functionality. In section I, I review some recent legal scholarship about software and explain why understanding how embedded technical expression works is important given the increasingly large role of software in governance. In section II, I explore software construction as a combination of social and technical practices. In section III, I describe some examples of meaning-making around the software program Linux. In section IV I conclude with a short description of why such analyses are important.<|reference_end|>
arxiv
@article{ratto2001leveraging, title={Leveraging Software, Advocating Ideology: Free Software and Open Source}, author={Matt Ratto}, journal={arXiv preprint arXiv:cs/0109103}, year={2001}, number={TPRC-2001-036}, archivePrefix={arXiv}, eprint={cs/0109103}, primaryClass={cs.CY} }
ratto2001leveraging
arxiv-670172
cs/0109104
Universal Service in times of Reform: Affordability and accessibility of telecommunication services in Latin America
<|reference_start|>Universal Service in times of Reform: Affordability and accessibility of telecommunication services in Latin America: By surveying the universal service policies in six Latin American countries, this study explores the evolution of the concept during the rollout of the telecommunication reform in the last decade. Country profiles and a set of universal service indicators provide a frame for discussing issues of accessibility and affordability of telephone service in the region. This study found that the reconfiguration of national networks fostered by liberalization policies offered risks and opportunities to achieve universal service goals. The diversification of access points and services enhanced users choices but price rebalancing and lack of Universal Service Obligations (USO) to target groups with special needs depressed the demand and threatened to exclude significant parts of the population. The situation requires the reformulation of USO incorporating all technological solutions existing in the market, and factors from the consumer-demand accounting for the urban-rural continuum, and different social and economic strata. This study identifies the emergence of a second generation of USO targeting some of these needs. However, more competition and special tariff plans for the poor need to be incorporated to the options available in the market.<|reference_end|>
arxiv
@article{fuentes-bautista2001universal, title={Universal Service in times of Reform: Affordability and accessibility of telecommunication services in Latin America}, author={Martha Fuentes-Bautista}, journal={arXiv preprint arXiv:cs/0109104}, year={2001}, number={TPRC-2001-099}, archivePrefix={arXiv}, eprint={cs/0109104}, primaryClass={cs.CY} }
fuentes-bautista2001universal
arxiv-670173
cs/0109105
Standards and Intellectual Property Rights in the Age of Global Communication - A Review of the International Standardization of Third-Generation Mobile System
<|reference_start|>Standards and Intellectual Property Rights in the Age of Global Communication - A Review of the International Standardization of Third-Generation Mobile System: When the European Telecommunications Standards Institute (ETSI) selected a radio access technology based on Wideband Code-Division Multiple Access (WCDMA), sponsored by European telecommunications equipment manufactures Ericsson and Nokia, for its third-generation wireless communications system, a bitter dispute developed between ETSI and Qualcommm Inc. Qualcomm threatened to withhold its intellectual property on the CDMA technology unless the Europeans agreed to make the radio access technology backward compatible with cdmaOne, Qualcomm's favored version of CDMA. A dispute over intellectual property rights over key CDMA techniques also erupted between Ericsson and Qualcomm and both filed patent infringement in US Court. The dispute halted the standards activity and has troubled operators worldwide as well as the International Telecommunications Union (ITU).<|reference_end|>
arxiv
@article{hjelm2001standards, title={Standards and Intellectual Property Rights in the Age of Global Communication - A Review of the International Standardization of Third-Generation Mobile System}, author={Bjorn Hjelm}, journal={arXiv preprint arXiv:cs/0109105}, year={2001}, number={TPRC-2001-092}, archivePrefix={arXiv}, eprint={cs/0109105}, primaryClass={cs.CY} }
hjelm2001standards
arxiv-670174
cs/0109106
Bigger May Not Be Better: An Empirical Analysis of Optimal Membership Rules in Peer-To-Peer Networks
<|reference_start|>Bigger May Not Be Better: An Empirical Analysis of Optimal Membership Rules in Peer-To-Peer Networks: Peer to peer networks will become an increasingly important distribution channel for consumer information goods and may play a role in the distribution of information within corporations. Our research analyzes optimal membership rules for these networks in light of positive and negative externalities additional users impose on the network. Using a dataset gathered from the six largest OpenNap-based networks, we find that users impose a positive network externality based on the desirability of the content they provide and a negative network externality based on demands they place on the network. Further we find that the marginal value of additional users is declining and the marginal cost is increasing in the number of current users. This suggests that multiple small networks may serve user communities more efficiently than single monolithic networks and that network operators may wish to specialize in their content and restrict membership based on capacity constraints and user content desirability.<|reference_end|>
arxiv
@article{asvanund2001bigger, title={Bigger May Not Be Better: An Empirical Analysis of Optimal Membership Rules in Peer-To-Peer Networks}, author={Atip Asvanund, Karen Clay, Ramayya Krishnan, Michael Smith}, journal={arXiv preprint arXiv:cs/0109106}, year={2001}, number={TPRC-2001-049}, archivePrefix={arXiv}, eprint={cs/0109106}, primaryClass={cs.CY} }
asvanund2001bigger
arxiv-670175
cs/0109107
Is Patent Law Technology Specific?
<|reference_start|>Is Patent Law Technology Specific?: Although patent law purports to cover all manner of technologies, we have noticed recent divergence in the standards applied to biotechnology and to software patents: the Federal Circuit has applied a very permissive standard of obviousness in biotechnology, but a highly restrictive disclosure requirement. The opposite holds true for software patents, which seems to us exactly contrary to sound policy for either industry. These patent standards are grounded in the legal fiction of the "person having ordinary skill in the art" or PHOSITA. We discuss the appropriateness of the PHOSITA standard, concluding that it properly lends flexibility to the patent system. We then discuss the difficulty of applying this standard in different industries, offering suggestions as to how it might be modified to avoid the problems seen in biotechnology and software patents.<|reference_end|>
arxiv
@article{burk2001is, title={Is Patent Law Technology Specific?}, author={Dan L. Burk and Mark A. Lemley}, journal={arXiv preprint arXiv:cs/0109107}, year={2001}, number={TPRC-2001-002}, archivePrefix={arXiv}, eprint={cs/0109107}, primaryClass={cs.CY} }
burk2001is
arxiv-670176
cs/0109108
Spectrum auctions, pricing and network expansion in wireless telecommunications
<|reference_start|>Spectrum auctions, pricing and network expansion in wireless telecommunications: This paper examines the effects of licensing conditions, in particular of spectrum fees, on the pricing and diffusion of mobile communications services. Seemingly exorbitant sums paid for 3G licenses in the UK, Germany in 2000 and similarly high fees paid by U.S. carriers in the re-auctioning of PCS licenses early in 2001 raised concerns as to the impacts of the market entry regime on the mobile communications market. The evidence from the GSM and PCS markets reviewed in this paper suggests that market entry fees do indeed influence the subsequent development of the market. We discuss three potential transmission channels by which license fees can influence the price and quantity of service sold in a wireless market: an increase in average cost, an increase in incremental costs, and impacts of sunk costs on the emerging market structure. From this conceptual debate, an empirical model is developed and tested using cross-sectional data for the residential mobile voice market. We utilize a structural equation approach, modeling the supply and demand relationships subject to the constraint that supply equals demand. The results confirm the existence of a positive effect of license fees on the cost of supply. However, we also find that higher market concentration has a positive effect on the overall supply in the market, perhaps supporting a Schumpeterian view that a certain degree of market concentration facilitates efficiency.<|reference_end|>
arxiv
@article{bauer2001spectrum, title={Spectrum auctions, pricing and network expansion in wireless telecommunications}, author={Johannes M. Bauer}, journal={arXiv preprint arXiv:cs/0109108}, year={2001}, number={TPRC-2001-070}, archivePrefix={arXiv}, eprint={cs/0109108}, primaryClass={cs.CY} }
bauer2001spectrum
arxiv-670177
cs/0109109
The Role of Institutions in the Design of Communication Technologies
<|reference_start|>The Role of Institutions in the Design of Communication Technologies: Communication technologies contain embedded values that affect our society's fundamental values, such as privacy, freedom of speech, and the protection of intellectual property. Researchers have shown the design of technologies is not autonomous but shaped by conflicting social groups. Consequently, communication technologies contain different values when designed by different social groups. Continuing in this vein, we show that the institutions where communication technologies are designed and developed are an important source of the values of communication technologies. We use the term code to collectively refer to the hardware and software of communication technologies. Institutions differ in their motivations, structure, and susceptibility to external influences. First, we focus on the political, economic, social, and legal influences during the development of code. The institutional reactions to these influences are embodied in code. Second, we focus on the decision-making issues in the review process for code. This process determines the code's content and affects the dissemination of code through the decision whether to publicly release the code. We found these factors vary by institution. As a result, institutions differ in the values that they incorporate into code or communications technologies.<|reference_end|>
arxiv
@article{shah2001the, title={The Role of Institutions in the Design of Communication Technologies}, author={Rajiv C. Shah, Jay P. Kesan}, journal={arXiv preprint arXiv:cs/0109109}, year={2001}, number={TPRC-2001-086}, archivePrefix={arXiv}, eprint={cs/0109109}, primaryClass={cs.CY} }
shah2001the
arxiv-670178
cs/0109110
Efficient Choice, Inefficient Democracy? The Implications of Cable and Internet Access for Political Knowledge and Voter Turnout
<|reference_start|>Efficient Choice, Inefficient Democracy? The Implications of Cable and Internet Access for Political Knowledge and Voter Turnout: This paper explains why, despite a marked increase in available political information on cable television and the Internet, citizens' levels of political knowledge have, at best, remained stagnant (Delli Carpini & Keeter, 1996). Since the availability of entertainment content has increased too, the effect of new media on knowledge and vote likelihood should be determined by people's relative preferences for entertainment and information. Access to new media should increase knowledge and vote likelihood among people who prefer news. At the same time, it is hypothesized to have a negative effect on knowledge and turnout for people who prefer entertainment content. Hypotheses are tested by building a measure of Relative Entertainment Preference (REP) from existing NES and Pew survey data. Results support the predicted interaction effect of media environment (cable and/or Internet access) and motivation (REP) on political knowledge and turnout. In particular, people who prefer entertainment to news and have access to cable television and the Internet are less knowledgeable and less likely to vote than any other group of people.<|reference_end|>
arxiv
@article{prior2001efficient, title={Efficient Choice, Inefficient Democracy? The Implications of Cable and Internet Access for Political Knowledge and Voter Turnout}, author={Markus Prior}, journal={arXiv preprint arXiv:cs/0109110}, year={2001}, number={TPRC-2001-025}, archivePrefix={arXiv}, eprint={cs/0109110}, primaryClass={cs.CY} }
prior2001efficient
arxiv-670179
cs/0109111
Quality of service monitoring: Performance metrics across proprietary content domains
<|reference_start|>Quality of service monitoring: Performance metrics across proprietary content domains: We propose a quality of service (QoS) monitoring program for broadband access to measure the impact of proprietary network spaces. Our paper surveys other QoS policy initiatives, including those in the airline, and wireless and wireline telephone industries, to situate broadband in the context of other markets undergoing regulatory devolution. We illustrate how network architecture can create impediments to open communications, and how QoS monitoring can detect such effects. We present data from a field test of QoS-monitoring software now in development. We suggest QoS metrics to gauge whether information "walled gardens" represent a real threat for dividing the Internet into proprietary spaces. To demonstrate our proposal, we are placing our software on the computers of a sample of broadband subscribers. The software periodically conducts a battery of tests that assess the quality of connections from the subscriber's computer to various content sites. Any systematic differences in connection quality between affiliated and non-affiliated content sites would warrant research into the behavioral implications of those differences. QoS monitoring is timely because the potential for the Internet to break into a loose network of proprietary content domains appears stronger than ever. Recent court rulings and policy statements suggest a growing trend towards relaxed scrutiny of mergers and the easing or elimination of content ownership rules. This policy environment could lead to a market with a small number of large, vertically integrated network operators, each pushing its proprietary content on subscribers.<|reference_end|>
arxiv
@article{o'donnell2001quality, title={Quality of service monitoring: Performance metrics across proprietary content domains}, author={Shawn O'Donnell, Hugh Carter Donahue, Josephine Ferrigno-Stack}, journal={arXiv preprint arXiv:cs/0109111}, year={2001}, number={TPRC-2001-074}, archivePrefix={arXiv}, eprint={cs/0109111}, primaryClass={cs.CY} }
o'donnell2001quality
arxiv-670180
cs/0109112
Elusive Threats: Security Weaknesses of Commercial Cellular Networks
<|reference_start|>Elusive Threats: Security Weaknesses of Commercial Cellular Networks: Commercial cellular telecommunications networks are routinely used as key means of voice and data transport by both businesses and the Public. Despite advances in encryption and other security measures, these commercial cellular networks remain extremely vulnerable to malicious attacks and the loss of user and data privacy and integrity. Such losses can result not only in mere inconvenience, but in serious corporate and national security breaches. Because of the potentially catastrophic nature of such security breaches, it behooves cellular network operators, distributors, and users to explore these network risk areas and mitigate them to the extent that current resources allow.<|reference_end|>
arxiv
@article{forbes2001elusive, title={Elusive Threats: Security Weaknesses of Commercial Cellular Networks}, author={Scott C. Forbes}, journal={arXiv preprint arXiv:cs/0109112}, year={2001}, archivePrefix={arXiv}, eprint={cs/0109112}, primaryClass={cs.CY} }
forbes2001elusive
arxiv-670181
cs/0109113
Digital Arroyos: An Examination of State Policy and Regulated Market Boundaries in Constructing Rural Internet Access
<|reference_start|>Digital Arroyos: An Examination of State Policy and Regulated Market Boundaries in Constructing Rural Internet Access: This focused study on state-level policy and access patterns contributes to a fuller understanding of how these invisible barriers work to structure access and define rural communities. Combining both quantitative and qualitative data, this study examines the role of geo-policy barriers in one of the largest and most rural states in the nation. Expanded Area Service policies are state policies wherein phone customers can expand their local calling area. Because useful Internet access requires a flat-price connection, EAS policies can play a crucial role in connecting citizens to one another. EAS policies (including Texas') tend to vary along five dimensions (community of interest, customer scope, directionality, pricing mechanism and policy scope). EAS policies that rely on regulated market boundaries for definition can generate gross inequities in rural Internet access. Interviews with Internet Service Providers in a case study of 25 rural communities reveals that LATA and exchange boundaries, along with geographically restricted infrastructure investments, curtail service provision in remote areas. A statistical analysis of 1300 telephone exchanges, including 208 rural telephone exchanges in Texas reveals that the farther a community lies from a metropolitan area the less likely they are to have reliable Internet access<|reference_end|>
arxiv
@article{nicholas2001digital, title={Digital Arroyos: An Examination of State Policy and Regulated Market Boundaries in Constructing Rural Internet Access}, author={Kyle Nicholas}, journal={arXiv preprint arXiv:cs/0109113}, year={2001}, number={TPRC-2001-XXX}, archivePrefix={arXiv}, eprint={cs/0109113}, primaryClass={cs.CY} }
nicholas2001digital
arxiv-670182
cs/0109114
The Consumer Product Selection Process in an Internet Age: Obstacles to Maximum Effectiveness & Policy Options
<|reference_start|>The Consumer Product Selection Process in an Internet Age: Obstacles to Maximum Effectiveness & Policy Options: Intermediaries, like real estate agents, Consumer Reports, and Zagats, have long helped buyers to identify their most suitable options. Now, the combination of databases and the Internet enables them to serve consumers dramatically more effectively. This article begins by offering a three-part framework for understanding the evolving forms of selection assistance. It then focuses on numerous potential obstacles that could prevent shoppers from enjoying the full benefits of these developing technologies. While concluding that adjustments to business strategies and the enforcement of existing laws can effectively overcome most of these impediments, the article identifies several areas where proactive government action may be desirable, such as to prevent the emergence of anticompetitive entry barriers.<|reference_end|>
arxiv
@article{nadel2001the, title={The Consumer Product Selection Process in an Internet Age: Obstacles to Maximum Effectiveness & Policy Options}, author={Mark S. Nadel}, journal={Harvard Journal of Law & Technology, vol. 14 p183-266 (Fall 2000)}, year={2001}, number={TPRC-2001-033}, archivePrefix={arXiv}, eprint={cs/0109114}, primaryClass={cs.CY} }
nadel2001the
arxiv-670183
cs/0109115
Prospects for Improving Competition in Mobile Roaming
<|reference_start|>Prospects for Improving Competition in Mobile Roaming: The ability to make international roaming calls is of increasing importance to customers in Europe. This contrasts with various complaints that retail prices of roaming calls are rigid and excessive. The focus if the paper is on wholesale roaming, which is the prime determinant of retail roaming prices. The paper analyses the structural conditions of wholesale roaming markets that have impaired incentives to competition, namely high combined market share of the two leading GSM 900 operators combined with second mover disadvantages for new entrant GSM 1800 operators, and demand externalities. The paper argues that a number of developments are under way that are likely to modify this situation in the future. With the introduction of SIM over-the-air programming, home mobile operators will be able to direct customers to networks with the lowest charges. As dual mode handsets become ubiquitous and as new entrant GSM 1800 operators reach nationwide coverage, second-mover disadvantages will disappear. Given the relatively small roaming volumes that GSM 1800 operators currently provide, they should have an incentive to lower charges in exchange for preferred roaming status. On the demand side of wholesale roaming markets, it will be the larger GSM 900 operators, and in particular those with a pan-European footprint, that will ask for lower charges in exchange for preferred roaming status. This could discriminate against mobile operators in downstream retail markets that do not have a pan-European footprint and that lack the bargaining power. However, arbitrage by roaming brokers, new entry and wider geographical markets on the retail roaming level will work against this.<|reference_end|>
arxiv
@article{stumpf2001prospects, title={Prospects for Improving Competition in Mobile Roaming}, author={Ulrich Stumpf}, journal={arXiv preprint arXiv:cs/0109115}, year={2001}, number={TPRC-2001-022}, archivePrefix={arXiv}, eprint={cs/0109115}, primaryClass={cs.CY} }
stumpf2001prospects
arxiv-670184
cs/0109116
Digital Color Imaging
<|reference_start|>Digital Color Imaging: This paper surveys current technology and research in the area of digital color imaging. In order to establish the background and lay down terminology, fundamental concepts of color perception and measurement are first presented us-ing vector-space notation and terminology. Present-day color recording and reproduction systems are reviewed along with the common mathematical models used for representing these devices. Algorithms for processing color images for display and communication are surveyed, and a forecast of research trends is attempted. An extensive bibliography is provided.<|reference_end|>
arxiv
@article{sharma2001digital, title={Digital Color Imaging}, author={Gaurav Sharma and H. Joel Trussell}, journal={IEEE Trans. Image Proc., vol. 6, no. 7, pp. 901-932, Jul. 1997}, year={2001}, doi={10.1109/83.597268}, archivePrefix={arXiv}, eprint={cs/0109116}, primaryClass={cs.CV cs.GR} }
sharma2001digital
arxiv-670185
cs/0110001
Computer Security: Competing Concepts
<|reference_start|>Computer Security: Competing Concepts: This paper focuses on a tension we discovered in the philosophical part of our multidisciplinary project on values in web-browser security. Our project draws on the methods and perspectives of empirical social science, computer science, and philosophy to identify values embodied in existing web-browser security and also to prescribe changes to existing systems (in particular, Mozilla) so that values relevant to web-browser systems are better served than presently they are. The tension, which we had not seen explicitly addressed in any other work on computer security, emerged when we set out to extract from the concept of security the set values that ought to guide the shape of web-browser security. We found it impossible to construct an internally consistent set of values until we realized that two robust -- and in places competing -- conceptions of computer security were influencing our thinking. We needed to pry these apart and make a primary commitment to one. One conception of computer security invokes the ordinary meaning of security. According to it, computer security should protect people -- computer users -- against dangers, harms, and threats. Clearly this ordinary conception of security is already informing much of the work and rhetoric surrounding computer security. But another, substantively richer conception, also defines the aims and trajectory of computer security -- computer security as an element of national security. Although, like the ordinary conception, this one is also concerned with protection against threats, its primary subject is the state, not the individual. The two conceptions suggest divergent system-specifications, not for all mechanisms but a significant few.<|reference_end|>
arxiv
@article{nissenbaum2001computer, title={Computer Security: Competing Concepts}, author={Helen Nissenbaum, Batya Friedman, Edward Felten}, journal={arXiv preprint arXiv:cs/0110001}, year={2001}, archivePrefix={arXiv}, eprint={cs/0110001}, primaryClass={cs.CY} }
nissenbaum2001computer
arxiv-670186
cs/0110002
Inventing E-Regulation in the US, EU and East Asia: Conflicting Social Visions of the Internet & the Information Society
<|reference_start|>Inventing E-Regulation in the US, EU and East Asia: Conflicting Social Visions of the Internet & the Information Society: This paper attempts to assess the international approach to Internet policy in the context of distinctive socio-political frameworks evolving in the US, the European Union (EU), and East Asia. The comparative review will develop a set of underlying structural models of the Information Society particular to each region, along with an analysis of their defining characteristics in relation to one another. This examination demonstrates how each region, given its regulatory legacy, has elected a different mix of good and bad socio-political choices in public policy for the Internet. Despite the range and diversity of paths to e-regulation suggested in these choices, none adequately addresses the underlying issue of how to promote an innovative society that is open to broad social participation. The paper evaluates principal weaknesses in these regional models of Internet policy and argues the need for re-conceptualizing the cultural, political and economic approach to the new information space of the Internet.<|reference_end|>
arxiv
@article{venturelli2001inventing, title={Inventing E-Regulation in the US, EU and East Asia: Conflicting Social Visions of the Internet & the Information Society}, author={Shalini Venturelli}, journal={arXiv preprint arXiv:cs/0110002}, year={2001}, number={TPRC-2001-040}, archivePrefix={arXiv}, eprint={cs/0110002}, primaryClass={cs.CY} }
venturelli2001inventing
arxiv-670187
cs/0110003
The temporal calculus of conditional objects and conditional events
<|reference_start|>The temporal calculus of conditional objects and conditional events: We consider the problem of defining conditional objects (a|b), which would allow one to regard the conditional probability Pr(a|b) as a probability of a well-defined event rather than as a shorthand for Pr(ab)/Pr(b). The next issue is to define boolean combinations of conditional objects, and possibly also the operator of further conditioning. These questions have been investigated at least since the times of George Boole, leading to a number of formalisms proposed for conditional objects, mostly of syntactical, proof-theoretic vein. We propose a unifying, semantical approach, in which conditional events are (projections of) Markov chains, definable in the three-valued extension of the past tense fragment of propositional linear time logic, or, equivalently, by three-valued counter-free Moore machines. Thus our conditional objects are indeed stochastic processes, one of the central notions of modern probability theory. Our model fulfills early ideas of Bruno de Finetti and, moreover, as we show in a separate paper, all the previously proposed algebras of conditional events can be isomorphically embedded in our model.<|reference_end|>
arxiv
@article{tyszkiewicz2001the, title={The temporal calculus of conditional objects and conditional events}, author={Jerzy Tyszkiewicz, Arthur Ramer and Achim Hoffmann}, journal={arXiv preprint arXiv:cs/0110003}, year={2001}, archivePrefix={arXiv}, eprint={cs/0110003}, primaryClass={cs.AI cs.LO} }
tyszkiewicz2001the
arxiv-670188
cs/0110004
Embedding conditional event algebras into temporal calculus of conditionals
<|reference_start|>Embedding conditional event algebras into temporal calculus of conditionals: In this paper we prove that all the existing conditional event algebras embed into a three-valued extension of temporal logic of discrete past time, which the authors of this paper have proposed in anothe paper as a general model of conditional events. First of all, we discuss the descriptive incompleteness of the cea's. In this direction, we show that some important notions, like independence of conditional events, cannot be properly addressed in the framework of conditional event algebras, while they can be precisely formulated and analyzed in the temporal setting. We also demonstrate that the embeddings allow one to use Markov chain algorithms (suitable for the temporal calculus) for computing probabilities of complex conditional expressions of the embedded conditional event algebras, and that these algorithms can outperform those previously known.<|reference_end|>
arxiv
@article{tyszkiewicz2001embedding, title={Embedding conditional event algebras into temporal calculus of conditionals}, author={Jerzy Tyszkiewicz, Achim Hoffmann and Arthur Ramer}, journal={arXiv preprint arXiv:cs/0110004}, year={2001}, archivePrefix={arXiv}, eprint={cs/0110004}, primaryClass={cs.AI cs.LO} }
tyszkiewicz2001embedding
arxiv-670189
cs/0110005
Two-way Quantum One-counter Automata
<|reference_start|>Two-way Quantum One-counter Automata: After the first treatments of quantum finite state automata by Moore and Crutchfield and by Kondacs and Watrous, a number of papers study the power of quantum finite state automata and their variants. This paper introduces a model of two-way quantum one-counter automata (2Q1CAs), combining the model of two-way quantum finite state automata (2QFAs) by Kondacs and Watrous and the model of one-way quantum one-counter automata (1Q1CAs) by Kravtsev. We give the definition of 2Q1CAs with well-formedness conditions. It is proved that 2Q1CAs are at least as powerful as classical two-way deterministic one-counter automata (2D1CAs), that is, every language L recognizable by 2D1CAs is recognized by 2Q1CAs with no error. It is also shown that several non-context-free languages including {a^n b^{n^2}} and {a^n b^{2^n}} are recognizable by 2Q1CAs with bounded error.<|reference_end|>
arxiv
@article{yamasaki2001two-way, title={Two-way Quantum One-counter Automata}, author={Tomohiro Yamasaki, Hirotada Kobayashi, Hiroshi Imai}, journal={arXiv preprint arXiv:cs/0110005}, year={2001}, archivePrefix={arXiv}, eprint={cs/0110005}, primaryClass={cs.CC quant-ph} }
yamasaki2001two-way
arxiv-670190
cs/0110006
Electronic Commerce, Consumer Search and Retailing Cost Reduction
<|reference_start|>Electronic Commerce, Consumer Search and Retailing Cost Reduction: This paper explains four things in a unified way. First, how e-commerce can generate price equilibria where physical shops either compete with virtual shops for consumers with Internet access, or alternatively, sell only to consumers with no Internet access. Second, how these price equilibria might involve price dispersion on-line. Third, why prices may be higher on-line. Fourth, why established firms can, but need not, be more reluctant than newly created firm to adopt e-commerce. For this purpose we develop a model where e-commerce reduces consumers' search costs, involves trade-offs for consumers, and reduces retailing costs.<|reference_end|>
arxiv
@article{pereira2001electronic, title={Electronic Commerce, Consumer Search and Retailing Cost Reduction}, author={Pedro Pereira, Cristina Maz'on}, journal={arXiv preprint arXiv:cs/0110006}, year={2001}, number={TPRC-2001-XXX}, archivePrefix={arXiv}, eprint={cs/0110006}, primaryClass={cs.HC} }
pereira2001electronic
arxiv-670191
cs/0110007
Variable and Value Ordering When Solving Balanced Academic Curriculum Problems
<|reference_start|>Variable and Value Ordering When Solving Balanced Academic Curriculum Problems: In this paper we present the use of Constraint Programming for solving balanced academic curriculum problems. We discuss the important role that heuristics play when solving a problem using a constraint-based approach. We also show how constraint solving techniques allow to very efficiently solve combinatorial optimization problems that are too hard for integer programming techniques.<|reference_end|>
arxiv
@article{castro2001variable, title={Variable and Value Ordering When Solving Balanced Academic Curriculum Problems}, author={Carlos Castro and Sebastian Manzano}, journal={Proceedings of 6th Workshop of the ERCIM WG on Constraints (Prague, June 2001)}, year={2001}, number={DI-UTFSM TR 2001/1}, archivePrefix={arXiv}, eprint={cs/0110007}, primaryClass={cs.PL} }
castro2001variable
arxiv-670192
cs/0110008
Analyzing Website Choice Using Clickstream Data
<|reference_start|>Analyzing Website Choice Using Clickstream Data: This paper uses clickstream data from Plurimus Corp. (formerly Foveon Corp.) to analyze user choice of Internet portals. It will show that commonly used econometric models for examining grocery scanner data can be applied to clickstream data on advertising-based online markets. Developing a framework to study consumer choices of free websites is an essential step to better understanding user behavior on the Internet. The main data for this study is a clickstream data set consisting of every website visited by 2654 users from December 27 1999 to March 31 2000. Using this data, I construct several variables including search success, time spent searching, and whether a website is an individual's starting page. I also have advertising and media mentions data. This study helps to increase understanding of user behavior on the Internet. It explores some key determinants of website choice and simulates market responses to changes in the online environment. By better understanding website choice, we can better evaluate the implications of policy decisions.<|reference_end|>
arxiv
@article{goldfarb2001analyzing, title={Analyzing Website Choice Using Clickstream Data}, author={Avi Goldfarb}, journal={arXiv preprint arXiv:cs/0110008}, year={2001}, archivePrefix={arXiv}, eprint={cs/0110008}, primaryClass={cs.CY} }
goldfarb2001analyzing
arxiv-670193
cs/0110009
Algorithmic Self-Assembly of DNA Tiles and its Application to Cryptanalysis
<|reference_start|>Algorithmic Self-Assembly of DNA Tiles and its Application to Cryptanalysis: The early promises of DNA computing to deliver a massively parallel architecture well-suited to computationally hard problems have so far been largely unkept. Indeed, it is probably fair to say that only toy problems have been addressed experimentally. Recent experimental development on algorithmic self-assembly using DNA tiles seem to offer the most promising path toward a potentially useful application of the DNA computing concept. In this paper, we explore new geometries for algorithmic self-assembly, departing from those previously described in the literature. This enables us to carry out mathematical operations like binary multiplication or cyclic convolution product. We then show how to use the latter operation to implement an attack against the well-known public-key crypto system NTRU.<|reference_end|>
arxiv
@article{pelletier2001algorithmic, title={Algorithmic Self-Assembly of DNA Tiles and its Application to Cryptanalysis}, author={Olivier Pelletier and Andre Weimerskirch}, journal={arXiv preprint arXiv:cs/0110009}, year={2001}, archivePrefix={arXiv}, eprint={cs/0110009}, primaryClass={cs.CR} }
pelletier2001algorithmic
arxiv-670194
cs/0110010
Pushdown Timed Automata: a Binary Reachability Characterization and Safety Verification
<|reference_start|>Pushdown Timed Automata: a Binary Reachability Characterization and Safety Verification: We consider pushdown timed automata (PTAs) that are timed automata (with dense clocks) augmented with a pushdown stack. A configuration of a PTA includes a control state, dense clock values and a stack word. By using the pattern technique, we give a decidable characterization of the binary reachability (i.e., the set of all pairs of configurations such that one can reach the other) of a PTA. Since a timed automaton can be treated as a PTA without the pushdown stack, we can show that the binary reachability of a timed automaton is definable in the additive theory of reals and integers. The results can be used to verify a class of properties containing linear relations over both dense variables and unbounded discrete variables. The properties previously could not be verified using the classic region technique nor expressed by timed temporal logics for timed automata and CTL$^*$ for pushdown systems. The results are also extended to other generalizations of timed automata.<|reference_end|>
arxiv
@article{dang2001pushdown, title={Pushdown Timed Automata: a Binary Reachability Characterization and Safety Verification}, author={Zhe Dang}, journal={arXiv preprint arXiv:cs/0110010}, year={2001}, archivePrefix={arXiv}, eprint={cs/0110010}, primaryClass={cs.LO} }
dang2001pushdown
arxiv-670195
cs/0110011
The Minimum Expectation Selection Problem
<|reference_start|>The Minimum Expectation Selection Problem: We define the min-min expectation selection problem (resp. max-min expectation selection problem) to be that of selecting k out of n given discrete probability distributions, to minimize (resp. maximize) the expectation of the minimum value resulting when independent random variables are drawn from the selected distributions. We assume each distribution has finitely many atoms. Let d be the number of distinct values in the support of the distributions. We show that if d is a constant greater than 2, the min-min expectation problem is NP-complete but admits a fully polynomial time approximation scheme. For d an arbitrary integer, it is NP-hard to approximate the min-min expectation problem with any constant approximation factor. The max-min expectation problem is polynomially solvable for constant d; we leave open its complexity for variable d. We also show similar results for binary selection problems in which we must choose one distribution from each of n pairs of distributions.<|reference_end|>
arxiv
@article{eppstein2001the, title={The Minimum Expectation Selection Problem}, author={David Eppstein and George Lueker}, journal={Random Structures and Algorithms 21:278-292, 2002}, year={2001}, doi={10.1002/rsa.10061}, archivePrefix={arXiv}, eprint={cs/0110011}, primaryClass={cs.DS math.PR} }
eppstein2001the
arxiv-670196
cs/0110012
Proceedings of the 6th Annual Workshop of the ERCIM Working Group on Constraints
<|reference_start|>Proceedings of the 6th Annual Workshop of the ERCIM Working Group on Constraints: Homepage of the workshop proceedings, with links to all individually archived papers<|reference_end|>
arxiv
@article{apt2001proceedings, title={Proceedings of the 6th Annual Workshop of the ERCIM Working Group on Constraints}, author={Krzysztof R. Apt, Roman Bartak, Eric Monfroy, Francesca Rossi, Sebastian Brand}, journal={arXiv preprint arXiv:cs/0110012}, year={2001}, archivePrefix={arXiv}, eprint={cs/0110012}, primaryClass={cs.PL} }
apt2001proceedings
arxiv-670197
cs/0110013
A Proposal for Dynamic Access Lists for TCP/IP Packet Filering
<|reference_start|>A Proposal for Dynamic Access Lists for TCP/IP Packet Filering: The use of IP filtering to improve system security is well established, and although limited in what it can achieve has proved to be efficient and effective. In the design of a security policy there is always a trade-off between usability and security. Restricting access means that legitimate use of the network is prevented; allowing access means illegitimate use may be allowed. Static access list make finding a balance particularly stark -- we pay the price of decreased security 100% of the time even if the benefit of increased usability is only gained 1% of the time. Dynamic access lists would allow the rules to change for short periods of time, and to allow local changes by non-experts. The network administrator can set basic security guide-lines which allow certain basic services only. All other services are restricted, but users are able to request temporary exceptions in order to allow additional access to the network. These exceptions are granted depending on the privileges of the user. This paper covers the following topics: (1) basic introduction to TCP/IP filtering; (2) semantics for dynamic access lists and; (3) a proposed protocol for allowing dynamic access; and (4) a method for representing access lists so that dynamic update and look-up can be done efficiently performed.<|reference_end|>
arxiv
@article{hazelhurst2001a, title={A Proposal for Dynamic Access Lists for TCP/IP Packet Filering}, author={Scott Hazelhurst}, journal={arXiv preprint arXiv:cs/0110013}, year={2001}, number={TR-Wits-CS-2001-2}, archivePrefix={arXiv}, eprint={cs/0110013}, primaryClass={cs.NI cs.LO} }
hazelhurst2001a
arxiv-670198
cs/0110014
The Open Language Archives Community and Asian Language Resources
<|reference_start|>The Open Language Archives Community and Asian Language Resources: The Open Language Archives Community (OLAC) is a new project to build a worldwide system of federated language archives based on the Open Archives Initiative and the Dublin Core Metadata Initiative. This paper aims to disseminate the OLAC vision to the language resources community in Asia, and to show language technologists and linguists how they can document their tools and data in such a way that others can easily discover them. We describe OLAC and the OLAC Metadata Set, then discuss two key issues in the Asian context: language classification and multilingual resource classification.<|reference_end|>
arxiv
@article{bird2001the, title={The Open Language Archives Community and Asian Language Resources}, author={Steven Bird, Gary Simons, Chu-Ren Huang}, journal={Proceedings of the Workshop on Language Resources in Asia, 6th Natural Language Processing Pacific Rim Symposium (NLPRS), Tokyo, November 2001}, year={2001}, archivePrefix={arXiv}, eprint={cs/0110014}, primaryClass={cs.CL cs.DL} }
bird2001the
arxiv-670199
cs/0110015
Richer Syntactic Dependencies for Structured Language Modeling
<|reference_start|>Richer Syntactic Dependencies for Structured Language Modeling: The paper investigates the use of richer syntactic dependencies in the structured language model (SLM). We present two simple methods of enriching the dependencies in the syntactic parse trees used for intializing the SLM. We evaluate the impact of both methods on the perplexity (PPL) and word-error-rate(WER, N-best rescoring) performance of the SLM. We show that the new model achieves an improvement in PPL and WER over the baseline results reported using the SLM on the UPenn Treebank and Wall Street Journal (WSJ) corpora, respectively.<|reference_end|>
arxiv
@article{chelba2001richer, title={Richer Syntactic Dependencies for Structured Language Modeling}, author={Ciprian Chelba and Peng Xu}, journal={arXiv preprint arXiv:cs/0110015}, year={2001}, archivePrefix={arXiv}, eprint={cs/0110015}, primaryClass={cs.CL} }
chelba2001richer
arxiv-670200
cs/0110016
Limits To Certainty in QoS Pricing and Bandwidth
<|reference_start|>Limits To Certainty in QoS Pricing and Bandwidth: Advanced services require more reliable bandwidth than currently provided by the Internet Protocol, even with the reliability enhancements provided by TCP. More reliable bandwidth will be provided through QoS (quality of service), as currently discussed widely. Yet QoS has some implications beyond providing ubiquitous access to advance Internet service, which are of interest from a policy perspective. In particular, what are the implications for price of Internet services? Further, how will these changes impact demand and universal service for the Internet. This paper explores the relationship between certainty of bandwidth and certainty of price for Internet services over a statistically shared network and finds that these are mutually exclusive goals.<|reference_end|>
arxiv
@article{gideon2001limits, title={Limits To Certainty in QoS Pricing and Bandwidth}, author={Carolyn Gideon, L Jean Camp}, journal={arXiv preprint arXiv:cs/0110016}, year={2001}, number={TPRC-2001-2091}, archivePrefix={arXiv}, eprint={cs/0110016}, primaryClass={cs.CY cs.HC} }
gideon2001limits