id
stringlengths
3
8
url
stringlengths
32
207
title
stringlengths
1
114
text
stringlengths
93
492k
299443
https://en.wikipedia.org/wiki/John%20Hopcroft
John Hopcroft
John Edward Hopcroft (born October 7, 1939) is an American theoretical computer scientist. His textbooks on theory of computation (also known as the Cinderella book) and data structures are regarded as standards in their fields. He is the IBM Professor of Engineering and Applied Mathematics in Computer Science at Cornell University, and the Director of the John Hopcroft Center for Computer Science at Shanghai Jiao Tong University. Education He received his bachelor's degree from Seattle University in 1961. He received his master's degree and Ph.D. from Stanford University in 1962 and 1964, respectively. He worked for three years at Princeton University and since then has been at Cornell University. Hopcroft is the grandson of Jacob Nist, founder of the Seattle-Tacoma Box Company. Career In addition to his research work, he is well known for his books on algorithms and formal languages coauthored with Jeffrey Ullman and Alfred Aho, regarded as classic texts in the field. In 1986 he received the Turing Award (jointly with Robert Tarjan) "for fundamental achievements in the design and analysis of algorithms and data structures." Along with his work with Tarjan on planar graphs he is also known for the Hopcroft–Karp algorithm for finding matchings in bipartite graphs. In 1994 he was inducted as a Fellow of the Association for Computing Machinery. In 2005 he received the Harry H. Goode Memorial Award "for fundamental contributions to the study of algorithms and their applications in information processing." In 2008 he received the Karl V. Karlstrom Outstanding Educator Award "for his vision of and impact on computer science, including co-authoring field-defining texts on theory and algorithms, which continue to influence students 40 years later, advising PhD students who themselves are now contributing greatly to computer science, and providing influential leadership in computer science research and education at the national and international level." Hopcroft was elected a member of the National Academy of Engineering in 1989 for fundamental contributions to computer algorithms and for authorship of outstanding computer science textbooks. In 1992, Hopcroft was nominated to the National Science Board by George H. W. Bush. In 2005, he was awarded an honorary doctorate by the University of Sydney, in Sydney, Australia. In 2009, he received an honorary doctorate from Saint Petersburg State University of Information Technologies, Mechanics and Optics. In 2017, Shanghai Jiao Tong University launched a John Hopcroft Center for Computer Science. In 2020 the Chinese University of Hong Kong, Shenzhen opened a Hopcroft Institute for Advanced Information Sciences and designated him as an Einstein professor. Hopcroft is also the co-recipient (with Jeffrey Ullman) of the 2010 IEEE John von Neumann Medal “for laying the foundations for the fields of automata and language theory and many seminal contributions to theoretical computer science.” Awards 1986. Turing Award 1989. National Academy of Engineering Member 1994. ACM Fellow 2005. Harry H. Goode Memorial Award 2008. Karl Karlstrom Outstanding Educator Award 2010. IEEE John von Neumann Medal 2016. Friendship Award (China) Selected publications Books 2017. Foundations of Data Science. (with Avrim Blum and Ravindran Kannan) 2001. J.E. Hopcroft, Rajeev Motwani, Jeffrey D. Ullman, Introduction to Automata Theory, Languages, and Computation Second Edition. Addison-Wesley. 1983. Alfred V. Aho, J.E. Hopcroft, Jeffrey D. Ullman, Data Structures and Algorithms, Addison-Wesley Series in Computer Science and Information Processing. 1974. Alfred V. Aho, J.E. Hopcroft, Jeffrey D. Ullman, The Design and Analysis of Computer Algorithms, Addison-Wesley Series in Computer Science and Information Processing. 1969. Formal Languages and Their Relation to Automata. (with Jeffrey D. Ullman), Addison-Wesley, Reading MA. References External links John E. Hopcroft at Cornell University American computer scientists 1939 births Living people Fellows of the Association for Computing Machinery Fellows of the Society for Industrial and Applied Mathematics Members of the United States National Academy of Engineering Members of the United States National Academy of Sciences Turing Award laureates Cornell University faculty Stanford University alumni Seattle University alumni 20th-century American engineers 21st-century American engineers 20th-century American scientists 21st-century American scientists Computer science educators American textbook writers American electrical engineers
52479651
https://en.wikipedia.org/wiki/PoolParty%20Semantic%20Suite
PoolParty Semantic Suite
The PoolParty Semantic Suite is a technology platform provided by the Semantic Web Company. The EU-based company belongs to the early pioneers of the Semantic Web movement (Reference). The software supports enterprises in knowledge management, data analytics and content organisation. The product uses standards-based technologies as defined by W3C, which prevents vendor lock-in (Reference). Reference customers are among others Boehringer Ingelheim, Credit Suisse, European Commission, REEEP, Wolters Kluwer and the World Bank Group. History The PoolParty Semantic Suite is commercialising Semantic Web technologies. In 2009, the first release of the PoolParty Semantic Software entered the market. Since then, the product has evolved from a taxonomy management tool to a feature-rich semantic software platform that enables companies to deploy enterprise knowledge graphs to integrate structured and unstructured data. The product is developing fast due to a strong R&D focus and is integral part of multiple EU research projects of the Horizon 2020 initiative ( Reference, Reference). Product The PoolParty Semantic Suite is a modular and flexible software package. It differentiates nine modules, which can be individually combined depending on the business challenge: Taxonomy & Thesaurus Management Text Mining & Entity Extraction Ontology Management Concept Tagging Data Integration Linked Data Management Semantic Search Recommender System Analytics & Visualization Content assets get semantically enriched and are put into context by being matched against a knowledge graph. This is the foundation for semantic applications as search or linked data portals. Technologies PoolParty Semantic Suite is deploying Semantic Web technologies as promoted by W3C. The backbone of the information architecture is built by applying SKOS (Simple Knowledge Organization System), ontologies and Linked Data principles. Any data processed within PoolParty is transformed into RDF graphs and can be queried with SPARQL. An essential advantage of this approach is that taxonomy projects developed in PoolParty can be linked with data from virtually any repository. Companies benefit from an automatic entity linking between resources from the graph-based semantic layer and knowledge assets from data repositories like document management systems. Critics argue that the technology hasn't yet become mainstream as it is relative complex. As the demand and requirements for smarter enterprise applications are undamped, the Semantic Web technology stack becomes continuously more attractive to developers. Awards and recognition Semantic Web Company's (SWC) Information Security Management Systems have been certified according to the standard ISO 27001:2013 (Certificate) Gartner calls PoolParty ‘a representative product’ in its 2018 market guide for ‘Hosted AI Services’. (Reference) Gartner Names PoolParty as Visionary in 2020 Magic Quadrant for Metadata Management Solutions. (Reference) MarkLogic has announced Semantic Web Company's PoolParty with Partner Excellence in Technology (Reference) In 2017, 2018, 2019 amd 2020 KMWorld named Semantic Web Company as 1 of 100 companies that matter in knowledge management (Reference) In 2015, 2016 and 2017 KMWorld named PoolParty Semantic Suite as a trend-setting product. In 2016 KMWorld named Semantic Web Company as 1 of 100 companies that matter in knowledge management (Reference) ZDNet writes about "What IBM, the Semantic Web Company, and Siemens are doing with semantic technologies" 2016.SEMANTiCS writes about "How to develop aligned, quality-centric semantic software - Gold Sponsor Aligned Project" Dataversity writes about "PoolParty Releases Enhanced Version of its Semantic Middleware" KMWorld writes about "PoolParty Introduces Machine Learning Capabilities to its Platform" KMWorld writes about "Why Enterprises should embrace Taxonomies and Knowledge Graphs" KMWorld writes about "Knowledge graph management and text analytics from Semantic Web Company" KMWorld writes about "Taxonomy 101: The Basics and Getting Started with Taxonomies" References Acknowledgement W3C PoolParty Semantic Suite and Semantic Web technologies Realizing thesaurus based uses cases with the PoolParty Suite PoolParty 6.0 brings the Most Complete Semantic Middleware to the Global Market! PoolParty Enhances Semantic Middleware with Release 6 PoolParty Product Test Review by Europeana PoolParty Research Papers External links Official Product Website Corporate Website Java platform software Natural language processing software Semantic Web Ontology editors
28768148
https://en.wikipedia.org/wiki/Lanner%20Group%20Ltd
Lanner Group Ltd
Lanner Group Ltd is a software company specialising in simulation software such as discrete event simulation and predictive simulation, headquartered in Henley-in-Arden, Warwickshire. The business develops, markets and supports business process simulation and optimisation systems. The company has subsidiaries in the USA, China, France and Germany and a distributor network selling the company's products in 20 different countries. Lanner Group was formed following a Management Buyout of AT&T Istel, a spin-off from the operational research department of British Leyland where, in 1978, the world's first visual interactive simulation tool was developed. Lanner Group services automotive, aviation, criminal justice, defence and aerospace, financial services and contact centres, food and beverage, health, logistics and supply chain, manufacturing, nuclear, oil and gas, pharmaceutical, and consumer health industries. Company history Lanner Group formed in 1996 after completing a Management Buyout from AT&T Istel; the company was initially named SEE WHY SOLUTIONS and was incorporated in 1995. Lanner Group's previous owner, AT&T Istel, formerly known as ISTEL, was initially called BL Systems, and was a spin out formed in 1979 following a Merger of all the computer departments under the then British Leyland umbrella. BL System's SEE WHY tool, programmed in Fortran 77 and launched in 1980, was the world's first commercially available visual interactive simulation package and the precursor to Lanner Group's current core software product WITNESS. WITNESS was the first of the industrial strength 4GL simulators. The WITNESS system was launched on IBM PC in 1986 and has been revised frequently since. The latest version of Lanner's Witness was released in summer 2017. The applications of the FORTRAN 77 / WITNESS interface have been subject to further academic Research and Development. Since 1985 the company has supported its simulation software academic program. Over 100 universities worldwide have been involved since the program began. Lanner Group continued to develop the WITNESS platform further in parallel to developing its mainstay product of the same name, and since 2002 has introduced new systems providing niche simulation packages for police and healthcare organisations called PRISM and PX-Sim respectively. In 2006 the company unveiled a Java based simulation engine called L-SIM which is embedded in Business Process Management (BPM) solutions software. In May the same year, a technology partnership with BPM solution provider IDS Scheer was announced. The L-SIM product is now the simulation engine of IDS Scheer's ARIS Business Simulator. The company's WITNESS platform technology is therefore embedded into current Oracle, SAP, and IBM BPA products. From 1996 to 2010 the company's main investor was private equity company 3i. On 18 March 2010 Lanner Group announced that it had secured a £3 million new investment deal with NVM Private Equity replacing 3i. 3i continues to retain an interest in Lanner Group as a small minority investor. References Further reading External links Official website Company history of AT&T Istel Ltd. British Leyland Simulation software Modeling and simulation
50160038
https://en.wikipedia.org/wiki/Murex%20%28financial%20software%29
Murex (financial software)
Murex is a company that provides financial software for trading, treasury, risk, and post-trade operations for financial markets. Murex was founded in 1986 in Paris by Laurent Néel and Salim Edde, who were soon followed by Salim’s three brothers and his brother-in-law. Today, the company employs a multi-national force of over 2,200 employees worldwide. Its main office is in Paris and it has 18 global offices in cities such as New York, London, Dublin, Hong Kong, Beirut, Sydney and Singapore. Murex has customers located in 70 countries. Murex’s platform, MX.3, is used by banks, asset managers, pension funds and insurance companies. Its clients include UBS, the National Bank of Canada, the Bank of China, OCBC Bank, China Merchants Bank, the National Bank of Kuwait, Banorte and ATB Financial. Maroun Edde is the current Chief Executive Officer. In the Truffle 100 rankings for 2020, Murex was recognised as the third largest French software publisher with an announced turnover of 569 million euros. History In 2013, National Australia Bank undertook an overhaul of its trading operations by going live on MX.3 to support its FX trading and processing. MX.3 was first adopted in Melbourne and subsequently rolled out to its international operations on a phased basis. In 2014, the Singaporean-based bank, DBS, adopted MX.3 in its risk management operations. That same year, Murex partnered with Tullet Prebon, allowing them to use TPI data for internal model validation. Then, in November, UBS announced that it had chosen Murex’s software to replace a large part of its fixed income platform technology, including the booking of trades, valuation and risk management. In the Truffle 100 rankings for 2016, Murex became the third largest French software publisher with an announced turnover of 460 million euros. In 2017, China Merchants Bank (CMB) went live on Murex’s MX.3 trading platform to improve the technology at its Shenzhen and Shanghai operations, following the bank becoming one of the first Chinese financial institutions to join the R3 blockchain consortium. Murex took part in the pilot project for Teen Turn in Dublin, an initiative that seeks to create a talent pipeline for women in science, technology, engineering, and mathematics (STEM). Murex announced its partnership with Microsoft, bringing MX.3 into the cloud with certification on Microsoft Azure. ATB Financial began the first phase of moving its infrastructure and application management into the cloud with the implementation of Murex’s MX.3 Software-as-a-Service (SaaS) in August 2017. In 2020, the final phase in the project was completed as ATB’s commodities desk went live. In December 2017, Murex collaborated with Amazon Web Services to make MX.3 available on Amazon’s cloud platform. MX.3 on the cloud can be used for applications such as development and testing, running production processes, disaster recovery and for accessing cloud-based managed services. In 2018, Nationwide Building Society went live on MX.3 to replace its legacy system and secure wider funding sources, review pricing rules and implement tighter risk, liquidity and collateral controls. Murex partnered on the project with Sapient Global Markets who delivered testing automation to fast-track deployments. Italy-based Banca IMI, a subsidiary of Intesa Sanpaolo, migrated its equity derivatives business to MX.3 as part of a technology update for its capital markets business. Bankdata, a Danish banking IT provider, announced it was expanding its use of MX.3 to its remaining member banks following implementation with Jyske Bank and Sydbank. MX.3 replaces legacy systems, providing one platform covering the whole trade lifecycle and centralising risk calculation, trading and settlement data for real-time processing. The Depository Trust & Clearing Corporation (DTCC) partnered with Murex to support reporting requirements under the Securities Financing Transactions Regulation (SFTR). Murex’s SFTR solution will link to DTCC’s Global Trade Repository to enable simplified reporting and lower implementation costs. In 2019, Banorte, Mexico's second largest investment bank, announced it was extending its use of MX.3 for some of its main risk and compliance functions. MX.3 will enable Banorte to automate and digitize its operations, including counterparty risk, derivatives valuation adjustment (XVA) and collateral management. Ping An Bank, a Chinese joint-stock commercial bank, adopted Murex’s MX.3 platform in 2019. The bank’s aim was to close the gap between front and back office, enabling its expansion into new areas. In 2020, Thai bank Krungsri (Bank of Ayudhya) implemented Murex’s IT platform MX.3 to simplify its existing IT infrastructure and improve regulatory reporting transparency. In 2020, Murex facilitated the first overnight indexed swap derivatives transaction based on the new Thai reference rate THOR for Kasikornbank. Ahead of the Libor transition, a global effort to reform interest rate benchmarks, Murex is working with its clients to prepare their IT systems. In 2021, Murex adds physical presence in Cyprus to bolster its regional footprint in the Europe-Middle East-Africa region. Corporate affairs and culture Murex is involved in the Teen Turn programme, an initiative that is seeking to create a talent pipeline for women in science, technology, engineering, and mathematics (STEM). In 2017 Murex sponsored Teen Turn’s pilot project, hosting a two-week work placement at their offices in Dublin where female employees acted as mentors for local students. In 2018, Murex again sponsored Teen Turn, supporting an event for mentors and women in technology in Dublin. Murex was ranked tenth for companies of 500 to 999 employees in the Les Echos Happy at Work survey of 2018. Murex was recognized in the Top 5 best employers to work for in France in Glassdoor’s 2020 and 2021 Employee Choice Awards. References Banking software companies Companies based in Paris Financial software Financial software companies Financial technology companies Software companies of France French brands 1986 establishments in France Companies established in 1986
547803
https://en.wikipedia.org/wiki/Silpakorn%20University
Silpakorn University
Silpakorn University (SU.) (; ) is a national university in Thailand. The university was founded in Bangkok in 1943 by Tuscan–born art professor Corrado Feroci, who took the Thai name Silpa Bhirasri when he became a Thai citizen. It began as a fine arts university and now includes many other faculties as well. In 2016, it has 25,210 students. History Silpakorn University was originally established as the School of Fine Arts under Thailand's Fine Arts Department in 1933. The school offered the only painting and sculpture programs and waived tuition fees for government officials and students. Its creation owes much to the almost lifetime devotion of Professor Silpa Bhirasri, an Italian sculptor (formerly Corrado Feroci) who was commissioned during the reign of King Rama VI to work in the Fine Arts Department. He subsequently enlarged his classes to include greater members of the interested public before setting up the School of Fine Arts. The school gradually developed and was officially accorded a new status and named Silpakorn University on 12 October 1943. Its inaugural faculty was the Faculty of Painting and Sculpture. In 1955, the Faculty of Thai Architecture was established, later named the Faculty of Architecture) and two more faculties were created, the Faculty of Archaeology and the Faculty of Decorative Arts. In 1966, Silpakorn University diversified the four faculties into sub–specializations to broaden its offerings, but the university's Wang Tha Phra campus proved inadequate. A new campus, Sanam Chandra Palace, was established in Nakhon Pathom Province in the former residential compound of King Rama VI. The first two faculties based on this campus were the Faculty of Arts in 1968 and the Faculty of Education in 1970. Later, three more faculties were created: the Faculty of Science in 1972, the Faculty of Pharmacy in 1986, and the Faculty of Engineering and Industrial Technology in 1992. In 1999, the Faculty of Music was created. In 1997, Silpakorn extended its reach by establishing a new campus in Phetchaburi Province. The new campus was named "Phetchaburi Information Technology Campus". In 2001 and 2002, the Faculty of Animal Sciences and Agricultural Technology and the Faculty of Management Science were established on the Phetchaburi Campus. In 2003, the Faculty of Information and Communication Technology (ICT) was established, as well as Silpakorn University International College (SUIC). Its role is to provide an international curriculum in additional fields of study. Ganesha, one of the Hindu deities symbolizing arts and crafts, is Silpakorn University's emblem. The "university tree" is the chan tree. Campuses Tha Phra Palace Tha Phra Palace was Silpakorn's first campus. It occupies a small part of the inner city of Bangkok known as Rattanakosin Island. Opposite the Grand Palace and covering an area of 8 rai, the campus was once the palace of Prince Narisara Nuwattiwong. On its west side is the Chao Phraya River. The office of the university president is in Taling Chan District, Bangkok. Sanam Chandra Palace Campus Sanam Chandra Palace Campus is on the grounds of Sanam Chandra Palace in Nakhon Pathom which was once the royal pavilion of King Rama VI of Chakri dynasty. It occupies 440 rai. Phetchaburi Information Technology Campus The 820 rai Phetchaburi Information Technology Campus is in Phetchaburi Province. Faculties Faculty of Painting, Sculpture and Graphic Arts Faculty of Architecture Faculty of Archaeology Faculty of Decorative Arts Faculty of Arts Faculty of Education Faculty of Science Faculty of Pharmacy Faculty of Engineering and Industrial Technology Faculty of Music Faculty of Animal Sciences and Agricultural Technology Faculty of Management Science Faculty of Information and Communication Technology Silpakorn University International College (SUIC) Graduate School Notable alumni Princess Maha Chakri Sirindhorn – Princess of Thailand Princess Chulabhorn – Princess of Thailand Princess Siribhachudabhorn – Princess of Thailand Angkarn Kalayanapong – National Artist of Thailand (Literature), poet Chalermchai Kositpipat – National Artist of Thailand (Fine art and visual art), Founder of the White Temple (Wat Rong Khun) Thawan Duchanee – National Artist of Thailand (Fine art and visual art) Chavalit Soemprungsuk – National Artist of Thailand (visual art), painter Gallery See also Silpakorn University Art Gallery Education in Thailand List of universities and colleges in Thailand Lists of universities and colleges References External links Official website of Silpakorn University Social News Silpakorn Universities and colleges in Bangkok Educational institutions established in 1943 Phra Nakhon District 1943 establishments in Thailand Art schools in Thailand
56193615
https://en.wikipedia.org/wiki/IncludeOS
IncludeOS
IncludeOS is a minimal, open source, unikernel operating system for cloud services and IoT. IncludeOS allows users to run C++ applications in the cloud without an operating system. IncludeOS adds operating system functionality to an application allowing oneself to create a 'virtual machine' for an application. IncludeOS applications boot in tens of milliseconds and require only a few megabytes of disk and memory. Architecture The minimalist architecture of IncludeOS means that it does not have any virtual memory space. In turn, therefore, there is no concept of system calls nor user space. References External links IncludeOS on GitHub IncludeOS blog Alfred Bratterud: Deconstructing the OS: The devil’s In the side effects, CppCon 2017 presentation C++ Weekly - Ep 31 - IncludeOS Computing platforms Free software operating systems Software using the Apache license Software companies of Norway Free software programmed in C++
5818561
https://en.wikipedia.org/wiki/J.%20Anthony%20Hall
J. Anthony Hall
J. Anthony Hall FREng is a leading British software engineer specializing in the use of formal methods, especially the Z notation. Anthony Hall was educated at the University of Oxford with a BA in chemistry and a DPhil in theoretical chemistry. His subsequent posts have included: ICI Research Fellow, Department of Theoretical Chemistry, University of Sheffield (1971–1973) Principal Scientific Officer, British Museum Research Laboratory (1973–1980) Senior Consultant, Systems Programming Limited (1980–1984) Principal Consultant, Systems Designers (1984–1986) Visiting Professor, Carnegie Mellon University (1994) Principal Consultant, Praxis Critical Systems (1986–2004) In particular, Hall has worked on software development using formal methods for the UK National Air Traffic Services (NATS). He has been an invited speaker at conferences concerned with formal methods, requirements engineering and software engineering. Since 2004, Hall has been an independent consultant. He has also been a visiting professor at the University of York. Hall was the founding chair of ForTIA, the Formal Techniques Industry Association. Selected publications Anthony Hall, Seven Myths of Formal Methods, IEEE Software, September 1990, pp. 11–19. Anthony Hall and Roderick Chapman, Correctness by Construction: Developing a Commercial Secure System, IEEE Software, January/February 2002, pp. 18–25. References Career history External links Anthony Hall website Living people British computer programmers British computer scientists Formal methods people Fellows of the Royal Academy of Engineering Fellows of the British Computer Society Alumni of the University of Oxford Employees of the British Museum Academics of the University of Sheffield British software engineers Year of birth missing (living people)
366243
https://en.wikipedia.org/wiki/Poser%20%28software%29
Poser (software)
Poser (and Poser Pro) is a 3D computer graphics program distributed by Bondware. Poser is optimized for the 3D modeling of human figures. By enabling beginners to produce basic animations and digital images, along with the extensive availability of third-party digital 3D models, it has attained much popularity. Overview Poser is a 3D rendering software package for the posing, animating and rendering of 3D poly-mesh human and animal figures. Similar to a virtual photography studio, Poser allows the user to load figures, props, lighting and cameras for both still and animated renderings. Using a subset of the Alias object (OBJ) file format and a text-based markup for content files, Poser comes with a large library of pre-rigged human, animal, robotic, and cartoon figures. The package also includes poses, hairpieces, props, textures, hand gestures and facial expressions. As Poser itself does not allow for original modeling of objects, a large community market of artists has emerged, in which Poser content is created and sold through various third party channels. Poser is available in multiple languages including English, Japanese, German and French. Poser is available for both Microsoft Windows and Mac OS X operating systems. While Poser's interface has evolved since the product's introduction in 1995, the current Poser 11 and Poser Pro 11 preserve many of the application's original interface elements so that legacy users can move into the newest version and navigate without relearning the program's controls. Features Poser includes a library of pre-built, ready-to-use content including body and hand poses, materials, props, facial expressions, hairpieces, lights, cameras and scenes, and a Reyes-based render engine called Firefly which supports nodes for the creation of complex materials. Furthermore, it provides import of sound, image, and video files, motion capture data and 3D content for the creation of scenes or the addition of new library items. Poser exports content in many 3D formats. Poser is capable of material editing, facial photo matching, dynamic hair, dynamic cloth and new figure rigging. Online content is also available. Python enables third-party developers to create additional features ranging from custom libraries, rendering engine control panels, metadata editors and utility scripts. Usage Poser is a digital stage that gives the user total control. Poser is used to create original images ranging from human figures, human renderings of medical and industrial design illustrations, editorial illustrations, informational graphics, graphic novel illustrations, comics, and much more. Poser contains many animation capabilities and is regularly employed by broadcast professionals including animation staff at Fox Bones, Colbert Report and Jimmy Kimmel Live!, as well as in industry applications, such as in the animated instructions for automated checkout machines at Albertsons, Save-On stores and Wal-Mart, and in at least a few full-length films including Star Trek fan-film, Star Trek: Aurora' The Misty Green Sky, and The Exigency. Poser characters and animations were used for early computer games from 'buddies' game creators ("Desert Rifle" games and "Cake shop" from Qi and ELEFUN(TM) game developers). Standard Poser characters have been extensively used by European and US-based documentary production teams to graphically render the human body or virtual actors in digital scenes. Humanoids printed in several science and technology magazines around the US are often Poser-rendered and post-worked models. A film animated entirely on Poser, titled The Exigency, took thirteen years to produce and was released on December 14, 2019. Library Poser is packaged with ready-to-use 3D content that allows new users to get started without immediately needing to purchase additional content. Items are stored in Poser's drag-and-drop-enabled Library and are organized by type and name, e.g. People/Ryan2. Users can save customized figures or objects into the Library in order to reuse those items at a later point in time. The Library also supports adding in additional "Runtimes" which are collections of content that legacy users have assembled from third party providers. The Library includes a configurable, keyword-based Search function that locates content in the Library or connected Runtimes. Content can also be added to the Library's Favorites for quick access. The Library is set-up with categories that each include collections of similar content items: Character: pre-rigged figures including anatomically accurate humans, mannikins, animals, insects, dinosaurs, cartoon characters, human anatomies such as skeletons and musculature and mechanical figures such as vehicles Pose: animated and static poses for human and animals covering day-to-day activities, dancing, walking, standing and sitting, as well as action and sport poses Face: includes full and partial facial expressions Hair: includes prop-based transparency-mapped hairpieces, dynamic hairpieces and hair props such as mustaches or sideburns. Hand: hand poses of various types such as action poses and gestures, signals, counting and American Sign Language Props: includes primitives such as spheres and cylinders, clothing items grouped by character, scene props, furniture, rooms, vehicles, plants and cartoon elements Lights: includes animated or static pre-set lights consisting of spotlights, infinite lights, point lights, diffuse IBL lights. Cameras: includes animated or static cameras Materials: includes simple and complex node-based materials Scenes: full Poser scenes including a Factory, Crime Scene Lab and a modern Apartment. History Poser was created by artist and programmer Larry Weinberg as a digital replacement for artist's mannequins. Versions 1.0 and 2.0 were published by Fractal Design. In 1997, Fractal Design was acquired by MetaCreations, and Poser's interface was redesigned by MetaCreations' Phil Clevenger for release as Poser 3 in 1998. This interface has remained as the basis for all subsequent versions. In 1999, MetaCreations sold Poser to egi.sys AG, which established the subsidiary Curious Labs'', with Larry Weinberg as CEO to handle Poser development and publication. Curious Labs and Poser were sold to e-frontier, in 2003. In November 2007, Smith Micro Software acquired Poser as well as Anime Studio (now called Moho). Smith Micro Software also acquired the English language distribution rights to Manga Studio (now called Clip Studio Paint), from e-Frontier. The latest "stable" versions of Poser were released in September 2019; Poser 12 is currently in open beta, meaning you can purchase it but it will have bugs. Poser 11 introduced many new features, including better rigging capabilities. Early versions of Poser were bundled with fully clothed humanoid figures specifically designed for Poser. As the program evolved, add-on packages of human figures were sold by the manufacturer of Poser, and eventually, third-party companies began creating figures which work with Poser. As clothing became separate from the humanoid figure, collections of 3D garments were created for specific models which conform to the shape and pose of the Poser figure. 'Poses' for figures were packaged and sold by the software vendor and by third parties. 'Morphs', allowing customization of body or face shape or other features, are also sold. Different skin textures, frequently combined with settings for morph technology, are marketed to allow one base model to be customized into many different 'characters'. Similarly 'texture' packages allow one garment to take on many different appearances, an animal to represent different breeds of the same species or a vehicle to show many different color schemes. On July 2, 2009 Smith Micro Software announced the creation of a new platform for distribution of assets for use in Poser called Content Paradise. On November 9, 2018 Smith Micro Software announced the closure of Content Paradise on December 3, 2018., the content moved to Renderosity. On June 20, 2019, Smith Micro Software announced they sold the product line of its Poser software to Bondware, Inc., owner of the popular online marketplace, Renderosity.com, and longtime Smith Micro resale partner. Poser figures Poser's specially designed figures are commonly known as Poser Figures, Poser Models, Poser Content, Digital Actors, or Digital Puppets. Early versions of Poser were bundled with fully clothed humanoid figures specifically designed for the then-current version of Poser. Next, add-on packages of human figures were sold by the manufacturer of Poser. Soon, third party companies began creating figures which work with Poser. As clothing became separate from the humanoid figure, collections of 3D garments were created for specific models which conform to the shape and pose of the Poser figure. 'Poses' for figures were packaged and sold by the software vendor and by third parties. 'Morphs' allowing customization of body or face shape or other features are also for sale. Skin textures, frequently combined with settings for morph technology, are marketed to allow one base model to be customized into many 'characters'; similar 'texture' packages allow one garment to take on many appearances, an animal to represent different breeds of the same species, or a vehicle to show many colour schemes. Development of figures Each major release of Poser has come with a new generation of figures for use with the tool, however separate figures rapidly became available as the content market developed. Notably Zygote (later Daz 3D) made a Poser model of a young woman, higher-resolution than Posette, and called her "the Millennium Girl". Poser users often colloquially shortened this name to "Millie". Zygote, disliking this name, officially named her Victoria, which is often colloquially shortened to Vicky. Victoria then became the initial member of a large family of figures which has developed across multiple generations of technology. After they merged with Gizmoz in late 2009, Daz 3D released all their Poser figures as free downloads, but withdrew the free versions of their pre-Genesis figures when Genesis was released. Content market Because Poser figures are very inexpensive and useful for commercial illustrators, an entire cottage industry has developed to create and market Poser figures and other content. The market is a combination of several large distributors, who often also develop products, and of individual artists who often use one or more of the larger distributors to handle the sale of their products. Both the distributors and individual artists are involved in the creation of Poser figures, clothing, poses, morphs, textures and characters. Figure families Rather than unconnected single figures, Poser figures are now generally produced as families of models linked by technology generation and creator. Certain add-on products, most often poses and skin textures, but including some clothing models, may be usable across more than one model within a family, but in general are not usable across different generations of the same model. Examples of notable families of models are: See also 3D modeling List of 3D modeling software References External links 3D graphics software 3D animation software Macintosh graphics software Windows graphics-related software MacOS graphics-related software Anatomical simulation
16286907
https://en.wikipedia.org/wiki/Prefix%20delegation
Prefix delegation
IP networks are divided logically into subnetworks. Computers in the same subnetwork have the same address prefix. For example, in a typical home network with legacy Internet Protocol version 4, the network prefix would be something like 192.168.1.0/24, as expressed in CIDR notation. With IPv4, commonly home networks use private addresses (defined in ) that are non-routable on the public Internet and use address translation to convert to routable addresses when connecting to hosts outside the local network. Business networks typically had manually provisioned subnetwork prefixes. In IPv6 global addresses are used end-to-end, so even home networks may need to distribute public, routable IP addresses to hosts. Since it would not be practical to manually provision networks at scale, in IPv6 networking, DHCPv6 prefix delegation is used to assign a network address prefix and automate configuration and provisioning of the public routable addresses for the network. The way this works for example in case of a home network is that the home router uses DHCPv6 protocol to request a network prefix from the ISP's DHCPv6 server. Once assigned, the ISP routes this network to the customer's home router and the home router starts advertising the new addresses to hosts on the network, either via SLAAC or using DHCPv6. DHCPv6 Prefix Delegation is supported by most ISPs who provide native IPv6 for consumers on fixed networks. Prefix delegation is generally not supported on cellular networks, for example LTE or 5G. Most cellular networks route a fixed /64 prefix to the subscriber. Personal hotspots may still provide IPv6 access to hosts on the network by using a different technique called Proxy Neighbor Discovery or using the technique described in . One of the reasons why cellular networks may not yet support prefix delegation is that the operators want to use prefixes they can aggregate to a single route. To solve this, defines an optional mechanism and the related DHCPv6 option to allow exclusion of one specific prefix from a delegated prefix set. See also Computer networks IPv6
12976092
https://en.wikipedia.org/wiki/P%20A%20College%20of%20Engineering
P A College of Engineering
P. A. College of Engineering (P.A.C.E.) is an engineering college located in Karnataka, India. It is situated at Konaje, 25 km from Mangalore. P.A.C.E. was founded in 1999 by the Kerala-based businessman Dr. P.A. Ibrahim Haji. Approximately 1,450 engineering students graduate each year. About The main building on the 60-acre campus was designed by Upalker Sadekar architects of Mumbai. The college was established with a sanctioned intake of 240 students per academic year; 60 students in each of four disciplines: Electronics and Communication, Telecommunications, Computer Science, and Information Science. Since then, three additional disciplines have been added - Mechanical Engineering, Biotechnology, and Civil Engineering. With these additions, the student intake per academic year doubled to 480 students. The Electronics and Communication, Biotechnology, Chemistry and Computer Science departments each have dedicated research centers. The trust continued its venture of establishing and running successful educational institutions. A Pre-University College was opened in the academic year 2005-06 with Science & Commerce streams having a combined student strength of 350. One of the factors of establishing this college was to have a feeder institution to the already established institution. The latest venture of the Trust is P.A. Polytechnic, which was established in 2006. P.A. Polytechnic offers Diploma education in Electronics & Communication, Electrical & Electronics, Civil Engineering, Computer Science & Engineering. In 2009 a Mechanical Engineering branch was added. The trust also runs hostels to house students. Boys live in one hostel, while girls live in another. Approximately 1,200 students are fed each day. The Trust has plans of continuing in the path of development turning the institutions into an autonomous and ultimately into a Deemed University. UG courses B.E in Biotechnology (60 Intakes) B.E in Civil Engineering (120 Intakes) Haleemath Nishma is the best outgoing student for the year 2017-18. B.E in Computer Science and Engineering (120 Intakes) B.E in Electrical Engineering (60 Intakes) B.E in Electronics and Communication Engineering (120 Intakes) B.E in Mechanical Engineering (180 Intakes) PG courses M.Tech in Digital Communication Networks (18 Intakes) M.Tech in Computer Science and Engineering (18 Intakes) M.Tech in Thermal Engineering (18 Intakes) M.Tech in VLSI Design (18 Intakes) M.B.A (120 Intakes) Research programme Bio-Technology Chemistry Electronics and Communication Engineering Computer Science Engineering Mechanical Engineering Mathematics Management Studies Research collaborations Institute of Chemistry Research Centres: Academia Sinica, Taipei, Taiwan, Republic of China GenØk – Centre for Biosafety Research Centres: University of Tromsø, Tromsø, Norway University of Sains Malaysia (USM) Research Centres: Malaysia Staff from different departments in PACE have collaborated with researchers in various Institutes and Universities in India and across the world that resulted in joint publications. Examples of such Institutes and Universities are Curtin University of Technology, Perth, Western Australia and BRAC University. Research clusters Drug Discovery (Synthesis and Biological Studies) VLSI Design and Fuzzy Logic High Performance Computing and Embedded Systems Computational Fluid Dynamics Flow in Jet and Turbo Jet Engines Biotechnology Professional society chapters ISTE Student Chapter CSI Student Chapter IEEE Student Chapter Linux Users Group (LUG) Awards and certifications P.A.C.E. is approved by the All India Council for Technical Education (AICTE). It is affiliated with the Visvesvaraya Technological University (VTU). The National Board of Accreditation (NBA) accredited the college in 2009. Several research grants have been awarded to the Department of Chemistry. P.A.C.E. has also been certified to the ISO 9001-2008. References External links Engineering colleges in Mangalore Affiliates of Visvesvaraya Technological University
43946756
https://en.wikipedia.org/wiki/Tick%20%28software%29
Tick (software)
Tick is time tracking software operated by Higher Pixels (former The Molehill) headquartered in Jacksonville, Florida, that offers online time tracking and reporting services through their website along with mobile and desktop applications. Tick tracks time based on clients, projects and tasks, either through a timer or through manual entry. Features Tick provides time tracking, management and reporting. API control is enabled for designers. Tick features include: Manual time entry Timers Instant budget feedback in the timecard View and export reports Desktop App for Windows and Mac Apple Watch App Mac Widget Mac App Chrome Extension Native iPhone application and Android/Google application Integrates with Basecamp, Asana, and Trello. SSL encryption ensures outside sources are kept out Account administrators have full control over permissions and access levels Zapier Support Seamless invoicing options through QuickBooks or FreshBooks Early SaaS & Ruby Adoption Tick was one of the first software as a service applications to be built on the Ruby on Rails framework. Company The software company The Molehill began in 2006 as a lifestyle business company. The company created the online business tool Tick to help companies hit their budgets by tracking time with the idea that employees help you hit your budgets. See also Comparison of time tracking software Project management software References External links Time-tracking software 2006 software Proprietary software
15670933
https://en.wikipedia.org/wiki/Alsys
Alsys
Alsys, SA. (founded 1980, merged 1995) was a software development company created to support initial work on the Ada programming language. In July 1995, Alsys merged to become Thomson Software Products (TSP), which merged into Aonix in 1996. History Alsys SA. the French company was founded in 1980 by Jean Ichbiah (1940–2007). Also in 1980 the American subsidiary Alsys Inc was formed with Ben Brosgol (from Intermetrics), and Pascal Clève. In 1985 a British subsidiary, Alsys ltd was formed with John Barnes as the MD. During merger mania of the 1990s, Alsys was repositioned via a series of mergers. In 1991 Alsys was acquired by Thomson-CSF. In November 1992, Thomson-CSF acquired TeleSoft and merged it with Alsys. In July 1995, Thomson-CSF merged two of their subsidiaries, Alsys and MUST Software, a software development corporation based in Norwalk, Connecticut, to form Thomson Software Products (TSP). In November 1996, TSP merged with IDE (Interactive Development Environments, Inc.) to form Aonix. Thomson-CSF (now known as Thales), sold Aonix to Gore Technology Group (GTG) in the late 1990s. Aonix acquired Select Software in 2001. In January 2003, GTG sold the Critical Development Solutions (CDS) division of Aonix, which included the Alsys, Telesoft and IDE product lines, to a group of French investors. The name Aonix was kept for this new company, while Select Business Solutions was the name given to the part under Gore control. In 2003, Aonix acquired NewMonics of Tucson, Arizona, a supplier of Java-compliant virtual machines for embedded and real-time systems. In January, 2010 Aonix merged with Artisan Software Tools to form Atego. Alsys was one of the few companies that developed products that unleashed the protected mode of the 80286 processor. At the time, most applications were limited to using only 640K of memory. With the Ada compiler, applications could be built using up to 16MB of memory. Notes References "Parallel Processing in Ada", David Parker, April 1989, webpage: CNavy-12. External links "Parallel Processing in Ada", David Parker, April 1989, webpage: CNavy-12. Companies established in 1980 Software development Software development process Software companies of France
12833453
https://en.wikipedia.org/wiki/High%20School%20for%20Gifted%20Students%2C%20Hanoi%20National%20University%20of%20Education
High School for Gifted Students, Hanoi National University of Education
The High School for Gifted Students, Hanoi National University of Education (abbrev. HNUE High School, in Vietnamese: Trường THPT chuyên, Đại học Sư phạm Hà Nội, or, as commonly known, Chuyên Sư phạm (CSP)) is a public magnet school in Hanoi, Vietnam. The school was founded in 1966 as a national educational institution to nurture Vietnamese students who excelled at mathematics. HNUE High School is the second oldest magnet high school in Vietnam and one of the seven national-level high schools for the gifted. The school and HUS High School for Gifted Students are often interchangeably ranked the first in National Science Olympiads for high school students and National University Entrance Examinations. Its students has more than 100 medals at the International Science Olympiads. Its alumni include 4 ministers in the Vietnamese government, leading scientists at top domestic and foreign universities, and notable Vietnamese entrepreneurs. Foundation and history During the Vietnam War, aware of the important role of sciences for future of the country, a group of Vietnam leading scientists including Lê Văn Thiêm, Hoàng Tụy and Tạ Quang Bửu suggested that the government open up selective programs to nurture talented students, and to encourage them to follow science in their later years at universities and professions. As a premier national institution for training of science teachers, Hanoi National University of Education was selected to organize such a program. On December 24, 1966, at the height of the Vietnam War, the first class for gifted students was inaugurated with 33 mathematically inclined students, who were chosen from thousands of high school students in North Vietnam, at the evacuation site of the university in Phù Cừ District, Hưng Yên Province. This class was the foundation of HNUE High School. The history of Hanoi National University of Education High School is divided three periods: From 1966 to 1995, the special math classes were under the administration of the faculty of Mathematics, Hanoi National University of Education. In 1995, the school expanded to include a specialized stream in Computer Science. It was then renamed Specialized School for Maths and Computer Science. In 2005, the school started offering classes specialized in Literature, Physics, Chemistry and Biology and was named "High School for Gifted Students of Hanoi National University of Education". The HNUE High School was honored with many national awards, including the 3rd degree Labor Decoration award in 1986, the 2nd degree Labor Decoration in 1996, and most recently the 1st degree Labor Decoration (2001). Education Admissions In the first period of establishment of the school in the 80s, HNUE High School for Gifted Students did not directly handle the admission process; this work was taken by the Ministry of Education and Training. During this period, students with outstanding abilities in mathematics (only in Northern region, from Nam Dinh province northwards) were nominated by the region to the Ministry of Education and Training before participating in an entrance exam. Students who passed the exam would be divided two schools: High School for Gifted Students, Hanoi University of Science and HNUE High School for Gifted Students or HSGS High School for Gifted Students. Since the end of 1980, with the wave of eradication of subsidy mechanism, HNUE High School for Gifted Students and HSGS High School for Gifted Students directly handle their own admission process. The school's entrance examinations are held in June, attracts students from all over the country. Candidates must take three compulsory papers on the first day of testing (Vietnamese language, Mathematics and English) and on the second day, one elective paper (from Mathematics, Literature, Physics, Chemistry, Biology, English studies ) for their specialization. These exams are highly competitive. Education model The students are organized into specialized streams in one of the following subjects: Mathematics, Literature, Physics, Chemistry, Biology, Computer Science and English. Each stream is offered an accelerated curriculum on the subject of specialization. Seminars are held during the school year, at which students can discuss with national and international scientists and researchers. Extra-curricular activities include sports, camping and clubs. Teaching staff Members of the staff often provide annual training for national Olympiad teams for international competitions, and have been consulted for developing textbooks and curriculum for Vietnamese national high school education. In addition, many have received the Excellent Teacher Award from the government for their dedication to education. Facilities As many students come from provinces far away from the main campus in Hanoi, the school provides room and board.The school has built a spacious facility with 40 classrooms, 2 Informatics practice rooms, an English classroom, 3 laboratories for subjects: Physics, Chemistry, Biology with full necessary equipment, one multi room... Classrooms The main building of the school contains 24 classrooms and a multi-media room. Wireless internet access is available across the entire building. Library Built in 2001, it meets the demand for reference and study material for teachers and students. It has hundreds of computers for internet access, along with spacious media rooms. Laboratories There are two computer rooms where students practice Computer Science or use for study and reference or entertaining. Because of space shortage, the school does not have lab space for physics, chemistry and biology experiments, but the students are given access to facilities on the University main campus. Dormitories The school provides students with dormitories. The residents have access to a canteen and an internet access point. Stadium The students use Hanoi National University of Education stadium for physical education and sport events. Student life PTCMedia PTCMedia is the official media organization of HNUE High School. With the mission of reporting the details of the internal academic aspect and activities of students, as well as encouraging the student connections by organizing various events, PTCMedia has always been creative and critical to bringing the best to students. Our initial product is PTCTimes, the newspaper of HNUE High School. The first issue was published in December 2006. There are approximately 60 pages in each issue. A number of pages are always devoted to difficulties in studying. Each newspaper has an interview with a teacher or student. The recreation column deals with music, games, and sport. Besides print media, PTC also directs and provides digital products that demonstrate all sides of HNUE High School. Events are an important part of PTCMedia's annual activities. The organization is behind the success of various events for students in all class standing. Fiesta A Cielo is currently the biggest event at HNUE High School, which normally lasts 2 weeks to 1 month. This is an opportunity for different classes to compete and have fun in sports, academics, games, and so on. Throughout different seasons, Fiesta A Cielo carries the virtues of a well-development environment for students, as well as representing the strength of HNUE High School students in all fields. PTC Media is the oldest club in CSP. SAGS SAGS is the abbreviation for "Studying Abroad for Gifted Students" - the organization concerning studying abroad orientation and English learning development. Founded in 2008, through 6 years of working seriously, enthusiastically and effectively, SAGS has created many helpful and interesting activities: - Founded and has administrated an English club since 2009. - Administrating 2 pages on Facebook: "The ySAGS" and "SAGS.CSP", which bring students helpful knowledge, information and video clips about studying abroad and learning English. - Holding big annual events: English speaking contest "U-talk", English singing contest "Stereo Hearts". - Hosting many conferences on studying abroad with the presence of universities representatives, famous lecturers, etc. from all around the world such as Ebroad - Studying Abroad Orientation and Developing English through seminars. - Holding and attending, supporting meaningful voluntary events: imFLOW (at National Institute of Hematology and Blood Transfusion), Youth Day (in Hanoi), Vitamin smile (at National Hospital of Pediatrics). - Christmas free transporting cards and gifts program: "Christmas Hearts transporter". - Annual prom on Halloween night. - Co-operating with other organizations to hold big events: LEMON's day (the biggest series of voluntary events in Hanoi by students), The Breakfast (annual orientation program for first year students), Puzzles (a debate club of Hanoi students), etc. MCCM School music club where students who passionated about music can rehearsed and share their joy of music. The club usually participated in school performances, as well as having its own music events. ECLUB With slogan "English Can Lead U Beyond", ECLUB is the one and only English Club for students of CSP. The club has two main events around the year: Fight the Krampus, Activate your Energy. CDT The abbreviation for "CSP Dance Team". Movies for Relief A charity organization. One of the biggest clubs of school. Their annual activities are Spring Melody, Red Carpet and more. CSF CSF is the abbreviation for "CSP Sporting Federation", the first sport organization in the school history. Founded in 2013, CSF focuses on promoting three main sports which are soccer, basketball and badminton, holding both intramural and interscholastic competition for each sport. The winning teams will represent the school in the city tournaments. C3 C3 stands for "Chuyen Su pham (CSP) Cubing Club". ASO ASO is the abbreviation for "Apply Science Organization". ADaPT ADaPT is the first Information Technology club of HNUE High School. This is where students have a chance to participate in assembling high-tech products, have an experience like in Startup projects, or simply just try out unique machines. At the same time, ADaPT provides you a place to hang out, learn, and exchange knowledge and ideas with everyone. ET Magic The first magic team where teammates can show their own skills in magic tricks. HE "HE" is the abbreviation for "History for Everyone", a history club founded in 2016. CDS CDS is the abbreviation for "Chuyen Su Pham (CSP) Debate Society", the school's first debate club with the aim of sparking a wider interest in formal debating within the CSP student community. Founded in 2017, the club provides students with the opportunity to develop and utilize their critical thinking, research, discussion and presentation skills as well as preparing potential individuals to compete in local, regional, and national tournaments. It also holds its own debate tournaments and a summer program dedicated to teaching debating skills. CDS has gained popularity in light of the succeed of Warm-up Debating Championship Other extracurricular activities Camping or sightseeing: During the summer vacation, the school organizes short tours for students. These activities may be replaced by camping. Clubs: There are sport clubs, English club, physics club, mathematics club (online) and the Readle, a literature appreciation club, for students. The sport clubs, physics club and mathematics club were all founded in 2013-2014. The Readle focuses not only upon book discussion but also organises Slam Poetry sessions. Achievements College admission 100% of HNUE High school students pass the annual National University Entrance Examination and are admitted to universities in Vietnam. The average entrant score of HNUE students is always one of the highest in the country. After graduation, many students pursue higher education abroad and are scholars in world top universities. National Olympiads Since its foundation, the School has attended national merit competitions annually and received more than 500 prizes, mainly in Math and Informatics, and about 50 of them are first prizes. International Olympiads More than 50 students of the school have received high awards in international competitions, namely International Mathematics Olympiad (IMO), Asian Pacific Mathematics Olympiad (APMO), International Olympiad in Informatics (IOI), and International Biology Olympiad. Especially, Vu Ngoc Minh won two Gold medals ( at the 42nd and 43rd IMO ), Dinh Tien Cuong and Nguyen Trong Canh scored 42/42 point respectively at the 30th and 44th IMO. International Mathematics Olympiad 16th International Mathematical Olympiad in German Democratic Republic in 1974 Vu Dinh Hoa " Silver medal Ta Hong Quang " Bronze medal 17th International Mathematical Olympiad in Bulgaria in 1975 Le Dinh Long " Silver medal 18th International Mathematical Olympiad in Austria in 1976 Le Ngoc Minh " Bronze medal 20th International Mathematical Olympiad in Romania in 1978 Vu Kim Tuan " Silver medal Nguyen Thanh Tung " Silver medal Do Duc Thai " Bronze medal 21st International Mathematical Olympiad in the United Kingdom in 1979 Bui Ta Long " Silver medal 24th International Mathematical Olympiad in France in 1983 Tran Tuan Hiep " Silver medal Pham Thanh Phuong " Bronze medal 25th International Mathematical Olympiad in Czechoslovakia in 1984 Do Quang Dai " Silver medal 27th International Mathematical Olympiad in Poland in 1986 Ha Anh Vu " Gold medal Nguyen Phuong Tuan " Silver medal 28th International Mathematical Olympiad in Cuba in 1987 Tran Trong Hung " Silver medal 29th International Mathematical Olympiad in Australia in 1988 Tran Trong Hung " Silver medal 30th International Mathematical Olympiad in Germany in 1989 Dinh Tien Cuong " Gold medal (score 42/42) 31st International Mathematical Olympiad in China in 1990 Le Truong Lan " Bronze medal 32nd International Mathematical Olympiad in Sweden in 1991 Nguyen Viet Anh " Silver medal 33rd International Mathematical Olympiad in Russia in 1992 Nguyen Huu Cuong " Bronze medal 34th International Mathematical Olympiad in Turkey in 1993 Pham Hong Kien " Silver medal Pham Chung Thuy " Bronze medal 35th International Mathematical Olympiad in Hong Kong in 1994 Nguyen Duy Lan " Silver medal 36th International Mathematical Olympiad in Canada in 1995 Nguyen The Phuong " Silver medal 39th International Mathematical Olympiad in Taiwan in 1998 Vu Viet Anh " Gold medal Le Thai Hoang " Bronze medal 40th International Mathematical Olympiad in Romania in 1999 Le Thai Hoang " Gold medal 42nd International Mathematical Olympiad in the US in 2001 Vu Ngoc Minh " Gold medal Tran Khanh Toan " Silver medal 43rd International Mathematical Olympiad in the United Kingdom in 2002 Pham Gia Vinh Anh " Gold medal Vu Ngoc Minh " Gold medal 44th International Mathematical Olympiad in Japan in 2003 Nguyen Trong Canh " Gold medal (score 42/42) 45th International Mathematical Olympiad in Greece in 2004 Nguyen Kim Son " Gold medal Nguyen Duc Thinh " Silver medal Hua Khac Nam " Silver medal 46th International Mathematical Olympiad in Mexico in 2005 Nguyen Nguyen Hung " Bronze medal 49th International Mathematical Olympiad in Spain in 2008 Nguyen Pham Dat " Silver medal Asian Pacific Mathematics Olympiad 10th Asian Pacific Mathematics Olympiad Vu Viet Anh " Bronze medal 11th Asian Pacific Mathematics Olympiad Le Thai Hoang " Gold medal 13th Asian Pacific Mathematics Olympiad Luu Tien Duc " Gold medal 14th Asian Pacific Mathematics Olympiad Vu Hoang Hiep " Gold medal International Olympiad in Informatics 12th International Olympiad in Informatics in Turkey in 1999 Nguyen Hong Son " Silver medal 14th International Olympiad in Informatics in Finland in 2001 Tran Quang Khai " Silver medal 15th International Olympiad in Informatics in South Korea in 2002 Tran Quang Khai " Gold medal 20th International Olympiad in Informatics in Croatia in 2007 Pham Nam Long - Bronze medal ( Great Britain Selected Team) 23rd International Olympiad in Informatics in Thailand in 2011 Nguyen Hoang Yen " Bronze medal 24th International Olympiad in Informatics in Italy in 2012 Nguyen Viet Dung " Silver medal International Biology Olympiad 21st International Biology Olympiad in Korea in 2010 Vu Thi Ngoc Oanh " Bronze medal 22nd International Biology Olympiad in Taiwan in 2011 Nguyen Trung Kien " Bronze medal 23rd International Biology Olympiad in Singapore in 2012 Nguyen Thi Ngoc Hong " Bronze medal 24th International Biology Olympiad in Switzerland in 2013 Nguyen Thi Phuong Diep " Bronze medal Notable alumni Politics Doan Xuan Hung - Former Deputy Minister of Ministry of Foreign Affairs of Vietnam and Ambassador to Japan and Germany. Nguyen Huy Dung - Deputy Minister, Ministry of Information and Communication Le Quoc Thinh - Vietnam Consulate General in Osaka, Japan. Nguyen Hoi Nghia - Former Deputy General Director Ho Chi Minh City National University (Deputy Minister), member of the founding team of Gifted High School, National University of Ho Chi Minh City Nguyen Dinh Cong - Deputy General Director, Vietnam Academy of Science and Technology (Deputy Minister), former Deputy Director Institute of Mathematics of Vietnam. Science Nguyen Lan Viet - Former President of Hanoi Medical University, President Vietnam Heart Association, Deputy Director Vietnam Heart Institute. Ho Tu Bao - Director of Knowledge Creation Methodology Laboratory, Japan Advanced Institute of Science and Technology (JAIST), Member of Science Council Institute of Advanced Mathematics in Vietnam Dinh Tien Cuong - Provost's Chair of Mathematics, National University of Singapore, Former Professor Sorbonne University, Member of Science Council Vietnam Advanced Mathematical Institute Do Duc Thai - Chair of Mathematics Department, Hanoi National University of Education, Member of Science Council Institute of Advanced Mathematics. Vũ Kim Tuấn - Distinguished Chair of the University of West Georgia, US. Nguyen Tu Cuong - Professor, Doctor of Mathematics Science, Institute of Mathematics of Vietnam. Nguyen Dong Yen - Professor, Doctor of Science in Mathematics, Ta Quang Buu Prize 2015. Pham Duc Chinh - Professor, Doctor of Mechanical Science, Ta Quang Buu Award 2019. Ha Anh Vu - Honeywell International Center, United States. Nguyen Hong Thai - Professor at the Mathematics Institute of the University of Szczecin, Poland. Business Le Dinh Long - Development Director of Spark Social Enterprise Center, Former General Director International Bank of Vietnam (VIB), Former General Director Hong Leong Bank Vietnam. Nguyen Tu Quang - Founder and CEO of Technology Group Bkav. Do Van Minh - CEO of Gemandept Joint Stock Company ( HOSE: GMD) Dao Trong Khoa - Vice President of Vietnam Logistics Association Ha Thanh Tu - Local Partner McKinsey & Company Other Doan Minh Cuong (Course 1) - Former Head of the Division (equivalent to Rector) of the High School, Hanoi Pedagogical University. References External links "Results of Vietnam Teams at International Mathematical Olympiads". High schools in Hanoi High schools for the gifted in Vietnam University-affiliated schools Educational institutions established in 1966 1966 establishments in North Vietnam
55298672
https://en.wikipedia.org/wiki/Stride%20%28software%29
Stride (software)
Stride was a cloud-based team business communication and collaboration tool, launched by Atlassian on 7 September 2017 to replace the cloud-based version of HipChat. Stride software was available to download onto computers running Windows, Mac or Linux, as well as Android, iOS smartphones, and tablets. Stride was bought by Atlassian's competitor Slack Technologies and was discontinued on February 15, 2019. The features of Stride include chat rooms, one-on-one messaging, file sharing, 5 GB of file storage, group voice and video calling, built-in collaboration tools, and up to 25,000 of searchable message history. Premium features include unlimited file storage, users, group chat rooms, file sharing and storage, apps, and history retention. The premium version, priced at $3/user/month, also includes advanced meeting functionality like group screen sharing, remote desktop control, and dial-in/dial-out capabilities. Stride offered integrations with Atlassian's other products as well as other third-party applications listed in the Atlassian Marketplace, such as GitHub, Giphy, Stand-Bot and Google Calendar. Stride offered additional features beyond messaging to improve efficiency and productivity. It aimed to reduce collaboration noise by introducing a "focus" mode, and eliminates the divisions between text chat, voice meetings, and videoconferencing, by simplifying transitioning between these modes in the same channel. On July 26, 2018, Atlassian announced that HipChat and Stride would be discontinued February 15, 2019, and that it had reached a deal to sell their intellectual property to Slack. Slack will pay an undisclosed amount over three years to assume the user bases of the services, and Atlassian will take a minority investment in Slack. The companies also announced a commitment to work on integration of Slack with Atlassian services. See also List of collaborative software References External links Web applications Chat websites Instant messaging Atlassian products Business chat software 2013 software Android (operating system) software Collaborative software IOS software Linux software MacOS software Task management software Windows Phone software Windows software
28083018
https://en.wikipedia.org/wiki/Satellite%20router
Satellite router
A satellite router is an Indoor Unit (IDU) that contains a modulator and a demodulator and is one of the essential components of a VSAT. Training Modern VSAT systems utilize a satellite router. Best practice methods for using a satellite router are contained in VSAT training: The VSAT Installation Manual Video Presentation shows an example of a satellite router Satellite Internet access
59819128
https://en.wikipedia.org/wiki/Client%20to%20Authenticator%20Protocol
Client to Authenticator Protocol
The Client to Authenticator Protocol (CTAP) or X.1278 enables a roaming, user-controlled cryptographic authenticator (such as a smartphone or a hardware security key) to interoperate with a client platform such as a laptop. Standard CTAP is complementary to the Web Authentication (WebAuthn) standard published by the World Wide Web Consortium (W3C). WebAuthn and CTAP are the primary outputs of the FIDO2 Project, a joint effort between the FIDO Alliance and the W3C. CTAP is based upon previous work done by the FIDO Alliance, in particular the Universal 2nd Factor (U2F) authentication standard. Specifically, the FIDO U2F 1.2 Proposed Standard (July 11, 2017) became the starting point for the CTAP Proposed Standard, the latest version of which was published on January 30, 2019. The CTAP specification refers to two protocol versions, the CTAP1/U2F protocol and the CTAP2 protocol. An authenticator that implements CTAP2 is called a FIDO2 authenticator (also called a WebAuthn authenticator). If that authenticator implements CTAP1/U2F as well, it is backward compatible with U2F. The protocol uses the CBOR binary data serialization format. The standard was adopted as ITU-T Recommendation X.1278. References External links FIDO Specifications Overview FIDO Specifications Authentication Identification Internet security ITU-T recommendations ITU-T X Series Recommendations
11511208
https://en.wikipedia.org/wiki/Semmle
Semmle
Semmle Inc is a code-analysis platform with offices in San Francisco, Seattle, New York, Oxford, Valencia and Copenhagen. Semmle was acquired by GitHub (itself owned by Microsoft) on 18 September 2019 for an undisclosed amount. Semmle's LGTM technology automates code review, tracks developer contributions, and flags software security issues. The LGTM platform leverages the CodeQL query engine (formerly QL) to perform semantic analysis on software code bases. GitHub aims to integrate Semmle technology to provide continuous vulnerability detection services. In November 2019, use of CodeQL was made free for research and open source. CodeQL either shares a direct pedigree with .QL (dot-que-ell), which derives from the Datalog family tree, or is an evolution of similar technology. SemmleCode is an object-oriented query language for deductive databases developed by Semmle. It is distinguished within this class by its support for recursive query. Corporate background The company is headquartered in San Francisco, with its development operations based in Blue Boar Court, Alfred Street, central Oxford, England. Semmle's customers include Credit Suisse, NASA and Dell. SemmleCode background Academic SemmleCode builds on academic research on querying the source of software programs. The first such system was Linton's Omega system, where queries were phrased in QUEL. QUEL did not allow for recursion in queries, making it difficult to inspect hierarchical program structures such as the call graph. The next significant development was therefore the use of logic programming, which does allow such recursive queries, in the XL C++ Browser. The disadvantage of using a full logic programming language is however that it is very difficult to attain acceptable efficiency. The CodeQuest system, developed at the University of Oxford, was the first to exploit the observation that Datalog, a very restrictive version of logic programming, is in the sweet spot between expressive power and efficiency. The QL query language is an object-oriented version of Datalog. Industrial The early research works on querying the source of software programs spun off a number of industrial applications. In particular it became the cornerstone of systems for application intelligence (data mining on the source of software systems) and software renovation. In 2007, Paris-based CAST is one of the market leaders in that area, and other significant players include BluePhoenix in Herzliya, Israel. SemmleCode differs from these systems in its use of an object-oriented query language, which allows programmers to easily formulate new queries that are particular to their own project. A full account of the academic and industrial developments leading up to the creation of SemmleCode can be found in a paper by Hajiyev et al. Sample query in QL To illustrate the use of QL, consider the well-known rule in object-oriented programming that public fields should be declared final. To find violations of that rule, we should search for fields that are public but not final. In QL, that requirement is expressed as follows: from Field f where f.hasModifier("public") and not(f.hasModifier("final")) select f.getDeclaringType().getPackage(), f.getDeclaringType(), f Here not only is the offending field f selected, but also the package and type in which its declaration occurs. SemmleCode integration with development environments SemmleCode provides a user interface via the Eclipse IDE to query Java code (both source code and bytecode) as well as XML files, and to edit QL queries. This is however but one application of the technology that underlies it: QL can be used to query any other type of complex data. As part of the fold into the Microsoft/GitHub corporate house, the original Eclipse-based workflow has been supplanted with a workflow based around Microsoft's Visual Studio Code. See also List of tools for static code analysis .QL Datalog References Further reading Mark A. Linton. Implementing relational views of programs. In Peter B. Henderson, editor, Software Development Environments (SDE), pages 132–140, 1984. External links Companies based in Oxford Software companies of the United Kingdom Software testing tools Java development tools Static program analysis tools
30816265
https://en.wikipedia.org/wiki/Bus%20monitoring
Bus monitoring
Bus monitoring is a term used in flight testing when capturing data from avionics buses and networks in data acquisition telemetry systems. Commonly monitored avionics buses include ARINC Standard buses such as ARINC-429, ARINC 573, ARINC 717 ARINC 629 also known as Multi-transmitter Data Bus ARINC 664 also known as Deterministic Ethernet ARINC 825 Controller Area Network (CAN) Common Airborne Instrumentation Systems (CAIS) Cross Channel Data Link (CCDL) / Motor Controller Data Link (MCDL) Ethernet Fibre Channel Firewire, IEEE 1394 IRIG-106 PCM MIL-STD-1553 RS-232/RS-422/RS-485 STANAG-3910 Time-Triggered Protocol (TTP) Typically a bus monitor must listen-only on the bus and intercept a copy of the messages on the bus. In general a bus monitor never transmits on the monitored bus. Once the bus monitor has intercepted a message, the message is made available to the rest of the data acquisition system for subsequent recording and/or analysis. There are three classes of bus monitor: Parser bus monitor Snarfer bus monitor Packetizer bus monitor Parser bus monitor Parser bus monitoring is also known as coherent monitoring or IRIG-106 Chapter 4 monitoring. Parser bus monitors are suited to applications where the bus is highly active and only a few specific parameters of interest must be extracted. The parser bus monitor uses protocol tracking to identify and classify messages on the bus. From the identified messages of interest, specific parameters can be extracted from the captured messages. In order to ensure that coherency is achieved whereby all extracted parameters are from the same message instance, the parameters must be triple buffered with stale and skipped indicators. Optionally time tags can be added to each parsed message. Snarfer bus monitor Snarfer bus monitoring is also known as FIFO or IRIG-106 Chapter 8 monitoring. Snarfer bus monitors are suited to applications where all messages and traffic on the bus must be captured for processing, analysis, and recording. A snarfer bus monitor captures all messages on the bus, tags them with a timestamp and content identifiers (for example Command or Status in the case of MIL-STD-1553 buses), and puts them into a FIFO. Packetizer bus monitor Packetizer bus monitors are designed for networked data acquisition systems where the acquired data from the avionics buses is captured and re-packetized in Ethernet frames for transmission to an analysis computer or network recorder. The packetizer bus monitor captures selected messages of interest (parsed) or all messages on the bus (snarfed) and packages the message in the payload of a UDP/IP packet. The application layer contains bus identifiers, sequence numbers and timestamps. The most popular application layer protocols used for networked data acquisition systems include the Airbus IENA format and the iNET (integrated Network Enhanced Telemetry) TmNS (Telemetry Network System) format. References External links IRIG XidML ETEP - Airborne Data acquisition systems Curtiss-Wright Controls Avionics & Electronics iNET Ballard Technology Aircraft Interface Devices with monitoring capabilities Data collection
50564088
https://en.wikipedia.org/wiki/Procore
Procore
Procore Technologies is an American construction management software as a service company founded in 2002, with headquarters in Carpinteria, California. History Founder and CEO Craig "Tooey" Courtemanche created the software that became Procore as a response to his struggles to manage the construction of his new home in Santa Barbara, from his then-home in Silicon Valley. The app he built tracked the activity of the workers onsite. Founded in 2002, the company was originally headquartered in Montecito, California. Steve Zahm, founder of the e-learning company DigitalThink, joined Procore as president in 2004. Procore's revenue in 2012 was $4.8 million. In 2020, it was $400 million. The company initially filed to go public in 2019, with plans to launch the IPO in 2020, but delayed the offering due to the coronavirus pandemic. Procore stock began trading under stock ticker PCOR on May 20, 2021 at $67 per share. The initial public offering raised $634.5 million. Following the IPO, the company was valued at nearly $11 billion. As of May 2021, the company has over 10,000 customers, and over 1.6 million users of its products in more than 125 countries. Procore's campus is on a 9-acre oceanfront property in Carpinteria, California. Investors and acquisitions In 2014, Bessemer Venture Partners led a $15 million investment round. In 2015, the company raised an additional $30 million in a round led by Bessemer and Iconiq Capital. In 2015, the Wall Street Journal reported the company to be worth "$500 million post-money." In 2016, the company raised $50 million in a round led by Iconiq, reaching a $1 billion valuation. In 2018, the company raised an additional $75 million, and in 2020, it raised over $150 million. In total, the company raised nearly $500 million from 2007 through its IPO in 2021. In July 2019, Procore acquired US project management software group Honest Buildings. In October 2020, it acquired US estimating software provider Esticom. Procore acquired construction artificial intelligence companies Avata Intelligence in 2020, and INDUS.AI in 2021. Software Procore's cloud-based construction management software allows teams of construction companies, property owners, project managers, contractors, and partners to collaborate on construction projects and share access to documents, planning systems and data, using an Internet-connected device. Data and video can also be streamed in to the system via drones. The software includes features such as meeting minutes, drawing markups and document storage for all project-related materials. Procore's offerings also include an app marketplace, with 300+ partners, including Box, an enterprise file storage and content management company; Botlink, a joint venture by Packet Digital that allows users to stream in both video and data from drones surveying their construction projects; and Dexter + Chaney, an ERP provider. In 2015, software review company Software Advice ranked Procore the #1 most popular construction software, based on the number of users, search traffic, and social media presence. In 2018, Forbes wrote that the Procore app is the most popular software in the US construction industry. It was ranked number 5 on the Forbes Cloud 100 list in 2018, number 6 in 2019, and number 8 in 2020. References Software companies based in California Construction software Architectural communication Construction documents Software companies of the United States 2002 establishments in California Software companies established in 2002 American companies established in 2002 2021 initial public offerings Companies listed on the New York Stock Exchange Cloud computing providers
1022443
https://en.wikipedia.org/wiki/Andy%20Hopper
Andy Hopper
Sir Andrew Hopper (born 1953) is a British-Polish Computer Technologist and entrepreneur. He is Treasurer and Vice-President of the Royal Society, Professor of Computer Technology, former Head of the University of Cambridge Department of Computer Science and Technology, an Honorary Fellow of Trinity Hall, Cambridge and Corpus Christi College, Cambridge. Education Hopper was educated at Quintin Kynaston School in London after which he went to study for a Bachelor of Science degree at Swansea University before going to the University of Cambridge Computer Laboratory and Trinity Hall, Cambridge in 1974 for postgraduate work. Hopper was awarded his PhD in 1978 for research into Local area computer communications networks supervised by David Wheeler. Research and career Hopper's PhD, completed in 1977 was in the field of communications networks, and he worked with Maurice Wilkes on the creation of the Cambridge Ring and its successors. Hopper's research interests include computer networks, multimedia systems, Virtual Network Computing and sentient computing. His most cited paper describes the indoor location system called the Active Badge. He has contributed to a discussion of the privacy challenges relating to surveillance. After more than 20 years at Cambridge University Computer Laboratory, Hopper was elected Chair of Communications Engineering at Cambridge University Engineering Department in 1997. He returned to the Computer Laboratory as Professor of Computer Technology and Head of Department in 2004. He is currently the head of the Computer Laboratory's Digital Technology Group Hopper's research under the title Computing for the Future of the Planet examines the uses of computers for assuring the sustainability of the planet. Hopper has supervised approximately fifty PhD students. Commercial activities In 1978, Hopper co-founded Orbis Ltd to develop networking technologies. In 1978 Hopper worked with Hermann Hauser and Chris Curry, founders of Acorn Computers Ltd; Orbis became a division of Acorn in 1979 and continued to work with the Cambridge Ring. While at Acorn, Hopper contributed to design some of the chips for the BBC Micro and helped conceive the project which led to the design of the ARM microprocessor. When Acorn was acquired by Olivetti in 1985, Hauser became vice-president for research at Olivetti, in which role he co-founded the Olivetti Research Laboratory in 1986 with Hopper; Hopper became its managing director. In 1985, after leaving Acorn, Hopper co-founded Qudos, a company producing CAD software and doing chip prototyping. He remained a director until 1989. In 1993, Hopper set up Advanced Telecommunication Modules Ltd with Hermann Hauser. This company went public on the NASDAQ as Virata in 1999. The company was acquired by Conexant Systems on 1 March 2004. In 1995, Hopper co-founded Telemedia Systems, now called IPV, and was its chairman until 2003. In 1997, Hopper co-founded Adaptive Broadband Ltd (ABL) to further develop the 'Wireless ATM' project started at ORL in the early 90s. ABL was bought by California Microwave, Inc in 1998. In January 2000, Hopper co-founded Cambridge Broadband which was to develop broadband fixed wireless equipment; he was non-executive chairman from 2000 – 2005. In 2002 Hopper was involved in the founding of Ubisense Ltd to further develop the location technologies and sentient computing concepts that grew out of the ORL Active Badge system. Hopper became a director in 2003 and was chairman between 2006 and 2015 during which the company made its initial public offering (IPO) in June 2011. In 2002, Hopper co-founded RealVNC and has served as chairman since the company's inception. In 2002, Hopper co-founded Level 5 Networks and was a director until 2008, just after it merged with Solarflare Inc. From 2005 until 2009, Hopper was chairman of Adventiq, a joint venture between Adder and RealVNC, developing a VNC-based system-on-a-chip. In 2013 Hopper co-founded TxtEz, a company looking to commoditise B2C communication in Africa. Hopper was an advisor to Hauser's venture capital firm Amadeus Capital Partners from 2001 until 2005. He was also an advisor to the Cambridge Gateway Fund from 2001 until 2006. Awards and honours Hopper is a Fellow of the Institution of Engineering and Technology (FIET) and was a Trustee from 2003 until 2006, and again between 2009 and 2013. In 2004, Hopper was awarded the Mountbatten Medal of the IET (then IEE). He served as president of the IET between 2012 and 2013. Hopper was elected a Fellow of the Royal Academy of Engineering in 1996 and awarded their Silver Medal in 2003. He was a member of the Council of the Royal Academy of Engineering from 2007 to 2010. In 2013, he was part of the RealVNC team to receive the MacRobert Award. In 1999, Hopper gave the Royal Society's Clifford Paterson Lecture on Progress and research in the communications industry published under the title Sentinent Computing and was thus awarded the society's bronze medal for achievement. In May 2006, he was elected a Fellow of the Royal Society. He was a member of the Council of the Royal Society between 2009 and 2011. In 2017 Hopper become Treasurer and Vice-President of the Royal Society and was awarded the Bakerian Lecture and Prize. In the 2007 New Year Honours, Hopper was made an CBE for services to the computer industry. In 2004, Hopper was awarded the Association for Computing Machinery's SIGMOBILE Outstanding Contribution Award and in 2016 the Test-of-Time Award for the Active Badge paper. In July 2005, Hopper was awarded an Honorary Fellowship of Swansea University. In 2010 Hopper was awarded an Honorary Degree from Queen's University Belfast. In 2011 Hopper was elected as member of the Council and Trustee of the University of Cambridge and a member of the Finance Committee. Hopper serves on several academic advisory boards. In 2005, he was appointed to the Advisory Board of the Institute of Electronics Communications and Information Technology at Queen's University Belfast. In 2008 he joined the Advisory Board of the Department of Computer Science, University of Oxford. In 2011 he was appointed a member of the Advisory Board of the School of Computer and Communication Sciences at the École Polytechnique Fédérale de Lausanne. He was knighted in the 2021 Birthday Honours for services to computer technology. Personal life Hopper married Alison Gail Smith, Professor of Plant Biochemistry at the University of Cambridge, in 1988. They have two children, William and Merrill. He is a qualified pilot with over 6,000 hours logged, including a round the world flight, and his house near Cambridge has an airstrip from which he flies his six-seater Cessna light aircraft. References 1953 births Living people Members of the University of Cambridge Computer Laboratory British computer scientists Acorn Computers Alumni of Swansea University Fellows of Corpus Christi College, Cambridge Fellows of the Royal Society Fellows of the Royal Academy of Engineering Commanders of the Order of the British Empire Fellows of the Institution of Engineering and Technology Polish emigrants to the United Kingdom Businesspeople from Warsaw British businesspeople Alumni of Trinity Hall, Cambridge People from Little Shelford Knights Bachelor
400753
https://en.wikipedia.org/wiki/E-gold
E-gold
E-gold was a digital gold currency operated by Gold & Silver Reserve Inc. (G&SR) under e-gold Ltd. that allowed users to open an account on their web site denominated in grams of gold (or other precious metals) and the ability to make instant transfers of value ("spends") to other e-gold accounts. The e-gold system was launched online in 1996 and had grown to five million accounts by 2009, when transfers were suspended due to legal issues. At its peak in 2006, e-gold was processing more than US$2 billion worth of transactions per year, on a monetary base of only 71 million worth of gold (~3.5 metric tonnes), indicating a high monetary turnover (velocity) of about 28 times per year (for comparison, annual velocity of is about 6 for M1 and less than 1.6 for M2 ). e-gold Ltd. was incorporated in Nevis, Saint Kitts and Nevis with operations conducted out of Florida, USA. Beginnings E-gold was founded by oncologist Douglas Jackson and attorney Barry Downey in 1996. The pair originally backed the services accounts with gold coins stored in a bank safe deposit box in Melbourne, Florida. By 1998, G&SR (the system operator) was an Affiliate Member of NACHA and a Full Member of NACHA's The Internet Council. The company was launched two years before PayPal but did not manifest exponential growth until 2000. By 2004, there were over a million accounts. It was the first successful digital currency system to gain a widespread user base and merchant adoption, noted July 13, 1999 in the Financial Times as “the only electronic currency that has achieved critical mass on the web”. It was also the first non-credit-card payment service provider to offer an application programming interface (API) enabling other services and e-commerce transactions to be built on top of it. After initial demonstration of an e-gold Spend via Palm Pilot in February 1999, e-gold introduced support for wireless mobile payments. E-gold was used by both individuals and merchants for services including metals trading, online merchants, online auctions, online casinos, political organizations, and non-profit organizations. From 1996 through 1999, currency exchange services referred to as “InExchange” and “OutExchange” were directly supported on the e-gold platform. This arrangement exposed the system’s operator, G&SR, to the financial risks attendant to provision of exchange services. It also tended to inhibit third parties from offering exchange services on an independent competitive basis. In 2000, the system was re-structured to effect a separation of currency exchange activities from the core functions of e-metal issuance and settlement of transfers. G&SR devolved ownership and responsibility for these core functions to e-gold Ltd., a newly formed offshore company organized for that express purpose. G&SR itself, now a customer of e-gold, continued to offer exchange services under the newly created OmniPay brand. Beginning spring 2000, there was a proliferation of independent exchange services marking the first emergence of an industry providing exchange between conventional national currencies and a privately issued brand of money. By 2001, several dozen companies and individuals from around the world were offering third party exchange services between national currencies and e-gold, further extending e-gold's international user base. E-gold, which allowed transactions as small as one ten-thousandth of a gram of gold, was also the world's only successful micropayment system. The company's payment statistics were published live and showed hundreds of thousands of micro-transactions were being made daily by computer programs using the API. From its inception in 1996, e-gold pioneered the decoupling of the numeraire for specifying a payment (Spend) instruction from the native unit of account of the settlement currency. For example, while AUG® (the trademarked designation for e-gold) was denominated in grams and decimal fractions (or in troy ounces since, as weight units, both are related by fixed arithmetic ratio), a Spend Instruction might be specified as "Pay [recipient account] 10 USD worth of e-gold". Calculation of the actual quantity to convey was made using a table of reference exchange rates maintained by the company, reflecting current actual exchange rates published by exchange providers. By the early 2000s (decade), the capability of immediate settlement, as implemented by e-gold, was recognized as key to the emergence of systems for peer-to-peer transfers of digital rights such as “smart contracts”. Governance E-gold was unique at the time in that they created the "e-gold Special Purpose Trust" which held title to the physical bullion on behalf of the users. They also created a real-time statistical reports page that showed the total holdings of each metal in the trust account, list of gold bars with serial numbers, the total number of accounts, as well as the total number and value of transactions in the previous 24 hours. This transparency enabled many observations to be made about how e-gold was being used. Criminal abuse E-gold's early success may have contributed to its demise. E-gold's store of value and large user base made it an early target of financial malware and phishing scams by increasingly organized criminal syndicates. The technique was refined with attacks against the digital gold systems like e-gold and later used to attack other financial institutions starting in 2003. Hackers Failing to prospectively verify the identity of account holders, e-gold began to suffer from an increasing rate of criminal activity mainly perpetrated by Russian and Ukrainian hackers against its users. In addition to phishing, the attackers made widespread use of flaws in the Microsoft Windows operating systems and Internet Explorer web browser to collect account details from millions of computers to compromise e-gold accounts. Jackson's theory was that e-gold is a book entry system with account histories, making it simple to conduct an investigation to track down and identify users who had engaged in illicit activity after the fact. While the public perception was that e-gold accounts were anonymous, e-gold accounts were pseudonymous, allowing the creator of the account to use any name or label he wished to use. However, account and transaction records—even failed log-in attempts—were permanently recorded, enabling linkage of seemingly unrelated accounts secretly under unified control. The data mining this enabled, combined with inputs from independent exchange services, enabled law enforcement to identify numerous criminal users of the service. Fraud Various fraud artists from Western countries were also able to take advantage of the e-gold system as a means of funding their schemes, enabling for the first time in history, international Ponzi schemes. Perpetrators of auction fraud on eBay would sell fake or non-existent items on the site. These criminal syndicates preferred their victims to pay in e-gold because it was the fastest and easiest way for them to move the funds overseas. The increase of online crime linked to e-gold led to complaints to government authorities by defrauded account holders, who often did not understand the difference between e-gold and the fraudulent person or company that encouraged them to open an e-gold account and wire money to fund it. Systemic problems As an online transactions system with exchange agents worldwide, e-gold enabled criminals and hackers in Romania to move money quickly and easily from victims in America back to the country from which the attacks were originating. Several of the cyber crime gangs that plagued and used e-gold were based in Râmnicu Vâlcea, Romania. E-gold was unknowingly part of a larger systemic problem with the banking system. The banking and credit system in the United States were not designed for a digital environment, and were therefore fundamentally insecure and highly vulnerable to identity theft and check fraud, as well as trust based attacks such as phishing. The willingness of credit card companies to allow people to apply for a card without being identified in person enabled rapid growth of identity theft. (Ironically, not verifying the identities of account holders would be one of the main criticisms raised against e-gold.) Cybercrime There were early reports where e-gold had actively helped to catch and collar cyber criminals, such as the one who stole Cisco Systems' firewall code and offered it for sale to be paid in e-gold. In June 2007, Jackson claimed to have "aided 300 investigations and reported 3,000 suspected child pornography buyers to the National Center for Missing and Exploited Children". Goldmoney, and then federal law enforcement agencies began to characterize e-gold as the payment system of choice for criminals, terrorists and child pornographers. Criminal prosecution Changing definition of a money transmitter The USA Patriot Act, passed in the wake of the September 11 attacks more than five years after e-gold had been launched, made it a federal crime to operate a money transmitter business without a state money transmitter license in any state that required such a license. At the time a money transmitter was in most states defined as a business that cashed checks or accepted cash remittances to send from one person to another person across international borders, such as Western Union or MoneyGram. For example, prior to 2010, California regulated money transmitters under the "Transmission of Money Abroad Law". One of e-gold's competitors, the e-Bullion company, applied for a money transmitter license from the State of California in 2002, but was informed by the State of California that their business which dealt in gold accounts did not fall under the state's definition of a money transmitter. In 2005, G&SR requested that the IRS SB/SE Division conduct a BSA (Bank Secrecy Act) Compliance examination in order to clarify what regulations, if any, e-gold fell under. The United States Treasury issued a report on January 11, 2006 titled U.S. Money Laundering Threat Assessment which G&SR believed was evidence favorable to its legal case as explained in its January 20, 2006 Letter, apparently confirming that e-gold accounts were excluded from the definition of "currency" under the United States Congress and Code of Federal Regulations definitions. However, in its actions from 2006-2008, the U.S. Treasury Department in conjunction with the United States Department of Justice stretched the definition of money transmitter in the USA Patriot Act to include any system that allows transfer of any kind of value from one person to another, not merely national currency or cash. Using this new interpretation they then proceeded to prosecute the USA-based gold systems, e-gold (and later e-Bullion) under the USA Patriot Act for not having money transmitter licenses, even though these companies had previously been cooperating with regulatory authorities and told they did not fall under the definition of money transmitter. The charge of not having a money transmitter license was eventually dropped against e-bullion. Several years later FINCEN further expanded this definition to apply to foreign companies allowing US persons to open accounts, which forced the Jersey based Goldmoney.com to suspend the ability to transfer value from one holder to another in December 2011. A November 2013 article in Financial Times noted that "For several years, Mr Jackson had hoped to resurrect e-gold himself, but it became clear he would not be able to obtain the money transmitter licenses required in most US states." Allegations against e-gold Banks suffer from the same problems with criminal activity, phishing, ponzi schemes and money laundering on a much larger scale, and some of the biggest banks have even knowingly participated in money laundering. e-gold's status as a controversial alternative currency system made it an attractive target. While e-gold had begun implementing stronger controls against abuse by users of the system by 2005, and was actively combating the use of its system for child pornography as a founding member of the Financial Coalition Against Child Pornography, the Justice Department indicted the e-gold directors on four counts of violating money laundering regulations and knowingly allowing a transaction to purchase child pornography. The government action against e-gold was a case of first impression. As noted by the prosecutor, “Digital currencies are on the forefront of international fund transfers. E-gold is the most prominent digital currency out there. It has the attention of the entire digital currency world. That world is a bit of a wild west right now. People are looking for what are the rules and what are the consequences.” Resolution The case against e-gold was brought under Title 18 USC section 1960 in UNITED STATES OF AMERICA v. E-GOLD, LTD, District of Columbia court. e-gold filed a motion to dismiss the case on the grounds that they did not fit the definition of a money transmitter. The court ruled against e-gold, stating that "a business can clearly engage in money transmitting without limiting its transactions to cash or currency and would commit a crime if it did so without being licensed." This ruling enshrined in case law the Treasury Department's expansion of the definition of a money transmitter to include any system by which stored value of any kind may be transferred from one person to another, even if the stored value is neither cash, nor national currency. After vigorously contesting the charges for a year, in July 2008 the company and its three directors entered into a plea agreement. Dr. Jackson pleaded guilty to "operation of an unlicensed money transmitting business" and "conspiracy to engage in money laundering". The agreement detailed actions required to bring the companies into compliance with laws and regulations governing operation of a Money Transmitting Business. Concurrently, the companies agreed to a consent order of forfeiture, dropping their action to recover funds previously seized by the government. Sentencing was scheduled to occur 120 days following entry of the Plea Agreements in order to afford a 90-day interval to implement compliance requirements. A status report detailing progress with regard to mandated compliance measures was filed November 8, 2008. In November 2008, Gold & Silver Reserve CEO Douglas Jackson was sentenced to 300 hours of community service, a $200 fine, and three years of supervision, including six months of electronically monitored home detention. He had faced a maximum sentence of 20 years in prison and a $500,000 fine. Commenting on her substantial deviation from Federal Sentencing Guidelines (in the direction of leniency), Judge Rosemary Collyer, having already noted “no doubt that Dr. Jackson has respect for the law” and that “the intent was not there to engage in illegal conduct”, determined: “there is no reason to shut down e-Gold and G&SR, and every reason to have them come into legal compliance”. Jackson's lawyer claimed Jackson was spared the heavier fine because he was deeply in debt - the judge said "Dr. Jackson has suffered, will continue to suffer, and may never be successful with e-Gold". Reid Jackson, Douglas Jackson's brother, and e-Gold director Barry Downey were each sentenced to three years of probation and 300 hours of community service, and ordered to pay a $2,500 fine and a $100 assessment. Suspension of service and e-gold Value Access Plan (VAP) The 2007 e-gold indictment was accompanied by seizures (and forced redemption) of the e-gold balances of multiple exchange providers, resulting in an almost overnight decline in the amount of e-gold in circulation (and gold reserves) from 3.5 to 2.6 metric tonnes. [No exchanger except G&SR was charged with any crime and the seized value was subsequently returned to them.] Additionally, the government filed a Post-Indictment Restraining Order (PIRO) which prohibited redemption of e-gold for gold bullion without the approval of the prosecutor. The primary purpose of the PIRO was to prevent dispersion of assets (the gold reserves) which the government had been unable to seize due to the custodial arrangements whereby the e-gold Bullion Reserve Special Purpose Trust held title to the gold. The combination of adverse publicity and disrupted exchange markets led to a precipitous decline in e-gold usage and demand. Whereas under normal circumstances a decrease in demand would have resulted in a decrease in circulation (without impacting exchange rates), this combination led to e-gold Users being unable to exchange their e-gold for conventional money and discouraged any potential recipient from accepting payment in e-gold. In 2008, the Plea Agreement detailed requirements for e-gold to resume operation as a regulated financial institution. While e-gold had already complied with the majority of requirements by the time of sentencing, it was discovered that the guilty Plea itself effectively precluded the companies (or any company controlled by the e-gold directors) from being licenseable in any US state. In accordance with the Plea, e-gold suspended all remaining Spend activity, in effect locking up all e-gold account balances. The challenge was then how to restore customer access to the value in their e-gold accounts. Lacking licenses as a money transmitting business, any plan to liquidate the system and distribute value to customers would risk being construed as an additional violation of operating without required licenses. In 2009, the e-gold directors approached the US government with a proposal whereby the government might serve as middleman for disbursing the value due to e-gold customers. Following a year of negotiation, the e-gold VAP was approved calling for monetization of reserves and a claims mechanism, under the authority and oversight of Judge Hollander. The VAP protocol entailed the companies consenting to a voluntary seizure action of the aggregate e-gold. The companies were then responsible for “monetizing” the value, that is, redeeming the e-gold, liquidating the bullion released from reserves and turning over the proceeds to the Secret Service. Due to a fortuitous drop in USD value relative to e-gold, the net realized monetization rate for VAP was $1583 per troy ounce, over twice the maximum e-gold exchange rate during the interval Spend activity was curtailed and then suspended c. 2007-2009. Altogether, G&SR turned over more than $92.8 million to the Secret Service in 2012. Some of the e-metal in e-Gold accounts was criminally derived, but much of the e-metal was owned by innocent account holders. The court ordered Rust Consulting, a private company in Maryland, to process refunds to account holders following validation of their identity by e-gold. The balance of unclaimed funds will be forfeited to the US government. A three-month window was set from June 3, 2013 to October 1, 2013 for e-gold account holders to submit a claim on their funds, then extended to December 31, 2013. Aftermath After the e-gold and e-Bullion cases, California (2010) and several other states amended their regulations to follow the federal precedent to define all digital value transfer systems as money transmitters. However, California's 2010 law is worded as to define a range of Internet startup companies, such as the room booking service Airbnb, as "money transmitters". E-gold was an early pioneer of Internet payments. The company was the first successful online payment system which pioneered many of the systems and techniques of e-commerce, including making payments over an SSL encrypted connection, and offering an API to enable other websites to build services using e-gold's transaction system. Though e-gold was ultimately shut down by the US government, the federal judge on the case ruled that the founders of e-gold "had no intent to commit illegal activity." After the resolution of the criminal case, the directors of e-gold Ltd vowed to continue operations following the new Federal know your customer guidelines. E-gold's failure was ultimately due to their inability to provide a system of reliable user identification and the failure to provide a workable dispute resolution system to identify and cut off illegal and abusive activity in their user community. Other transaction systems such as Webmoney.ru and Goldmoney.com learned from e-gold's mistakes and were able to successfully field similar systems with low rates of abuse by addressing these deficiencies. While PayPal has done a better job of addressing abuse than e-gold did, they now contend with the same kind of Internet fraud that took down e-gold. Financial cryptographers have observed that Bitcoin has repeated the same fundamental errors that e-gold made, and that despite its decentralized nature the cyber crime-wave might bring Bitcoin to a similar ending. According to GoldMoney's website, BitGold announced the acquisition of GoldMoney on May 22, 2015. Petition to vacate 2008 convictions On July 2, 2020, e-gold Ltd, Gold & Silver Reserve, Inc, Douglas Jackson, Barry Downey, and Reid Jackson filed a writ of coram nobis petition in the United States District Court for the District of Columbia seeking to vacate, with prejudice, their 2008 convictions. The petitioners allege that the government intentionally withheld exculpatory evidence from its Brady disclosure that would have resulted in the defendants declining to enter into plea agreements for any of the counts on the indictment and should have informed the court’s ruling on the defendants’ motion to dismiss the money transmitter specific counts on the indictment against them. Specifically, the petitioners allege that a business substantially similar to that of E-gold Ltd was advised by the District of Columbia Department of Insurance, Securities, and Banking that it did not require a money transmitting license pursuant to D.C. law, causing  the petitioners to realize that the government may have misrepresented the applicability of state money transmitting laws to e-gold. The petitioners further allege that this resulted in public records requests which yielded evidence that the government had instructed the Florida Office of Financial Regulation to refrain from issuing or publishing its legal opinion that neither of the companies were money transmitters per Florida law during the pendency of the grand jury investigation of the companies. See also Bitcoin Digital currency Digital currency exchanger Gold as an investment Liberty Reserve PayPal Private currency WebMoney References External links Digital gold currencies
3139614
https://en.wikipedia.org/wiki/Olympus%20m%3Arobe
Olympus m:robe
Olympus m:robe was a product line of MP3 players that were produced by Olympus Corporation between 2004-2005. The name m:robe is a contraction of Music wardROBE. Olympus has ended production of the entire m:robe line. On October 13, 2004, Olympus released two MP3 players: the 5GB MR-100 with monochrome display and the 20GB MR-500i with colour display and built-in camera. The MR-100’s release price was $249.99 (USD), and the MR-500i’s release price was $499.99 (USD). The later MR-F10, MR-F20, and MR-F30 players with colour screens, drag-and-drop file transferring, and FM tuning and recording were only released in Asia. MR:100 The MR:100 was a 5GB MP3 player. It had a bright red on deep blood red 1.25 x 1 inch display which was 160 pixels wide and 128 pixels high. It also had a Synaptic touch pad for the controls that glowed red where the sensitivity pads are. The entire front was covered in a single piece of plastic, and when the m:robe was turned off or idling, the front turned all black. It had a metal back painted in Pearl White, although in Japan limited numbers were also available with a Pearl Pink or Lagoon Blue back. The MR-100 used the same PortalPlayer PP5020 CPU as the iPod mini, and used a very similar interface. The player was compatible with MP3 and WMA files. It included an AC adapter, a USB cable, a dock, a pair of earbuds, a headphone extension wire, a CD-ROM with the player's software, and documentation. Additional accessories included a red or black rubber sleeve and a wired remote that plugs into the MR:100. Rockbox became fully functional on the m:robe 100 series and allowed many features not possible in the original firmware. MR-500i The MR-500i was a 20GB MP3 player/camera combo. It had a 3.7 inch 640x480 (VGA) 262,144 colour LCD touch screen. The player had the ability to play music and take photos, as well as a “remix” mode where the user could choose a song to play with a selection of photos. It came with a wired remote that allowed the user to change between songs, view the song being played, and change the volume. The remote was also available as an optional accessory for the MR:100. The back of the player was Pearl White like the original MR-100. The MR-500i supported MP3 and WMA audio files. It came with an AC adapter, a USB cable, a dock, a pair of earbuds, the m:robe remote, a CD-ROM with the player's software, and documentation. MR-Fxx series The MR-Fxx series consisted of the MR-F10, MR-F20, and MR-F30. They were only available in Asia. They were all available with 512MB and 1GB flash memory, except the MR-F20 which only came in 512MB. They all had color screens with varying resolutions and color depth. The MR-F10 and MR-F20 had 65,536-colour organic electroluminescent displays, and the MR-F30 had a 262,144-color organic electroluminescent display. The MR-F10 was available in pearl white or gloss black, and came with an earbud neckstrap. The MR-F20 also had a pearl white back and a piece of plastic on the front, with black behind it and recessed touch buttons (similar to the 3G iPod) that glowed red. The MR-F30 was available in white and black. The MR-F20 and MR-F30 had built-in FM tuners with aircheck functions for recording radio broadcasts. The MR-F30 and MR-F20 had a built-in voice recorders. A Drag and Drop interface could be used instead of m:trip. m:trip The device's main software for transferring media between the player and a computer was called m:trip, and has been heavily criticised by m:robe users. The software initially only ran on Windows 2000 or Windows XP (Home and Professional), and did not support Macintosh or Linux. Later it did support Linux. M-trip software required users to “Sync” to upload music, and if syncing an m:robe from a different computer, it deleted all of the music, to prevent piracy. In addition, the program was accused of having numerous syncing bugs m:trip 2.0 (2.0.0.9) was released July 5, 2006 (only for Japan), which made improvements to the software, including adding a Remix-Cube function, an automatic update/importing of music, and a new service named Olio. An English software patch was made for the 2.0 version by independent developers, but it was rendered obsolete when Olympus released version 2.1. This version created a new save folder option for remix cubes and made minor changes and additions to the remix function and to the program itself. Various m:trip replacements have been released by independent developers. Their aim is to provide a simple unintrusive alternative to m:trip without any of its multimedia capabilities. RM-13 remote and cradle The MR-100 could be used with an LCD remote(RM-13) and Data Sync/Charge Cradle. The RM-13 had a "Heart" Button which let the user add songs to favorite list. The RM-13 displays Track Time, Title, Battery, and Track Number. Languages supported on the RM-13 remote were English, Korean, and Japanese. However other languages could be displayed. Also RM-13 has green backlit. There are also mode and digital volume buttons. End of production On November 9, 2005, Olympus announced that they were stopping production of the m:robes. Olympus froze the development of the next generation of m:robe products to put more resources into its digital camera and audio recording research, development, marketing and sales. Many users say that it was due to the lack of marketing — the only marketing in the United States was two Super Bowl ads. Many users also predicted this stop, due to the drop in price of the m:robes shortly before the action was taken, and the fact that the MR-100 was being sold at Radio Shack for $100 USD after rebates. There were only a few accessories for the m:robes. See also Rockbox (alternative, open source firmware for the m:robe 100) References Digital audio players Japanese brands Consumer electronics brands
18287609
https://en.wikipedia.org/wiki/FreshBooks
FreshBooks
FreshBooks is accounting software operated by 2ndSite Inc. primarily for small and medium-sized businesses. It is a web-based software as a service (SaaS) model, that can be accessed through a desktop or mobile device. The company was founded in 2003 and is based in Toronto, Canada. History FreshBooks was founded in 2004 by Mike McDerment, Levi Cooperman, and Joe Sawada in Toronto, Ontario. McDerment incorporated a second company, BillSpring in January 2015 to work on new product development. It was rolled back into FreshBooks as an updated interface in 2016. Initially FreshBooks functioned like an electronic invoicing program targeting IT professionals. The initial release of FreshBooks is now referred to as "FreshBooks Classic." FreshBooks Classic's front-end application was built in PHP, and the backend services were built in Python. Product FreshBooks offers a subscription-based product that includes invoicing, accounts payable, expense tracking, time tracking, retainers, fixed asset depreciation, purchase orders, payroll integrations, double-entry accounting, and industry-standard business and management reporting. All financial data is stored in the cloud on a single unified ledger, allowing users to access the same set of books regardless of location on desktop and mobile. It offers a free API that enables customers and 3rd-party software vendors to integrate external applications with FreshBooks. FreshBooks also supports multiple tax rates and currencies. It also incorporates a payroll feature and a projects feature. The software is priced on a pay-per-use recurring monthly fee. FreshBooks supports country specific tax calculation in Canada, the United States and Britain. GST and HST in Canada, sales taxes in the United States, and MTD in the UK are supported by FreshBooks. Operations FreshBooks has its headquarters in Toronto, Canada with operations in North America, Europe and Australia. Founder Mike McDerment was the chief executive officer of the company from 2003 until 2021, when he stepped down and was replaced by Don Epperson, but stayed as the executive chair. Don Epperson had previously joined FreshBooks as executive director in 2019. Funding FreshBooks was initially self-funded. In 2014, the company raised a Series A venture investment of $30 million led by the venture capital firm Oak Investment Partners, with participation by Georgian Partners and Atlas Venture. In 2017, FreshBooks announced that it raised another $43 million in funding from Accomplice, Georgian Partners and Oak Investment Partners. On August 10, 2021, FreshBooks announced that it had secured $80.75 million in Series E funding and $50 million in debt financing. FreshBooks also reached a valuation of more than $1 billion. See also Comparison of accounting software Comparison of time tracking software References Accounting software Web applications Cloud applications
17671142
https://en.wikipedia.org/wiki/SCSI%20RDMA%20Protocol
SCSI RDMA Protocol
In computing the SCSI RDMA Protocol (SRP) is a protocol that allows one computer to access SCSI devices attached to another computer via remote direct memory access (RDMA). The SRP protocol is also known as the SCSI Remote Protocol. The use of RDMA makes higher throughput and lower latency possible than what is generally available through e.g. the TCP/IP communication protocol. Some network adapters accelerate RDMA onboard, such as InfiniBand HCAs and Ethernet network adapters with RDMA over Converged Ethernet (RoCE) or iWARP support while some software such as Vcinity implement them in software for WAN transport. Though the SRP protocol has been designed to use RDMA networks efficiently, it is also possible to implement the SRP protocol over networks that do not support RDMA. SRP was published as an ANSI standard (ANSI INCITS 365-2002) in 2002 and renewed in 2007 and 2019. As with the ISCSI Extensions for RDMA (iSER) communication protocol, there is the notion of a target (a system that stores the data) and an initiator (a client accessing the target) with the target initiating data transfers. In other words, when an initiator writes data to a target, the target executes an RDMA read to fetch the data from the initiator and when a user issues a SCSI read command, the target sends an RDMA write to the initiator. While the SRP protocol is easier to implement than the iSER protocol, iSER offers more management functionality, e.g. the target discovery infrastructure enabled by the iSCSI protocol. In order to use the SRP protocol, an SRP initiator implementation, an SRP target implementation and networking hardware supported by the initiator and target are needed. The following software SRP initiator implementations exist: Linux SRP initiator, available since November 2005 (kernel version 2.6.15). Windows SRP initiator, available through the winOFED InfiniBand stack. VMWare SRP initiator, available since January 2008 through Mellanox' OFED Drivers for VMware Infrastructure 3 and vSphere 4. Solaris 10 SRP initiator, available through Sun's download page. Solaris 11 and OpenSolaris SRP initiator, integrated as a component of project COMSTAR. The IBM POWER virtual SCSI client driver for Linux (ibmvscsi), available since January 2008 (kernel version 2.6.24). Virtual SCSI allows client logical partitions to access I/O devices (disk, CD, and tape) that are owned by another logical partition. The following SRP target implementations exist: The SCST SRP target implementation. This is a mature SRP target implementation available since 2008 via both SCST and OFED. Linux LIO SRP target, available since January 2012 (kernel version 3.3), based on the SCST SRP target. The IBM POWER virtual SCSI target driver (ibmvstgt), available since January 2008 (kernel version 2.6.24). DataDirect Network's (DDN) disk subsystems such as the S2A9900 and SFA10000, which use the SRP target implementation in the disk subsystem's controllers to present LUNs to servers (the servers act as SRP initiators). IBM's FlashSystem. The Solaris COMSTAR target, available since early 2009 in OpenSolaris and Solaris 11. Bandwidth and latency of storage targets supporting the SRP or the iSER protocol should be similar. On Linux, there are two SRP and two iSER storage target implementations available that run inside the kernel (SCST and LIO) and an iSER storage target implementation that runs in user space (STGT). Measurements have shown that the SCST SRP target has a lower latency and a higher bandwidth than the STGT iSER target. This is probably because the RDMA communication overhead is lower for a component implemented in the Linux kernel than for a user space Linux process, and not because of protocol differences. See also iSCSI Extensions for RDMA (iSER) References Computer networking SCSI
5742771
https://en.wikipedia.org/wiki/Ripper%20%28video%20game%29
Ripper (video game)
Ripper is a 1996 interactive movie point-and-click adventure game developed and published by Take-Two Interactive for MS-DOS and Macintosh. The cast includes Christopher Walken, Paul Giamatti, Karen Allen, Burgess Meredith (in his final performance before his death the following year), David Patrick Kelly, Ossie Davis, and John Rhys-Davies. It also uses the Blue Öyster Cult song "(Don't Fear) The Reaper". The villain of the game is chosen at random from the four main characters. A limited number of the clues and puzzles, plus a single line of dialogue in the ending, change according to the villain's identity. In 1996 home ports for the Saturn and PlayStation were announced, but did not ship. Ripper is the second of the three Take-Two developed full-motion video-based adventure games, the other two being Hell: A Cyberpunk Thriller and Black Dahlia. Plot Ripper takes place in New York City in the year 2040. It opens with the investigation of the recent murder of Renee Stein, the third victim of a serial killer known as "The Ripper", largely out of the modus operandi similarity to Jack the Ripper. The player assumes the role of Jake Quinlan, a reporter for the Virtual Herald, whom The Ripper sends messages to detailing his murders (an act attributed to Jack the Ripper, although no letters have been proven to come from him). Along with the police (whose investigation is headed by Detective Vincent Magnotta), Quinlan is seeking The Ripper's true identity. After investigating Renee Stein's murder, Quinlan receives a message from The Ripper, who warns Quinlan that his girlfriend, Catherine Powell, will be the next victim, as she has gotten too close to discovering his identity. Quinlan manages to find Powell still alive, but in a coma "deeper than anyone thought possible." Cybersurgeon Claire Burton at the Meta-Cognition Center of the Tribeca Center Hospital manages to retrieve a distorted image of Powell's attacker, but requires additional information from Quinlan to make it clearer. (This is also a reference to Jack the Ripper, as the police hypothesized that they might be able to get an image of the killer from the retinas of the victims.) He provides this through investigating into what Powell was on to in her investigation and homes in on three possible suspects for The Ripper's murders. In order to transmit this information into Powell's brain directly, he enlists the help of Joey Falconetti, a hacker who specializes in interfacing directly with the human brain. Quinlan's investigation leads him to discover that all of The Ripper's victims and all of those associated with the investigation of The Ripper (except Quinlan himself) were involved with an old gaming group known as the Web Runners, who played a game based on the Jack the Ripper mystery. The last session of this game somehow caused one of the players to die in real life. The player that died happened to be Catherine Powell's mother. Assistance from a pathologist named Vic Farley reveals that The Ripper's murders were done by placing a code into a victim's brain while in cyberspace that caused their internal body pressure to rise to a point of explosion, which Farley experiences immediately after providing his explanation. Quinlan also finds a cyberspace weapon developed by a murdered cyberarchitect named Hamilton Wofford, designed specifically to kill The Ripper inside a virtual recreation of the historic Whitechapel district of London, where the Jack the Ripper murders took place. After assembling the weapon and gathering the necessary protection from The Ripper's weapon, Quinlan enters cyberspace, kills The Ripper, and manages to escape the virtual Whitechapel in time to escape its destruction. The Ripper can be one of four possible suspects: Joey Falconetti, Claire Burton, Vincent Magnotta, or Catherine Powell. With each play-through, certain clues and actual identity of The Ripper vary, though the bulk of the story is unchanged, and clues indicating the guilt of all four suspects will appear regardless of who the killer is. For instance, Catherine Powell experiences mysterious surges in brain wave activity that coincide with all the Ripper's murders regardless of whether or not she actually is the Ripper, and no alternative explanation for these surges is provided. However, the changes in the game's story and puzzles are limited to the game's third act- after Farley's death. Cast Christopher Walken as Detective Vince Magnotta; a violent police officer with a romantic interest in Clare Burton and a vendetta against Falconetti. When Catherine's mother died during their last Ripper game, Magnotta arrested Falconetti for the murder. Magnotta is convinced that Falconetti is the Ripper, but his obsession may simply be a ruse to cover the fact that Magnotta himself is the Ripper. Magnotta may also be eager to frame Falconetti as the Ripper so that he can collect a large bounty for solving the case. Over the course of the investigation, Magnotta becomes increasingly more hostile towards Quinlan, believing him to be the Ripper due to the circumstances of Quinlan's encounters with the Ripper and their victims. Burgess Meredith as Hamilton Wofford / Covington Wofford; two brothers that live on the outskirts of New York City. Hamilton was a Cyber-Architect who built virtual worlds based on historic locations. Hamilton was hired by the Ripper to build a replica of Whitechapel. Hamilton is murdered prior to the start of the game, but Quinlan eventually finds an AI created by Hamilton that provides him with critical information. Covington- in his brother's absence- is a recluse and seems to have gradually lost his grip on sanity. Karen Allen as Doctor Clare Burton; a brilliant, but seemingly cold and distant doctor that specializes in the human brain. Burton is the object of affection of both Magnotta and Falconetti, both of whom have made Burton's life chaotic. Burton arouses suspicion of being the Ripper when she appears to stall treating Catherine- the only surviving Ripper victim. It is later revealed that Burton invented the weapon that the Ripper uses in his murders. Burton claims that the weapon was stolen, but whether this is true depends on whether Burton is the Ripper or not. David Patrick Kelly as Joey Falconetti; a brilliant, though seemingly violent and disturbed computer hacker. Falconetti immediately draws suspicion of being the Ripper with his fascination with knife collecting and his fascination with the Ripper himself. Falconetti was married to Clare Burton, but the two divorced after Magnotta arrested Falconetti. Joey now has a vendetta for Magnotta and Burton, though he still has feelings for the latter. Scott Cohen as Jake Quinlan; an investigative reporter who is investigating and writing about the Ripper murders. Quinlan regularly receives letters and messages from the Ripper, and Quinlan is forced into chasing the Ripper when his assistant and lover- Catherine Powell- is attacked by the Ripper. Quinlan is not well versed in the workings of cyberspace; thus the player must learn how the technology of the game world works as Quinlan learns it himself. Ossie Davis as Ben Dodds; the editor of the Virtual Herald whom Quinlan confides in. John Rhys-Davies as Vigo Haman; a mobster that has valuable information regarding Clare Burton. Tahnee Welch as Catherine Powell; Quinlan's co-worker and lover. Powell begins investigating the Ripper behind Quinlan's back, hoping to take the story and launch her own career as a reporter. Powell is comatose for most of the game, but it is discovered late in the game that her brain activity spikes each day that the Ripper kills; making Catherine a viable suspect. The suspicion that Catherine may be the Ripper is heightened when it's discovered that her mother was killed in the last Ripper Web Runners game, and Catherine was aware of this fact. Jimmie Walker as Soap Beatty; a computer hacker who is one of Catherine's primary sources in her investigation of the Ripper. He gives the player much insight into the computer technology of the game. Steven Randazzo as Sgt. Lou Brannon; a rare non-corrupt cop that gives the player information about Magnotta's suspicious activities. Peter Boyden as Vic Farley; a friendly pathologist who is trying to figure out how the Ripper commits his murders. Paul Giamatti as Doctor Bud Cable; a doctor tending to Catherine Powell who keeps the player up to date about Catherine's condition when Doctor Burton stalls treating her. MacIntyre Dixon as Gambit Nelson; a cyberspace entrepreneur who gives the player critical information about Falconetti. Lianna Pai as Kashi Yamamoto; A current Web Runner who gives the player information about the Runners' history. David Thornton as Twig; an assistant that maintains Falconetti's computer hacking equipment. Kira Arne as Vivien Santiago; the receptionist at the hospital who flirts with Quinlan and informs Quinlan of suspicious activities at the hospital. William Seymour as Bob Eppels; a pathologist who replaces Farley after Farley is fired. Suspiciously, Eppels is seemingly under instruction not to give Quinlan any information, though he slips up and tells Quinlan secrets about the hospital. Richard Bright as Dr. Karl Stasiak; a forensic photographer who is a trusted source of Quinlan's. Unfortunately Karl gets freaked out by the Ripper's attacks and leaves New York, depriving Quinlan of a trusted source in the police department. Phyllis Bash as Prof. Lillian Bech; a Professor who sheds some light on Clare Burton's past. She first informs Quinlan of the Web Runners' existence. Development Considerable effort was focused on the game's full motion video sequences. Paying the game's slew of big-name actors cost nearly 25% of the game's entire budget, and Phil Parmet was brought on to direct the video segments. Writer/lead designer F. J. Lennon commented, "The whole industry wants to crucify FMV, people claim FMV doesn't belong in game, but if it's done professionally, I think it can work." The game engine was created from scratch. It can change resolution between 640x480 and 320x200 on the fly. Ripper had a budget of , of which roughly a quarter paid actors, and was in development for two years. It was launched in February 1996. Reception Take 2 announced shipments of 160,000 copies to retailers during the game's debut week, and called it "our biggest game to date". According to Take 2, the game sold above 150,000 units by the end of October 1996 and earned 28.7% of all company revenue during that fiscal year, the total of which was $12.5 million. The company's income in that period was $349,074. Arinn Dembo of CNET Gamecenter wrote, "[S]low sales, unfortunately, quickly knocked [Ripper] off retail shelves." The game received an average score of 71.50% at GameRankings, based on an aggregate of 4 reviews. A reviewer for Next Generation commented, "One minute the game believes it's a graphic adventure, the next it's a movie, and the next it's a puzzle game. If any one of these aspects would be perfected, it could be a gamer's delight. As it stands, the game is mediocre in each category." He specifically criticized that the characters "are so overdone it's just plain funny" and the first-person sequences can't be bypassed, forcing the player to watch the same graphics every time they backtrack. Jeff Sengstack of NewMedia magazine wrote that Ripper "meets, even exceeds, its pre-release hype", and summarized it as "an engaging horror mystery with immense depth." However, he found fault with the video compression and difficulty. References External links Official site (archived) 1996 video games Cancelled PlayStation (console) games Cancelled Sega Saturn games Cyberpunk video games Detective video games DOS games Fiction set in 2040 Full motion video based games Interactive movie video games Classic Mac OS games Point-and-click adventure games Single-player video games Take-Two Interactive games Video games about Jack the Ripper Video games about virtual reality Video games developed in the United States Video games set in the 2040s Video games with alternate endings
31253847
https://en.wikipedia.org/wiki/Dragomir%20R.%20Radev
Dragomir R. Radev
Dragomir R. Radev is a Yale University professor of computer science working on natural language processing and information retrieval. He previously served as a University of Michigan computer science professor and Columbia University computer science adjunct professor. Radev serves as Member of the Advisory Board of Lawyaw. He is currently working in the fields of open domain question answering, multi-document summarization, and the application of NLP in Bioinformatics, Social Network Analysis and Political Science. Radev received his PhD in Computer Science from Columbia University in 1999. He is the secretary of (2006–present) and associate editor of JAIR. Awards As NACLO founder, Radev shared the Linguistic Society of America 2011 Linguistics, Language and the Public Award. He is the Co-winner of the Gosnell Prize (2006). In 2015 he was named a fellow of the Association for Computing Machinery "for contributions to natural language processing and computational linguistics." IOL Radev has served as the coach and led the US national team in the International Linguistics Olympiad (IOL) to several gold medals . Books Puzzles in Logic, Languages and Computation (2013) Mihalcea and Radev (2011) Graph-based methods for NLP and IR Selected Papers SIGIR 1995 Generating summaries of multiple news articles ANLP 1997 Building a generation knowledge source using internet-accessible newswire Computational Linguistics 1998 Generating natural language summaries from multiple on-line sources ACL 1998 Learning correlations between linguistic indicators and semantic constraints: Reuse of context dependent descriptions of entities ANLP 2000 Ranking suspected answers to natural language questions using predictive annotation CIKM 2001 Mining the web for answers to natural language questions AAAI 2002 Towards CST-enhanced summarization ACL 2003 Evaluation challenges in large-scale multi-document summarization: the Mead project Information Processing and Management 2004 Centroid-based summarization of multiple documents Journal of Artificial Intelligence Research 2004 LexRank: Graph-based lexical centrality as salience in text summarization Journal of the American Association of Information Science and Technology 2005 Probabilistic question answering on the web Communications of the ACM 2005 NewsInEssence: summarizing online news topics EMNLP 2007 Semi-supervised classification for extracting protein interaction sentences using dependency parsing Bioinformatics 2008 Identifying gene-disease associations using centrality on a literature mined gene-interaction network IEEE Intelligent Systems 2008 natural language processing and the web NAACL 2009 Generating surveys of scientific paradigms Nucleic Acids Research 2009 Michigan molecular interactions r2: from interacting proteins to pathways Journal of the American Association of Information Science and Technology 2009 Visual overviews for discovering key papers and influences across research fronts KDD 2010 Divrank: the interplay of prestige and diversity in information networks American Journal of Political Science 2010 How to Analyze Political Attention with Minimal Assumptions and Costs Arxiv 2011 The effect of linguistic constraints on the large scale organization of language Journal of Biomedical Semantics 2011 Mining of vaccine-associated ifn-gamma gene interaction networks using the vaccine ontology External links Team USA Brings Home the Linguistics Gold Dragomir Radev, Co-Founders Recognized as NACLO Receives Linguistics, Language and the Public Award Dragomir Radev Coaches US Linguistics Team to Multiple Wins Dragomir Radev Honored as ACM Distinguished Scientist Prof. Dragomir Radev Receives Gosnell Prize References Year of birth missing (living people) Living people Columbia School of Engineering and Applied Science alumni American computer scientists University of Michigan faculty Natural language processing Information retrieval researchers Fellows of the Association for Computing Machinery Fellows of the Association for the Advancement of Artificial Intelligence Natural language processing researchers Data miners
48736239
https://en.wikipedia.org/wiki/Privacy%20concerns%20regarding%20Google
Privacy concerns regarding Google
Google's changes to its privacy policy on March 16, 2012 enabled the company to share data across a wide variety of services. These embedded services include millions of third-party websites that use AdSense and Analytics. The policy was widely criticized for creating an environment that discourages Internet-innovation by making Internet users more fearful and wary of what they put online. Around December 2009, after privacy concerns were raised, Google's CEO Eric Schmidt declared: "If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place. If you really need that kind of privacy, the reality is that search engines—including Google—do retain this information for some time and it's important, for example, that we are all subject in the United States to the Patriot Act and it is possible that all that information could be made available to the authorities." Privacy International has raised concerns regarding the dangers and privacy implications of having a centrally located, widely popular data warehouse of millions of Internet users' searches, and how under controversial existing U.S. law, Google can be forced to hand over all such information to the U.S. government. In its 2007 Consultation Report, Privacy International ranked Google as "Hostile to Privacy", its lowest rating on their report, making Google the only company in the list to receive that ranking. At the Techonomy conference in 2010, Eric Schmidt predicted that "true transparency and no anonymity" is the path to take for the Internet: "In a world of asynchronous threats it is too dangerous for there not to be some way to identify you. We need a [verified] name service for people. Governments will demand it." He also said that, "If I look at enough of your messaging and your location, and use artificial intelligence, we can predict where you are going to go. Show us 14 photos of yourself and we can identify who you are. You think you don't have 14 photos of yourself on the internet? You've got Facebook photos!" In the summer of 2016, Google quietly dropped its ban on personally-identifiable info in its DoubleClick ad service. Google's privacy policy was changed to state it "may" combine web-browsing records obtained through DoubleClick with what the company learns from the use of other Google services. While new users were automatically opted-in, existing users were asked if they wanted to opt-in, and it remains possible to opt-out by going to the "Activity controls" in the "My Account" page of a Google account. ProPublica states that "The practical result of the change is that the DoubleClick ads that follow people around on the web may now be customized to them based on your name and other information Google knows about you. It also means that Google could now, if it wished to, build a complete portrait of a user by name, based on everything they write in email, every website they visit and the searches they conduct." Google contacted ProPublica to correct the fact that it doesn't "currently" use Gmail keywords to target web ads. Shona Ghosh, a journalist for Business Insider, noted that an increasing digital resistance movement against Google has grown. A major hub for critics of Google in order to organize to abstain from using Google products is the Reddit page for the subreddit r/degoogle. The Electronic Frontier Foundation (EFF), a nonprofit organization which deals with civil liberties, has raised concerns regarding privacy issues pertaining to student data after conducting a survey which showed that a majority of parents, students and teachers are concerned that student privacy is being breached. According to the EFF, the Federal Trade Commission has ignored complaints from the public that Google has been harvesting student data and search results even after holding talks with the Department of Education in 2018. Google blocks W3C privacy proposals using their veto power. Potential for data disclosure Data leaks On March 10, 2009, Google reported that a bug in Google Docs had allowed unintended access to some private documents. It was believed by PCWorld that 0.05% of all documents stored via the service were affected by the bug. Google stated the bug has now been fixed. Cookies Google places one or more cookies on each user's computer, which is used to track a person's web browsing on a large number of unrelated websites, and track their search history. If a user is logged into a Google service, Google also uses the cookies to record which Google Account is accessing each website and doing each search. Originally the cookie did not expire until 2038, although it could be manually deleted by the user or refused by setting a browser preference. As of 2007, Google's cookie expired in two years, but renewed itself whenever a Google service is used. As of 2011, Google said that it anonymizes the IP address data that it collects, after nine months, and the association between cookies and web accesses after 18 months. As of 2016, Google's privacy policy does not promise anything about whether or when its records about the users' web browsing or searching are deleted from its records. The non-profit group Public Information Research launched Google Watch, a website advertised as "a look at Google's monopoly, algorithms, and privacy issues." The site raised questions relating to Google's storage of cookies, which in 2007 had a life span of more than 32 years and incorporated a unique ID that enabled creation of a user data log. Google faced criticism with its release of Google Buzz, Google's version of social networking, where Gmail users had their contact lists automatically made public unless they opted out. Google shares this information with law enforcement and other government agencies upon receiving a request. The majority of these requests do not involve review or approval by any court or judge. Tracking Google is suspected of collecting and aggregating data about Internet users through the various tools it provides to developers, such as Google Analytics, Google Play Services, reCAPTCHA, Google Fonts, and Google APIs. This could enable Google to determine a user's route through the Internet by tracking the IP address being used through successive sites (cross-domain web tracking). Linked to other information made available through Google APIs, which are widely used, Google might be able to provide a quite complete web user profile linked to an IP address or user. This kind of data is invaluable for marketing agencies, and for Google itself to increase the efficiency of its own marketing and advertising activities. Google encourages developers to use their tools and to communicate end-user IP addresses to Google: "Developers are also encouraged to make use of the userip parameter to supply the IP address of the end-user on whose behalf you are making the API request. Doing so will help distinguish this legitimate server-side traffic from traffic which doesn't come from an end-user." ReCAPTCHA uses the google.com domain instead of one specific to ReCAPTCHA. This allows Google to receive any cookies that they have already set for the user, effectively bypassing restrictions on setting third party cookies and allowing traffic correlation with all of Google's other services, which most users use. ReCAPTCHA collects enough information that it could reliably de-anonymize many users that simply wish to prove that they are not a robot. Google has many sites and services that makes it difficult to track where the information could be viewed online. Following the continuous backlash over aggressive tracking and unknown data retention periods, Google has tried to appeal to a growing number of privacy conscious people. At Google I/O 2019, it announced plans to limit the data retention period for some of it services, starting with Web and App Activity. Users can select from between 3 months to 18 months within the Google Account Dashboard. The data retention period limit is disabled by default. Gmail Steve Ballmer, Liz Figueroa, Mark Rasch, and the editors of Google Watch believe the processing of email message content by Google's Gmail service goes beyond proper use. Google Inc. claims that mail sent to or from Gmail is never read by a human being other than the account holder, and content that is read by computers is only used to improve the relevance of advertisements and block spam emails. The privacy policies of other popular email services, like Outlook.com and Yahoo, allow users' personal information to be collected and utilized for advertising purposes. In 2004, thirty-one privacy and civil liberties organizations wrote a letter calling upon Google to suspend its Gmail service until the privacy issues were adequately addressed. The letter also called upon Google to clarify its written information policies regarding data retention and data sharing among its business units. The organizations voiced their concerns about Google's plan to scan the text of all incoming messages for the purposes of ad placement, noting that the scanning of confidential email for inserting third party ad content violates the implicit trust of an email service provider. In 2013, Microsoft launched an advertising campaign to attack Google for scanning email messages, arguing that most Gmail users are not aware that Google monitors their personal messages to deliver targeted ads. Microsoft claims that its email service Outlook does not scan the contents of messages and a Microsoft spokesperson called the issue of privacy "Google's kryptonite." Other concerns include the unlimited period for data retention that Google's policies allow, and the potential for unintended secondary uses of the information Gmail collects and stores. A court filing uncovered by advocacy group Consumer Watchdog in August 2013 revealed that Google stated in a court filing that no "reasonable expectation" exists among Gmail users in regard to the assured confidentiality of their emails. According to the British Newspaper, The Guardian, "Google's court filing was referring to users of other email providers who email Gmail users and not to the Gmail users themselves". In response to a lawsuit filed in May 2013, Google explained: A Google spokesperson stated to the media on August 15, 2013 that the corporation takes the privacy and security concerns of Gmail users "very seriously." A Federal Judge declined to dissolve a lawsuit made by Gmail users who opposed to the use of analyzing the content of the messenger by selling byproducts. In 2017, Google stopped personalizing Gmail ads. CIA and NSA ties In February 2010, Google was reported to be working on an agreement with the National Security Agency (NSA) to investigate recent attacks against its network. While the deal did not give the NSA access to Google's data on users' searches or e-mail communications and accounts and Google was not sharing proprietary data with the agency, privacy and civil rights advocates were concerned. In October 2004, Google acquired Keyhole, a 3D mapping company. In February 2004, before its acquisition by Google, Keyhole received an investment from In-Q-Tel, the CIA's investment arm. And in July 2010 it was reported that the investment arms of both the CIA (In-Q-Tel) and Google (Google Ventures) were investing in Recorded Future, a company specializing in predictive analytics—monitoring the web in real time and using that information to predict the future. Private corporations have been using similar systems since the 1990s, but the involvement of Google and the CIA, with their large data stores, raised privacy concerns. In 2011, a federal district court judge in the United States turned down a Freedom of Information Act request, submitted by the Electronic Privacy Information Center. In May 2012, a Court of Appeals upheld the ruling. The request attempted to disclose NSA records regarding the 2010 cyber-attack on Google users in China. The NSA stated that revealing such information would make the US Government information systems vulnerable to attack. The NSA refused to confirm or deny the existence of the records, or the existence of any relationship between the NSA and Google. Leaked NSA documents obtained by The Guardian and The Washington Post in June 2013 included Google on the list of companies that cooperate with the NSA's PRISM surveillance program, which authorizes the government to secretly access data of non-US citizens hosted by American companies without a warrant. Following the leak, government officials acknowledged the existence of the program. According to the leaked documents, the NSA has direct access to servers of those companies, and the amount of data collected through the program had been growing fast in years prior to the leak. Google has denied the existence of any "government backdoor". Government requests Google has been criticized both for disclosing too much information to governments too quickly and for not disclosing information that governments need to enforce their laws. In April 2010, Google, for the first time, released details about how often countries around the world ask it to hand over user data or to censor information. Online tools make the updated data available to everyone. Between July and December 2009, Brazil topped the list for user data requests with 3,663, while the US had made 3,580, the UK 1,166, and India 1,061. Brazil also made the largest number of requests to remove content with 291, followed by Germany with 188, India with 142, and the US with 123. Google, which stopped offering search services in China a month before the data was released, said it could not release information on requests from the Chinese government because such information is regarded as a state secret. Google's chief legal officer said, "The vast majority of these requests are valid and the information needed is for legitimate criminal investigations or for the removal of child pornography". March 20, 2019 the U.S Supreme Court risked an 8.5 million settlement which Google formed to fix a lawsuit with the claims of invading their privacy. Google Chrome In 2008, Consumer Watchdog produced a video showing how Google Chrome records what a user types into the web address field and sends that information to Google servers to populate search suggestions. The video includes discussion regarding the potential privacy implications of this feature. Incognito browsing mode Google Chrome includes a private browsing feature called "incognito browsing mode" that prevents the browser from permanently storing any browsing or download history information or cookies. Using incognito mode prevents tracking by the browser. However, the individual websites visited can still track and store information about visits. In particular, any searches performed while signed into a Google account will be saved as part of the account's web history. In addition, other programs such as those used to stream media files, which are invoked from within Chrome, may still record history information, even when incognito mode is being used. Furthermore, a limitation of Apple's iOS 7 platform allows some information from incognito browser windows to leak to regular Chrome browser windows. There are concerns that these limitations may have led Chrome users to believe that incognito mode provides more privacy protection than it actually does. Street View Google's online map service, "Street View", has been accused of taking pictures and viewing too far into people's private homes and/or too close to people on the street when they do not know they are being photographed. Wi-Fi networks information collection From 2006 to 2010, Google Streetview camera cars collected about 600 gigabytes of data from users of unencrypted public and private Wi-Fi networks in more than 30 countries. No disclosures nor privacy policy was given to those affected, nor to the owners of the Wi-Fi stations. Google apologized and said that they were "acutely aware that we failed badly here" in terms of privacy protection, that they were not aware of the problem until an inquiry from German regulators was received, that the private data was collected inadvertently, and that none of the private data was used in Google's search engine or other services. A representative of Consumer Watchdog replied, "Once again, Google has demonstrated a lack of concern for privacy. Its computer engineers run amok, push the envelope and gather whatever data they can until their fingers are caught in the cookie jar." In a sign that legal penalties may result, Google said it will not destroy the data until permitted by regulators. The Streetview data collection prompted several lawsuits in the United States. The suits were consolidated into one case before a California federal court. Google's motion to have the case dismissed, saying the Wi-Fi communications it captured were "readily accessible to the general public" and therefore not a violation of federal wiretapping laws, was rejected in June 2011 by the U.S. District Court for the Northern District of California and upon appeal in September 2013 by the U.S. Court of Appeals for the Ninth Circuit. The ruling is viewed as a major legal setback for Google and allows the case to move back to the lower court for trial. Currently Google no longer collects WiFi data via Streetview, instead using an Android device's Wi-Fi positioning system; however, they have suggested the creation of a unified approach for opting-out from Wi-Fi-based positioning systems, suggesting the usage of the word "nomap" appended to a wireless access point's SSID to exclude it from Google's WPS database. YouTube On July 14, 2008, Viacom compromised to protect YouTube users' personal data in their $1 billion copyright lawsuit. Google agreed it will anonymize user information and internet protocol addresses from its YouTube subsidiary before handing the data over to Viacom. The privacy deal also applied to other litigants including the FA Premier League, the Rodgers & Hammerstein organization and the Scottish Premier League. The deal however did not extend the anonymity to employees, because Viacom wishes to prove that Google staff are aware of the uploading of illegal material to the site. The parties therefore will further meet on the matter lest the data be made available to the court. YouTube was forced to pay $170 million and implement new privacy systems to the FTC following a complaint about the platform's enforcement of the Children's Online Privacy Protection Act. Google Buzz On February 9, 2010, Google launched Google Buzz, Google's microblogging service. Anyone with a Gmail account was automatically added as a contact to pre-existing Gmail contacts, and had to opt out if they did not wish to participate. The launch of Google Buzz as an "opt-out" social network immediately drew criticism for violating user privacy because it automatically allowed Gmail users' contacts to view their other contacts. In 2011, the United States Federal Trade Commission initiated a Commission proceeding against respondent Google, Inc., alleging that certain personal information of Gmail users was shared without users' permission through the Google Buzz social network. Real names, Google+, and Nymwars Google Plus (G+) was launched in late June 2011. The new service gained 20 million members in just a few weeks. At the time of launch, the site's user content and conduct policy stated, "To help fight spam and prevent fake profiles, use the name your friends, family or co-workers usually call you." Starting in July 2011, Google began enforcing this policy by suspending the accounts of those who used pseudonyms. Starting in August 2011, Google provided a four-day grace period before enforcing the real name policy and suspending accounts. The four days allowed members time to change their pen name to their real name. The policy extended to new accounts for all of Google services, including Gmail and YouTube, although accounts existing before the new policy were not required to be updated. In late January 2012 Google began allowing members to use nicknames, maiden names, and other "established" names in addition to their common or real names. According to Google, the real name policy makes Google more like the real world. People can find each other more easily, like a phone book. The real name policy protects children and young adults from cyber-bullying, as those bullies hide behind pen names. There is considerable use of search engines for "people-searching", attempting to find information on persons by performing a search of their name. A number of high-profile commentators have publicly criticized Google's policies, including technologists Jamie Zawinski, Kevin Marks, and Robert Scoble and organizations such as the Electronic Frontier Foundation. Criticisms have been wide-ranging, for example: The policy is not like the real world, because real names and personal information are not known to everyone in the off-line world. The policy fails to acknowledge long-standing Internet culture and conventions. Using real names online can disadvantage or endanger some individuals, such as victims of violence or harassment. The policy prevents users from protecting themselves by hiding their identity. For example, a person who reports a human rights violation or crime and posts it on YouTube can no longer do so anonymously. The dangers include possible hate crimes, retaliation against whistle-blowers, executions of rebels, religious persecution, and revenge against victims or witnesses of crimes. Using a pseudonym is different from anonymity, and a pseudonym used consistently denotes an "authentic personality". Google's arguments fail to address the financial gain represented by connecting personal data to real-world identities. Google has inconsistently enforced their policy, especially by making exceptions for celebrities using pseudonyms and mononyms. The policy as stated is insufficient for preventing spam. The policy may run afoul of legal constraints such as the German "Telemediengesetz" federal law, which makes anonymous access to online services a legal requirement. The policy does not prevent trolls. It is up to social media to encourage the growth of healthy social norms, and forcefully telling people how they must behave cannot be efficient. Do Not Track In April 2011, Google was criticized for not signing onto the Do Not Track feature for Chrome that was being incorporated in most other modern web browsers, including Firefox, Internet Explorer, Safari, and Opera. Critics pointed out that a new patent Google was granted in April 2011, for greatly enhanced user tracking through web advertising, will provide much more detailed information on user behavior and that do not track would hurt Google's ability to exploit this. Software reviewer Kurt Bakke of Conceivably Tech wrote: Mozilla developer Asa Dotzler noted: "It seems pretty obvious to me that the Chrome team is bowing to pressure from Google's advertising business and that's a real shame. I had hoped they'd demonstrate a bit more independence than that." At the time of the criticisms, Google argued that the technology is useless, as advertisers are not required to obey the user's tracking preferences and it remains unclear as to what constitutes tracking (as opposed to storing statistical data or user preferences). As an alternative, Google continues to offer an extension called "Keep My Opt-Outs" that permanently prevents advertising companies from installing cookies on the user's computer. The reaction to this extension was mixed. Paul Thurrott of Windows IT Pro called the extension "much, much closer to what I've been asking for—i.e. something that just works and doesn't require the user to figure anything out—than the IE or Firefox solutions" while lamenting the fact that the extension is not included as part of the browser itself. In February 2012, Google announced that Chrome would incorporate a Do Not Track feature by the end of 2012, and it was implemented in early November 2012. Moreover, Pól Mac and Douglas J. (2016) in their study “Don’t Let Google Know I’m Lonely”, presented strong evidence that the two giant techs have very high accuracy while providing results based on user's sensitive entries. Pól Mac and Douglas J. (2016) specifically focused on user's financial and sexual preferences and they have concluded that "For Google, 100% of user sessions on a sensitive topic reject the hypothesis that no learning of the sensitive topic by the search engine has taken place and so are identified as sensitive. For Bing, the corresponding detection rate is 91%." Scroogle Scroogle, named after the fictional character Ebenezer Scrooge, was a web service that allowed users to perform Google searches anonymously. It focused heavily on searcher privacy by blocking Google cookies and not saving log files. The service was launched in 2003 by Google critic Daniel Brandt, who was concerned about Google collecting personal information on its users. Scroogle offered a web interface and browser plugins for Firefox, Google Chrome, and Internet Explorer that allowed users to run Google searches anonymously. The service scraped Google search results, removing ads and sponsored links. Only the raw search results were returned, meaning features such as page preview were not available. For added security, Scroogle gave users the option of having all communication between their computer and the search page be SSL encrypted. Although Scroogle's activities technically violated Google's terms of service, Google generally tolerated its existence, whitelisting the site on multiple occasions. After 2005, the service encountered rapid growth before running into a series of problems starting in 2010. In February 2012, the service was permanently shut down by its creator due to a combination of throttling of search requests by Google and a denial-of-service attack by an unknown person or group. Before its demise, Scroogle handled around 350,000 queries daily, ranked among the top 4,000 sites worldwide and in the top 2,500 for the United States, Canada, the United Kingdom, Australia, and other countries in web traffic. Privacy and data protection cases and issues by country European Union European Union (EU) data protection officials (the Article 29 working party who advise the EU on privacy policy) have written to Google asking the company to justify its policy of keeping information on individuals' internet searches for up to two years. The letter questioned whether Google has "fulfilled all the necessary requirements" on the EU laws concerning data protection. On May 31, 2007, Google agreed that its privacy policy was vague, and that they were constantly working at making it clearer to users. After Google merged its different privacy policies into a single one in March 2012, the working group of all European Union Data Protection Authorities assessed that it failed to comply with the EU legal framework. Several countries then opened cases to investigate possible breach of their privacy rules. Google has also been implicated in Google Spain v AEPD and Mario Costeja González, a case before the Audiencia Nacional (Spain's national court) and the European Court of Justice, which required Google to comply with the European privacy laws (i.e., the Data Protection Directive) and to allow users to be forgotten when operating in the European Union. Czech Republic Starting in 2010, after more than five months of unsuccessful negotiations with Google, the Czech Office for Personal Data Protection prevented Street View from taking pictures of new locations. The Office described Google's program as taking pictures "beyond the extent of the ordinary sight from a street", and claimed that it "disproportionately invaded citizens' privacy." Google resumed Street View in Czech Republic in 2012 after having agreed to a number of limitations similar to concessions Google has made in other countries. France In January 2014, the French authority, CNIL sanctioned Google, requiring it to pay their highest fee and to display on its search engine website a banner referring to the decision. Google complied, but planned to appeal to the supreme court of administrative justice, the Conseil d'Etat. A number of French and German companies came together to form a group called the Open Internet Project, seeking a ban of Google's manipulative favoring of its own services and content over those of others. Germany In May 2010, Google was unable to meet a deadline set by Hamburg's data protection supervisor to hand over data illegally collected from unsecured home wireless networks. Google added, "We hope, given more time, to be able to resolve this difficult issue." The data was turned over to German, French, and Spanish authorities in early June 2010. In November 2010, vandals in Germany targeted houses that had opted out of Google's Street View. In April 2011, Google announced that it will not expand its Street View program in Germany, but what has already been photographed—around 20 cities' worth of pictures—will remain available. This decision came despite an earlier Berlin State Supreme Court ruling that Google's Street View program was legal. In September 2014, a top official in Germany called for Google to be broken up as publishers were fighting in court over compensation for the snippets of text that appear with Google News updates. The chief executive of Axel Springer, a German publishing giant, expressed fears over Google's growing influence in the country. Italy In February 2010, in a complaint brought by an Italian advocacy group for people with Down's Syndrome, Vividown, and the boy's father, three Google executives were handed six-month suspended sentences for breach of the Italian Personal Data Protection Code in relation to a video, uploaded to Google Video in 2006, of a disabled boy being bullied by several classmates. In December 2012, these convictions and sentences were overturned on appeal. Norway The Data Inspectorate of Norway (Norway is not a member of the EU) investigated Google (and others) and stated that the 18- to 24-month period for retaining data proposed by Google was too long. United Kingdom On 27 March 2015, the Court of Appeal ruled that British persons have the right to sue Google in the UK for misuse of private information. United States In early 2005, the United States Department of Justice filed a motion in federal court to force Google to comply with a subpoena for "the text of each search string entered onto Google's search engine over a two-month period (absent any information identifying the person who entered such query)." Google fought the subpoena, due to concerns about users' privacy. In March 2006, the court ruled partially in Google's favor, recognizing the privacy implications of turning over search terms and refusing to grant access. In April 2008, a Pittsburgh couple, Aaron and Christine Boring, sued Google for "invasion of privacy". They claimed that Street View made a photo of their home available online, and it diminished the value of their house, which was purchased for its privacy. They lost their case in a Pennsylvania court. "While it is easy to imagine that many whose property appears on Google's virtual maps resent the privacy implications, it is hard to believe that any other than the most exquisitely sensitive would suffer shame or humiliation," Judge Hay ruled; the Boring family was paid one dollar by Google for the incident. In May 2010, a U.S. District Court in Portland, Oregon ordered Google to hand over two copies of wireless data that the company's Street View program collected as it photographed neighborhoods. In 2012 and 2013, Google reached two settlements over tracking people online without their knowledge after bypassing privacy settings in Apple’s Safari browser. The first was a settlement in August 2012 for $22.5 million with the Federal Trade Commission—the largest civil penalty the FTC has ever obtained for a violation of a Commission order. The second was a November 2013 settlement for $17 million with 37 states and the District of Columbia. In addition to the fines, Google agreed to avoid using software that overrides a browser's cookie-blocking settings, to avoid omitting or misrepresenting information to individuals about how they use Google products or control the ads they see, to maintain for five years a web page explaining what cookies are and how to control them, and to ensure that the cookies tied to Safari browsers expire. In both settlements Google denied any wrongdoing, but said it discontinued circumventing the settings early in 2012, after the practice was publicly reported, and stopped tracking Safari users and showing them personalized ads. In September 2019, Google was fined $170 million by the Federal Trade Commission in New York for violation of COPPA regulations on its YouTube platform. YouTube was required to add a feature to allow video uploaders to flag videos made for users under 13 years old. Adverts were prohibited in videos flagged as such. YouTube was also required to ask for parents' permission before collecting personal information from children. DoubleClick ads combined with other Google services In the summer of 2016, Google quietly dropped its ban on personally-identifiable info in its DoubleClick ad service. Google's privacy policy was changed to state it "may" combine web-browsing records obtained through DoubleClick with what the company learns from the use of other Google services. While new users were automatically opted in, existing users were asked if they wanted to opt in, and it remains possible to opt out in the Activity controls of the My Account page for a Google account. ProPublica stated that "The practical result of the change is that the DoubleClick ads that follow people around on the web may now be customized to them based on your name and other information Google knows about you. It also means that Google could now, if it wished to, build a complete portrait of a user by name, based on everything they write in email, every website they visit and the searches they conduct." Google contacted ProPublica to correct the fact that it didn't "currently" use Gmail keywords to target web ads. See also Don't be evil Google litigation Googlization History of Google Criticism of Facebook Criticism of Microsoft Criticism of Yahoo! Distributed search engine Internet privacy Privacy concerns with social networking services References External links "Google's Email Service 'Gmail' Sacrifices Privacy for Extra Storage Space", Privacy Rights Clearinghouse, April 2, 2004 "Privacy Group Flunks Google", Lisa Vaas, eWeek.com, June 12, 2007 "Who's afraid of Google?", The Economist, August 30, 2007 "Googlist Realism: The Google-China saga and the free-information regimes as a new site of cultural imperialism and moral tensions", Tricia Wang, Cultural Bytes, June 29, 2010 "How to Remove Your Google Search History Before Google's New Privacy Policy Takes Effect", Eva Galperin, Electronic Frontier Foundation, February 21, 2012 Google Google Google privacy concerns Criticism of Google Internet privacy Privacy controversies and disputes
2052739
https://en.wikipedia.org/wiki/File%20system%20API
File system API
A file system API is an application programming interface through which a utility or user program requests services of a file system. An operating system may provide abstractions for accessing different file systems transparently. Some file system APIs may also include interfaces for maintenance operations, such as creating or initializing a file system, verifying the file system for integrity, and defragmentation. Each operating system includes the APIs needed for the file systems it supports. Microsoft Windows has file system APIs for NTFS and several FAT file systems. Linux systems can include APIs for ext2, ext3, ReiserFS, and Btrfs to name a few. History Some early operating systems were capable of handling only tape and disk file systems. These provided the most basic of interfaces with: Write, read and position More coordination such as device allocation and deallocation required the addition of: Open and close As file systems provided more services, more interfaces were defined: Metadata management File system maintenance As additional file system types, hierarchy structure and supported media increased, additional features needed some specialized functions: Directory management Data structure management Record management Non-data operations Multi-user systems required APIs for: Sharing Restricting access Encryption API overviews Write, read and position Writing user data to a file system is provided for use directly by the user program or the run-time library. The run-time library for some programming languages may provide type conversion, formatting and blocking. Some file systems provide identification of records by key and may include re-writing an existing record. This operation is sometimes called PUT or PUTX (if the record exists) Reading user data, sometimes called GET, may include a direction (forward or reverse) or in the case of a keyed file system, a specific key. As with writing run-time libraries may intercede for the user program. Positioning includes adjusting the location of the next record. This may include skipping forward or reverse as well as positioning to the beginning or end of the file. Open and close The open API may be explicitly requested or implicitly invoked upon the issuance of the first operation by a process on an object. It may cause the mounting of removable media, establishing a connection to another host and validating the location and accessibility of the object. It updates system structures to indicate that the object is in use. Usual requirements for requesting access to a file system object include: The object which is to be accessed (file, directory, media and location) The intended type of operations to be performed after the open (reads, updates, deletions) Additional information may be necessary, for example a password a declaration that other processes may access the same object while the opening process is using the object (sharing). This may depend on the intent of the other process. In contrast, a declaration that no other process may access the object regardless of the other processes intent (exclusive use). These are requested via a programming language library which may provide coordination among modules in the process in addition to forwarding the request to the file system. It must be expected that something may go wrong during the processing of the open. The object or intent may be improperly specified (the name may include an unacceptable character or the intent is unrecognized). The process may be prohibited from accessing the object (it may be only accessible by a group or specific user). The file system may be unable to create or update structures required to coordinate activities among users. In the case of a new (or replacement) object, there may not be sufficient capacity on the media. Depending on the programming language, additional specifications in the open may establish the modules to handle these conditions. Some libraries specify a library module to the file system permitting analysis should the opening program be unable to perform any meaningful action as a result of a failure. For example, if the failure is on the attempt to open the necessary input file, the only action may be to report the failure and abort the program. Some languages simply return a code indicating the type of failure which always must be checked by the program, which decides what to report and if it can continue. Close may cause dismounting or ejecting removable media and updating library and file system structures to indicate that the object is no longer in use. The minimal specification to the close references the object. Additionally, some file systems provide specifying a disposition of the object which may indicate the object is to be discarded and no longer be part of the file system. Similar to the open, it must be expected that something may go wrong. The specification of the object may be incorrect. There may not be sufficient capacity on the media to save any data being buffered or to output a structure indicating that the object was successfully updated. A device error may occur on the media where the object is stored while writing buffered data, the completion structure or updating meta data related to the object (for example last access time). A specification to release the object may be inconsistent with other processes still using the object. Considerations for handling a failure are similar to those of the open. Metadata management Information about the data in a file is called metadata. Some of the metadata is maintained by the file system, for example last-modification date (and various other dates depending on the file system), location of the beginning of the file, the size of the file and if the file system backup utility has saved the current version of the files. These items cannot usually be altered by a user program. Additional meta data supported by some file systems may include the owner of the file, the group to which the file belongs as well as permissions and/or access control (i.e. What access and updates various users or groups may perform), and whether the file is normally visible when the directory is listed. These items are usually modifiable by file system utilities which may be executed by the owner. Some applications store more metadata. For images the metadata may include the camera model and settings used to take the photo. For audio files, the meta data may include the album, artist who recorded the recording and comments about the recording which may be specific to a particular copy of the file (i.e. different copies of the same recording may have different comments as update by the owner of the file). Documents may include items like checked-by, approved-by, etc. Directory management Renaming a file, moving a file (or a subdirectory) from one directory to another and deleting a file are examples of the operations provide by the file system for the management of directories. Metadata operations such as permitting or restricting access the a directory by various users or groups of users are usually included. Filesystem maintenance As a filesystem is used directories, files and records may be added, deleted or modified. This usually causes inefficiencies in the underlying data structures. Things like logically sequential blocks distributed across the media in a way that causes excessive repositioning, partially used even empty blocks included in linked structures. Incomplete structures or other inconsistencies may be caused by device or media errors, inadequate time between detection of impending loss of power and actual power loss, improper system shutdown or media removal, and on very rare occasions file system coding errors. Specialized routines in the file system are included to optimize or repair these structures. They are not usually invoked by the user directly but triggered within the file system itself. Internal counters of the number of levels of structures, number of inserted objects may be compared against thresholds. These may cause user access to be suspended to a specific structure (usually to the displeasure of the user or users effected) or may be started as low priority asynchronous tasks or they may be deferred to a time of low user activity. Sometimes these routines are invoked or scheduled by the system manager or as in the case of defragmentation. Kernel-level API The API is "kernel-level" when the kernel not only provides the interfaces for the filesystems developers but is also the space in which the filesystem code resides. It differs with the old schema in that the kernel itself uses its own facilities to talk with the filesystem driver and vice versa, as contrary to the kernel being the one that handles the filesystem layout and the filesystem the one that directly access the hardware. It is not the cleanest scheme but resolves the difficulties of major rewrite that has the old scheme. With modular kernels it allows adding filesystems as any kernel module, even third party ones. With non-modular kernels however it requires the kernel to be recompiled with the new filesystem code (and in closed-source kernels, this makes third party filesystem impossible). Unixes and Unix-like systems such as Linux have used this modular scheme. There is a variation of this scheme used in MS-DOS (DOS 4.0 onward) and compatibles to support CD-ROM and network file systems. Instead of adding code to the kernel, as in the old scheme, or using kernel facilities as in the kernel-based scheme, it traps all calls to a file and identifies if it should be redirected to the kernel's equivalent function or if it has to be handled by the specific filesystem driver, and the filesystem driver "directly" access the disk contents using low-level BIOS functions. Driver-based API The API is "driver-based" when the kernel provides facilities but the file system code resides totally external to the kernel (not even as a module of a modular kernel). It is a cleaner scheme as the filesystem code is totally independent, it allows filesystems to be created for closed-source kernels and online filesystem additions or removals from the system. Examples of this scheme are the Windows NT and OS/2 respective IFSs. Mixed kernel-driver-based API In this API all filesystems are in the kernel, like in kernel-based APIs, but they are automatically trapped by another API, that is driver-based, by the OS. This scheme was used in Windows 3.1 for providing a FAT filesystem driver in 32-bit protected mode, and cached, (VFAT) that bypassed the DOS FAT driver in the kernel (MSDOS.SYS) completely, and later in the Windows 9x series (95, 98 and Me) for VFAT, the ISO9660 filesystem driver (along with Joliet), network shares, and third party filesystem drivers, as well as adding to the original DOS APIs the LFN API (that IFS drivers can not only intercept the already existent DOS file APIs but also add new ones from within the 32-bit protected mode executable). However that API was not completely documented, and third parties found themselves in a "make-it-by-yourself" scenario even worse than with kernel-based APIs. User space API The API is in the user space when the filesystem does not directly use kernel facilities but accesses disks using high-level operating system functions and provides functions in a library that a series of utilities use to access the filesystem. This is useful for handling disk images. The advantage is that a filesystem can be made portable between operating systems as the high-level operating system functions it uses can be as common as ANSI C, but the disadvantage is that the API is unique to each application that implements one. Examples of this scheme are the hfsutils and the adflib. Interoperatibility between file system APIs As all filesystems (at least the disk ones) need equivalent functions provided by the kernel, it is possible to easily port a filesystem code from one API to another, even if they are of different types. For example, the ext2 driver for OS/2 is simply a wrapper from the Linux's VFS to the OS/2's IFS and the Linux's ext2 kernel-based, and the HFS driver for OS/2 is a port of the hfsutils to the OS/2's IFS. There also exists a project that uses a Windows NT IFS driver for making NTFS work under Linux. See also Comparison of file systems File system Filename extension Filing Open Service Interface Definition (OSID) Installable File System (IFS) List of file systems Virtual file system References Sources O'Reilly - Windows NT File System Internals, A Developer's Guide - By Rajeev Nagar - Microsoft Press - Inside Windows NT File System - By Helen Custer - Wiley - UNIX Filesystems: Evolution, Design, and Implementation - By Steve D. Pate - Microsoft Press - Inside Windows NT - By Helen Custer - External links Filesystem Specifications and Technical Whitepapers A Tour of the Linux VFS Microsoft's IFSKit hfsutils adflib A FileSystem Abstraction System for Go Application programming interfaces Computer file systems
4913435
https://en.wikipedia.org/wiki/Mark%20Sanchez
Mark Sanchez
Mark Travis John Sanchez (born November 11, 1986) is a former American football quarterback who played in the National Football League (NFL) for 10 seasons. He played college football at the University of Southern California (USC) and was drafted by the New York Jets in the first round (fifth overall) of the 2009 NFL Draft. He is currently a color analyst for NFL coverage on Fox and Fox Sports 1. A backup quarterback during his first three years at USC, Sanchez rose to prominence in 2007 due to injuries suffered by starting quarterback John David Booty; he also became popular within the community due to his Mexican-American heritage. He was named the starter in 2008, and led USC to a 12–1 record and won the Rose Bowl against Penn State. Although USC coach Pete Carroll and many scouts considered him too inexperienced, Sanchez entered the 2009 NFL Draft and was selected by the Jets in the first round. Despite a subpar first season, Sanchez led the Jets to the AFC Championship Game, a losing effort to the Indianapolis Colts, becoming the fourth rookie quarterback in NFL history to win his first playoff game and the second to win two playoff games. In his second season, Sanchez again led the Jets to the AFC Championship Game, losing to the Pittsburgh Steelers; he joined Ben Roethlisberger as the only two quarterbacks in NFL history to reach the conference championship in their first two seasons in the league. The next two seasons would be a regression for both the team and Sanchez as they failed to reach the playoffs and he was eventually replaced towards the end of the 2012 season by Greg McElroy. Sanchez suffered a season-ending shoulder injury during the preseason in 2013; he was released after the season concluded and was subsequently signed by the Philadelphia Eagles. When Eagles starter Nick Foles went down with an injury, Sanchez started the second half of the season and set career highs in completion percentage and passer rating. Nevertheless, Sanchez was unable to reestablish himself as a starter and spent one season each as backup for the Dallas Cowboys, Chicago Bears, and Washington Redskins before retiring after the 2018 season. Early life Mark Sanchez was born in Long Beach, California, to Nick Sr. and Olga Sanchez. When Mark was four, his parents divorced; Mark and his brothers, Nick Jr. and Brandon, stayed with their father but their mother remained involved in their upbringing. Mark initially lived in Whittier and Pico Rivera; when he was six, his father moved with the children to Rancho Santa Margarita, a predominantly white city in Orange County. Mark's father remarried and raised them strictly, seeking to influence them to become leaders. Throughout his childhood and teenage years, Mark's father would have him combine athletic and mental training. Mark would have to dribble a basketball without looking at it while reciting multiplication tables; practice baseball swings in a batting cage while answering questions about the periodic table and similar combined drills that his father hoped would develop quick thinking and self-confidence that would guide Mark in all areas of life and not simply sports. By the time Sanchez entered the eighth grade, he had developed an interest in football but was unsure of what position to play. His father consulted coaches Bill Cunerty, who formerly coached at Saddleback College, and Bob Johnson, the head coach at Mission Viejo High School. Both coaches stated Mark could be a quarterback if he applied himself and was open to learning the intricacies of the position. Nick Sr. trained Mark during sessions in their backyard or at the park. Mark, who was attending Santa Margarita High School, joined the football team. During his first pass attempt as a sophomore, Mark threw a 55-yard touchdown. Prior to his junior year of high school, Sanchez transferred to Mission Viejo, where Johnson, who was recognized as a "quarterback guru", having trained professionals like Carson Palmer, was head coach. Under Johnson's tutelage, Mark felt he would have a better opportunity to become a better player. Johnson tutored Mark on the complexities of the position and in two seasons with the team, Mark led the Diablos to a 27–1 record culminating with the California Interscholastic Federation Division II championship in 2004. Sanchez was named football player of the year by several major college recruiting services and was considered the top quarterback in the nation upon the conclusion of his high school football career in 2005. In July 2004, Sanchez announced his commitment to the University of Southern California. College career 2005 season Being named the nation's top quarterback coming out of high school, Sanchez was well regarded upon his arrival at USC. With upperclassmen Matt Leinart and John David Booty returning, Sanchez did not play during his freshman year in 2005, opting to redshirt to preserve a year of eligibility. During this time, he participated as the quarterback of USC's scout team, earning the Trojans' Service Team Offensive Player of the Year Award. 2006 season In April 2006, Sanchez was arrested after a female USC student accused him of sexual assault. He was released from jail the following day and suspended. On June 3, 2006, the Los Angeles County District Attorney's office announced no charges would be filed, and Sanchez was reinstated, though he was disciplined by the football team for underage drinking and using false identification on the night he was arrested. At the outset of the 2006 season, Sanchez competed for the starting quarterback position; once Booty, a junior, suffered severe back spasms caused by a pre-existing back condition, surgery was required and Sanchez was promoted to run the first-team offense during the spring as Booty recovered. Coaches stated Booty would be considered the starting quarterback when he returned for fall practice. During the 2006 season, Sanchez saw limited playing time in games against Arkansas, Stanford and Oregon. Through those three games, Sanchez completed 3 of his 7 pass attempts for 63 yards and one interception. He also saw additional action against Arizona, Michigan, and Notre Dame but he did not attempt a pass in those contests. 2007 season In fall practice, before USC's 2007 season, Sanchez broke his right thumb, missing the first game against Idaho; he returned the following week and served as the primary backup to Booty. Sanchez earned limited playing time in wins against Nebraska and Washington State. Sanchez was named the starting quarterback by head coach Pete Carroll against Arizona after Booty suffered a broken finger during a 24–23 loss to Stanford. On October 13, Sanchez led USC to a 20–13 victory, overcoming a wavering performance during the first half of the game in which he threw two interceptions, as Arizona went on to tie the game heading into halftime. During the second half, Sanchez was more proficient in passing the ball and ultimately finished the game completing 19 of his 31 passes while throwing for 130 yards with one touchdown and two interceptions. With Booty still recovering, USC elected to start Sanchez for a second consecutive week against Notre Dame; he made significant improvements, completing 21 of his 38 passes for 235 yards and four touchdowns in a 38–0 victory over Notre Dame. On October 27, Sanchez started for the final time in place of the injured Booty, an away game against Oregon at Autzen Stadium. USC lost, 24–17; Sanchez had two passes intercepted by Oregon safety Matthew Harper in the second half. The first interception led to a fourth-quarter touchdown that gave Oregon a 14-point lead; the second interception ended USC's final chance for a comeback. In spite of a myriad of mistakes committed by his teammates in addition to his own, Sanchez publicly accepted blame for the loss. The following week, against Oregon State, Booty returned as USC's starting quarterback, with Sanchez resuming his position as Booty's backup. Sanchez did not perform in subsequent games and finished the season with 695 yards and seven touchdowns with five interceptions. 2008 season Sanchez entered spring practice after the 2008 season as the front-runner to take over the starting quarterback position, but faced strong competition from redshirt freshman Aaron Corp and Mitch Mustain, a transfer from Arkansas, where he had been the starting quarterback; Mustain, like Sanchez, was named the top quarterback in the nation upon the conclusion of high school career in 2006. By the end of spring practice, Carroll announced Sanchez would be the starting quarterback heading into the fall. During the first week of fall camp, Sanchez dislocated his left kneecap during warm-ups prior to practice; trainers immediately put the kneecap back into place. After missing nearly three weeks, Sanchez was cleared to play in the opener against Virginia. Before the opener, Sanchez was contacted by USC's previous three quarterbacks—Carson Palmer, Leinart and Booty—who wished him well and offered general advice. In the opener at Virginia, Sanchez threw for a career-best 338 yards, completing 26 of his 35 passes for three touchdowns and one interception. The Davey O'Brien Foundation named him the O'Brien Quarterback of the Week and his performance garnered early Heisman discussion. The Trojans suffered a stunning 27–21 loss against Oregon State on September 25. By season's end, the Trojans' lone loss was enough to remove them from contending for the BCS National Title and instead they were to play in the Rose Bowl against Penn State. The Trojans defeated the Nittany Lions 38–24. Sanchez won the 2009 Rose Bowl Most Valuable Player award for his performance on offense; his 413 passing yards ranked second in the history of the Rose Bowl and fourth in Trojan history. With Sanchez starting all thirteen games, the Trojans ended the season 12–1 and ranked number two in the Coaches' Poll and number three in the AP Poll. Sanchez finished the season with 3,207 yards passing, 34 touchdowns, second most in Trojan history, behind Leinart, and 10 interceptions. Upon the conclusion of the Rose Bowl, Sanchez stated it would be "hard to say goodbye to [USC]. I don’t think I can do it." However, with the subsequent announcement that other NFL-caliber quarterbacks, such as Sam Bradford, Tim Tebow, and Colt McCoy had decided to stay in school, rumors arose that Sanchez would use the opportunity to enter the 2009 NFL Draft. On January 15, Sanchez announced his plans to forgo his final year of college eligibility and enter the 2009 NFL Draft, although he continued as a USC student and completed work on his degree in the spring of 2009 while preparing for the draft. Sanchez became the first USC quarterback to leave school early for the NFL since Todd Marinovich did so after the 1990 season. During the press conference, head coach Pete Carroll made it clear that he did not agree with Sanchez's decision, and advised him of the low success-rate of quarterbacks who left college early. Despite the public disagreement, the two remained close afterward. Statistics Professional career Pre-draft Sanchez hired his older brother and business litigator, Nick Sanchez, to be his agent alongside David Dunn, who represented Carson Palmer. Sanchez received an invitation to the 2009 Scouting Combine, where his performance was well received. He was ranked as one of the top two quarterbacks, behind fellow junior Matthew Stafford from the University of Georgia. In the final days leading up to the draft, several NFL teams expressed serious interest in Sanchez, including the Seattle Seahawks (fourth overall selection), Cleveland Browns (fifth overall selection), Washington Redskins (13th overall selection), and New York Jets (17th overall selection). New York Jets The New York Jets drafted Sanchez in the first round using the fifth overall selection in the 2009 NFL Draft, making him the first quarterback selected by the Jets in the first round since Chad Pennington went 18th overall in the 2000 NFL Draft. To select Sanchez, the Jets traded their first and second round selections and three players, Kenyon Coleman, Abram Elam and Brett Ratliff, to the Cleveland Browns. At the time, the selection was lauded as good value for the team and for Sanchez. Sanchez reached an agreement with the team on June 10, 2009, signing a five-year, $50 million contract, with $28 million guaranteed. It is the largest contract the Jets signed a player to in franchise history. 2009 season Heading into his rookie training camp, Sanchez was listed as the second quarterback behind veteran Kellen Clemens. Jets head coach Rex Ryan viewed the camp as an opportunity for both quarterbacks to compete against each other to determine the eventual starter for the 2009 season. On August 26, 2009, Sanchez was named the starter, becoming the first rookie quarterback to start the season for the team since Dick Jamieson in 1960. Sanchez started his first regular season NFL game against the Houston Texans on September 13, 2009, throwing his first touchdown pass, a 30-yard reception, to wide receiver Chansi Stuckey. Sanchez and the Jets won the game 24–7, with Sanchez throwing for 272 yards, a touchdown, and an interception. He was named the Pepsi Rookie of the Week for his performance in the game, the first of three consecutive Rookie of the Week awards. He played his first home game a week later versus the New England Patriots, a 16–9 victory; it was also his first AFC East division game and his first rivalry game. It was the Jets' first victory over New England at home since 2000. With a 24–17 victory over the Tennessee Titans in Week 3, Sanchez became the first rookie quarterback in NFL history to start and win his first three games of an NFL season. However, his performance began to regress as he had a pass intercepted for a 99-yard touchdown return, and fumbled another attempted pass in the end zone for another touchdown, as the Jets fell to the New Orleans Saints in Week 4. These two plays were enough to spoil an otherwise strong outing from the Jets's defensive unit as the team dropped to a 3–1 record. Following the loss to New Orleans, Sanchez received criticism in a 16–13 overtime loss to the Buffalo Bills in Week 6 when he threw five interceptions against a lowly Bills defense that previously had only four interceptions all season long. The Jets ended their losing streak in a 38–0 victory against the Oakland Raiders in Week 7 however, Sanchez was criticized after he was seen eating a hot dog on the Jets' bench in the fourth quarter. In the team's second meeting against the Bills on December 3, 2009, Sanchez suffered a sprained posterior cruciate ligament in the third quarter, prompting the veteran Clemens to take his place. Though there were no setbacks to the injury, head coach Ryan benched Sanchez the following game against the Tampa Bay Buccaneers for precautionary reasons, much to Sanchez's dismay. At 7–7, the Jets had a chance to secure a playoff berth if they won the remainder of their games. One such game was against the Indianapolis Colts who had 23 consecutive regular season wins. Sanchez and the Jets engineered a comeback win following Colts head coach Jim Caldwell's decision to controversially rest the team starters in the third quarter with a five-point lead. The following week, on January 3, 2010, Sanchez led the team into the playoffs, despite a subpar effort, completing 8-of-16 passes for 63 yards, en route to a 37–0 victory over the Cincinnati Bengals, who rested their starters as the team had already clinched the AFC North division title and a playoff berth. The manner of the two wins, which gave the Jets their first playoff berth since 2006, caused many to claim the team had "backed into the playoffs". Sanchez completed his rookie season with 2,444 yards, 12 touchdowns, and 20 interceptions. 2009 postseason In the Wild Card Round, that took place on January 9, 2010 at Paul Brown Stadium, Sanchez led the Jets to another victory over the Bengals, 24–14, behind his positive performance where he completed 12 of his 15 passes while throwing for 182 yards and a touchdown with a passer rating of 139.4. Sanchez became the fourth rookie quarterback in NFL history to win his first postseason contest, and the second to do so on the road. The others were Shaun King (1999 Bucs), Ben Roethlisberger (2004 Steelers), and Joe Flacco (2008 Ravens). On January 17, 2010, Sanchez, with the help of running back and fellow rookie Shonn Greene, defeated the heavily favored San Diego Chargers 17–14 to attain the Jets' third AFC Championship appearance in franchise history. Sanchez became only the second rookie quarterback to win two consecutive playoff games, behind Joe Flacco. In a rematch of their regular season meeting with the Colts, Sanchez performed well in the first half however, the offense succumbed to the Colts' defense in the second half and the Jets gave up an 11-point lead, losing by a score of 30–17. Sanchez was named to Sporting News''' All-Rookie team for his performance during the season, becoming the first Jets quarterback to ever receive the award. 2010 season On February 17, 2010, Sanchez underwent surgery to repair the patella ligament in his left knee that he originally injured when he played for USC. The surgery was successful. Sanchez was expected to miss early workouts and return in time for training camp however Sanchez made a quick recovery and participated in team drills during Organized Team Activities (OTA). The Jets opened the 2010 season with a 5–1 record despite the passing game being subpar as Sanchez struggled to accurately throw the football. Sanchez recorded his first career 300-yard passing game in a win over the Detroit Lions on November 7, 2010. At 10–4, the Jets faced the Chicago Bears on December 26, 2010 with a chance to clinch a playoff berth. Though Sanchez injured his shoulder in a victory over Pittsburgh the previous week, he started the game completing 24 of his 37 passes for a touchdown and an interception. However, the Jets were unable to defend the Bears' offense and subsequently lost the game 38–34 after a comeback drive was halted when Sanchez was intercepted. Fortunately for them, due to a loss by the Jacksonville Jaguars that same day, the Jets clinched the playoff berth. Sanchez finished the season with 3,291 yards, 17 touchdowns, and 13 interceptions. 2010 postseason The Jets finished the season with an 11–5 record and entered the wild card round facing the Indianapolis Colts in a rematch of their previous encounter in the AFC Championship. Although Sanchez had a subpar performance completing 18 of his 31 passes while throwing an interception, he led the team in the final minutes of the game on a comeback drive culminating with kicker Nick Folk kicking the game-winning field goal as time expired. The Jets went on to face the New England Patriots in the divisional round and upset the heavily favored Patriots, 28–21, as Sanchez completed 16 of his 25 passes for 194 yards and three touchdowns. With the win, Sanchez tied Len Dawson, Roger Staubach, Jake Delhomme, and Joe Flacco for the second most post-season road victories by a quarterback in NFL history. The team traveled to the AFC Championship, for a second consecutive season, to face the Pittsburgh Steelers on January 23, 2011. Heading into halftime losing 24–3, the team, led by Sanchez, engineered a comeback following a heartfelt speech given by the quarterback at halftime. Although the Jets' defense did not allow Pittsburgh to score in the second half, the team fell short as their final offensive drive was stymied by the Steelers' defense and the Jets lost 24–19. 2011 season Prior to the outset of the 2011 season, head coach Rex Ryan named Sanchez a team captain. The Jets opened the season with a 2–3 record leading to discontent within the clubhouse. The team had begun to stray from their philosophy of consistently running the ball and began to pass more often; however, the offense struggled with this adjustment. Wide receivers Plaxico Burress, Santonio Holmes, and Derrick Mason approached coach Ryan to question offensive coordinator Brian Schottenheimer's system as Holmes and Mason averaged only three catches per game and Burress only 2.5 catches through four games. Additionally, Sanchez drew criticism for his difficulties to effectively throw the ball to his receivers. The struggles culminated with Holmes getting into a heated argument with another teammate in the huddle during their final regular season game against the Miami Dolphins. Holmes was benched in the fourth quarter while Sanchez threw three interceptions in the Jets' loss that eliminated the team from playoff contention for the first time in Sanchez's career. Statistically, Sanchez had similar numbers in a comparison to Eli Manning when he concluded his third year of football however, there were concerns that Sanchez was simply an ineffective quarterback and therefore expendable. During the offseason, Sanchez was criticized by anonymous teammates for his poor work ethic and his inability to improve; these claims were publicly refuted by other teammates. Despite questions surrounding his job security, after New York acquired Tim Tebow, the Jets agreed to a 3-year contract extension with Sanchez on March 9, 2012. The contract included $20 million in guarantees. 2012 season The Jets opened their 2012 season against the Buffalo Bills with Sanchez completing 19 of his 27 passes for 266 yards, 3 touchdowns and an interception in a 48–28 rout of the Bills. In the subsequent four games, Sanchez became the first quarterback since Stoney Case in 1999 to complete under 50% of his passes in four straight contests as the Jets fell to a 2–3 record. This led to fierce criticism from the media and fans and prompted calls for Sanchez to be benched in favor of Tim Tebow. Sanchez snapped this streak against the Indianapolis Colts on October 14, 2012 completing 11 of his 18 passes for 82 yards and throwing for 2 touchdowns in a 35–9 victory over Indianapolis. The following week, the Jets faced their division rivals, the Patriots. Despite a second-half lead by New England, the game was tied at the end of regulation forcing overtime. Following a Patriots field goal, Sanchez had the ball knocked out his hand by linebacker Rob Ninkovich who recovered the fumble and sealed the Patriots' victory. This was Sanchez's best overall performance to that point in the season as he completed a career record 68% of his passes when attempting 40 or more passes. Sanchez's struggles continued in the following two games against Miami and Seattle; in Seattle, Sanchez completed 9-of-22 passes for 124 yards while throwing his fourth red zone interception of the year and was the fifth game of the season where he completed under 50% of his pass attempts. The Jets snapped their losing streak in a 27–13 victory over the St. Louis Rams in which Sanchez completed 15 of his 20 passes for 178 yards and a touchdown. However, his struggles continued in a rematch against New England on Thanksgiving night. Despite completing 26 of 36 of his pass attempts for 301 yards and a touchdown and interception, the Jets lost to the Patriots 49–19 and fell to 4–7 as Sanchez turned the ball over twice—each turnover led to Patriots touchdowns. The Jets surrendered 21 points within a 53-second span on 3 turnovers. The play that earned Sanchez the most criticism was a second-quarter fumble where he ran into the backside of lineman Brandon Moore and the Patriots scored on the resulting fumble; this play, becoming widely known as the butt fumble, was mocked in the media. Sanchez started the next week, but was benched in the third quarter of the Jets' contest against the Arizona Cardinals on December 2, 2012 in favor of third string back-up Greg McElroy. Prior to being benched by Rex Ryan, Sanchez threw three interceptions. McElroy threw a touchdown to tight end Jeff Cumberland to score the team's only points in a 7–6 victory over Arizona. Rex Ryan renamed Sanchez the starting quarterback the following Wednesday after seeking out multiple opinions within the organization. Sanchez returned to complete 12 his 19 passes for 111 yards against the Jacksonville Jaguars, fumbling once that led to a Jaguar field goal. The Jets won 17–10. In a must-win game against the Tennessee Titans to remain in playoff contention, Sanchez struggled; he completed 13 of his 28 passes for 131 yards while throwing four interceptions and fumbling the ball in Titans territory in the closing minutes of the Jets' 14–10 defeat. A day later, Ryan named McElroy the starter. Sanchez started the Jets' final game against the Buffalo Bills after McElroy revealed he had been experiencing concussion symptoms in the days preceding. The Jets were defeated 28–9 with Sanchez completing 17 of his 35 passes for 205 yards and two turnovers. 2013 season Sanchez suffered a shoulder injury on August 24, 2013 in the Jets' third game of the preseason against the New York Giants after being tackled by nose tackle Marvin Austin. The Jets, who drafted rookie quarterback Geno Smith in the 2013 NFL Draft, named Smith the starter on September 4 with Sanchez still rehabilitating his injury. On September 14, 2013, Sanchez was placed on injured reserve with a designation to return. After undergoing shoulder surgery on October 8, 2013, it was announced he would miss the rest of the season. After much speculation regarding his future in New York, the Jets released Sanchez on March 21, 2014, the same day the Jets signed Michael Vick, the former Atlanta Falcons and Philadelphia Eagles quarterback. Philadelphia Eagles 2014 season Sanchez signed with the Philadelphia Eagles on March 29, 2014, for a 1-year, $2.25 million contract. After spending seven full games as Nick Foles's backup, Sanchez filled-in for an injured Foles, in a Week 9 game against the Houston Texans. Throwing for 202 yards, 2 touchdowns, and 2 interceptions, Sanchez led the Eagles to a 31–21 win. After the game, Eagles head coach, Chip Kelly, praised Sanchez, saying, "He's a hell of a quarterback and we're excited that we got him." Foles was later confirmed to be out for 6 to 8 weeks with a broken collarbone, meaning Sanchez would take over as the team's quarterback. On November 10, 2014, Sanchez started his first game for the Eagles and led them in a 45–21 rout of the visiting Carolina Panthers on Monday Night Football. Even though it was his first start at quarterback since December 2012, Sanchez notched two touchdowns as he threw for 332 yards—the fourth highest total in his NFL career. The victory also marked the first time he had ever thrown for more than 265 yards without an interception. Sanchez followed this with a 53–20 loss against Green Bay, where he threw for 346 yards and 2 touchdowns, but was also intercepted twice and lost 2 fumbles. Sanchez came back with 2 consecutive wins, a 43–24 rout of the Titans where he threw for 300 yards in his third consecutive start, and a 33–10 win against the Cowboys on Thanksgiving, where he logged 207 yards and a touchdown as well as 28 rushing yards and a score. With the Eagles at a 9–3 record, and his record as a starter at 3–1, the Eagles looked poised to win the NFC East, but after 3 consecutive losses and playoff elimination, he finished the 2014 season with a 4–4 record as the Eagles starting quarterback. In 9 games and 8 starts, he threw for 2,418 yards, 14 touchdowns, 11 INTs, and a 64.0 completion percentage, while rushing for 87 yards and 1 touchdown, with a career-high 88.4 passer rating. 2015 season The Eagles re-signed Sanchez to a two-year, $16 million contract on March 8, 2015, but despite his play in the 2014 season, he remained the backup quarterback as Nick Foles was traded for Sam Bradford. In Bradford's first several games, he had thrown more interceptions than touchdowns and often had a low completion percentage, but head coach Chip Kelly refused to bench Bradford in favor of Sanchez. Ironically, Sanchez would get his first playing time when Bradford was playing his best football of the season. On November 15, 2015, Sanchez came into a Week 10 game against the Miami Dolphins in relief of injured starter Sam Bradford with the Eagles trailing 20–16 late in the 3rd quarter. Sanchez finished 14–23 with 156 yards and an interception. The interception was costly as it occurred in the endzone when the Eagles were in field goal range down 20–19, which ended up being the final score of the game. Sanchez was announced as the starter for the Eagles in the Week 11 game against the Tampa Bay Buccaneers after it was revealed that Bradford suffered a separated shoulder and a concussion. Sanchez's opening drive was excellent, completing 100% of his passes and ending it with a 39-yard touchdown pass to Josh Huff to make the score 7–0. Sanchez continued to play well for the majority of the 1st half, and threw a second touchdown pass to Darren Sproles, but the Buccaneers offense had already scored 21 points. However, after that drive, Sanchez started to crumble, throwing an interception near the end of the second quarter. Sanchez finished the game with 3 interceptions, 1 of which was returned for a touchdown, in a humiliating 45–17 blowout. His 2 touchdown passes were the only points scored by the offense. Sanchez once again put up the only points for the offense in another humiliating blowout, this time a 45–14 loss to the Detroit Lions. Sanchez played relatively well, avoiding a 3-and-out on the first drive with a 5-yard scramble and managing to tie the game 7–7 with a touchdown pass to Brent Celek, but a defensive meltdown and lack of offensive momentum left the Eagles hopeless, and Sanchez finished his Eagles record with an 0–2 record as starter. Sanchez finished 19/27 for 199 yards with 2 touchdowns and no interceptions, which translated into a passer rating of 116.1. Despite his above average passer rating, Sanchez proceeded to go back to the bench the next week due to his winless record as starter, and he would not throw another pass for the rest of the season. Denver Broncos On March 11, 2016, Sanchez was traded to the Denver Broncos for a 2017 conditional draft pick. Prior to his arrival to the Broncos, Peyton Manning had retired and Brock Osweiler had signed with the Houston Texans. This opened up a competition for the Broncos starting quarterback job between Sanchez, second-year player Trevor Siemian and rookie Paxton Lynch. On August 29, Broncos head coach Gary Kubiak named Siemian the starting quarterback for the 2016 season. On September 3, 2016, Sanchez was released by the team. Dallas Cowboys On September 3, 2016, hours after the Broncos released him, Sanchez signed a one-year $2 million contract with the Dallas Cowboys, who were looking to have a veteran presence behind rookie Dak Prescott, who was named the starter at quarterback while Tony Romo recovered from a vertebral compression fracture he suffered during the first quarter of the Cowboys' Week 3 preseason game against the Seattle Seahawks. In Week 17, with the Cowboys already having clinched the NFC's #1 seed, Sanchez split time with starter Prescott and backup Romo. Prescott played the first two series of the game, Romo played the third, and Sanchez played the rest of the game. Sanchez completed 9 of 17 passes for 85 yards with two interceptions as the Cowboys lost to the Philadelphia Eagles by a score of 27–13. Chicago Bears On March 23, 2017, Sanchez signed a one-year contract with the Chicago Bears. The Bears had noted the "positive influence" Sanchez had as a mentor to rookie quarterback Dak Prescott and looked for him to play a similar role in the development of rookie Mitchell Trubisky. On May 30, 2017, Sanchez suffered a small knee injury in his left knee. He missed the remainder of OTA's and the minicamp. In response, the Bears rescinded the waiver request to Connor Shaw following the injury to Sanchez. Sanchez made the Bears' final roster third on the quarterback depth chart behind Trubisky and Mike Glennon, but did not play at all in 2017. On April 13, 2018, Sanchez, while still a free agent, was suspended for four games for violating the NFL's performance-enhancing drug policy. Sanchez tested positive for performance-enhancing drugs and cited "unknowing supplement contamination" in his statement to the media following the suspension. Washington Redskins On November 19, 2018, Sanchez signed with the Washington Redskins to serve as the backup to Colt McCoy, after starter Alex Smith suffered a season-ending leg injury. Sanchez made his first appearance with the Redskins for an injured McCoy in a 28–13 loss to his former team, the Philadelphia Eagles. He completed 13 passes from 21 attempts for 100 yards and an interception. Sanchez became the starter after McCoy fractured his fibula in the game. In Week 14, Sanchez was benched at halftime in a 40–16 loss to the New York Giants in favor for Josh Johnson, after throwing two interceptions and only 38 yards in the first half. The next day, the Redskins named Johnson their starter for the Week 15 game against the Jacksonville Jaguars. Retirement Sanchez announced his retirement on July 23, 2019, and subsequently took a position with ESPN's college football coverage. NFL career statistics Regular season Postseason Television and film career On October 15, 2020, Sanchez was revealed to have partaken in season four of The Masked Singer performing as the "Baby Alien", the show's first costume to incorporate a puppet. As Nick Cannon had a hard time figuring out how to remove Baby Alien's mask upon elimination, he had to receive help from the Men in Black to do the job. Player profile Early in his career, Sanchez was praised for his ability to maintain composure in the pocket amidst defensive pressure and focus on finding an open receiver to extend the team's offensive series. In his first four years, Sanchez had ten fourth quarter comebacks and twelve game winning drives. These characteristics were highlighted by Bill Parcells and Sam Wyche and garnered comparisons to Ben Roethlisberger. Sanchez was also noted for his proficiency in short passing situations and competitive nature; in December 2010, following dismal performances, Rex Ryan threatened to reduce Sanchez's repetitions with the first team offense during practice which infuriated Sanchez. In December 2012, following a series of poor performances that eventually led to his benching, a panel of former NFL quarterbacks were questioned about Sanchez's different attributes. It was unanimously agreed upon that his arm strength was "good enough" to succeed in the league and that he could be effective while mobile in the pocket. His regression was mainly attributed to his poor accuracy, a byproduct of his indecisive mentality once the ball is snapped, and the fact the Jets did little to help surround Sanchez with the talent to overcome his shortcomings. Ron Jaworski commented that Sanchez had lost his confidence which contributed to his decline. While Sanchez has embraced a playboy lifestyle, drawing comparisons to former Jet Joe Namath, he was praised by Brian Schottenheimer for his ability to work with various personalities and build relationships with teammates. After undergoing knee surgery following his rookie season, Sanchez established "Jets West" in 2010, an annual off-season camp located in his home state of California. Sanchez hosts workouts and offers classroom review sessions for his skill-position teammates on offense for one week. During the NFL labor dispute, Sanchez managed to organize private workouts with over forty of his teammates. Personal life Sanchez is an avid fan of musical theatre. He was a presenter at the 2010 Tony Awards, where he introduced a number from the Broadway musical Memphis. Sanchez has been involved in multiple charities including the Juvenile Diabetes Research Foundation to help raise awareness for Type 1 diabetes and Sam's Club's Giving Made Simple'' which, helps raise awareness about childhood obesity and how families can prevent it. Sanchez has also worked with the Teddy Atlas Foundation through which he met Aiden Binkley, a terminally ill 11-year-old struck with rhabdomyosarcoma. Sanchez developed a bond with Binkley and the two remained close friends until Binkley's death in December 2010. Sanchez's father is a fire captain for the Orange County Fire Authority and a member of the national urban search and rescue team. In college, Nick Sanchez played quarterback for East Los Angeles College and was later a sergeant in the United States Army. His two older brothers both played college football. Nick Jr. attended Yale University, where he played quarterback while Brandon attended DePauw University, where he played on the offensive line. Nick Jr. went on to attend the USC Law School and is a business attorney; Brandon became a mortgage broker. Sanchez dated model and current wife of Sean Avery, Hilary Rhoda for several years. The two appeared in a GQ photo shoot In June 2016, a lawsuit filed by U.S. Securities and Exchange Commission alleged that Sanchez's broker conducted a "ponzi-like scheme" which defrauded Sanchez, as well as Roy Oswalt and Jake Peavy, out of 30 million. Mexican-American identity When Sanchez was elevated to prominence at USC, he found himself a symbol of Mexican-American identity and a role model for younger generations. Sanchez was placed on center stage in Los Angeles, home to more than 4.6 million Hispanics, the majority of whom are of Mexican descent. While there had been previous, successful Mexican-American quarterbacks such as Tom Flores, Jim Plunkett, Joe Kapp, Jeff Garcia, Tony Romo, and Marc Bulger, unlike most of his predecessors, Sanchez is a third-generation, full Mexican and none had been embraced to the extent Sanchez was. USC fans began playing up Sanchez's ethnicity by wearing items such as sarapes, lucha libre masks, and homemade "¡Viva Sanchez!" T-shirts. His rise to fame within the Mexican-American community was compared to that of boxer Oscar De La Hoya and baseball pitcher Fernando Valenzuela. While starting for an injured John David Booty in 2007, Sanchez wore a custom mouthguard that featured the colors of the Mexican flag in honor of his heritage. It became a prominent issue after a nationally televised game against Notre Dame. The mouthpiece became a symbol for two opposing viewpoints: for Mexican-Americans, it was a symbol of unity—Sanchez accepting his heritage; for critics, the gesture symbolized a radical political statement. Sanchez, who was born and raised in the United States, received hate mail urging him to return to Mexico. Sanchez responded to the controversy stating, "It’s not a Mexican power thing or anything like that. It’s just a little bit of pride in our heritage. Hopefully, it inspires somebody and it’s all for the best." Overwhelmed by the attention and shying away from politics, Sanchez stopped wearing the mouthpiece, but began participating in other efforts to help the Hispanic community. Sanchez, who knew how to speak some Spanish but was not fluent, began to take Spanish lessons during his junior year at USC so he could have conversations without the use of a translator. The USC band played "El Matador" when Sanchez would come onto the field. Sanchez participated in a fundraiser to help provide school supplies to first-graders in the city of Long Beach and region of South Bay, and helped Mayor Antonio Villaraigosa give holiday gifts to impoverished families. By the end of his USC career, he had been hailed as a role model for Hispanic youth. Sanchez serves as the Ambassador to the Inner-City Games Los Angeles, an after-school program that provides "at-risk youth" with positive, alternative activities. ESPN Radio came to an agreement with the Jets to broadcast all of the team's regular season games in 2011 on 710 ESPN Radio in Los Angeles. The agreement came about due to Sanchez's continued popularity in California. Career highlights Awards and honors NFL records First rookie quarterback to win his first three starts Third most postseason road victories by an NFL quarterback: 4 (tied with Jake Delhomme, Len Dawson, and Roger Staubach) Most playoff victories by a rookie quarterback: 2 (tied with Joe Flacco) Most consecutive conference championship game appearances to begin career: 2 (tied with Ben Roethlisberger) New York Jets franchise records Most career postseason victories by an NFL quarterback: 4 Longest touchdown pass in a playoff game (2009): 80 Most game winning drives in a single season (2010): 6 Most regular season wins by a starting quarterback in 16 starts (2010): 11 (tied with Ken O'Brien in 1985) Philadelphia Eagles franchise records Most pass completions, game (37, tied with Sam Bradford) References External links Denver Broncos biography Philadelphia Eagles biography New York Jets biography USC Trojans biography 1986 births Living people American football quarterbacks New York Jets players Philadelphia Eagles players Denver Broncos players Dallas Cowboys players Chicago Bears players Washington Redskins players Sportspeople from Long Beach, California Sportspeople from Mission Viejo, California Sportspeople from Orange County, California Players of American football from Long Beach, California USC Trojans football players American sportspeople of Mexican descent Sportspeople from Somerset County, New Jersey People from Rancho Santa Margarita, California National Football League announcers
26501820
https://en.wikipedia.org/wiki/Ron%20Stillwell
Ron Stillwell
Ronald Roy Stillwell (December 3, 1939 – January 25, 2016) was an American Major League Baseball player who played parts of two seasons for the Washington Senators. A shortstop, he batted and threw right-handed, stood tall and weighed . Stillwell also help build the baseball programs at Thousand Oaks High School, Moorpark College and California Lutheran University. As a player, he played shortstop at University of Southern California (USC) and captained its 1961 national championship team. A week after graduating, he signed a contract with Major League Baseball’s Washington Senators. His MLB career was limited to fourteen games. Born in Los Angeles, Stillwell attended John Burroughs High School in Burbank, California and the University of Southern California, where he co-captained the national champion 1961 USC Trojans varsity baseball team. He was signed by the Senators as an amateur free agent during the season—the inaugural season of that incarnation of the Senators—and made his big league debut on July 3 against the Boston Red Sox at Griffith Stadium. Starting at shortstop in back-to-back games, both Washington victories, he collected one hit in eight total at bats, a double off Don Schwall. That was Stillwell's only MLB extra-base hit in 38 at bats and 42 plate appearances. He notched three runs batted in. Stilwell retired after five professional seasons in 1965. He became a teacher, and was baseball coach at Thousand Oaks High School, California Lutheran University and Moorpark College. He died of cancer on January 25, 2016. His son, Kurt, had a nine-season MLB career. Early life Stillwell attended John Burroughs High School in Burbank, where he was student body president and played both basketball and baseball. He graduated in 1957. His high school career included being selected to the All Foothill League team as a shortstop in baseball and guard in basketball. Playing career He graduated with a BA from the University of Southern California (USC) in 1961, the same university where he received his MA in 1967. During his time as a Trojan at USC, he was awarded the Wills Hunter Award for best GPA of a USC athlete. He was an All PAC in shortstop and captained the 1961 NCAA Championship team. A week after graduation, at age 21, he signed a contract with the Washington Senators. Stillwell played parts of two seasons with the Washington Senators. His career was cut short due to a collision during a game. Coaching career Thousand Oaks High School He began a 25-year career in the Conejo Valley School District in 1965, coaching varsity baseball, freshman basketball and cross country at Thousand Oaks High School. He was hired in 1964 when the school still was part of the Oxnard Union High School District. He taught at Thousand Oaks High for 33 years, coaching for 25. Stillwell also taught physical education at Thousand Oaks High School for over twenty-five years. California Lutheran University While still a teacher at Thousand Oaks High School, Stillwell was hired by Robert Shoup and became the head baseball coach at California Lutheran University in 1972, where he remained until 1978. He had a record 139-100-1 (.581) at Cal Lutheran and was named the 1976 NAIA Coach of the Year. Moorpark College Stillwell was a walk-on coach at Moorpark College from 1985-1989. He resigned as baseball coach in his fifth season for Moorpark College in 1989. Personal life Ron and his wife Jan have three children, Scott, Rod and Kurt. Kurt played nine seasons in Major League Baseball, his best being 1988 when he was selected to the American League All-Star Team. He was the second pick for the 1983 amateur draft by the Cincinnati Reds. Stillwell’s younger son, Rod, named after USC baseball coach Rod Dedeaux, played college ball at Arkansas and advanced to the College World Series in 1989. He was drafted by the Kansas City Royals. References External links 1939 births 2016 deaths Baseball coaches from California Baseball players from Los Angeles Cal Lutheran Kingsmen baseball coaches Charlotte Hornets (baseball) players Deaths from cancer in California Denver Bears players Major League Baseball infielders Richmond Virginians (minor league) players Sportspeople from Ventura County, California Syracuse Chiefs players USC Trojans baseball players Washington Senators (1961–1971) players York White Roses players
21885162
https://en.wikipedia.org/wiki/LG%20CNS
LG CNS
LG CNS (Korean: 엘지 씨엔에스) is a subsidiary of LG Corporation founded in 1987 that provides information technology services including consulting, System Integration, Network Integration, Business Process Outsourcing, and Information Technology Outsourcing. Originally, LG CNS only focused on computer engineering such as designing, developing and operating computer network systems for LG Group. Then, the firm expanded its target customers from LG Group to other private organizations and governments. LG CNS also focuses on global markets running worldwide development centers and has overseas subsidiaries. Currently, LG CNS “is Korea’s largest IT service provider and has implemented a number of large-scale public IT infrastructure projects and played a major role in the Korean government's e-Korea initiative.” Globalization The former CEO of LG CNS, Shin Chae-chol, said "The Korean market constitutes a small percentage of the total global market. I think that unfettered expansion and cutthroat competition in the domestic medical market is meaningless. We plan to expand the scope of overseas projects centering on seven overseas branches in China, Southeast Asia, and the United States." According to the CEO’s expectation, LG CNS has expanded to 7 overseas subsidiary companies and development centers in China, India, U.S.A., the Netherlands, Indonesia, Brazil, Singapore and Japan. Overseas sales from LG CNS subsidiaries exceeded more than “200 billion won in 2007, focusing to realize its 2008 sales goal of 230 billion won in overseas markets.” CSR (Corporate Social Responsibility) LG CNS has also been doing CSR since its establishment. LG CNS IT Dream Project is the annual event which is designed for students in welfare institutions to cheer them up by inspiring them in for IT. Also, the company regularly hold an in-house bazaar to help handicapped children and women. Furthermore, “from 1995 to 2008, LG CNS has helped 628 sight-impaired people to have eyesight recovery operations. Not only these three events, but also many other CRS related event have done by LG CNS.”. See also Electronic Data Systems (EDS) - the company that participated joint venture of STM, root company of LG CNS List of Korean companies Notes and references External links Official website LG Corporation Information technology companies of South Korea Technology companies established in 1987 South Korean brands South Korean companies established in 1987 Companies based in Seoul
44614066
https://en.wikipedia.org/wiki/Interactive%20Brokers
Interactive Brokers
Interactive Brokers LLC (IB) is an American multinational brokerage firm. It operates the largest electronic trading platform in the U.S. by number of daily average revenue trades. The company brokers stocks, options, futures, EFPs, futures options, forex, bonds, and funds. The company is headquartered in Greenwich, Connecticut and has offices in four cities. It is the largest subsidiary of the brokerage group Interactive Brokers Group, Inc., which was founded by Chairman Thomas Peterffy, an early innovator in computer-assisted trading. IB is regulated by the U.S. Securities and Exchange Commission, the Financial Industry Regulatory Authority, the New York Stock Exchange, the Commodity Futures Trading Commission, National Futures Association, Chicago Mercantile Exchange and other self-regulatory organizations. The company is a provider of fully disclosed, omnibus, and non-disclosed broker accounts and provides correspondent clearing services to 200 introducing brokers worldwide. The company serves 607,000 client brokerage accounts, with US$128.4 billion in customer equity. Interactive Brokers Group owns 40 percent of the futures exchange OneChicago, and is an equity partner and founder of the Boston Options Exchange. The original organization was first created as a market maker in 1977 under the name T.P. & Co., and was renamed Timber Hill Inc. in 1982. It became the first to use fair value pricing sheets on an exchange trading floor in 1979, and the first to use handheld computers for trading, in 1983. In 1987, Peterffy also created the first fully automated algorithmic trading system, to automatically create and submit orders to a market. Between 1993 and 1994, the corporate group Interactive Brokers Group was created, and the subsidiary Interactive Brokers LLC was created to control its electronic brokerage, and to keep it separate from Timber Hill, which conducts market making. In 2014, Interactive Brokers became the first online broker to offer direct access to IEX, a private forum for trading securities. Currently about 16.6 percent of the company is publicly held, while the remainder is held by employees and their affiliates; Thomas Peterffy is the largest shareholder. History In 1977, Thomas Peterffy left his job designing commodity trading software for Mocatta Metals, and bought a seat on the American Stock Exchange (AMEX) as an individual market maker. The following year, he formed his first company, named T.P. & Co., to expand trading activities to several members under badge number 549. At the time, trading used an open outcry system; Peterffy developed algorithms to determine the best prices for options and used those on the trading floor, and thus the firm became the first to use daily printed fair value pricing sheets. In 1979, the company expanded to employ four traders, three of whom were AMEX members. In 1982, Peterffy renamed T.P. & Co. to Timber Hill Inc.; he named it after a road to a favorite retreat, one of his properties on Hutchin Hill Road in Woodstock, New York. By 1983, Peterffy was sending orders to the floor from his upstairs office; he devised a system to read the data from a Quotron machine by measuring the electric pulses in the wire and decoding them. The data would be then sent through Peterffy's trading algorithms, and then Peterffy would call down the trades. After pressure to become a true market maker and keep constant bids and offers, Peterffy knew that he would need his employees to closely pay attention to market movements, and that handheld computers would help. At the time, the AMEX didn't permit computers on the trading floor. Because of this, Peterffy had an assistant deliver market information from his office in the World Trade Center. In November 1983 he convinced the exchange to allow computer use on the floor. In 1983, Peterffy sought to computerize the options market, and he first targeted the Chicago Board Options Exchange (CBOE). At the time, brokers still used fair value pricing sheets, which were by then updated once or twice a day. In 1983, Timber Hill created the first handheld computers used for trading. As Peterffy explained in a 2016 interview, the battery-powered units had touch screens for the user to input a stock price and it would produce the recommended option prices, and it also tracked positions and continually repriced options on stocks. However, he immediately encountered opposition from the heads of the exchange. When he first brought a by device to the exchange floor, a committee in the exchange told him it was too big. When he made the device smaller, the committee stated that no analytic devices were allowed to be used on the exchange floor. Effectively blocked from using the CBOE, he sought to use his devices in other exchanges. Also in 1983, Timber Hill expanded to 12 employees and began trading on the Philadelphia Stock Exchange. In 1984, Timber Hill began coding a computerized stock index futures and options trading system and, in February 1985, Timber Hill's system and network was brought online. The system was designed to centrally price and manage risk on a portfolio of equity derivatives traded in multiple locations around the country. In 1985, Peterffy introduced his computer system to the New York Stock Exchange (NYSE), which allowed it. However, the stock exchange only allowed it to be used at trading booths several yards away from where transactions were executed. Peterffy responded by designing a code system for his traders to read colored bars emitted in patterns from the video displays of computers in the booths. This caused the exchange and other members to be suspicious of insider trading, which convinced Timber Hill to distribute instructions throughout the exchange, describing how to read the displays. In response, the exchange required the company to turn the screens away from the trading floor, which prompted Peterffy to hire a clerk to communicate with the traders via hand signals. Eventually computers were allowed on the trading floor. Timber Hill joined the Options Clearing Corporation in 1984, the New York Futures Exchange in 1985, and the Pacific Stock Exchange and the options division of the NYSE the following year. Also in 1985, the firm joined and began trading on the Chicago Mercantile Exchange, the Chicago Board of Trade and the Chicago Board Options Exchange. In 1986, the company moved its headquarters to the World Trade Center to control activity at multiple exchanges. Peterffy again hired workers to sprint from his offices to the exchanges with updated handheld devices, which he later superseded with phone lines carrying data to computers at the exchanges. Peterffy later built miniature radio transmitters into the handhelds and the exchange computers to allow data to automatically flow to them. In 1987, Timber Hill joined the National Securities Clearing Corporation and the Depository Trust Company (now merged as the Depository Trust & Clearing Corporation). By 1987, Timber Hill had 67 employees and had become self-clearing in equities. In 1987, the CBOE was about to close down its S&P 500 options market due to the options not attracting sufficient trader interest. Because of this, Peterffy pledged that Timber Hill would make tight markets in the product for a year if the exchange would allow the traders to use handheld computers on the trading floor. The exchange agreed, and more traders were attracted by the change in pricing; today S&P 500 options are the most actively traded index options in the U.S. In 1990, Timber Hill Deutschland GmbH was incorporated in Germany, and shortly thereafter began trading equity derivatives at the Deutsche Terminbörse (DTB), marking the first time that Timber Hill used one of its trading systems on a fully automated exchange. In 1992, Timber Hill began trading at the Swiss Options and Financial Futures Exchange, which merged with DTB in 1998 to become Eurex. At that time, Timber Hill had 142 employees. While Peterffy was trading on the Nasdaq in 1987, he created the first fully automated algorithmic trading system. It consisted of an IBM computer that would pull data from a Nasdaq terminal connected to it and carry out trades on a fully automated basis. The machine, for which Peterffy wrote the software, worked faster than a trader could. Upon inspection, the Nasdaq banned direct interface with the terminal, and required trades to be typed in manually. Peterffy and his team designed a system with a camera to read the terminal, a computer to decode the visual data, and mechanical fingers to type in the trade orders, which was then accepted by the Nasdaq. 1993 to 2000 Interactive Brokers Inc. was incorporated in 1993 as a U.S. broker-dealer, to provide technology developed by Timber Hill for electronic network and trade execution services to customers. In 1994, Timber Hill Europe began trading at the European Options Exchange, the OM Exchange and the London International Financial Futures and Options Exchange. Also in 1994, Timber Hill Deutschland became a member of the Belgium Futures and Options Exchange, IB became a member of the New York Stock Exchange, and the Timber Hill Group LLC was formed as a holding company of Timber Hill and IB's operations. In 1995, Timber Hill France S.A. was incorporated and began making markets at the Marché des Options Négociables de Paris (a subsidiary of Euronext Paris) and the Marché à Terme International de France futures exchange. Also in 1995, Timber Hill Hong Kong began market making at the Hong Kong Futures Exchange and IB created its primary trading platform Trader Workstation and executed its first trades for public customers. In 1996, Timber Hill Securities Hong Kong Limited was incorporated and began trading at the Hong Kong Stock Exchange. In 1997, Timber Hill Australia Pty Limited was incorporated in Australia, and Timber Hill Europe began trading in Norway and became a member of the Austrian Derivatives Exchange. By 1997, Timber Hill had 284 employees. In 1998, Timber Hill Canada Company was formed, and IB began to clear online trades for retail customers connected directly to Globex to trade S&P futures. In 1999, IB introduced a smart order routing linkage for multiple-listed equity options and began to clear trades for its customer stocks and equity derivatives trades. Also in 1999, Goldman Sachs attempted to purchase the company and was turned away. In 2000, Interactive Brokers (U.K.) Limited was formed and Timber Hill became a Primary Market Maker on the International Securities Exchange (ISE). 2001 to present In 2001, the corporate name of the Timber Hill Group LLC was changed to Interactive Brokers Group LLC, which at the time handled 200,000 trades per day. In 2002, Interactive Brokers, along with the Bourse de Montréal and the Boston Stock Exchange, created the Boston Options Exchange. Also in 2002, IB introduced Mobile Trader and an application programming interface for customers and developers to integrate their mobile phone systems with the IB trading system. In 2002, Timber Hill became the major market maker for the newly introduced U.S. single-stock futures. In 2003, Interactive Brokers expanded its trade execution and clearing services to include Belgian index options and futures, Canadian stocks, equity/index options and futures, Dutch index options and futures, German equity options, Italian index options and futures, Japanese index options and futures, and U.K. equity options. In 2004, IB introduced direct market access to its customers on the Frankfurt and Stuttgart exchanges. In the same year, IB upgraded its account management system and Trader Workstation, adding real-time charts, scanners, fundamental analytics, and tools BookTrader and OptionTrader to the platform. In 2005, IB released its forex trading platform IdealPro (now Ideal FX). In 2006, the IB Options Intelligence Report was launched to report on unusual concentrations of trading interests and changing levels of uncertainty in the option markets. Also in that year, IBG took stakes in OneChicago, the ISE Stock Exchange, and the CBOE Stock Exchange. In 2006, Interactive Brokers started offering penny-priced options. On May 3, 2007, IBG held its initial public offering (IPO) through the Nasdaq and sold 40 million shares at $30.01 ($ in ) per share. It was run as a Dutch auction handled by WR Hambrecht (which handled Google's IPO similarly in 2004) and HSBC; it was the second-largest U.S. IPO that year and the largest brokerage IPO since 2005. The shares sold represented approximately 10 percent of the interest in IBG LLC. Also in 2007, a real-time Portfolio Margin platform was introduced for customers trading multiple asset classes, providing increased leverage with real-time risk management; as well, the company introduced exchanges for physicals for customers to exchange stocks and futures with a market-determined rate. In 2008, the company released Risk Navigator, a real-time market risk management platform. Also in 2008, several trading algorithms were introduced to the Trader Workstation. Among these is the Accumulate-Distribute Algo, which allows traders to divide large orders into small non-uniform increments and release them at random intervals over time to achieve better prices for large volume orders. In 2009, IB launched iTWS, a mobile trading app based on IB's Trader Workstation; it also released the Portfolio Analyst tool. In 2011, the company introduced several new services, including the Interactive Brokers Information System, Hedge Fund Capital Introduction Program, and the Stock Yield Enhancement Program. Interactive Brokers also became in 2011 the largest online U.S. broker as measured by daily average revenue trades. During the Occupy Wall Street protests of 2011–2012, IB ran a series of television commercials with the catchphrase "Join the 1%", which were seen as a controversial criticism of the protests. In 2012, IB began offering money manager accounts and opened the fully electronic Money Manager Marketplace. IB also released the TWS Mosaic trading interface and the Tax Optimizer for managing capital gains and losses. In 2013, IB released the Probability Lab tool and Traders' Insight, a service that provides daily commentary by Interactive Brokers traders and third party contributors. Also in 2013, IB integrated its trading notification tool (called IB FYI) into the TWS. The tool keeps customers informed of upcoming announcements that could impact their account, and a customer can set it to automatically act to exercise options early if the action is projected to be beneficial for the customer. An IB FYI also can act to automatically suspend a customer's orders before the announcement of major economic events that influence the market. On April 3, 2014, Interactive Brokers became the first online broker to offer direct access to IEX, a private electronic communication network for trading securities, which was subsequently registered as an exchange. In 2015, IB created the service Investors' Marketplace, which allows customers to find investors and other service providers in the financial industry. IB also gained clients through Scottrade that year; Scottrade had previously offered complex option trading through its platform OptionsFirst, and began offering trading through IB's platform. In March 2016, IB released a companion app to iTWS for the Apple Watch. In May 2017, IB announced the sale of the market making business conducted by its Timber Hill subsidiary, including its market making software, to New York-based Two Sigma Securities. In 2020, the customer base grew to one million users. During the GameStop short squeeze, Interactive Brokers briefly restricted trading of several stocks, along with other brokerages. In September 2021, the firm announced that clients could now trade cryptocurrencies. Since September 13, Interactive Brokers' clients are now able to trade and custody Bitcoin, Ethereum, Litecoin and Bitcoin Cash alongside the already tradable stocks, options, futures, bonds, mutual funds and exchange-traded funds. The company charges commissions of 0.12 to 0.18% of the traded amount. Operations Interactive Brokers is the largest electronic brokerage firm in the US by number of daily average revenue trades, and is the leading forex broker. Interactive Brokers also targets commodity trading advisors, making it the fifth-largest prime broker servicing them. IB is regulated by the Securities and Exchange Commission, the Financial Industry Regulatory Authority, the New York Stock Exchange, the Financial Conduct Authority and other regulators and self-regulatory organizations. It provides correspondent clearing services to 200 introducing brokers worldwide. The company serves 720 thousand client brokerage accounts, with $170.1 billion in customer equity. Interactive Brokers Group has $75 million in tangible assets, including $24 million in computer equipment. Currently about 17.3 percent of the company is publicly held, while the remainder is held by employees; Thomas Peterffy is the largest shareholder. Peterffy has described the company as similar to Charles Schwab Corporation or TD Ameritrade, however, specializing in providing brokerage services to larger customers and charging low transaction costs. He also described the company's focus on building technology over having high sales, with technology often used to automate systems in order to service customers at a low cost. The company can afford to focus on automation and acquiring customers over focusing on financial results, as 82.7% of the company is held by employees. It has offered direct market access to Australian contracts for difference since 2008. Mobile transactions account for about 10% of the company's retail orders. Investors can open accounts online and there is no minimum required, though maintenance fees are sometimes charged. New customers are directed towards Traders' Academy, the company's education resource, which includes series of videos on IB's products and trading tools. Employees Interactive Brokers Group has 11 directors, including Thomas Peterffy, chairman of the board of directors, who as the controlling shareholder is able to elect board members. As of 2016, the company has 1,649 employees, and 1,365 of them hold company stock. Interactive Brokers employs computer programmers and IT workers; programmers outnumber other employees five to one. As of 2015, approximately nine percent of employees work in legal or regulatory compliance departments. Among the company's directors are Lawrence E. Harris, a professor at the University of Southern California's Marshall School of Business, and who was chief economist of the Securities and Exchange Commission. Among its former directors are Hans Stoll, founder and director of the Financial Markets Research Center at Vanderbilt University, and an author and former president of the American Finance Association, and Ivers Riley, former chairman of the International Securities Exchange, CEO of the Hong Kong Futures Exchange, and chief developer of SPDR funds. Locations Interactive Brokers maintains a headquarters in downtown Greenwich, Connecticut. Traders and programmers work in units with several monitors and more overhead, while several network engineers staff an area round the clock, six days a week. The company also has offices in Budapest, Chicago, Dublin, Hong Kong, London, Luxembourg, Montreal, Mumbai, San Francisco, Singapore, Sydney, Tokyo, Toronto, West Palm Beach, and Zug. More than half of the company's customers reside outside the United States, in approximately 200 countries. Media The first chapter of Christopher Steiner's 2012 book Automate This: How Algorithms Came to Rule Our World describes Thomas Peterffy's development of Interactive Brokers and the technologies that have led to the modern automated market. Four chapters of Scott Patterson's Dark Pools: The Rise of the Machine Traders and the Rigging of the U.S. Stock Market also detail Peterffy and his company. See also Financial innovation Notes References External links 1978 establishments in New York (state) American companies established in 1978 Companies based in Fairfield County, Connecticut Financial services companies established in 1978 Financial derivative trading companies Financial services companies of the United States Greenwich, Connecticut Investment management companies of the United States Multinational companies headquartered in the United States Online brokerages Electronic trading platforms Online financial services companies of the United States 2007 initial public offerings Companies listed on the Nasdaq
61101793
https://en.wikipedia.org/wiki/Teamfight%20Tactics
Teamfight Tactics
Teamfight Tactics (TFT) is an auto battler game developed and published by Riot Games. The game is a spinoff of League of Legends and is based on Dota Auto Chess, where players compete online against seven other opponents by building a team to be the last one standing. The game released as a League of Legends game mode for Microsoft Windows and macOS in June 2019 and as a standalone game for Android and iOS in March 2020, featuring cross-platform play between them. Gameplay Based on Dota Auto Chess, a mod for Dota 2, the game centers around eight players who construct teams to fight one another and be the last player standing. The battlefield consists of hexagons, where players can strategically place units on the hexagons on their side of the game board between rounds. During each round, a short battle automatically commences, with two players matched randomly for that round, or else paired against computer-controlled enemies. In the rounds against computer-controlled enemies, each enemy has a chance to drop gold, units, or items that the player can use. Health lost from losing a round is calculated with a combination of set damage per round and how many units an opponent had still alive. The game consist of stages and rounds. Each stage consist of seven rounds with the exception of stage 1. Stage 1 consist of three rounds of computer-controlled enemies. On round four of each stage, there is a feature called the "Carousel", in which players have access to a free rotation of units with random equipped items to select from. During these shared rounds, the two players with the lowest health are able to choose their units first, followed by the next two players with the lowest health, and so on. If there are players with similar health points, the game will randomly choose the order. On the last round of each stage, players will face computer-controlled enemies. Players accumulate gold during rounds and can save it to build interest, which further increases their income per round. Players can also gain additional income per round by either winning multiple rounds in a row or losing multiple rounds in a row. With this gold, they can either reroll the five units automatically offered to them in their shop at the start of each round or purchase experience points to increase their level. The higher a player's level, the more units they can place on the board, which can also be augmented by certain items, and the higher the average rarity of units in the shop. Each unit is able to be upgraded if additional copies of the same unit are found in the shop or Shared Draft. Upgrading a champion increases their maximum health and attack damage. With some exceptions, units have both a health bar and a mana bar. Taking damage from enemy attacks or abilities will lower a unit's health but increase a unit's mana. When a unit's health reaches zero, they are effectively removed from the round. When a unit's mana bar is full, they cast a unique ability. Some units may start the round with some percentage of their mana bar full, but units generally start the round with no mana. Synergies are activated by a team composition that makes use of one or multiple units with the same trait. Each unit has two or three traits and the effective combination of units will activate synergies that benefit the player. Synergies will usually fall into three categories: effects that strengthen allies, effects that weaken enemies, and miscellaneous effects. Each "set" of Teamfight Tactics corresponds to a unique unit pool, collection of synergies, and usable items. Teamfight Tactics periodically updates its unit roster. Every three months there is a partial rotation, referred by Riot Games developers as a mid-set update, rotating out traits and units which are problematic. As Teamfight Tactics is a game mode of League of Legends, its patch numbering follows the same as its parent game rather than being labelled differently. Development and release Teamfight Tactics was based on Dota Auto Chess, which in turn was inspired by Mahjong, where players pick up tiles and discard tiles in order to complete a hand by forming a pair and sets such as a sequence, or three or four identical tiles, while preventing other players to complete a hand. The game was released within the League of Legends client for Microsoft Windows and macOS on June 26, 2019, and as a standalone app for Android and iOS on March 19, 2020. By September 2019, the game had over 33million monthly players with 1.72 billion hours of accumulated game time. Teamfight Tactics has its own store separate from League of Legends. The player's controllable avatar, called a "Little Legend", can be customized by buying new ones from the store. Those can be upgraded by buying from loot boxes called "Little Legend eggs" or through star shards that can either be bought in the store or earned from the season pass that lasts for the duration of the set. Skins for a player's board, on which the game is played, are purchasable without the use of loot boxes. Teamfight Tactics is being supported by Riot Games post-launch, with regular balance updates to keep the game fair and entertaining, as well as Little Legend egg drops. The game also updates the game in a big way with "sets". Sets give players more incentive to play the game, changing synergies and introducing new ones, rotating various League of Legends champions into the roster, as well as dropping new season passes. Besides Normal and Ranked game modes, Riot released a faster-paced game mode named "Hyper Roll" in April 2021, streamlining mechanics to decrease game time. References External links Android (operating system) games Auto battler video games Esports games Free-to-play video games IOS games 2019 video games MacOS games Video games developed in the United States Windows games League of Legends Riot Games games
8839363
https://en.wikipedia.org/wiki/Costa%20Panayi
Costa Panayi
Costa Panayi is a former computer game programmer active during the 1980s. He founded Vortex Software with Paul Canter, publishing games for the ZX Spectrum, Commodore 64 and Amstrad CPC. He is of Greek Cypriot descent, and studied engineering at the University of Salford. After graduation, he started to work as a mechanical engineer for British Aerospace, when he got into programming games business from his hobby interests. In 1982, he studied Sinclair BASIC and subsequently formed his company Vortex Software along with Luke Andrews and Mark Haigh-Hutchinson. They wrote a variety of games, including Gunlaw, the Android series, and Tornado Low Level for the ZX Spectrum. His games achieved critical success; Tornado Low Level and Highway Encounter appeared in the "Your Sinclair official top 100", for example, and in them he developed original 3D interfaces. In 1995, he was working as a design consultant for Fisher Price. List of games ZX81 Othello (1981) Word Mastermind (1981) Pontoon (1981) Crash (1981) Cosmos (1982), Abbex Electronics Astral Convoy (1982), Vortex Software Gunlaw (1983), Vortex Software Android One: The Reactor Run (1983), Vortex Software Android Two (1983), Vortex Software Tornado Low Level (1984), Vortex Software Cyclone (1985), Vortex Software Highway Encounter (1985), Vortex Software Revolution (1986), U.S. Gold Deflektor (1987), Gremlin Graphics Hostile All Terrain Encounter (1989), Gremlin Graphics References World of Spectrum entry. Interview from Sinclair User issue 32. External links History of Vortex Software by Mark Haigh-Hutchinson 1957 births Living people People from Stretford Video game programmers British people of Greek Cypriot descent Alumni of the University of Salford British mechanical engineers
12084924
https://en.wikipedia.org/wiki/IcyOwl
IcyOwl
IcyOwl is an open-source C++ Integrated Development Environment(IDE) distributed under GNU General Public License. It is written in C++ and wxWidgets API. IcyOwl is Cross-Platform, which means that it runs on many Operating System such as Microsoft Windows and Linux. Mac OS port is still unavailable as of now. It uses doxygen to generate class tree, but only supports one compiler, that is GCC/MinGW. It is now going through a major code changes since the developer saw problems implementing the class browser using doxygen. Features Uses MinGW on Windows Uses GCC on linux Syntax highlighting Code completion Class browser Project Manager Auto generation of Makefile See also C++ List of integrated development environments Comparison of integrated development environments wxWidgets Scintilla SourceForge External links IcyOwl Homepage IcyOwl Homepage (Alternate Sourceforge address) Integrated development environments Free integrated development environments Linux integrated development environments Software that uses wxWidgets
962908
https://en.wikipedia.org/wiki/Change%20control
Change control
Within quality management systems (QMS) and information technology (IT) systems, change control is a process—either formal or informal—used to ensure that changes to a product or system are introduced in a controlled and coordinated manner. It reduces the possibility that unnecessary changes will be introduced to a system without forethought, introducing faults into the system or undoing changes made by other users of software. The goals of a change control procedure usually include minimal disruption to services, reduction in back-out activities, and cost-effective utilization of resources involved in implementing change. Change control is used in various industries, including in IT, software development, the pharmaceutical industry, the medical device industry, and other engineering/manufacturing industries. For the IT and software industries, change control is a major aspect of the broader discipline of change management. Typical examples from the computer and network environments are patches to software products, installation of new operating systems, upgrades to network routing tables, or changes to the electrical power systems supporting such infrastructure. Certain portions of ITIL cover change control. The process There is considerable overlap and confusion between change management, configuration management and change control. The definition below is not yet integrated with definitions of the others. Change control can be described as a set of six steps: Plan / Scope Assess / Analyze Review / Approval Build / Test Implement Close Plan / Scope Consider the primary and ancillary detail of the proposed change. Should include aspects such as identifying the change, its owner(s), how it will be communicated and executed, how success will be verified, the change's estimate of importance, its added value, its conformity to business and industry standards, and its target date for completion. Assess / Analyze Impact and risk assessment is the next vital step. When executed, will the proposed plan cause something to go wrong? Will related systems be impacted by the proposed change? Even minor details should be considered during this phase. Afterwards, a risk category should ideally be assigned to the proposed change: high-, moderate-, or low-risk. High-risk change requires many additional steps such as management approval and stakeholder notification, whereas low-risk change may only require project manager approval and minimal documentation. If not addressed in the plan/scope, the desire for a backout plan should be expressed, particularly for high-risk changes that have significant worst-case scenarios. Review / Approval Whether it's a change controller, change control board, steering committee, or project manager, a review and approval process is typically required. The plan/scope and impact/risk assessments are considered in the context of business goals, requirements, and resources. If, for example, the change request is deemed to address a low severity, low impact issue that requires significant resources to correct, the request may be made low priority or shelved altogether. In cases where a high-impact change is requested but without a strong plan, the review/approval entity may request a full business case may be requested for further analysis. Build / Test If the change control request is approved to move forward, the delivery team will execute the solution through a small-scale development process in test or development environments. This allows the delivery team an opportunity to design and make incremental changes, with unit and/or regression testing. Little in the way of testing and validation may occur for low-risk changes, though major changes will require significant testing before implementation. They will then seek approval and request a time and date to carry out the implementation phase. In rare cases where the solution can't be tested, special consideration should be made towards the change/implementation window. Implement In most cases a special implementation team with the technical expertise to quickly move a change along is used to implement the change. The team should also be implementing the change not only according to the approved plan but also according to organizational standards, industry standards, and quality management standards. The implementation process may also require additional staff responsibilities outside the implementation team, including stakeholders who may be asked to assist with troubleshooting. Following implementation, the team may also carry out a post-implementation review, which would take place at another stakeholder meeting or during project closing procedures. Close The closing process can be one of the more difficult and important phases of change control. Three primary tasks at this end phase include determining that the project is actually complete, evaluating "the project plan in the context of project completion," and providing tangible proof of project success. If despite best efforts something went wrong during the change control process, a post-mortem on what happened will need to be run, with the intent of applying lessons learned to future changes. Regulatory environment In a Good Manufacturing Practice regulated industry, the topic is frequently encountered by its users. Various industrial guidances and commentaries are available for people to comprehend this concept. As a common practice, the activity is usually directed by one or more SOPs. From the information technology perspective for clinical trials, it has been guided by another U.S. Food and Drug Administration document. See also Change request Change order Engineering Change Order Documentation Identifier Version control Changelog Living Document Specification (technical standard) Standardization Scope Management References Information technology management Project management Software project management
17875139
https://en.wikipedia.org/wiki/LaserSoft%20Imaging
LaserSoft Imaging
LaserSoft Imaging AG is a software developer designing software such as SilverFast for scanners and large format printers. The company's headquarters are located in Kiel, Germany, north of Hamburg, and another office in Sarasota, Florida, United States. History 1986 - 1990 LaserSoft Imaging was founded in Spring 1986 by the physicist Karl-Heinz Zahorsky, the president of the company today. LaserSoft Imaging became an early adopter of color- and image processing on the Macintosh. It was the first company to distribute video digitizers, such as Pixelogic's 'ProViz' and Truvel's 'TrueScan', the first professional color scanner for the Macintosh, which was first shown at Hannover trade fair CeBIT in 1988, to which LaserSoft Imaging was invited by Apple Computer. 1990 - 2000 Karl-Heinz Zahorsky published many articles about image processing in several professional journals. In 1990 Hell Graphics, the world's leading drum scanner developer in Kiel, hired him as an independent consultant to keep up with the rapidly proceeding desktop technologies. In 1991 LaserSoft Imaging became a registered GmbH and moved into a large Prepress house to help in setting up color reproduction on the desktop and to link Chromacom systems to Macintosh Computers. With RipLink LaserSoft Imaging presented a system to link major prepress systems, like Hell, Scitex and Crosfield to Macintosh. The distribution of leafscanners started the chapter of high-end scanners on the desktop. LaserSoft Imaging consulted Leaf, Canon, Sony, Seiko and others regarding desktop high end color. LaserSoft Imaging also was involved in the development of the Photone Prepress, after the product had gone through its infant stages. In 1994 LaserSoft Imaging started the development of SilverFast, which was first presented in its early stages at CeBIT 1995. Version 2.0 was demonstrated at CeBIT 1996, version 3.0 was released in December 1996, version 4.0 in 1998. The second office opened in Sarasota, Florida in 1997. 2000 - 2010 LaserSoft Imaging released SilverFast version 5.0 in 2000, version 6.0 in 2002, version 6.5 in 2006 ant the latest version 6.6 in 2008. LaserSoft Imaging changed the company into a public corporation (Aktiengesellschaft) in 2001; the founder is the sole shareholder. Today, LaserSoft Imaging has implemented scan software for over 300 scanners for Mac OS 9, Mac OS X and Microsoft Windows 98, 2000, XP, Vista. In the beginning LaserSoft Imaging's software was mainly distributed in Germany, but now, SilverFast is distributed globally through distributors as well as world-wide bundle agreements with manufacturers such as Canon, HP, Seiko Epson, Cruse, Leica Camera AG, Microtek, Nikon, Pacific Image Electronics, Pentacon GmbH, PFU/Quato, Plustek, Samsung, and Umax. The company also supports high-end scanners such as the Heidelberg (Linotype) Topaz, Tango, Nexscan, and Chromagraph (3300 and 3400). LaserSoft Imaging's SilverFast was awarded "best color management software of the year 2008" by the European Digital Press Association (EDP). The award was given for improving the dynamic range of most scanners with its Multi-Exposure feature and for automatic creation of ICC profiles. The Fogra came to a similar conclusion after testing LaserSoft Imaging's IT8 targets and SilverFast's colour management system in 2009 by attesting full conformity to the strict ISO norm 12641. The HDRi raw data format was also developed in 2009. By adding the RAW data of the infrared channel to the usual RAW data it contains any readable image information for later post processing. Products The company's products can be classified in four different categories : SilverFast SE / SE Plus / Ai Studio (Scanner Software) SilverFast HDR / HDR Studio (64Bit / 48 Bit Image Processing Software) SilverFast Archive Suite / Archive Suite SE SilverFast PrintTao 8 (Software for large format printers) SRDx Photoshop Plug-in (Dust and Scratch Removal) LaserSoft Imaging is also manufacturing reflective and transparent IT8 Targets (ISO 12641-1), Advanced IT8 Targets (ISO 12641-2) since 2019. The Resolution Target (USAF 1951) is a tool to determine the maximum resolution of a particular scanner. Literature Taz Tally, Ph.D. SilverFast - The Official Guide. Sybex, 2003. Taz Tally, Ph.D. SilverFast - Das offizielle Buch. Mitp-Verlag, 2004. Sascha Steinhoff: Scanning negatives and slides : digitizing your photographic archive - Rocky Nook, 2007. References Software companies of Germany Privately held companies of Germany Companies based in Kiel
69367020
https://en.wikipedia.org/wiki/Gregg%20Rothermel
Gregg Rothermel
Gregg Rothermel is an American computer scientist, software engineer and academic. He is Professor and Head of the Department of Computer Science at North Carolina State University. Rothermel’s research has focused on software engineering and program analysis, with a particular emphasis on the applications of program analysis techniques to problems in the context of software maintenance and testing, end-user software engineering, and empirical studies. He co-founded the ESQuaReD Laboratory at the University of Nebraska-Lincoln, and the Software-Artifact Infrastructure Repository (SIR). He also co-founded the EUSES Consortium, a group of researchers who, with National Science Foundation's support, have led end-user software engineering research. Rothermel is an IEEE Fellow and an ACM Distinguished Scientist, and serves as an Associate Editor for IEEE Transactions on Software Engineering and Methodology. Education Rothermel earned a B.A. in Philosophy from Reed College in 1983, and an M.S. in Computer Science from the State University of New York at Albany in 1986. He then enrolled at Clemson University and received his Ph.D. degree in Computer Science, under the supervision of Mary Jean Harrold, in 1996. Career Rothermel began his academic career as a Teaching Assistant of Computer Science at State University of New York at Albany in 1985. In 1991, he joined Clemson University as a Teaching Assistant in the Department of Computer Science, and became a Research Assistant the following year. He then held a brief appointment at the Ohio State University as a Senior Research Associate in the Department of Computer and Information Science through 1996, before joining Oregon State University as an Assistant Professor of Computer Science. In 2001, he was promoted to Associate Professor at Oregon State. From 2004 through 2018, he served on the faculty of the Department of Computer Science and Engineering at the University of Nebraska – Lincoln as Professor and Jensen Chair of Software Engineering. Since 2018, he has been serving as Professor and Head of the Department of Computer Science at North Carolina State University. Research Rothermel has published over 230 articles, has been cited over 24,000 times, and has a Google Scholar H-index of 75. He holds two U.S. Patents Rothermel has received recognition for his pioneering contributions in developing and empirically evaluating regression testing techniques. He was among the first researchers to propose and empirically study test case prioritization techniques. He also published a paper highlighting the issues regarding regression test selection techniques, and used these issues as the basis for a framework within which to evaluate the techniques. Rothermel has also worked on "end-user software engineering". His research is concerned with enabling non-professional programmers to create more dependable systems while developing programs such as spreadsheets, web macros, and web mashups. In 2001, he introduced and explored a methodology to adapt data flow adequacy criteria and coverage monitoring to the task of testing spreadsheets. Awards and honors 1997-2001 - CAREER Award, National Science Foundation 1999 - Engelbrecht Young Faculty Award, Oregon State University College of Engineering 2003 - Research Award, Oregon State University College of Engineering 2013 - Excellence in Science and Technology Award, University at Albany 2013 - ACM Distinguished Scientist, Association for Computing Machinery 2016 - Fellow, Institute of Electrical and Electronics Engineers (IEEE) 2020 - Retrospecitve Impact Paper Award Keynote, ISSTA Bibliography Rothermel, G., & Harrold, M. J. (1996). Analyzing regression test selection techniques. IEEE Transactions on Software Engineering (TSE), 22(8), 529-551. Rothermel, G., & Harrold, M. J. (1997). A safe, efficient regression test selection technique. ACM Transactions on Software Engineering and Methodology (TOSEM), 6(2), 173-210. Rothermel, G., Untch, R. H., Chu, C., & Harrold, M. J. (2001). Prioritizing test cases for regression testing. IEEE Transactions on Software Engineering (TSE), 27(10), 929-948. Elbaum, S., Malishevsky, A. G., & Rothermel, G. (2002). Test case prioritization: A family of empirical studies. IEEE Transactions on Software Engineering (TSE), 28(2), 159-182. Do, H., Elbaum, S., & Rothermel, G. (2005). Supporting controlled experimentation with testing techniques: An infrastructure and its potential impact. Empirical Software Engineering, 10(4), 405-435. Elbaum, S., Rothermel, G., & Penix, J. (2014) Techniques for improving regression testing in continuous integration development environments. Proceedings of the 22nd ACM SIGSOFT International Symposium on Foundations of Software Engineering, November, 2014, 235-245 References Living people Reed College alumni University at Albany, SUNY alumni Clemson University alumni American computer scientists American software engineers North Carolina State University faculty Year of birth missing (living people)
35006257
https://en.wikipedia.org/wiki/Scratchbox2
Scratchbox2
Scratchbox2 (sbox2 or sb2) is a cross-compilation toolkit designed to make embedded Linux application development easier. It also provides a full set of tools to integrate and cross-compile an entire Linux distribution. In the Linux world, when building software, many parameters are auto-detected based on the host system (like installed libraries and system configurations), through autotools "./configure" scripts for example. When one wants to build for an embedded target (cross-compilation), most of the detected parameters are incorrect (i.e. host configuration is not the same as the embedded target configuration). Without Scratchbox2, one has to manually set many parameters and "hack" the "configure" process to be able to generate code for the embedded target. Scratchbox2 allows one to set up a "virtual" environment that will trick the autotools and executables into thinking that they are directly running on the embedded target with its configuration. Moreover, Scratchbox2 provides a technology called CPU-transparency that goes further in that area. With CPU-transparency, executables built for the host CPU or for the target CPU could be executed directly on the host with sbox2 handling the task to CPU-emulate if needed to run a program compiled for the target CPU. So, a build process could mix the usage of program built for different CPU architectures. That is especially useful when a build process requires building the program X to be able to use it to build the program Y (Example: building a Lexer that will be used to generate code for a specific package). Projects using Scratchbox2 Tizen Maemo MeeGo / Mer WIDK webOS Internals Development Kit Raspberry Pi (Used to build binaries for the Raspberry Pi alphaboard) Sailfish OS External links Official Scratchbox2 website and wiki Scratchbox2 for Debian presentation by Riku Voipio Embedded Linux Build automation
827224
https://en.wikipedia.org/wiki/CipherSaber
CipherSaber
CipherSaber is a simple symmetric encryption protocol based on the RC4 stream cipher. Its goals are both technical and political: it gives reasonably strong protection of message confidentiality, yet it's designed to be simple enough that even novice programmers can memorize the algorithm and implement it from scratch. According to the designer, a CipherSaber version in the QBASIC programming language takes just sixteen lines of code. Its political aspect is that because it's so simple, it can be reimplemented anywhere at any time, and so it provides a way for users to communicate privately even if government or other controls make distribution of normal cryptographic software completely impossible. History and purpose CipherSaber was invented by Arnold Reinhold to keep strong cryptography in the hands of the public. Many governments have implemented legal restrictions on who can use cryptography, and many more have proposed them. By publicizing details on a secure yet easy-to-program encryption algorithm, Reinhold hopes to keep encryption technology accessible to everyone. Unlike programs like PGP which are distributed as convenient-to-use prewritten software, Reinhold publishes CipherSaber only as a specification. The specification is intended to be so simple that even a beginning programmer can implement it easily. As the CipherSaber web site explains: In George Lucas's Star Wars trilogy, Jedi Knights were expected to make their own light sabers. The message was clear: a warrior confronted by a powerful empire bent on totalitarian control must be self-reliant. As we face a real threat of a ban on the distribution of strong cryptography, in the United States and possibly world-wide, we should emulate the Jedi masters by learning how to build strong cryptography programs all by ourselves. If this can be done, strong cryptography will become impossible to suppress. The web site has a graphics file that displays as a "CipherKnight" certificate; however, that file is encrypted using CipherSaber with a known key published alongside the file. Users can view the graphic (and optionally print it out for framing) by first writing their own CipherSaber implementation to decrypt the file. By writing their own implementation and performing a few other small tasks, the user becomes a CipherKnight and the decrypted certificate attests to their knighthood. So, rather than providing a ready-made tool, CipherSaber's designer hopes to help computer users understand that they're capable of making their own strong cryptography programs without having to rely on professional developers or the permission of the government. Technical description In the original version of CipherSaber (now called CipherSaber-1 or CS1), each encrypted message begins with a random ten-byte initialization vector (IV). This IV is appended to the CipherSaber key to form the input to the RC4 key setup algorithm. The message, XORed with the RC4 keystream, immediately follows. The Fluhrer, Mantin and Shamir attack on RC4 has rendered CipherSaber-1 vulnerable if a large number (>1000) messages are sent with the same CipherSaber key. To address this, the CipherSaber designer has made a modified protocol (called CipherSaber-2) in which the RC4 key setup loop is repeated multiple times (20 is recommended). In addition to agreeing on a secret key, parties communicating with CipherSaber-2 must agree on how many times to repeat this loop. The ciphertext output is a binary byte stream that is designed to be "indistinguishable from random noise". For use with communications systems that can accept only ASCII data, the author recommends encoding the byte stream as hexadecimal digits. This is less efficient than, for example, base64 MIME encoding, but it is much simpler to program, keeping with CipherSaber's goal of maximal ease of implementation. Security and usability CipherSaber is strong enough and usable enough to make its political point effectively. However, it falls markedly short of the security and convenience one would normally ask of such a cryptosystem. While CipherKnights can use CipherSaber to exchange occasional messages with each other reasonably securely, either for fun or in times of great distress, CipherSaber strips cryptography to its bare essentials and it does not offer enough features to be suitable for wide deployment and routine daily use. CipherSaber's author in fact asks users to download and install PGP as one of the steps of becoming a CipherKnight. CipherSaber can be seen as a last-resort fallback system to use if programs like PGP are banned. Some, but not all of CipherSaber's sacrifices and shortcomings are unique to RC4. CipherSaber provides no message authentication. This vulnerability, shared by all pure stream ciphers, is straightforward to exploit. For example, an attacker who knows that the message contains "Meet Jane and me tomorrow at 3:30 pm" at a particular point can recover the keystream at that point from the ciphertext and plaintext. Then the attacker can replace the original content with any other content of exactly the same length, such as "3:30 meeting is cancelled, stay home" by encrypting it with the recovered keystream, without knowing the encryption key. Like most ciphers in use for bulk data transfer today, CipherSaber is a symmetric-key cipher. Thus, each pair of communicating users must somehow securely agree on an encryption key, and each user must securely store the encryption keys of those they are to communicate with. Agreeing on encryption keys when the only communications channels available are insecure is the classic chicken-and-egg problem solved by public key cryptography as provided by PGP-like programs. Avoiding the need for secure symmetric key agreements between every pair of users is of considerable convenience and generally improves security. A protocol typically used to achieve good efficiency and convenience is to use a public key cipher such as RSA for key exchange, then a symmetric-key cipher such as CipherSaber for bulk data transfer using the negotiated key. The short key-setup RC4 used in CipherSaber-1 is broken: RC4's original key scheduling is now known to be too weak to protect a large number of ciphertexts encrypted using the same key. CipherSaber-2 modifies CipherSaber-1's key setup procedure by repeating it multiple times in the hope of improving its security (the result is equivalent to using conventional RC4 starting with a key that's been preprocessed by a complex algorithm). While this procedure is believed to close the RC4 key scheduling vulnerability, its effectiveness has not been proven. Like any chosen-key cipher, both versions of CipherSaber are vulnerable to dictionary attack if the chosen key (which would normally be a password or passphrase) does not have sufficient entropy. Symmetric-key cryptography implementations usually include a facility for generating random keys when high security is required. The CipherSaber site recommends generating high entropy random passphrases using Diceware. Like most other cryptosystems (including PGP), CipherSaber makes no provisions at all to prevent attackers from detecting (as opposed to decrypting) the encrypted messages. This is a potentially serious problem in some situations for which CipherSaber was designed: if the government has banned the distribution of cryptographic software, it probably also will want to pounce on anyone who it finds sending encrypted messages. See traffic analysis and steganography for more about these issues and their countermeasures. References Further reading Cryptographic software Encryption debate
2108379
https://en.wikipedia.org/wiki/NAF%20%28non-profit%20organization%29
NAF (non-profit organization)
NAF (formerly known as National Academy Foundation) is an industry-sponsored nonprofit with a national network of public-private partnerships that support career academies within traditional high schools. Each academy focuses on a theme that addresses the anticipated future needs of local industry and the community it serves in five major "college prep plus" fields of study that encourage and facilitate college preparation and technical training on career paths in finance, hospitality and tourism, information technology (IT), engineering, and health sciences. In 2019, the NFL awarded eight social justice organizations, including NAF, with a $2 million grant for "reduc[ing] barriers to opportunity." The program is designed to build a work-ready future workforce by emphasizing STEM-related industry-specific curricula in the classroom and work-based learning experience, including summer internships. NAF has created career academies in 620 high schools in high-need communities in the continguous United States and its territories since 1980. In one high-profile example, it partnered with United Technologies in 2020, launching two $3 million engineering academies in high schools in Aguadilla, Puerto Rico. During the height of the pandemic in 2020, corporate partner Verizon created a virtual internship program to accommodate social distancing protocol for participants. Numerous studies of the NAF model have concluded that "sustained, quality employer involvement in education is possible," and that their programming helps provide equitable opportunities for minority students in "low-socioeconomic and high-risk backgrounds." Other research also credits the work-study model with promoting successful equity and inclusion. Program Characterized as "schools within schools," NAF academies serve a small community of students who are "organized as a cohort over their four years of high school" Academy teachers are typically skilled in both academic and the technical knowledge of the field in which the academy is focused. They are provided support with NAF professional development, training and curricula that integrates "core subject area content, career-themed content or technical education under a specified theme" ... based on NAF's input, that of CTE (Career Technical Education) and the local labor markets. Teachers meet often to coordinate the curriculum, take care of administrative details and are involved outside the classroom with local businesses and sponsors "to make learning relevant with real-world career support to build strong connections between school and work." The academies are typically run and taught by the same teachers for a number of semesters. Summer internships of about six to eight weeks are a focal point of the academy programs, and usually pay the students for their work. During internships, the students spend some time training, often report to a school staff supervisor and sometimes have a workplace mentor. Seniors in the program combine work-based learning with corresponding curricular activities to learn more about the industry, and to "explore careers, plan for college, and develop their social and interpersonal skills." As a result of the program, one Massachusetts student "complete[d] a 120-hour internship and pass[ed] a college-accredited accounting course while in high school ... then "she took an early college program," and arrived at college with "a 27-credit head start on her business administration degree (with a concentration in accounting)." History In 1980, the first Academy of Finance opened in John Dewey High School in Brooklyn, New York, after the New York City Board of Education accepted Sanford I. Weill's proposal to address the disconnect between the need for skilled workforce talent and the lack of opportunity for young people in New York City by creating a public-private partnership in high schools that exposed young people to genuine career skills. As he later explained to the U.S. House Committee on Ways and Means when testifying on the benefits of the program:You saw young people playing in the street, young people without having a clue of what life was about, and how they can become part of the system. That was the beginning of the idea that maybe the private sector should get together with the public sector and see if we can create a high-school level program that can expose young people for a career in the financial services industry.Weill's initial program focused on finance, which was Weill's specialty. In 1987, NAF's launched a Hospitality and Tourism theme with the opening of two pilot Academies, one in Miami, Florida and another in Richmond Hill, New York, with support from the American Express Foundation. In 2000, NAF piloted a third theme, opening Academies of Information Technology in 12 high schools across the country with support from Lucent, AT&T Corporation, United Technologies, GTE/Verizon, Oracle, Computer Associates and Compaq. In 2007, NAF launched its fourth academy theme, the Academy of Engineering as a collaboration between NAF, Project Lead The Way (PLTW), and the National Action Council for Minorities in Engineering, Inc. (NACME) to provide underrepresented students with the knowledge and skills needed to succeed in STEM careers. Health Sciences, a theme launched in 2011–2012, quickly earned Academy of Health Sciences involvement to help prepare young people for careers in health. Outcomes There have been significant reports and statistics on the outcome of students from NAF's Career Academies. Milton Chen, author of Education Nation and former executive director for the George Lucas Educational Foundation, sums up the most recent reports: Graduates of NAF institutes earn on average of 11% more per year in the eight years after graduation than non-academy students. More than 90% of NAF students graduate high school, even though academies are located in urban areas where the average high school graduation rate is 50%. More than 50% of NAF's high school graduates earn bachelor's degrees in four years while the national average is only 32%. In 2002, the first set of Career Academies outside the U.S. were set up in the United Kingdom by Career Academies UK, affiliated with NAF. Governing academies Academy of Engineering Academy of Finance Academy of Hospitality and Tourism Academy of Information Technology Academy of Health Sciences See also National Academy Foundation School – a high school in Baltimore, Maryland based on NAF's school model. References External links Career Academies UK Non-profit organizations based in the United States Educational foundations in the United States Non-profit organizations based in New York City Schools by association High schools and secondary schools Apprenticeship Vocational education Schools programs
19071449
https://en.wikipedia.org/wiki/ISMACryp
ISMACryp
The ISMA Encryption and Authentication, Version 1.1 specification (or ISMACryp) specifies encryption and message authentication services for MPEG-4 over RTP streams. It was defined by the Internet Streaming Media Alliance and published on September 15, 2006. The ISMA Encryption and Authentication, Version 2.0 specifies content encryption, message authentication (integrity) services, an RTP payload format and a file format for pre-encrypted content for ISMA 1.0, ISMA 2.0 and more generally any media that can be stored as elementary stream in an ISO base media file format (ISO/IEC 14496-12). The specification was published on 15 November 2007. ISMACryp specification defined extensions over the ISO base media file format, which were registered by the registration authority for code-points in "MP4 Family" files. The ISMACryp 2.0 specification in an informative "Annex F" provides guidelines on how ISMACryp can be used together with the key and rights management system of OMA DRM v2 (Open Mobile Alliance DRM). The Packetized OMA DRM Content Format is almost based on ISMACryp format. There are two alternatives to ISMACryp, SRTP and IPsec, that can also be used to provide service and content protection. The difference between the three is at what level encryption is done. Whereas ISMACryp encrypts MPEG-4 access units (that are in the RTP payload), SRTP encrypts the whole RTP payload, and IPsec encrypts packets at . References External links ISMA Technical Specifications ISMA Encryption and Authentication Version 1.1 DVB-H handheld video content protection with ISMA Encryption OpenIPMP - open source software for DRM including ISMACryp Cryptographic algorithms Television technology
31555280
https://en.wikipedia.org/wiki/Matthias%20Kalle%20Dalheimer
Matthias Kalle Dalheimer
Matthias Kalle Dalheimer (born May 27, 1970) is a published author and software consultant from Sweden. He ported the StarOffice office suite to Linux and he was one of KDE’s first contributors. In August 2002 he was elected as president of KDE e.V. Career Dalheimer founded Klarälvdalens Datakonsult AB, the first technical solution provider of Qt Development Frameworks, where he is the current President & CEO and Head of Development. Books Matthias Kalle Dalheimer has written several books on the subject of Qt and Linux "Running Linux, A Distribution-Neutral Guide for Servers and Desktops", by Matthias Kalle Dalheimer and Matt Welsh Publisher: O'Reilly Media Print Review,Free Software Magazine Mar 2, 2006 "Programming with Qt, Writing Portable GUI applications on Unix and Win32", by Matthias Kalle Dalheimer Publisher: O'Reilly Media Print References External links http://www.wizards-of-os.org/archiv/sprecher/d_f/kalle_dalheimer.html Kalle Dalheimer's website KDE Living people 1970 births
227000
https://en.wikipedia.org/wiki/Leidos
Leidos
Leidos, formerly known as Science Applications International Corporation (SAIC), is an American defense, aviation, information technology (Lockheed Martin IS&GS), and biomedical research company headquartered in Reston, Virginia, that provides scientific, engineering, systems integration, and technical services. Leidos merged with Lockheed Martin's IT sector in August 2016 for Information Systems & Global Solutions business to create the defense industry’s largest IT services provider. The Leidos-Lockheed Martin merger is one of the biggest transactions thus far in the consolidation of a defense sector. Leidos works extensively with the United States Department of Defense, the United States Department of Homeland Security, and the United States Intelligence Community, including the NSA, as well as other U.S. government civil agencies and selected commercial markets. History As SAIC The company was founded by J. Robert "Bob" Beyster in 1969 in the La Jolla neighborhood of San Diego, California, as Science Applications Incorporated (SAI). Beyster, a former scientist for the Westinghouse Atomic Power Division, and Los Alamos National Laboratory, who became the chairman of the Accelerator Physics Department of General Atomics in 1957, raised the money to start SAI by selling stock he had received from General Atomics, combined with funds raised from the early employees who bought stock in the young enterprise. Initially the company's focus was on projects for the U.S. government related to nuclear power and weapons effects study programs. The company was renamed Science Applications International Corporation (SAIC) as it expanded its operations. Major projects during Beyster's tenure included work on radiation therapy for the Los Alamos National Laboratory; technical support and management assistance to the development of the cruise missile in the 1970s; the cleanups of the Three Mile Island Nuclear Generating Station after its major accident, and of the contaminated community of Love Canal; design and performance evaluation of the Stars & Stripes 87, the winning ship for the 1987 America's Cup; and the design of the first luggage inspection machine to pass new Federal Aviation Administration tests following the terrorist bombing of Pan American flight 103 over Lockerbie, Scotland. Contrary to traditional business models, Beyster originally designed SAIC as an employee-owned company. This shared ownership was accompanied by shared responsibility and freedom in business development, and allowed SAIC to attract and retain highly educated and motivated employees that helped the company to grow and diversify. After Beyster's retirement in 2003, SAIC conducted an initial public offering of common stock on October 17, 2006. The offering of 86,250,000 shares of common stock was priced at $15.00 per share. The underwriters, Bear Stearns and Morgan Stanley, exercised overallotment options, resulting in 11.25 million shares. The IPO raised US$1.245B. Even then, employee shares retained a privileged status, having ten times the voting power per share over common stock. In September 2009 SAIC relocated its corporate headquarters to their existing facilities in Tysons Corner in unincorporated Fairfax County, Virginia, near McLean. In 2012 SAIC was ordered to pay $550 million to the City of New York for overbilling the city over a period of seven years on the CityTime contract. In 2014 Gerard Denault, SAIC's CityTime program manager, and his government contact were sentenced to 20 years in prison for fraud and bribery related to that contract. As Leidos In August 2012, SAIC announced its plans to split into two publicly traded companies. The company spun off about a third of its business, forming an approximately $4 billion-per-year service company focused on government services, including systems engineering, technical assistance, financial analysis, and program office support. The remaining part became a $7 billion-per-year IT company specializing in technology for the national security, health, and engineering sectors. The smaller company was led by Tony Moraco, who beforehand was leading SAIC's Intelligence, Surveillance and Reconnaissance group, and the bigger one was led by John P. Jumper. The split has allowed both companies to pursue more business, which it could not pursue as a single company which would have resulted in conflicts of interest. In February 2013, it was announced that the smaller spin-off company would get the name "Science Applications International Corporation" and stay in the current headquarters, while the larger company would change its name to Leidos, (created by clipping the word kaleidoscope) and would move its headquarters to Reston. The split was structured in a way that SAIC changed its name to Leidos, then spun off the new SAIC as a separate publicly traded company. However, Leidos is the legal successor of the original SAIC and retains SAIC's pre-2013 stock price and corporate filing history. On September 27, 2013, SAIC changed its name to Leidos and spun off a new and independent $4 billion government services and information technology company which retained the Science Applications International Corporation name; Leidos is the direct successor to the original SAIC. Before the split, Leidos employed 39,600 employees and reported $11.17 billion in revenue and $525 million net income for its fiscal year ended January 31, 2013, making it number 240 on the Fortune 500 list. In 2014, Leidos reported US$5.06 billion in revenue. In August 2016, the deal to merge with the entirety of Lockheed Martin's Information Systems & Global Solutions (IS&GS) business came to a close, more than doubling the size of Leidos and its portfolio, and positioning the company as the global defense industry's largest enterprise in the federal technology sector. As of February 2019, the company has 32,000 employees. In 2018, Leidos reported US$10.19 billion in revenue. It ranked 311 on the 2019 Fortune 500 list. In January 2020, Leidos purchased defense contractor Dynetics for approximately $1.65 billion. In May 2020 it purchased the Security Detection and Automation Systems division of L3Harris (notable for providing the detection screeners that all airport travelers pass through when flying). Structure Leidos has four central divisions: Civil, Health, Advanced Solutions, and Defense & Intelligence. The Civil Division focuses on integrating aviation systems, securing transportation measures, modernizing IT infrastructure, and engineering energy efficiently. The Health Division focuses on optimizing medical enterprises, securing private medical data, and improving collection and data entry methods. The Advanced Solutions Division is centered around data analysis, integrating advanced defense and intelligence systems, and increasing surveillance and reconnaissance efficiency. The Defense & Intelligence Division focuses on providing air service systems, geospatial analysis, cybersecurity, intelligence analysis, and supporting operations efforts. Management CEO: Roger Krone After more than 30 years of Beyster's leadership, Kenneth C. Dahlberg was named the CEO of SAIC in November 2003. In May 2005, the company changed its external tagline from An Employee-Owned Company to From Science to Solutions. The third CEO was Walt Havenstein, who pushed for tighter integration of the company's historically autonomous divisions, which led to lower profit and revenue. The strategy was reversed by the fourth CEO, retired Air Force general John P. Jumper, appointed in 2012. On July 1, 2014, Leidos announced that Roger Krone would become its CEO on July 14, 2014. As of 2019, Krone is the Chairman and CEO of Leidos. Headquarters In January 2018, Leidos announced it would move within Reston, VA, a quarter mile from 11951 Freedom Drive to 1750 Presidents Street. The new building was completed in early 2020. Operations The Defense Intelligence Agency (DIA) transitioned a Remote Viewing Program to SAIC in 1991 which was renamed Stargate Project. In March 2001, SAIC defined the concept for the NSA Trailblazer Project. In 2002, NSA contracted SAIC for $280 million to produce a "technology demonstration platform" for the agency's project, a "Digital Network Intelligence" system to analyze data carried on computer networks. Other project participants included Boeing, Computer Sciences Corporation, and Booz Allen Hamilton. According to science news site PhysOrg.com, Trailblazer was a continuation of the earlier ThinThread program. In 2005, NSA director Michael Hayden told a Senate hearing that the Trailblazer program was several hundred million dollars over budget and years behind schedule. In fiscal year 2003, SAIC did more than $2.6 billion in business with the United States Department of Defense, making it the ninth-largest defense contractor in the United States. Other large contracts included a bid for information technology for the 2004 Olympics in Greece. From 2001 to 2005, SAIC was the primary contractor for the FBI's unsuccessful Virtual Case File project. During fiscal year 2012 (latest figure available), SAIC had more than doubled its business with the DoD to $5,988,489,000, and was the 4th-largest defense contractor on the annual list of the top 100. Leidos ranked 292 on the 2017 Fortune 500 list. Subsidiaries Dynetics, Inc., a wholly owned subsidiary of Leidos since Jan 2020. Leidos Biomedical Research, Inc., formerly SAIC - Frederick, a wholly owned subsidiary of Leidos manages Frederick National Laboratory for Cancer Research. Gibbs & Cox, a wholly owned subsidiary of Leidos since May 7, 2021. MEDPROTECT, LLC supports US government health-payer organizations Reveal, develops dual-energy X-ray computed tomography systems for explosives-detection at airports and similar facilities CloudShield Technologies a wholly owned subsidiary, specializing in cyber-security Varec, Inc., liquid petroleum asset management company Leidos Health Leidos Canada, formerly SAIC Canada, wholly owned subsidiary, works with Canadian government. Leidos Australia (Leidos Pty Ltd), wholly owned subsidiary, specializing in document technologies and cyber-security. Produces TeraText software. Leidos UK (Leidos Innovations UK Ltd, Leidos Europe Ltd, Leidos Supply Ltd & Leidos Ltd), wholly owned subsidiary, specializing in managed IT Services, developing of bespoke products. Produces, supports & maintains the Chroma Airport Suite, also responsible for the MOD's Supply Chain. Leidos Engineering, LLC, formerly SAIC Energy, Environment & Infrastructure LLC, assembles the legacy of engineering capabilities of Benham Investment Holdings, LLC, R. W. Beck Group, Inc., and Patrick Energy Services. QTC Management, Inc., acquired by merging with Lockheed Martin IS&GS. Systems Made Simple (SMS), acquired by merging with Lockheed Martin IS&GS. Former subsidiaries AMSEC LLC, a business partnership between SAIC and Northrop Grumman subsidiary Newport News Shipbuilding divested on July 13, 2007. Network Solutions was acquired by SAIC in 1995, and subsequently was acquired by VeriSign, Inc. for $21 billion.Leidos Cyber, Inc., formerly Lockheed Martin Industrial Defender, acquired by merging with Lockheed Martin IS&GS, was sold to Capgemini in 2018. Controversies As SAIC Then-SAIC had as part of its management and on its board of directors, many well-known ex-government personnel including Melvin Laird, Secretary of Defense in the Nixon administration; William Perry, Secretary of Defense for Bill Clinton; John M. Deutch, Director of Central Intelligence under President Clinton; Admiral Bobby Ray Inman who served in various capacities in the National Security Agency (NSA) and Central Intelligence Agency (CIA) for the Ford, Carter and Reagan administrations; and David Kay who led the search for weapons of mass destruction after the 1991 Gulf War and served under the Bush administration after the 2003 invasion of Iraq. In 2012, 26 out of 35 SAIC Inc. lobbyists previously held government jobs. In June 2001, the Federal Bureau of Investigation (FBI) paid SAIC $122 million to create a Virtual Case File (VCF) software system to speed up the sharing of information among agents. But the FBI abandoned VCF when it failed to function adequately. Robert Mueller, FBI Director, testified to a congressional committee, "When SAIC delivered the first product in December 2003 we immediately identified a number of deficiencies – 17 at the outset. That soon cascaded to 50 or more and ultimately to 400 problems with that software ... We were indeed disappointed." In 2005, then-SAIC executive vice president Arnold L. Punaro claimed that the company had "fully conformed to the contract we have and gave the taxpayers real value for their money." He blamed the FBI for the initial problems, saying the agency had a parade of program managers and demanded too many design changes. He stated that during 15 months that SAIC worked on the program, 19 different government managers were involved and 36 contract modifications were ordered. "There were an average of 1.3 changes every day from the FBI, for a total of 399 changes during the period," Punaro said. In 2011–2012, then-SAIC was among the 8 top contributors to federal candidates, parties, and outside groups with $1,209,611 during the 2011–2012 election cycle according to information from the Federal Election Commission. The top candidate recipient was Barack Obama. As Leidos In a heavily redacted report dated January 3, 2018, the Inspector General for the Department of Defense determined that a supervisor at Leidos made “inappropriate sexual and racial comments to” a female contractor, and that when she complained of a hostile work environment, Leidos retaliated by excluding her from further work on an additional contract. The report found that Leidos's claim that the contract employee “exhibited poor performance throughout her employment" lacked supporting evidence. It recommended that U.S. Secretary of Defense Jim Mattis “consider appropriate action against Leidos” such as “compensatory damages, including back pay, employee benefits and other terms and conditions of employment” that the contractor would have received under the additional contract. In 2018, Leidos donated to the Senate campaign of Cindy Hyde-Smith. However, after a video was released showing Hyde-Smith speaking fondly of participating in "public hangings", Leidos said the company would never have made the donation if it had known about the comment. During Hyde-Smith's 2020 re-election bid, Leidos again donated to her. See also Top 100 Contractors of the U.S. federal government References Further reading External links Defense companies of the United States Companies listed on the New York Stock Exchange Engineering companies of the United States Companies based in San Diego American companies established in 1969 Consulting firms established in 1969 Technology companies established in 1969 1969 establishments in California Companies based in Reston, Virginia Information technology consulting firms of the United States International information technology consulting firms 2006 initial public offerings
997497
https://en.wikipedia.org/wiki/Black%20hat%20%28computer%20security%29
Black hat (computer security)
A black hat hacker (or black-hat hacker) is a hacker who violates computer security for their own personal profit or out of malice. Origin The term's origin is often attributed to hacker culture theorist Richard Stallman (though he denies coining it) to contrast the exploitative hacker with the white hat hacker who hacks protectively by drawing attention to vulnerabilities in computer systems that require repair. The black/white hat terminology originates in the Western genre of popular American culture, in which black and white hats denote villainous and heroic cowboys respectively, to resemble the contrast of good and evil. Black hat hackers are the stereotypical illegal hacking groups often portrayed in popular culture, and are "the epitome of all that the public fears in a computer criminal". Black hat hackers break into secure networks and systems with the motive of destroying, modifying, or stealing some sensitive data, or to make the networks unusable for authorized network users. Unlike the white hat hackers, black hat hackers do not have any common and standardized code or internal regulation. Albeit, there exist some forms of organization: black hat call centers, malicious software resellers and vendors. See also Security hacker References Hacking (computer security) Cybercrime Hat, black
372478
https://en.wikipedia.org/wiki/Video%20game%20industry
Video game industry
The video game industry is the industry involved in the development, marketing, and monetization of video games. It encompasses dozens of job disciplines and its component parts employ thousands of people worldwide. The video game industry has grown from focused markets to mainstream in the recent years. As of 2018, July, video games generated sales of US$134.9 billion annually worldwide. In the US, it took in about US$ 9.5 billion in 2007, 11.7 billion in 2008, and 25.1 billion 2010, according to the ESA annual report. Modern personal computers owe many advances and innovations to the game industry: sound cards, graphics cards and 3D graphic accelerators, faster CPUs, and dedicated co-processors like PhysX are a few of the more notable improvements. Sound cards, for example, were originally developed for an addition of digital-quality sound to games and only later were they improved for the music industry. Graphics cards were originally developed to provide more screen colors; and later on to support graphical user interfaces (GUIs) and games. This drove the need for higher resolutions and 3D acceleration. Industry overview Size In 2017 in the United States, which represented about a third of the global video game market, the Entertainment Software Association estimated that there were over 2,300 development companies and over 525 publishing companies (including those involved in hardware and software manufacturing, service providers, and distributors). These companies in total have nearly 66,000 direct employed workers. When including indirect employment, such as a developer using the services of a graphics design package from a different firm, the total number of employees involved in the video game industry rises to over 220,000. Value chain Traditionally, the video game industry has had six connected layers in its value chain based on the retail distribution of games: Game development, representing programmers, designers and artists, as well as their leadership, with support of middleware and other development tools. Publishing, which typically include both the source of funding the development of a video game, as well as providing the marketing and advertising for a game. Distribution, whether through retail or digital channels. Distribution typically includes manufacturing and duplication of game media and packaging for retail games. Retailer, storefront where the game is sold. Customers and consumers, the purchasers and players of video games Hardware/platform manufacturers, which can own and place limitations for content on the platform they have made, requiring developers or publishers to pay a license fee to publish games on that system. As games have transitioned from the retail to more digital market, parts of this value chain have become redundant. For example, the distributor may be redundant as a function of either the publisher or the retailer, or even in some cases as the case of indie games, the function of the developer themselves. Roles Ben Sawyer of Digitalmill observes that the development side of the industry is made up of six connected and distinctive layers: Capital and publishing layer: involved in paying for development of new titles and seeking returns through licensing of the titles. Product and talent layer: includes developers, designers and artists, who may be working under individual contracts or as part of in-house development teams. Production and tools layer: generates content production tools, game development middleware, customizable game engines, and production management tools. Distribution layer: or the "publishing" industry, involved in generating and marketing catalogs of games for retail and online distribution. Hardware (or Virtual Machine or Software Platform) layer: or the providers of the underlying platform, which may be console-based, accessed through online media, or accessed through mobile devices such as smartphones. This layer now includes network infrastructure and non-hardware platforms such as virtual machines (e.g. Java or Flash), or software platforms such as browsers or even further Facebook, etc. End-users layer: or the users/players of the games. The game industry employs those experienced in other traditional businesses, but some have experience tailored to the game industry. Some of the disciplines specific to the game industry include: game programmer, game designer, level designer, game producer, game artist and game tester. Most of these professionals are employed by video game developers or video game publishers. However, many hobbyists also produce computer games and sell them commercially. Game developers and publishers sometimes employ those with extensive or long-term experience within the modding communities. History 1940s–1960s Prior to the 1970s, there was no significant commercial aspect of the video game industry, but many advances in computing would set the stage for the birth of the industry. Many early publicly available interactive computer-based game machines used or other mechanisms to mimic a display; while technically not "video games", they had elements of interactivity between the player and the machine. Some examples of these included the 1940 "Nimatron", an electromagnetic relay-based Nim-playing device designed by Edward Condon and built by Westinghouse Electric for the New York World's Fair, Bertie the Brain, an arcade game of tic-tac-toe, built by Josef Kates for the 1950 Canadian National Exhibition, and Nimrod created by engineering firm Ferranti for the 1951 Festival of Britain. The development of cathode ray tube—the core technology behind televisions—created several of the first true video games. In 1947 Thomas T. Goldsmith Jr. and Estle Ray Mann filed a patent for a "cathode ray tube amusement device". Their game, which uses a cathode ray tube hooked to an oscilloscope display, challenges players to fire a gun at target. Between the 1950s and 1960s, with mainframe computers becoming available to campus colleges, students and others started to develop games that could be played at terminals that accessed the mainframe. One of the first known examples is Spacewar!, developed by Harvard and MIT employees Martin Graetz, Steve Russell, and Wayne Wiitanen. The introduction of easy-to-program languages like BASIC for mainframes allowed for more simplistic games to be developed. The arcade video game industry grew out of the pre-existing arcade game industry, which was previously dominated by electro-mechanical games (EM games). Following the arrival of Sega's EM game Periscope (1966), the arcade industry was experiencing a "technological renaissance" driven by "audio-visual" EM novelty games, establishing the arcades as a healthy environment for the introduction of commercial video games in the early 1970s. In the late 1960s, a college student named Nolan Bushnell had a part-time job at an arcade where he became familiar with EM games such as Chicago Coin's racing game Speedway (1969), watching customers play and helping to maintain the machinery, while learning how it worked and developing his understanding of how the game business operates. 1970s In 1971, the first commercial arcade video game, Computer Space, was released. The following year, Atari, Inc. released the first commercially successful video game, Pong, the original arcade version of which sold over 19,000 arcade cabinets. That same year saw the introduction of video games to the home market with the release of the early video game console, the Magnavox Odyssey. However, both the arcade and home markets would be dominated by Pong clones, which flooded the market and led to the video game crash of 1977. The crash eventually came to an end with the success of Taito's Space Invaders, released in 1978, sparking a renaissance for the video game industry and paving the way for the golden age of video arcade games. The game's success inspired arcade machines to become prevalent in mainstream locations such as shopping malls, traditional storefronts, restaurants and convenience stores during the golden age. Space Invaders would go on to sell over 360,000 arcade cabinets worldwide, and by 1982, generate a revenue of $2 billion in quarters, equivalent to $4.6 billion in 2011. Soon after, Space Invaders was licensed for the Atari VCS (later known as Atari 2600), becoming the first "killer app" and quadrupling the console's sales. The success of the Atari 2600 in turn revived the home video game market during the second generation of consoles, up until the video game crash of 1983. By the end of the 1970s, the personal computer game industry began forming from a hobby culture. 1980s The early 1980s saw the golden age of video arcade games reach its zenith. The total sales of arcade video game machines in North America increased significantly during this period, from $50 million in 1978 to $900 million by 1981, with the arcade video game industry's revenue in North America tripling to $2.8 billion in 1980. By 1981, the arcade video game industry was generating an annual revenue of $5 billion in North America, equivalent to $12.3 billion in 2011. In 1982, the arcade video game industry reached its peak, generating $8 billion in quarters, equivalent to over $18.5 billion in 2011, surpassing the annual gross revenue of both pop music ($4 billion) and Hollywood films ($3 billion) combined at that time. This was also nearly twice as much revenue as the $3.8 billion generated by the home video game industry that same year; both the arcade and home markets combined add up to a total revenue of $11.8 billion for the video game industry in 1982, equivalent to over $27.3 billion in 2011. The arcade video game industry would continue to generate an annual revenue of $5 billion in quarters through to 1985. The most successful game of this era was Namco's Pac-Man, released in 1980, which would go on to sell over 350,000 cabinets, and within a year, generate a revenue of more than $1 billion in quarters; in total, Pac-Man is estimated to have grossed over 10 billion quarters ($2.5 billion) during the 20th century, equivalent to over $3.4 billion in 2011. The early part of the decade saw the rise of 8-bit home computing, and home-made games, especially in Europe (with the ZX Spectrum and Commodore 64) and Asia (with the NEC PC-88 and MSX). This time also saw the rise of video game journalism, which was later expanded to include covermounted cassettes and CDs. In 1983, the North American industry crashed due to the production of too many badly developed games (quantity over quality), resulting in the fall of the North American industry. The industry would eventually be revitalized by the release of the Nintendo Entertainment System, which resulted in the home console market being dominated by Japanese companies such as Nintendo, while a professional European video game industry also began taking shape with companies such as Ocean Software and Gremlin Interactive. The latter part of the decade saw the rise of the Game Boy handheld system. In 1987, Nintendo lost a legal challenge against Blockbuster Entertainment, which enabled games rentals in the same way as movies. 1990s The 1990s saw advancements in game related technology. Among the significant advancements were: The "3D Revolution" where 3D polygon graphics became the de facto standard for video game visual presentation, initially in the arcades during the early 1990s, and then on home systems with 3D consoles and PC graphics cards in the mid-1990s. The widespread adoption of CD-based storage and software distribution Continuing advancement of CPU speed and sophistication Widespread adoption of GUI-based operating systems, such as the series of Amiga OS, Microsoft Windows and Mac OS Miniaturisation of hardware, with handheld game consoles and mobile phones, which enabled mobile gaming The emergence of the internet, which in the latter part of the decade enabled online co-operative play and competitive gaming Aside from technology, in the early part of the decade, licensed games became more popular, as did video game sequels. The arcades experienced a renaissance in the early 1990s following the release of Street Fighter II (1991), which led to a number of other popular fighting games such as Fatal Fury (1991) and Mortal Kombat (1992). The arcade resurgence was further driven by increasing realism, with the "3D Revolution" from 2D and pseudo-3D graphics to true real-time 3D polygon graphics, following the release of titles such as Virtua Racing (1992) and Virtua Fighter (1993). In the late 1990s, there was a transition away from arcades to home systems. Up until about 1996-1997, arcade video games represented the largest sector of the global video game industry, before arcades declined and the console market surpassed arcade video games for the first time around 1997-1998. Arcade systems such as the Sega Model 3 remained more technologically advanced than home systems in the late 1990s, but the gap between arcade and home systems began narrowing in the late 1990s. The video game industry generated worldwide sales of $19.8 billion in 1993 (equivalent to $31 billion in 2011), $20.8 billion in 1994 (equivalent to $32 billion in 2011), and an estimated $30 billion in 1998 (equivalent to $41.5 billion in 2011). In the United States alone, in 1994, arcades were generating $7 billion in quarters (equivalent to $11 billion in 2011) while home console game sales were generating revenues of $6 billion (equivalent to $9 billion in 2011). Combined, this was nearly two and a half times the $5 billion revenue generated by movies in the United States at the time. 2000s In 2000s, the video game industry is a juggernaut of development; profit still drives technological advancement which is then used by other industry sectors. Technologies such as Smartphones, virtual reality and augmented reality are major drivers for game hardware and gameplay development. Though maturing, the video game industry was still very volatile, with third-party video game developers quickly cropping up, and just as quickly, going out of business. Nevertheless, many casual games and indie games were developed and become popular and successful, such as Braid and Limbo. Game development for mobile phones (such as iOS and Android devices) and social networking sites emerged. For example, a Facebook game developer, Zynga, has raised in excess of $300 million. 2010s Though not the main driving force, indie games continue to have a significant impact on the industry, with sales of some of these titles such as Spelunky, Fez, Don't Starve, Castle Crashers, and Minecraft, exceeding millions of dollars and over a million users. The 2010s have seen a larger shift to casual and mobile gaming; in 2016, the mobile video game market is estimated to have taken $38 billion in revenues, compared to $6 billion for the console market and $33 billion for personal computing gaming. Games centered on virtual reality and augmented reality equipment also arose during this decade. As of 2014, newer game companies arose that vertically integrate live operations and publishing such as crowdfunding and other direct-to-consumer efforts, rather than relying on a traditional publishers, and some of these have grown to substantial size. Spurred by some initial events in the late 2000s, eSports centered around professional players in organized competitions and leagues for prize money, grew greatly over this decade, drawing hundreds of millions of viewers and reaching nearly $500 million in revenue by 2016 and expected to break $1 billion by 2019. 2020s While a new generation of home consoles, the Xbox Series X/S and PlayStation 5, was planned in 2020, the video game industry was affected by the COVID-19 pandemic that had a worldwide impact starting in March 2020 due to forced stay-at-home orders by governmental regulations. While there were similar impacts to the video game industry as with other industries, such as cancellation of in-person trade shows, conventions and esports events, and the delay of many games into late 2020, 2021, or beyond, the industry was also one of the few to actually thrive from people stuck at home using video games as a means to overcome social distancing. The market had a 20% year-to-year growth from 2019, reaching over in global revenue in both hardware and software for 2020. Simple-to-learn games with high social interactions found high popularity during the COVID-19 pandemic, including Animal Crossing: New Horizons, Fall Guys and Among Us. As the pandemic wore on from 2020 into 2021, the industry was impacted by a secondary effect of COVID-19, the impact of the global semiconductor chip shortage on hardware manufacturing. All three console vendors, Nintendo, Microsoft, and Sony, were impacted by availability of supply of core components, and for the latter two, made the launch of their new consoles difficult to manage with only limited supplies available at launch. The chip supply shortage also affected personal computer gamers, coupled with demand for computer parts to be used in cryptocurrency mining, which artificially raised prices and made it difficult to purchase newer components. Economics Early on, development costs were minimal, and video games could be quite profitable. Games developed by a single programmer, or by a small team of programmers and artists, could sell hundreds of thousands of copies each. Many of these games only took a few months to create, so developers could release multiple titles per year. Thus, publishers could often be generous with benefits, such as royalties on the games sold. Many early game publishers started from this economic climate, such as Origin Systems, Sierra Entertainment, Capcom, Activision and Electronic Arts. As computing and graphics power increased, so too did the size of development teams, as larger staffs were needed to address the ever-increasing technical and design complexities. The larger teams consist of programmers, artists, game designers, and producers. Their salaries can range anywhere from $50,000 to $120,000 generating large labor costs for firms producing video games which can often take between one and three years to develop. Now budgets typically reach millions of dollars despite the growing popularity of middleware and pre-built game engines. In addition to growing development costs, marketing budgets have grown dramatically, sometimes consisting of two to three times of the cost of development. The game development team has to select a profitable and suitable method to sell or earn money from the finished game. Traditionally, the game monetization method is to sell hard copies in retail store. Now some developers are turning to alternative production and distribution methods, such as online distribution, to reduce costs and increase revenue. In the 2010s, the video game industry had a major impact on the economy through the sales of major systems and games such as Call of Duty: Black Ops, which took in over $650 USD million of sales in the game's first five days and which set a five-day global record for a movie, book or video game. The game's income was more than the opening weekend of Spider-Man 3 and the previous title holder for a video game Halo 3. Many individuals have also benefited from the economic success of video games including the former chairman of Nintendo and Japan's third richest man: Hiroshi Yamauchi. By 2014, the global video game market was valued at over $93 billion. The industry wide adoption of high-definition graphics during the seventh generation of consoles greatly increased development teams' sizes and reduced the number of high-budget, high-quality titles under development. In 2013 Richard Hilleman of Electronic Arts estimated that only 25 developers were working on such titles for the eighth console generation, compared to 125 at the same point in the seventh generation-console cycle seven or eight years earlier. By 2018, the United States video game industry had matched that of the United States film industry on basis of revenue, with both industries having made around that year. Retail The games industry's shift from brick and mortar retail to digital downloads led to a severe sales decline at video game retailers such as GameStop, following other media retailers superseded by Internet delivery, such as Blockbuster, Tower Records, and Virgin Megastores. GameStop diversified its services by purchasing chains that repair wireless devices and expanding its trade-in program through which customers trade used games for credit towards new games. The company began to produce its own merchandise and games. In Britain, the games retailer Game revamped its stores so customers would spend time playing games there. It built a gaming arena for events and tournaments. The shift to digital marketplaces, especially for smartphones, led to an influx of inexpensive and disposable titles, as well as lower engagement among gamers who otherwise purchased new games from retail. Customers also shifted away from the tradition of buying games on their first day of release. Publishers often funded trade-in deals to encourage consumers to purchase new games. Trade-in customers at the Australia retailer Game would purchase twice the games per year as non-trade-in customers. The sale of pre-owned games kept retailers in business, and composed about a third of Game's revenue. Retailers also saved on the UK's value-added tax, which only taxed the retailer's profit on pre-owned games, rather than the full sale on regular games. The former trade-in retail executives behind the trade-in price comparison site Trade In Detectives estimated that the United Kingdom's trade-in industry was about a third of the size of its new games business. They figured that sites such as eBay, which convert used games into cash, compose about a quarter of the UK's trade-in market, but do not keep the credit within the industry. While consumers might appear to receive better offers on these sites, they also take about 15 percent of the selling price in fees. Alternatively, some retailers will match the trade-in values offered by their competitors. Microsoft's original plan for the Xbox One attempted to translate trade-in deals for the digital marketplace, with a database of product licenses that shops would be able to resell with publisher permission, though the plan was poorly received or poorly sold. Practices Video game industry practices are similar to those of other entertainment industries (e.g., the music recording industry), but the video game industry in particular has been accused of treating its development talent poorly. This promotes independent development, as developers leave to form new companies and projects. In some notable cases, these new companies grow large and impersonal, having adopted the business practices of their forebears, and ultimately perpetuate the cycle. However, unlike the music industry, where modern technology has allowed a fully professional product to be created extremely inexpensively by an independent musician, modern games require increasing amounts of manpower and equipment. This dynamic makes publishers, who fund the developers, much more important than in the music industry. Breakaways In the video game industry, it is common for developers to leave their current studio and start their own. A particularly famous case is the "original" independent developer Activision, founded by former Atari developers. Activision grew to become the world's second largest game publisher. In the meantime, many of the original developers left to work on other projects. For example, founder Alan Miller left Activision to start another video game development company, Accolade (now Atari née Infogrames). Activision was popular among developers for giving them credit in the packaging and title screens for their games, while Atari disallowed this practice. As the video game industry took off in the mid-1980s, many developers faced the more distressing problem of working with fly-by-night or unscrupulous publishers that would either fold unexpectedly or run off with the game profits. Piracy The industry claims software piracy to be a big problem, and take measures to counter this. Digital rights management have proved to be the most unpopular with gamers, as a measure to counter piracy. The most popular and effective strategy to counter piracy is to change the business model to freemium, where gamers pay for their in-game needs or service. Strong server-side security is required for this, to properly distinguish authentic transactions from hacked transactions. Creative control On various Internet forums, some gamers have expressed disapproval of publishers having creative control since publishers are more apt to follow short-term market trends rather than invest in risky but potentially lucrative ideas. On the other hand, publishers may know better than developers what consumers want. The relationship between video game developers and publishers parallels the relationship between recording artists and record labels in many ways. But unlike the music industry, which has seen flat or declining sales in the early 2000s, the video game industry continues to grow. In the computer games industry, it is easier to create a startup, resulting in many successful companies. The console games industry is a more closed one, and a game developer must have up to three licenses from the console manufacturer: A license to develop games for the console The publisher must have a license to publish games for the console A separate license for each game In addition, the developer must usually buy development systems from the console manufacturer in order to even develop a game for consideration, as well as obtain concept approval for the game from the console manufacturer. Therefore, the developer normally has to have a publishing deal in place before starting development on a game project, but in order to secure a publishing deal, the developer must have a track record of console development, something which few startups will have. Alternatives An alternative method for publishing video games is to self-publish using the shareware or open source model over the Internet. Gaming conventions Gaming conventions are an important showcase of the industry. The major annual video game conventions include Gamescom in Cologne (Germany), the E3 in Los Angeles (USA), the Penny Arcade Expo, and others. Regional distribution As with other forms of media, video games have often been released in different world regions at different times. The practice has been used where localization is not done in parallel with the rest of development or where the game must be encoded differently, as in PAL vs. NTSC. It has also been used to provide price discrimination in different markets or to focus limited marketing resources. Developers may also stagger digital releases so as not to overwhelm the servers hosting the game. International practices The video game industry had its primary roots in the United States following the introduction of arcade games and console systems, with Japan soon following. With the introduction of the personal computer, Western Europe also became a major center for video game development. Since then, the industry is primarily led by companies in North American, Europe, and Japan, but other regions, including Australia/New Zealand, and other southeast Asian countries including China and South Korea, have become significant sectors for the industry. World trends International video game revenue is estimated to be $81.5B in 2014. This is more than double the revenue of the international film industry in 2013. In 2015, it was estimated at . The largest nations by estimated video game revenues in 2016 are China ($24.4B), the United States ($23.5B) and Japan ($12.4B). The largest regions in 2015 were Asia-Pacific ($43.1B), North America ($23.8B), and Western Europe ($15.6B). In 2018, the global video games market was valued at around $134.9bn. Largest markets According to market research firm Newzoo, the following countries are the largest video game markets by annual revenue, . In general, spending on gaming tends to increase with increase in nominal GDP. However, gaming is relatively more popular in East Asia, and relatively less popular in India. North America Canada Canada has the third largest video game industry in terms of employment numbers. The video game industry has also been booming in Montreal since 1997, coinciding with the opening of Ubisoft Montreal. Recently, the city has attracted world leading game developers and publishers studios such as Ubisoft, EA, Eidos Interactive, Artificial Mind and Movement, BioWare, Warner Bros. Interactive Entertainment and Strategy First, mainly because video games jobs have been heavily subsidized by the provincial government. Every year, this industry generates billions of dollars and thousands of jobs in the Montreal area. Vancouver has also developed a particularly large cluster of video game developers, the largest of which, Electronic Arts, employs over two thousand people. The Assassin's Creed series, along with the Tom Clancy series have all been produced in Canada and have achieved worldwide success. For consumers, the largest video games convention in Canada is the Enthusiast Gaming Live Expo (EGLX). United States The video game industry got its start in the United States in the late 1970s and early 1980s with the creation of arcade games like Pong and the first home console, the Magnavox Odyssey. Several factors, including loss of publishing control, a flooded market, and competition from personal computers, led to the 1983 video game crash in the U.S., affecting both arcades and home game systems. Nintendo's introduction of the Nintendo Entertainment System helped to revitalize the industry, but until Microsoft's introduction of the Xbox in the early 2000s, the hardware side was dominated by mostly Japanese-developed systems. Instead, much of the industry's growth in the U.S. was on game development, implementing new game technologies and gameplay concepts, as well as creating the large-scale publisher model used by companies like Electronic Arts to support marketing and distribution of games. The United States has the largest video games presence in the world in terms of total industry employees. In 2017, the U.S. game industry as a whole was worth US$18.4 billion and consisted of roughly 2457 companies that had a rough total of 220,000 people employed. U.S. video game revenue is forecast to reach $230 billion by 2022, making it the largest video game market in the world. Over 150 million Americans play video games, with an average age of 35 and a gender breakdown of 59 percent male and 41 percent female. American gamers are more likely to vote than non-gamers, feel that the economy is the most important political issue, and lean conservative, however party demographics are split evenly with 38% identifying as Democrats, 38% identifying as Republicans, and 24% identifying as Independents. Europe Germany Germany has the largest video games market in Europe, with revenues of $4.1 billion forecast for 2017. The annual Gamescom in Cologne is Europe's largest video game expo. One of the earliest internationally successful video game companies was Gütersloh-based Rainbow Arts (founded in 1984) who were responsible for publishing the popular Turrican series of games. The Anno series and The Settlers series are globally popular strategy game franchises since the 1990s. The Gothic series, SpellForce and Risen are established RPG franchises. The X series by Egosoft is the best-selling space simulation. The FIFA Manager series was also developed in Germany. The German action game Spec Ops: The Line (2012) was successful in the markets and received largely positive reviews. One of the most famed titles to come out of Germany is Far Cry (2004) by Frankfurt-based Crytek, who also produced the topseller Crysis and its sequels later. Other well-known current and former developers from Germany include Ascaron, Blue Byte, Deck13, Phenomic, Piranha Bytes, Radon, Related, Spellbound and Yager Development. Publishers include Deep Silver (Koch Media), dtp entertainment, Kalypso and Nintendo Europe. Bigpoint Games, Gameforge, Goodgame Studios and Wooga are among the world's leading browser game and social network game developers/distributors. United Kingdom The UK industry is the third largest in the World in terms of developer success and sales of hardware and software by country alone but fourth behind Canada in terms of people employed. The size of the UK game industry is comparable to its film or music industries. Like most European countries, the UK entered the video game industry through personal computers rather than video game consoles. Low-cost computers like the ZX Spectrum and Amiga 500 led to numerous "bedroom coders" that would make and sell games through mail-order or to distributors that helped to mass-produce them. Coupled with quirky british humour, the "Britsoft" wave of popular titles led to a number of influential people and studios in the 1990s. As game programming became more complex and costly in the early 2000s, more traditional studio structures arose to support both personal computers and consoles, with several studios that, in some form or another, remain highly regarded and influential in the present. In recent years some of the studios have become defunct or been purchased by larger companies such as LittleBigPlanet developer, Media Molecule and Codemasters. The country is home to some of the world's most successful video game franchises, such as Tomb Raider, Grand Theft Auto, Fable, Colin McRae Dirt and Total War. The country also went without tax relief until March 21, 2012 when the British government changed its mind on tax relief for UK developers, which without, meant most of the talented development within the UK may move overseas for more profit, along with parents of certain video game developers which would pay for having games developed in the UK. The industry trade body TIGA estimates that it will increase the games development sector's contribution to UK GDP by £283 million, generate £172 million in new and protected tax receipts to HM Treasury, and could cost just £96 million over five years. Before the tax relief was introduced there was a fear that the UK games industry could fall behind other leading game industries around the world such as France and Canada, of which Canada overtook the UK in terms of job numbers in the industry in 2010. Asia China China had early on not been a major factor in the global video game market early on due to economic factors, governmental oversight, and a black market for foreign products. The government initiated a ban on video game consoles in 2000 that lasted through 2014, during which China's video game market grew for personal computer games, particularly subscription-based and microtransaction-based ones that were amenable to use in PC cafes, and later into mobile games. Media publishers like Tencent and NetEase focused on these types of games, growing successfully during the 2010s to become leading international companies. As of 2015, China's video game market revenue exceeds that of the United States, and is the largest country by both revenue and number of players. China is also the largest contributor towards esports in both revenue and in the number of professional players from the country. The industry, like most media in China, is tightly controlled by the government, with strong restrictions on what content may be in games, and incorporation of anti-addiction measures to limit playtime. It is home to Asia Game Show, the largest game convention in the world by attendance. Japan The Japanese video game industry is markedly different from the industry in North America, Europe and Australia. Japan initially trailed the United States in entering the video game sector as its companies followed trends set by their American partners, but started to pioneer their own ideas soon after. Several Japanese-developed arcade games, such as Space Invaders, helped to usher in the golden age of arcade video games from 1978 to 1982. The 1983 video game crash that affected the North American market did have small but short-term effects in Japan, as most companies involved in the business were well-established and could weather the disruption. Nintendo took the opportunity to push the Nintendo Entertainment System, a rebranding of its Famicom system, into the Western markets after the crash, implementing technical and business practices to avoid the factors that created the 1983 crash but also secured its control on what games were published for the system. Japan became the dominate home for consoles and console games through the early 2000s, challenged only by the incorporation of large publishers in the West and the Xbox line of consoles from Microsoft. Nintendo along with companies like Sega, Sony Interactive Entertainment, and Capcom are dominant leaders in the Japanese video game industry. Nintendo themselves are recognized for having created some of the best-selling and positively-reviewed video game series such as the Mario, Donkey Kong, The Legend of Zelda, Metroid and Pokémon. In recent years, consoles and arcade games have both been overtaken by downloadable free-to-play games on the PC and mobile platforms. South Korea The video game industry in South Korea generally followed the same early trends as the Japanese market, but players started focusing on massively-multiplayer online games (MMO) and other games that could be played at PC bangs (Internet cafes). South Korea was one of the first major regions involved in esports in the 1990s and 2000s, and today a large number of professional esports players originate from South Korea. India Video gaming in India is an emerging market since India is experiencing strong growth in online gaming, making it one of the top gaming markets in the world. Over the past few decades, the Indian gaming industry has gone from close to nonexistent in the 1990s to one of the top markets globally in the late 2010s. In 2019, the online gaming market in India was estimated at with an estimated 300 million gamers, a 41.6% increase from 2018. As of 2021, it is one of the top five mobile gaming markets in the world. The industry is projected to reach 510 million gamers by 2022. Others Africa The video game industry is still in its infancy throughout the African continent, but due to the continent's young population and increasing technological literacy, the sector is growing rapidly. African countries such as South Africa, Nigeria, and Kenya have been making rapid advances in mobile game development, both within their country and internationally, but due to limited funding and a market overcrowded with Western games, success has thus far been minimal. Australia and New Zealand Australia and New Zealand have an active video game industry, with several standalone developers as well as additional studios from other major developers across the globe. Conventions, trade shows, and conferences Gaming conventions are an important showcase of the industry. These typically provide the means for developers and publishers to demonstrate their games directly to video game players and consumers and obtain feedback. New games are frequently introduced during these events. Some examples of each conventions include the annual Gamescom in Cologne, and numerous PAX events. Some publishers, developers and technology producers also have their own regular conventions, with BlizzCon, QuakeCon, Nvision and the X shows being prominent examples. National trade groups that support their local video game industry often will hold trade shows aimed for developers and publishers to interact more directly with the video game media, and with retailers and distributors for planning future sales of products. The largest such trade show is the E3 in Los Angeles, California is held by the Entertainment Software Association. Other similar trade shows include Tokyo Game Show (Japan), Brasil Game Show (Brazil), EB Games Expo (Australia), KRI (Russia), ChinaJoy (China) and the annual Game Developers Conference. The development of video games is also a topic of academic and professional interest, leading to a number of conferences for developers to share their knowledge with others. Two of the major professional conferences include the Game Developers Conference (GDC), which holds multiple events through the year but with its main annual conference held in March in San Francisco, and the D.I.C.E. Summit run by the Academy of Interactive Arts & Sciences in February of each year at Las Vegas, Nevada. Media coverage and archiving The coverage of the video game industry started off with several magazines covering the topic, but as the Internet became widely available to support new media, much of the dedicated coverage of the video game industry has transitioned to detected websites, including Gamasutra, IGN, Eurogamer, Polygon and GameSpot. More recently, the effect of social media influencers, video game players that create online videos or stream themselves playing games through services like Twitch, have also become a significant source for coverage of video game news from the consumer point of view. Another facet of tracking the history of the video game industry is video game preservation, a process that is complicated due to game hardware technology that can become obsolete, dependencies on decommissioned online servers, and issues over intellectual property that legally restricts preservation efforts. Much of the industry's history prior to the 1983 crash has been lost, as companies affected by the crash simply threw material away, leaving little to recover today. There is better awareness of video game preservation into the 21st century, and several groups and museums have been established to collect and preserve hardware and software for the industry. Recognition within the industry The video game industry has a number of annual award ceremonies, commonly associated with the above conventions, trade shows, and conferences, as well as standalone award shows. Many of the dedicated video game journalism websites also have their own set of awards. Most commonly, these ceremonies are capped by the top prize, the "Game of the Year". Trends Players become fourth-party developers, allowing for more open source models of game design, development and engineering. Players also create modifications (mods), which in some cases become just as popular as the original game for which they were created. An example of this is the game Counter-Strike, which began as a mod of the video game Half-Life and eventually became a very successful, published game in its own right. While this "community of modifiers" may only add up to approximately 1% of a particular game's user base, the number of those involved will grow as more games offer modifying opportunities (such as, by releasing source code) and the video user base swells. According to Ben Sawyer, as many as 600,000 established online game community developers existed as of 2012. This effectively added a new component to the game industry value chain and if it continues to mature, it will integrate itself into the overall industry. The industry has seen a shift towards games with multiplayer facilities. A larger percentage of games on all types of platforms include some type of competitive online multiplayer capability. In addition, the industry is experiencing further significant change driven by convergence, with technology and player comfort being the two primary reasons for this wave of industry convergence. Video games and related content can now be accessed and played on a variety of media, including: cable television, dedicated consoles, handheld devices and smartphones, through social networking sites or through an ISP, through a game developer's website, and online through a game console and/or home or office personal computer. In fact, 12% of U.S. households already make regular use of game consoles for accessing video content provided by online services such as Hulu and Netflix. In 2012, for the first time, entertainment usage passed multiplayer game usage on Xbox, meaning that users spent more time with online video and music services and applications than playing multiplayer games. This rapid type of industry convergence has caused the distinction between video game console and personal computers to disappear. A game console with high-speed microprocessors attached to a television set is, for all intents and purposes, a computer and monitor. As this distinction has been diminished, players' willingness to play and access content on different platforms has increased. The growing video gamer demographic accounts for this trend, as former president of the Entertainment Software Association Douglas Lowenstein explained at the 10th E3 expo, "Looking ahead, a child born in 1995, E3's inaugural year, will be 19 years old in 2014. And according to Census Bureau data, by the year 2020, there will be 174 million Americans between the ages of 5 and 44. That's 174 million Americans who will have grown up with PlayStations, Xboxes, and GameCubes from their early childhood and teenage years...What this means is that the average gamer will be both older and, given their lifetime familiarity with playing interactive games, more sophisticated and discriminating about the games they play." Evidence of the increasing player willingness to play video games across a variety of media and different platforms can be seen in the rise of casual gaming on smartphones, tablets, and social networking sites as 92% of all smartphone and tablet owners play games at least once a week, 45% play daily, and industry estimates predict that, by 2016, one-third of all global mobile video game revenue will come from tablets alone. Apple's App Store alone has more than 90,000 game apps, a growth of 1,400% since it went online. In addition, game revenues for iOS and Android mobile devices now exceed those of both Nintendo and Sony handheld video game systems combined. See also List of video games List of video game websites Hollywood and the video game industry References Further reading External links comp.games.development.industry (Google Groups) International Game Developers Association Playing the Game: The Economics of the Computer Game Industry (Cambridge University Press) Sloperama: Game Biz Advice (Tom Sloper) Economics of the arts and literature Entertainment industry Mass media industry Industries (economics)
17112319
https://en.wikipedia.org/wiki/Hyland%20Software
Hyland Software
Hyland Software is the developer of the enterprise content management (ECM) and process management software suite called OnBase. Applications of the suite are used in healthcare, financial institutions, insurance, government, higher education and manufacturing. The firm has its headquarters in Westlake, Ohio, and offices in Lincoln, Nebraska; Irvine, California; Charlotte, North Carolina; São Paulo, Brazil; London, England; Tokyo, Japan; Andover, Massachusetts; Melbourne, Australia; Kolkata, India; Sydney, Australia; Berlin, Germany; Olathe, Kansas; Bloomington, Minnesota; Salt Lake City, Utah; Phoenix, Arizona; and Tampa, Florida. Corporate History Founding The company was founded in 1991 by Packy Hyland Jr. He met with members of the Necedah Bank to discuss its data processing and how electronic information technology could reduce printing costs by storing daily reports directly to optical disk. Packy Hyland created the first version of OnBase for the Necedah Bank, which became Hyland Software's first customer. Because OnBase was originally created for a bank, a majority of Hyland Software's customers were in the banking industry until recently when the healthcare providers found value from ECM technologies. Acquisition History September 1, 2006: Matrix Imaging, a private enterprise content management company in Bloomfield Hills, Michigan specializing in the higher education sector July 1, 2008: Liberty Information Management Systems (IMS), a private Costa Mesa, California-based enterprise content management company. Hyland acquired the company to gain a larger customer and partner base. The office in California, with a move from Costa Mesa to Irvine, has been maintained. July 1, 2009: Valco Data Systems, a private, Salem, New Hampshire-based healthcare software and software integration company Valco's software was noted for being strongly integrated with software from MEDITECH. March 1, 2010: eWebHealth, a private, Reading, Massachusetts-based provider of hosted medical records workflow Their specific expertise is in the areas of coding and revenue cycle workflow. March 1, 2010: eWebHealth, a private, Reading, Massachusetts-based provider of hosted medical records workflow Their specific expertise is in the areas of coding and revenue cycle workflow. September 1, 2010: Hershey Systems, a private, Santa Fe Springs, California maker of Singularity, a document management system marketed to higher education institutions September 24, 2010: Computer Systems Company, Inc. (dba The CSC Group), a private, Strongsville, Ohio-based provider of healthcare software and document conversion services September 24, 2010: Computer Systems Company, Inc. (dba The CSC Group), a private, Strongsville, Ohio-based provider of healthcare software and document conversion services August 29, 2012: SIRE Technologies, Inc., a private, Salt Lake City, Utah-based software developer focused on stuff for county and local governments December 2012: Enterprise Consulting Partners (ECP), a private, Reston, VA-based software focused on workflow automation and document management solutions that support business processes such as: invoice automation, billing, contract management, purchase requisitioning and Human Resources. December 2012: Enterprise Consulting Partners (ECP), a private, Reston, VA-based software focused on workflow automation and document management solutions that support business processes such as: invoice automation, billing, contract management, purchase requisitioning and Human Resources. February 28, 2013: AnyDoc Software, a Tampa, Florida-based software developer focused on automated document, data capture and classification June 1, 2014: CAYLX, a private, Belrose, Australia-based software company June 1, 2014: CAYLX, a private, Belrose, Australia-based software company October 13, 2015: LawLogix, a Phoenix, Arizona-based software company specializing in cloud-based immigration and compliance software May 2016: AcroSoft ECM, a private, Indianapolois, IN- based software company May 2016: AcroSoft ECM, a private, Indianapolois, IN- based software company July 7, 2017: the Perceptive business unit from Lexmark International, Inc. including the products Perceptive Content (formerly ImageNow), Perceptive Capture (formerly Brainware), Acuo VNA, PACSGEAR, Claron, Nolij, Saperion, Pallas Athena, ISYS and Twistage October 22, 2020: Hyland announced the acquisition of Alfresco Software October 22, 2020: Hyland announced the acquisition of Alfresco Software April 8, 2021: Hyland announced the acquisition of Nuxeo. Major Industries - Healthcare - Government - Higher Education - Commercial - Financial Services - Insurance - Manufacturing Key people Packy Hyland, Jr. is the founder of Hyland Software and developed the first version of OnBase for The Necedah Bank in Wisconsin in 1991. He served as CEO and President until 2001, when he was succeeded by his brother, A.J. Hyland, who retired in 2013 and was succeed after by Bill Priemer, formerly the firm's Chief Operating Officer. Miguel Zubizarreta served as CTO and joined the company in 1992 and has since retired in 2016, and was responsible for the architecture and product development direction of the OnBase product line. Chris Hyland served as CFO and joined the company in 1992 and will retire on 2020. Product Hyland Software's OnBase product integrates document management, business process automation and records management. Industry analysts such as Forrester Research focus on the product's foundational ECM functionality, like imaging and archiving capabilities, as its strengths. The OnBase product also offers integrations with Microsoft, SAP, Oracle Corporation and Lawson to gain more value from existing technologies OnBase is written in .NET, JavaScript. OnBase was named 2015 Best in KLAS for Document Management and Imaging. Services The OnBase Cloud Hyland offers a Software-as-a-Service (SaaS) application of OnBase software known as the OnBase Cloud. This service is a cloud-based version of Hyland's traditional OnBase product offering; applications are hosted at a data center and accessed over a secure Internet connection. Alfresco Cloud Hyland also offers a Platform as a service (PaaS) cloud-hosted platform of Alfresco Content Services, with customisation via Alfresco Module Packages (AMPs) and Alfresco Developer Framework (ADF) applications. Recognition In 2014, 2015, and 2016 the company was ranked as one of Fortune's 100 Best Companies to Work For list, rising to position 48 in 2016. References External links Hyland, creator of OnBase official website ECM 101 Research Guide Gartner Magic Quadrant Update, Document Imaging Talk, November 22, 2010 Content management systems Document management systems Software companies based in Ohio Companies based in Cleveland Software companies established in 1991 1991 establishments in Ohio Records management technology Software companies of the United States 2007 mergers and acquisitions
32272986
https://en.wikipedia.org/wiki/TeaMp0isoN
TeaMp0isoN
TeaMp0isoN was a computer security research group consisting of 3 to 5 core members. The group gained notoriety in 2011/2012 for its blackhat hacking activities, which included attacks on the United Nations, NASA, NATO, Facebook, Minecraft Pocket Edition Forums, and several other large corporations and government entities. TeaMp0isoN disbanded in 2012 following the arrests of some of its core members, "TriCk", and "MLT". English Defence League TeaMp0isoN released several documents pertaining to the English Defence League (EDL), leaking information which included personal details of several high-ranking EDL members. In addition, TeaMp0isoN went on to deface EDL's official website. Facebook In January 2011, unauthorized status updates were posted on Mark Zuckerberg and French President Nicolas Sarkozy's accounts on social-networking site Facebook. On 25 January, a spokesperson for Facebook acknowledged the bug in their system and said it has been fixed. Later that week The Daily Beast reported that "TriCk", a member of TeaMp0isoN, along with members of a group known as "ZHC", said they had exploited a bug in the web site on the previous New Year's Eve, allowing them to post unauthorized status updates and to block temporary newsfeeds to a list of 130 pages. A spokeswoman for one of the targeted groups, the English Defence League, confirmed that they were targeted and their pages critical of Islam were indeed hacked. Members of Facebook's security team said after being contacted on the matter by The Daily Beast, they had found no evidence of malicious activity in their logs. Tony Blair address book leak In June 2011, the group published what appeared to be the address book and other private data of former British Prime Minister Tony Blair on Pastebin. According to TeaMp0isoN, the data was obtained originally in December 2010. Blair's spokesman said the data was not obtained from Blair directly, but from the personal email account of his former staff. TeaMp0isoN responded to this, commenting "Blairs sheep are lying about how we got the info, we got into the webmail server via a private exploit & we wiped the logs so Good luck". BlackBerry During the 2011 England riots it was believed that the BlackBerry Messenger service was used by looters for collaboration. TeaMp0isoN defaced the official BlackBerry blog as a response to Research In Motion (RIM), the maker of the BlackBerry, promising to co-operate with the United Kingdom police and government. TeaMp0isoN released a statement saying, "We are all for the rioters that are engaging in attacks on the police and government." Government leaks In July 2011, TeaMp0isoN released eight Court Cases against Sarah Palin, claiming they had intentions to do the same with Barack Obama. On 8 August 2011, TeaMp0isoN released the hashed administrator passwords for a website hosted under NASA's domain, after using a public vulnerability. In November 2011, TeaMp0isoN released a list of email addresses and passwords that were reportedly obtained via an SQL injection vulnerability in the United Kingdom's Ministry of Defence. The Ministry of Defence is responsible for controlling Britain's defence policies and is also the headquarters of the British Armed Forces. In December 2011, TeaMp0isoN leaked the account data of 13 million South Korean online game subscribers. In April 2012, TeaMp0isoN targeted MI6 (the UK's Secret Intelligence Service). The group created a script that allowed them to repeatedly flood the anti-terrorism hotline with computer-generated calls, before calling up the hotline themselves in order to mock officers. The officers then warned them that they would be traced and reported to the FBI. TeaMp0isoN then reportedly wiretapped the MI6 agents, recording a conversation between officers and posting the leaked conversation on YouTube. On 3 April 2012, TeaMp0isoN gained access to a NATO web server, before leaking data obtained from the server and defacing the index page of the site. Operation Censor This TeaMp0isoN joined forces with the hacker collective Anonymous to announce OpCensorThis, an operation intended to protest against censorship. The operation received a lot of media attention and music artists such as Lyricist Jinn and Tabanacle created a music video in order to raise awareness of the operation. TeaMp0isoN then went on to deface several sites in support of OpCensorThis, the most significant being the United Nations Development Programme, and the British tabloid newspaper, the Daily Mail. Operation Robin Hood In response to the Occupy Movement, an online announcement claimed that TeaMp0isoN joined Anonymous to launch Operation Robin Hood, intending to hack into websites, obtain credit cards and make donations to activist organizations while the banks would have to refund the hacked accounts. The video stated: "Operation Robin Hood will take credit cards and donate to the 99% as well as various charities around the globe. The banks will be forced to reimburse the people their money back", while encouraging people to "move your accounts into secure credit unions". As part of Operation Robin Hood, TeaMp0isoN leaked over 26,000 Israeli credit card details, obtained via vulnerabilities in Israeli banks, One and CityNet. TeaMp0isoN went on to publish the credit card details and passport scans of well-known rapper Sean Combs (also known as P-Diddy). TeaMp0isoN then used his credit card to donate money to charity and to order pizzas for those who requested via Twitter. P-Diddy launched an internal investigation to attempt to track down TeaMp0isoN, reportedly hiring a team of private detectives. Operation Retaliation Following the arrest of founding TeaMp0isoN member "TriCk," the group announced Operation Retaliation, which began with reported DDoS attacks against MI6, before attacks took place against, among others, the Japanese electronics multinational Panasonic, the Australian Government, and the World Health Organization. In addition, Consternation Security and Doxbin were also reported to have been hacked. United Nations In November 2011, TeaMp0isoN released more than 128 usernames and login details, which they say were obtained from the United Nations Development Programme. According to a spokeswoman for the UNDP the data was extracted from "an old server which contains old data". TeaMp0isoN disputed this statement, releasing server logs and other evidence to suggest that the server was still in fact actively being used by the United Nations. In April 2012, TeaMp0isoN hacked the United Nations again, this time targeting the UN's World Health Organisation and leaking a list of usernames and hashed passwords, including administrator credentials. Possible arrests On 10 April 2012, the group created a script to call the British Anti-Terrorism Hotline with hoax calls continuously for a 24-hour period to protest the extradition of terrorist suspects to the United States. On 12 April, police arrested two teenagers, aged 16 and 17, over the incident under suspicion of violating the Malicious Communications Act 1988 and the Computer Misuse Act. On 9 May 2012, alleged TeaMp0isoN member and spokesperson "MLT" was arrested by officers from Scotland Yard on suspicion of offences under the Computer Misuse Act, relating to the attacks on the Anti-Terrorist Hotline and other offences. Activities in 2015 In 2015, TeaMp0isoN returned and no longer appear to be committing any illegal activities. Posting from their official Twitter account, they have identified and disclosed vulnerabilities in Google, Amazon, eBay, Harvard University, NOAA, Comcast, Time Warner Cable, Western Union, the United Nations, the London Stock Exchange, Autodesk and several other large systems. TeaMp0isoN has also released several zero-day exploits, including one that affected the memorial sites of Malcolm X and Marilyn Monroe, and one that affected a commonly-used WordPress plugin used by a large number of websites. In addition to this, their website and forums have returned alongside their newly launched IRC network, and it appears they also have plans for a wargaming website allowing penetration testers to hone their skills within a legal and ethical environment. In April 2015, TeaMp0isoN identified and disclosed vulnerabilities in many major universities including Harvard University, Stanford University, Princeton University, the University of Texas, and the University of California, among others. The majority of the vulnerabilities found were via SQL injection flaws. Also at this time, TeaMp0isoN identified a zero-day SQL Injection vulnerability, resulting in many sites being compromised, including Crime Stoppers in Waterloo, Ontario, Peel and other Canadian cities and districts. In May 2015, TeaMp0isoN member "KMS" targeted the Minecraft Pocket Edition Forum, seemingly infiltrating their database and leaking a list of over 16,000 usernames and passwords. Activities in 2016 Activities in 2016 indicated that they came back as a mix between a black hat and a white hat group. They disclosed vulnerabilities in the United States Department of Education, UCLA, and various other institutions. In February/March 2016, the group breached both a UN Agency and one of America's largest Internet service providers. During mid-February, TeaMp0isoN breached the United Nations World Tourism Organization and defaced their forum index. During late February, TeaMp0isoN breached the Time Warner Cable Business Class Managed Security Services Portal. Their (since suspended) Twitter feed indicated that they gained access to the backend ticket system as well as the details of 4,191 users. Links to ISIS TeaMp0isoN member "TriCk" is believed to be Junaid Hussain, a black hat hacker who was arrested for doxing Tony Blair's personal information. He fled the UK while on police bail and reportedly joined ISIL. It is believed that Hussain became a prominent ISIL propagandist, using social media to recruit soldiers to join ISIL, and was behind several high-profile attacks under the group name "CyberCaliphate". Hussain is also believed to have links to Jihadi John. Hussain has also been suspected of cooperating with other ISIL members to unmask individuals who report to rebel media groups, and doxing U.S. soldiers and their families. Hussain was a prominent target on the Pentagon's Disposition Matrix due to his influence overseas. On 26 August 2015, U.S. officials said they have a "high level of confidence" that Hussain was killed in a drone strike in Syria. See also Hacktivism References Internet activists Internet vigilantism Hacker groups Organizations established in 2009
976641
https://en.wikipedia.org/wiki/ALT%20Linux
ALT Linux
ALT Linux is a set of Russian operating systems based on RPM Package Manager (RPM) and built on a Linux kernel and Sisyphus package repository. ALT Linux has been developed collectively by ALT Linux Team developers community and ALT Linux Ltd. History ALT Linux Team arose from the merger of IPLabs Linux Team and the Linux community of the Institute of Logic, Cognitive Science and Development of Personality. The latter cooperated with Mandrake Linux and SUSE Linux teams to improve localization (specifically Cyrillic script), producing a Linux-Mandrake Russian Edition (RE). Mandrake and Mandrake RE became different distributions and thus the decision was made to create a separate project. The name ALT was coined, which is a recursive acronym meaning ALT Linux Team. The split led to the creation of the Sisyphus package repository, which is an unstable branch of the ALT Linux development. In 2007, the Sisyphus repository won a prestigious CNews award in nomination for Information Security. Releases Version history Linux-Mandrake Linux-Mandrake 7.0 Russian Edition, released in the beginning of 2000, was the first de facto independent distribution of IPLabs Linux Team. It kept the name Mandrake with permission from Mandrake developers. Spring 2001 was the second IPLabs Linux team release, released several months later. ALT Linux 1.0 Since the summer of 2001, ALT Linux Team has been formed and the ALT Linux name has been established. The first ALT Linux release was ALT Linux Junior 1.0, released in summer of 2001, followed by the updated ALT Linux Junior 1.1 in autumn of 2001. Junior distributions were 1CD releases. ALT Linux 2.* ALT Linux Master 2.0, released in May 2002, was the 4CD all-purpose Linux distribution targeted for software developers and system administrators. ALT Linux Junior 2.0 was released in summer of 2002, as a 1CD desktop/workstation-oriented release. ALT Linux 3.0 ALT Linux Compact 3.0 was released during autumn 2005, and consisted of 1CD/1DVD installable versions along with LiveCD (TravelCD 3.0). There were several subsequent OEM updates counting up to 3.0.5. ALT Linux 4.0 These series changed the official naming somewhat to be ALT Linux 4.0 $flavour. Server was released in June 2007 (1CD+1DVD per platform; i586 and x86_64); Office Server quickly followed (1CD; i586 and x86_64); Desktop Personal in August 2007 (1DVD, LiveCD, Rescue CD; i586; KDE3); Lite in December 2007 (installation CD, live CD and 2CD with addons; i586; Xfce4); Terminal in December 2007 (joint release with Media Magic Ltd, 1DVD; i586; KDE3, low client RAM requirements). There was also a more conservative school 4.0 branch maintained for the Russian schools pilot project, and several distributions specifically tailored for schools released using that as a base. ALT Linux 4.1 Desktop was released in October 2008 (1CD/1DVD; i586 and x86_64; KDE3); Children in December 2008 (LiveCD; i586); Skif in December 2008 (1CD; x86_64; HPC); School Server in February 2009 (1CD; i586). ALT Linux 5.x The 5.0 branch was canceled mainly due to stormy X.org conditions (and subsequently archived); 5.1 community branch was created along with p5 conservative branch later in 2009. Somewhat confusingly, distributions based on the p5/branch were numbered as ALT Linux 5.0: Ark (client+server suite, 1DVD+1CD per platform; i586 and x86_64); School Suite – mostly i586, also including docs, video lessons and free software for Windows (3DVD): Server (1DVD; i586 and x86_64); Terminal (1DVD; KDE3); Master (1DVD/flash; KDE4); Junior (1DVD/flash; GNOME2); Lite (2CD; Xfce4); New Lite (1CD/1DVD/flash; LXDE); KDesktop (1DVD; i586 and x86_64; KDE4); Simply Linux 5.0 (1CD/flash/LiveCD; i586; Xfce4). Lite A small single-CD distribution for older/low-memory computers, with Xfce as default desktop. Available in normal and Live CD versions. Rather superseded by LXDE-based New Lite. Compact Compact is a series of ALT Linux distributions tailored for beginner users. It is mostly used on workstations, home computers, and notebooks. It includes additional means for easy configuration, many office and multimedia applications, and some games. Compact was also a popular choice for OEM whitelabeling, i.e., creating a specific edition for various hardware vendors to bundle with their hardware. Server ALT Linux Server is a hardened server single CD distribution. It is certified by Federal department of technical and expert control of Russia in the following categories: by the level of monitoring for non-declared features – level 4 class of protection from unauthorized access to information – class 5 Terminal ALT Linux Terminal is a terminal server distribution based on ALT Linux Desktop and ALTSP5: a friendly/merging fork of Linux Terminal Server Project (LTSP) which is usable on older hardware acting as thin and diskless clients (16 MB RAM is enough, while stock LTSP5 usually requires ≥ 64 MB RAM). It was also adapted for Russian School Education National Project free software package. References External links Community website (in English) Sisyphus package repository, on which Alt Linux is based Reviews ALT Linux 5 Ark desktop review Customizing ALT Linux 5 Ark desktop Simply Linux 5 review Customizing Simply Linux 5 How to install Cairo-Dock on Simply Linux 5 RPM-based Linux distributions X86-64 Linux distributions Linux distributions Russian-language Linux distributions
2929496
https://en.wikipedia.org/wiki/Family%20Tree%20Maker
Family Tree Maker
Family Tree Maker is genealogy software for Windows and Mac that allows the researcher to keep track of information collected during research and to create reports, charts, and books containing that information. The software was originally developed by Kenneth Hess of Banner Blue Software, which was purchased by Broderbund in 1995. It passed through the hands of The Learning Company, Mattel, and others before coming under its current ownership. A redesigned Family Tree Maker 2008 was released on August 14, 2007. The 2009 version of the program corrected some of the errors and omissions of its predecessor, and introduced a few new features. Family Tree Maker 2010 claimed to further enhance the radical redesign and be more powerful and feature-packed with faster navigation and quicker load times. A version for the Mac was released in 1997, but due to low market demand was discontinued for over a decade. A new version of Family Tree Maker for Mac was released on November 4, 2010. Family Tree Maker Version 16 was awarded a Codie award in the "Best Consumer Productivity Solution" category in 2006. On December 8, 2015, Ancestry.com announced that it would discontinue Family Tree Maker. The announcement was met by fierce protest from Family Tree Maker users. On February 2, 2016, Ancestry.com announced that Software MacKiev, the company that had developed the Mac version of the software for more than six years, would acquire the Family Tree Maker brand, and take over the development and publishing of Mac and Windows editions. At the same time as this announcement, Software MacKiev promised free updates for owners of the then-current versions. While the MacKiev dot-one versions were put on sale within 2 months, they were not officially released as free updates to the current versions until 30 Dec 2016. At that time, connectivity with Ancestry was also extended until its replacement could be rolled out. Ancestry connectivity, including TreeSync, the ability to synchronize a Family Tree Maker tree with one at Ancestry, was turned off on 29 Mar 2017, with its replacement, called FamilySync, to be turned on 31 Mar. The news about FamilySync contained a bit of a surprise: it would only be available in Family Tree Maker 2017, the next iteration of Family Tree Maker, which was to be released coincident with the deployment of FamilySync. Users of Family Tree Maker 2 and 3 for Mac and 2012 and 2014 for Windows would no longer have any connectivity to Ancestry. If they wanted such connectivity, they would have to pay to upgrade to Family Tree Maker 2017, which was officially released on 16 July 2017, 107 days later than planned. The core functionality and user interface of Family Tree Maker 2017 have changed little since 2010. Software MacKiev touted four major improvements: FamilySearch integration, FamilySync, Color Coding, and Photo Darkroom. FamilySearch integration provides potential matches to the FamilySearch.org Family Tree, but not to their record collections. FamilySync is a replacement for Ancestry.com's TreeSync feature; it provides potential matches to family trees, indexes, and records at Ancestry.com. It was necessitated by Ancestry.com retiring their TreeSync application programming interface (API). While the old API was used exclusively by Ancestry.com, since they also owned Family Tree Maker, the new API is open to other software developers to use. Color Coding allows users to assign up to four different colors to a person and their ancestors. Photo Darkroom can darken faded black and white photos. FTM version history FTM merger history 1984 Banner Blue Software founded by Ken Hess, "As the founder and president of Banner Blue Software from 1984 to 1996, I sold over two million copies of Family Tree Maker." May 1997 Brøderbund Software acquired Parsons Technology from Intuit (which included the marketing rights to Family Origins for Windows August 1998 Brøderbund Software acquired by The Learning Company (which included Family Tree Creator through an acquisition of Mindscape/IMSI. v5 Published Late 1998 The Learning Company acquired Palladium Interactive (which included Ultimate Family Tree). May 1999 The Learning Company was acquired by Mattel Incorporated "Barbie").v6 Published November 1999 A&E Television Networks, Hearst Interactive Media, Mattel, and private equity firms form Genealogy.com, LLC April 2000 v7.5 Published. February 2001 A&E TV acquired Genealogy.com Late 2001 Genealogy.com acquired the GenForum message board site, which it had been hosting for a few years June 2002 Genealogy.com acquired Generations PC product line from Sierra Home April 2003 Genealogy.com acquired by MyFamily.com December 2006 My Family.com Inc changed its name to The Generations Network March 2016 Software MacKiev purchased FTM Software from ancestry.com References External links Family Tree Maker Windows-only genealogy software
3003616
https://en.wikipedia.org/wiki/Freedom%20Toaster
Freedom Toaster
A Freedom Toaster is a public kiosk that will burn copies of free software onto user-provided CDs and DVDs. History The original Freedom Toaster project was sponsored by Mark Shuttleworth's Shuttleworth Foundation and built in Perl by Hamish Whittal. It consisted of a number of CD burning facilities (in kiosk form), where members of the public were able to burn copies of free and open-source software onto self-supplied blank CD media. The project was started as one solution to overcome the difficulty of obtaining Linux and other free and open-source software in South Africa, where the restrictive telecommunications environment makes downloading large software files prohibitively expensive. There are currently Freedom Toasters at the following locations: Aberystwyth University, Bloemfontein, Cape Town (2), Diepkloof, Durban, East London, Grahamstown, Johannesburg (5), Knysna, Namibia, Pietermaritzburg, Port Elizabeth, Port Shepstone, Potchefstroom, Pretoria (3), Seneca College in Toronto, Stellenbosch, Stockholm, Trivandrum. Soon, it will be available at Michigan City, Indiana. Functions A Freedom Toaster kiosk is placed at a school, library, shopping center or another publicly accessible location. Users bring blank optical discs to the kiosk and select the software that they would like. The kiosk will then burn the selected software onto the users' media. The name derives from this function. "Freedom" refers to the free and open source software provided. "Toaster" is a term for an optical disc burner. Purpose Freedom Toaster kiosks provide a way for computer users in economically disadvantaged regions and areas with limited or no Internet access to get software. By providing this service, the people behind the Freedom Toaster hope to address the issue of the Digital Divide. Availability The Freedom Toaster is mostly available in South Africa and is currently supported by the Shuttleworth Foundation. The Foundation is attempting to get others to adopt the idea by providing the tools to help create, support and maintain your own Freedom Toaster. It has also provided seed funding to Brett Simpson of Breadbin Interactive to create a sustainable business model with the idea. The initiative has also been taken up independently by a company in India - Zyxware Technologies - who has partnered with Free Software Users Group, Thiruvananthapuram to promote Freedom Toasters as a viable method to spread Free Software in India. References External links Freedom Toaster – The South African website Seneca Freedom Toaster Washington-Lee High School Freedom Toaster Freedom Toaster Map Freedom Toaster in India Free software projects
8213829
https://en.wikipedia.org/wiki/Rocky%20Seto
Rocky Seto
Haruki Rocky Seto, often referred to as Rocky Seto, (born March 12, 1976) is a former American football coach; he last served as the Assistant Head Coach for the NFL's Seattle Seahawks. In 2017, Seto announced that he was leaving the coaching industry to become a full-time pastor. He is currently serving as the Senior Pastor for Evergreen Baptist Church of San Gabriel Valley. Early years Seto was born in Los Angeles, California; he is Japanese-American Nisei, son of Issei parents. His father runs a gardening business and grew up going to USC football games; both father and son were fans of the Trojans. Seto attended Arcadia High School, where he played numerous positions; he described himself as an "average player". College career Seto began his college playing career at Mount San Antonio College, a junior college in the Los Angeles area. He chose the college so he could play for head coach Bill Fisk, who was an All-American at USC. Seto was a fullback and defensive end during the 1995 and 1996 seasons, but mostly played on special teams. In 1997, he transferred to the University of Southern California, hoping to walk-on to the football team. Although he was initially told he would be able to walk-on, he stopped getting mail from the program. Concerned, Seto staged an "accidental" meeting with head coach John Robinson who sorted out his situation, allowing him to walk-on. Seto was a reserve linebacker for the Trojans in 1997, seeing action on the scout team. In 1998, new head coach Paul Hackett awarded him an athletic scholarship, and he was later awarded USC's Black Shirt (scout team) Defensive Player of the Year Award for that season. Seto received an Associate's degree in general studies from Mt. San Antonio Junior College in 1997, a bachelor's degree in exercise science from USC in 1999, and a master's degree in public administration from USC in 2001. Once he had gained his bachelor's degree, Seto initially planned to attend graduate school at USC to become a physical therapist. Although he had already placed his deposit, he found out about the possibility of a volunteer assistant position with the football program and opted to enter coaching. Coaching career After playing for the Trojans, Seto joined the coaching staff in 1999 as a volunteer assistant under then-head coach Paul Hackett, working with the defense and special teams. In 2000, he served as an administrative graduate assistant, and with the arrival of head coach Pete Carroll in 2001, he became a graduate assistant involved in the defense, working with the general defense in 2001 and safeties in 2002. In 2003, he became a full coach, in charge of safeties, and from 2004 to 2005 he coached linebackers. From 2006 to 2010, he coached the USC secondary. In 2008, former college teammate Kris Richard joined the staff as a graduate assistant. In 2006, Seto turned down a job to coach the secondary of the NFL's Buffalo Bills. When USC offensive coordinator Steve Sarkisian departed to take the head coaching position at Washington in late 2008, he offered Seto the position of defensive coordinator. He opted to stay at USC and continue coaching the secondary, along with a raise and the additional title of assistant head coach for defense. On January 7, 2009, Carroll promoted Seto to USC Defensive Coordinator. Seto was not retained when Lane Kiffin became the head coach at USC. He was replaced by Kiffin's father, Monte. In 2010, Seto joined Pete Carroll's coaching staff for the NFL's Seattle Seahawks. In 2015, the Seahawks announced that Seto has been promoted to assistant head coach/defense. In January 2017, Seto announced that he is leaving his position with the Seahawks to join the Baptist ministry. He now serves as a pastor in La Puente, California. Personal life Seto is named after boxer Rocky Marciano, his brothers are named after Sonny Jurgensen and Johnny Bench. His nickname is "Rock". Seto married Sharla (née Chiang), who played soccer for USC and was on the Women of Troy's 1998 Pac-10 championship squad; she was originally from Seattle. They have two daughters (Kaylani & Mia) and two sons (Troy & Timothy). Seto is a devout Christian and considers his church community an important aspect of his life. He was featured in an episode of Trinity Broadcasting Network's "More Than Conquerors" magazine show, which profiles Christian sports figures and shares their testimony. Seto has participated in an exchange program with Waseda University in Tokyo, Japan to help teach American football coaching and playing strategy. Seto currently serves as the Senior Pastor of Evergreen Baptist Church of San Gabriel Valley. References External links Official Rocky Seto websites - shouldertackling.com and thegreatesttreasure.org USC Athletic Department Biography 1976 births Living people Mt. SAC Mounties football players Seattle Seahawks coaches USC Trojans football players USC Trojans football coaches American sportspeople of Japanese descent USC Sol Price School of Public Policy alumni
2418207
https://en.wikipedia.org/wiki/A.%20David%20Thackeray
A. David Thackeray
Andrew David Thackeray (19 June 1910 – 21 February 1978), was an astronomer trained at Cambridge University. He served as director of the Radcliffe Observatory for 23 years. Career Thackeray went to school at Eton College, where he observed meteors for the British Astronomical Association. He went on to study mathematics at King's College, Cambridge. He received a PhD on theoretical stellar spectroscopy in 1937 from the Solar Physics Laboratory in Cambridge. During his studies he worked at the Mount Wilson Observatory in California from 1934 to 1936. He was Assistant Director of the Solar Physics Observatory at Cambridge Observatory from 1937 to 1948. He was then director of the Radcliffe Observatory, Pretoria from 1951 until it was merged with the Royal Observatory, Cape of Good Hope in 1974 to form the South African Astronomical Observatory. He became an honorary professor of the University of Cape Town and, a few days before his death, an Associate of the Royal Astronomical Society. Research He specialized in stellar spectroscopy. At a conference of the International Astronomical Union in Rome in 1952, he presented results of studies of variable stars in the Magellanic Clouds, indicating that the perceived age and size of the universe had to be doubled. He was the discoverer of Thackeray's Globules in 1950. Personal life He was born on 19 June 1910 in Chelsea, London. His father was the classical scholar Henry St. John Thackeray. He died in an accident on 21 February 1978. He was a nephew of Mary Ackworth Evershed. Published works References External links A. David Thackeray 20th-century British astronomers South African astronomers 1910 births 1978 deaths People educated at Eton College Alumni of King's College, Cambridge
8055239
https://en.wikipedia.org/wiki/Record%20restoration
Record restoration
Record restoration, a particular kind of audio restoration, is the process of converting the analog signal stored on gramophone records (either 78 rpm shellac, or 45 and 33⅓ rpm vinyl) into digital audio files that can then be edited with computer software and eventually stored on a hard-drive, recorded to digital tape, or burned to a CD or DVD. The process may be divided into several separate steps performed in the following order: Cleaning the record, to prevent unwanted audio artifacts from being introduced in the capture that will necessitate correction in the digital domain (e.g. transient surface noise caused by dirt), and to prevent unnecessary wear and damage to the stylus used in playback. Transcription of the record to another format on another medium (generally a digital format such as a wav file on a computer); Processing the raw sound file with software in order to remove transient noise resulting from record surface damage (clicks, pops, and crackle cause by surface scratches and wear); Using software to adjust the volume and equalization; Processing the audio with digital and analogue techniques to reduce surface/wideband noise; Saving the file in the desired format (WAV, MP3, FLAC, etc.). The source of the information for these steps is available from various websites and the help files for the software employed in the process. Record cleaning The first step involves cleaning the playing surface of the records (unless they have been stored in archival, dust-free conditions since they were last cleaned). This can involve anything from turntable-based, vacuum equipped, professional cleaning machines that use proprietary chemical formulations and cost four figures, to improvised methods involving home-made equipment and/or cleaning solutions consisting of isopropyl alcohol, distilled water (unpurified tap water should not be used, as it will probably leave limescale deposits on the record surface) and a surfactant to aid drying. Isopropyl alcohol should only be used to clean vinyl records: it will cause permanent damage to shellac, master and one-time recordings (acetate, wax and lacquer). Hardware The second step involves transcription of a record using a suitable turntable and a suitable cartridge-stylus combination. More often than not, a magnetic cartridge and stylus combination is used because of its superior sound characteristics and signal-to-noise ratio over other pickup systems. The output of a magnetic cartridge is of a very low volume (typically ≈5mV) so the signal must be amplified with a preamplifier to bring it up to line level before being routed into the line-in jack of a computer's sound card. Sound cards made specifically for digitally recording vinyl (as well as those designed for DJing with timecode vinyl) have phono preamplifiers built in, eliminating the need for two separate devices. Three main types of phono-preamplifier exist for the process of record restoration and playback: those that apply RIAA equalization or RIAA de-emphasis on playback to counteract the equalization used when the recording was originally made. These are generally not suitable for 78rpm records and early microgroove recordings. those that include a switchable frequency turnover filter to match the various turnover frequencies used by the many record manufacturers between 1925 and c.1960. those that apply no equalization (also called "Flat" phonopreamplifiers). These require audio software to apply the correct equalization to the digital recording during the restoration process. As such, this type of premplifier is suitable for all record formats regardless of equalization employed by the mastering process. Regardless of the preamplifier employed, one must ensure that the output volume is not set too high when recording through the sound card, or digital clipping may result. A low average volume can easily be corrected later on during editing (although with some loss in dynamic range) - however, too low a volume setting can result in greater amount of noise (especially the inherent sound-card or system noise) relative to the usable audio and this noise will become prominent at the time of normalisation of the audio. Ideally, the VU meter should not exceed around -2 or -3 dB to allow for some signal headroom. However, some clipping due to transient responses caused by scratches or cracked records are usually acceptable since these are extremely small in width and do not usually cause any audible difference. One must also be sure that all equipment is grounded appropriately together, or subtle hums will likely result from the formation of ground loops. Similarly, the computer should have sufficient power and memory to record an entire record without any "drop-outs"— (tiny gaps in the audio stream lasting just a fraction of a second). Software The software used to process the resulting digital files ranges in price from thousands of dollars to freeware. Some of these applications are simple, and some are very complex. Many are general purpose waveform editors that also happen to include record restoration features or plugins, and others are dedicated to the sole purpose of record restoration. Moreover, some applications are designed for easy fast processing with the push of a few buttons, and others require a time-consuming but perhaps more exact manual approach to editing out damage. Most applications present a waveform display, but a few are basically noise and click-pop filters that provide no visual display at all. All record restoration applications for Windows work directly upon WAV files, but a few will also directly open files in other formats, such as MP3. Record restoration software normally handles two different categories of noise separately. First, there is the constant background noise that goes on through the entire recording that is the result of the sound the stylus makes in the groove when no music is playing, plus whatever subtle drones are generated by the electronics involved (such as turntable rumble or 50/60 cycle hum). In addition to band-stop filters (also known as "notch filters"), low-pass filters, and high-pass filters for filtering out hum and noise, many applications allow the user to take a "noiseprint" of a small section of waveform when the stylus is tracking but no music is playing; the filtering is then accomplished specific to this noiseprint. Second, there are the transient bursts of damage, mostly clicks and pops, caused by scratches or record defects, and crackle caused by many minute defects grouped close together. The software must filter this kind of click-pop damage conservatively, because a click or a pop can look very much like a legitimate percussive effect, such as a light snare drum rim-shot. If the automatic filtering software is getting every last click, chances are good that it could also be filtering some percussion instruments. After an automatic click filtering, it is reasonable to expect a few clicks to be left over, and these must be removed manually by isolating them one-by-one in the waveform. These residual clicks may then be corrected by attenuation (reducing or muting the volume of the anomaly), interpolation (replacing the waveform "spike" with a less offensive section, either a straight line—linear interpolation—or a calculated facsimile deduced from what the wave looks like on either side); substitution (replacing a damaged waveform segment with a similar section from elsewhere); channel substitution (where damage occurring in only one channel of a stereo waveform is replaced by a similar good segment in the other channel); and simple deletion, which is usually not noticeable for small samples. Some applications also have a "pencil tool" with which one can actually redraw the waveform. Volume and equalization After the noise and clicks and pops have been removed, one may adjust the volume. This is usually done by a process called audio normalization whereby the loudest tone in a track is amplified right up to some specified point, usually the point of digital clipping, and the rest of the waveform is amplified accordingly. In another form of amplification called "hard limiting," the loudest passages are attenuated drastically after they hit a certain limit, while the quieter passages are amplified. The result is a compressed waveform that sounds considerably louder, though it may not be what the original recording engineers intended. In all of these volume adjustments, one should respect the original dynamics of a piece, and of the variation in dynamics among different tracks in the same LP. In addition to adjusting the volume, at this point one may desire to adjust the frequency profile of a piece with the "graphic equalizer" that is normally supplied with a wave editor. Some might feel that a track needs a slight treble boost, or reduction, or a big boost in the bass department. One should satisfy one's own perception of what sounds best for any particular track. An application usually lets you "preview" a piece before applying the equalization effects. Export and save After all this is done, the file (or files) are ready to export (or save) in whatever form the user desires. Almost all wave editing applications have the default ability to save files in WAV form, and some can also save files as MP3, FLAC, or in other formats. Many CD-R burning applications can then take these files and burn them onto a blank recordable disc in a form that can be played on a common CD player (using the standard CD-DA format). Preservation Each medium - including digital media - has benefits and drawbacks and over the long term, vinyl records may even have advantages over digital media. Due to the nature of the medium, playback of "hard" records, e.g.: LPs, causes gradual degradation of the recording. CDs, however, can also have degradation due to "CD rot" and other abnormalities. CDs' shelf life has been disputed as to whether it is to be the equivalent of vinyl- which actually can last for years of playback. CDs also can have shortcomings such as skips and clicks. This is due to problems with the laser reading the discs. On the other hand, a vinyl record will play under most any circumstance because it is an analog medium. The recordings are best preserved by transferring them onto more stable media and playing the records as rarely as possible. They need to be stored on edge, and do best under environmental conditions that most humans would find comfortable. The medium needs to be kept clean — but alcohol should only be used on PVC or optical media, not on 78s. The equipment for playback of certain formats (e.g. 16 and 78 rpm) is manufactured only in small quantities, leading to increased difficulty in finding equipment to play the recordings. (This "gradual degradation" is more noticeable on some discs than others. In fact it is possible to have eighty-year-old records that sound as new as brand new discs with pops and tics. How the records are handled and the equipment on which they are played as well as the manufacturing process and quality of original vinyl have a considerable impact upon their wear.) Where old disc recordings are considered to be of artistic or historic interest, record companies or archivists play back the disc on suitable equipment and record the result, typically onto a digital format which can be copied and converted without any further damage to the recording. For example, Nimbus Records uses a specially built horn record player to transfer 78s. However, anyone can do this using a standard record player with a suitable pickup, a phono-preamp (pre-amplifier) and a typical personal computer. Once a recording has been digitized, it can be manipulated with software to restore and, hopefully, improve the sound, for example by removing the result of scratches. It can also be easily converted to other digital formats such as DVD-A, CD and MP3. As an alternative to playback with a stylus, a recording can be read optically, processed with software that calculates the velocity that the stylus would be moving in the mapped grooves and converted to a digital recording format. This does no further damage to the disc and generally produces a better sound than normal playback. This technique also has the potential to allow for reconstruction of damaged or broken disks. With regard to inner sleeves, plastic polyethylene is purported to be better than the common paper sleeve and less bulky than the poly-lined paper variety. Paper sleeves deteriorate over time, leave dusty fibers, and produce static that attract dust. 100% poly sleeves produce less static (thereby attracting less dust), are archival, and are thinner by nature so they minimize pressure on the LP jacket seams. References External links The Wave Corrector Tutorial A Technical Overview Preserving vinyl records digitally Site has some relevant video content. Converting gramophone and analogue recordings to mp3. A practical step-by-step guide. Transferring LP's to CDR Popular open source software used to transfer analogue to digital Audio storage
32990232
https://en.wikipedia.org/wiki/Mitchell%20Waite
Mitchell Waite
Mitchell Waite is an American computer programmer, author and publisher of programming books and the mobile app iBird. Birding app first published by Mitch Waite Group in 2008: iBird Pro for the Apple iPhoneHis first book "Projects in Sight, Sound and Sensation" was published in 1974. He studied nuclear physics at Sonoma State University during 1971–1975. His career began in 1977 when he met Steve Jobs at the Home Brew Computer Club at Stanford University. Jobs introduced the Apple I to the group and Waite was one of the first to purchase the single board personal computer at the Byte Shop in San Rafael. By this point Waite had published several computer books and was working on a book about computer graphics. Jobs found out about an elaborate weather station Waite had running on his houseboat in Greenbrae, California and attached to the Apple I, and invited himself up to see it. When Jobs arrived he spent the entire time bragging about the new Apple II he had developed with Steve Wozniak. It was 10X better than the Apple I he claimed and invited Waite to come to Cupertino to see it. Waite met Jobs at the new offices of Apple Computer. Impressed with his books Jobs offered Waite a job as Apple's head of documentation, along with stock options. Waite accepted and asked if he could come in later in the morning since he lived in Marin county. Jobs told him that if he worked at Apple he had to be there 24 x 7. Waite balked, and told Jobs he could not live in Cupertino's asphalt jungle. Jobs was infuriated, called him a bozo and said he was blowing a once-in-a-lifetime opportunity. Waite told Jobs he felt he would be successful on his own one day as a writer or publisher but would still love to do something else to help Apple. Jobs told him to meet with Mike Markkula and Mike assigned Waite a job writing a magazine article comparing the Apple II to the Commodore 64 and learning the Commodore apart with a stiletto knife. After that Waite decided he was not destined to work in the computer industry but rather wanted to create new kinds of books and so he worked even harder at being a writer. After writing 5 books on his own he began working with more of his friends in the hobby world. He enlisted his old College of Marin professors to help him write books about programming languages and soon has a reputation as one of the major shakers in the burgeoning computer book industry. He established Mitch Waite Group at 1977, which has published more than 130 titles in the computer programming field. The company was later sold to Simon & Schuster. Waite also created the website WhatBird and later developed iBird, a bird field guide app for iOS and Android. iBird apps: http://www.ibird.com Biography Early life and education Books Books written by Mitchell Waite include: CP/M Bible Soul of CP/M MS-DOS Bible C Primer Plus BASIC Programming Primer Unix Primer Plus Pascal Primer BASIC Programming Primer for PC Bluebook of Assembly Language DOS Primer for PC Pascal Primer Assembly Language Primer for PC Turbo C++ Bible The Unix Papers C: Step by Step Supercharging C with Assembly Language Inside the 80286 Framework from the Ground Up Master C: Let the PC Teach You C Object Oriented Programming in Turbo C++ C++ Primer Plus Master C++: Let the PC Teach You C Visual Basic How To Windows API Bible Workout C Windows Programming Primer Plus Windows API Bible Object Oriented Programming in Microsoft C++ Mitch Waite group iBird (various versions for iOS and Android) References American computer programmers Living people Sonoma State University alumni People from Greenbrae, California 1946 births
159271
https://en.wikipedia.org/wiki/ElcomSoft
ElcomSoft
ElcomSoft is a privately owned software company headquartered in Moscow, Russia. Since its establishment in 1990, the company has been working on computer security programs, with the main focus on password and system recovery software. The DMCA case On July 16, 2001, Dmitry Sklyarov, a Russian citizen employed by ElcomSoft who was at the time visiting the United States for DEF CON, was arrested and charged for violating the United States DMCA law by writing ElcomSoft's Advanced eBook Processor software. He was later released on bail and allowed to return to Russia, and the charges against him were dropped. The charges against ElcomSoft were not, and a court case ensued, attracting much public attention and protest. On December 17, 2002, ElcomSoft was found not guilty of all four charges under the DMCA. Thunder Tables Thunder Tables is the company's own technology developed to ensure guaranteed recovery of Microsoft Word and Microsoft Excel documents protected with 40-bit encryption. The technology first appeared in 2007 and employs the time–memory tradeoff method to build pre-computed hash tables, which open the corresponding files in a matter of seconds instead of days. These tables take around ~ 4GB. So far, the technology is used in two password recovery programs: Advanced Office Password Breaker and Advanced PDF Password Recovery. Cracking wi-fi password with GPUs In 2009 ElcomSoft released a tool that takes WPA/WPA2 Hash Codes and uses brute-force methods to guess the password associated with a wireless network. The brute force attack is carried out by testing passwords with a known SSID of a network of which the WPA/WPA2 Hash Code has been captured. The passwords that are tested are generated from a dictionary using various mutation (genetic algorithm) methods, including case mutation (password, PASSWORD, PassWOrD, etc.), year mutation (password, password1992, password67, etc.), and many other mutations to try to guess the correct password. The advantages of using such methods over the traditional ones, such as rainbow tables, are numerous. Rainbow tables, being very large in size because of the amount of SSID/Password combinations saved, take a long time to traverse, cannot have large numbers of passwords per SSID, and are reliant on the SSID being a common one which the rainbow table has already listed hash codes for (Common ones include linksys, belkin54g, etc.). EWSA, however, uses a relatively small dictionary file (a few megabytes versus dozens of gigabytes for common rainbow tables) and creates the passwords on the fly as needed. Rainbow tables are tested against a captured WPA/WPA2 Hash Code via a computer's processor with relatively low numbers of simultaneous processes possible. EWSA, however, can use a computer's processor(s), with up to 32 logical cores, up to 8 GPUs, all with many CUDA cores (NVIDIA) or Stream Processors (ATI). Vulnerability in Canon authentication software On November 30, 2010, Elcomsoft announced that the encryption system used by Canon cameras to ensure that pictures and Exif metadata have not been altered was flawed and cannot be fixed. On that same day, Dmitry Sklyarov gave a presentation at the Confidence 2.0 conference in Prague demonstrating the flaws. Among others, he showed an image of an astronaut planting a flag of the Soviet Union on the moon; all the images pass Canon's authenticity verification. Nude Celebrity Photo Leak In 2014, an attacker used the Elcomsoft Phone Password Breaker to a guess celebrity Jennifer Lawrence's password and obtain nude photos. Wired said about Apple's cloud services, "...cloud services might be about as secure as leaving your front door key under the mat." References Companies established in 1990 Computer law Cryptography law Software companies of Russia Computer security software companies Companies based in Moscow
30746908
https://en.wikipedia.org/wiki/Satellite%20%28software%29
Satellite (software)
In computing, Red Hat Satellite is a systems-management product by the company Red Hat which allows system administrators to deploy and manage Red Hat Enterprise Linux (RHEL) hosts. A Satellite server registers with Red Hat Subscription Management, mirrors all relevant software like security errata and bug fixes, and provides this together with locally added software and configuration to the attached servers. The managed hosts register against the local Satellite server and access the provided resources like software packages, patches, configuration, etc. while they also provide information about the current health state of the server to the Satellite As of March 2017: The latest version is Red Hat Satellite 6, based on Foreman. This article focuses on Red Hat Satellite 6 The previous version was Red Hat Satellite 5. Based on Spacewalk, it is still in widespread use despite being in the sunset of its lifecycle Architecture Red Hat Satellite Server The Red Hat Satellite Server enables planning and management of the content life cycle and the configuration of Capsule Servers and hosts through GUI, CLI (Hammer), or API (RESTful API). Capsule Servers Capsule Servers mirror content from the Satellite Server to establish content sources in different geographical locations, they are analogous to the Red Hat Satellite 5 Proxy Server. Managed Client Systems As well as Supported Managed Hosts Red Hat Satellite 6 also has some deployment and management capability on certain other hosts though Red Hat Support for these will be limited. Connection to Red Hat Customer Portal and External Content Sources Satellite generally operates in "connected" mode, registering directly with the RHN and downloading relevant software into Satellite's software channels. The organisation's hosts then register against the local Satellite server, instead of directly against Red Hat Network. For secure deployments, Satellite can operate in a "Disconnected" mode, where updates are downloaded directly from Red Hat via an Internet connected machine and then uploaded into Satellite or a local offline RHN proxy. Both modes allow the organisation to control which versions of software it makes available for its hosts, as well as making additional software available within the local network. Red Hat Satellite 6 components Major modules Provision Satellite offers numerous methods for deploying hosts, including simple kickstart, bare metal install and re-imaging. Current versions of Satellite support kickstart using Cobbler as an underlying framework. PXE Boot, and Koan are methods that can be used to implement bare metal installs and re-imaging of hosts. Manage Satellite assists in remotely managing hosts in several areas: software, operational management, and configuration. The 3 main mechanisms for managing hosts are: Software Channel Configuration Channels Activation Keys Monitor Satellite can provide monitoring of software and systems via probes. These probes periodically explore the target host and send alerts if the probes do not get the correct replies, or if the replies fall outside of some specified range. History and Lifecycle A primary purpose of earlier versions of Satellite was to allow organizations to utilize the benefits of Red Hat Network (RHN) without having to provide public Internet access to their servers or other client systems. Later version of the tool have developed increased functionality. Future of Red Hat Satellite 6 The Lifecycle of Red Hat Satellite 6 is recorded at the Red Hat Satellite and Proxy Server Life Cycle which is updated as required, with future events on a bona fide basis. When viewed in August 2019, Red Hat didn't indicate any date for end of support. Red Hat Satellite 5 For Red Hat Satellite version 5 the Satellite Application was implemented by a toolset named Project Spacewalk. Red Hat announced in June 2008 Project Spacewalk was to be made open source under the GPLv2 License Satellite 5.3 was the first version to be based on upstream Spacewalk code. In the Spacewalk FAQ issued in 2015 after the release of Red Hat Satellite 6: Red Hat formally released Spacewalk as open source(GPLv2) in June 2008. Red Hat would continue to sponsor and support Spacewalk as the upstream Red Hat Satellite 5. however that participation is anticipated to diminish as Red Hat Satellite 5 enters the final phases of its lifecycle. Spacewalk is not and can never be upstream for Red Hat Satellite 6 released in September 2014. due to it being a ground up rebuild with a different toolset. Future of Red Hat Satellite 5 The Lifecycle of Red Hat Satellite 5 is recorded at the Red Hat Satellite and Proxy Server Life Cycle which is updated as required, with future events on a bona fide basis. When viewed in March 2017 Red Hat indicated: Red Hat Satellite 5 is in the final Production 3 phase. The current releases, 5.6 and 5.7, would remain supported through January 2019. A further minor release 5.8 will be the only release supported in a supplementary Extended Life Phase from February 2019 through to EOL in May 2020. Satellite minor release 5.8 is in available in beta. See also Landscape (software) References External links Red Hat Satellite Information Knowledge Redhat Satellite Product Documentation Red Hat Satellite Release Dates RHN The Foreman - official website Red Hat software Remote administration software Provisioning Systems management
1336512
https://en.wikipedia.org/wiki/PC%20game
PC game
A personal computer game, also known as a PC game or computer game, is a type of video game played on a personal computer (PC) rather than a video game console or arcade machine. Its defining characteristics include: more diverse and user-determined gaming hardware and software; and generally greater capacity in input, processing, video and audio output. The uncoordinated nature of the PC game market, and now its lack of physical media, make precisely assessing its size difficult. In 2018, the global PC games market was valued at about $27.7 billion. Home computer games became popular following the video game crash of 1983, leading to the era of the "bedroom coder". In the 1990s, PC games lost mass-market traction to console games, before enjoying a resurgence in the mid-2000s through digital distribution on services such as Steam and GOG.com. Newzoo reports that the PC gaming sector is the third-largest category (and estimated in decline) across all platforms , with the console sector second-largest, and mobile / smartphone gaming sector biggest. 2.2 billion video gamers generate US$101.1 billion in revenue, excluding hardware costs. "Digital game revenues will account for $94.4 billion or 87% of the global market. Mobile is the most lucrative segment, with smartphone and tablet gaming growing 19% year on year to $46.1 billion, claiming 42% of the market. In 2020, mobile gaming will represent just more than half of the total games market. [...] China expected to generate $27.5 billion, or one-quarter of all revenues in 2017." PC gaming is considered synonymous (by Newzoo and others) with IBM Personal Computer compatible systems; while mobile computers – smartphones and tablets, such as those running Android or iOS – are also personal computers in the general sense. The APAC region was estimated to generate $46.6 billion in 2016, or 47% of total global video game revenues (note, not only "PC" games). China alone accounts for half of APAC's revenues (at $24.4 billion), cementing its place as the largest video game market in the world, ahead of the US's anticipated market size of $23.5 billion. China is expected to have 53% of its video game revenues come from mobile gaming in 2017 (46% in 2016). History Mainframes and minicomputers Bertie the Brain was one of the first game playing machines developed. It was built in 1950 by Josef Kates. It measured more than four meters tall, and was displayed at the Canadian National Exhibition that year. Although personal computers only became popular with the development of the microprocessor and microcomputer, computer gaming on mainframes and minicomputers had previously already existed. OXO, an adaptation of tic-tac-toe for the EDSAC, debuted in 1952. Another pioneer computer game was developed in 1961, when MIT students Martin Graetz and Alan Kotok, with MIT student Steve Russell, developed Spacewar! on a PDP-1 mainframe computer used for statistical calculations. The first generation of computer games were often text-based adventures or interactive fiction, in which the player communicated with the computer by entering commands through a keyboard. An early text-adventure, Adventure, was developed for the PDP-11 minicomputer by Will Crowther in 1976, and expanded by Don Woods in 1977. By the 1980s, personal computers had become powerful enough to run games like Adventure, but by this time, graphics were beginning to become an important factor in games. Later games combined textual commands with basic graphics, as seen in the SSI Gold Box games such as Pool of Radiance, or The Bard's Tale, for example. Early personal computer games By the late 1970s to early 1980s, games were developed and distributed through hobbyist groups and gaming magazines, such as Creative Computing and later Computer Gaming World. These publications provided game code that could be typed into a computer and played, encouraging readers to submit their own software to competitions. Players could modify the BASIC source code of even commercial games. Microchess was one of the first games for microcomputers which was sold to the public. First sold in 1977, Microchess eventually sold over 50,000 copies on cassette tape. As with second-generation video game consoles at the time, early home computer game companies capitalized on successful arcade games at the time with ports or clones of popular arcade games. By 1982, the top-selling games for the Atari 400 were ports of Frogger and Centipede, while the top-selling game for the Texas Instruments TI-99/4A was the Space Invaders clone TI Invaders. That same year, Pac-Man was ported to the Atari 800, while Donkey Kong was licensed for the Coleco Adam. In late 1981, Atari attempted to take legal action against unauthorized clones, particularly Pac-Man clones, despite some of these predating Atari's exclusive rights to the home versions of Namco's game. Industry crash and aftermath As the video game market became flooded with poor-quality cartridge games created by numerous companies attempting to enter the market, and overproduction of high-profile releases such as the Atari 2600 adaptations of Pac-Man and E.T. grossly underperformed, the popularity of personal computers for education rose dramatically. In 1983, consumer interest in console video games dwindled to historical lows, as interest in games on personal computers rose. The effects of the crash were largely limited to the console market, as established companies such as Atari posted record losses over subsequent years. Conversely, the home computer market boomed, as sales of low-cost color computers such as the Commodore 64 rose to record highs and developers such as Electronic Arts benefited from increasing interest in the platform. Growth of home computer games The North American console market experienced a resurgence in the United States with the release of the Nintendo Entertainment System (NES). In Europe, computer gaming continued to boom for many years after. Computers such as the ZX Spectrum and BBC Micro were successful in the European market, where the NES was not as successful despite its monopoly in Japan and North America. The only 8-bit console to have any success in Europe would be the Sega Master System. Meanwhile, in Japan, both consoles and computers became major industries, with the console market dominated by Nintendo and the computer market dominated by NEC's PC-88 (1981) and PC-98 (1982). A key difference between Western and Japanese computers at the time was the display resolution, with Japanese systems using a higher resolution of 640x400 to accommodate Japanese text, which in turn affected video game design and allowed more detailed graphics. Japanese computers were also using Yamaha's FM synth sound boards from the early 1980s. To enhance the immersive experience with their unrealistic graphics and electronic sound, early PC games included extras such as the peril-sensitive sunglasses that shipped with The Hitchhiker's Guide to the Galaxy or the science fiction novella included with Elite. These extras gradually became less common, but many games were still sold in the traditional oversized boxes that used to hold the extra "feelies". Today, such extras are usually found only in Special Edition versions of games, such as Battlechests from Blizzard. During the 16-bit era, the Commodore Amiga and Atari ST became popular in Europe, while the PC-98, Sharp X68000, and FM Towns became popular in Japan. The Amiga, X68000 and FM Towns were capable of producing near arcade-quality hardware sprite graphics and sound quality when they first released in the mid-to-late 1980s. Growth of IBM PC compatible games Among launch titles for the IBM Personal Computer (PC) in 1981 was Microsoft Adventure, which IBM described as bringing "players into a fantasy world of caves and treasures". BYTE that year stated that the computer's speed and sophistication made it "an excellent gaming device", and IBM and others sold games like Microsoft Flight Simulator. The PC's CGA graphics and speaker sound were poor, however, and most customers bought the powerful but expensive computer for business. One ComputerLand owner estimated in 1983 that a quarter of corporate executives with computers "have a game hidden somewhere in their drawers", and InfoWorld in 1984 reported that "in offices all over America (more than anyone realizes) executives and managers are playing games on their computers", but software companies found selling games for the PC difficult; an observer said that year that Flight Simulator had sold hundreds of thousands of copies because customers with corporate PCs could claim that it was a "simulation". From mid-1985, however, what Compute! described as a "wave" of inexpensive IBM PC clones from American and Asian companies, such as the Tandy 1000, caused prices to decline; by the end of 1986, the equivalent to a $1600 real IBM PC with 256K RAM and two disk drives cost as little as $600, lower than the price of the Apple IIc. Consumers began purchasing DOS computers for the home in large numbers. While often purchased to do work on evenings and weekends, clones' popularity caused consumer-software companies to increase the number of IBM-compatible products, including those developed specifically for the PC as opposed to porting from other computers. Bing Gordon of Electronic Arts reported that customers used computers for games more than one fifth of the time whether purchased for work or a hobby, with many who purchased computers for other reasons finding PC games "a pretty satisfying experience". By 1987, the PC market was growing so quickly that the formerly business-only computer had become the largest and fastest-growing, and most important platform for computer game companies. DOS computers dominated the home, supplanting Commodore and Apple. More than a third of games sold in North America were for the PC, twice as many as those for the Apple II and even outselling those for the Commodore 64. With the EGA video card, an inexpensive clone had better graphics and more memory for games than the Commodore or Apple, and the Tandy 1000's enhanced graphics, sound, and built-in joystick ports made it the best platform for IBM PC-compatible games before the VGA era. By 1988, the enormous popularity of the Nintendo Entertainment System had greatly affected the computer-game industry. A Koei executive claimed that "Nintendo's success has destroyed the [computer] software entertainment market". A Mindscape executive agreed, saying that "Unfortunately, its effect has been extremely negative. Without question, Nintendo's success has eroded software sales. There's been a much greater falling off of disk sales than anyone anticipated." A third attributed the end of growth in sales of the Commodore 64 to the console, and Trip Hawkins called Nintendo "the last hurrah of the 8-bit world". Experts were unsure whether it affected 16-bit computer games, but Hawkins, in 1990, nonetheless had to deny rumors that Electronic Arts would withdraw from computers and only produce console games. By 1993, ASCII Entertainment reported at a Software Publishers Association conference that the market for console games ($5.9 billion in revenue) was 12 times that of the computer-game market ($430 million). However, computer games did not disappear. By 1989, Computer Gaming World reported that "the industry is moving toward heavy use of VGA graphics". While some games were advertised with VGA support at the start of the year, they usually supported EGA graphics through VGA cards. By the end of 1989, however, most publishers moved to at supporting at least 320x200 MCGA, a subset of VGA. VGA gave the PC graphics that outmatched the Amiga. Increasing adoption of the computer mouse, driven partially by the success of adventure games such as the highly successful King's Quest series, and high resolution bitmap displays allowed the industry to include increasingly high-quality graphical interfaces in new releases. Further improvements to game artwork and audio were made possible with the introduction of FM synthesis sound. Yamaha began manufacturing FM synth boards for computers in the early-mid-1980s, and by 1985, the NEC and FM-7 computers had built-in FM sound. The first PC sound cards, such as AdLib's Music Synthesizer Card, soon appeared in 1987. These cards allowed IBM PC compatible computers to produce complex sounds using FM synthesis, where they had previously been limited to simple tones and beeps. However, the rise of the Creative Labs Sound Blaster card, released in 1989, which featured much higher sound quality due to the inclusion of a PCM channel and digital signal processor, led AdLib to file for bankruptcy by 1992. Also in 1989, the FM Towns computer included built-in PCM sound, in addition to a CD-ROM drive and 24-bit color graphics. By 1990, DOS was 65% of the computer-game market, with the Amiga at 10%; all other computers, including the Apple Macintosh, were below 10% and declining. Although both Apple and IBM tried to avoid customers associating their products with "game machines", the latter acknowledged that VGA, audio, and joystick options for its PS/1 computer were popular. In 1991, id Software produced an early first-person shooter, Hovertank 3D, which was the company's first in their line of highly influential games in the genre. There were also several other companies that produced early first-person shooters, such as Arsys Software's Star Cruiser, which featured fully 3D polygonal graphics in 1988, and Accolade's Day of the Viper in 1989. Id Software went on to develop Wolfenstein 3D in 1992, which helped to popularize the genre, kick-starting a genre that would become one of the highest-selling in modern times. The game was originally distributed through the shareware distribution model, allowing players to try a limited part of the game for free but requiring payment to play the rest, and represented one of the first uses of texture mapping graphics in a popular game, along with Ultima Underworld. In December 1992, Computer Gaming World reported that DOS accounted for 82% of computer-game sales in 1991, compared to Macintosh's 8% and Amiga's 5%. In response to a reader's challenge to find a DOS game that played better than the Amiga version the magazine cited Wing Commander and Civilization, and added that "The heavy MS-DOS emphasis in CGW merely reflects the realities of the market". A self-reported Computer Gaming World survey in April 1993 similarly found that 91% of readers primarily used IBM PCs and compatibles for gaming, compared to 6% for Amiga, 3% for Macintosh, and 1% for Atari ST, while a Software Publishers Association study found that 74% of personal computers were IBMs or compatible, 10% Macintosh, 7% Apple II, and 8% other. 51% of IBM or compatible had 386 or faster CPUs. By 1992, DOS games such as Links 386 Pro supported Super VGA graphics. While leading Sega and Nintendo console systems kept their CPU speed at 3–7 MHz, the 486 PC processor ran much faster, allowing it to perform many more calculations per second. The 1993 release of Doom on the PC was a breakthrough in 3D graphics, and was soon ported to various game consoles in a general shift toward greater realism. Computer Gaming World reiterated in 1994, "we have to advise readers who want a machine that will play most of the games to purchase high-end MS-DOS machines". By 1993, PC floppy disk games had a sales volume equivalent to about one-quarter that of console game ROM cartridge sales. A hit PC game typically sold about 250,000 disks at the time, while a hit console game typically sold about cartridges. By spring 1994, an estimated 24 million US homes (27% of households) had a personal computer. 48% played games on their computer; 40% had the 486 CPU or higher; 35% had CD-ROM drives; and 20% had a sound card. Another survey found that an estimated 2.46 million multimedia computers had internal CD-ROM drives by the end of 1993, an increase of almost 2,000%. Computer Gaming World reported in April 1994 that some software publishers planned to only distribute on CD as of 1995. CD-ROM had much larger storage capacity than floppies, helped reduce software piracy, and was less expensive to produce. Chris Crawford warned that it was "a data-intensive technology, not a process-intensive one", tempting developers to emphasize the quantity of digital assets like art and music over the quality of gameplay; Computer Gaming World wrote in 1993 that "publishers may be losing their focus". While many companies used the additional storage to release poor-quality shovelware collections of older software, or "enhanced" versions of existing ones—often with what the magazine mocked as "amateur acting" in the added audio and video—new games such as Myst included many more assets for a richer game experience. Many companies sold "multimedia upgrade kits" that bundled CD drives, sound cards, and software during the mid-1990s, but device drivers for the new peripherals further depleted scarce RAM. By 1993, PC games required much more memory than other software, often consuming all of conventional memory, while device drivers could go into upper memory with DOS memory managers. Players found modifying CONFIG.SYS and AUTOEXEC.BAT files for memory management cumbersome and confusing, and each game needed a different configuration. (The game Les Manley in: Lost in L.A. satirizes this by depicting two beautiful women exhaust the hero in bed, by requesting that he again explain the difference between extended and expanded memory.) Computer Gaming World provided technical assistance to its writers to help install games for review, and published sample configuration files. The magazine advised non-technical gamers to purchase commercial memory managers like QEMM and 386MAX and criticized nonstandard software like Origin Systems's "infamous late and unlamented Voodoo Memory Manager", which used unreal mode. Contemporary PC gaming By 1996, the growing popularity of Microsoft Windows simplified device driver and memory management. The success of 3D console titles such as Super Mario 64 and Tomb Raider increased interest in hardware accelerated 3D graphics on PCs, and soon resulted in attempts to produce affordable solutions with the ATI Rage, Matrox Mystique, S3 ViRGE, and Rendition Vérité. As 3D graphics libraries such as DirectX and OpenGL matured and knocked proprietary interfaces out of the market, these platforms gained greater acceptance in the market, particularly with their demonstrated benefits in games such as Unreal. However, major changes to the Microsoft Windows operating system, by then the market leader, made many older DOS-based games unplayable on Windows NT, and later, Windows XP (without using an emulator, such as DOSBox). The faster graphics accelerators and improving CPU technology resulted in increasing levels of realism in computer games. During this time, the improvements introduced with products such as ATI's Radeon R300 and NVidia's GeForce 6 Series have allowed developers to increase the complexity of modern game engines. PC gaming currently tends strongly toward improvements in 3D graphics. Unlike the generally accepted push for improved graphical performance, the use of physics engines in computer games has become a matter of debate since announcement and 2005 release of the nVidia PhysX PPU, ostensibly competing with middleware such as the Havok physics engine. Issues such as difficulty in ensuring consistent experiences for all players, and the uncertain benefit of first generation PhysX cards in games such as Tom Clancy's Ghost Recon Advanced Warfighter and City of Villains, prompted arguments over the value of such technology. Similarly, many game publishers began to experiment with new forms of marketing. Chief among these alternative strategies is episodic gaming, an adaptation of the older concept of expansion packs, in which game content is provided in smaller quantities but for a proportionally lower price. Titles such as Half-Life 2: Episode One took advantage of the idea, with mixed results rising from concerns for the amount of content provided for the price. Foreign PC gaming Polish During the 1980s, the cheap and talented workforce of the Polish People's Republic began producing video games with Warsaw company Karen, founded by enterprising emigrant Lucjan Wencel, developing many hits that were released in the United States. The 1991 strategy game "Solidarność" by Przemysław Rokita, where players led a trade union to political victory, was the symbolic beginning of a new trend where interactive works applied video game conventions to local Polish culture and history, and through a distorting mirror portrayed the Eastern Bloc, local villages, and the mentality of citizens. Developers in this age struggled with minimal profits, working after hours, harsh working conditions, older computers, and an ignorance of foreign languages and sentiments. The country saw its own text based games – e.g. Mózgprocesor (1989), arcade games – e.g. Robbo (1989), football manager – Polish League (1995), Doom-clone – Cytadela (1995), and The Settlers-clone – Polanie (1995), however the adventure game genre was the "most significant species in the 90s", a genre which was finally cracked with Tajemnica Statuetki. Tajemnica Statuetki was the first commercially released Polish adventure game, one of the first Polish and Polish-language video games ever, and Chmielarz's first game that he had developed from start to finish – the first officially sold program that he wrote. It is sometimes erroneously considered the first Polish computer game, a distinction held by Witold Podgórski's 1961 mainframe game Marienbad, inspired by a Chinese puzzle called "Nim", and released on the Odra 1003. (Meanwhile, Polygamia writes that 1986's text-based Puszka Pandory is the first game written by a Pole, sold in Poland, and reviewed in Polish press). Despite this, Onet wrote in 2013 about a common misconception that the game marks the point where the history of digital entertainment in Poland begins. Platform characteristics Fidelity In high-end PC gaming, a PC will generally have far more processing resources at its disposal than other gaming systems. Game developers can use this to improve the visual fidelity of their game relative to other platforms, but even if they do not, games running on PC are likely to benefit from higher screen resolution, higher framerate, and anti-aliasing. Increased draw distance is also common in open world games. Better hardware also increases the potential fidelity of a PC game's rules and simulation. PC games often support more players or NPCs than equivalents on other platforms and game designs which depend on the simulation of large numbers of tokens (e.g. Guild Wars 2, World of Warcraft) are rarely seen anywhere else. The PC also supports greater input fidelity thanks to its compatibility with a wide array of peripherals. The most common forms of input are the mouse/keyboard combination and gamepads, though touchscreens and motion controllers are also available. The mouse in particular lends players of first-person shooter and real-time strategy games on PC great speed and accuracy. Openness The defining characteristic of the PC platform is the absence of centralized control; all other gaming platforms (except Android devices, to an extent) are owned and administered by a single group. The advantages of openness include: Reduced software cost Prices are kept down by competition and the absence of platform-holder fees. Games and services are cheaper at every level, and many are free. Increased flexibility PC games decades old can be played on modern systems, through emulation software if need be. Conversely, newer games can often be run on older systems by reducing the games' fidelity and/or scale. Increased innovation One does not need to ask for permission to release or update a PC game or to modify an existing one, and the platform's hardware and software are constantly evolving. These factors make PC the centre of both hardware and software innovation. By comparison, closed platforms tend to remain much the same throughout their lifespan. There are also disadvantages, including: Increased complexity A PC is a general-purpose tool. Its inner workings are exposed to the owner, and misconfiguration can create enormous problems. Hardware compatibility issues are also possible. Game development is complicated by the wide variety of hardware configurations; developers may be forced to limit their design to run with sub-optimum PC hardware in order to reach a larger PC market, or add a range graphical and other settings to adjust for playability on individual machines, requiring increased development, test, and customer support resources. Increased hardware cost PC components are generally sold individually for profit (even if one buys a pre-built machine), whereas the hardware of closed platforms is mass-produced as a single unit and often sold at a smaller profit, or even a loss (with the intention of making profit instead in online service fees and developer kit profits). Reduced security It is difficult, and in most situations ultimately impossible, to control the way in which PC hardware and software is used. This leads to far more software piracy and cheating than closed platforms suffer from. Modifications The openness of the PC platform allows players to edit or modify their games and distribute the results over the Internet as "mods". A healthy mod community greatly increases a game's longevity and the most popular mods have driven purchases of their parent game to record heights. It is common for professional developers to release the tools they use to create their games (and sometimes even source code) in order to encourage modding, but if a game is popular enough mods generally arise even without official support. Mods can compete with official downloadable content however, or even outright redistribute it, and their ability to extend the lifespan of a game can work against its developers' plans for regular sequels. As game technology has become more complex, it has also become harder to distribute development tools to the public. Modding has a different connotation on consoles which are typically restricted much more heavily. As publicly released development tools are rare, console mods usually refer to hardware alterations designed to remove restrictions. Dominant software Although the PC platform is almost completely decentralized at a hardware level, there are two dominant software forces: the Microsoft Windows operating system and the Steam distribution service. Microsoft introduced an operating environment named Windows on November 20, 1985, as an add-on to DOS in response to the growing interest in graphical user interfaces (GUIs). Microsoft Windows came to dominate the world's personal computer market with over 90% market share, overtaking Mac OS, which had been introduced in 1984. Valve does not release any sales figures on its Steam service, instead it only provides the data to companies with games on Steam, which they cannot release without permission due to signing a non-disclosure agreement with Valve. However, Stardock, the previous owner of competing platform Impulse, estimated that, as of 2009, Steam had a 70% share of the digital distribution market for video games. In early 2011, Forbes reported that Steam sales constituted 50–70% of the $4 billion market for downloaded PC games and that Steam offered game producers gross margins of 70% of purchase price, compared with 30% at retail. In 2011, Steam served over 780 petabytes of information, double what it had delivered in 2010. Digital distribution services PC games are sold predominantly through the Internet, with buyers downloading their new purchase directly to their computer. This approach allows smaller independent developers to compete with large publisher-backed games and avoids the speed and capacity limits of the optical discs which most other gaming platforms rely on. Valve released the Steam platform for Windows computers in 2003 as a means to distribute Valve-developed video games such as Half-Life 2. It would later see release on the Mac OS X operating system in 2010 and was released on Linux in 2012 as well. By 2011, it controlled 70% of the market for downloadable PC games, with a userbase of about 40 million accounts. Origin, a new version of the Electronic Arts online store, was released in 2011 in order to compete with Steam and other digital distribution platforms on the PC. The period between 2004 and now saw the rise of many digital distribution services on PC, such as Amazon Digital Services, GameStop, GFWL, EA Store, Direct2Drive, GOG.com, and GamersGate. Digital distribution also slashes the cost of circulation, eliminates stock shortages, allows games to be released worldwide at no additional cost, and allows niche audiences to be reached with ease. However, most digital distribution systems create ownership and customer rights issues by storing access rights on distributor-owned computers. Games confer with these computers over the Internet before launching. This raises the prospect of purchases being lost if the distributor goes out of business or chooses to lock the buyer's account, and prevents resale (the ethics of which are a matter of debate). PC gaming technology Hardware Modern computer games place great demand on the computer's hardware, often requiring a fast central processing unit (CPU) to function properly. CPU manufacturers historically relied mainly on increasing clock rates to improve the performance of their processors, but had begun to move steadily towards multi-core CPUs by 2005. These processors allow the computer to simultaneously process multiple tasks, called threads, allowing the use of more complex graphics, artificial intelligence and in-game physics. Similarly, 3D games often rely on a powerful graphics processing unit (GPU), which accelerates the process of drawing complex scenes in realtime. GPUs may be an integrated part of the computer's motherboard, the most common solution in laptops, or come packaged with a discrete graphics card with a supply of dedicated Video RAM, connected to the motherboard through either an AGP or PCI-Express port. It is also possible to use multiple GPUs in a single computer, using technologies such as NVidia's Scalable Link Interface and ATI's CrossFire. Sound cards are also available to provide improved audio in computer games. These cards provide improved 3D audio and provide audio enhancement that is generally not available with integrated alternatives, at the cost of marginally lower overall performance. The Creative Labs SoundBlaster line was for many years the de facto standard for sound cards, although its popularity dwindled as PC audio became a commodity on modern motherboards. Physics processing units (PPUs), such as the Nvidia PhysX (formerly AGEIA PhysX) card, are also available to accelerate physics simulations in modern computer games. PPUs allow the computer to process more complex interactions among objects than is achievable using only the CPU, potentially allowing players a much greater degree of control over the world in games designed to use the card. Virtually all personal computers use a keyboard and mouse for user input, but there are exceptions. During the 1990s, before the keyboard and mouse combination had become the method of choice for PC gaming input peripherals, there were other types of peripherals such as the Mad Catz Panther XL, the First-Person Gaming Assassin 3D, and the Mad Catz Panther, which combined a trackball for looking / aiming, and a joystick for movement. Other common gaming peripherals are a headset for faster communication in online games, joysticks for flight simulators, steering wheels for driving games and gamepads for console-style games. Software Computer games also rely on third-party software such as an operating system (OS), device drivers, libraries and more to run. Today, the vast majority of computer games are designed to run on the Microsoft Windows family of operating systems. Whereas earlier games written for DOS would include code to communicate directly with hardware, today application programming interfaces (APIs) provide an interface between the game and the OS, simplifying game design. Microsoft's DirectX is an API that is widely used by today's computer games to communicate with sound and graphics hardware. OpenGL is a cross-platform API for graphics rendering that is also used. The version of the graphics card's driver installed can often affect game performance and gameplay. In late 2013, AMD announced Mantle, a low-level API for certain models of AMD graphics cards, allowing for greater performance compared to software-level APIs such as DirectX, as well as simplifying porting to and from the PlayStation 4 and Xbox One consoles, which are both built upon AMD hardware. It is not unusual for a game company to use a third-party game engine, or third-party libraries for a game's AI or physics. Multiplayer Local area network gaming Multiplayer gaming was largely limited to local area networks (LANs) before cost-effective broadband Internet access became available, due to their typically higher bandwidth and lower latency than the dial-up services of the time. These advantages allowed more players to join any given computer game, but have persisted today because of the higher latency of most Internet connections and the costs associated with broadband Internet. LAN gaming typically requires two or more personal computers, a router and sufficient networking cables to connect every computer on the network. Additionally, each computer must have its own copy (or spawn copy) of the game in order to play. Optionally, any LAN may include an external connection to the Internet. Online games Online multiplayer games have achieved popularity largely as a result of increasing broadband adoption among consumers. Affordable high-bandwidth Internet connections allow large numbers of players to play together, and thus have found particular use in massively multiplayer online role-playing games, Tanarus and persistent online games such as World War II Online. Although it is possible to participate in online computer games using dial-up modems, broadband Internet connections are generally considered necessary in order to reduce the latency or "lag" between players. Such connections require a broadband-compatible modem connected to the personal computer through a network interface card (generally integrated onto the computer's motherboard), optionally separated by a router. Online games require a virtual environment, generally called a "game server". These virtual servers inter-connect gamers, allowing real time, and often fast-paced action. To meet this subsequent need, Game Server Providers (GSP) have become increasingly more popular over the last half decade. While not required for all gamers, these servers provide a unique "home", fully customizable, such as additional modifications, settings, etc., giving the end gamers the experience they desire. Today there are over 510,000 game servers hosted in North America alone. Emulation Emulation software, used to run software without the original hardware, are popular for their ability to play legacy video games without the platform for which they were designed. The operating system emulators include DOSBox, a DOS emulator which allows playing games developed originally for this operating system and thus not compatible with a modern-day OS. Console emulators such as Nestopia and MAME are relatively commonplace, although the complexity of modern consoles such as the Xbox or PlayStation makes them far more difficult to emulate, even for the original manufacturers. The most technically advanced consoles that can currently be successfully emulated for commercial games on PC are the PlayStation 2 using PCSX2, and the Nintendo Wii U using the Cemu emulator. A PlayStation 3 emulator named RPCS3 is in development, although it can currently only run small Homebrew games and certain old arcade titles that were originally ported to the PS3 from older platforms. Most emulation software mimics a particular hardware architecture, often to an extremely high degree of accuracy. This is particularly the case with classic home computers such as the Commodore 64, whose software often depends on highly sophisticated low-level programming tricks invented by game programmers and the demoscene. Controversy PC games have long been a source of controversy, largely due to the depictions of violence that has become commonly associated with video games in general. The debate surrounds the influence of objectionable content on the social development of minors, with organizations such as the American Psychological Association concluding that video game violence increases children's aggression, a concern that prompted a further investigation by the Centers for Disease Control in September 2006. Industry groups have responded by noting the responsibility of parents in governing their children's activities, while attempts in the United States to control the sale of objectionable games have generally been found unconstitutional. Video game addiction is another cultural aspect of gaming to draw criticism as it can have a negative influence on health and on social relations. The problem of addiction and its health risks seems to have grown with the rise of massively multiplayer online role playing games (MMORPGs). Alongside the social and health problems associated with computer game addiction have grown similar worries about the effect of computer games on education. Computer games museums There are several computer games museums around the world. In 2011 one opened in Berlin, a computer game museum that documents computer games from the 1970s until today. The Museum of Art and Digital Entertainment, in Oakland, California also exhibits PC games in its general collection. The Video Game Museum in Rome is dedicated to the preservation of videogames, and includes Pss games in its collection. The Computer History Museum in Mountain View, California holds a collection of PC games, and allows visitors to play Spacewar!, the first computer game, on a restored original DEC PDP-1. See also Game studies Handheld game console Video game console List of PC games Mobile game Split screen (video games) PC Master Race Microsoft Windows References External links Computer game museum in Berlin Video game platforms Personal computing Video game terminology
152693
https://en.wikipedia.org/wiki/Line%20editor
Line editor
In computing, a line editor is a text editor in which each editing command applies to one or more complete lines of text designated by the user. Line editors predate screen-based text editors and originated in an era when a computer operator typically interacted with a teleprinter (essentially a printer with a keyboard), with no video display, and no ability to move a cursor interactively within a document. Line editors were also a feature of many home computers, avoiding the need for a more memory-intensive full-screen editor. Line editors are limited to typewriter keyboard text-oriented input and output methods. Most edits are a line-at-a-time. Typing, editing, and document display do not occur simultaneously. Typically, typing does not enter text directly into the document. Instead, users modify the document text by entering these commands on a text-only terminal. Commands and text, and corresponding output from the editor, will scroll up from the bottom of the screen in the order that they are entered or printed to the screen. Although the commands typically indicate the line(s) they modify, displaying the edited text within the context of larger portions of the document requires a separate command. Line editors keep a reference to the "current line" to which the entered commands usually are applied. In contrast, modern screen based editors allow the user to interactively and directly navigate, select, and modify portions of the document. Generally line numbers or a search based context (especially when making changes within lines) are used to specify which part of the document is to be edited or displayed. Early line editors included Colossal Typewriter, Expensive Typewriter and QED. All three pre-dated the advent of UNIX; the former two ran on DEC PDP-1's, while the latter was a Unisys product. Numerous line editors are included with UNIX and Linux: ed is considered the standard UNIX editor, while ex extends it and has more features, and sed was written for pattern-based text editing as part of a shell script. GNU Readline is a line editor implemented as a library that is incorporated in many programs, such as Bash. For the first 10 years of the IBM PC, the only editor provided in DOS was the Edlin line editor. Line editors are still used non-interactively in shell scripts and when dealing with failing operating systems. Update systems such as patch traditionally used diff data converted into a script of ed commands. They are also used in many MUD systems, though many people edit text on their own computer using MUD's download and upload features. See also ed Edlin GNU Readline QED
4113528
https://en.wikipedia.org/wiki/ConnNet
ConnNet
ConnNet was a packet switched data network operated by the Southern New England Telephone Company serving the U.S. state of Connecticut. ConnNet was the nation's first local public packet switching network when it was launched on March 11, 1985. Users could access services such as Dow Jones News Retrieval, CompuServe, Dialcom, GEnie, Delphi, Eaasy Sabre, NewsNet, PeopleLink, the National Library of Medicine, and BIX. ConnNet could also be used to access other national and international packet networks, such as Tymnet and ACCUNET. Large companies also connected their mainframe computers to ConnNet allowing employees access to the mainframes from home. The network is no longer in operation. Hardware The X.25 network was based on hardware from Databit, Inc. consisting of three EDX-P Network Nodes that performed switching and were located in Hartford, New Haven and Stamford. Databit also supplied 23 ANP 2520 Advanced Network Processors each of which provided the system with a point of presence, a network control center and modems. Customers would order leased line connections into the network for host computers running at 4,800 to 56,000 bits per second (bit/s). Terminals would connect over a leased line from 1,200 to 9,600 bit/s synchronous, 300 to 2,400 bit/s asynchronous or using dial-up connections from 300 to 1,200 bit/s. The connection to Tymnet was established over an X.75 based 9,600 bit/s analog link from the ConnNet Hartford node to Tymnet's Bloomfield node. See also Southern New England Telephone (SNET) References Southern New England Telephone (Mar 13, 1985). SNET; Offers its Connecticut customers the first local packet switched data network in the nation. Press Release ConnNet Online Help. Accessed Jan 07, 1991 AT&T (Jan 29, 1986). Untitled. Press Release SNET / Packet/PC (Nov 12, 1987). PC users can link to IBM mainframes with Packet/ PC software and SNET's Connect. Press Release Scully, Sharon (June 2, 1986). "Protocol Conversion; SNET heralds services". Network World, p 4. Databit (May 27, 1986). DATABIT; Announces point-of-sale terminal application with Southern New England Telephone. Press Release Strauss, Paul R. (Jan 1 1987). "Feature 1986: Information networking's quiet watershed year in review". Data Communications, p 169. 1985 establishments in Connecticut Communications in Connecticut History of the Internet
34827496
https://en.wikipedia.org/wiki/XS%20Software
XS Software
XS Software JSCo is a Bulgarian worldwide producer, developer and publisher of cross platform multiplayer online games. XS Software's headquarters are located in Sofia, Bulgaria. History The company was founded in 2004 by entrepreneur Hristo Tenchev, who is also the CEO and created the first Bulgarian online browser game Bulfleet. After the success of the game, the team started working on more projects and the company continued to grow. In 2005, the first version of the second title Khan Wars was released in Bulgaria and in 2008 the internationalization on Khan Wars began. The game was first released in Poland with the support of one of the biggest horizontal portals, Wirtualna Polska. The company was continuously growing by gaining more international investments, and in 2009, four new titles were released. Khan Wars was translated into 25 languages and released in more than 50 countries worldwide. In 2010, XS Software JSCo started the internationalization of all their games. The company translated its best performing games in up to 40 languages and they subsequently were released in 80 countries. Games Games developed by XS Software Khan Wars Lady Popular Nemexia (game) Andromeda5 Games published by XS Software Sofia Wars Taern Tropicalla Botva HeroZero Awards and nominations 2011 Galaxy News - Best Online Game of the Year Khanwars - Best Online Game 2011 in Iran 2011 Forbes Magazine News- Forbes Business Awards XS Software was awarded with the Employee of the Year Award. 2011 BHRMDA News- Annual HR Awards In the category "Organizational architecture and design" XS Software won an award for the idea XS LEGO See also Massively multiplayer online game MMORPG Browser game Cooperative gameplay Multiplayer online game Online game Spawn installation References Video game companies established in 2004 Video game companies of Bulgaria Browser-based game websites Mobile game companies Video game development companies Bulgarian companies established in 2004
488401
https://en.wikipedia.org/wiki/Sound%20design
Sound design
Sound design is the art and practice of creating sound tracks for a variety of needs. It involves specifying, acquiring or creating auditory elements using audio production techniques and tools. It is employed in a variety of disciplines including filmmaking, television production, video game development, theatre, sound recording and reproduction, live performance, sound art, post-production, radio, new media and musical instrument development. Sound design commonly involves performing (see e.g. foley) and editing of previously composed or recorded audio, such as sound effects and dialogue for the purposes of the medium, but it can also involve creating sounds from scratch through synthesizers. A sound designer is one who practices sound design. History The use of sound to evoke emotion, reflect mood and underscore actions in plays and dances began in prehistoric times. At its earliest, it was used in religious practices for healing or recreation. In ancient Japan, theatrical events called kagura were performed in Shinto shrines with music and dance. Plays were performed in medieval times in a form of theatre called Commedia dell'arte, which used music and sound effects to enhance performances. The use of music and sound in the Elizabethan Theatre followed, in which music and sound effects were produced off stage using devices such as bells, whistles, and horns. Cues would be written in the script for music and sound effects to be played at the appropriate time. Italian composer Luigi Russolo built mechanical sound-making devices, called "intonarumori," for futurist theatrical and music performances starting around 1913. These devices were meant to simulate natural and man-made sounds, such as trains and bombs. Russolo's treatise, The Art of Noises, is one of the earliest written documents on the use of abstract noise in the theatre. After his death, his intonarumori' were used in more conventional theatre performances to create realistic sound effects. Recorded sound Possibly the first use of recorded sound in the theatre was a phonograph playing a baby's cry in a London theatre in 1890. Sixteen years later, Herbert Beerbohm Tree used recordings in his London production of Stephen Phillips’ tragedy NERO. The event is marked in the Theatre Magazine (1906) with two photographs; one showing a musician blowing a bugle into a large horn attached to a disc recorder, the other with an actor recording the agonizing shrieks and groans of the tortured martyrs. The article states: “these sounds are all realistically reproduced by the gramophone”. As cited by Bertolt Brecht, there was a play about Rasputin written in (1927) by Alexej Tolstoi and directed by Erwin Piscator that included a recording of Lenin's voice. Whilst the term "sound designer" was not in use at this time, a number of stage managers specialised as "effects men", creating and performing offstage sound effects using a mix of vocal mimicry, mechanical and electrical contraptions and gramophone records. A great deal of care and attention was paid to the construction and performance of these effects, both naturalistic and abstract. Over the course of the twentieth century the use of recorded sound effects began to take over from live sound effects, though often it was the stage manager's duty to find the sound effects and an electrician played the recordings during performances. Between 1980 and 1988, Charlie Richmond, USITT's first Sound Design Commissioner, oversaw efforts of their Sound Design Commission to define the duties, responsibilities, standards and procedures which might normally be expected of a theatre sound designer in North America. This subject is still regularly discussed by that group, but during that time, substantial conclusions were drawn and he wrote a document which, although now somewhat dated, provides a succinct record of what was expected at that time. It was subsequently provided to both the ADC and David Goodman at the Florida USA local when they were both planning to represent sound designers in the 1990s. Digital technology MIDI and digital audio technology have contributed to the evolution of sound production techniques in the 1980s and 1990s. Digital audio workstations (DAW) and a variety of digital signal processing algorithms applied in them allow more complicated sound tracks with more tracks as well as auditory effects to be realized. Features such as unlimited undo and sample-level editing allow fine control over the sound tracks. In theatre sound, features of computerized theatre sound design systems have also been recognized as being essential for live show control systems at Walt Disney World and, as a result, Disney utilized systems of that type to control many facilities at their Disney-MGM Studios theme park, which opened in 1989. These features were incorporated into the MIDI Show Control (MSC) specification, an open communications protocol used to interact with diverse devices. The first show to fully utilize the MSC specification was the Magic Kingdom Parade at Walt Disney World's Magic Kingdom in September, 1991. The rise of interest in game audio has also brought more advanced interactive audio tools that are also accessible without a background in computer programming. Some of such software tools (termed "implementation tools" or "audio engines") feature a workflow that's similar to that in more conventional DAW programs and can also allow the sound production personnel to undertake some of the more creative interactive sound tasks (that are considered to be part of sound design for computer applications) that previously would have required a computer programmer. Interactive applications have also given rise to a plethora of techniques in "dynamic audio" that loosely means sound that's "parametrically" adjusted during the run-time of the program. This allows for a broader expression in sounds, more similar to that in films, because this way the sound designer can e.g. create footstep sounds that vary in a believable and non-repeating way and that also corresponds to what's seen in the picture. The digital audio workstation cannot directly "communicate" with game engines, because the game's events occur often in an unpredictable order, whereas traditional digital audio workstations as well as so called linear media (TV, film etc.) have everything occur in the same order every time the production is run. Especially games have also brought in dynamic or adaptive mixing. The World Wide Web has greatly enhanced the ability of sound designers to acquire source material quickly, easily and cheaply. Nowadays, a designer can preview and download crisper, more "believable" sounds as opposed to toiling through time- and budget-draining "shot-in-the-dark" searches through record stores, libraries and "the grapevine" for (often) inferior recordings. In addition, software innovation has enabled sound designers to take more of a DIY (or "do-it-yourself") approach. From the comfort of their home and at any hour, they can simply use a computer, speakers and headphones rather than renting (or buying) costly equipment or studio space and time for editing and mixing. This provides for faster creation and negotiation with the director. Applications Film In motion picture production, a Sound Editor/Designer is a member of a film crew responsible for the entirety or some specific parts of a film's sound track. In the American film industry, the title Sound Designer is not controlled by any professional organization, unlike titles such as Director or Screenwriter. The terms sound design and sound designer began to be used in the motion picture industry in 1969. At that time, The title of Sound Designer was first granted to Walter Murch by Francis Ford Coppola in recognition for Murch's contributions to the film The Rain People. The original meaning of the title Sound Designer, as established by Coppola and Murch, was "an individual ultimately responsible for all aspects of a film's audio track, from the dialogue and sound effects recording to the re-recording (mix) of the final track". The term sound designer has replaced monikers like supervising sound editor or re-recording mixer for what was essentially the same position: the head designer of the final sound track. Editors and mixers like Murray Spivack (King Kong), George Groves (The Jazz Singer), James G. Stewart (Citizen Kane), and Carl Faulkner (Journey to the Center of the Earth) served in this capacity during Hollywood's studio era, and are generally considered to be sound designers by a different name. The advantage of calling oneself a sound designer beginning in later decades was two-fold. It strategically allowed for a single person to work as both an editor and mixer on a film without running into issues pertaining to the jurisdictions of editors and mixers, as outlined by their respective unions. Additionally, it was a rhetorical move that legitimised the field of post-production sound at a time when studios were downsizing their sound departments, and when producers were routinely skimping on budgets and salaries for sound editors and mixers. In so doing, it allowed those who called themselves sound designers to compete for contract work and to negotiate higher salaries. The position of Sound Designer therefore emerged in a manner similar to that of Production Designer, which was created in the 1930s when William Cameron Menzies made revolutionary contributions to the craft of art direction in the making of Gone with the Wind. The audio production team is a principal member of the production staff, with creative output comparable to that of the film editor and director of photography. Several factors have led to the promotion of audio production to this level, when previously it was considered subordinate to other parts of film: Cinema sound systems became capable of high-fidelity reproduction, particularly after the adoption of Dolby Stereo. Before stereo soundtracks, film sound was of such low fidelity that only the dialogue and occasional sound effects were practical. These sound systems were originally devised as gimmicks to increase theater attendance, but their widespread implementation created a content vacuum that had to be filled by competent professionals. Dolby's immersive Dolby Atmos format, introduced in 2012, provides the sound team with 128 tracks of audio that can be assigned to a 7.1.2 bed that utilizes two overhead channels, leaving 118 tracks for audio objects that can be positioned around the theater independent of the sound bed. Object positions are informed by metadata that places them based on X, Y, Z coordinates and the number of speakers available in the room. This immersive sound format expands creative opportunities for the use of sound beyond what was achievable with older 5.1 and 7.1 surround sound systems. The greater dynamic range of the new systems, coupled with the ability to produce sounds at the sides, behind, or above the audience, provided the audio post-production team new opportunities for creative expression in film sound. Some directors were interested in realizing the new potentials of the medium. A new generation of filmmakers, the so-called "Easy Riders and Raging Bulls"—Martin Scorsese, Steven Spielberg, George Lucas, and others—were aware of the creative potential of sound and wanted to use it. Filmmakers were inspired by the popular music of the era. Concept albums of groups such as Pink Floyd and The Beatles suggested new modes of storytelling and creative techniques that could be adapted to motion pictures. New filmmakers made their early films outside the Hollywood establishment, away from the influence of film labor unions and the then rapidly dissipating studio system. The contemporary title of sound designer can be compared with the more traditional title of supervising sound editor; many sound designers use both titles interchangeably. The role of supervising sound editor, or sound supervisor, developed in parallel with the role of sound designer. The demand for more sophisticated soundtracks was felt both inside and outside Hollywood, and the supervising sound editor became the head of the large sound department, with a staff of dozens of sound editors, that was required to realize a complete sound job with a fast turnaround. Theatre Sound design, as a distinct discipline, is one of the youngest fields in stagecraft, second only to the use of projection and other multimedia displays, although the ideas and techniques of sound design have been around almost since theatre started. Dan Dugan, working with three stereo tape decks routed to ten loudspeaker zones during the 1968–69 season of American Conservatory Theater (ACT) in San Francisco, was the first person to be called a sound designer. A theatre sound designer is responsible for everything the audience hears in the performance space, including music, sound effects, sonic textures, and soundscapes. These elements are created by the sound designer, or sourced from other sound professionals, such as a composer in the case of music. Pre-recorded music must be licensed from a legal entity that represents the artist's work. This can be the artist themselves, a publisher, record label, performing rights organization or music licensing company. The theatre sound designer is also in charge of choosing and installing the sound system —speakers, sound desks, interfaces and convertors, playout/cueing software, microphones, radio mics, foldback, cables, computers, and outboard equipment like FX units and dynamics processors. Modern audio technology has enabled theatre sound designers to produce flexible, complex, and inexpensive designs that can be easily integrated into live performance. The influence of film and television on playwriting is seeing plays being written increasingly with shorter scenes, which is difficult to achieve with scenery but easily conveyed with sound. The development of film sound design is giving writers and directors higher expectations and knowledge of sound design. Consequently, theatre sound design is widespread and accomplished sound designers commonly establish long-term collaborations with directors. Musicals Sound design for musicals often focuses on the design and implementation of a sound reinforcement system that will fulfil the needs of the production. If a sound system is already installed in the performance venue, it is the sound designer's job to tune the system for the best use for a particular production. Sound system tuning employs various methods including equalization, delay, volume, speaker and microphone placement, and in some cases, the addition of new equipment. In conjunction with the director and musical director, if any, the sound reinforcement designer determines the use and placement of microphones for actors and musicians. The sound reinforcement designer ensures that the performance can be heard and understood by everyone in the audience, regardless of the shape, size or acoustics of the venue, and that performers can hear everything needed to enable them to do their jobs. While sound design for a musical largely focuses on the artistic merits of sound reinforcement, many musicals, such as Into the Woods also require significant sound scores (see Sound Design for Plays). Sound Reinforcement Design was recognized by the American Theatre Wing's Tony Awards with the Tony Award for Best Sound Design of a Musical until the 2014-15 season, later reinstating in the 2017-18 season. Plays Sound design for plays often involves the selection of music and sounds (sound score) for a production based on intimate familiarity with the play, and the design, installation, calibration and utilization of the sound system that reproduces the sound score. The sound designer for a play and the production's director work together to decide the themes and emotions to be explored. Based on this, the sound designer for plays, in collaboration with the director and possibly the composer, decides upon the sounds that will be used to create the desired moods. In some productions, the sound designer might also be hired to compose music for the play. The sound designer and the director usually work together to "spot" the cues in the play (i.e., decide when and where sound will be used in the play). Some productions might use music only during scene changes, whilst others might use sound effects. Likewise, a scene might be underscored with music, sound effects or abstract sounds that exist somewhere between the two. Some sound designers are accomplished composers, writing and producing music for productions as well as designing sound. Many sound designs for plays also require significant sound reinforcement (see Sound Design for Musicals). Sound Design for plays was recognized by the American Theatre Wing's Tony Awards with the Tony Award for Best Sound Design of a Play until the 2014-15 season, later reinstating the award in the 2017-18 season. Professional organizations Theatrical Sound Designers and Composers Association (TSDCA) The Association of Sound Designers is a trade association representing theatre sound designers in the UK. United Scenic Artists (USA) Local USA829, which is integrated within IATSE, represents theatrical sound designers in the United States. Theatrical Sound Designers in English Canada are represented by the Associated Designers of Canada (ADC), and in Québec by l'Association des professionnels des arts du Québec (APASQ). Music In the contemporary music business, especially in the production of rock music, ambient music, progressive rock, and similar genres, the record producer and recording engineer play important roles in the creation of the overall sound (or soundscape) of a recording, and less often, of a live performance. A record producer is responsible for extracting the best performance possible from the musicians and for making both musical and technical decisions about the instrumental timbres, arrangements, etc. On some, particularly more electronic music projects, artists and producers in more conventional genres have sometimes sourced additional help from artists often credited as "sound designers", to contribute specific auditory effects, ambiences etc. to the production. These people are usually more versed in e.g. electronic music composition and synthesizers than the other musicians on board. In the application of electroacoustic techniques (e.g. binaural sound) and sound synthesis for contemporary music or film music, a sound designer (often also an electronic musician) sometimes refers to an artist who works alongside a composer to realize the more electronic aspects of a musical production. This is because sometimes there exists a difference in interests between composers and electronic musicians or sound designers. The latter specialising in electronic music techniques, such as sequencing and synthesizers, but the former more experienced in writing music in a variety of genres. Since electronic music itself is quite broad in techniques and often separate from techniques applied in other genres, this kind of collaboration can be seen as natural and beneficial. Notable examples of (recognized) sound design in music are the contributions of Michael Brook to the U2 album The Joshua Tree, George Massenburg to the Jennifer Warnes album Famous Blue Raincoat, Chris Thomas to the Pink Floyd album The Dark Side of the Moon, and Brian Eno to the Paul Simon album Surprise. In 1974, Suzanne Ciani started her own production company, Ciani/Musica. Inc., which became the #1 sound design music house in New York. Fashion In fashion shows, the sound designer often works with the artistic director to create an atmosphere fitting the theme of a collection, commercial campaign or event. Computer applications and other applications Sound is widely used in a variety of human–computer interfaces, in computer games and video games. There are a few extra requirements for sound production for computer applications, including reusability, interactivity and low memory and CPU usage. For example, most computational resources are usually devoted to graphics. Audio production should account for computational limits for sound playback with audio compression or voice allocating systems. Sound design for video games requires proficient knowledge of audio recording and editing using a digital audio workstation, and an understanding of game audio integration using audio engine software, audio authoring tools, or middleware to integrate audio into the game engine. Audio middleware is a third-party toolset that sits between the game engine and the audio hardware. Interactivity with computer sound can involve using a variety of playback systems or logic, using tools that allow the production of interactive sound (e.g. Max/MSP, Wwise). Implementation might require software or electrical engineering of the systems that modify sound or process user input. In interactive applications, a sound designer often collaborates with an engineer (e.g. a sound programmer) who's concerned with designing the playback systems and their efficiency. Awards Sound designers have been recognized by awards organizations for some time, and new awards have emerged more recently in response to advances in sound design technology and quality. The Motion Picture Sound Editors and the Academy of Motion Picture Arts and Sciences recognizes the finest or most aesthetic sound design for a film with the Golden Reel Awards for Sound Editing in the film, broadcast, and game industries, and the Academy Award for Best Sound respectively. In 2021, the 93rd Academy Awards merged Best Sound Editing and Best Sound Mixing into one general Best Sound category. In 2007, the Tony Award for Best Sound Design was created to honor the best sound design in American theatre on Broadway. North American theatrical award organizations that recognize sound designers include these: Dora Mavor Moore Awards Drama Desk Awards Helen Hayes Awards Obie Awards Joseph Jefferson Awards Major British award organizations include the Olivier Awards. The Tony Awards retired the awards for Sound Design as of the 2014-2015 season, then reinstated the categories in the 2017-18 season. See also Audio engineering Berberian Sound Studio Crash box Director of audiography List of sound designers Musique concrète IEZA Framework – a framework for conceptual game sound design References External links FilmSound.org: A Learning Space dedicated to the Art of Sound Design Kai's Theater Sound Hand Book Association of Sound Designers sounDesign: online publication about Sound Communication Sound production Film sound production Stagecraft Stage crew Theatrical occupations Theatrical sound production Design
7206721
https://en.wikipedia.org/wiki/Pumping%20%28computer%20systems%29
Pumping (computer systems)
Pumping, when referring to computer systems, is an informal term for transmitting a data signal more than one time per clock signal. Overview Early types of system memory (RAM), such as SDRAM, transmitted data on only the rising edge of the clock signal. With the advent of double data rate synchronous dynamic RAM or DDR SDRAM, the data was transmitted on both rising and falling edges. However, quad-pumping has been used for a while for the front-side bus (FSB) of a computer system. This works by transmitting data at the rising edge, peak, falling edge, and trough of each clock cycle. Intel computer systems (and others) use this technology to reach effective FSB speeds of 1600 MT/s (million transfers per second), even though the FSB clock speed is only 400 MHz (cycles per second). A phase-locked loop in the CPU then multiplies the FSB clock by a factor in order to get the CPU speed. Example: A Core 2 Duo E6600 processor is listed as 2.4 GHz with a 1066 MHz FSB. The FSB is known to be quad-pumped, so its clock frequency is 1066/4 = 266 MHz. Therefore, the CPU multiplier is 2400/266, or 9×. The DDR2 RAM that it is compatible with is known to be double-pumped and to have an Input/Output Bus twice that of the true FSB frequency (effectively transferring data 4 times a clock cycle), so to run the system synchronously (see front-side bus) the type of RAM that is appropriate is quadruple 266 MHz, or DDR2-1066 (PC2-8400 or PC2-8500, depending on the manufacturer's labeling.). References Computer memory
23657850
https://en.wikipedia.org/wiki/Haley%20Anderson
Haley Anderson
Haley Danita Anderson (born November 20, 1991) is an American competitive swimmer who is an Olympic silver medalist. She placed second in the 10-kilometer open water event at the 2012 Summer Olympics. Personal Anderson's older sister, Alyssa, was a swimmer at Arizona. Both sisters competed at the 2009 World Aquatics Championships. Anderson attended the University of Southern California, where she swam for the USC Trojans swimming and diving team in National Collegiate Athletic Association (NCAA) competition from 2010 to 2013. Career At the 2009 Junior Pan Pacific Championships, Anderson placed first in the 800-meter and 1,500-meter freestyle events. At the 2009 USA Nationals and World Championship Trials, Anderson placed second in the 800-meter freestyle in 8:31.66, earning a place to compete at the 2009 World Aquatics Championships in Rome. At the World Championships, Anderson placed 28th in the 800-meter freestyle (8:45.91) and ninth in the 1,500-meter freestyle (16:20.62). In June 2012, Anderson qualified for the 2012 Summer Olympics by placing first at the FINA Olympic Marathon Swim Qualifier in Setubal in the 10-kilometer open water event. Anderson later competed at the 2012 United States Olympic Trials in the hopes of also competing in the pool but narrowly missed the team by finishing third in the 800-meter freestyle. She also competed in the 400-meter individual medley and placed eighth in the final. At the 2012 Olympics in London, Anderson earned a silver medal by placing second in the 10-kilometer marathon event, finishing four-tenths (0.40) of a second behind the winner, Éva Risztov of Hungary, over the 6.2 miles of the event. Her sister Alyssa earned a gold medal at the 2012 Olympics as a member of the winning U.S. team in the 4x200-meter freestyle relay. At the 15th FINA World Championships in Barcelona in 2013, Anderson won the gold medal in the 5-kilometer open water competition. At the AT&T Winter Nationals located in Federal Way, WA, Anderson won first in both the women's 800-meter freestyle and the women's 200-meter butterfly. Personal bests (long course) See also List of Olympic medalists in swimming (women) List of University of Southern California people USC Trojans References External links 1991 births Living people American female freestyle swimmers Olympic silver medalists for the United States in swimming Sportspeople from Santa Clara, California Swimmers at the 2012 Summer Olympics Swimmers at the 2016 Summer Olympics Swimmers at the 2020 Summer Olympics USC Trojans women's swimmers Medalists at the 2012 Summer Olympics Female long-distance swimmers World Aquatics Championships medalists in open water swimming Universiade medalists in swimming Universiade gold medalists for the United States Universiade silver medalists for the United States Medalists at the 2011 Summer Universiade
1127936
https://en.wikipedia.org/wiki/Australian%20High%20Tech%20Crime%20Centre
Australian High Tech Crime Centre
The Australian High Tech Crime Centre (AHTCC) are hosted by the Australian Federal Police (AFP) at their headquarters in Canberra. Under the auspices of the AFP, the AHTCC is party to the formal Joint Operating Arrangement established between the AFP, the Australian Security Intelligence Organisation and the Computer Network Vulnerability Team of the Australian Signals Directorate. The AHTCC is an Australian-wide policing initiative to coordinate the efforts of Australian law enforcement in combating serious, complex and multi-jurisdictional Internet-based crimes, particularly those beyond the capability of individual police agencies in Australia. Other roles include protecting the information infrastructure of Australia, and providing information to other law enforcement to help combat online crime. Technological advancements, and greater internet accessibility, has seen a growth in cyber criminality. The Australian Federal Police have established the Australian High Tech Crime Centre to prevent such crimes from occurring in the digital space. State and community police work in corporation with the AFP to combat cyber crime. Overview Technology and its advancements, including greater and broader internet accessibility, have boosted cybercrime. The AFP established the AHTCC to cease and prevent online fraudulent behaviour. State and community police work incorporation with the AFP to combat digital crime. AHTCC was established in July 2003, to investigate online fraudulent matters and is currently directed by Kevin Zuccato. The AHTCC, today, continues to fight against cybercrime. Commonwealth Laws outline the Cybercrime regulations in Australia Cybercrime is defined by the AFP as  “crimes directed at computers or other information communications technologies (ICTs) (such as computer intrusions and denial of service attacks), and crimes where computers or ICTs are an integral part of an offence (such as online fraud)”. Digital attacks are often conducted by cyber adversaries, who are individuals or groups of people that specialize in conducting cyberattacks, online crime and malicious activity. Their activities destroy online businesses and networks and can compromise public or private data. Malicious cyber activity is a risk for organisations and governments, including but not limited to, businesses that hold public data. In 1991, the World-wide-web was developed and shared. This allowed crime to occur through the internet, as well as offline. Online interactions and user relationships have heightened the ability of cybercrime to manifest and spread. AHTCC's main role is to ‘discover levels of online criminal activity’ and to undertake necessary measures to prevent or combat digital crime (Platypus, 2009, p. 7). AHTCC has partnered with large corporations, and work's alongside Australia's policing system to effectively manage web-based criminal schemes. The AFP conduct initiatives, like publishing reports that detail how the public should prevent cybercrime. These reports educate businesses on the possible threats of online transactions, customer communications, and other digital tools. The AHTCC's website has a section for victims of digital offences to report and request help from the federal police. Notably, AHTCC has mitigated online criminality and effectively maintained a high level of mediation for cybercrime prevention. The AHTCC deals with different groups of cybercrime including but not limited to, hackers, viruses, scammers, identity thieves, and online terrorists. In 2008, the AFP launched the High Tech Crimes Operations (HTCO) to investigate Child sex crimes and digital acts of child exploitation. The exploitation of children online continues to be an issue globally.   The AFP has several organisations that deal with online criminality. The Australian Cyber Security Centre (ACSC), which is operated by the Australian Government, specialises in digital safety and security and works closely with the AHTCC to fight cybercrime. The ACSC allows the public and private to join forces to resolving cyber threats and possible attacks. The AFP also work with Defense Signals Directorate (DSD) and the Australian Security Intelligence Organization (ASIO) to effectively manage cyber-attacks that would harm Australia's business environment and public sphere. History The AHTCC was established in 2003, as a response to the rapid rise in digital usage and technological capabilities. The growth in Internet potential provided new capacities for criminals to engage in online fraudulent activities. Cybercrime has been existent since 1978 when the internet was first established and dispersed. Security systems have realised the need for continuous data moderation, and the monitoring of user activity online and offline. With the rise of e-commerce transactions a ‘dark’ web/ ‘black market’ has emerged and targets users of internet. In 2003, at the beginning of the AHTCC, the centre had a budget of $4 million per annum and was operated by 13 staff, currently, the AFP has 258 staff working in technology and innovation and 319 in the security sector, with a total of 6695 staff across the board. The AHTCC and the AFP have adapted their work in line with the rise in web use. In 1991, the police were able to make video and audio records of interviews, as a new wave of digital records emerged. To this day, the AFP is extremely present online, focusing on detecting criminals before they know they have been detected (AFP, 2020). Cybercrime prevention is an expensive exercise for the AFP. As of 2013, there were an estimated 10 billion devices connected to the internet, and with this increasing digital usage, comes more targeted attacks and criminal possibilities. The security software company, Symantec Corporation, estimates that, in 2010, cybercrime cost US$388 billion in losses, globally. (Weber, 2014, p. 54). The Cybercrime Act was introduced by the Australian Government in 2001, outlining online offences prohibited. Statistically, cybercrime is significantly impacting Australians. In 2015, 1 in 4 Australians reported being victims of Identity theft (Veda Report, 2015). In 2015, Australia had a significant data breach from overseas, an Australian Insurance company was the target of a cyberattack, in which criminals gained access to prohibited government systems and compromised the finances of notable businesses, damaging Australia's security systems. The AFP solved this.  The AFP is working closely with Australians to avoid cybercrime. “Veda’s research from 2015 found that 95% of Australians were taking some kind of active precaution to protect their identity” (Veda Report, 2015). Australia's Approach Australia follows the Commonwealth cybercrime regime, a legislative system that formed the CyberCrime Act 2011, an Act that considered computer crimes, traditional offline offences committed online, digital data misuse and other internet criminality. This Act has been amended, in 2012 it was changed to the Cybercrime Amendment Act, expanding its definition of cyber offences, as the online criminal possibilities broadened. However, the current cybercrime policies in Australia, do not acknowledge ‘computer fraud’ as a criminal offence under the commonwealth law.  ‘Given the borderless nature of cybercrime, it is an oversight to not have a national approach to computer related fraud’ (Weber, p. 68). Roles and Functions Cybercrime takes numerous forms, including identity theft, digital scams, hacking, online fraud, and phishing. It is the role of the AHTCC in conjunction with the AFP and several other government corporations, to prevent and reduce criminal behaviour online. The AFP is in a partnership with the Virtual Global Taskforce (VGT), which includes major countries and organisations, who join forces in combating cybercrime. Globally, they tackle international digital offences. Cybercrime is not just personal account hacking, but can involve breaching security of government cooperation's and networks, having a broader national impact. The Australian Government established The Australian Cybercrime Online Reporting Network (ACORN) to allow victims of cybercrime to report the occurrences and likewise for organisations. In 2015, ACORN recorded 39, 491 cybercrime incidents, and similarly, "Australian Federal Police (AFP) Commander David McLean reported that, in one month alone, over 3,500 people had contacted police about perceived cyber-crime" (Broadhurst, 2017). There is widespread dissemination of pornographic content on the internet. User anonymity is used to conduct this behaviour, as tracing the criminal behind the crime is difficult. Some estimates suggest that 20,000 images of child  porn are posted online each week (Broadhurst, 2017). mportantly, the AHTCC is taking effect against child abuse material, which has continued to circulate on the internet. The Australian police are focusing closely on child predators, targeting criminals that are a threat to society, to keep the online community safe and free from malicious and dangerous offences. Users who access and choose to view illegal content online are considered in the eyes of the Australian Law, as guilty as those who publish the footage, and thus are not free from prosecution themselves. Educating the public on these offences is part of the Australian Government security initiatives. Technologies are helping the AFP combat cybercrime. In April 2016, The Australian Prime Minister allocated resources and financial expenditure of A$230 million to fund cybercrime and security measures under the AHTCC operations. Funding is important for the safety of businesses and individuals. Notable Cases The Australian Federal Police established the AHTCC to effectively control and manage cybercrime. The AFP has thus far, been successful in this field. According to the AFP, ‘over the past two years, more than 300 Australians have been arrested and charged in relation to the online sexual exploitation of children’ and concurrently, children posed as a risk to criminals were freed from danger and educated (AFP, 2020). A large part of cybercrime is Pedophilia. Organisations of International pedophiles use digital devices and platforms to exploit children, through strategies like webcams and online mechanisms like anonymity. The AFP's budget is extensive, to cover a vast number of cybercrime cases and solutions. In 2014, An online Russian crime syndicate hacked Australian companies and accessed 500,000 Australian's financial details. The AHTCC arrested members of the gang, with seven criminals being in Victoria and thirteen in Spain. The AHTCC was responsible for combating this fraudulent activity effectively, avoiding the major crime from damaging the Australian economy and the broader national impact. The AHTCC is capable of managing domestic and international cybercrime and will work alongside large nations to develop plans of combat. Recently, the AHTCC was highly successful in a case known as ‘Operation Carpo’, whereby a Western Australian man was prosecuted for withholding 56,000 credit card details, 53,000 usernames and passwords and 110,000 domain names. The success of this case, prevented several other organised cybercrime's in Australia (AFP). The Future Australia is facing an increase in cyber criminality because of new technologies that can be defeated and maliciously targeted. The Australian Government continues to develop and improve its digital infrastructure and security, but loopholes are discovered by crime groups. In Australia, there is wealth and a high use of digital devices, which attracts criminality to our organisations and businesses. Further, there is a lack of information reported about cybercrime, which proposes difficulty for government bodies, like the AHTCC, to investigate and resolve the problem. As technology continues to dominate the marketplace, the AHTCC will be vitally needed. The AFP will always require investigatory teams, such as the AHTCC to mitigate cyber criminality. The AFP declare that “Our ongoing commitment is to stay at least one step ahead of the criminals who are involved in any form of online crime” (AFP, 2020). The Government, in 2016, introduced the Cyber Security Strategy, worth more than $230 million, to improve and maintain nation-wide digital security. One of the initiatives of this program was to increase the number of cyber security workers, and equip more professionals with the necessary tools and skills to uphold these qualifications. Institutions were set up to offer cybersecurity courses, and to ensure Australia remains at the front line with digital crime and technological advancements. The director of AHTCC, Kevin Zuccato, works nationally with other countries to establish cyber-security on a global scale. The AHTCC is part of the Strategic Alliance Cyber Crime Working Group, which focuses on transnational cybercrime and building strong global relationships between countries and governments to combat world-wide cyber-attacks together. The Australian Government has also implemented numerous measures and strategies to prevent cybercrime, including, but not limited to, risk reduction protocols, multi-stakeholder approaches to internet governing, internet neutrality, supporting human rights online, improve connectivity, technology innovation and digital-ready workforces. All cyber defenders must work closely together to minimise cyber criminality. To do so, Australians and their businesses must report any security breached incidents to the Australian Federal Police, who will provide advice and assistance in preventing such crimes from occurring in the future. Further, the AFP requires the reporting of cybercrime offences as it provides needed information in order to seize all criminal behaviour online. See also Crime in Australia Law enforcement in Australia Australian Federal Police Australian Government Virtual Global Taskforce Australian Cyber Security Centre Australian Security Intelligence Organisation References External links Australian High Tech Crime Centre Australian Federal Police Australian Cyber Security Centre Australian Security Intelligence Organisation Law enforcement in Australia Federal law enforcement agencies of Australia Cybercrime
12915464
https://en.wikipedia.org/wiki/Nelson%20Ford
Nelson Ford
Nelson Ford (born 1946) was one of the founders of shareware software distribution, of HAL-PC (the Houston Area League of PC Users, a PC user group which grew to over 10,000 members), of the Association of Shareware Professionals, founder of the Public (software) Library, the largest commercial library of public domain and shareware software, and of the first major order processing service for shareware programmers. In 1984, through his shareware column in Softalk-PC magazine, he was responsible for standardizing the use of the term shareware for free-trial software. He wrote several shareware games: CardShark Hearts, CardShark Spades, and CardShark Bridge Tutor. Nelson Ford was inducted into the Shareware Hall Of Fame in August 2001. Background Nelson Ford was born in 1946 in San Antonio, Texas, USA. He served 4 years in the United States Marine Corps, 19 months in Viet Nam. He graduated from the University of Texas at Austin, with a BBA in Accounting and moved to Houston, Texas where he met and married Kay Hightower Ford. He became a Certified Public Accountant in Texas and worked for Daniel Industries, Inc. and Pennzoil Company before forming Public (software) Library. He and Kay are retired and living in Hot Springs Village, Arkansas. HAL-PC In 1979, Nelson Ford got his first personal computer, a Radio Shack Model II. In 1980, he got one of the first IBM-PCs available in Texas and shortly after that helped start the Houston Area League Of PC Users, which became the largest PC user group in the country with well over 10,000 members. User groups in other major cities were normally run by the few people who had started the group, which tended to limit their growth. As president of HAL-PC, Nelson Ford established a system of special interest groups (SIGs) and had the leaders of the SIGs also serve as HAL-PC's board of directors. With this source of new leadership for HAL-PC and term limits for officers, HAL-PC was assured that no one person or small group of people would run the group into the ground. The SIGs also assured that all members could find something in HAL to match their particular computer interests. Association of Shareware Professionals In 1985, Nelson Ford began working on a conference of shareware programmers, bulletin board system operators, and shareware disk distributors with the goal of creating an industry trade organization. As a result of the attendance and hard work of such industry leaders as Jim "Button" Knopf and Bob Wallace and many others, the Association of Shareware Professionals was created in 1987. Nelson Ford served on the first Board Of Directors. After more than 25 years, ASP is still a very active, important organization for shareware professionals. In 2001, Nelson Ford was inducted into the ASP Hall Of Fame. Public (software) Library One of Nelson Ford's interest in the HAL-PC user group was swapping public domain and shareware software with other members. He eventually created a large, organized library of programs and his group made copies for other members for a disk fee. In 1984, Nelson Ford wrote a column named The Public Library in Softalk-PC magazine. When people around the world were not able to get programs discussed in the column because they lacked economical access to bulletin board systems, they wrote to Nelson Ford asking for copies, which he also made for a disk fee. This service quickly snowballed into a full-time job, resulting in the creation of Public (software) Library (PsL) to provide the service. Nelson's wife, Kay Ford, ran the operation of PsL while Nelson ran the technical side. Programmers who learned of the service sent their software to PsL to be added to the library, eventually at the rate of hundreds of programs a month. PsL hired technicians to test, review and write-up programs in PsL's monthly magazine and annual catalog which grew to well over 1000 pages. Eventually, as the volume of software increased and CD-ROM drives in PCs became common, most of PsL's shareware distribution shifted to CD-ROMs where many hundreds of programs could be put on each monthly CD-ROM. During the boom years of shareware disk distribution, new vendors were popping up all the time. As the CD-ROM and the Internet took over, these disk vendors died out, thus leaving PsL the first (1980) and most likely the very last (1997) company to distribute shareware on diskettes. Shareware order processing In the 1980s and early 1990s, virtually no shareware authors had the ability to accept credit card orders at all, much less via live operators at toll-free numbers, the way most people are accustomed to making such orders. Authors could accept only cash or checks mailed directly to them and thus missed out on potential sales. In the late 1980s, PsL initiated an order-processing service for shareware authors in which live operators took orders over the phone at toll-free numbers. This was not an easy service to provide as banks were reluctant to give credit card merchant accounts to mail- or phone-order businesses. PsL had to change banks several times and one time lost money when a bank went bankrupt. But eventually things settled down, and some programmers began receiving from thousands to tens of thousands of dollars a month in orders—substantially more than they would have received accepting payment only by cash or check. Originally, PsL provided its service to programmers at its cost, but as hundreds of programmers signed up, economies of scale actually made the service profitable. With the eventual spread of the Internet, PsL added Internet order processing to its services. With the growth of the Internet, by 1998, the distribution of shareware by disk and CD-ROM was beginning to wane while order processing was booming, and with 13+ years of 100-hour work weeks taking their toll on Nelson and Kay Ford, they sold PsL to Digital River, Inc., an NYSE-listed, online-order-processing company, and retired. The Shareware name Many terms were being used for freely-distributal software in the early 1980s. In his column on such software in Softalk-PC magazine, Nelson Ford held a contest to come up with a standard name. The most popular name was shareware, and that name was adopted for generic use.See "Association of Shareware Professionals" web site. CardShark games In the mid 1980s, Nelson Ford wrote CardShark Hearts, for playing Hearts against three computer opponents, CardShark Spades, for playing Spades against the computer, and CardShark Bridge Tutor, for learning to play Contract Bridge. References Publications with articles about Nelson Ford. Article quoting Nelson Ford about changes in shareware. 1946 births Living people People from San Antonio United States Marines University of Texas at Austin alumni American accountants
1426032
https://en.wikipedia.org/wiki/Spaceship%20Earth%20%28Epcot%29
Spaceship Earth (Epcot)
Spaceship Earth is a dark ride attraction at the Epcot theme park at the Walt Disney World in Bay Lake, Florida. The geodesic sphere in which the attraction is housed has served as the symbolic structure of Epcot since the park opened in 1982. The 15-minute ride takes guests on a time machine-themed experience, demonstrating how advancements in human communication have helped to create the future one step at a time. Riding in Omnimover-type vehicles along a track that spirals up and down the geodesic sphere, passengers are taken through scenes depicting important breakthroughs in communication throughout history—from the development of early language through cave paintings, to the use of hieroglyphs, to the invention of the alphabet, to the creation of the printing press, to today's modern communication advancements, including telecommunication and mass communication. Since its 1982 opening, the ride has been updated three times—in 1986, 1994, and 2008. On February 25, 2020, the Disney Parks Blog announced that Spaceship Earth would be closing for refurbishment on May 26, 2020. However, its refurbishment is currently placed on indefinite hold due to the COVID-19 pandemic. Structure The structure is similar in texture to the United States pavilion from Expo 67 in Montreal but, unlike that structure, Spaceship Earth is a complete sphere, supported by three pairs of legs. The architectural design was conceived by Wallace Floyd Design Group. The structural designs of both Expo 67 and Spaceship Earth were completed by Simpson Gumpertz & Heger Inc. of Boston, Massachusetts. Geometrically, Spaceship Earth is derived from the Class 2 geodesic polyhedron with frequency of division equal to 8. Each face of the polyhedron is divided into three isosceles triangles to form each point. In theory, there are 11,520 total isosceles triangles forming 3840 points. In reality, some of those triangles are partially or fully nonexistent due to supports and doors; there are actually only 11,324 silvered facets, with 954 partial or full flat triangular panels. The appearance of being a monolithic sphere is an architectural goal that was achieved through a structural trick. Spaceship Earth is in fact two structural domes. Six legs are supported on pile groups that are driven up to 160 feet into Central Florida's soft earth. Those legs support a steel box-shaped ring at the sphere's perimeter, at about 30 degrees south latitude in earth-terms. The upper structural dome sits on this ring. A grid of trusses inside the ring supports two helical structures of the ride and show system. Below the ring, a second dome is hung from the bottom, completing the spherical shape. The ring and trusses form a table-like structure which separates the upper dome from the lower. Supported by and about three feet off the structural domes is a cladding sphere to which the shiny Alucobond panels and drainage system are mounted. The cladding was designed so that when it rains, no water pours off the sides onto the ground. All water is collected through one-inch gaps in the facets into a gutter system, and the water is channeled into the World Showcase Lagoon. History Design and construction The structure was designed with the help of science fiction writer Ray Bradbury, who also helped write the original storyline for the attraction. The term "Spaceship Earth" was popularised by Buckminster Fuller, who also popularized the geodesic dome. Construction took 26 months. Extending upwards from the table are "quadropod" structures, which support smaller beams which form the shell of the steel skeleton. Pipes stand the aluminum skin panels away from the skeleton and provide space for utilities. A small service car is parked in the interstitial space between the structural and cladding surfaces, and it can carry a prone technician down the sides to access repair locations. The shop fabrication of the steel (done in nearby Tampa, Florida) was an early instance of computer-aided drafting and materials processing. Spaceship Earth was originally sponsored by the Bell System from 1982 until 1984, when it was broken into smaller companies and its parent company, AT&T, became an independent company. AT&T sponsored Spaceship Earth from 1984 until 2004. From 2005 until 2017, the German company Siemens was the sponsor of Spaceship Earth. As of 2021, the ride currently has no sponsor. The private sponsor lounge, located on the second floor above Project Tomorrow, is currently used for special events. Dedication During Epcot Center's opening ceremony William Ellinghaus, then president of AT&T dedicated Spaceship Earth stated: "Now as you will soon see, Spaceship Earth’s theme is communications, civilization and communications from Stone Age to Information Age, and I therefore think it is very fitting that we dedicate Spaceship Earth to all of the people who have advanced communications, arts, and sciences, and in so doing have demonstrated that communications is truly the beginning of understanding." Ride updates From October 1, 1982 to May 25, 1986, the attraction experience began with the ride vehicles moving up into the structure through a lighted tunnel, enhanced by a fog machine, and then ascending on a spiraling track up through dark spaces featuring a series of lighted historic vignettes. The attraction featured actor Vic Perrin as the narrator along with a very simple and quiet orchestral composition throughout. The theme of "communication through the ages" is presented in chronological order in settings peopled with Audio-Animatronics figures. Cavemen are seen telling stories using wall carvings; Egyptians work on hieroglyphics and papyrus as a pharaoh inspects the final result. A Greek theater presents actors declaiming Oedipus Rex. Charioteers carry messages from a Roman court, and Jewish and Islamic scholars discuss texts. With typical Disney whimsy, a monk is seen having fallen asleep on a manuscript he was inscribing. Michelangelo, overhead, paints the ceiling of the Sistine chapel, and Gutenberg mans his printing press. Suggesting the rush of 20th-century technology, subsequent scenes meld together as the circumference of the ride track narrows. A newsboy hawks papers, a movie marquee and film clips represent motion pictures, and radio and television are represented. On the right side of the track, the vehicles then pass a wall with angled windows looking out into the stars, a glass wall with a mainframe computer blinking behind it, and a woman observing technical readouts on various screens. On the left side of the track, a man and a woman are seen working at a network operations center with a data map of the United States. As the vehicles reach the large space at the apex of the ride system, guests see, on the planetarium ceiling of the sphere, a projection of "our spaceship earth", and then they pass under a large lighted space station with two astronauts working on satellites and a woman sitting in the station operating controls. The Omnimover vehicles then revolve 180 degrees and pass under the woman through the station's “hangar”, so that the passengers lie backward facing the "sky" as they begin their descent on a relatively straight track passing various monitors showing various events and activities. The ride stops intermittently as wheelchairs are loaded or unloaded. From May 26 to 28, 1986, the attraction was given a slight remodel. This second version of the attraction started off with the lighted tunnel enhanced by twinkling lights, meant to depict stars, with the fog machine removed. News journalist Walter Cronkite was the new narrator, reading from an updated script. Two new scenes were added before the network operations center, on the left side of the track, featuring a woman working in a “paperless office” and a boy at a computer in his bedroom (the right side, featuring a woman observing technical read outs, remained the same). A theme song called "Tomorrow's Child" was composed for the ending of the attraction, which was redesigned with projected images of children on screens to fit the theme of "Tomorrow's Child". Between August 15 and November 23, 1994, the attraction was closed to receive a major remodel. This third version of the attraction kept the lighted tunnel as it was in 1986, and it maintained the majority of the scenes depicted in the beginning and middle of the attraction. Three 1980s scenes toward the end of the attraction were removed: a computer in a boy's bedroom, a woman's office, and a network operations center. These were replaced with a single scene depicting a boy and girl using the Internet to communicate between America and Asia. The ride received an updated script narrated by Jeremy Irons. A new orchestral composition, based upon Bach's Sinfonia No. 2 in C Minor, was composed for the entire attraction. The ending was completely redone, with the updating of the projected Earth and removal of the Space Station scene from the planetarium (the Space Station astronauts subsequently turned up in Space Mountain's post-show, where they were used until 2009) as well as the replacement of the 1982 and 1986 ending scenes with miniature architectural settings that were connected by color-changing fiber-optic cables, and arrays of blinking lights, representing electronic communication pathways. Wand and update In celebration of the year 2000, a 25-story "magic wand" held by a representation of Mickey Mouse's hand was built next to the sphere. Inspiration for it came from the Sorcerer's Apprentice sequence of Fantasia (although Mickey did not actually use a magic wand in that sequence). At the top of the structure was a large cut-out of the number 2000. This structure was constructed to have a lifetime of about 10 years, and it was left standing after the Millennium Celebration ended. In 2001, the number 2000 was replaced with the word "Epcot" in a script font that differed from the park's logotype. On July 5, 2007, Epcot Vice President Jim MacPhee announced that Spaceship Earth would be restored to its original appearance, and that the "magic wand" structure would be removed in time for the park's 25th anniversary on October 1, 2007. It was rumored that Siemens AG, the new sponsor of Spaceship Earth, requested the wand be removed as it did not fit their corporate image. The attraction was closed on July 9, 2007, and by October 1 the wand structure, the stars and their supports were gone, replaced by palm trees and other plants. Components of the structure were later auctioned on eBay. The closure also saw the ride's fourth update, which included new scenes and modifications to existing scenes, some new costumes, lighting, and props, a new musical score by Bruce Broughton, new narration by Judi Dench, and a new interactive ending. New scenes showed a Greek classroom, mainframe computers and the creation of the personal computer. The attraction opened for "soft launch" previews starting in December 2007. After some last-minute adjustments in January, the ride had its official re-opening on February 15, 2008. The "time machine" vehicles now have an interactive screen where riders can choose their vision of the future. This resembles a similar idea to the now-defunct Horizons attraction. At the beginning of the ride, a camera takes riders' pictures (using facial recognition technology) which are used at the end of the ride to conduct an interactive experience about the future of technology, featuring the riders' faces on animated characters, with narration by Cam Clarke. Visitors are now also asked where in our Spaceship Earth they live; this is used in the post-show area where a map of the world is displayed with the riders' faces shown where they live. On June 30, 2017, Siemens, a long-time sponsor, announced they would end their sponsorship of the attraction, as well as the firework show, IllumiNations: Reflections of Earth. The last official day of Siemens sponsorship was on October 10, 2017. World Celebration update On August 25, 2019, it was announced that, as part of a multi-year renovation of Epcot, Future World would become three new areas: World Celebration, World Discovery and World Nature. Spaceship Earth (which is under the name of Spaceship Earth: Our Shared Story) would become a part of World Celebration, and would be updated with a new narrative about the human experience and the art of storytelling. An ethereal "story light" would guide guests as they travel through the attraction. The attraction's new exit would be Dreamers Point, an elevated area which would present a panoramic view of the park and would feature a lush garden with a "wishing tree", an interactive fountain, and a statue of Walt Disney. On July 15, 2020, when EPCOT officially reopened, both "Spaceship Earth: Our Shared Story" as well as the Mary Poppins attraction's sequences were removed from the EPCOT Experience. When asked for comment, a spokesperson for Disney said, "As with most businesses during this period, we are further evaluating long-term project plans. The decision was made to postpone development of the 'Mary Poppins'-inspired attraction and Spaceship Earth at this time." Ride experience As the ride was built on an omnimover system, there were no triggered ride events. Rather, a narration plays as the show scenes and music run on loop. The script, originally penned by Ray Bradbury, has since been updated to meet contemporary technological trends. The current narrator is Judi Dench, who is accompanied by an orchestral score by Bruce Broughton. Show scenes The ride begins with the time-machine vehicles ascending into a dark tunnel with twinkling stars all around. An adventurous orchestral theme starts to play. As the score shifts to the theme ostinato, a leitmotif that comes to represent digital interference. On touchscreens in the vehicle, guests select their language and hometown, and then have their picture taken by a passing camera. As the vehicle arrives at the first story of the structure, it begins a slow curve. A large film screen is stretched along the inside of the sphere, depicting early humans fighting for survival against a woolly mammoth without a form of communication and language. As the screen dims behind them, guests enter a cavern populated by audio animatronic early humans, who represent the development of early language through cave paintings. The drawings on the walls come to life and begin to dance as the car continues onward. The score modulates, presenting the theme in a phrygian mode, implying a middle eastern atmosphere. Guests are brought through a heated diorama of the Egyptians, who invented a system of portable communication using hieroglyphs recorded on papyrus, as opposed to cave paintings that were unable to be transported as humans migrated. Phoenician merchants are seen carrying goods to faraway lands. The narration explains how each civilization is trying to communicate, but cannot understand each other due to the language barrier. But the Phoenicians, who trade with all of them create a simple common alphabet, so that trade and communication becomes easier. Turning a corner, riders see a lesson in mathematics being taught in a piazza in an ancient Greek city, in a sequence that attempts to show how math helped invent the 'birth of a high tech life we enjoy today.' Shifting to ancient Rome, a night scene including a traveler in a chariot delivering news depicts how language is portrayed as a tool for cultural unification with the vast network of roads that stretched across Europe, ultimately all leading to Rome. Suddenly, the scene takes a dark turn as crashes are heard and the smell of burning wood fills the air. The fall of Rome by invading mercenary armies also brought the destruction of the bulk of the world's recorded knowledge, including the loss of scrolls at the Library of Alexandria. But the narration gives hope as the vehicle reaches the next level, where Jewish and Islamic scholars of the Middle Ages are seen preserving recorded information, and continuing to progress in science. Winding through exotic fabrics and drapery, guests arrive at a monastery where biblical manuscripts are being copied by hand. The composition shifts to a hallelujah chorus, sung to the melody of the piece's exposition. Gutenberg is seen working the first movable-type printing press, allowing information to travel freely across the globe. The European Renaissance is portrayed, with animatronics of ensembles playing rich, polyphonic secular motets, sculpting a woman, and the painting of a portrait of fruit. The scene ends as the car passes under a scaffold, where Michelangelo is seen painting the ceiling of the Sistine Chapel. The time machines transition to a post-Civil War North. Guests witness syndicated news reports illuminating the planet of current events with amazing efficiency. Loud, industrial-sized printing presses show the incredible influence of the machine as an advancement in mass communication. As guests pass the clanging sounds of the press, the score's theme is presented again, this time with an uptempo ragtime piano. Seen next is a romanticized version of the 20th century communications revolution—after passing telegraphs, radio, telephones, and movies, riders see the 1969 television broadcast of Apollo 11 landing on the moon, featuring Walter Cronkite. Riders hear Neil Armstrong say his most famous quote, "That's one small step for man, one giant leap for mankind." while the vehicles pass by the TV. Language had progressed to such an extent that it no longer was spoken solely by humans, but by machines as well. Guests turn a corner and find themselves in a large mainframe computer as they ascend up the final hill. At the top, a slow descent starts, progressing through a garage in California, where a man is seen building one of the first home computers. The score becomes suddenly percussive and dramatic as guests fly through a tunnel with computer code projected onto the walls. At a crescendo, the car makes its final turn into the cupola of Spaceship Earth. The top of the structure is, in fact, a planetarium studded with stars and a large projection of a rotating Earth. Before the omnimover vehicles start to move down the long descent to the unloading area, they rotate 180 degrees clockwise and guests ride the end of the attraction backwards. The final scene has been redone multiple times, most recently to remove the animatronic scenes. The remainder of the ride moves past a seemingly infinite number of stars and into a realm of glowing triangles. The guests can then use the touchscreens in their Omnimover vehicle to answer questions to create a possible depiction of their future, which uses the pictures taken at the beginning of the ride. Guests are then invited to visit Project Tomorrow as they exit the ride cars. Post-show Earth Station The original post show for Spaceship Earth was called Earth Station. It lasted from 1982 until 1994. It was a wide open exhibit space that included: EPCOT Center Guest Relations Seven large rear projector screens mounted on the walls of the exhibit space toward the ceiling that displayed visual previews of various EPCOT Center attractions. WorldKey Information: Interactive kiosks that offered previews of various EPCOT Center attractions. Guests could also talk to a live cast member via two-way closed-circuit video, or make a restaurant reservation while in the park. Global Neighborhood When AT&T renewed their sponsorship in 1994, they redesigned the exhibit space for Earth Station into the Global Neighborhood. The original Global Neighborhood lasted from 1994 until 1999. In 1999, the exhibit space was updated to become the New Global Neighborhood for the Millennium Celebration. The exhibit space closed in 2004 after AT&T left as sponsor. Project Tomorrow: Inventing the Wonders of the Future AT&T's departure as sponsor in 2004 caused the exhibit to close. Siemens AG, the newest sponsor of Spaceship Earth, having signed on in 2005, created a new exhibit space called Project Tomorrow: Inventing the Wonders of the Future. The new exhibit space once again uses the entire exhibit space that only Earth Station had once used. The new exhibit space houses interactive exhibits featuring various Siemens AG technology. These interactive displays and games allow guests to see the future of medicine, transportation and energy management. The space opened with two games, with two new games added in December 2007 and January 2008. After Siemens dropped their sponsorships, all signs mentioning them were removed, however, the name stayed the same. Project Tomorrow current attractions are: An illuminated globe that shows the hometown of all Spaceship Earth visitors for the day. Body Builder – a 3-D game that challenges guests to reconstruct a human body. Features the voice of Wallace Shawn as Dr. Bones. Super Driver – a driving simulation video game featuring vehicle accident and avoidance systems. It simulates what is supposed to be the future of driving. You drive a "smart-car" and try to stop the city from being destroyed. Power City – a large, digital "shuffleboard-style" game that has guest racing around the board to power their city. InnerVision – a coordination and reaction-time game with elements similar to Simon and Dance Dance Revolution VIP Lounge A VIP lounge, operated by the pavilion sponsor, exists above the post-show area of Spaceship Earth. Employees of the current sponsoring company and their guests can relax in the lounge while visiting Epcot. The sponsor can also hold receptions in the space as well as conduct workshops and business presentations. When Spaceship Earth was without sponsorship from 2004 to 2005, the room was utilized for private events such as weddings and conventions. The layout is small and curved in shape, with one wall consisting of large windows where visitors can look out onto the park. When Siemens AG took over as sponsor, the lounge was given the name "Base21." In 2012, the name was dropped and it is now simply known as the "Siemens VIP Center." In August 2017, Siemens quickly left the lounge, and Disney took it over. Timeline October 1, 1982: Spaceship Earth opens with the opening of EPCOT Center, sponsored by the Bell System. The narrator is Vic Perrin. May 26, 1986: Attraction reopens from first major renovation. AT&T is now the sponsor, having signed on in 1984. New narration by Walter Cronkite. Finale music changed to Tomorrow's Child. August 15, 1994: Closes for second major renovation. "Home computer", "Office Computer", "Network Operations Center", and "Space Station" scenes removed. New final scenes installed and replace old final scenes. Earth Station closes. Tomorrow's Child ending removed. November 23, 1994: Attraction reopens. New ride narration by Jeremy Irons. New ride score by Edo Guidotti. The Global Neighborhood replaces Earth Station. September 29, 1999: The Mickey Mouse arm holding a wand is dedicated with "2000" over Spaceship Earth. November 24, 1999: The Global Neighborhood is replaced with The New Global Neighborhood, a new exhibit space serving as a hands-on playground for Spaceship Earth's post show. May 2001: The Mickey Mouse arm holding a wand is changed to say "Epcot" over Spaceship Earth. January 1, 2004: AT&T Corporation sponsorship ends. April 2004: The New Global Neighborhood is removed and the area is boarded up. AT&T references removed. November 2005: It is announced that Siemens AG will sponsor Spaceship Earth for twelve years. April 11, 2007: Major changes coming to Spaceship Earth are announced. April 25, 2007: The new exhibit space in Spaceship Earth's post show called Project Tomorrow: Inventing the Wonders of the Future opens. July 5, 2007: Epcot Vice President Jim Macphee announces the removal of the wand structure in time for the park's 25th anniversary on October 1, 2007. July 9, 2007: Closes for a fourth renovation. Removal of the wand structure begins. August 24, 2007: Removal of the wand structure completed. December 2007: Guest previews of fourth edition begin. February 15, 2008: Fourth edition opens to the general public. New narration by Dame Judi Dench. March 4, 2008: Spaceship Earth is rededicated. October 1, 2012: Spaceship Earth and Epcot celebrate their 30th anniversary. June 30, 2017: Siemens announces the end of their Disney sponsorships, including Spaceship Earth. October 1, 2017: Spaceship Earth and Epcot celebrate their 35th anniversary. October 10, 2017: Official last day of the Siemens sponsorship. June 20, 2020: Large-scale refurbishment postponed indefinitely. Narrators Vic Perrin: October 1, 1982 – May 25, 1986 Walter Cronkite: May 29, 1986 – August 15, 1994 Jeremy Irons: November 23, 1994 – July 9, 2007 Judi Dench: February 15, 2008 – present See also Epcot attraction and entertainment history References Further reading External links Walt Disney World Resort - Spaceship Earth Intercot's Spaceship Earth page AT&T Archive video of the opening of Spaceship Earth 1982 establishments in Florida Amusement rides introduced in 1982 Audio-Animatronic attractions Buildings and structures completed in 1982 Dark rides Epcot Future World (Epcot) World Celebration Geodesic domes Omnimover attractions Ray Bradbury Walt Disney Parks and Resorts attractions Walt Disney Parks and Resorts icons Siemens
34226513
https://en.wikipedia.org/wiki/Spyder%20%28software%29
Spyder (software)
Spyder is an open-source cross-platform integrated development environment (IDE) for scientific programming in the Python language. Spyder integrates with a number of prominent packages in the scientific Python stack, including NumPy, SciPy, Matplotlib, pandas, IPython, SymPy and Cython, as well as other open-source software. It is released under the MIT license. Initially created and developed by Pierre Raybaut in 2009, since 2012 Spyder has been maintained and continuously improved by a team of scientific Python developers and the community. Spyder is extensible with first-party and third-party plugins, includes support for interactive tools for data inspection and embeds Python-specific code quality assurance and introspection instruments, such as Pyflakes, Pylint and Rope. It is available cross-platform through Anaconda, on Windows, on macOS through MacPorts, and on major Linux distributions such as Arch Linux, Debian, Fedora, Gentoo Linux, openSUSE and Ubuntu. Spyder uses Qt for its GUI and is designed to use either of the PyQt or PySide Python bindings. QtPy, a thin abstraction layer developed by the Spyder project and later adopted by multiple other packages, provides the flexibility to use either backend. Features Features include: An editor with syntax highlighting, introspection, code completion Support for multiple IPython consoles The ability to explore and edit variables from a GUI A Help pane able to retrieve and render rich text documentation on functions, classes and methods automatically or on-demand A debugger linked to IPdb, for step-by-step execution Static code analysis, powered by Pylint A run-time Profiler, to benchmark code Project support, allowing work on multiple development efforts simultaneously A built-in file explorer, for interacting with the filesystem and managing projects A "Find in Files" feature, allowing full regular expression search over a specified scope An online help browser, allowing users to search and view Python and package documentation inside the IDE A history log, recording every user command entered in each console An internal console, allowing for introspection and control over Spyder's own operation Plugins Available plugins include: Spyder-Unittest, which integrates the popular unit testing frameworks Pytest, Unittest and Nose with Spyder Spyder-Notebook, allowing the viewing and editing of Jupyter Notebooks within the IDE Download Spyder Notebook Using conda: conda install spyder-notebook -c spyder-ide Using pip: pip install spyder-notebook Spyder-Reports, enabling use of literate programming techniques in Python Spyder-Terminal, adding the ability to open, control and manage cross-platform system shells within Spyder Download Spyder Terminal Using conda: conda install spyder-terminal -c spyder-ide Using pip: pip install spyder-terminal Spyder-Vim, containing commands and shortcuts emulating the Vim text editor Spyder-AutoPEP8, which can automatically conform code to the standard PEP 8 code style Spyder-Line-Profiler and Spyder-Memory-Profiler, extending the built-in profiling functionality to include testing an individual line, and measuring memory usage See also List of integrated development environments for Python programming language References External links Documentation Free integrated development environments Free integrated development environments for Python Python (programming language) development tools Free mathematics software Free science software Python (programming language) software Software using the MIT license
33932700
https://en.wikipedia.org/wiki/StarWind%20Software
StarWind Software
StarWind Software, Inc. is a privately held Beverly, Massachusetts-based computer software and hardware appliance company specializing in storage virtualization and software-defined storage. History StarWind Software began in 2008 as a spin-off from Rocket Division Software, Ltd. (founded in 2003), with a round A of investment from venture capital firm ABRT. It started providing early adopters with initially free software defined storage offerings in 2009, including its V2V (virtual-to-virtual) image converter and iSCSI SAN software. In 2013, hard drive manufacturer Western Digital began integrating StarWind's iSCSI engine with some of the company's Network Attached Storage (NAS) appliances. In mid-April 2014, StarWind Software closed a round B of investment from Almaz Capital and AVentures Capital. In August 2015, StarWind announced a combined software-hardware product called HyperConverged Appliance. In April 2016, StarWind was selected by research firm Gartner as one of its 2016 "Cool Vendors for Compute Platforms". In February 2020, StarWind's Hyperconverged Infrastructure (HCI) software StarWind VSAN set performance benchmarks for off the shelf commodity hardware. In December, StarWind was named to Gartner's Magic Quadrant for HCI software. Products StarWind develops standards-based storage virtualization and management software that will run on any x86 platform. Its software defined storage software supports building iSCSI, iSER, NVM Express over Fabrics (NVMe-oF), and NFSv3/v4 and SMB3 NAS using commodity hardware. Its products include: StarWind HyperConverged Appliance - the company's HyperConverged Appliance bundles StarWind's VSAN software with third party server and storage hardware, along with storage management software and hypervisors such as from VMware or Hyper-V. Virtual SAN (VSAN) software - HCI software which allows customers to set up and operate a storage area network supporting clustering and multi-host access on any standard 64-bit or 32-bit Windows server. The software acts as the storage back end for virtualized servers such as VMware vSphere server and Microsoft Hyper-V, and also supports standard server applications requiring network storage, such as Microsoft Exchange or SQL Server. Open source NVMe SPDK for Windows Server is used to support the NVMe-oF uplink protocol, together with iSCSI and iSER. Free NAS & SAN - software for converting commodity servers into iSCSI and NFS/SMB3 targets. V2V Converter - a free converter allowing back and forth conversion of virtual storage files between VMware's open format Virtual Machine Disk (VMDK) and Virtual Hard Disk (VHD), the format for Microsoft's hypervisor (virtual machine system), Hyper-V. P2V Migrator - a free tool for converting physical servers to various virtual machine formats. StarWind VTL - a virtual tape library that can be used to replace tape drives with less expensive Serial ATA (SATA) drives, and offloading storage to public cloud, for long term storage. StarWind iSCSI Accelerator – a driver that improves CPU utilization with multi-core CPUs, when used with Microsoft iSCSCI Initiator. StarWind NVMe-oF Initiator – Windows software used to initiate the NVM Express open logical-device specification. It maps the NVMe driver over an RDMA or TCP network to make remote NVMe drives appear to be attached to a physical server. Operations StarWind is headquartered in Beverly, Massachusetts. References External links Software companies based in Massachusetts Software companies established in 2008 American companies established in 2008 2008 establishments in the United States 2008 establishments in Massachusetts Companies based in Essex County, Massachusetts Beverly, Massachusetts Privately held companies based in Massachusetts Software companies of the United States
32442603
https://en.wikipedia.org/wiki/Lifeboat%20Associates
Lifeboat Associates
Lifeboat Associates was a New York City company that was one of the largest microcomputer software distributors in the late 1970s and early 1980s. Lifeboat acted as an independent software broker marketing software to major hardware vendors such as Xerox, HP and Altos. As such Lifeboat Associates was instrumental in the founding of Autodesk and also financed the creation of PC Magazine. Overview Lifeboat was founded in 1976 or 1977 by Larry Alkoff and Tony Gold. By mid-1981 the company had same-name affiliates in England, Switzerland, France, Germany, Japan and Oakland, California. PC Magazine in 1982 wrote that Lifeboat "has published and marketed more CP/M application programs on more 8-bit machines than anyone in the world", and in 1983 InfoWorld said that Lifeboat was the largest publisher of microcomputer software in the world. Lifeboat Associates successfully combined many roles, including publisher and distributor, and actively solicited authors for software products that met its standards. The company distributed T/Maker (written by Peter Roizen), one of the first spreadsheet programs designed for the personal computer user, which went a step beyond the similar VisiCalc program by offering text-processing capability, and The Boss Financial Accounting System (written by John Burns), a $2495 package for CP/M users. It was one of the first accounting programs for micro-computers. In addition Lifeboat Associates started collecting and distributing user-written "free" software, initially for the CP/M operating system. One of the first was XMODEM, which allowed reliable communication via modem and phone line. In June 1986, Voyager Software Corp acquired Lifeboat Associates. Later in 1986, Programmer's Paradise was started by Voyager Software as a catalog marketer of technical software. In 1988, Voyager acquired Corsoft Inc., a corporate reseller founded in 1983, and combined it with the operations of the Programmer's Paradise catalog and Lifeboat Associates, both of which marketed technical software for microcomputers. In May 1995, Voyager Software Corp. changed its name to "Programmers Paradise, Inc." and consolidated its U.S. catalog and software publishing operations in a new subsidiary, Programmers Paradise Catalogs, Inc. and its wholesale distribution operations in a new subsidiary, Lifeboat Distribution, Inc. In July 1995, Programmer's Paradise completed an initial public offering of its common stock. Programmer’s Paradise, Inc. changed its name to Wayside Technology Group, Inc. in August 2006. Products T/Maker (Table Maker) – one of the first spreadsheet programs designed for the personal computer user The Boss – Financial Accounting System Software Bus-80, also known as SB-80 – a version of CP/M-80 for 8080/Z80 8-bit computers Software Bus-86, also known as SB-86 – a version of MS-DOS for x86 16-bit computers See also Software Bus References External links Defunct software companies of the United States Companies established in the 1970s 1970s establishments in the United States
10439152
https://en.wikipedia.org/wiki/Daemon%20%28novel%20series%29
Daemon (novel series)
Daemon and Freedom™ comprise a two-part novel by the author Daniel Suarez about a distributed, persistent computer application, the Daemon, that begins to change the real world after the original programmer's death. Daemon (2006) paperback; (2009) hardcover re-release Freedom™ (2010) Plot Upon publication of the obituary for Matthew A. Sobol, a brilliant computer programmer and CTO of Cyberstorm Entertainment, a daemon is activated. Sobol, dying of brain cancer, was fearful for humanity and began to envision a new world order. The Daemon becomes his tool to achieve that vision. The Daemon's first mission is to kill two programmers Joseph Pavlos & Chopra Singh who worked for CyberStorm Entertainment and unknowingly helped in the creation of the Daemon. The program secretly takes over hundreds of companies and provides financial and computing resources for recruiting real world agents and creating AutoM8s (computer controlled driverless cars, used as transport and occasionally as weapons), Razorbacks (sword-wielding robotic riderless motorcycles, specifically designed as weapons) and other devices. The Daemon also creates a secondary online web service, hidden from the general public, dubbed the Darknet, which allows Daemon operatives to exchange information freely. Daemon implements a kind of government by algorithm inside the community of its recruited operatives. What follows is a series of interlocking stories following the main characters: Detective Pete Sebeck is called in to investigate the death of Pavlos. However, when a connection is made between the two programmers and Cyberstorm, the FBI takes over led by Agent Decker. For being the first authority figure in the investigation, the Daemon selects Sebeck against his will to serve the Daemon, which frames Sebeck for its creation as a multi-million scheme and a hoax. The US government, though knowing the truth, fasttracks Sebeck's trial and executes him eight months later. Sebeck makes peace with his wife, who loves him despite the fact that Sebeck is having an affair, but his son Chris remains estranged, and he proclaims his innocence while dying from lethal injection. However, Sebeck later awakens to learn that the Daemon faked his death and assigned him the task to prove that humanity deserves its freedom from the Daemon. Joined by a fellow operative named Laney Price, Sebeck vanishes into America. Jon Ross, a Russian hacker and identity thief, is questioned by the FBI and brought into the investigation by Sebeck. Unfortunately, traditional investigation methods are useless against Sobol's Daemon program. Ross eventually deduces that the Daemon can anticipate their every move, seemingly one step ahead of anyone who tries to interfere with its operation. Even after being named in the Daemon hoax (and put on the FBI's most wanted list), Ross willingly helps the US government to stop the program. Assigned to the NSA's anti-Daemon task force, with Agent Phillips, he is a firsthand witness to Loki's attack on the installation and barely survives the massacre that follows. With his immunity deal rescinded, he vanishes underground with the intent on destroying the Daemon on his own. Agent Roy "Tripwire" Merritt a decorated FBI agent is brought in to secure Sobol's property, when several FBI agents and police officers are killed by an automated Hummer that attacks anyone who approaches. A longtime military officer and expert in hostage situations, he realizes that Sobol's estate is a death trap and red herring, but fear of the Daemon forces his hand and his team is ordered to secure the site regardless. His team is quickly killed, and he remains the lone survivor, infiltrating the house and accidentally triggering a bomb, which levels the property. Blamed for the failure, he is relieved of duty but is later brought onto the anti-Daemon task force by the Major. When Loki is revealed to have infiltrated the building, Roy pursues him, against orders. Fearful of the publicity that the chase will generate, the Major kills Roy himself. Despite being an enemy to the Daemon, he becomes a folk hero of the Darknet, known as "The Burning Man" by the Darknet users, who respect him for his tenacity. NSA Agent Natalie Philips, a genius workaholic government cryptographer. Phillips joins the investigation shortly after the FBI is called in. Eventually, she is placed in charge of the anti-Daemon task force, but she finds plenty of interference from the Major. She is attracted to Jon Ross (the attraction is mutual), but she quickly states that national security will take precedence and their relationship will remain professional. Phillips objects to the murder of Sebeck to protect infected corporate systems from the Daemon's wrath. One of a handful survivors from Loki's attack, Phillips is blamed for the failure and relieved of duties. Brian Gragg aka "Loki Stormbringer" is a sociopathic loner and avid gamer. He makes a living through identity theft and other cyber crimes. After running afoul of some hackers from the Philippines, he allows his partner in crime, Jason Heider, to be killed in his place. Needing to lie low, Loki is recruited by the Daemon by outthinking a hidden game level in one of Sobol's games. Loki is the first Daemon operative and quickly becomes one of the most powerful operatives. His behavior, though useful to the Daemon, is hated and feared even by other Darknet members. His first major act is to infiltrate the anti-Daemon task force. When found out, he quickly triggers an attack, which leaves most of the people and agents there dead. He is pursued by Roy Merritt, as he escapes and witnesses the Major executing Roy, vowing to kill the Major for betraying his own man. The Major, unnamed throughout the series, is introduced as a secret DOD liaison assigned to the daemon task force. Soon, everyone who encounters him realizes that his history is checkered, and his loyalty remains with the military-industrial complex now under attack by the Daemon. When Loki massacres the task force, he quickly contains the situation by destroying all evidence (including leveling the building) and personally executing Roy Merritt, fearful that Merritt's pursuit of Loki will attract too much attention. Realizing that they have underestimated the Daemon and its network, the Major retreats and prepares to wage a secret war against the Daemon and its agents. Anji Anderson is a recently fired reporter, whose good looks have hindered her career for years. Having been relegated to fluff pieces and put on the air to be pretty, she is quickly recruited as a Daemon operative, her job to investigate stories that benefit the Daemon and help push its propaganda. Her main effect in the story is to help frame Sebeck. She eventually becomes the spokesman for the Daemon. Charles Mosely is a former drug dealer and convicted killer recruited by the Daemon, which helps him to escape prison by transferring him first to minimum security and then releasing him altogether. With a new identity, he travels to a Daemon-controlled office where he is interrogated by the Daemon's AI and is deemed acceptable to serve. He eventually becomes a security operative, assigned jobs such as executing criminals, participating a massive worldwide assassination of spammers who corrupt the internet. Mosley's only request is for the Daemon to locate his missing son and protect him. Ray is both found and sent to live with Daemon agents, who will raise and educate Ray in a safe family-like setting. Dutton purchase On June 25, 2008, the Dutton imprint of the Penguin Group purchased Daemon and the rights to the sequel Freedom™ from Verdugo Press. Film adaptation Walter F. Parkes, who produced the 1983 film WarGames, had optioned the film rights to Daemon with Paramount Pictures, but they likely reverted to Suarez on 8 December 2012. References External links Insightful interview with Daniel Suarez, titled "Understanding the Daemon" Novels' website An interview with Daniel Suarez on the BookBanter podcast Review of both Daemon and Freedom™ by political theorist Kevin Carson. Joi Ito's Review of Daemon Daemons: IT Keepsakes; Jim Rapoza, eWeek A Book Review: Don Donzal, Editor-in-Chief, The Ethical Hacker Network 2006 American novels Techno-thriller novels Novels about computing 2000s science fiction novels American science fiction novels Novels about the Internet Transhumanist books Postcyberpunk novels Mixed reality Fictional software Malware in fiction Cybernetted society in fiction Hive minds in fiction Massively multiplayer online role-playing games in fiction Novels about death Detective novels Fiction about parasites Novels about cryptography Novels about mass surveillance Utopian fiction Government by algorithm in fiction
41676232
https://en.wikipedia.org/wiki/COS%20%28operating%20system%29
COS (operating system)
COS (China Operating System) is a Linux kernel-based mobile operating system developed in China mainly targeting mobile devices, tablets and set-top boxes. It is being developed by the Institute of Software at the Chinese Academy of Sciences (ISCAS) together with Shanghai Liantong Network Communications Technology to compete with foreign operating systems like iOS and Android. The operating system is based on Linux but the platform is closed source. Security and the risk of back doors in devices from foreign vendors are some of the main motivations for COS. Android had almost 90% of the smart phone market when COS was introduced and Apple most of the remaining market share. COS looks very similar to Android and has its own Application Portal much like Android Market and iOS App Store. Licensing The Linux kernel is GPLv2 and the COS framework is closed source and licensed by the originators. Core OS The COS Core operating system is based on the Linux kernel and can be seen as a Linux distribution, in the same way as the Android operating system. API According to the official statements, COS supports HTML5 based applications and Java based applications. See also Kylin Linux Deepin Zorin OS StartOS Comparison of mobile operating systems List of free and open source Android applications References External links Demonstration video Mobile Linux ARM Linux distributions Linux distributions
19768790
https://en.wikipedia.org/wiki/Robot%20Operating%20System
Robot Operating System
Robot Operating System (ROS or ros) is an open-source robotics middleware suite. Although ROS is not an operating system but a collection of software frameworks for robot software development, it provides services designed for a heterogeneous computer cluster such as hardware abstraction, low-level device control, implementation of commonly used functionality, message-passing between processes, and package management. Running sets of ROS-based processes are represented in a graph architecture where processing takes place in nodes that may receive, post and multiplex sensor data, control, state, planning, actuator, and other messages. Despite the importance of reactivity and low latency in robot control, ROS itself is not a real-time OS (RTOS). It is possible, however, to integrate ROS with real-time code. The lack of support for real-time systems has been addressed in the creation of ROS 2, a major revision of the ROS API which will take advantage of modern libraries and technologies for core ROS functionality and add support for real-time code and embedded hardware. Software in the ROS Ecosystem can be separated into three groups: language-and platform-independent tools used for building and distributing ROS-based software; ROS client library implementations such as , , and ; packages containing application-related code which uses one or more ROS client libraries. Both the language-independent tools and the main client libraries (C++, Python, and Lisp) are released under the terms of the BSD license, and as such are open-source software and free for both commercial and research use. The majority of other packages are licensed under a variety of open-source licenses. These other packages implement commonly used functionality and applications such as hardware drivers, robot models, datatypes, planning, perception, simultaneous localization and mapping, simulation tools, and other algorithms. The main ROS client libraries are geared toward a Unix-like system, primarily because of their dependence on large collections of open-source software dependencies. For these client libraries, Ubuntu Linux is listed as "Supported" while other variants such as Fedora Linux, macOS, and Microsoft Windows are designated "experimental" and are supported by the community. The native Java ROS client library, , however, does not share these limitations and has enabled ROS-based software to be written for the Android OS. has also enabled ROS to be integrated into an officially supported MATLAB toolbox which can be used on Linux, macOS, and Microsoft Windows. A JavaScript client library, has also been developed which enables integration of software into a ROS system via any standards-compliant web browser. History Early days at Stanford (2007 and earlier) Sometime before 2007, the first pieces of what eventually would become ROS were beginning to come together at Stanford University. Eric Berger and Keenan Wyrobek, PhD students working in Kenneth Salisbury's robotics laboratory at Stanford, were leading the Personal Robotics Program. While working on robots to do manipulation tasks in human environments, the two students noticed that many of their colleagues were held back by the diverse nature of robotics: an excellent software developer might not have the hardware knowledge required, someone developing state of the art path planning might not know how to do the computer vision required. In an attempt to remedy this situation, the two students set out to make a baseline system that would provide a starting place for others in academia to build upon. In the words of Eric Berger, "something that didn’t suck, in all of those different dimensions". In their first steps towards this unifying system, the two built the PR1 as a hardware prototype and began to work on software from it, borrowing the best practices from other early open-source robotic software frameworks, particularly switchyard, a system that Morgan Quigley, another Stanford PhD student, had been working on in support of the STAIR (STanford Artificial Intelligence Robot) by the Stanford Artificial Intelligence Laboratory. Early funding of US$50,000 was provided by Joanna Hoffman and Alain Rossmann, which supported the development of the PR1. While seeking funding for further development, Eric Berger and Keenan Wyrobek met Scott Hassan, the founder of Willow Garage, a technology incubator which was working on an autonomous SUV and a solar autonomous boat. Hassan shared Berger and Wyrobek's vision of a "Linux for robotics", and invited them to come and work at Willow Garage. Willow Garage was started in January 2007, and the first commit of ROS code was made to SourceForge on the seventh of November, 2007. Willow Garage (2007-2013) Willow Garage began developing the PR2 robot as a follow-up to the PR1, and ROS as the software to run it. Groups from more than twenty institutions made contributions to ROS, both the core software and the growing number of packages which worked with ROS to form a greater software ecosystem. The fact that people outside of Willow were contributing to ROS (particularly from Stanford's STAIR project) meant that ROS was a multi-robot platform from the beginning. While Willow Garage had originally had other projects in progress, they were scrapped in favor of the Personal Robotics Program: focused on producing the PR2 as a research platform for academia and ROS as the open-source robotics stack that would underlie both academic research and tech startups, much like the LAMP stack did for web-based startups. In December 2008, Willow Garage met the first of their three internal milestones: continuous navigation for the PR2 over a period of two days and a distance of pi kilometers. Soon after, an early version of ROS (0.4 Mango Tango) was released, followed by the first RVIZ documentation and the first paper on ROS. In early summer, the second internal milestone: having the PR2 navigate the office, open doors, and plug itself it in, was reached. This was followed in August by the initiation of the ROS.org website. Early tutorials on ROS were posted in December, preparing for the release of ROS 1.0, in January 2010. This was Milestone 3: producing tons of documentation and tutorials for the enormous capabilities that Willow Garage's engineers had developed over the preceding 3 years. Following this, Willow Garage achieved one of its longest held goals: giving away 10 PR2 robots to worthy academic institutions. This had long been a goal of the founders, as they felt that the PR2 could kick-start robotics research around the world. They ended up awarding eleven PR2s to different institutions, including University of Freiburg (Germany), Bosch, Georgia Tech, KU Leuven (Belgium), MIT, Stanford, TU Munich (Germany), UC Berkeley, U Penn, USC, and University of Tokyo (Japan). This, combined with Willow Garage's highly successful internship program (run from 2008 to 2010 by Melonee Wise), helped to spread the word about ROS throughout the robotics world. The first official ROS distribution release: ROS Box Turtle, was released on 2 March 2010, marking the first time that ROS was officially distributed with a set of versioned packages for public use. These developments led to the first drone running ROS, the first autonomous car running ROS, and the adaption of ROS for Lego Mindstorms. With the PR2 Beta program well underway, the PR2 robot was officially released for commercial purchase on 9 September 2010. 2011 was a banner year for ROS with the launch of ROS Answers, a Q/A forum for ROS users, on 15 February; the introduction of the highly successful TurtleBot robot kit on 18 April; and the total number of ROS repositories passing 100 on 5 May. Willow Garage began 2012 by creating the Open Source Robotics Foundation (OSRF) in April. The OSRF was immediately awarded a software contract by the Defense Advanced Research Projects Agency (DARPA). Later that year, the first ROSCon was held in St. Paul, Minnesota, the first book on ROS, ROS By Example, was published, and Baxter, the first commercial robot to run ROS, was announced by Rethink Robotics. Soon after passing its fifth anniversary in November, ROS began running on every continent on 3 December 2012. In February 2013, the OSRF became the primary software maintainers for ROS, foreshadowing the announcement in August that Willow Garage would be absorbed by its founders, Suitable Technologies. At this point, ROS had released seven major versions (up to ROS Groovy), and had users all over the globe. This chapter of ROS development would be finalized when Clearpath Robotics took over support responsibilities for the PR2 in early 2014. OSRF and Open Robotics (2013–present) In the years since OSRF took over primary development of ROS, a new version has been released every year, while interest in ROS continues to grow. ROSCons have occurred every year since 2012, co-located with either ICRA or IROS, two flagship robotics conferences. Meetups of ROS developers have been organized in a variety of countries, a number of ROS books have been published, and many educational programs initiated. On 1 September 2014, NASA announced the first robot to run ROS in space: Robotnaut 2, on the International Space Station. In 2017, the OSRF changed its name to Open Robotics. Tech giants Amazon and Microsoft began to take an interest in ROS during this time, with Microsoft porting core ROS to Windows in September 2018, followed by Amazon Web Services releasing RoboMaker in November 2018. Perhaps the most important development of the OSRF/Open Robotics years thus far (not to discount the explosion of robot platforms which began to support ROS or the enormous improvements in each ROS version) was the proposal of ROS 2, a significant API change to ROS which is intended to support real time programming, a wider variety of computing environments, and utilize more modern technology. ROS 2 was announced at ROSCon 2014, the first commits to the repository were made in February 2015, followed by alpha releases in August 2015. The first distribution release of ROS 2, Ardent Apalone, was released on 8 December 2017, ushering in a new era of next-generation ROS development. Design Philosophy ROS was designed with open source in mind, intending that users would be able to choose the configuration of tools and libraries which interacted with the core of ROS so that users could shift their software stacks to fit their robot and application area. As such, there is very little which is actually core to ROS, beyond the general structure within which programs must exist and communicate. In one sense, ROS is the underlying plumbing behind nodes and message passing. However, in reality, ROS is not only that plumbing, but a rich and mature set of tools, a wide-ranging set of robot-agnostic capabilities provided by packages, and a greater ecosystem of additions to ROS. Computation graph model ROS processes are represented as nodes in a graph structure, connected by edges called topics. ROS nodes can pass messages to one another through topics, make service calls to other nodes, provide a service for other nodes, or set or retrieve shared data from a communal database called the parameter server. A process called the ROS Master makes all of this possible by registering nodes to itself, setting up node-to-node communication for topics, and controlling parameter server updates. Messages and service calls do not pass through the master, rather the master sets up peer-to-peer communication between all node processes after they register themselves with the master. This decentralized architecture lends itself well to robots, which often consist of a subset of networked computer hardware, and may communicate with off-board computers for heavy computation or commands. Nodes A node represents a single process running the ROS graph. Every node has a name, which it registers with the ROS master before it can take any other actions. Multiple nodes with different names can exist under different namespaces, or a node can be defined as anonymous, in which case it will randomly generate an additional identifier to add to its given name. Nodes are at the center of ROS programming, as most ROS client code is in the form of a ROS node which takes actions based on information received from other nodes, sends information to other nodes, or sends and receives requests for actions to and from other nodes. Topics Topics are named buses over which nodes send and receive messages. Topic names must be unique within their namespace as well. To send messages to a topic, a node must publish to said topic, while to receive messages it must subscribe. The publish/subscribe model is anonymous: no node knows which nodes are sending or receiving on a topic, only that it is sending/receiving on that topic. The types of messages passed on a topic vary widely and can be user-defined. The content of these messages can be sensor data, motor control commands, state information, actuator commands, or anything else. Services A node may also advertise services. A service represents an action that a node can take which will have a single result. As such, services are often used for actions which have a defined beginning and end, such as capturing a single-frame image, rather than processing velocity commands to a wheel motor or odometer data from a wheel encoder. Nodes advertise services and call services from one another. Parameter server The parameter server is a database shared between nodes which allows for communal access to static or semi-static information. Data which does not change frequently and as such will be infrequently accessed, such as the distance between two fixed points in the environment, or the weight of the robot, are good candidates for storage in the parameter server. Tools ROS's core functionality is augmented by a variety of tools which allow developers to visualize and record data, easily navigate the ROS package structures, and create scripts automating complex configuration and setup processes. The addition of these tools greatly increases the capabilities of systems using ROS by simplifying and providing solutions to a number of common robotics development problems. These tools are provided in packages like any other algorithm, but rather than providing implementations of hardware drivers or algorithms for various robotic tasks, these packages provide task and robot-agnostic tools which come with the core of most modern ROS installations. is a three-dimensional visualizer used to visualize robots, the environments they work in, and sensor data. It is a highly configurable tool, with many different types of visualizations and plugins. is a command line tool used to record and playback ROS message data. uses a file format called bags, which log ROS messages by listening to topics and recording messages as they come in. Playing messages back from a bag is largely the same as having the original nodes which produced the data in the ROS computation graph, making bags a useful tool for recording data to be used in later development. While is a command line only tool, provides a GUI interface to . catkin catkin is the ROS build system, having replaced as of ROS Groovy. catkin is based on CMake, and is similarly cross-platform, open-source, and language-independent. The package provides a suite of tools which augment the functionality of the bash shell. These tools include , , and , which replicate the functionalities of ls, cd, and cp respectively. The ROS versions of these tools allow users to use package names in place of the file path where the package is located. The package also adds tab-completion to most ROS utilities, and includes rosed, which edits a given file with the chosen default text editor, as well , which runs executables in ROS packages. supports the same functionalities for zsh and tcsh, to a lesser extent. is a tool used to launch multiple ROS nodes both locally and remotely, as well as setting parameters on the ROS parameter server. configuration files, which are written using XML can easily automate a complex startup and configuration process into a single command. scripts can include other scripts, launch nodes on specific machines, and even restart processes which die during execution. Packages of note ROS contains many open-source implementations of common robotics functionality and algorithms. These open-source implementations are organized into packages. Many packages are included as part of ROS distributions, while others may be developed by individuals and distributed through code sharing sites such as . Some packages of note include: Systems and tools provides a standardized interface for interfacing with preemptable tasks. provides a way to run multiple algorithms in a single process. provides a JSON API to ROS functionalities for non-ROS programs. Mapping and localization slam toolbox provides full 2D SLAM and localization system. provides a wrapper for OpenSlam's Gmapping algorithm for simultaneous localization and mapping. cartographer provides real time 2D and 3D SLAM algorithms developed at Google. provides an implementation of adaptive Monte-Carlo localization. Navigation navigation provides the capability of navigating a mobile robot in a planar environment. Perception is a meta-package which provides packages for integrating ROS with OpenCV. Coordinate frame representation provided a system for representing, tracking and transforming coordinate frames until ROS Hydro, when it was deprecated in favor of . is the second generation of the library, and provides the same capabilities for ROS versions after Hydro. Simulation is a meta-package which provides packages for integrating ROS with the Gazebo simulator. stage provides an interface for the 2D Stage simulator. Versions and releases ROS releases may be incompatible with other releases and are often referred to by code name rather than version number. ROS currently releases a version every year in May, following the release of Ubuntu LTS versions. ROS 2 currently releases a new version every six months (in December and July). These releases are supported for a single year. There are currently two active major versions seeing releases: ROS 1 and ROS 2. Aside to this there is the ROS-Industrial or ROS-I derivate project since at least 2012. ROS 1 ROS 2 ROS-Industrial ROS-Industrial is an open-source project (BSD (legacy)/Apache 2.0 (preferred) license) that extends the advanced capabilities of ROS to manufacturing automation and robotics. In the industrial environment, there are two different approaches to programming a robot: either through an external proprietary controller, typically implemented using ROS, or via the respective native programming language of the robot. ROS can therefore be seen as the software-based approach to program industrial robots instead of the classic robot controller-based approach. The ROS-Industrial repository includes interfaces for common industrial manipulators, grippers, sensors, and device networks. It also provides software libraries for automatic 2D/3D sensor calibration, process path/motion planning, applications like Scan-N-Plan, developer tools like the Qt Creator ROS Plugin, and training curriculum that is specific to the needs of manufacturers. ROS-I is supported by an international Consortium of industry and research members. The project began as a collaborative endeavor between Yaskawa Motoman Robotics, Southwest Research Institute, and Willow Garage to support the use of ROS for manufacturing automation, with the GitHub repository being founded in January 2012 by Shaun Edwards (SwRI). Currently, the Consortium is divided into three groups; the ROS-Industrial Consortium Americas (led by SwRI and located in San Antonio, Texas), the ROS-Industrial Consortium Europe (led by Fraunhofer IPA and located in Stuttgart, Germany) and the ROS-Industrial Consortium Asia Pacific (led by Advanced Remanufacturing and Technology Centre (ARTC) and Nanyang Technological University (NTU) and located in Singapore). The Consortia supports the global ROS-Industrial community by conducting ROS-I training, providing technical support and setting the future roadmap for ROS-I, as well as conducting pre-competitive joint industry projects to develop new ROS-I capabilities. ROS-compatible robots and hardware Robots ABB, Adept, Fanuc, Motoman, and Universal Robots are supported by ROS-Industrial. Baxter at Rethink Robotics, Inc. CK-9: robotics development kit by Centauri Robotics, supports ROS. HERB developed at Carnegie Mellon University in Intel's personal robotics program Husky A200: robot developed (and integrated into ROS) by Clearpath Robotics PR1: personal robot developed in Ken Salisbury's lab at Stanford PR2: personal robot being developed at Willow Garage Raven II Surgical Robotic Research Platform Shadow Robot Hand: a fully dexterous humanoid hand. STAIR I and II: robots developed in Andrew Ng's lab at Stanford SummitXL: mobile robot developed by Robotnik, an engineering company specialized in mobile robots, robotic arms, and industrial solutions with ROS architecture. Nao humanoid: University of Freiburg's Humanoid Robots Lab developed a ROS integration for the Nao humanoid based on an initial port by Brown University UBR1: developed by Unbounded Robotics, a spin-off of Willow Garage. ROSbot: autonomous robot platform by Husarion Webots: robot simulator integrating a complete ROS programming interface. GoPiGo3: Raspberry Pi-based educational robot, supports ROS. SBCs and hardware BeagleBoard: the robotics lab of the Katholieke Universiteit Leuven, Belgium has ported ROS to the Beagleboard. Sitara ARM Processors have support for the ROS package as part of the official Linux SDK. Raspberry Pi: image of Ubuntu Mate with ROS by Ubiquity Robotics; installation guide for Raspbian. See also Open hardware Robotics middleware Open-source software List of free and open-source software packages References Notes STAIR: The STanford Artificial Intelligence Robot project, Andrew Y. Ng, Stephen Gould, Morgan Quigley, Ashutosh Saxena, Eric Berger. Snowbird, 2008. Related projects RT middleware – Robot middleware standard/implementations. RT-component is discussed/defined by the Object Management Group. External links 2007 in robotics 2007 software Computer vision software Free software operating systems Open-source robots Robot operating systems manipulation Robotics suites Software using the BSD license
48866655
https://en.wikipedia.org/wiki/Alter/Ego
Alter/Ego
Alter/Ego (アルター・エゴ) is a free real-time vocal synthesizer software which was created by Plogue. About Alter/Ego is a text-to-speech synthesizer which aims to create more modern vocals, working to post 1990s research. It was offered as a free plug-in and is used for music making to produce singing vocals. It operates in a similar manner to Chipspeech. Vocals are clean-cut though robotic sounding and the software is ideal for vocal experimentation. It is capable of running different speech engines. There are currently only 2 released vocals for the software. The released vocals are purchased separately. The vocals come as files that need to be extracted as they lack installers. Over time Plogue have received many vocal requests from individuals since the release of the software, however they are limited by their small development team and being busy. In January 2016 it was announced that there were 6 new vocals in production. No more vocals are due after Leora and Marie Orks final two vocal updates. Plogue have since moved on to other adaptations of the engine. One such adaptation is the ability for the engine to detect Microsoft text-to-speech voices and load them into the engine. UTAU has also been experimented with. Plogue noted that the engine was designed to have user made vocal support from the beginning, though this was yet to be implemented due to a lack of support for this. In 2017 the production of new voicebanks ceased with Marie Ork "Clear" and Leora having been confirmed the last voicebanks being produced for the software. The cease of new voicebanks came in the wake of a scandal with the creator of voicebanks "Vera" and "Nata". Though the engine has been updated since 2017, since it is identical to the Chipspeech engine, the overall development of Alter/ego has ceased. Characters Daisy: Daisy was the very first vocal added to the software. Daisy is a lonely time traveler and the estranged lover of Chipspeech vocal, Dandy 704. Daisy was offered as a free separate download, allowing her to be imported into both Alter/Ego and Chipspeech and acted as the default vocal. She was retired and replaced with Bones. Tera Eleki (エレキテラ); was revealed on Sept 7th 2015 and was a Japanese only vocal. She has since been cancelled. ALYS: A female vocal released on March 10, 2016. She sings in French and Japanese, and has a Live Polyphonic CAhoir mode. She was developed by VoxWave. Bones: A male vocal that sings/talks in English and sings in Japanese, first unveiled in Jan 2016, first released in Oct 31 2016. He replaced Daisy as the default vocal for the software. He has a Chinese vocal being currently worked on. Marie Ork: a Death Metal female vocalist, she was released on 1 December 2016 and sings in English. She was released with two vocals "Growl" and "Space". A "Talk" and "Clean" vocal were later added. LEORA: The second French female vocal developed by VoxWave. She was created complementary to ALYS, with English and French voicebanks. NATA: An English vocal developed by Vocallective. She is an adult female and was released on January 11, 2017. Though she started out as official, Plogue has since dropped their support of her. VERA: Was a female English prototype developed by Azureflux from Vocallective without the Plogue's authorization, the project was available on this page, but is now cancelled. Reception As noted by BPB, Alter/Ego is praised for being a powerful tool by standards of free software. It, however, has a steep learning curve, though highlighted how easy it was to get the synthesizer to sing lyrics, calling the product "fun" to work with overall. Later in December that year, the software was awarded second place in their top 50 free instruments list. Computer Music magazine also covered the synthesizer in their December 2015 issue. References External links Speech synthesis software Singing software synthesizers
25904910
https://en.wikipedia.org/wiki/Cognition%20Network%20Technology
Cognition Network Technology
Cognition Network Technology (CNT), also known as Definiens Cognition Network Technology, is an object-based image analysis method developed by Nobel laureate Gerd Binnig together with a team of researchers at Definiens AG in Munich, Germany. It serves for extracting information from images using a hierarchy of image objects (groups of pixels), as opposed to traditional pixel processing methods. To emulate the human mind's cognitive powers, Definiens used patented image segmentation and classification processes, and developed a method to render knowledge in a semantic network. CNT examines pixels not in isolation, but in context. It builds up a picture iteratively, recognizing groups of pixels as objects. It uses the color, shape, texture and size of objects as well as their context and relationships to draw conclusions and inferences, similar to a human analyst. Company In 1994 Professor Gerd Binnig founded Definiens. CNT was first available with the launch of the eCognition software in May 2000. In June 2010, Trimble Navigation Ltd (NASDAQ: TRMB) acquired Definiens business asset in earth sciences markets, including eCognition software, and also licensed Definiens' patented CNT. In 2014, Definiens was acquired by MedImmune, the global biologics research and development arm of AstraZeneca, for an initial consideration of $150 million. Software Definiens Tissue Studio Definiens Tissue Studio is a digital pathology image analysis software application based on CNT. The intended use of Definiens Tissue Studio is for biomarker translational research in formalin-fixed, paraffin-embedded tissue samples which have been treated with immunohistochemical staining assays, or hematoxylin and eosin (H&E). The central concept behind Definiens Tissue Studio is a user interface that facilitates machine learning from example digital histopathology images in order to derive an image analysis solution suitable for the measurement of biomarkers and/or histological features within pre-defined regions of interest on a cell-by-cell basis, and within sub-cellular compartments. The derived image analysis solution is then automatically applied to subsequent digital images in order to objectively measure defined sets of multiparametric image features. These data sets are used for further understanding the underlying biological processes that drive cancer and other diseases. Image processing and data analysis are performed either on a local desktop computer workstation, or on a server grid. eCognition The eCognition suite offers three components which can be used stand-alone or in combination to solve image analysis tasks. eCognition Developer is a development environment for object-based image analysis. It is used in earth sciences to develop rule sets (or applications) for the analysis of remote sensing data. eCognition Architect enables non-technical users to configure, calibrate and execute image analysis workflows created in eCognition Developer. eCognition Server software provides a processing environment for batch execution of image analysis jobs. eCognition software is utilized in numerous remote sensing and geospatial application scenarios and environments, using a variety of data types: Generic: Rapid Mapping, Change Detection, Object Recognition By environment: Diverse Landcover Mapping, Urban Analysis (i.e. impervious surface area analysis for taxation, property assessment for insurance, inventory of green infrastructure), Forestry (i.e. biomass measurement, species identification, firescar measurement), Agriculture (i.e. regional planning, precision farming, crisis response), Marine and Riparian (i.e. ecosystem evaluation, disaster management, harbor monitoring). Other: Defense, security, atmosphere and climate The online eCognition community was launched in July 2009 and had 2813 members as of July 9, 2010. Membership is distributed globally and user conferences are held regularly, the last having taken place in November 2009 in Munich, Germany. The bi-annual GEOBIA (Geographic Object-Based Image Analysis) conference is heavily attended by eCognition users, with the majority of presentations based on eCognition software. References Further reading Computer vision Science and technology in Germany
274947
https://en.wikipedia.org/wiki/WxWidgets
WxWidgets
wxWidgets (formerly wxWindows) is a widget toolkit and tools library for creating graphical user interfaces (GUIs) for cross-platform applications. wxWidgets enables a program's GUI code to compile and run on several computer platforms with minimal or no code changes. A wide choice of compilers and other tools to use with wxWidgets facilitates development of sophisticated applications. wxWidgets supports a comprehensive range of popular operating systems and graphical libraries, both proprietary and free, and is widely deployed in prominent organizations (see text). The project was started under the name wxWindows in 1992 by Julian Smart at the University of Edinburgh. The project was renamed wxWidgets in 2004 in response to a trademark claim by Microsoft UK. It is free and open source software, distributed under the terms of the wxWidgets Licence, which satisfies those who wish to produce for GPL and proprietary software. Portability and deployment wxWidgets covers systems such as Microsoft Windows, Mac OS (Carbon and Cocoa), iOS (Cocoa Touch), Linux/Unix (X11, Motif, and GTK), OpenVMS, OS/2 and AmigaOS. A version for embedded systems is under development. wxWidgets is used across many industry sectors, most notably by Xerox, Advanced Micro Devices (AMD), Lockheed Martin, NASA and the Center for Naval Analyses. It is also used in the public sector and education by, for example, Dartmouth Medical School, National Human Genome Research Institute, National Center for Biotechnology Information, and many others. wxWidgets is used in many open source projects, and by individual developers. History wxWidgets (initially wxWindows) was started in 1992 by Julian Smart at the University of Edinburgh. He attained an honours degree in Computational science from the University of St Andrews in 1986, and is still a core developer. On February 20, 2004, the developers of wxWindows announced that the project was changing its name to wxWidgets, as a result of Microsoft requesting Julian Smart to respect Microsoft's United Kingdom trademark of the term Windows. Major release versions were 2.4 on 6 January 2003, 2.6 on 21 April 2005 and 2.8.0 on 14 December 2006. Version 3.0 was released on 11 November 2013. wxWidgets has participated in the Google Summer of Code since 2006. The following table contains the release history of wxWidgets, showing all of its major release versions. License wxWidgets is distributed under a custom made wxWindows Licence, similar to the GNU Lesser General Public License (LGPL), with an exception stating that derived works in binary form may be distributed on the user's own terms. This license is a free software license approved by the FSF, making wxWidgets free software. It has been approved by the Open Source Initiative (OSI). Official support Supported platforms wxWidgets is supported on the following platforms: Windows – (32/64-bits Windows XP up to Windows 10) Linux/Unix – wxGTK, wxX11, wxMotif Mac OS – wxMac (Mac OS X 10.3 using Carbon, Mac OS X 10.5 using Cocoa), (32/64-bits Mac OS X 10.7 or later) OS/2 – wxOS2, , wxWidgets for GTK or Motif can be compiled on OS/2 Embedded platforms – wxEmbedded External ports Amiga – wxWidgets-AOS: AmigaOS port (Work In Progress) Supported compilers wxWidgets is officially confirmed to work properly with the following compilers: Programming language bindings The wxWidgets library is implemented in C++, with bindings available for many commonly used programming languages. wxWidgets is best described as a native mode toolkit as it provides a thin abstraction to a platform's native widgets, contrary to emulating the display of widgets using graphic primitives. Calling a native widget on the target platform results in a more native looking interface than toolkits such as Swing (for Java), as well as offering performance and other benefits. The toolkit is also not restricted to GUI development, having an inter-process communication layer, socket networking functionality, and more. RAD tools and IDEs for wxWidgets There are many Rapid Application Development (RAD) and Integrated Development Environment (IDE) tools available. Notable tools include: Code::Blocks (via wxSmith plugin) CodeLite (via wxCrafter plugin) wxFormBuilder Applications built using wxWidgets Notable applications that use wxWidgets: 0 A.D. – a FOSS video game similar to Age of Empires Amaya – web authoring tool aMule – peer-to-peer file sharing application ActivePresenter – screen recorder, video editor & e-learning application Audacity – cross-platform sound editor BitTorrent – peer-to-peer file sharing application Berkeley Open Infrastructure for Network Computing – an open-source middleware system Code::Blocks – C/C++ IDE CodeLite – simple C++ Editor (Collection of free Tools, implemented by plugins) FileZilla – FTP client GrandOrgue – virtual pipe organ simulator Guayadeque Music Player – a music player with database Hollywood – uses wxWidgets in its RapaGUI plugin KiCad – a free software suite for electronic design automation (EDA) RapidSVN – Subversion client RocketCake – WYSIWYG responsive website builder TortoiseCVS – CVS client Criticism Several well known and big applications have switched to Qt due to the problems of wxWidgets: Dolphin (emulator) MKVToolNix VLC media player See also FLTK – a light, cross platform, non-native widget toolkit FOX toolkit – a fast, open source, cross-platform widget toolkit GTK – the GIMP toolkit, a widget toolkit used by GNOME applications gtkmm – C++ version of GTK Juce – an extensive cross-platform widget toolkit IUP – a multi-platform toolkit for building native graphical user interfaces Qt (toolkit) – an application framework used by KDE applications Ultimate++ – a C++ cross-platform development framework Widget toolkit List of widget toolkits References Further reading External links 1992 software C++ libraries Cross-platform software Free computer libraries Free software programmed in C++ Widget toolkits X-based libraries
411760
https://en.wikipedia.org/wiki/OpenRISC
OpenRISC
OpenRISC is a project to develop a series of open-source hardware based central processing units (CPUs) on established reduced instruction set computer (RISC) principles. It includes an instruction set architecture (ISA) using an open-source license. It is the original flagship project of the OpenCores community. The first (and only) architectural description is for the OpenRISC 1000 ("OR1k"), describing a family of 32-bit and 64-bit processors with optional floating-point arithmetic and vector processing support. The OpenRISC 1200 implementation of this specification was designed by Damjan Lampret in 2000, written in the Verilog hardware description language (HDL). The later mor1kx implementation, which has some advantages compared to the OR 1200, was designed by Julius Baxter and is also written in Verilog. Additionally software simulators exist, which implement the OR1k specification. The hardware design was released under the GNU Lesser General Public License (LGPL), while the models and firmware were released under the GNU General Public License (GPL). A reference system on a chip (SoC) implementation based on the OpenRISC 1200 was developed, named the OpenRISC Reference Platform System-on-Chip (ORPSoC). Several groups have demonstrated ORPSoC and other OR1200 based designs running on field-programmable gate arrays (FPGAs), and there have been several commercial derivatives produced. Later SoC designs, also based on an OpenRisc 1000 CPU implementation, are FuseSoC, minSoC, OpTiMSoC and MiSoC. Instruction set The instruction set is a reasonably simple MIPS architecture-like traditional RISC using a 3-operand load-store architecture, with 16 or 32 general-purpose registers and a fixed 32-bit instruction length. The instruction set is mostly identical between the 32- and 64-bit versions of the specification, the main difference being the register width (32 or 64 bits) and page table layout. The OpenRISC specification includes all features common to modern desktop and server processors: a supervisor mode and virtual memory system, optional read, write, and execute control for memory pages, and instructions for synchronizing and interrupt handling between multiple processors. Another notable feature is a rich set of single instruction, multiple data (SIMD) instructions intended for digital signal processing. Implementations Most implementations are on field-programmable gate arrays (FPGAs) which give the possibility to iterate on the design at the cost of performance. By 2018, the OpenRISC 1000 was considered stable, so ORSoC (owner of OpenCores) began a crowdfunding project to build a cost-efficient application-specific integrated circuit (ASIC) to get improved performance. ORSoC faced criticism for this from the community. The project did not reach the goal. , no open-source ASIC had been produced. Commercial implementations Several commercial organizations have developed derivatives of the OpenRISC 1000 architecture, including the ORC32-1208 from ORSoC and the BA12, BA14, and BA22 from Beyond Semiconductor. Dynalith Systems provide the iNCITE FPGA prototyping board, which can run both the OpenRISC 1000 and BA12. Flextronics (Flex) and Jennic Limited manufactured the OpenRISC as part of an application-specific integrated circuit (ASIC). Samsung uses the OpenRISC 1000 in their DTV system-on-chips (SDP83 B-Series, SDP92 C-Series, SDP1001/SDP1002 D-Series, SDP1103/SDP1106 E-Series). Allwinner Technology are reported to use an OpenRISC core in their AR100 power controller, which forms part of the A31 ARM-based SoC. Cadence Design Systems have begun using OpenRISC as a reference architecture in documenting tool chain flows (for example the UVM reference flow, now contributed to Accellera). TechEdSat, the first NASA OpenRISC architecture based Linux computer launched in July 2012, and was deployed in October 2012 to the International Space Station with hardware provided, built, and tested by ÅAC Microtec and ÅAC Microtec North America. Academic and non-commercial use Being open source, OpenRISC has proved popular in academic and hobbyist circles. For example, Stefan Wallentowitz and his team at the Institute for Integrated Systems at the Technische Universität München have used OpenRISC in research into multi-core processor architectures. The Open Source Hardware User Group (OSHUG) in the UK has on two occasions run sessions on OpenRISC, while hobbyist Sven-Åke Andersson has written a comprehensive blog on OpenRISC for beginners, which attracted the interest of Electronic Engineering Times (EE Times). Sebastian Macke has implemented jor1k, an OpenRISC 1000 emulator in JavaScript, running Linux with X Window System and Wayland support. Toolchain support The OpenRISC community have ported the GNU toolchain to OpenRISC to support development in the programming languages C and C++. Using this toolchain the newlib, uClibc, musl (as of release 1.1.4), and glibc libraries have been ported to the processor. Dynalith provides OpenIDEA, a graphical integrated development environment (IDE) based on this toolchain. A project to port LLVM to the OpenRISC 1000 architecture began in early 2012. GCC 9 released with OpenRISC support. The OR1K project provides an instruction set simulator, or1ksim. The flagship implementation, the OR1200, is a register-transfer level (RTL) model in Verilog HDL, from which a SystemC-based cycle-accurate model can be built in ORPSoC. A high speed model of the OpenRISC 1200 is also available through the Open Virtual Platforms (OVP) initiative (see OVPsim), set up by Imperas. Operating system support Linux support The mainline Linux kernel gained support for OpenRISC in version 3.1. The implementation merged in this release is the 32-bit OpenRISC 1000 family (or1k). Formerly OpenRISC 1000 architecture, it has been superseded by the mainline port. RTOS support Several real-time operating systems (RTOS) have been ported to OpenRISC, including RTEMS, FreeRTOS, and eCos. QEMU support Since version 1.2, QEMU supports emulating OpenRISC platforms. See also Amber (processor core) – ARM-Compatible OpenCores Project Free and Open Source Silicon Foundation OpenRISC 1200 OVPsim, Open Virtual Platforms OpenSPARC LEON LatticeMico32 RISC-V References External links Open Source Semiconductor Core Licensing, 25 Harvard Journal of Law & Technology 131 (2011) Article analyzing the law, technology and business of open source semiconductor cores Beyond Semiconductor commercial fabless semiconductor company founded by the developers of OpenRISC Dynalith Systems company website. Imperas company website. Flex company website Jennic company website Eetimes article OpenRISC tutorial jor1k OpenRISC 1000 emulator in JavaScript Open microprocessors Embedded microprocessors
4674178
https://en.wikipedia.org/wiki/List%20of%20Java%20virtual%20machines
List of Java virtual machines
This article provides non-exhaustive lists of Java SE Java virtual machines (JVMs). It does not include every Java ME vendor. Note that Java EE runs on the standard Java SE JVM but that some vendors specialize in providing a modified JVM optimized for Java EE applications. Much Java development work takes place on Windows, Solaris, Linux FreeBSD, primarily with the Oracle JVMs. Note the further complication of different 32-bit/64-bit varieties. The primary reference Java VM implementation is HotSpot, produced by Oracle Corporation and many other big and medium-sized companies (e.g. IBM, Redhat, Microsoft, Azul, SAP). Free and open source implementations Active Azul Platform Core — is an OpenJDK build supported by Azul Systems and is compliant with the Java SE 17, 16, 15, 13, 11, 8, 7, and 6 standards. Codename One — uses the open source ParparVM Eclipse OpenJ9 — open-source from IBM J9, for Windows, AIX, Linux (x86, Power, and Z), macOS, MVS, OS/400, Pocket PC, z/OS. GraalVM — is based on HotSpot/OpenJDK, it has a polyglot feature, to transparently mix and match supported languages. HotSpot — the open-source Java VM implementation by Oracle. Jikes RVM (Jikes Research Virtual Machine) — research project. PPC and IA-32. Supports Apache Harmony and GNU Classpath libraries. Eclipse Public License. leJOS — Robotics suite, a firmware replacement for Lego Mindstorms programmable bricks, provides a Java programming environment for the Lego Mindstorms RCX and NXT robots. Maxine — meta-circular open source research VM from Oracle Labs and the University of Manchester. Inactive Apache Harmony — supports several architectures and systems. Discontinued November 2011. Apache License 2.0. GCJ the GCC Java compiler, that compiles either to bytecode or to native machine code. As of GCC 7, gcj and associated libjava runtime library have been removed from GCC. IKVM.NET — Java for Mono and the Microsoft .NET Framework. Uses OpenJDK. Zlib License. JamVM — developed to be an extremely small virtual machine. Uses GNU Classpath and OpenJDK. Supports several architectures. GPL. Last update 2014. JOP — hardware implementation of the JVM. GPL 3. Juice — JavaME experimental JVM developed to run on the NUXI operating system. Jupiter — uses Boehm garbage collector and GNU Classpath. GPL. Unmaintained. Kaffe — uses GNU Classpath. GPL. 1.1.9 released on February 26, 2008. Mika VM — intended for use in embedded devices. Cross-platform. BSD-style licence. NanoVM — developed to run on the Atmel AVR ATmega8 used in the Asuro Robot, can be ported to other AVR-based systems. SableVM — first free software JVM to support JVMDI and JDWP. Makes use of GNU Classpath. LGPL. Version 1.13 released on March 30, 2007. Squawk virtual machine — a Java ME VM for embedded systems and small devices. Cross-Platform. GPL. SuperWaba — Java-like virtual machine for portable devices. GPL. Discontinued, succeeded by TotalCross. TakaTuka — for wireless sensor network devices. GPL. TinyVM. VMKit of LLVM. Wonka VM — developed to run on Acunia's ARM-based hardware. Some code drawn from GNU Classpath. BSD-style licence. No longer under active development, superseded by Mika VM. Java operating systems Some JVM's are intended to run without an underlying OS. JX Java operating system that focuses on a flexible and robust operating system architecture developed as an open source system by the University of Erlangen. GPL. Version 0.1.1 released on October 10, 2007 JavaOS - Original project from Sun Microsystems Proprietary implementations Active Azul Platform Prime — a fully compliant, high-performance Java Virtual Machine based on OpenJDK that uses Azul Systems's C4 garbage collector and Falcon JIT Compiler. JamaicaVM (aicas) — a hard real-time Java VM for embedded systems. Inactive Excelsior JET — a licensed Java SE implementation with AOT compiler for Windows, OS X, and Linux on Intel x86 and Linux on 32-bit ARM. Jinitiator — developed by Oracle before they purchased Sun. Designed to improve support for Oracle Forms in web sites. JRockit (originally from Appeal Virtual Machines) — acquired by Oracle for Linux, Windows and Solaris. Mac OS Runtime for Java (MRJ). Microsoft Java Virtual Machine — discontinued in 2001. Lesser-known proprietary Java virtual machines Blackdown Java was a licensed port to Linux of the reference SunSoft implementation. It was discontinued in 2007, after OpenJDK became available. Sun CVM — CVM originally standing for "Compact Java Virtual Machine". Gemstone — modified for Java EE features (application DBMS). Intent (Tao Group). PreonVM — a Java VM for embedded systems and small and resource constrained devices. See also Comparison of Java virtual machines Free Java implementations Java processor Dalvik virtual machine References External links List of Java virtual machines (JVMs), Java development kits (JDKs), Java runtime environments (JREs) Java platform software Java virtual machine Java virtual machines
49868268
https://en.wikipedia.org/wiki/Athena%20Fund
Athena Fund
Athena Fund is a non-profit organization established in 2006. Its mission is to empower teachers in Israel by providing them with tools for self-fulfillment and professional advancement. The Fund was founded by business leaders under the direction of Uri Ben-Ari (CEO of UBA Ventures and former executive VP of Ness Technologies). Introduced in 2007, Athena Fund's flagship initiative is the "Digital Toolbox for Every Teacher in Israel" (formerly "Laptop Computer for Every Teacher in Israel") program. Launched with the support of Israel's Prime Minister's Office, the program has been implemented in collaboration with the Israeli Ministry of Education, the Israel Teachers Union, and Bank Massad, an Israeli bank that specializes in providing banking services to teachers. Since 2012, Athena Fund has launched three additional programs in Israel: "Digital Toolbox for Every Kindergarten Teacher" (“Laptop Computer for Every Kindergarten Teacher”, 2012); "Digital Toolbox for Every STEM Teacher" (“Tablet for Every Science Teacher,” 2014); and "Digital Toolbox for Every Special Education Teacher" (“iPad for Every Special-Education Teacher,” 2015). Laptop computer for every teacher in Israel The mission of the Laptop Computer for Every Teacher in Israel program (launched in 2007) is to empower teachers, enhance their access to the digital world, and raise their social status. The goal of the program is to provide a laptop computer and 120 hours of professional training to each and every elementary, middle and high school teacher in Israel by year-end 2018. Each participating teacher receives a laptop computer with a protective case, a full set of Microsoft software, a 3-year warranty, 24/7 support, and a 120-hour professional training course. Laptop for every kindergarten teacher The Laptop for Every Kindergarten Teacher program (launched 2012) aims to enhance teachers' status and increase accessibility to the digital environment. The program trains teachers to use computers for different purposes and helps bring innovative teaching methods to kindergarten classrooms. Kindergarten computer use helps develop independent learning and promotes children's emotional, social and cognitive development. Each participating teacher receives a laptop computer with a protective case, a full set of Microsoft software, a 3-year warranty, 24/7 support, and a 120-hour professional training course. Tablet for every science teacher The Tablet for Every Science Teacher program (launched 2014) aims to empower science and technology teachers, and help them make each class more enjoyable and productive by using innovative technological means for teaching. Each participating teacher receives a GlobiMate Tablet from Globisens. This tablet comes with built-in multiple sensors and a microscope. In addition, the teacher receives a 1-year warranty, 24/7 support, and a 120-hour professional training course. iPad for every special-education teacher in Israel The iPad for Every Special-Education Teacher in Israel program (launched 2015) aims to enable special-education students to engage in meaningful learning, strengthening the students’ enjoyment and motivation, and improving interpersonal communication. Each participating teacher receives an iPad tablet, a 3-year warranty, 24/7 support, and a 120-hour professional training course. Parents to teens with special needs say that the iPad enables the children, even those who had never before communicated with their parents and their environment, to interact with their surroundings in a most exciting manner. Program participants As of December 31, 2015, the Laptop Computer for Every Teacher in Israel program has provided laptops and professional training to over 12,000 teachers in 503 cities and small communities in 1,104 schools and kindergartens. The Laptop for Every Kindergarten Teacher program (launched 2012) has provided laptops and professional training to 486 kindergarten teachers in Israel. The einstein Tablet+ for Every Science Teacher program (launched 2014) has provided tablets and professional training to 856 science and technology teachers in Israel. The iPad for Every Special-Education Teacher program (launched 2015) has provided tablet computers and professional training to 247 special-education teachers in Israel. Sponsorship The Laptop Computer for Every Teacher in Israel program, Athena Fund’s first program, was launched in 2007 with the sponsorship of the Prime Minister's Office. Today, Athena Fund’s programs are implemented with the cooperation of the Israeli Ministry of Education, the Fund for Professional Advancement of the Israel Teachers Union, and Bank Massad, which specializes in providing banking services to teachers. Program advisory board The program is supported by an Advisory Board of Israeli business, political and academic leaders including: Eliezer Shkedi, Former CEO of El Al Israel Airlines. Former Commander in Chief, Israeli Air Force Moshe Lichtman, former President of Microsoft Israel R&D Center and Microsoft Corporate VP Udi Shani, Former General Manager of Israel’s Ministry of Defense, Former Major General, IDF Raviv Zoller, CEO of IDI-Direct Insurance Peli Peled, Editor-in-Chief and Joint-CEO of People and Computers Prof. Nava Ben-Zvi, President of Hadassah Academic College and the Bloomfield Science Museum Jerusalem Amram Mitzna, Former Major General, IDF Prof. Moshe Bar Niv (Burnovski), Former Provost and Senior Vice President for Academic Affairs and Development, Interdisciplinary Center (IDC) Herzliya Program results The results of a survey conducted among 300 participating teachers in the Negev (southern Israel) in June 2008 indicated a substantial improvement in teacher and student performance. 75% of the teachers reported that as a result of using a laptop computer, their teaching effectiveness improved and student interest increased. Moreover, 35% of the teachers noted a significant decrease in classroom disciplinary problems. In another survey conducted in 2011 (in various schools), 99% of the participants reported an improvement in their status in the classroom. Moreover, classroom disruptions decreased by 35%. In a survey conducted in 2014 (in youth villages), 61% of teachers said that the laptop greatly helped improve teaching processes, and 62% said that it greatly helped update the teacher’s knowledge in work-related areas. 47% said that using the laptop in the classroom greatly contributed to more efficient use of classroom teaching time, and participation in the program greatly helped implement new teaching methods in the classroom. 48% stated that students' interest in the lesson topics greatly increased as a result of laptop use. References External links Non-profit organizations based in Israel Educational charities
40982
https://en.wikipedia.org/wiki/Customer-premises%20equipment
Customer-premises equipment
In telecommunications, a customer-premises equipment or customer-provided equipment (CPE) is any terminal and associated equipment located at a subscriber's premises and connected with a carrier's telecommunication circuit at the demarcation point ("demarc"). The demarc is a point established in a building or complex to separate customer equipment from the equipment located in either the distribution infrastructure or central office of the communications service provider. CPE generally refers to devices such as telephones, routers, network switches, residential gateways (RG), set-top boxes, fixed mobile convergence products, home networking adapters and Internet access gateways that enable consumers to access providers' communication services and distribute them in a residence or enterprise with a local area network (LAN). A CPE can be an active equipment, as the ones mentioned above, or passive equipment such as analogue telephone adapters (ATA) or xDSL-splitters. This includes key telephone systems and most private branch exchanges. Excluded from the CPE category are overvoltage protection equipment and pay telephones. Other types of materials that are necessary for the delivery of the telecommunication service, but are not defined as equipment, such as manuals and cable packages, and cable adapters are instead referred to as CPE-peripherals. CPE can refer to devices purchased by the subscriber, or to those provided by the operator or service provider. History The two phrases, "customer-premises equipment" and "customer-provided equipment", reflect the history of this equipment. Under the Bell System monopoly in the United States (post Communications Act of 1934), the Bell System owned the telephones, and one could not attach privately owned or supplied devices to the network, or to the station apparatus. Telephones were located on customers' premises, hence, customer-premises equipment. In the U.S. Federal Communications Commission (FCC) proceeding the Second Computer Inquiry, the FCC ruled that telecommunications carriers could no longer bundle CPE with telecommunications service, uncoupling the market power of the telecommunications service monopoly from the CPE market, and creating a competitive CPE market. With the gradual breakup of the Bell monopoly, starting with Hush-A-Phone v. United States [1956], which allowed some non-Bell owned equipment to be connected to the network (a process called interconnection), equipment on customers' premises became increasingly owned by customers. Indeed, subscribers were eventually permitted to purchase telephones – hence, customer-provided equipment. In the pay-TV industry many operators and service providers offer subscribers a set-top box with which to receive video services, in return for a monthly fee. As offerings have evolved to include multiple services [voice and data] operators have increasingly given consumers the opportunity to rent or buy additional devices like access modems, internet gateways and video extenders that enable them to access multiple services, and distribute them to a range of consumer electronics devices in the home. Technology evolution Hybrid devices The growth of multiple-service operators, offering triple or quad-play services, required the development of hybrid CPE to make it easy for subscribers to access voice, video and data services. The development of this technology was led by Pay TV operators looking for a way to deliver video services via both traditional broadcast and broadband IP networks. Spain's Telefonica was the first operator to launch a hybrid broadcast and broadband TV service in 2003 with its Movistar TV DTT/IPTV offering, while Polish satellite operator 'n' was the first to offer its subscribers a Three-way hybrid (or Tri-brid) broadcast and broadband TV service, which launched in 2009 Set-back box The term set-back box is used in the digital TV industry to describe a piece of consumer hardware that enables them to access both linear broadcast and internet-based video content, plus a range of interactive services like Electronic Programme Guides (EPG), Pay Per View (PPV) and video on demand (VOD) as well as internet browsing, and view them on a large screen television set. Unlike standard set-top boxes, which sit on top of or below the TV, a set-back box has a smaller form factor to enable it to be mounted to the rear of the display panel flat panel TV, hiding it from view. Residential gateway A residential gateway is a networking device used to connect devices in the home to the Internet or other wide area network (WAN). It is an umbrella term, used to cover multi-function networking appliances used in homes, which may combine a DSL modem or cable modem, a network switch, a consumer-grade router, and a wireless access point. In the past, such functions were provided by separate devices, but in recent years technological convergence has enabled multiple functions to be merged into a single device. One of the first home gateway devices to be launched was selected by Telecom Italia to enable the operator to offer triple play services in 2002 . Along with a SIP VoIP handset for making voice calls, it enabled subscribers to access voice, video and data services over a 10MB symmetrical ADSL fiber connection. Virtual gateway The virtual gateway concept enables consumers to access video and data services and distribute them around their homes using software rather than hardware. The first virtual gateway was introduced in 2010 by Advanced Digital Broadcast at the IBC exhibition in Amsterdam. The ADB Virtual Gateway uses software that resides within the middleware and is based on open standards, including DLNA home networking and the DTCP-IP standard, to ensure that all content, including paid-for encrypted content like Pay TV services, can only be accessed by secure CE devices. Broadband A subscriber unit, or SU is a broadband radio that is installed at a business or residential location to connect to an access point to send/receive high speed data wired or wirelessly. Devices commonly referred to as a subscriber unit include cable modems, access gateways, home networking adapters and mobile phones. WAN CPE may also refer to any devices that terminate a WAN circuit, such as an ISDN, E-carrier/T-carrier, DSL, or metro Ethernet. This includes any customer-owned hardware at the customer's site: routers, firewalls, network switches, PBXs, VoIP gateways, sometimes CSU/DSU and modems. Application areas Connected home Pay TV Over-the-top video services Broadband Voice over IP Fixed–mobile convergence [FMC] Other uses Cellular carriers may sometimes internally refer to cellular phones a customer has purchased without a subsidy or from a third party as "customer provided equipment." It is also notable that the fully qualified domain name and the PTR record of DSL and cable lines connected to a residence will often contain 'cpe'. See also Demarcation point Interconnection On-premises wiring Terminal equipment TR-069 References Sources Telephony equipment
555928
https://en.wikipedia.org/wiki/Sender%20Policy%20Framework
Sender Policy Framework
Sender Policy Framework (SPF) is an email authentication method designed to detect forging sender addresses during the delivery of the email. SPF alone, though, is limited to detecting a forged sender claim in the envelope of the email, which is used when the mail gets bounced. Only in combination with DMARC can it be used to detect the forging of the visible sender in emails (email spoofing), a technique often used in phishing and email spam. SPF allows the receiving mail server to check during mail delivery that a mail claiming to come from a specific domain is submitted by an IP address authorized by that domain's administrators. The list of authorized sending hosts and IP addresses for a domain is published in the DNS records for that domain. Sender Policy Framework is defined in RFC 7208 dated April 2014 as a "proposed standard". History The first public mention of the concept was in 2000 but went mostly unnoticed. No mention was made of the concept again until a first attempt at an SPF-like specification was published in 2002 on the IETF "namedroppers" mailing list by Dana Valerie Reese, who was unaware of the 2000 mention of the idea. The very next day, Paul Vixie posted his own SPF-like specification on the same list. These posts ignited a lot of interest, led to the forming of the IETF Anti-Spam Research Group (ASRG) and their mailing list, where the SPF idea was further developed. Among the proposals submitted to the ASRG were "Reverse MX" (RMX) by Hadmut Danisch, and "Designated Mailer Protocol" (DMP) by Gordon Fecyk. In June 2003, Meng Weng Wong merged the RMX and DMP specifications and solicited suggestions from others. Over the next six months, a large number of changes were made and a large community had started working on SPF. Originally SPF stood for Sender Permitted From and was sometimes also called SMTP+SPF; but its name was changed to Sender Policy Framework in February 2004. In early 2004, the IETF created the MARID working group and tried to use SPF and Microsoft's CallerID proposal as the basis for what is now known as Sender ID; but this collapsed due to technical and licensing conflicts. The SPF community returned to the original "classic" version of SPF. In July 2005, this version of the specification was approved by the IESG as an IETF experiment, inviting the community to observe SPF during the two years following publication. On April 28, 2006, the SPF RFC was published as experimental RFC 4408. In April 2014 IETF published SPF in RFC 7208 as a "proposed standard". Principles of operation The Simple Mail Transfer Protocol permits any computer to send email claiming to be from any source address. This is exploited by spammers and scammers who often use forged email addresses, making it more difficult to trace a message back to its source, and easy for spammers to hide their identity in order to avoid responsibility. It is also used in phishing techniques, where users can be duped into disclosing private information in response to an email purportedly sent by an organization such as a bank. SPF allows the owner of an Internet domain to specify which computers are authorized to send mail with envelope-from addresses in that domain, using Domain Name System (DNS) records. Receivers verifying the SPF information in TXT records may reject messages from unauthorized sources before receiving the body of the message. Thus, the principles of operation are similar to those of DNS-based blackhole lists (DNSBL), except that SPF uses the authority delegation scheme of the Domain Name System. Current practice requires the use of TXT records, just as early implementations did. For a while a new record type (SPF, type 99) was registered and made available in common DNS software. Use of TXT records for SPF was intended as a transitional mechanism at the time. The experimental RFC, RFC 4408, section 3.1.1, suggested "an SPF-compliant domain name SHOULD have SPF records of both RR types". The proposed standard, RFC 7208, says "use of alternative DNS RR types was supported in SPF's experimental phase but has been discontinued". The envelope-from address is transmitted at the beginning of the SMTP dialog. If the server rejects the domain, the unauthorized client should receive a rejection message, and if that client was a relaying message transfer agent (MTA), a bounce message to the original envelope-from address may be generated. If the server accepts the domain, and subsequently also accepts the recipients and the body of the message, it should insert a Return-Path field in the message header in order to save the envelope-from address. While the address in the Return-Path often matches other originator addresses in the mail header such as the header-from, this is not necessarily the case, and SPF does not prevent forgery of these other addresses such as sender header. Spammers can send email with an SPF PASS result if they have an account in a domain with a sender policy, or abuse a compromised system in this domain. However, doing so makes the spammer easier to trace. The main benefit of SPF is to the owners of email addresses that are forged in the Return-Path. They receive large numbers of unsolicited error messages and other auto-replies. If such receivers use SPF to specify their legitimate source IP addresses and indicate FAIL result for all other addresses, receivers checking SPF can reject forgeries, thus reducing or eliminating the amount of backscatter. SPF has potential advantages beyond helping identify unwanted mail. In particular, if a sender provides SPF information, then receivers can use SPF PASS results in combination with an allow list to identify known reliable senders. Scenarios like compromised systems and shared sending mailers limit this use. Reasons to implement If a domain publishes an SPF record, spammers and phishers are less likely to forge emails pretending to be from that domain, because the forged emails are more likely to be caught in spam filters which check the SPF record. Therefore, an SPF-protected domain is less attractive to spammers and phishers. Because an SPF-protected domain is less attractive as a spoofed address, it is less likely to be denylisted by spam filters and so ultimately the legitimate email from the domain is more likely to get through. FAIL and forwarding SPF breaks plain message forwarding. When a domain publishes an SPF FAIL policy, legitimate messages sent to receivers forwarding their mail to third parties may be rejected and/or bounced if all of the following occur: The forwarder does not rewrite the Return-Path, unlike mailing lists. The next hop does not allowlist the forwarder. This hop checks SPF. This is a necessary and obvious feature of SPF – checks behind the "border" MTA (MX) of the receiver cannot work directly. Publishers of SPF FAIL policies must accept the risk of their legitimate emails being rejected or bounced. They should test (e.g., with a SOFTFAIL policy) until they are satisfied with the results. See below for a list of alternatives to plain message forwarding. HELO tests For an empty Return-Path as used in error messages and other auto-replies, an SPF check of the HELO identity is mandatory. With a bogus HELO identity the result NONE would not help, but for valid host names SPF also protects the HELO identity. This SPF feature was always supported as an option for receivers, and later SPF drafts including the final specification recommend to check the HELO always. This allows receivers to allowlist sending mailers based on a HELO PASS, or to reject all mails after a HELO FAIL. It can also be used in reputation systems (any allow or deny list is a simple case of a reputation system). Implementation Compliance with SPF consists of three loosely related tasks: Publishing a policy: Domains and hosts identify the machines authorized to send email on their behalf. They do this by adding additional records to their existing DNS information: every domain name or host that has an A record or MX record should have an SPF record specifying the policy if it is used either in an email address or as HELO/EHLO argument. Hosts which do not send mail should have an SPF record published which indicate such ("v=spf1 -all"). Checking and using SPF information: Receivers use ordinary DNS queries, which are typically cached to enhance performance. Receivers then interpret the SPF information as specified and act upon the result. Revising mail forwarding: Plain mail forwarding is not allowed by SPF. The alternatives are: Remailing (i.e., replacing the original sender with one belonging to the local domain) Refusing (i.e., answering 551 User not local; please try <[email protected]>) Allowlisting on the target server, so that it will not refuse a forwarded message Sender Rewriting Scheme, a more complicated mechanism that handles routing non-delivery notifications to the original sender Thus, the key issue in SPF is the specification for the new DNS information that domains set and receivers use. The records laid out below are in typical DNS syntax, for example: "v=spf1 ip4:192.0.2.0/24 ip4:198.51.100.123 a -all" "v=" defines the version of SPF used. The following words provide mechanisms to use to determine if a domain is eligible to send mail. The "ip4" and "a" specify the systems permitted to send messages for the given domain. The "-all" at the end specifies that, if the previous mechanisms did not match, the message should be rejected. Mechanisms Eight mechanisms are defined: Qualifiers Each mechanism can be combined with one of four qualifiers: + for a PASS result. This can be omitted; e.g., +mx is the same as mx. ? for a NEUTRAL result interpreted like NONE (no policy). ~ (tilde) for SOFTFAIL, a debugging aid between NEUTRAL and FAIL. Typically, messages that return a SOFTFAIL are accepted but tagged. - (minus) for FAIL, the mail should be rejected (see below). Modifiers The modifiers allow for future extensions to the framework. To date only the two modifiers defined in the RFC 4408 have been widely deployed: exp=some.example.com gives the name of a domain with a DNS TXT record (interpreted using SPF's macro language) to get an explanation for FAIL results—typically a URL which is added to the SMTP error code. This feature is rarely used. redirect=some.example.com can be used instead of the ALL-mechanism to link to the policy record of another domain. This modifier is easier to understand than the somewhat similar INCLUDE-mechanism. Error handling As soon as SPF implementations detect syntax errors in a sender policy they must abort the evaluation with result PERMERROR. Skipping erroneous mechanisms cannot work as expected, therefore include:bad.example and redirect=bad.example also cause a PERMERROR. Another safeguard is the maximum of ten mechanisms querying DNS, i.e. any mechanism except from IP4, IP6, and ALL. Implementations can abort the evaluation with result TEMPERROR when it takes too long or a DNS query times out or they can continue pretending that the query returned no data —which is called a "void lookup". However, they must return PERMERROR if the policy directly or indirectly needs more than ten queries for mechanisms. In addition, they should return PERMERROR as soon as more than two "void lookups" have been encountered. Any redirect= also counts towards this processing limits. A typical SPF HELO policy v=spf1 a mx ip4:192.0.2.0 -all may execute four or more DNS queries: (1) TXT record (SPF type was obsoleted by RFC 7208), (2) A or AAAA for mechanism a, (3) MX record and (4+) A or AAAA for each MX name, for mechanism mx. Except the first one, all those queries count towards the limit of 10. In addition if, for example, the sender has an IPv6 address, while its name and its two MX names have only IPv4 addresses, then the evaluation of the first two mechanisms already results in more than two void lookups and hence PERMERROR. Note that mechanisms ip4, ip6 and all need no DNS lookup. Issues DNS SPF Records To enable rapid testing and deployment, initial versions of SPF checked for its setting in the DNS TXT record of the sending domain - even though this record was traditionally supposed to be free-form text with no semantics attached. Although in July 2005, IANA assigned a specific Resource Record type 99 to SPF the uptake of was never high, and having two mechanisms was confusing for users. In 2014 the use of this record was discontinued after the SPFbis working group concluded that " ...significant migration to the SPF RR type in the foreseeable future was very unlikely and that the best solution for resolving this interoperability issue was to drop support for the SPF RR type." Header limitations As SPF increasingly prevents spammers from spoofing the envelope-from address, many have moved to only spoof the address in the From field of the mail header, which is actually displayed to the recipient rather than only processed by the recipient's message transfer agent (MTA). SPF (or DKIM) can be used together with DMARC though, to also check the From field of the mail header. This is called 'identifier alignment'. Custom proprietary implementations are required to protect against such display name spoofing and cannot utilize SPF. Deployment Anti-spam software such as SpamAssassin version 3.0.0 and ASSP implement SPF. Many mail transfer agents (MTAs) support SPF directly such as Courier, CommuniGate Pro, Wildcat, MDaemon, and Microsoft Exchange, or have patches or plug-ins available that support SPF, including Postfix, Sendmail, Exim, qmail, and Qpsmtpd. As of 2017, more than eight million domains publish SPF FAIL -all policies. In a survey published in 2007, 5% of the .com and .net domains had some kind of SPF policy. In 2009, a continuous survey run at Nokia Research reports that 51% of the tested domains specify an SPF policy. These results can include trivial policies like v=spf1 ?all. In April 2007, BITS, a division of the Financial Services Roundtable, published email security recommendations for its members including SPF deployment. In 2008, the Messaging Anti-Abuse Working Group (MAAWG) published a paper about email authentication covering SPF, Sender ID, and DomainKeys Identified Mail (DKIM). In their "Sender Best Communication Practices" the MAAWG stated: "At the very least, senders should incorporate SPF records for their mailing domains". In 2015, the Messaging Anti-Abuse Working Group (MAAWG) revised a paper about email authentication covering SPF, DomainKeys Identified Mail (DKIM), and DMARC (DMARC). In their revised "Sender Best Communication Practices" the MAAWG stated: "Authentication supports transparency by further identifying the sender(s) of a message, while also contributing to the reduction or elimination of spoofed and forged addresses". See also DomainKeys Identified Mail (DKIM) Author Domain Signing Practices DMARC References External links IETF RFC4408: Sender Policy Framework (SPF) for Authorizing Use of Domains in Email, Version 1 EXPERIMENTAL (2006) IETF RFC6652: Sender Policy Framework (SPF) Authentication Failure Reporting Using the Abuse Reporting Format, PROPOSED STANDARD (2012) IETF RFC7208: Sender Policy Framework (SPF) for Authorizing Use of Domains in Email, Version 1, PROPOSED STANDARD (2014) libspf2 – An Open Source Implementation of the SPF Protocol (2010) https://senderpolicyframework.com/ and https://www.spfwizard.net/ (tools for the generation, validation and configuration of the SPF record) Email authentication Internet architecture Internet governance Internet protocols Network addressing Spam filtering
64349219
https://en.wikipedia.org/wiki/Distributed%20Denial%20of%20Secrets
Distributed Denial of Secrets
Distributed Denial of Secrets, abbreviated DDoSecrets, is a non-profit whistleblower site for news leaks founded in 2018. Sometimes referred to as a successor to WikiLeaks, it is best known for its June 2020 publication of a large collection of internal police documents, known as BlueLeaks. The group has also published data on Russian oligarchs, fascist groups, shell companies, tax havens and banking in the Caymans, as well as hosting data scraped from Parler in January 2021 and from the February 2021 Gab leak. The group is also known for publishing emails from military officials, City Hall in Chicago and the Washington D.C. Metropolitan Police Department. As of January 2021, the site hosts dozens of terabytes of data. The site is a frequent source for other news outlets. The site's leaks have resulted in or contributed to multiple government investigations, including the second impeachment of President Donald J. Trump. History Distributed Denial of Secrets was founded by Emma Best, an American national security reporter known for filing prolific freedom of information requests, and another member of the group known as The Architect. According to Best, The Architect, who they already knew, approached them and expressed their desire to see a new platform for leaked and hacked materials, along with other relevant datasets. The Architect provided the initial technical expertise for the project. At its public launch in December 2018, the site held more than 1 terabyte of data from many of the highest-profile leaks. The site originally considered making all of the data public, but after feedback made some of it available only to journalists and researchers. Best has served as a public face of the group, which lists its members. In February 2019, they told Columbia Journalism Review there were fewer than 20 people working on the project. In April 2021, their website listed 10 members and advisors. In December 2019, Distributed Denial of Secrets announced their collaboration with the Organized Crime and Corruption Reporting Project. In May 2020, DDoSecrets partnered with European Investigative Collaborations and the Henri-Nannen-Journalistenschule journalism school. In June 2020, the DDoSecrets Twitter account was suspended in response to BlueLeaks, citing a breach of their policies against "distribution of hacked material" in a move that was criticized as setting a "dangerous precedent." In December 2020, the group announced their affiliation with Harvard University's Institute for Quantitative Social Science. Response DDoSecrets and the people behind the project have been described by Wired as a "transparency collective of data activists" and a successor to WikiLeaks, by the Congressional Research Service, Organized Crime and Corruption Reporting Project, Human Rights Watch and The Nation as a "transparency collective", by The Hill as a "leaktivist collective", by Columbia Journalism Review as a "journalist collective", by Brookings Institution as "a WikiLeaks-style journalist collective," by the New York Times as a "watchdog group", and Business Insider as a "freedom-of-information advocacy group", as an "alternative to WikiLeaks" by Columbia Journalism Review, Krebs On Security, ZDNet, and Forbes, and as "the most influential leaking organization on the internet" by VICE News." Government response In 2019, the Congressional Research Service recognized Distributed Denial of Secrets as a transparency collective. In 2020, the U.S. counterintelligence strategy described leaktivists and public disclosure organizations like Distributed Denial of Secrets as “significant threats,” alongside five countries, three terrorist groups, and “transnational criminal organizations.” A June 2020 bulletin created by the Department of Homeland Security's Office of Intelligence and Analysis described them as a "criminal hacker group". Elements of the report were challenged as inaccurate by media such as The Verge. The next month, the Internal Revenue Service (IRS) recognized the group as a 501(c)(3) non-profit. Publications Russian leaks Russian Ministry of the Interior In December 2019, DDoSecrets listed a leak from Russia's Ministry of Internal Affairs, portions of which detailed the deployment of Russian troops to Ukraine at a time when the Kremlin was denying a military presence there. Some material from that leak was published in 2014, about half of it wasn't, and WikiLeaks reportedly rejected a request to host the files two years later, at a time when Julian Assange was focused on exposing Democratic Party documents passed to WikiLeaks by Kremlin hackers. Dark Side of the Kremlin In January 2019, DDoSecrets published hundreds of gigabytes of hacked Russian documents and emails from pro-Kremlin journalists, oligarchs, and militias. The New York Times called the release "a symbolic counterstrike against Russia's dissemination of hacked emails to influence the American presidential election in 2016." According to the Times, the documents shed light on the Russian invasion of Ukraine as well as ties between the Kremlin and the Russian Orthodox Church, the business dealings of oligarchs and much more. According to an internal bulletin issued by the Department of Homeland Security, the "hack-and-leak activity" was conducted by DDoSecrets, though reporting by The Daily Beast identified several independent hacktivists responsible for the hacks. Bankers Box series The Bankers Boxes are a series of releases from DDoSecrets related to banking, finance and corporate ownership. Rossi + MPS In September 2019, DDoSecrets published the investigation file for the death of David Rossi, an executive of the world's oldest bank Banca Monte dei Paschi di Siena, who died under suspicious circumstances while the bank was embroiled in a scandal. Cayman Islands In November 2019, DDoSecrets published over 2 terabytes of data from the Cayman Island National Bank and Trust, dubbed the Sherwood files. The files were provided by the hacktivist known as Phineas Fisher, who was previously responsible for the hack and subsequent release of Gamma Group and Hacking Team documents and emails. The files included lists of the bank's politically exposed clients and was used for studies of how elites use offshore banking. The leak led to at least one government investigation. #29 Leaks In December 2019, DDoSecrets published #29 Leaks in partnership with the Organized Crime and Corruption Reporting Project. The hundreds of gigabytes of data in #29 Leaks included emails, documents, faxes, and recordings of phone calls. The leak was compared to the Panama Papers and the Paradise Papers and came from Formations House, which registered and operated companies for clients who included organized crime, state owned oil companies, and fraudulent banks. The leak led to at least one government investigation. Corporate registries In 2019 and 2020, DDoSecrets published corporate registries for the Cook Islands and the Bahamas. DDoSecrets partnered with European Investigative Collaborations and the German Henri-Nannen-Journalistenschule journalism school in an unprecedented project named Tax Evader Radar to review and research a dataset containing almost one million documents from the Bahamas company registry. The project exposed the offshore holdings of prominent Germans, the activities of ExxonMobil, as well as the DeVos and Prince families. The leak included files which ICIJ reviewed as part of Bahamas Leaks but did not make available to the public. PacoLeaks and MilicoLeaks In December 2019, DDoSecrets re-published the first tranche of PacoLeaks, data from Chilean police hacked by Anonymous as part of ongoing protests, after it was censored before publishing the second tranche. Soon after, they published emails hacked from the Chilean military, dubbed MilicoLeaks. MilicoLeaks included details on Chilean army intelligence, including operations, finance and international relations. Project Whispers In April 2020, DDoSecrets published millions of neo-nazi and far-right chat messages in a searchable database called Whispers. The leaked chats showed threats of violence and attempts to sway the 2018 United States midterm elections. BlueLeaks On June 19, 2020, DDoSecrets released BlueLeaks, which consisted of 269 gigabytes of internal U.S. law enforcement data obtained from fusion centers by the hacker collective Anonymous. DDoSecrets called it the "largest published hack of American law enforcement agencies." The editor for The Intercept described BlueLeaks as the law enforcement equivalent to the Pentagon Papers. Some of the group's servers were located in Germany, and German authorities seized those servers at the request of the United States. Twitter and other social media companies cooperated with police by suspending the group's accounts and making their past posts inaccessible. Twitter cited its terms of service, which explicitly bars the distributing of "content obtained through hacking that contains private information, may put people in harm or danger, or contains trade secrets." However, Emma Best, one of the group's founders, called Twitter's actions "heavy-handed", as they suspended users whose tweets had linked to archives where leaked material could be found, they also suspended users whose tweets merely mentioned the leak. On July 9, Reddit banned /r/BlueLeaks, a community created to discuss BlueLeaks, claiming they had posted personal information. There is a federal investigation relating to BlueLeaks. Various Freedom of Information Act requests filed about BlueLeaks and DDoSecrets were rejected due to an ongoing federal investigation. Homeland Security Investigations has questioned at least one person, seeking information about BlueLeaks and DDoSecrets. As a result of BlueLeaks, there were calls in 2020 to defund fusion centers and in 2021 Maine began holding legislative hearings about it. Findings During the George Floyd protests, law enforcement agencies monitored protesters' communications over social media and messaging apps. Reports leaked found that the police were aware of the potential for their surveillance to violate the Constitution. They distributed documents to police filled with rumors and warnings that the protests would become violent, sparking fear among police officers. The documents also show a much broader trend of surveillance. They show details about the data that police can obtain from social media sites including Facebook, Twitter, TikTok, Reddit and Tumblr, among others. Fusion centers also collect and distribute detailed data from automatic license plate readers. Surveys from law enforcement training programs reveal that some instructors were prejudiced and unprofessional. Classes taught biased, outdated, and incorrect content. Some contain sexual content unrelated to the class, and there was one report of an instructor admitting to lying in court frequently. In Maine, legislators took interest in BlueLeaks thanks to details about the Maine Information and Analysis Center, which is under investigation. The leaks showed the fusion center was spying on and keeping records on people who had been legally protesting or had been "suspicious" but committed no crime. Documents also contain reports about other countries from the Department of Homeland Security, U.S. Department of State and other agencies. Officials discussed cyber attacks from Iran and concerns about further attacks in early 2020. Another report discusses possible Chinese espionage at natural gas facilities. Homeland Security also discussed Russian interference with American elections, attempts to hack the 2020 census, and manipulation of social media discussion. Google's CyberCrime Investigation Group On August 21, The Guardian reported, based on the leaked documents, the existence of Google's "CyberCrime Investigation Group" (CIG). The group focused on voluntarily forwarding detailed information of Google, YouTube and Gmail users, among other products, to members of the Northern California Regional Intelligence, a counter-terrorist fusion center, for content threatening violence or otherwise expressing extremist views, often associated with the far right. The company has also been said to report users who appeared to be in mental distress, indicating suicidal thoughts or intent to commit self-harm. One way Google identified its users in order to report them to law enforcement was by cross-referencing different Gmail accounts that eventually led them to a single Android phone. In some cases the company did not ban the users they reported to the authorities, and some were said to still have accounts on YouTube, Gmail and other services. Gab Chat In early 2020, Gab, a social network known for its far-right userbase, launched encrypted text messaging service Gab Chat in beta. In late June 2020, hackers leaked a May 26 law enforcement bulletin that was distributed by DDoSecrets as part of BlueLeaks. The bulletin was created by the Central Florida Intelligence Exchange Fusion Center, who speculated that Gab Chat's encryption and privacy features for private chatting, such as the service automatically deleting text messages after 30 days of them being sent, could entice white supremacists to use the platform instead of Discord, a platform on which white supremacist groups have been frequently infiltrated by anti-fascists. AssangeLeaks In July 2020, DDoSecrets released secret files on the United States' case against Julian Assange. Giving ransomware leaks to journalists In January 2021, DDoSecrets began making data published by ransomware hackers available to journalists. The initial release contained over 750,000 files from industries including pharmaceuticals, manufacturing, finance, software, retail, real estate, and oil and gas. In June 2021, DDoSecrets released 73,500 emails, accounting files, contracts, and around 19 GB of other business documents from the pipeline firm LineStar Integrity Services. The same month, 200 gigabytes from Presque Isle Police Department were posted online, including 15,000 emails and police reports and witness statements from the 1970s to the present. DDoSecrets mirrored the files and gave them to journalists, but did not repost them publicly citing privacy concerns. Perceptics The group pointed to their earlier publication of the Perceptics breach as an example of the importance of ransomware leaks. The breach revealed that the security firm had lobbied Congress to downplay privacy and security concerns, provided extensive favors to politicians, and crafted some of the Republican Party's demands on border security. Jones Day (Chicago emails) In April 2021, DDoSecrets published a cache of emails from Chicago City Hall, which Mayor Lightfoot refused to answer questions about. The emails revealed that the city's handling of fatal shootings by police officers violates state law and a federal consent decree. The emails also exposed the Mayor's secret lobbying for qualified immunity, a secret drone program funded with off-the-books cash, and the city's problems with police chases and the George Floyd protests. The emails also revealed that the Mayor's office was blindsided by CPD's use of facial recognition and Clearview AI. Metropolitan Police Department In May 2021, DDoSecrets republished the leak of Washington D.C.'s Metropolitan Police Department, including over 90,000 emails. According to DDoSecrets co-founder Emma Best, the documents gave "a unique opportunity to examine how these systems of policing are built, how they’re deployed, and an opportunity to perform an authoritative study on how, when and why the system is deployed differently against different groups." Among other things, the files revealed details of surveillance of right-wing extremists and the response to the 2021 United States Capitol attack. Parler In January 2021, DDoSecrets made the scraped Parler videos available to journalists. Videos scraped from Parler were used as evidence during the second impeachment trial of Donald Trump. Myanmar releases Myanmar Financials In February 2021, DDoSecrets gave journalists financial documents from the Directorate of Investment and Company Administration (DICA) showing Google was indirectly supporting the Myanmar coup by allowing Gmail addresses and Google run blogs to be used to run companies owned and operated by Myanmar's military and coup leaders. After the public release of the 330 gigabyte leak, Google disabled the blog. A Google spokesperson told Insider, "In this case, we have terminated accounts as a result of President Biden's Executive Order of 11 February 2021 concerning Myanmar." Justice For Myanmar called the release "biggest leak in Myanmar history." Myanmar Investments In March 2020, DDoSecrets published an additional 156 GB of data which had been hacked from the Myanmar Investment Commission. The release included entries of the Investments Management System, proposals and permits, many of which are labelled “secret” or “confidential”. As a result, Justice For Myanmar added 26 companies to its list of business associates of the Myanmar military. The leak also revealed how millions of dollars allegedly flowed from Mytel subscribers into the pockets of Myanmar military generals and how their families profited from the military, the coup itself and the internet blackouts. The leak also led to allegations of profiteering which resulted in policy changes that cost Myanmar generals millions of dollars. The data also revealed that Thai state-owned companies were funding the Myanmar junta. GabLeaks On February 28, DDoSecrets revealed "GabLeaks", a collection of more than 70 gigabytes of data from Gab, including more than 40 million posts, passwords, private messages, and other leaked information. The data was given to the group by a hacktivist self-identifying as "JaXpArO and My Little Anonymous Revival Project", who retrieved the data from Gab's back-end databases to expose the platform's largely right-wing userbase. DDoSecrets co-founder Emma Best called GabLeaks "another gold mine of research for people looking at militias, neo-Nazis, the far right, QAnon and everything surrounding January 6." The group said that they would not release the data publicly due to the data containing a large amount of private and sensitive information and will instead share the data with select journalists, social scientists, and researchers. Andy Greenberg from Wired confirmed that the data "does appear to contain Gab users' individual and group profiles—their descriptions and privacy settings—public and private posts, and passwords". In response, Gab CEO Andrew Torba acknowledged the data breach, said that his Gab account had been "compromised", and that "the entire company is all hands investigating what happened and working to trace and patch the problem". Torba also used a transphobic slur to insult the hackers "attacking" Gab and referred to them as "demon hackers." On March 1, he revealed in a post on Gab's blog that the company had received a ransom demand of $500,000 in Bitcoin for the data, and wrote in response that they would not be paying it. Also on March 1, Torba said in a Gab post that "I want to make clear that we have zero tolerance for any threats of violence including against the wicked people who are attacking Gab. We need to pray for these people. I am." Dan Goodin reported in Ars Technica on March 2 that Gab's chief technology officer (CTO), Fosco Marotto, had in February introduced a SQL vulnerability that may have led to the data breach, and that Gab had subsequently scrubbed the commit from Git history. The company had previously open sourced Gab's source code in a Git repository which included all historical commits; on March 1, they took the repository offline and replaced it with a zipfile. On March 8, JaXpArO again compromised verified accounts on Gab, posting a message to their feeds addressed to Torba, which said the service had been "fully compromised" the previous week and accused him of lying to Gab's users. Gab briefly went offline again the same day, and the company wrote on Twitter that they had taken their site offline "to investigate a security breach". Torba posted a statement in response to the attack, claiming that "The attacker who stole data from Gab harvested OAuth2 bearer tokens during their initial attack" and that "Though their ability to harvest new tokens was patched, we did not clear all tokens related to the original attack. By reusing these old tokens, the attacker was able to post 177 statuses in an 8-minute period today." In May 2021, The Intercept used GabLeaks in its coverage and fundraising. Former Intercept reporter Glenn Greenwald criticized the publication for exploiting what he called an invasion of free speech and privacy, which he said contrasted with The Intercept's origins during the Snowden leaks. GiveSendGo In April 2021, Distributed Denial of Secrets made donor information from the Christian crowdfunding site GiveSendGo available to journalists and researchers. The information identified previously anonymous high-dollar donors to far-right actors including members of the Proud Boys, designated as a terrorist group in Canada, many of whose fundraising efforts were directly related to the 2021 United States Capitol attack. The platform had previously been criticized for its refusal to restrict use by far right extremists. It was later reported that police officers and public officials in the United States had donated to Kyle Rittenhouse. The executive officer for internal affairs for Norfolk Police Department was fired for the comments he made with his donation to Rittenhouse. In May 2021, USA Today used the GiveSendGo data to report that nearly $100,000 was raised for the Proud Boys on GiveSendGo from people of Chinese descent in the days before the 2021 Capitol attack. In June 2021, USA Today used the GiveSendGo data to report that a member of the Koch family had anonymously donated to a crowdfunding campaign supporting the election fraud conspiracy theories. In February 2022, after many anonymous donors supported the 2022 Freedom Convoy, DDOS started distributing a hacked list of donors' personal information from GiveSendGo to journalists and researchers. See also Anonymous (hacker group) Bahamas Leaks BlueLeaks Cryptome Global surveillance disclosures (2013–present) Organized Crime and Corruption Reporting Project International Consortium of Investigative Journalists Panama Papers Paradise Papers Offshore Leaks WikiLeaks References External links Classified documents Freedom of speech Freedom of expression Internet leaks Investigative journalism National security News leaks Online archives Online organizations Organizations established in 2018 Transparency (behavior) WikiLeaks Whistleblowers Whistleblowing Whistleblower support organizations Open government
46526
https://en.wikipedia.org/wiki/Advance-fee%20scam
Advance-fee scam
An advance-fee scam is a form of fraud and is one of the most common types of confidence tricks. The scam typically involves promising the victim a significant share of a large sum of money, in return for a small up-front payment, which the fraudster claims will be used to obtain the large sum. If a victim makes the payment, the fraudster either invents a series of further fees for the victim to pay or simply disappears. The Federal Bureau of Investigation (FBI) states that "An advance fee scheme occurs when the victim pays money to someone in anticipation of receiving something of greater value—such as a loan, contract, investment, or gift—and then receives little or nothing in return." There are many variations of this type of scam, including the Nigerian prince scam, also known as a 419 scam. The number "419" refers to the section of the Nigerian Criminal Code dealing with fraud and the charges and penalties for such offenders. The scam has been used with fax and traditional mail and is now prevalent in online communications like emails. Other variations include the Spanish Prisoner scam and the black money scam. Other nations known to have a high incidence of advance-fee fraud include: Ivory Coast, Togo, South Africa, the Netherlands, Spain, Poland and Jamaica. History The modern scam is similar to the Spanish Prisoner scam that dates back to the late 18th century. In that con, businessmen were contacted by an individual allegedly trying to smuggle someone who is connected to a wealthy family out of a prison in Spain. In exchange for assistance, the scammer promised to share money with the victim in exchange for a small amount of money to bribe prison guards. One variant of the scam may date back to the 18th or 19th centuries, as a very similar letter, entitled "The Letter from Jerusalem." This is illustrated in the memoirs of Eugène François Vidocq, a former French criminal and private investigator. Another variant of the scam, dating back to ca. 1830, appears very similar to emails today: "Sir, you will doubtlessly be astonished to be receiving a letter from a person unknown to you, who is about to ask a favour from you..." and goes on to talk of a casket containing 16,000 francs in gold and the diamonds of a late marchioness. The modern day transnational scam can be traced back to Germany in 1922 and became popular during the 1980s. There are many variants of the template letter. One of these, sent via postal mail, was addressed to a woman's husband to inquire about his health. It then asked what to do with profits from a $24.6 million investment and ended with a telephone number. Other official-looking letters were sent from a writer who said he was a director of the state-owned Nigerian National Petroleum Corporation. He said he wanted to transfer $20 million to the recipient's bank account—money that was budgeted but was never spent. In exchange for transferring the funds out of Nigeria, the recipient would keep 30% of the total. To get the process started, the scammer asked for a few sheets of the company's letterhead, bank account numbers, and other personal information. Yet other variants have involved mention of a Nigerian prince or other member of a royal family seeking to transfer large sums of money out of the country—thus, these scams are sometimes called "Nigerian Prince emails". The spread of e-mail and email harvesting software significantly lowered the cost of sending scam letters by using the Internet in lieu of international post. While Nigeria is most often the nation referred to in these scams, they may originate in other nations as well. For example in 2007, the head of the Economic and Financial Crimes Commission stated that scam emails more frequently originated in African countries or in Eastern Europe. Within the European Union, there is a high incidence of advance-fee fraud in the Netherlands and Spain. According to Cormac Herley, a Microsoft researcher, "By sending an email that repels all but the most gullible, the scammer gets the most promising marks to self-select." Nevertheless, Nigeria has earned a reputation for being at the center of email scammers, and the number 419 refers to the article of the Nigerian Criminal Code (part of Chapter 38: "Obtaining property by false pretenses; Cheating") dealing with fraud. Modern variations have arisen and subsequently became fairly popular among Nigerian youth circles. Such variants include “sugar daddy/sugar momma” schemes involving advance fees, money “flipping” involving mobile payment apps, and related scams. They refer to their targets as Maga or Mugu, slang developed from a Yoruba word meaning 'easy target' or 'fool' and referring to gullible people in general. Some scammers have accomplices in the United States and abroad who move in to finish the deal once the initial contact has been made. Implementation This scam usually begins with the perpetrator contacting the victim via email, instant messaging, or social media using a fake email address or a fake social media account. The fraudster then makes an offer that would allegedly result in a large payoff for the victim. An email subject line may say something like "From the desk of barrister [Name]", "Your assistance is needed", and so on. The details vary, but the usual story is that a person, often a government or bank employee, knows of a large amount of unclaimed money or gold that they cannot access directly, usually because they have no right to it. Such people, who may be real but impersonated people or fictitious characters played by the con artist, could include, for example, the wife or son of a deposed African leader who has amassed a stolen fortune, a bank employee who knows of a terminally ill wealthy person with no relatives, or a wealthy foreigner who deposited money in the bank just before dying in a plane crash (leaving no will or known next of kin), a US soldier who has stumbled upon a hidden cache of gold in Iraq, a business being audited by the government, a disgruntled worker or corrupt government official who has embezzled funds, a refugee, and similar characters. The money could be in the form of gold bullion, gold dust, money in a bank account, blood diamonds, a series of checks or bank drafts, and so forth. The sums involved are usually in the millions of dollars, and the investor is promised a large share, typically ten to forty percent, in return for assisting the fraudster to retrieve or expatriate the money. Although the vast majority of recipients do not respond to these emails, a very small percentage do, enough to make the fraud worthwhile, as many millions of messages can be sent daily. To help persuade the victim to agree to the deal, the scammer often sends one or more false documents that bear official government stamps, and seals. 419 scammers often mention false addresses and use photographs taken from the Internet or from magazines to falsely represent themselves. Often a photograph used by a scammer is not a picture of any person involved in the scheme. Multiple "people" may write or be involved in schemes as they continue, but they are often fictitious; in many cases, one person controls many fictitious personae all used in scams. Once the victim's confidence has been gained, the scammer then introduces a delay or monetary hurdle that prevents the deal from occurring as planned, such as "To transmit the money, we need to bribe a bank official. Could you help us with a loan?" or "For you to be a party to the transaction, you must have holdings at a Nigerian bank of $100,000 or more" or similar. This is the money being stolen from the victim; the victim willingly transfers the money, usually through some irreversible channel such as a wire transfer, and the scammer receives and pockets it. Often but not always, delays and additional costs are added by the fraudster, always keeping the promise of an imminent large transfer alive, convincing the victim that the money the victim is currently paying would be covered several times over by the payoff. The implication that these payments will be used for white-collar crime, such as bribery, and even that the money they are being promised is being stolen from a government or royal/wealthy family, often prevents the victim from telling others about the "transaction", as it would involve admitting that they intended to be complicit in an international crime. Sometimes psychological pressure is added by claiming that the Nigerian side, to pay certain fees, had to sell belongings and borrow money on a house or by comparing the salary scale and living conditions in Africa to those in the West. Much of the time, however, the needed psychological pressure is self-applied: once the victims have provided money toward the payoff, they feel they have a vested interest in seeing the "deal" through. Some victims even believe they can cheat the other party, and walk away with all the money instead of just the percentage they were promised. The essential fact in all advance-fee fraud operations is the promised money transfer to the victim never happens because the money does not exist. The perpetrators rely on the fact that by the time the victim realizes this (often only after being confronted by a third party who has noticed the transactions or conversation and recognized the scam), the victim may have sent thousands of dollars of their own money. Sometimes thousands more that has been borrowed or stolen to the scammer via an untraceable and/or irreversible means such as wire transfer. The scammer disappears, and the victim is left on the hook for the money sent to the scammer. During the course of many schemes, scammers ask victims to supply bank account information. Usually this is a "test" devised by the scammer to gauge the victim's gullibility; the bank account information isn't used directly by the scammer, because a fraudulent withdrawal from the account is more easily detected, reversed, and traced. Scammers instead usually request that payments be made using a wire transfer service like Western Union and MoneyGram. The reason given by the scammer usually relates to the speed at which the payment can be received and processed, allowing quick release of the supposed payoff. The real reason for using such money-sending services is that such wire transfers are irreversible and often untraceable. Further, these services are ideal because identification beyond knowledge of the details of the transaction is often not required, making receipt of such funds almost or entirely completely anonymous. However, bank account information obtained by scammers is sometimes sold in bulk to other fraudsters who wait a few months for the victim to repair the damage caused by the initial scam before raiding any accounts that the victim didn't close. Telephone numbers used by scammers tend to come from burner phones. In Ivory Coast, a scammer may purchase an inexpensive mobile phone and a pre-paid SIM card without submitting any identifying information. If the scammers believe they are being traced, they discard their mobile phones and purchase new ones. The spam emails used in these scams are often sent from Internet cafés equipped with satellite internet connection. Recipient addresses and email content are copied and pasted into a webmail interface using a stand-alone storage medium, such as a memory card. Certain areas of Lagos, such as Festac, contain many cyber cafés that serve scammers; cyber cafés often seal their doors outside hours, such as from 10:30pm to 7:00am, so that scammers inside may work without fear of discovery. Nigeria also contains many businesses that provide false documents used in scams. After a scam involving a forged signature of Nigerian President Olusegun Obasanjo in summer 2005, Nigerian authorities raided a market in the Oluwole section of Lagos. There, police seized thousands of Nigerian and non-Nigerian passports, 10,000 blank British Airways boarding passes,10,000 United States Postal money orders, customs documents, false university certificates, 500 printing plates, and 500 computers. The "success rate" of the scammers is also hard to gauge, since they are operating illegally and do not keep track of specific numbers. One individual estimated he sent 500 emails per day and received about seven replies, citing that when he received a reply, he was 70 percent certain he would get the money. If tens of thousands of emails are sent every day by thousands of individuals, it doesn't take a very high success rate to be worthwhile. Countermeasures In recent years, efforts have been made by governments, internet companies, and individuals to combat scammers involved in advance-fee fraud and 419 scams. In 2004, the Nigerian government formed the Economic and Financial Crimes Commission (EFCC) to combat economic and financial crimes, such as advanced-fee fraud. In 2009, Nigeria's EFCC announced that they have adopted smart technology developed by Microsoft to track down fraudulent emails. They hoped to have the service, dubbed "Eagle Claw", running at full capacity to warn a quarter of a million potential victims. Some individuals participate in a practice known as scam baiting, in which they pose as potential targets and engage the scammers in lengthy dialogue so as to waste the scammer's time and decrease the time they have available for actual victims. Common elements Irreversible money transfers A central element of advance-fee fraud is that the transaction from the victim to the scammer must be untraceable and irreversible. Otherwise, the victim, once they become aware of the scam, could successfully retrieve their money and alert officials who could track the accounts used by the scammer. Wire transfers via Western Union and MoneyGram are ideal for this purpose. International wire transfers cannot be cancelled or reversed, and the person receiving the money cannot be tracked. Other non-cancellable forms of payment include postal money orders and cashier's checks, but wire transfer via Western Union or MoneyGram is more common. Anonymous communication Since the scammer's operations must be untraceable to avoid identification, and because the scammer is often impersonating someone else, any communication between the scammer and his victim must be done through channels that hide the scammer's true identity. The following options in particular are widely used. Web-based email Because many free email services do not require valid identifying information and also allow communication with many victims in a short span of time, they are the preferred method of communication for scammers. Some services go so far as to mask the sender's source IP address (Gmail being a common choice), making the scammer's country of origin more difficult to trace. While Gmail does indeed strip headers from emails, it is possible to trace an IP address from such an email. Scammers can create as many accounts as they wish and often have several at a time. In addition, if email providers are alerted to the scammer's activities and suspend the account, it is a trivial matter for the scammer to simply create a new account to resume scamming. Email hijacking/friend scams Some fraudsters hijack existing email accounts and use them for advance-fee fraud purposes. For instance, with social engineering, the fraudster impersonates associates, friends, or family members of the legitimate account owner in an attempt to defraud them. A variety of techniques such as phishing, keyloggers, and computer viruses are used to gain login information for the email address. Fax transmissions Facsimile machines are commonly used tools of business whenever a client requires a hard copy of a document. They can also be simulated using web services and made untraceable by the use of prepaid phones connected to mobile fax machines or by use of a public fax machine such as one owned by a document processing business like FedEx Office/Kinko's. Thus, scammers posing as business entities often use fax transmissions as an anonymous form of communication. This is more expensive, as the prepaid phone and fax equipment cost more than email, but to a skeptical victim, it can be more believable. SMS messages Abusing SMS bulk senders such as WASPs, scammers subscribe to these services using fraudulent registration details and paying either via cash or with stolen credit card details. They then send out masses of unsolicited SMS messages to victims stating they have won a competition, lottery, reward, or an event and that they have to contact somebody to claim their prize. Typically the details of the party to be contacted will be an equally untraceable email address or a virtual telephone number. These messages may be sent over a weekend when the staff at the service providers are not working, enabling the scammer to be able to abuse the services for a whole weekend. Even when traceable, they give out long and winding procedures for procuring the reward (real or unreal) and that too with the impending huge cost of transportation and tax or duty charges. The origin of such SMS messages is often from fake websites/addresses. A contemporary (mid-2011) innovation is the use of a Premium Rate 'call back' number (instead of a website or email) in the SMS. On calling the number, the victim is first reassured that 'they are a winner' and then subjected to a long series of instructions on how to collect their 'winnings'. During the message, there will be frequent instructions to 'ring back in the event of problems'. The call is always 'cut off' just before the victim has the chance to note all the details. Some victims call back multiple times in an effort to collect all the details. The scammer thus makes their money out of the fees charged for the calls. Telecommunications relay services Many scams use telephone calls to convince the victim that the person on the other end of the deal is a real, truthful person. The scammer, possibly impersonating a person of a nationality or gender other than their own, would arouse suspicion by telephoning the victim. In these cases, scammers use TRS, a US federally funded relay service where an operator or a text/speech translation program acts as an intermediary between someone using an ordinary telephone and a deaf caller using TDD or other teleprinter device. The scammer may claim they are deaf, and that they must use a relay service. The victim, possibly drawn in by sympathy for a disabled caller, might be more susceptible to the fraud. FCC regulations and confidentiality laws require operators to relay calls verbatim and adhere to a strict code of confidentiality and ethics. Thus, no relay operator may judge the legality and legitimacy of a relay call and must relay it without interference. This means the relay operator may not warn victims, even when they suspect the call is a scam. MCI said about one percent of their IP Relay calls in 2004 were scams. Tracking phone-based relay services is relatively easy, so scammers tend to prefer Internet Protocol-based relay services such as IP Relay. In a common strategy, they bind their overseas IP address to a router or server located on US soil, allowing them to use US-based relay service providers without interference. TRS is sometimes used to relay credit card information to make a fraudulent purchase with a stolen credit card. In many cases however, it is simply a means for the con artist to further lure the victim into the scam. Invitation to visit the country Sometimes, victims are invited to a country to meet government officials, an associate of the scammer, or the scammer themselves. Some victims who travel are instead held for ransom. Scammers may tell a victim that they do not need a visa or that the scammers will provide one. If the victim does this, the scammers have the power to extort money from the victim. Sometimes victims are ransomed, kidnapped, or murdered. According to a 1995 U.S. State Department report, over fifteen persons were murdered between 1992 and 1995 in Nigeria after following through on advance-fee frauds. In 1999 Norwegian millionaire Kjetil Moe was lured to South Africa by scammers and was murdered. George Makronalli was lured to South Africa and was killed in 2004. Variants There are many variations on the most common stories, and also many variations on the way the scam works. Some of the more commonly seen variants involve employment scams, lottery scams, online sales and rentals, and romance scams. Many scams involve online sales, such as those advertised on websites such as Craigslist and eBay, or property rental. This article cannot list every known and future type of advanced fee fraud or 419 scheme; only some major types are described. Additional examples may be available in the external links section at the end of this article. Employment scams This scam targets people who have posted their résumés on e.g. job sites. The scammer sends a letter with a falsified company logo. The job offer usually indicates exceptional salary and benefits, and requests that the victim needs a "work permit" for working in the country, and includes the address of a (fake) "government official" to contact. The "government official" then proceeds to fleece the victim by extracting fees from the unsuspecting user for the work permit and other fees. A variant of the job scam recruits freelancers seeking work, such as editing or translation, then requires some advance payment before assignments are offered. Many legitimate (or at least fully registered) companies work on a similar basis, using this method as their primary source of earnings. Some modelling and escort agencies tell applicants that they have a number of clients lined up, but that they require some sort of prior "registration fee", usually paid in by an untraceable method, e.g. by Western Union transfer; once the fee is paid, the applicant is informed the client has cancelled, and not contacted again. The scammer contacts the victim to interest them in a "work-at-home" opportunity, or asks them to cash a check or money order that for some reason cannot be redeemed locally. In one cover story, the perpetrator of the scam wishes the victim to work as a "mystery shopper", evaluating the service provided by MoneyGram or Western Union locations within major retailers such as Wal-Mart. The scammer sends the victim a forged or stolen check or money order as described above, the victim deposits it—banks will often credit an account with the value of a check not obviously false— and sends the money to the scammer via wire transfer. Later the check is not honoured and the bank debits the victim's account. Schemes based solely on check cashing usually offer only a small part of the check's total amount, with the assurance that many more checks will follow; if the victim buys into the scam and cashes all the checks, the scammer can steal a lot in a very short time. Bogus job offers More sophisticated scams advertise jobs with real companies and offer lucrative salaries and conditions with the fraudsters pretending to be recruitment agents. A bogus telephone or online interview may take place and after some time the applicant is informed that the job is theirs. To secure the job they are instructed to send money for their work visa or travel costs to the agent, or to a bogus travel agent who works on the scammer's behalf. No matter what the variation, they always involve the job seeker sending them or their agent money, credit card or bank account details. A newer form of employment scam has arisen in which users are sent a bogus job offer, but are not asked to give financial information. Instead, their personal information is harvested during the application process and then sold to third parties for a profit, or used for identity theft. Another form of employment scam involves making people receive a fake "interview" where they are told the benefits of the company. The attendees are then made to assist to a conference where a scammer will use elaborate manipulation techniques to convince the attendees to purchase products, in a similar manner to the catalog merchant business model, as a hiring requisite. Quite often, the company lacks any form of the physical catalog to help them sell products (e.g. jewelry). When "given" the job, the individual is then asked to promote the scam job offer on their own. They are also made to work the company unpaid as a form of "training". Similar scams involve making alleged job candidates pay money upfront in person for training materials or services, with the claim that upon successful completion, they will be offered a guaranteed job, which never materializes. Lottery scam The lottery scam involves fake notices of lottery wins, although the intended victim has not entered the lottery. The "winner" is usually asked to send sensitive information such as name, residential address, occupation/position, lottery number etc. to a free email account which is at times untraceable or without any link. In addition to harvesting this information, the scammer then notifies the victim that releasing the funds requires some small fee (insurance, registration, or shipping). Once the victim sends the fee, the scammer invents another fee. The fake check technique described above is also used. Fake or stolen checks, representing a part payment of the winnings, being sent; then a fee, smaller than the amount received, is requested. The bank receiving the bad check eventually reclaims the funds from the victim. In 2004, a variant of the lottery scam appeared in the United States: a scammer phones a victim purporting to be speaking on behalf of the government about a grant they qualify for, subject to an advance fee of typically US$250. Typical lottery scams address the person as some variation of Lucky Winner. This is a red flag as if someone entered an actual lottery and won, the organization would know their name, and not simply call them Lucky Winner. Online sales and rentals Many scams involve the purchase of goods and services via classified advertisements, especially on sites like Craigslist, eBay, or Gumtree. These typically involve the scammer contacting the seller of a particular good or service via telephone or email expressing interest in the item. They will typically then send a fake check written for an amount greater than the asking price, asking the seller to send the difference to an alternate address, usually by money order or Western Union. A seller eager to sell a particular product may not wait for the check to clear, and when the bad check bounces, the funds wired have already been lost. Some scammers advertise phony academic conferences in exotic or international locations, complete with fake websites, scheduled agendas and advertising experts in a particular field that will be presenting there. They offer to pay the airfare of the participants, but not the hotel accommodations. They will extract money from the victims when they attempt to reserve their accommodations in a non-existent hotel. Sometimes, an inexpensive rental property is advertised by a fake landlord, who is typically out of state (or the country) and asking for the rent and/or deposit to be wired to them. Or the con artist finds a property, pretends to be the owner, lists it online, and communicates with the would-be renter to make a cash deposit. The scammer may also be the renter as well, in which case they pretend to be a foreign student and contact a landlord seeking accommodation. They usually state they are not yet in the country and wish to secure accommodations prior to arriving. Once the terms are negotiated, a forged check is forwarded for a greater amount than negotiated, and the fraudster asks the landlord to wire some of the money back. Pet scams This is a variation of the online sales scam where high-value, scarce pets are advertised as bait on online advertising websites using little real seller verification like Craigslist, Gumtree, and JunkMail. The pet may either be advertised as being for-sale or up for adoption. Typically the pet is advertised on online advertising pages complete with photographs taken from various sources such as real advertisements, blogs or wherever else an image can be stolen. Upon the potential victim contacting the scammer, the scammer responds by asking for details pertaining to the potential victim's circumstances and location under the pretense of ensuring that the pet would have a suitable home. By determining the location of the victim, the scammer ensures he is far enough from the victim so as to not allow the buyer to physically view the pet. Should the scammer be questioned, as the advertisement claimed a location initially, the scammer will claim work circumstances having forced him to relocate. This forces a situation whereby all communication is either via email, telephone (normally untraceable numbers) and SMS. Upon the victim deciding to adopt or purchase the pet, a courier has to be used which is in reality part of the scam. If this is for an adopted pet, typically the victim is expected to pay some fee such as insurance, food or shipping. Payment is via MoneyGram, Western Union or money mules' bank accounts where other victims have been duped into work from home scams. Numerous problems are encountered in the courier phase of the scam. The crate is too small and the victim has the option of either purchasing a crate with air conditioning or renting one while also paying a deposit, typically called a caution or cautionary fee. The victim may also have to pay for insurance if such fees have not been paid yet. If the victim pays these fees, the pet may become sick and a veterinarian's assistance is sought for which the victim has to repay the courier. Additionally, the victim may be asked to pay for a health certificate needed to transport the pet, and for kennel fees during the recuperation period. The further the scam progresses, the more similar are the fictitious fees to those of typical 419 scams. It is not uncommon to see customs or like fees being claimed if such charges fit into the scam plot. Numerous scam websites may be used for this scam. This scam has been linked to the classical 419 scams in that the fictitious couriers used, as are also used in other types of 419 scams such as lotto scams. Romance scam One of the variants is the Romance Scam, a money-for-romance angle. The con artist approaches the victim on an online dating service, an instant messenger, or a social networking site. The scammer claims an interest in the victim, and posts pictures of an attractive person. The scammer uses this communication to gain confidence, then asks for money. A very common example of romance scams is depicted in the fraudulent activities of these boys, popularly known as Yahoo boys in Nigeria. The con artist may claim to be interested in meeting the victim but needs cash to book a plane, buy a bus ticket, rent a hotel room, pay for personal-travel costs such as gasoline or a vehicle rental, or to cover other expenses. In other cases, they claim they're trapped in a foreign country and need assistance to return, to escape imprisonment by corrupt local officials, to pay for medical expenses due to an illness contracted abroad, and so on. The scammer may also use the confidence gained by the romance angle to introduce some variant of the original Nigerian Letter scheme, such as saying they need to get money or valuables out of the country and offer to share the wealth, making the request for help in leaving the country even more attractive to the victim. Scams often involve meeting someone on an online match-making service. The scammer initiates contact with their target who is out of the area and requests money for transportation fare. Scammers will typically ask for money to be sent via a money order or wire transfer due to the need to travel, or for medical or business costs. When a victim travels to a meeting, it can have deadly consequence as in the case of Jette Jacobs, 67, from Australia. Jette Jacobs traveled to South Africa to supposedly marry her scammer, Jesse Orowo Omokoh, 28, after having sent more than $90000. to him over a three-year period. Her body was discovered on February 9, 2013, under mysterious circumstances, two days after meeting up with Omokoh. Omokoh has fled back to Nigeria. After questioning in Nigeria, Omokoh was arrested. He was found to have had 32 fake online identities. He was never charged with murder, due to the inability to prove he had a hand in the death of Jette Jacobs, only fraud charges. Mobile tower installation fraud This variant of advance-fee fraud is widespread in India and Pakistan. The fraudster uses Internet classified websites and print media to lure the public for the installation of a mobile phone tower on their property, with the promise of huge rental returns. The fraudster also creates fake websites to appear legitimate. The victims part with their money in pieces to the fraudster on account of the Government Service Tax, government clearance charges, bank charges, transportation charges, survey fee etc. The Indian government is issuing public notices in media to spread awareness among the public and warn them against mobile tower fraudsters. Other scams Other scams involve unclaimed property, also called "bona vacantia" in the United Kingdom. In England and Wales (other than the Duchy of Lancaster and the Duchy of Cornwall), this property is administered by the Bona Vacantia Division of the Treasury Solicitor's Department. Fraudulent emails and letters claiming to be from this department have been reported, informing the recipient they are the beneficiary of a legacy but requiring the payment of a fee before sending more information or releasing the money. In the United States, messages are falsely claimed to be from the National Association of Unclaimed Property Administrators (NAUPA), a real organization, but one that does not and cannot itself make payments. In one variant of 419 fraud, an alleged hitman writes to someone explaining he has been targeted to kill them. He tells them he knows the allegations against them are false, and asks for money so the target can receive evidence of the person who ordered the hit. Another variant of advanced fee fraud is known as a pigeon drop. This is a confidence trick in which the mark, or "pigeon", is persuaded to give up a sum of money in order to secure the rights to a larger sum of money, or more valuable object. In reality, the scammers make off with the money and the mark is left with nothing. In the process, the stranger (actually a confidence trickster) puts his money with the mark's money (in an envelope, briefcase, or bag) which the mark is then apparently entrusted with; it is actually switched for a bag full of newspaper or other worthless material. Through various theatrics, the mark is given the opportunity to leave with the money without the stranger realizing. In reality, the mark would be fleeing from his own money, which the con man still has (or has handed off to an accomplice). Some scammers will go after the victims of previous scams; known as a reloading scam. For example, they may contact a victim saying they can track and apprehend the scammer and recover the money lost by the victim, for a price. Or they may say a fund has been set up by the Nigerian government to compensate victims of 419 fraud, and all that is required is proof of the loss, personal information, and a processing and handling fee. The recovery scammers obtain lists of victims by buying them from the original scammers. Consequences Estimates of the total losses due to the scam are uncertain and vary widely, since many people may be too embarrassed to admit that they were gullible enough to be scammed to report the crime. A United States government report in 2006 indicated that Americans lost $198.4 million to Internet fraud in 2006, averaging a loss of $5,100 per incident. That same year, a report in the United Kingdom claimed that these scams cost the economy £150 million per year, with the average victim losing £31,000. In addition to the financial cost, many victims also suffer a severe emotional and psychological cost, such as losing their ability to trust people. One man from Cambridgeshire, UK burnt himself to death with petrol after realizing that the $1.2 million "internet lottery" that he had won was actually a scam. In 2007 a Chinese student at the University of Nottingham killed herself after she discovered that she had fallen for a similar lottery scam. Other victims lose wealth and friends, become estranged from family members, deceive partners, get divorced, or commit criminal offenses in the process of either fulfilling their "obligations" to the scammers or obtaining more money. In 2008, an Oregon woman lost $400,000 to a Nigerian advance-fee fraud scam, after an email told her she had inherited money from her long-lost grandfather. Her curiosity was piqued because she actually had a grandfather with whom her family had lost touch, and whose initials matched those given in the email. She sent hundreds of thousands of dollars over a period of more than two years, despite her family, bank staff and law enforcement officials all urging her to stop. The elderly are particularly susceptible to online scams such as this, as they typically come from a generation that was more trusting, and are often too proud to report the fraud. They also may be concerned that relatives might see it as a sign of declining mental capacity, and they are afraid to lose their independence. Victims can be enticed to borrow or embezzle money to pay the advance fees, believing that they will shortly be paid a much larger sum and be able to refund what they misappropriated. Crimes committed by victims include credit-card fraud, check kiting, and embezzlement. San Diego-based businessman James Adler lost over $5 million in a Nigeria-based advance-fee scam. While a court affirmed that various Nigerian government officials (including a governor of the Central Bank of Nigeria) were directly or indirectly involved, and that Nigerian government officials could be sued in U.S. courts under the "commercial activity" exception to the Foreign Sovereign Immunities Act, Adler was unable to get his money back due to the doctrine of unclean hands because he had knowingly entered into a contract that was illegal. Some 419 scams involve even more serious crimes, such as kidnapping or murder. One such case, in 2008, involves Osamai Hitomi, a Japanese businessman who was lured to Johannesburg, South Africa and kidnapped on September 26, 2008. The kidnappers took him to Alberton, south of Johannesburg, and demanded a $5 million ransom from his family. Seven people were ultimately arrested. In July 2001, Joseph Raca, a former mayor of Northampton, UK, was kidnapped by scammers in Johannesburg, South Africa, who demanded a ransom of £20,000. The captors released Raca after they became nervous. One 419 scam that ended in murder occurred in February 2003, when Jiří Pasovský, a 72-year-old scam victim from the Czech Republic, shot and killed 50-year-old Michael Lekara Wayid, an official at the Nigerian embassy in Prague, and injured another person, after the Nigerian Consul General explained he could not return the $600,000 that Pasovský had lost to a Nigerian scammer. The international nature of the crime, combined with the fact that many victims do not want to admit that they bought into an illegal activity, has made tracking down and apprehending these criminals difficult. Furthermore, the government of Nigeria has been slow to take action, leading some investigators to believe that some Nigerian government officials are involved in some of these scams. The Nigeria government's establishment of the Economic and Financial Crimes Commission (EFCC) in 2004 helped with the issue to some degree, although issues with corruption remain. A notable case which the EFCC pursued was that of Emmanuel Nwude, who was convicted for defrauding $242 million out of the director of a Brazilian bank, Banco Noroeste, which ultimately led to the bank's collapse. Also the SARS (Special Anti-Robbery Squad), a division of the Nigerian Police Force lately has been apprehending suspected fraud perpetrators. Despite this, there have been some recent successes in apprehending and prosecuting these criminals. In 2004, fifty-two suspects were arrested in Amsterdam after an extensive raid, after which almost no 419 emails were reported being sent by local internet service providers. In November 2004, Australian authorities apprehended Nick Marinellis of Sydney, the self-proclaimed head of Australian 419ers who later boasted that he had "220 African brothers worldwide" and that he was "the Australian headquarters for those scams". In 2008 US authorities in Olympia, Washington, sentenced Edna Fiedler to two years in prison with 5 years of supervised probation for her involvement in a $1 million Nigerian check scam. She had an accomplice in Lagos, Nigeria, who shipped her up to $1.1 million worth of counterfeit checks and money orders with instructions on where to ship them. In popular culture Due to the increased use of the 419 scams on the Internet, it has been used as a plot device in many films, television shows and books. A song, "I Go Chop Your Dollar", performed by Nkem Owoh, also became internationally known as an anthem for 419 scammers using the phrases "419 is just a game, I am the winner, you are the loser". Other appearances in popular media include: The 2016 short story The Nigerian Prince - When The Scammer Becomes The Scammed by L. Toshua Parker follows the true story of a U.S. college student and hacker in 2000 who targeted Nigerian 419 scammers and stole millions back from them. In "A Thief in Ni-Moya", a 1981 novella from Robert Silverberg's Majipoor series, a young woman is swindled out of her savings under the pretense of fees required to inherit a large estate. The novel I Do Not Come To You By Chance by Nigerian author Adaobi Tricia Nwaubani explores the phenomenon. The 2006 direct-to-DVD kid flick EZ Money features an instance of this scam as its central premise. In the 2007 Futurama straight-to-DVD film Bender's Big Score, Professor Farnsworth falls for a lottery scam, giving away his personal details on the Internet after believing he has won the Spanish national lottery. Later, Nixon's Head falls for a "sweepstakes" letter by the same scammers, while Zoidberg is taken by an advance-fee fraud, thinking he is next of kin to a Nigerian Prince. In series 6, episode 3 of the BBC television series The Real Hustle, the hustlers demonstrated the 419 Scam to the hidden cameras in the "High Stakes" episodes of the show. In the HBO comedy series Flight of the Conchords episode "The New Cup", the band's manager, Murray, uses the band's emergency funds for what appears to be a 419 scam—an investment offer made by a Mr. Nigel Soladu, who had e-mailed him from Nigeria. However, it turns out that Nigel Soladu is a real Nigerian businessman and the investment offer is legitimate, although Murray notes that, despite Mr. Soladu having e-mailed many people for an investment, only he had taken him up on it. The band receives a 1000% profit, which they use to get bailed out of jail. The Residents included a song called "My Nigerian Friend" in their 2008 multimedia production The Bunny Boy. In the pilot episode, "The Nigerian Job", of Leverage, the group uses the reputation of the Nigerian Scam to con a deceitful businessman. The 2012 novel 419 by Will Ferguson is the story of a daughter looking for the persons she believes responsible for her father's death due to suicide following a 419 scam. A follow-up to earlier novels about con men and frauds (Generica and Spanish Fly), 419 won the 2012 Giller Prize, Canada's most distinguished literary award. In the video game Warframe, Nef Anyo ran an advance-fee fraud during Operation False Profit, where players attempted to reverse the scam and steal credits from him in order to bankrupt him and prevent his creation of a robotic army. MC Frontalot's song "Message No. 419" is about a 419 scam. See also Sakawa, fraud in Ghana with African traditional rituals Scambaiting Nigerian organized crime Pigeon drop References Further reading Daly, Samuel Fury Childs (2020). A History of the Republic of Biafra: Law, Crime, and the Nigerian Civil War. Cambridge: Cambridge University Press. External links US FBI Internet Crime Complaint Center Canadian Cybercrime Europol Cybercrime Spamming Social engineering (computer security) Non-sufficient funds Deception Fraud Property crimes Confidence tricks Crime in Nigeria
26256989
https://en.wikipedia.org/wiki/Network%20Crack%20Program%20Hacker%20Group
Network Crack Program Hacker Group
The Network Crack Program Hacker Group (NCPH Group) is a Chinese hacker group based out of Zigong in Sichuan Province. While the group first gained notoriety after hacking 40% of the hacker association websites in China, their attacks grew in sophistication and notoriety through 2006 and received international media attention in early 2007. iDefense linked the GinWui rootkit, developed by their leader Tan Dailin (Wicked Rose) with attacks on the US Department of Defense in May and June 2006. iDefense linked the group with many of the 35 zero-day hacker proof-of-concept codes used in attacks with over a period of 90 days during the summer of 2006. They are also known for the remote-network-control programs they offer for download. Wicked Rose announced in a blog post that the group is paid for their work, but the group's sponsor is unknown. Members The group had four core members in 2006, Wicked Rose, KuNgBim, Charles, and Rodag, with approximately 10 members in total. The group's current membership is unknown. Wicked Rose Wicked Rose, also known as Meigui (玫瑰), is the pseudonym of the Chinese hacker Tan Dailin. He is first noted as a hacker during the "patriotic" attacks of 2001. In 2005, Wicked Rose was contracted by the Sichuan Military Command Communication Department which instructed him to participate in the Chengdu Military Command Network Attack/Defense Competition. After winning the local competition, he received a month of intense training in simulating attacks, designing hacking tools, and drafting network-infiltration strategies. He and his team represented the Sichuan Military Command in a competition with other provinces which they went on to win. Wicked Rose is also credited with the development of the GinWui rootkit used in attacks on the US Department of Defense in 2006. As the group's leader, he is responsible for managing relationships with sponsors and paying NCPH members for their work. In April 2009 he was arrested after committing distributed denial of service attacks on Hackbase, HackerXFiles, and 3800hk, possibly for the purpose of committing blackmail. the organizations attacked collected information on the attack and turned it in to the public security department. The authorities conducted an investigation and shut down his website. Hackbase reported Wicked Rose was arrested and faces up to 71/2 years in prison. Controversy The group expelled the hacker WZT on 20 May 2006. Although the cause is unknown, the group ejected him soon after the zero-day attacks were publicly disclosed. WZT was a coding expert within the group. Associates Former NCPH member associates with the Chinese hacker Li0n, the founder of the Honker Union of China (HUC). Wicked Rose credits the Chinese hacker WHG, also known as "fig" as one of the developers of the GinWui rootkit. WHG is an expert in malicious code. Security firms researching Wicked Rose's activities have connected him with the Chinese hacker group Evil Security Team. Activities The group is known for its remote-network-control programs they offer for free on their website and the exploitation of zero-day vulnerabilities of Microsoft Office suite products. After their founding in 2004, the group earned a reputation among hacking groups by hacking 40% of the hacker association websites in China. GinWui Rootkit Wicked Rose is the creator of the GinWui rootkit. His code and support posts are on Chinese hacker message boards, and was also available from the NCPH blog. Security researchers discovered the rootkit on 18 May 2006 attackers utilized it in attacks on the US and Japan. Attackers introduced it to the US in an attack against a Department of Defense entity. They used two different versions of the rootkit in attacks during May and June 2006. According to F-secure, GinWui is "a fully featured backdoor with rootkit characteristics." It is distributed through Word documents. The backdoor GinWui creates allows the controlling hacker control over certain processes of the compromised computer including the ability to, Create, read, write, delete, and search for files and directories, Access and modify the Registry, Manipulate services, Start and kill processes, Get information about the infected computer, and lock, restart, or shutdown Windows, among other activities. According to Information Systems Security, the rootkit also obtains kernel-level access to "...trap several functions and modify information passed to the user." Microsoft Office Exploits IDefense links NCPH with many of the 35 zero-day and proof-of-concept codes used in attacks against Microsoft Office products over a period of 90 days during the summer of 2006 due to the use of malware developed by Wicked Rose and not available in the public domain at the time. The group graduated from their early attacks exploiting only Microsoft Word, and by the end of 2006, they were also using Power Point and Excel in attacks. NCPH utilizes these exploits in spear phishing attacks. Spear Phishing On his blog, Wicked Rose discussed his preference for spear phishing attacks. First, during the collection phase information is gathered using open source information or from employee databases or mailboxes of a company's system. He may also conduct analysis on user ID's which allows them to track and understand their activities. Finally he conducts the attack using the information collected and someone is likely to open the infected document. Spear phishing attacks attributed to NCPH increased in sophistication over time. While their phishing attacks in the beginning of 2006 targeted large numbers of employees, one attack attributed to the group later that year targeted one individual in a US oil company using socially engineered emails and infected Power Point documents. Sponsorship After winning the military network attack/defense competition, the group obtained a sponsor who paid them 2000 RMB per month. IDefense believes their sponsor is likely the People's Liberation Army (PLA) but has no definitive evidence to support this claim. After the 2006 attacks took place, their sponsor increased their pay to 5000 RMB. The group's current sponsor is unknown. Media coverage Time reporter Simon Elegant interviewed eight members of the group in December 2007 as part of an article on Chinese government cyber operations against the US government. During the interview the members referred to each other using code names. Security firm iDefense has published reports on the group and their exploits and devoted a webinar to the group, their capabilities, and relationships with other Chinese hackers. Scott Henderson, Chinese linguistics and Chinese hacker expert, has also devoted several blog posts to the group and their ongoing activities. Blogging All four core members of the group have blogged about their activities at one point or another. The group's blog NCPH.net also offered network-infiltration programs for download. Scott Henderson describes Wicked Rose's early blog posts as "the most revealing and damning thing I have ever seen a Chinese hacker write." After the interview with Time reporter Wicked Rose took down the group's blog and his blog. In July 2008 the group's blog returned, but with modified content. Withered Rose also began blogging again, saying he was busy during the time the blog was down, but that his new job allows him more time to blog. Chinese officials removed both blogs after his arrest in April 2009. Rodag also blogs, but the most recent post is from August 2008. His last post is on IE vulnerabilities that attackers can used to exploit a user's desktop. See also Wicked Rose Honker Union References External links link to the iDefense Webcast (Internet Explorer only) The Dark Visitor Enemies At The Firewall Hacker groups Groups Malware
18567210
https://en.wikipedia.org/wiki/Computer%20graphics
Computer graphics
Computer graphics deals with generating images with the aid of computers. Today, computer graphics is a core technology in digital photography, film, video games, cell phone and computer displays, and many specialized applications. A great deal of specialized hardware and software has been developed, with the displays of most devices being driven by computer graphics hardware. It is a vast and recently developed area of computer science. The phrase was coined in 1960 by computer graphics researchers Verne Hudson and William Fetter of Boeing. It is often abbreviated as CG, or typically in the context of film as computer generated imagery (CGI). The non-artistic aspects of computer graphics are the subject of computer science research. Some topics in computer graphics include user interface design, sprite graphics, rendering, ray tracing, geometry processing, computer animation, vector graphics, 3D modeling, shaders, GPU design, implicit surfaces, visualization, scientific computing, image processing, computational photography, scientific visualization, computational geometry and computer vision, among others. The overall methodology depends heavily on the underlying sciences of geometry, optics, physics, and perception. Computer graphics is responsible for displaying art and image data effectively and meaningfully to the consumer. It is also used for processing image data received from the physical world, such as photo and video content. Computer graphics development has had a significant impact on many types of media and has revolutionized animation, movies, advertising, video games, in general. Overview The term computer graphics has been used in a broad sense to describe "almost everything on computers that is not text or sound". Typically, the term computer graphics refers to several different things: the representation and manipulation of image data by a computer the various technologies used to create and manipulate images methods for digitally synthesizing and manipulating visual content, see study of computer graphics Today, computer graphics is widespread. Such imagery is found in and on television, newspapers, weather reports, and in a variety of medical investigations and surgical procedures. A well-constructed graph can present complex statistics in a form that is easier to understand and interpret. In the media "such graphs are used to illustrate papers, reports, theses", and other presentation material. Many tools have been developed to visualize data. Computer-generated imagery can be categorized into several different types: two dimensional (2D), three dimensional (3D), and animated graphics. As technology has improved, 3D computer graphics have become more common, but 2D computer graphics are still widely used. Computer graphics has emerged as a sub-field of computer science which studies methods for digitally synthesizing and manipulating visual content. Over the past decade, other specialized fields have been developed like information visualization, and scientific visualization more concerned with "the visualization of three dimensional phenomena (architectural, meteorological, medical, biological, etc.), where the emphasis is on realistic renderings of volumes, surfaces, illumination sources, and so forth, perhaps with a dynamic (time) component". History The precursor sciences to the development of modern computer graphics were the advances in electrical engineering, electronics, and television that took place during the first half of the twentieth century. Screens could display art since the Lumiere brothers' use of mattes to create special effects for the earliest films dating from 1895, but such displays were limited and not interactive. The first cathode ray tube, the Braun tube, was invented in 1897 – it in turn would permit the oscilloscope and the military control panel – the more direct precursors of the field, as they provided the first two-dimensional electronic displays that responded to programmatic or user input. Nevertheless, computer graphics remained relatively unknown as a discipline until the 1950s and the post-World War II period – during which time the discipline emerged from a combination of both pure university and laboratory academic research into more advanced computers and the United States military's further development of technologies like radar, advanced aviation, and rocketry developed during the war. New kinds of displays were needed to process the wealth of information resulting from such projects, leading to the development of computer graphics as a discipline. 1950s Early projects like the Whirlwind and SAGE Projects introduced the CRT as a viable display and interaction interface and introduced the light pen as an input device. Douglas T. Ross of the Whirlwind SAGE system performed a personal experiment in which he wrote a small program that captured the movement of his finger and displayed its vector (his traced name) on a display scope. One of the first interactive video games to feature recognizable, interactive graphics – Tennis for Two – was created for an oscilloscope by William Higinbotham to entertain visitors in 1958 at Brookhaven National Laboratory and simulated a tennis match. In 1959, Douglas T. Ross innovated again while working at MIT on transforming mathematic statements into computer generated 3D machine tool vectors by taking the opportunity to create a display scope image of a Disney cartoon character. Electronics pioneer Hewlett-Packard went public in 1957 after incorporating the decade prior, and established strong ties with Stanford University through its founders, who were alumni. This began the decades-long transformation of the southern San Francisco Bay Area into the world's leading computer technology hub – now known as Silicon Valley. The field of computer graphics developed with the emergence of computer graphics hardware. Further advances in computing led to greater advancements in interactive computer graphics. In 1959, the TX-2 computer was developed at MIT's Lincoln Laboratory. The TX-2 integrated a number of new man-machine interfaces. A light pen could be used to draw sketches on the computer using Ivan Sutherland's revolutionary Sketchpad software. Using a light pen, Sketchpad allowed one to draw simple shapes on the computer screen, save them and even recall them later. The light pen itself had a small photoelectric cell in its tip. This cell emitted an electronic pulse whenever it was placed in front of a computer screen and the screen's electron gun fired directly at it. By simply timing the electronic pulse with the current location of the electron gun, it was easy to pinpoint exactly where the pen was on the screen at any given moment. Once that was determined, the computer could then draw a cursor at that location. Sutherland seemed to find the perfect solution for many of the graphics problems he faced. Even today, many standards of computer graphics interfaces got their start with this early Sketchpad program. One example of this is in drawing constraints. If one wants to draw a square for example, they do not have to worry about drawing four lines perfectly to form the edges of the box. One can simply specify that they want to draw a box, and then specify the location and size of the box. The software will then construct a perfect box, with the right dimensions and at the right location. Another example is that Sutherland's software modeled objects – not just a picture of objects. In other words, with a model of a car, one could change the size of the tires without affecting the rest of the car. It could stretch the body of car without deforming the tires. 1960s The phrase "computer graphics" itself was coined in 1960 by William Fetter, a graphic designer for Boeing. This old quote in many secondary sources comes complete with the following sentence: Fetter has said that the terms were actually given to him by Verne Hudson of the Wichita Division of Boeing. In 1961 another student at MIT, Steve Russell, created another important title in the history of video games, Spacewar! Written for the DEC PDP-1, Spacewar was an instant success and copies started flowing to other PDP-1 owners and eventually DEC got a copy. The engineers at DEC used it as a diagnostic program on every new PDP-1 before shipping it. The sales force picked up on this quickly enough and when installing new units, would run the "world's first video game" for their new customers. (Higginbotham's Tennis For Two had beaten Spacewar by almost three years; but it was almost unknown outside of a research or academic setting.) At around the same time (1961-1962) in the University of Cambridge, Elizabeth Waldram wrote code to display radio-astronomy maps on a cathode ray tube. E. E. Zajac, a scientist at Bell Telephone Laboratory (BTL), created a film called "Simulation of a two-giro gravity attitude control system" in 1963. In this computer-generated film, Zajac showed how the attitude of a satellite could be altered as it orbits the Earth. He created the animation on an IBM 7090 mainframe computer. Also at BTL, Ken Knowlton, Frank Sinden, Ruth A. Weiss and Michael Noll started working in the computer graphics field. Sinden created a film called Force, Mass and Motion illustrating Newton's laws of motion in operation. Around the same time, other scientists were creating computer graphics to illustrate their research. At Lawrence Radiation Laboratory, Nelson Max created the films Flow of a Viscous Fluid and Propagation of Shock Waves in a Solid Form. Boeing Aircraft created a film called Vibration of an Aircraft. Also sometime in the early 1960s, automobiles would also provide a boost through the early work of Pierre Bézier at Renault, who used Paul de Casteljau's curves – now called Bézier curves after Bézier's work in the field – to develop 3d modeling techniques for Renault car bodies. These curves would form the foundation for much curve-modeling work in the field, as curves – unlike polygons – are mathematically complex entities to draw and model well. It was not long before major corporations started taking an interest in computer graphics. TRW, Lockheed-Georgia, General Electric and Sperry Rand are among the many companies that were getting started in computer graphics by the mid-1960s. IBM was quick to respond to this interest by releasing the IBM 2250 graphics terminal, the first commercially available graphics computer. Ralph Baer, a supervising engineer at Sanders Associates, came up with a home video game in 1966 that was later licensed to Magnavox and called the Odyssey. While very simplistic, and requiring fairly inexpensive electronic parts, it allowed the player to move points of light around on a screen. It was the first consumer computer graphics product. David C. Evans was director of engineering at Bendix Corporation's computer division from 1953 to 1962, after which he worked for the next five years as a visiting professor at Berkeley. There he continued his interest in computers and how they interfaced with people. In 1966, the University of Utah recruited Evans to form a computer science program, and computer graphics quickly became his primary interest. This new department would become the world's primary research center for computer graphics through the 1970s. Also, in 1966, Ivan Sutherland continued to innovate at MIT when he invented the first computer-controlled head-mounted display (HMD). It displayed two separate wireframe images, one for each eye. This allowed the viewer to see the computer scene in stereoscopic 3D. The heavy hardware required for supporting the display and tracker was called the Sword of Damocles because of the potential danger if it were to fall upon the wearer. After receiving his Ph.D. from MIT, Sutherland became Director of Information Processing at ARPA (Advanced Research Projects Agency), and later became a professor at Harvard. In 1967 Sutherland was recruited by Evans to join the computer science program at the University of Utah – a development which would turn that department into one of the most important research centers in graphics for nearly a decade thereafter, eventually producing some of the most important pioneers in the field. There Sutherland perfected his HMD; twenty years later, NASA would re-discover his techniques in their virtual reality research. At Utah, Sutherland and Evans were highly sought after consultants by large companies, but they were frustrated at the lack of graphics hardware available at the time, so they started formulating a plan to start their own company. In 1968, Dave Evans and Ivan Sutherland founded the first computer graphics hardware company, Evans & Sutherland. While Sutherland originally wanted the company to be located in Cambridge, Massachusetts, Salt Lake City was instead chosen due to its proximity to the professors' research group at the University of Utah. Also in 1968 Arthur Appel described the first ray casting algorithm, the first of a class of ray tracing-based rendering algorithms that have since become fundamental in achieving photorealism in graphics by modeling the paths that rays of light take from a light source, to surfaces in a scene, and into the camera. In 1969, the ACM initiated A Special Interest Group on Graphics (SIGGRAPH) which organizes conferences, graphics standards, and publications within the field of computer graphics. By 1973, the first annual SIGGRAPH conference was held, which has become one of the focuses of the organization. SIGGRAPH has grown in size and importance as the field of computer graphics has expanded over time. 1970s Subsequently, a number of breakthroughs in the field – particularly important early breakthroughs in the transformation of graphics from utilitarian to realistic – occurred at the University of Utah in the 1970s, which had hired Ivan Sutherland. He was paired with David C. Evans to teach an advanced computer graphics class, which contributed a great deal of founding research to the field and taught several students who would grow to found several of the industry's most important companies – namely Pixar, Silicon Graphics, and Adobe Systems. Tom Stockham led the image processing group at UU which worked closely with the computer graphics lab. One of these students was Edwin Catmull. Catmull had just come from The Boeing Company and had been working on his degree in physics. Growing up on Disney, Catmull loved animation yet quickly discovered that he did not have the talent for drawing. Now Catmull (along with many others) saw computers as the natural progression of animation and they wanted to be part of the revolution. The first computer animation that Catmull saw was his own. He created an animation of his hand opening and closing. He also pioneered texture mapping to paint textures on three-dimensional models in 1974, now considered one of the fundamental techniques in 3D modeling. It became one of his goals to produce a feature-length motion picture using computer graphics – a goal he would achieve two decades later after his founding role in Pixar. In the same class, Fred Parke created an animation of his wife's face. The two animations were included in the 1976 feature film Futureworld. As the UU computer graphics laboratory was attracting people from all over, John Warnock was another of those early pioneers; he later founded Adobe Systems and create a revolution in the publishing world with his PostScript page description language, and Adobe would go on later to create the industry standard photo editing software in Adobe Photoshop and a prominent movie industry special effects program in Adobe After Effects. James Clark was also there; he later founded Silicon Graphics, a maker of advanced rendering systems that would dominate the field of high-end graphics until the early 1990s. A major advance in 3D computer graphics was created at UU by these early pioneers – hidden surface determination. In order to draw a representation of a 3D object on the screen, the computer must determine which surfaces are "behind" the object from the viewer's perspective, and thus should be "hidden" when the computer creates (or renders) the image. The 3D Core Graphics System (or Core) was the first graphical standard to be developed. A group of 25 experts of the ACM Special Interest Group SIGGRAPH developed this "conceptual framework". The specifications were published in 1977, and it became a foundation for many future developments in the field. Also in the 1970s, Henri Gouraud, Jim Blinn and Bui Tuong Phong contributed to the foundations of shading in CGI via the development of the Gouraud shading and Blinn–Phong shading models, allowing graphics to move beyond a "flat" look to a look more accurately portraying depth. Jim Blinn also innovated further in 1978 by introducing bump mapping, a technique for simulating uneven surfaces, and the predecessor to many more advanced kinds of mapping used today. The modern videogame arcade as is known today was birthed in the 1970s, with the first arcade games using real-time 2D sprite graphics. Pong in 1972 was one of the first hit arcade cabinet games. Speed Race in 1974 featured sprites moving along a vertically scrolling road. Gun Fight in 1975 featured human-looking animated characters, while Space Invaders in 1978 featured a large number of animated figures on screen; both used a specialized barrel shifter circuit made from discrete chips to help their Intel 8080 microprocessor animate their framebuffer graphics. 1980s The 1980s began to see the modernization and commercialization of computer graphics. As the home computer proliferated, a subject which had previously been an academics-only discipline was adopted by a much larger audience, and the number of computer graphics developers increased significantly. In the early 1980s, metal–oxide–semiconductor (MOS) very-large-scale integration (VLSI) technology led to the availability of 16-bit central processing unit (CPU) microprocessors and the first graphics processing unit (GPU) chips, which began to revolutionize computer graphics, enabling high-resolution graphics for computer graphics terminals as well as personal computer (PC) systems. NEC's µPD7220 was the first GPU, fabricated on a fully integrated NMOS VLSI chip. It supported up to 1024x1024 resolution, and laid the foundations for the emerging PC graphics market. It was used in a number of graphics cards, and was licensed for clones such as the Intel 82720, the first of Intel's graphics processing units. MOS memory also became cheaper in the early 1980s, enabling the development of affordable framebuffer memory, notably video RAM (VRAM) introduced by Texas Instruments (TI) in the mid-1980s. In 1984, Hitachi released the ARTC HD63484, the first complementary MOS (CMOS) GPU. It was capable of displaying high-resolution in color mode and up to 4K resolution in monochrome mode, and it was used in a number of graphics cards and terminals during the late 1980s. In 1986, TI introduced the TMS34010, the first fully programmable MOS graphics processor. Computer graphics terminals during this decade became increasingly intelligent, semi-standalone and standalone workstations. Graphics and application processing were increasingly migrated to the intelligence in the workstation, rather than continuing to rely on central mainframe and mini-computers. Typical of the early move to high-resolution computer graphics intelligent workstations for the computer-aided engineering market were the Orca 1000, 2000 and 3000 workstations, developed by Orcatech of Ottawa, a spin-off from Bell-Northern Research, and led by David Pearson, an early workstation pioneer. The Orca 3000 was based on the 16-bit Motorola 68000 microprocessor and AMD bit-slice processors, and had Unix as its operating system. It was targeted squarely at the sophisticated end of the design engineering sector. Artists and graphic designers began to see the personal computer, particularly the Commodore Amiga and Macintosh, as a serious design tool, one that could save time and draw more accurately than other methods. The Macintosh remains a highly popular tool for computer graphics among graphic design studios and businesses. Modern computers, dating from the 1980s, often use graphical user interfaces (GUI) to present data and information with symbols, icons and pictures, rather than text. Graphics are one of the five key elements of multimedia technology. In the field of realistic rendering, Japan's Osaka University developed the LINKS-1 Computer Graphics System, a supercomputer that used up to 257 Zilog Z8001 microprocessors, in 1982, for the purpose of rendering realistic 3D computer graphics. According to the Information Processing Society of Japan: "The core of 3D image rendering is calculating the luminance of each pixel making up a rendered surface from the given viewpoint, light source, and object position. The LINKS-1 system was developed to realize an image rendering methodology in which each pixel could be parallel processed independently using ray tracing. By developing a new software methodology specifically for high-speed image rendering, LINKS-1 was able to rapidly render highly realistic images. It was used to create the world's first 3D planetarium-like video of the entire heavens that was made completely with computer graphics. The video was presented at the Fujitsu pavilion at the 1985 International Exposition in Tsukuba." The LINKS-1 was the world's most powerful computer, as of 1984. Also in the field of realistic rendering, the general rendering equation of David Immel and James Kajiya was developed in 1986 – an important step towards implementing global illumination, which is necessary to pursue photorealism in computer graphics. The continuing popularity of Star Wars and other science fiction franchises were relevant in cinematic CGI at this time, as Lucasfilm and Industrial Light & Magic became known as the "go-to" house by many other studios for topnotch computer graphics in film. Important advances in chroma keying ("bluescreening", etc.) were made for the later films of the original trilogy. Two other pieces of video would also outlast the era as historically relevant: Dire Straits' iconic, near-fully-CGI video for their song "Money for Nothing" in 1985, which popularized CGI among music fans of that era, and a scene from Young Sherlock Holmes the same year featuring the first fully CGI character in a feature movie (an animated stained-glass knight). In 1988, the first shaders – small programs designed specifically to do shading as a separate algorithm – were developed by Pixar, which had already spun off from Industrial Light & Magic as a separate entity – though the public would not see the results of such technological progress until the next decade. In the late 1980s, Silicon Graphics (SGI) computers were used to create some of the first fully computer-generated short films at Pixar, and Silicon Graphics machines were considered a high-water mark for the field during the decade. The 1980s is also called the golden era of videogames; millions-selling systems from Atari, Nintendo and Sega, among other companies, exposed computer graphics for the first time to a new, young, and impressionable audience – as did MS-DOS-based personal computers, Apple IIs, Macs, and Amigas, all of which also allowed users to program their own games if skilled enough. For the arcades, advances were made in commercial, real-time 3D graphics. In 1988, the first dedicated real-time 3D graphics boards were introduced for arcades, with the Namco System 21 and Taito Air System. On the professional side, Evans & Sutherland and SGI developed 3D raster graphics hardware that directly influenced the later single-chip graphics processing unit (GPU), a technology where a separate and very powerful chip is used in parallel processing with a CPU to optimize graphics. The decade also saw computer graphics applied to many additional professional markets, including location-based entertainment and education with the E&S Digistar, vehicle design, vehicle simulation, and chemistry. 1990s The 1990s' overwhelming note was the emergence of 3D modeling on a mass scale and an impressive rise in the quality of CGI generally. Home computers became able to take on rendering tasks that previously had been limited to workstations costing thousands of dollars; as 3D modelers became available for home systems, the popularity of Silicon Graphics workstations declined and powerful Microsoft Windows and Apple Macintosh machines running Autodesk products like 3D Studio or other home rendering software ascended in importance. By the end of the decade, the GPU would begin its rise to the prominence it still enjoys today. The field began to see the first rendered graphics that could truly pass as photorealistic to the untrained eye (though they could not yet do so with a trained CGI artist) and 3D graphics became far more popular in gaming, multimedia, and animation. At the end of the 1980s and the beginning of the nineties were created, in France, the very first computer graphics TV series: La Vie des bêtes by studio Mac Guff Ligne (1988), Les Fables Géométriques (1989–1991) by studio Fantôme, and Quarxs, the first HDTV computer graphics series by Maurice Benayoun and François Schuiten (studio Z-A production, 1990–1993). In film, Pixar began its serious commercial rise in this era under Edwin Catmull, with its first major film release, in 1995 – Toy Story – a critical and commercial success of nine-figure magnitude. The studio to invent the programmable shader would go on to have many animated hits, and its work on prerendered video animation is still considered an industry leader and research trail breaker. In video games, in 1992, Virtua Racing, running on the Sega Model 1 arcade system board, laid the foundations for fully 3D racing games and popularized real-time 3D polygonal graphics among a wider audience in the video game industry. The Sega Model 2 in 1993 and Sega Model 3 in 1996 subsequently pushed the boundaries of commercial, real-time 3D graphics. Back on the PC, Wolfenstein 3D, Doom and Quake, three of the first massively popular 3D first-person shooter games, were released by id Software to critical and popular acclaim during this decade using a rendering engine innovated primarily by John Carmack. The Sony PlayStation, Sega Saturn, and Nintendo 64, among other consoles, sold in the millions and popularized 3D graphics for home gamers. Certain late-1990s first-generation 3D titles became seen as influential in popularizing 3D graphics among console users, such as platform games Super Mario 64 and The Legend Of Zelda: Ocarina Of Time, and early 3D fighting games like Virtua Fighter, Battle Arena Toshinden, and Tekken. Technology and algorithms for rendering continued to improve greatly. In 1996, Krishnamurty and Levoy invented normal mapping – an improvement on Jim Blinn's bump mapping. 1999 saw Nvidia release the seminal GeForce 256, the first home video card billed as a graphics processing unit or GPU, which in its own words contained "integrated transform, lighting, triangle setup/clipping, and rendering engines". By the end of the decade, computers adopted common frameworks for graphics processing such as DirectX and OpenGL. Since then, computer graphics have only become more detailed and realistic, due to more powerful graphics hardware and 3D modeling software. AMD also became a leading developer of graphics boards in this decade, creating a "duopoly" in the field which exists this day. 2000s CGI became ubiquitous in earnest during this era. Video games and CGI cinema had spread the reach of computer graphics to the mainstream by the late 1990s and continued to do so at an accelerated pace in the 2000s. CGI was also adopted en masse for television advertisements widely in the late 1990s and 2000s, and so became familiar to a massive audience. The continued rise and increasing sophistication of the graphics processing unit were crucial to this decade, and 3D rendering capabilities became a standard feature as 3D-graphics GPUs became considered a necessity for desktop computer makers to offer. The Nvidia GeForce line of graphics cards dominated the market in the early decade with occasional significant competing presence from ATI. As the decade progressed, even low-end machines usually contained a 3D-capable GPU of some kind as Nvidia and AMD both introduced low-priced chipsets and continued to dominate the market. Shaders which had been introduced in the 1980s to perform specialized processing on the GPU would by the end of the decade become supported on most consumer hardware, speeding up graphics considerably and allowing for greatly improved texture and shading in computer graphics via the widespread adoption of normal mapping, bump mapping, and a variety of other techniques allowing the simulation of a great amount of detail. Computer graphics used in films and video games gradually began to be realistic to the point of entering the uncanny valley. CGI movies proliferated, with traditional animated cartoon films like Ice Age and Madagascar as well as numerous Pixar offerings like Finding Nemo dominating the box office in this field. The Final Fantasy: The Spirits Within, released in 2001, was the first fully computer-generated feature film to use photorealistic CGI characters and be fully made with motion capture. The film was not a box-office success, however. Some commentators have suggested this may be partly because the lead CGI characters had facial features which fell into the "uncanny valley". Other animated films like The Polar Express drew attention at this time as well. Star Wars also resurfaced with its prequel trilogy and the effects continued to set a bar for CGI in film. In videogames, the Sony PlayStation 2 and 3, the Microsoft Xbox line of consoles, and offerings from Nintendo such as the GameCube maintained a large following, as did the Windows PC. Marquee CGI-heavy titles like the series of Grand Theft Auto, Assassin's Creed, Final Fantasy, BioShock, Kingdom Hearts, Mirror's Edge and dozens of others continued to approach photorealism, grow the video game industry and impress, until that industry's revenues became comparable to those of movies. Microsoft made a decision to expose DirectX more easily to the independent developer world with the XNA program, but it was not a success. DirectX itself remained a commercial success, however. OpenGL continued to mature as well, and it and DirectX improved greatly; the second-generation shader languages HLSL and GLSL began to be popular in this decade. In scientific computing, the GPGPU technique to pass large amounts of data bidirectionally between a GPU and CPU was invented; speeding up analysis on many kinds of bioinformatics and molecular biology experiments. The technique has also been used for Bitcoin mining and has applications in computer vision. 2010s In the 2010s, CGI has been nearly ubiquitous in video, pre-rendered graphics are nearly scientifically photorealistic, and real-time graphics on a suitably high-end system may simulate photorealism to the untrained eye. Texture mapping has matured into a multistage process with many layers; generally, it is not uncommon to implement texture mapping, bump mapping or isosurfaces or normal mapping, lighting maps including specular highlights and reflection techniques, and shadow volumes into one rendering engine using shaders, which are maturing considerably. Shaders are now very nearly a necessity for advanced work in the field, providing considerable complexity in manipulating pixels, vertices, and textures on a per-element basis, and countless possible effects. Their shader languages HLSL and GLSL are active fields of research and development. Physically based rendering or PBR, which implements many maps and performs advanced calculation to simulate real optic light flow, is an active research area as well, along with advanced areas like ambient occlusion, subsurface scattering, Rayleigh scattering, photon mapping, and many others. Experiments into the processing power required to provide graphics in real time at ultra-high-resolution modes like 4K Ultra HD are beginning, though beyond reach of all but the highest-end hardware. In cinema, most animated movies are CGI now; a great many animated CGI films are made per year, but few, if any, attempt photorealism due to continuing fears of the uncanny valley. Most are 3D cartoons. In videogames, the Microsoft Xbox One, Sony PlayStation 4, and Nintendo Switch currently dominate the home space and are all capable of highly advanced 3D graphics; the Windows PC is still one of the most active gaming platforms as well. Image types Two-dimensional 2D computer graphics are the computer-based generation of digital images—mostly from models, such as digital image, and by techniques specific to them. 2D computer graphics are mainly used in applications that were originally developed upon traditional printing and drawing technologies such as typography. In those applications, the two-dimensional image is not just a representation of a real-world object, but an independent artifact with added semantic value; two-dimensional models are therefore preferred because they give more direct control of the image than 3D computer graphics, whose approach is more akin to photography than to typography. Pixel art A large form of digital art, pixel art is created through the use of raster graphics software, where images are edited on the pixel level. Graphics in most old (or relatively limited) computer and video games, graphing calculator games, and many mobile phone games are mostly pixel art. Sprite graphics A sprite is a two-dimensional image or animation that is integrated into a larger scene. Initially including just graphical objects handled separately from the memory bitmap of a video display, this now includes various manners of graphical overlays. Originally, sprites were a method of integrating unrelated bitmaps so that they appeared to be part of the normal bitmap on a screen, such as creating an animated character that can be moved on a screen without altering the data defining the overall screen. Such sprites can be created by either electronic circuitry or software. In circuitry, a hardware sprite is a hardware construct that employs custom DMA channels to integrate visual elements with the main screen in that it super-imposes two discrete video sources. Software can simulate this through specialized rendering methods. Vector graphics Vector graphics formats are complementary to raster graphics. Raster graphics is the representation of images as an array of pixels and is typically used for the representation of photographic images. Vector graphics consists in encoding information about shapes and colors that comprise the image, which can allow for more flexibility in rendering. There are instances when working with vector tools and formats is best practice, and instances when working with raster tools and formats is best practice. There are times when both formats come together. An understanding of the advantages and limitations of each technology and the relationship between them is most likely to result in efficient and effective use of tools. Three-dimensional 3D graphics, compared to 2D graphics, are graphics that use a three-dimensional representation of geometric data. For the purpose of performance, this is stored in the computer. This includes images that may be for later display or for real-time viewing. Despite these differences, 3D computer graphics rely on similar algorithms as 2D computer graphics do in the frame and raster graphics (like in 2D) in the final rendered display. In computer graphics software, the distinction between 2D and 3D is occasionally blurred; 2D applications may use 3D techniques to achieve effects such as lighting, and primarily 3D may use 2D rendering techniques. 3D computer graphics are the same as 3D models. The model is contained within the graphical data file, apart from the rendering. However, there are differences that include the 3D model is the representation of any 3D object. Until visually displayed a model is not graphic. Due to printing, 3D models are not only confined to virtual space. 3D rendering is how a model can be displayed. Also can be used in non-graphical computer simulations and calculations. Computer animation Computer animation is the art of creating moving images via the use of computers. It is a subfield of computer graphics and animation. Increasingly it is created by means of 3D computer graphics, though 2D computer graphics are still widely used for stylistic, low bandwidth, and faster real-time rendering needs. Sometimes the target of the animation is the computer itself, but sometimes the target is another medium, such as film. It is also referred to as CGI (Computer-generated imagery or computer-generated imaging), especially when used in films. Virtual entities may contain and be controlled by assorted attributes, such as transform values (location, orientation, and scale) stored in an object's transformation matrix. Animation is the change of an attribute over time. Multiple methods of achieving animation exist; the rudimentary form is based on the creation and editing of keyframes, each storing a value at a given time, per attribute to be animated. The 2D/3D graphics software will change with each keyframe, creating an editable curve of a value mapped over time, in which results in animation. Other methods of animation include procedural and expression-based techniques: the former consolidates related elements of animated entities into sets of attributes, useful for creating particle effects and crowd simulations; the latter allows an evaluated result returned from a user-defined logical expression, coupled with mathematics, to automate animation in a predictable way (convenient for controlling bone behavior beyond what a hierarchy offers in skeletal system set up). To create the illusion of movement, an image is displayed on the computer screen then quickly replaced by a new image that is similar to the previous image, but shifted slightly. This technique is identical to the illusion of movement in television and motion pictures. Concepts and principles Images are typically created by devices such as cameras, mirrors, lenses, telescopes, microscopes, etc. Digital images include both vector images and raster images, but raster images are more commonly used. Pixel In digital imaging, a pixel (or picture element) is a single point in a raster image. Pixels are placed on a regular 2-dimensional grid, and are often represented using dots or squares. Each pixel is a sample of an original image, where more samples typically provide a more accurate representation of the original. The intensity of each pixel is variable; in color systems, each pixel has typically three components such as red, green, and blue. Graphics are visual presentations on a surface, such as a computer screen. Examples are photographs, drawing, graphics designs, maps, engineering drawings, or other images. Graphics often combine text and illustration. Graphic design may consist of the deliberate selection, creation, or arrangement of typography alone, as in a brochure, flier, poster, web site, or book without any other element. Clarity or effective communication may be the objective, association with other cultural elements may be sought, or merely, the creation of a distinctive style. Primitives Primitives are basic units which a graphics system may combine to create more complex images or models. Examples would be sprites and character maps in 2D video games, geometric primitives in CAD, or polygons or triangles in 3D rendering. Primitives may be supported in hardware for efficient rendering, or the building blocks provided by a graphics application. Rendering Rendering is the generation of a 2D image from a 3D model by means of computer programs. A scene file contains objects in a strictly defined language or data structure; it would contain geometry, viewpoint, texture, lighting, and shading information as a description of the virtual scene. The data contained in the scene file is then passed to a rendering program to be processed and output to a digital image or raster graphics image file. The rendering program is usually built into the computer graphics software, though others are available as plug-ins or entirely separate programs. The term "rendering" may be by analogy with an "artist's rendering" of a scene. Although the technical details of rendering methods vary, the general challenges to overcome in producing a 2D image from a 3D representation stored in a scene file are outlined as the graphics pipeline along a rendering device, such as a GPU. A GPU is a device able to assist the CPU in calculations. If a scene is to look relatively realistic and predictable under virtual lighting, the rendering software should solve the rendering equation. The rendering equation does not account for all lighting phenomena, but is a general lighting model for computer-generated imagery. 'Rendering' is also used to describe the process of calculating effects in a video editing file to produce final video output. 3D projection 3D projection is a method of mapping three dimensional points to a two dimensional plane. As most current methods for displaying graphical data are based on planar two dimensional media, the use of this type of projection is widespread. This method is used in most real-time 3D applications and typically uses rasterization to produce the final image. Ray tracing Ray tracing is a technique from the family of image order algorithms for generating an image by tracing the path of light through pixels in an image plane. The technique is capable of producing a high degree of photorealism; usually higher than that of typical scanline rendering methods, but at a greater computational cost. Shading Shading refers to depicting depth in 3D models or illustrations by varying levels of darkness. It is a process used in drawing for depicting levels of darkness on paper by applying media more densely or with a darker shade for darker areas, and less densely or with a lighter shade for lighter areas. There are various techniques of shading including cross hatching where perpendicular lines of varying closeness are drawn in a grid pattern to shade an area. The closer the lines are together, the darker the area appears. Likewise, the farther apart the lines are, the lighter the area appears. The term has been recently generalized to mean that shaders are applied. Texture mapping Texture mapping is a method for adding detail, surface texture, or colour to a computer-generated graphic or 3D model. Its application to 3D graphics was pioneered by Dr Edwin Catmull in 1974. A texture map is applied (mapped) to the surface of a shape, or polygon. This process is akin to applying patterned paper to a plain white box. Multitexturing is the use of more than one texture at a time on a polygon. Procedural textures (created from adjusting parameters of an underlying algorithm that produces an output texture), and bitmap textures (created in an image editing application or imported from a digital camera) are, generally speaking, common methods of implementing texture definition on 3D models in computer graphics software, while intended placement of textures onto a model's surface often requires a technique known as UV mapping (arbitrary, manual layout of texture coordinates) for polygon surfaces, while non-uniform rational B-spline (NURB) surfaces have their own intrinsic parameterization used as texture coordinates. Texture mapping as a discipline also encompasses techniques for creating normal maps and bump maps that correspond to a texture to simulate height and specular maps to help simulate shine and light reflections, as well as environment mapping to simulate mirror-like reflectivity, also called gloss. Anti-aliasing Rendering resolution-independent entities (such as 3D models) for viewing on a raster (pixel-based) device such as a liquid-crystal display or CRT television inevitably causes aliasing artifacts mostly along geometric edges and the boundaries of texture details; these artifacts are informally called "jaggies". Anti-aliasing methods rectify such problems, resulting in imagery more pleasing to the viewer, but can be somewhat computationally expensive. Various anti-aliasing algorithms (such as supersampling) are able to be employed, then customized for the most efficient rendering performance versus quality of the resultant imagery; a graphics artist should consider this trade-off if anti-aliasing methods are to be used. A pre-anti-aliased bitmap texture being displayed on a screen (or screen location) at a resolution different than the resolution of the texture itself (such as a textured model in the distance from the virtual camera) will exhibit aliasing artifacts, while any procedurally defined texture will always show aliasing artifacts as they are resolution-independent; techniques such as mipmapping and texture filtering help to solve texture-related aliasing problems. Volume rendering Volume rendering is a technique used to display a 2D projection of a 3D discretely sampled data set. A typical 3D data set is a group of 2D slice images acquired by a CT or MRI scanner. Usually these are acquired in a regular pattern (e.g., one slice every millimeter) and usually have a regular number of image pixels in a regular pattern. This is an example of a regular volumetric grid, with each volume element, or voxel represented by a single value that is obtained by sampling the immediate area surrounding the voxel. 3D modeling 3D modeling is the process of developing a mathematical, wireframe representation of any three-dimensional object, called a "3D model", via specialized software. Models may be created automatically or manually; the manual modeling process of preparing geometric data for 3D computer graphics is similar to plastic arts such as sculpting. 3D models may be created using multiple approaches: use of NURBs to generate accurate and smooth surface patches, polygonal mesh modeling (manipulation of faceted geometry), or polygonal mesh subdivision (advanced tessellation of polygons, resulting in smooth surfaces similar to NURB models). A 3D model can be displayed as a two-dimensional image through a process called 3D rendering, used in a computer simulation of physical phenomena, or animated directly for other purposes. The model can also be physically created using 3D Printing devices. Pioneers in computer graphics Charles Csuri Charles Csuri is a pioneer in computer animation and digital fine art and created the first computer art in 1964. Csuri was recognized by Smithsonian as the father of digital art and computer animation, and as a pioneer of computer animation by the Museum of Modern Art (MoMA) and Association for Computing Machinery-SIGGRAPH. Donald P. Greenberg Donald P. Greenberg is a leading innovator in computer graphics. Greenberg has authored hundreds of articles and served as a teacher and mentor to many prominent computer graphic artists, animators, and researchers such as Robert L. Cook, Marc Levoy, Brian A. Barsky, and Wayne Lytle. Many of his former students have won Academy Awards for technical achievements and several have won the SIGGRAPH Achievement Award. Greenberg was the founding director of the NSF Center for Computer Graphics and Scientific Visualization. A. Michael Noll Noll was one of the first researchers to use a digital computer to create artistic patterns and to formalize the use of random processes in the creation of visual arts. He began creating digital art in 1962, making him one of the earliest digital artists. In 1965, Noll along with Frieder Nake and Georg Nees were the first to publicly exhibit their computer art. During April 1965, the Howard Wise Gallery exhibited Noll's computer art along with random-dot patterns by Bela Julesz. Other pioneers Pierre Bézier Jim Blinn Jack Bresenham John Carmack Paul de Casteljau Ed Catmull Frank Crow James D. Foley William Fetter Henry Fuchs Henri Gouraud Charles Loop Nadia Magnenat Thalmann Benoit Mandelbrot Martin Newell Fred Parke Bui Tuong Phong David Pearson Steve Russell Daniel J. Sandin Alvy Ray Smith Bob Sproull Ivan Sutherland Daniel Thalmann Andries van Dam John Warnock J. Turner Whitted Lance Williams Jim Kajiya Organizations SIGGRAPH GDC Bell Telephone Laboratories United States Armed Forces, particularly the Whirlwind computer and SAGE Project Boeing IBM Renault The computer science department of the University of Utah Lucasfilm and Industrial Light & Magic Autodesk Adobe Systems Pixar Silicon Graphics, Khronos Group & OpenGL The DirectX division at Microsoft Nvidia AMD Study of computer graphics The study of computer graphics is a sub-field of computer science which studies methods for digitally synthesizing and manipulating visual content. Although the term often refers to three-dimensional computer graphics, it also encompasses two-dimensional graphics and image processing. As an academic discipline, computer graphics studies the manipulation of visual and geometric information using computational techniques. It focuses on the mathematical and computational foundations of image generation and processing rather than purely aesthetic issues. Computer graphics is often differentiated from the field of visualization, although the two fields have many similarities. Applications Computer graphics may be used in the following areas: Computational biology Computational photography Computational physics Computer-aided design Computer simulation Design Digital art Education Graphic design Infographics Information visualization Rational drug design Scientific visualization Special Effects for cinema Video Games Virtual reality Web design See also Computer representation of surfaces Glossary of computer graphics Notes References Further reading L. Ammeraal and K. Zhang (2007). Computer Graphics for Java Programmers, Second Edition, John-Wiley & Sons, . David Rogers (1998). Procedural Elements for Computer Graphics. McGraw-Hill. James D. Foley, Andries Van Dam, Steven K. Feiner and John F. Hughes (1995). Computer Graphics: Principles and Practice. Addison-Wesley. Donald Hearn and M. Pauline Baker (1994). Computer Graphics. Prentice-Hall. Francis S. Hill (2001). Computer Graphics. Prentice Hall. John Lewell (1985). Computer Graphics: A Survey of Current Techniques and Applications. Van Nostrand Reinhold. Jeffrey J. McConnell (2006). Computer Graphics: Theory Into Practice. Jones & Bartlett Publishers. R. D. Parslow, R. W. Prowse, Richard Elliot Green (1969). Computer Graphics: Techniques and Applications. Peter Shirley and others. (2005). Fundamentals of computer graphics. A.K. Peters, Ltd. M. Slater, A. Steed, Y. Chrysantho (2002). Computer graphics and virtual environments: from realism to real-time. Addison-Wesley. Wolfgang Höhl (2008): Interactive environments with open-source software, Springer Wien New York, External links A Critical History of Computer Graphics and Animation History of Computer Graphics series of articles Computer Graphics research at UC Berkeley Thomas Dreher: History of Computer Art, chap. IV.2 Computer Animation History of Computer Graphics on RUS
419357
https://en.wikipedia.org/wiki/Creative%20NOMAD
Creative NOMAD
The NOMAD was a range of digital audio players designed and sold by Creative Technology Limited, and later discontinued in 2004. Subsequent players now fall exclusively under the MuVo and ZEN brands. The NOMAD series consisted of two distinct brands: NOMAD (and later NOMAD MuVo) - Players that use flash memory. This brand eventually became the MuVo line. NOMAD Jukebox - Players that use microdrives. The brand evolved into the ZEN line. NOMAD and NOMAD MuVo These models appear as a USB mass storage device to the operating system so that the device can be accessed like any other removable disk, a floppy disk for example. Older MuVo devices and all Jukebox models use a custom protocol named PDE (Portable Digital Entertainment, a Creative internal device designation) that requires the installation of drivers before the device can be recognised by the operating system. Creative's foray into the MP3 player market began with the Creative NOMAD, a rebranded Samsung Electronics Yepp YP-D40 player with 64 megabytes of solid-state memory. IEEE 1284 Parallel port connection Creative NOMAD USB 1.1 connection Creative NOMAD II - Included FM radio and 64MB of memory via bundled Smart Media card. No internal memory. Creative NOMAD IIc - Same appearance as Nomad II, but with no FM radio and 64MB or 128MB internal memory. Creative NOMAD II MG Creative NOMAD MuVo Creative NOMAD MuVo NX USB 2.0 connection Creative NOMAD MuVo2 Creative NOMAD MuVo2 X-Trainer Creative NOMAD MuVo2 FM Creative NOMAD MuVo USB 2.0 Creative NOMAD MuVo TX Creative NOMAD MuVo TX (Second Edition) Creative NOMAD MuVo TX FM Creative NOMAD Muvo Micro V200 Creative NOMAD MuVo Micro v100 Creative NOMAD Muvo Micro N200 NOMAD Jukebox Zen Later NOMAD Jukeboxes used Creative's own firmware. Most players use Texas Instruments TMS320DA25x ARM plus digital signal processor as their CPU and support some version of Creative's environmental audio extensions (EAX). It beat Apple Computer's hard drive music player "iPod" to market by about a year. The NOMAD Jukeboxes have varied in their use of connections. The Jukebox 3 and Jukebox Zen were unusual in their use of the older USB 1.1 standard despite their predecessor, the NOMAD Jukebox 2, having used the newer USB 2.0 standard. Part of the reason for this was the inclusion of a FireWire connection, which is of comparable speed to USB 2.0. USB 1.1 connection Creative NOMAD Jukebox (Creative Digital Audio Player in Europe) Creative NOMAD Jukebox 3 (also features a FireWire connection) Creative NOMAD Jukebox Zen (also features a FireWire connection) USB 2.0 connection Creative NOMAD Jukebox 2 Creative NOMAD Jukebox Zen USB 2.0 Creative NOMAD Jukebox Zen NX Creative NOMAD Jukebox Zen Xtra A variant of the NOMAD Jukebox was also sold as an OEM product by Dell under the name Dell Digital Jukebox (Dell DJ), a USB 2.0 device. The Second Generation Dell DJ and Dell Pocket DJ 5 are also OEM products from Creative. Future versions in the Creative ZEN line exclusively use Microsoft's Media Transfer Protocol (also known as PlaysForSure), and some legacy devices have been supplied with firmware upgrades to support MTP. The first NOMAD player and the first NOMAD Jukebox use proprietary protocols, neither PDE or MTP. Related software Besides the NOMAD Explorer or MediaSource programs included with the devices, there are other programs which can be used to manage the player and to transfer data. Bundled software Creative NOMAD Explorer - Software included with older NOMAD models and used to transfer music and data to the device. This has since been replaced by Creative MediaSource Creative MediaSource - A fully featured audio player for Microsoft Windows that also manages NOMAD devices and can be used to transfer media to the device or to synchronise playlists with the device - a feature that was unavailable in the previous NOMAD Explorer software. Free software For early models The NOMAD Manager Project is a program to control the original NOMAD player under Linux The Nomad II Linux USB driver - for using the Nomad II under Linux For NOMAD Jukeboxes libnjb is the driver underlying most of the following programs Amarok is a music player for Linux Banshee is a music player for Linux Gnomad is a Jukebox Manager for Linux Neutrino is another Jukebox Manager for Linux XNJB is a Jukebox Manager for Mac OS X Nomadsync is a Jukebox synchronization tool for both Microsoft Windows and Linux Creative Nomad Jukebox KIO::Slave is an integration driver for KDE JBHTTP is a webserver interface to Jukeboxes that is similar to what Notmad Explorer (see below) did for Microsoft Windows Apple's iTunes is also capable of controlling the Nomad Jukebox. Proprietary software Notmad Explorer by Red Chair Software is a (now defunct) jukebox management program for Microsoft Windows. The program was notable for fully integrating NOMAD devices into Windows Explorer, providing a web-based interface to the device and providing search capabilities using a built-in SQL database See also Creative Technology Limited Creative ZEN Creative MuVo External links Linux and digital audio players We Test Drive the Creative Nomad Jukebox - November 2000 MP3 Newswire review of Creative's Nomad Jukebox Sharky Extreme Nomad Review October 2000 - Nomad Jukebox Review Digital audio players Creative Technology products Singaporean brands Consumer electronics brands Jukebox-style media players
14887180
https://en.wikipedia.org/wiki/Weidner%20Communications
Weidner Communications
Weidner Communications Inc. was founded by Stephen Weidner in 1977 and marketed the Weidner Multi-Lingual Word Processing System. History In conjunction with its introduction to the market, the Weidner Multi-Lingual Word Processing System was first reported on in 1978 in the Wall Street Journal as “Quadrupling Translation Volume” and the Deseret News as “halving translation costs and of increasing output by at least 400 percent.” This new technology was demonstrated to translation experts on September 12, 1978 at Brigham Young University in Provo, Utah. Thomas Bauman and Leland Wright of the American Translators Association who had arrived on September 11, 1978, in Provo, Utah to view a demonstration of the Weidner Multi-Lingual Word Processing System. After attending the demonstration Thomas Bauman said, “I’ve never been so converted to anything so fast in my life.” He subsequently extended an invitation for Wydner to attend the annual meeting of the American Translators Association that following October where the Weidner Machine Translation System hailed a hoped-for breakthrough in machine translation. (Geoffrey Kingscott, 1992) The Weidner "Multi-Lingual Word Processing System" is based on the research and work of Bruce Wydner, as demonstrated in his copyrighted text books. The Fastest Way To Learn Spanish Is To See IT! (Learn to read Spanish in 24 hours) ©1971 and 1975. This technology is the basis of the Weidner Multi-Lingual Word Processing System, and was programmed for processing human languages on the low-cost computers of the late 1970s and as part of machine translation and word processing software today. The Weidner Engine works by mapping the approximately 460,000 words in the English Dictionary (as in other target languages) to 10,000 "Root" words/thoughts (an Interlingual Lexicon). Additionally, inflected, conjugated word changes and endings are automatically parsed from the root words by a parsing engine, then associated to a specific word type by language rules based on the sense of sight. Each word (or expression) is parsed, compared to the spell-check lexicon and mapped to the interlingual lexicon for subsequent translation to the target language. If the target language is the same as the inputted language the language rules result is a word processed document in the original language. Tools included an aid for spelling and alternate word look-up. Translated experts at the Commission of the European Union, said that this (to them) "new translation system" of Bruce Wydner "renewed" their "hope" for Machine Translation that would lead them to "Better Translation for Better Communication." (G. Van Slype, 1983) Translation Associates and Eyring Research Institute The company responsible for the production of the Weidner Multi-Lingual Word Processing System included Bruce Wydner and his friends (Warren Davidson, Dale Miller, and Lowell Randall) who formed, the Inns of the Temple, Inc., a 501(c)3 Corporation, dba Translation Associates. Bruce Wydner (who legally had his last name changed to be pronounced properly), representing his company, made an exclusive marketing contract with his brother Stephen Weidner, the contract restricted the transfer of any development rights to his brother's company, and Bruce Wydner made a 15-year non-compete non-circumvent contract with Eyring Research Institute in Provo, Utah to engage their computing services in the creation of the Weidner Multi-lingual Word Processor, and for the programming skills of Eyring's bi-lingual programmer employee Bruce Bastian, Bruce Wydner paid Eyring Research Institute $25.00/hr for programming time of which $5.00/hr went to pay Bastian. Machine translation In 1982 Stephen Weidner began to have financial problems over a Research and Development Tax Shelter he had created, as a result Weidner Communications Inc. suffered, disputes over Weidner's assets were taken to court. In 1984 Stephen Weidner's original Company was purchased by Bravis International, one of Japan’s largest translation companies, as part of a settlement of the court ordered liquidation of Weidner Communication’s assets, but Weidner Communications Inc. still maintained offices in Chicago and in Paris. During the mid-1980s Weidner Communications, Inc., (WCC), was the largest translation company by sales volume in the United States. (Margaret M. Perscheid, 1985) Later the Japanese sold Wydner’s technology to Intergraph Corporation of Alabama who later sold it to Transparent Language, Inc. of New Hampshire. Bruce Wydner, the principal agent for the Inns of the Temple Inc., that retained the research and development rights to the Weidner Multi-lingual Word Processor, separated himself from his brother in early 1979 and no longer supplied any updated software developments. Weidner had offended his brother over a matter of having Eyring Research Institute send their bi-lingual employee to remove Wydners intellectual property from his home, of which Wydner claims was stolen from him. (Wydner vs Novell, WordPerfect, Ashton, Bastan, et. all, 2003) SDL International, Enterprise Translation Server The Weidner Engine is the basis of the website freetranslation.com. The original Weidner Engine was recently (2001) bought by SDL International of London, England. Lionbridge, iTranslator A copy of the Weidner Multi-Lingual Word Processing software was requested by the German Government for the Siemens Corporation of Germany in September 1980 and was nicknamed the Siemens-Weidner Engine (originally English-German). This revolutionary multi-lingual word processing engine became foundational in the development of the Metal MT project according to John White of the Siemens Corporation. (The Deseret News, Friday, Aug. 22, 1980.) After the Metal MT development Rights to the Siemens-Weidner Engine were sold to a Belgium company, Lernout & Hauspie. The Siemens copy of the Weidner Multi-lingual Word Processing software has since been acquired through the purchase of assets of Lernout & Hauspie by Bowne Global Solutions, Inc., which was later acquired by Lionbridge Technologies, Inc. and is demonstrated in their itranslator software. Word Processing Microsoft Word Lernout & Hauspie sold a copy of Wydner's language technology (The Siemens/Weidner Engine) aka Lernout and Hauspie Speech Products N.V. to the Microsoft Corporation to be used in Microsoft Word. WordPerfect Eyring Research Institute was a development bed to Bruce Bastian (co-founder of WordPerfect) who was one of the original programmer helpers for Bruce Wydner in the production of the original Weidner Spanish-English Multi-lingual Word Processor, a foundation to the Wordperfect Mono-lingual Word Processor, produced first for English then for Spanish. (Utah Weekly, 2003) Ronald G. Hansen, the President of the Eyring Research Institute, reportedly asked Bruce Wydner the following in 1978: "Bruce Bastian says that this Multilingual Wordprocessor of yours has a lot more uses than just translating languages. He says that it could be used to produce monolingual word processors and wants to know if you will let him do that." (Utah Weekly, 2003) Alan Ashton said that, “Bruce Bastian did all of the formatting of the Word Processor Program, the main part of the Program that makes it work so well." That Format was expressed in that 1989 WordPerfect Users Manual as, "If you want to compose the Rules to process all of the words in a language, you must start with the Rules to process the most-used words." Intelligent Systems Technology (ARPA) and (SISTO) Eyring Research Institute was instrumental to the U.S. Air Force Missile Directorate at Hill Air Force Base near Ogden, Utah to produce in top military secrecy, the Intelligent Systems Technology Software that was foundational to the later named Reagan Star Wars program. After the ALPAC Report in 1966, President David O. McKay of the LDS Church, apparently, approached one US Government operation that continued seeking the technological advances in Human Language Technology, the Missile Directorate of the US Air Force, at Hill Air Force Base, to fund the transfer, in Top Military Secrecy, of any such technological advances, from the BYU Linguistics Department’s Project to ERI facilities, in order for it to try to take any Human Language Technology from there to make it, through collaboration with the US Defense Department’s Advanced Research Projects Agency (ARPA) and its Software and Intelligent Systems Technology Office (SISTO), into Missile Guidance Software that would be superior to any producible by the Soviet Union. (Cleo Harmon, 1999) References Sources Natural Language Computing: The commercial applications, Tim Johnson, Published by Ovum Ltd, London, 1985 A survey of the translation market, present and future, prepared for the Commission of the European Communities, Directorate-General Information Market and Innovation by Bureau Marcel van Dijk, Brussels PA Conseiller de Direction, Paris, Authors G. Van Slype (Bureau Marcel van Dijk) J. F. Guinet (PA) F. Seitz (PA) E. Benegam (PACTEL) 1983 ECSC, EEC EAEC Luxembourg, , EUR 7720EN, A lunch with Bruce Wydner, Geoffrey Kingscott, Language International, John Benjamins Publishing Co., Amsterdam - The Netherlands, 4/4, April, 1992 http://www.mt-archive.info/jnl/LangInt-1992-Wydner.pdf The Life of Frank Carlyle Harmon (1905–1997), Compiled by his wife, Cleo Harman Edited by Bliss J. Hansen, Published by Family Footprints, 1999, p. 150 ASIN: B000I8VR9C California Firm to Unveil a Computer That Processes Words for Translators, Richard A. Shaffer, Wall Street Journal, October 24, 1978 Provo researchers help perfect a computer-translator, Arnold Irving, The Deseret News, Oct. 31, 1978 Germans visit Utah to see language translation unit, Richard Nash, The Deseret News, Aug. 21, 1980 Machine Translation: its History, Current Status, and Future Prospects, Jonathan Slocum, Siemens Communications Systems, Inc., Linguistics Research Center, University of Texas, Austin, Texas, 1984, http://acl.ldc.upenn.edu/P/P84/P84-1116.pdf The Fastest Way to Learn Spanish is to See IT!, by Spanish New Learning Center, Hawkes Publishing Inc., 1975, Wydner vs Novell, WordPerfect, Ashton, Bastan, et. all, 2003 Twenty years of Translating and the Computer, John Hutchins, 1998 http://www.hutchinsweb.me.uk/Aslib-1998.pdf Practical Experience of Machine Translation, Veronica Lawson, North Holland Publishing Company, Amsterdam, The Netherlands 1982 Machine Translation Today; The State of the Art, Margaret King, Edinburgh University Press, Edinburgh, Scotland 1984 Machine Translation; Past, Present, Future, W. J. Hutchins, Ellis Norwood Limited, Chichester, England, 1986 Machine Translation, Ian Pigott, Commission of the European Communities, Luxemburg, XIII-84 IP, November 1991. Language Software and Technology, Report by Michael Quinlan, President of Transparent Language to LDS Church, New Hampshire, www.transparent.com, March 8, 2000 COMPUTER-AIDED TRANSLATION AT WCC, Margaret M. Perscheid, CALICO Journal, Volume 3 Number 1, https://calico.org/a-273-ComputerAided%20Translation%20At%20WCC.html Analyse des Systems zur computergestützten Übersetzung Weidner – Version Französisch-Englisch 2.5, http://www.dialog-translations.com/bilder/Diplomarbeit%20Hans%20Christian%20von%20Steuber.pdf Michael G. Hundt: Working with the Weidner machine-aided translation system, in: Veronica Lawson (Hg): Translating and the computer 4 - Practical experience with machine translation, London, 1982 Trial of the Weidner computer-assisted translation system, Translation Bureau Canada, Project No. 5-5462, 1985 Wydner Invention Fulfills "Prophecy" of LDS "Mormon" Church Presidents, by US-Oregon Observer staff Special to the Utah Weekly, The Utah Weekly, Thursday, March 27, 2003, Vol. 2, Num. 4 WCC's translation bureau, Henrietta Pons, Veronica Lawson 1982 Ulla Magnusson-Murray: Operational experience of a machine translation service, in Veronica Lawson (Hg): Translating and the computer 5 - Tools for the trade, London 1983, S.171-180; Tim Johnson ebd:283-286 Machine Translation: its History, Current Status, and Future Prospects, Jonathan Slocum, Siemens Communications Systems, Inc., Linguistics Research Center, University of Texas, Austin, Texas Machine translation
26377830
https://en.wikipedia.org/wiki/Maptek
Maptek
Maptek is an Adelaide, Australia-based company that provides 3D modelling, spatial analysis and design technology to the global mining industry. Founded by Bob Johnson in 1981 as K. Robert Johnson and Associates, the company operated as KRJA systems for 10 years before changing its name to Maptek in 1992. Founder, Bob Johnson graduated in 1969 from the University of NSW as a geologist, and completed his doctorate in 1972. Since 1976 he has pursued a commercial career using computers for mining. Johnson is a Fellow of the Australasian Institute of Mining and Metallurgy, and a Member of the American Society of Mining Engineers. Products Vulcan: The first generation of Vulcan was released in 1984 under a Fortran platform. Vulcan is a general mine planning software package that provides 3D modular software visualisation for geological modelling and mine planning. It is used by mining engineers, geologists and mine surveyors. Applications include 3D geological mapping and modelling, mine design, mine planning, geotechnical analysis, mine scheduling and optimisation, and mine rehabilitation. Other versions were released over the years. PointStudio (formerly I-Site Studio): PointStudio is a laser scanning technology including hardware and software. It is used for surveying large geographical areas and geotechnical analysis over highwalls. Maptek's laser scanners weigh 12–14 kg (including battery) and incorporate an optional inbuilt telescope and camera, as well as a handheld tablet for operating the scanner. In February 2009 Leica Geosystems’ Spatial Solutions Division, selected Maptek to supply laser scanners and software to be sold under the Leica Geosystems brand. Different versions of studio software were released from 2000 to 2018. MineSuite: Released in 2001, MineSuite is a fleet management, production and reporting package developed for the global mining industry. MineSuite collects and reports data from the real-time monitoring of production systems and equipment in open pit and underground mines, processing and production plants. The MineSuite system is now sold by MINLOG, a specialist company 50% owned by Maptek. BlastLogic: Released in 2011, BlastLogic provides a centralised record of all operational blast data (local or via cloud). Access to data is immediate and universal across users, simplifying and accelerating routine tasks. Eureka: Released in 2012, Eureka provides a single integrated platform for viewing and analysing exploration project data. The software allows users to take spatially located data and put it into context to better understand the inter-connecting relationships between the disparate information. Viewing the data at different scales allows users to see the big picture as well as analyse local areas of interest. Eureka displays aerial photography, terrain maps, historical plans and GIS data. PerfectDig: Released in 2013, PerfectDig displays enhanced photographs of the mine environment coloured with 3D design conformance information. Results can be viewed using any web browser, without the need for specialised software. In 2015, PerfectDig won the South Australian Industrial & Resources, Australian Information Industry Association iAward. Sentry: Released in 2014, Sentry is a flexible surface change detection system. The system combines I-Site laser scan data with sophisticated software to track and analyse movement over time. Evolution: Released in 2015, Evolution is an enterprise-level scheduling solution which optimises net present value using grade cut-off techniques, a proven method for maximising project value. Previously known as Evorelution offered by the Orelogy Group. Locations Maptek has 5 offices in Australia – Adelaide, Brisbane, Newcastle, Perth and Sydney and international distributors in Belo Horizonte (Brazil), Distrito Federal (Mexico), Hermosillo (Mexico), Denver (USA), Vancouver and Montreal (Canada), Edinburgh (UK), Johannesburg (RSA), Lima (Peru) and Viña del Mar (Chile). References Software companies of Australia Geology software
2850681
https://en.wikipedia.org/wiki/Aaron%20Swartz
Aaron Swartz
Aaron Hillel Swartz (November 8, 1986January 11, 2013) was an American computer programmer, entrepreneur, writer, political organizer, and Internet hacktivist. He was involved in the development of the web feed format RSS, the Markdown publishing format, the organization Creative Commons, the website framework web.py, and joined the social news site Reddit six months after its founding. He was given the title of co-founder of Reddit by Y Combinator owner Paul Graham after the formation of Not a Bug, Inc. (a merger of Swartz's project Infogami and Redbrick Solutions, a company run by Alexis Ohanian and Steve Huffman). Swartz's work also focused on civic awareness and activism. He helped launch the Progressive Change Campaign Committee in 2009 to learn more about effective online activism. In 2010, he became a research fellow at Harvard University's Safra Research Lab on Institutional Corruption, directed by Lawrence Lessig. He founded the online group Demand Progress, known for its campaign against the Stop Online Piracy Act. In 2011, Swartz was arrested by Massachusetts Institute of Technology (MIT) police on state breaking-and-entering charges, after connecting a computer to the MIT network in an unmarked and unlocked closet, and setting it to download academic journal articles systematically from JSTOR using a guest user account issued to him by MIT. Federal prosecutors, led by Carmen Ortiz, later charged him with two counts of wire fraud and eleven violations of the Computer Fraud and Abuse Act, carrying a cumulative maximum penalty of $1 million in fines, 35 years in prison, asset forfeiture, restitution, and supervised release. Swartz declined a plea bargain under which he would have served six months in federal prison. Two days after the prosecution rejected a counter-offer by Swartz, he was found dead by suicide in his Brooklyn apartment. In 2013, Swartz was inducted posthumously into the Internet Hall of Fame. Early life Aaron Swartz was born in Highland Park, 25 miles North of Chicago, the child of a Jewish family. He was the eldest child of Susan and Robert Swartz and brother to Noah and Ben Swartz. He was an atheist. His father founded the software firm Mark Williams Company. At an early age, Swartz immersed himself in the study of computers, programming, the Internet, and Internet culture. He attended North Shore Country Day School, a small private school near Chicago, until 9th grade, when he left high school and enrolled in courses at Lake Forest College. In 1999, at age 12, he created the website The Info Network, a user-generated encyclopedia. The site won the ArsDigita Prize, given to young people who create "useful, educational, and collaborative" noncommercial websites and led to early recognition of Swartz's nascent talent in coding. At age 14, he became a member of the working group that authored the RSS 1.0 web syndication specification. In 2005, he enrolled at Stanford University but left the school after his first year. Entrepreneurship During Swartz's first year at Stanford, he applied to Y Combinator's first Summer Founders Program, proposing to work on a startup called Infogami, a flexible content management system designed to create rich and visually interesting websites or a form of wiki for structured data. After working on it with co-founder Simon Carstensen over the summer of 2005, Swartz opted not to return to Stanford, choosing instead to continue to develop and seek funding for Infogami. As part of his work on Infogami, Swartz created the web.py web application framework because he was unhappy with other available systems in the Python programming language. In early fall of 2005, he worked with his fellow co-founders of another nascent Y-Combinator firm, Reddit, to rewrite its Lisp codebase using Python and web.py. Although Infogami's platform was abandoned after Not a Bug was acquired, Infogami's software was used to support the Internet Archive's Open Library project and the web.py web framework was used as basis for many other projects by Swartz and many others. When Infogami failed to find further funding, Y-Combinator organizers suggested Infogami merge with Reddit, which it did in November 2005, creating a new firm, Not a Bug, devoted to promoting both products. As a result, Swartz was given the title of co-founder of Reddit. Although both projects initially struggled, Reddit made large gains in popularity in 2005–2006. In October 2006, based largely on Reddit's success, Not a Bug was acquired by Condé Nast Publications, owner of Wired magazine. Swartz moved with his company to San Francisco to continue to work on Reddit for Wired. He found corporate office life uncongenial and ultimately was asked to resign from the company. In September 2007, he joined Infogami co-founder Simon Carstensen to launch a new firm, Jottit, in another attempt to create a markdown-driven content management system in Python. Activism In 2008, Swartz founded Watchdog.net, "the good government site with teeth," to aggregate and visualize data about politicians. That year, he wrote a widely circulated Guerilla Open Access Manifesto. On December 27, 2010, he filed a Freedom of Information Act (FOIA) request to learn about the treatment of Chelsea Manning, alleged source for WikiLeaks. PACER In 2008, Swartz downloaded about 2.7 million federal court documents stored in the PACER (Public Access to Court Electronic Records) database managed by the Administrative Office of the United States Courts. The Huffington Post characterized his actions this way: "Swartz downloaded public court documents from the PACER system in an effort to make them available outside of the expensive service. The move drew the attention of the FBI, which ultimately decided not to press charges as the documents were, in fact, public." PACER was charging 8 cents per page for information that Carl Malamud, who founded the nonprofit group Public.Resource.Org, contended should be free, because federal documents are not covered by copyright. The fees were "plowed back to the courts to finance technology, but the system [ran] a budget surplus of some $150 million, according to court reports," reported The New York Times. PACER used technology that was "designed in the bygone days of screechy telephone modems ... putting the nation's legal system behind a wall of cash and kludge." Malamud appealed to fellow activists, urging them to visit one of 17 libraries conducting a free trial of the PACER system, download court documents, and send them to him for public distribution. After reading Malamud's call for action, Swartz used a Perl computer script running on Amazon cloud servers to download the documents, using credentials belonging to a Sacramento library. From September 4 to 20, 2008, it accessed documents and uploaded them to a cloud computing service. He released the documents to Malamud's organization. On September 29, 2008, the GPO suspended the free trial, "pending an evaluation" of the program. Swartz's actions were subsequently investigated by the FBI. The case was closed after two months with no charges filed. Swartz learned the details of the investigation after filing a FOIA request with the FBI, and described their response as the "usual mess of confusions that shows the FBI's lack of sense of humor." PACER still charges per page, but customers using Firefox have the option of saving the documents for free public access with a plug-in called RECAP. At a 2013 memorial for Swartz, Malamud recalled their work with PACER. They brought millions of U.S. District Court records out from behind PACER's "pay wall", he said, and found them full of privacy violations, including medical records and the names of minor children and confidential informants. A more detailed account of his collaboration with Swartz on the PACER project appears in an essay on Malamud's website. Writing in Ars Technica, Timothy Lee, who later made use of the documents obtained by Swartz as a co-creator of RECAP, offered some insight into discrepancies in reports on how much data Swartz downloaded: "In a back-of-the-envelope calculation a few days before the offsite crawl was shut down, Swartz guessed he got around 25 percent of the documents in PACER. The New York Times similarly reported Swartz had downloaded "an estimated 20 percent of the entire database". Based on the facts that Swartz downloaded 2.7 million documents while PACER, at the time, contained 500 million, Lee concluded that Swartz downloaded less than 1% of the database. Progressive Change Campaign Committee In 2009, wanting to learn about effective activism, Swartz helped launch the Progressive Change Campaign Committee. He wrote in his blog: "I spend my days experimenting with new ways to get progressive policies enacted and progressive politicians elected." He led the first activism event of his career with the Progressive Change Campaign Committee, delivering thousands of "Honor Kennedy" petition signatures to Massachusetts legislators, asking them to fulfill former Senator Ted Kennedy's last wish by appointing a senator to vote for healthcare reform. Demand Progress In 2010, Swartz co-founded Demand Progress, a political advocacy group that organizes people online to "take action by contacting Congress and other leaders, funding pressure tactics, and spreading the word" about civil liberties, government reform, and other issues. During academic year 2010–11, Swartz conducted research studies on political corruption as a Lab Fellow in Harvard University's Edmond J. Safra Research Lab on Institutional Corruption. Author Cory Doctorow, in his novel Homeland, "drew on advice from Swartz in setting out how his protagonist could use the information now available about voters to create a grass-roots anti-establishment political campaign." In an afterword to the novel, Swartz wrote: "These political hacktivist tools can be used by anyone motivated and talented enough.... Now it's up to you to change the system. ... Let me know if I can help." Opposition to the Stop Online Piracy Act (SOPA) Swartz was involved in the campaign to prevent passage of the Stop Online Piracy Act (SOPA), which sought to combat Internet copyright violations but was criticized on the basis that it would make it easier for the U.S. government to shut down web sites accused of violating copyright and would place intolerable burdens on Internet providers. After the bill's defeat, Swartz was the keynote speaker at the F2C:Freedom to Connect 2012 event in Washington, D.C., on May 21, 2012. In his speech, "How We Stopped SOPA", he said: He added, "We won this fight because everyone made themselves the hero of their own story. Everyone took it as their job to save this crucial freedom." He was referring to a series of protests against the bill by numerous websites, described by the Electronic Frontier Foundation as the biggest protest in Internet history, with over 115,000 sites posting their opposition. Swartz also spoke on the topic at an event organized by ThoughtWorks. Wikipedia Swartz participated in Wikipedia since August 2003 under the username AaronSw. In 2006, he ran unsuccessfully for the Wikimedia Foundation's Board of Trustees. In 2006, Swartz wrote an analysis of how Wikipedia articles are written, and concluded that the bulk of its content came from tens of thousands of occasional contributors, or "outsiders,” each of whom made few other contributions to the site, while a core group of 500 to 1,000 regular editors tended to correct spelling and other formatting errors. He said: "The formatters aid the contributors, not the other way around." His conclusions, based on the analysis of edit histories of several randomly selected articles, contradicted the opinion of Wikipedia co-founder Jimmy Wales, who believed the core group of regular editors provided most of the content while thousands of others contributed to formatting issues. Swartz came to his conclusions by counting the number of characters editors added to particular articles, while Wales counted the total number of edits. United States v. Aaron Swartz case According to state and federal authorities, Swartz used JSTOR, a digital repository, to download a large number of academic journal articles through MIT's computer network over the course of a few weeks in late 2010 and early 2011. Visitors to MIT's "open campus" were authorized to access JSTOR through its network; Swartz, as a research fellow at Harvard University, also had a JSTOR account. The download On September 25, 2010, the IP address 18.55.6.215, part of the MIT network, began sending hundreds of PDF download requests per minute to the JSTOR website, enough to slow the site's performance. This prompted a block of the IP address. In the morning, another IP address, also from within the MIT network, began sending more PDF download requests, resulting in a temporary block on the firewall level of all MIT servers in the entire 18.0.0.0/8 range. A JSTOR employee emailed MIT on September 29, 2010: According to authorities, Swartz downloaded the documents through a laptop connected to a networking switch in a controlled-access wiring closet at MIT. The closet's door was kept unlocked, according to press reports. When it was discovered, a video camera was placed in the room to record Swartz; his computer was left untouched. Recording was stopped once Swartz was identified; but rather than pursue a civil lawsuit against him, JSTOR reached a settlement with him in June 2011 where he surrendered the downloaded data. On July 30, 2013, JSTOR released 300 partially redacted documents used as incriminating evidence against Swartz, originally sent to the United States Attorney's Office in response to subpoenas in the case United States v. Aaron Swartz. (The following images are all excerpts from the 3,461-page PDF document.) Arrest and prosecution On the night of January 6, 2011, Swartz was arrested near the Harvard campus by MIT police and a Secret Service agent, and arraigned in Cambridge District Court on two state charges of breaking and entering with intent to commit a felony. On July 11, 2011, he was indicted by a federal grand jury on charges of wire fraud, computer fraud, unlawfully obtaining information from a protected computer, and recklessly damaging a protected computer. On November 17, 2011, Swartz was indicted by a Middlesex County Superior Court grand jury on state charges of breaking and entering with intent, grand larceny, and unauthorized access to a computer network. On December 16, 2011, state prosecutors filed a notice that they were dropping the two original charges, and the charges listed in the November 17, 2011 indictment were dropped on March 8, 2012. According to a spokesperson for the Middlesex County prosecutor, this was done to avoid impeding a federal prosecution headed by Stephen P. Heymann, supported by evidence provided by Secret Service agent Michael S. Pickett. On September 12, 2012, federal prosecutors filed a superseding indictment adding nine more felony counts, increasing Swartz's maximum criminal exposure to 50 years of imprisonment and $1 million in fines. During plea negotiations with Swartz's attorneys, the prosecutors offered to recommend a sentence of six months in a low-security prison if Swartz pled guilty to 13 federal crimes. Swartz and his lead attorney rejected the deal, opting instead for a trial where prosecutors would be forced to justify their pursuit of him. The federal prosecution involved what was characterized by numerous critics (such as former Nixon White House counsel John Dean) as an "overcharging" 13-count indictment and "overzealous", "Nixonian" prosecution for alleged computer crimes, brought by then U.S. Attorney for Massachusetts Carmen Ortiz. Swartz died by suicide on January 11, 2013. After his death, federal prosecutors dropped the charges. On December 4, 2013, due to a Freedom of Information Act suit by the investigations editor of Wired magazine, several documents related to the case were released by the Secret Service, including a video of Swartz entering the MIT network closet. Death, funeral, and memorial gatherings Death On the evening of January 11, 2013, Swartz's girlfriend, Taren Stinebrickner-Kauffman, found him dead in his Brooklyn apartment. A spokeswoman for New York's Medical Examiner reported that he had hanged himself. No suicide note was found. Swartz's family and his partner created a memorial website on which they issued a statement, saying: "He used his prodigious skills as a programmer and technologist not to enrich himself but to make the Internet and the world a fairer, better place". Days before Swartz's funeral, Lawrence Lessig eulogized his friend and sometime-client in an essay, "Prosecutor as Bully." He decried the disproportionality of Swartz's prosecution and said, "The question this government needs to answer is why it was so necessary that Aaron Swartz be labeled a 'felon'. For in the 18 months of negotiations, that was what he was not willing to accept." Cory Doctorow wrote, "Aaron had an unbeatable combination of political insight, technical skill, and intelligence about people and issues. I think he could have revolutionized American (and worldwide) politics. His legacy may still yet do so." Funeral and memorial gatherings Swartz's funeral services were held on January 15, 2013, at Central Avenue Synagogue in Highland Park, Illinois. Tim Berners-Lee, creator of the World Wide Web, delivered a eulogy. The same day, The Wall Street Journal published a story based in part on an interview with Stinebrickner-Kauffman. She told the Journal that Swartz lacked the money to pay for a trial and "it was too hard for him to ... make that part of his life go public" by asking for help. He was also distressed, she said, because two of his friends had just been subpoenaed and because he no longer believed that MIT would try to stop the prosecution. Several memorials followed soon afterward. On January 19, hundreds attended a memorial at the Cooper Union, speakers at which included Stinebrickner-Kauffman, open source advocate Doc Searls, Creative Commons' Glenn Otis Brown, journalist Quinn Norton, Roy Singham of ThoughtWorks, and David Segal of Demand Progress. On January 24, there was a memorial at the Internet Archive headquarters in San Francisco (video) with speakers including Stinebrickner-Kauffman, Alex Stamos, Brewster Kahle, and Carl Malamud. On February 4, a memorial was held in the Cannon House Office Building on Capitol Hill; speakers at this memorial included Senator Ron Wyden and Representatives Darrell Issa, Alan Grayson, and Jared Polis, and other lawmakers in attendance included Senator Elizabeth Warren and Representatives Zoe Lofgren and Jan Schakowsky. A memorial also took place on March 12 at the MIT Media Lab. Swartz's family recommended GiveWell for donations in his memory, an organization that Swartz admired, had collaborated with and was the sole beneficiary of his will. Response US Department of Justice Carmen M. Ortiz, then US Attorney for the District of Massachusetts, “As a parent and a sister, I can only imagine the pain felt by the family and friends of Aaron Swartz, […] I must, however, make clear that this office's conduct was appropriate in bringing and handling this case.” – official statement, January 16, 2013. Family response On January 12, 2013, Swartz's family and partner issued a statement criticizing the prosecutors and MIT. Speaking at his son's funeral on January 15, Robert Swartz said, "Aaron was killed by the government, and MIT betrayed all of its basic principles." Tom Dolan, husband of U.S. Attorney for Massachusetts Carmen Ortiz, whose office prosecuted Swartz's case, replied with criticism of the Swartz family: "Truly incredible that in their own son's obit they blame others for his death and make no mention of the 6-month offer." This comment triggered some criticism; Esquire writer Charlie Pierce replied, "the glibness with which her husband and her defenders toss off a 'mere' six months in federal prison, low-security or not, is a further indication that something is seriously out of whack with the way our prosecutors think these days." MIT MIT maintains an open-campus policy along with an "open network." Two days after Swartz's death, MIT President L. Rafael Reif commissioned professor Hal Abelson to lead an analysis of MIT's options and decisions relating to Swartz's "legal struggles." To help guide the fact-finding stage of the review, MIT created a website where community members could suggest questions and issues for the review to address. Swartz's attorneys requested that all pretrial discovery documents be made public, a move which MIT opposed. Swartz allies have criticized MIT for its opposition to releasing the evidence without redactions. On July 26, 2013, the Abelson panel submitted a 182-page report to MIT president, L. Rafael Reif, who authorized its public release on July 30. The panel reported that MIT had not supported charges against Swartz and cleared the institution of wrongdoing. However, its report also noted that despite MIT's advocacy for open access culture at the institutional level and beyond, the university never extended that support to Swartz. The report revealed, for example, that while MIT considered the possibility of issuing a public statement about its position on the case, such a statement never materialized. Press The Huffington Post reported that "Ortiz has faced significant backlash for pursuing the case against Swartz, including a petition to the White House to have her fired." Other news outlets reported similarly. Reuters news agency called Swartz "an online icon" who "help[ed] to make a virtual mountain of information freely available to the public, including an estimated 19 million pages of federal court documents." The Associated Press (AP) reported that Swartz's case "highlights society's uncertain, evolving view of how to treat people who break into computer systems and share data not to enrich themselves, but to make it available to others," and that JSTOR's lawyer, former U.S. Attorney for the Southern District of New York Mary Jo White, had asked the lead prosecutor to drop the charges. As discussed by editor Hrag Vartanian in Hyperallergic, Brooklyn, New York muralist BAMN ("By Any Means Necessary") created a mural of Swartz. "Swartz was an amazing human being who fought tirelessly for our right to a free and open Internet," the artist explained. "He was much more than just the 'Reddit guy'." Speaking on April 17, 2013, Yuval Noah Harari described Swartz as "the first martyr of the Freedom of Information movement". However, according to Harari, Swartz's stance did not illustrate the belief in the freedom of persons or speech, but stemmed from the increasing belief among the young generation that above anything else, information should be free. Aaron Swartz's legacy has been reported as strengthening the open access to scholarship movement. In Illinois, his home state, Swartz's influence led state university faculties to adopt policies in favor of open access. James Fallows, writing in The Atlantic, offers a short eulogy. Internet Hacks On January 13, 2013, members of Anonymous hacked two websites on the MIT domain, replacing them with tributes to Swartz that called on members of the Internet community to use his death as a rallying point for the open access movement. The banner included a list of demands for improvements in the U.S. copyright system, along with Swartz's Guerilla Open Access Manifesto. On the night of January 18, 2013, MIT's e-mail system was taken offline for ten hours. On January 22, e-mail sent to MIT was redirected by hackers Aush0k and TibitXimer to the Korea Advanced Institute of Science & Technology. All other traffic to MIT was redirected to a computer at Harvard University that was publishing a statement headed "R.I.P Aaron Swartz," with text from a 2009 posting by Swartz, accompanied by a chiptune version of "The Star-Spangled Banner". MIT regained full control after about seven hours. In the early hours of January 26, 2013, the U.S. Sentencing Commission website, USSC.gov, was hacked by Anonymous. The home page was replaced with an embedded YouTube video, Anonymous Operation Last Resort. The video statement said Swartz "faced an impossible choice". A hacker downloaded "hundreds of thousands" of scientific-journal articles from a Swiss publisher's website and republished them on the open Web in Swartz's honor a week before the first anniversary of his death. Petition to the White House After Swartz's death, more than 50,000 people signed an online petition to the White House calling for the removal of Ortiz, "for overreach in the case of Aaron Swartz." A similar petition was submitted calling for prosecutor Stephen Heymann's firing. In January 2015, two years after Swartz's death, the White House declined both petitions. Commemorations On August 3, 2013, Swartz was posthumously inducted into the Internet Hall of Fame. There was a hackathon held in Swartz' memory around the date of his birthday in 2013. Over the weekend of November 8–10, 2013, inspired by Swartz's work and life, a second annual hackathon was held in at least 16 cities around the world. Preliminary topics worked on at the 2013 Aaron Swartz Hackathon were privacy and software tools, transparency, activism, access, legal fixes, and a low-cost book scanner. In January 2014, Lawrence Lessig led a walk across New Hampshire in honor of Swartz, rallying for campaign finance reform. Memorial site: Remember Aaron Swartz. “Aaron Swartz made our world more free. Be Free, Internet. Thank you, Aaron, for what you gave us.” – public.resource.org. Mark Bernstein. Henry Farrell, “Remembering Aaron Swartz.” Quinn Norton, “My Aaron Swartz, whom I loved.” In 2017, the Turkish-Dutch artist Ahmet Öğüt commemorated Swartz through a work entitled "Information Power to The People" and depicting his bust. Legacy Open Access A long-time supporter of open access, Swartz wrote in his Guerilla Open Access Manifesto:Supporters of Swartz responded to news of his death with an effort called #PDFTribute to promote Open Access. On January 12, Eva Vivalt, a development economist at the World Bank, began posting her academic articles online using the hashtag #pdftribute as a tribute to Swartz. Scholars posted links to their works. The story of Aaron Swartz has exposed the topic of open access to scientific publications to wider audiences. In the wake of Aaron Swartz, many institutions and personalities have campaigned for open access to scientific knowledge. Swartz's death prompted calls for more open access to scholarly data (e.g., open science data). The Think Computer Foundation and the Center for Information Technology Policy (CITP) at Princeton University announced scholarships awarded in memory of Aaron Swartz. In 2013, Swartz was posthumously awarded the American Library Association's James Madison Award for being an "outspoken advocate for public participation in government and unrestricted access to peer-reviewed scholarly articles." In March, the editor and editorial board of the Journal of Library Administration resigned en masse, citing a dispute with the journal's publisher, Routledge. One board member wrote of a "crisis of conscience about publishing in a journal that was not open access" after the death of Aaron Swartz. In 2002, Swartz had stated that when he died, he wanted all the contents of his hard drives made publicly available. Congress Several members of the U.S. House of Representatives – Republican Darrell Issa and Democrats Jared Polis and Zoe Lofgren – all on the House Judiciary Committee, have raised questions regarding the government's handling of the case. Calling the charges against him "ridiculous and trumped up," Polis said Swartz was a "martyr", whose death illustrated the need for Congress to limit the discretion of federal prosecutors. Speaking at a memorial for Swartz on Capitol Hill, Issa said Massachusetts Democratic Senator Elizabeth Warren issued a statement saying "[Aaron's] advocacy for Internet freedom, social justice, and Wall Street reform demonstrated ... the power of his ideas ..." In a letter to Attorney General Eric Holder, Texas Republican Senator John Cornyn asked, "On what basis did the U.S. Attorney for the District of Massachusetts conclude that her office's conduct was 'appropriate'?" and "Was the prosecution of Mr. Swartz in any way retaliation for his exercise of his rights as a citizen under the Freedom of Information Act?" Congressional investigations Issa, who chaired the House Committee on Oversight and Government Reform, announced that he would investigate the Justice Department's actions in prosecuting Swartz. In a statement to The Huffington Post, he praised Swartz's work toward "open government and free access to the people." Issa's investigation has garnered some bipartisan support. On January 28, 2013, Issa and ranking committee member Elijah Cummings published a letter to U.S. Attorney General Holder, questioning why federal prosecutors had filed the superseding indictment. On February 20, WBUR reported that Ortiz was expected to testify at an upcoming Oversight Committee hearing about her office's handling of the Swartz case. On February 22, Associate Deputy Attorney General Steven Reich conducted a briefing for congressional staffers involved in the investigation. They were told that Swartz's Guerilla Open Access Manifesto played a role in prosecutorial decision-making. Congressional staffers left this briefing believing that prosecutors thought Swartz had to be convicted of a felony carrying at least a short prison sentence in order to justify having filed the case against him in the first place. Excoriating the Department of Justice as the "Department of Vengeance", Stinebrickner-Kauffman told the Guardian that the DOJ had erred in relying on Swartz's Guerilla Open Access Manifesto as an accurate indication of his beliefs by 2010. "He was no longer a single issue activist," she said. "He was into lots of things, from healthcare, to climate change to money in politics." On March 6, Holder testified before the Senate Judiciary Committee that the case was "a good use of prosecutorial discretion." Stinebrickner-Kauffman issued a statement in reply, repeating and amplifying her claims of prosecutorial misconduct. Public documents, she wrote, reveal that prosecutor Stephen Heymann "instructed the Secret Service to seize and hold evidence without a warrant... lied to the judge about that fact in written briefs... [and] withheld exculpatory evidence... for over a year," violating his legal and ethical obligations to turn such evidence over to the defense. On March 22, Senator Al Franken wrote Holder a letter expressing concerns, writing that "charging a young man like Mr. Swartz with federal offenses punishable by over 35 years of federal imprisonment seems remarkably aggressive – particularly when it appears that one of the principal aggrieved parties ... did not support a criminal prosecution." Amendment to Computer Fraud and Abuse Act In 2013, Rep. Zoe Lofgren (D-Calif.) introduced a bill, Aaron's Law (, ) to exclude terms of service violations from the 1986 Computer Fraud and Abuse Act and from the wire fraud statute. Lawrence Lessig wrote of the bill, "this is a critically important change.... The CFAA was the hook for the government's bullying.... This law would remove that hook. In a single line: no longer would it be a felony to breach a contract." Professor Orin Kerr, a specialist in the nexus between computer law and criminal law, wrote that he had been arguing for precisely this sort of reform of the Act for years. The ACLU, too, has called for reform of the CFAA to "remove the dangerously broad criminalization of online activity." The EFF has mounted a campaign for these reforms. Lessig's inaugural Chair lecture as Furman Professor of Law and Leadership was entitled Aaron's Laws: Law and Justice in a Digital Age; he dedicated the lecture to Swartz. The Aaron's Law bill stalled in committee. Brian Knappenberger alleges this was due to Oracle Corporation's financial interest in maintaining the status quo. Fair Access to Science and Technology Research Act The Fair Access to Science and Technology Research Act (FASTR) is a bill that would mandate earlier public release of taxpayer-funded research. FASTR has been described as "The Other Aaron's Law." Senator Ron Wyden (D-Ore.) and Senator John Cornyn (R-Tex.) introduced the Senate version, in 2013 and again in 2015, while the bill was introduced to the House by Reps. Zoe Lofgren (D-Calif.), Mike Doyle (D-Pa.) and Kevin Yoder (R-Kans.). Senator Wyden wrote of the bill, "the FASTR act provides that access to taxpayer funded research should never be hidden behind a paywall." While the legislation had not passed , it helped to prompt some motion toward more open access on the part of the US administration. Shortly after the bill's original introduction, the Office of Science and Technology Policy directed "each Federal agency with over $100 million in annual conduct of research and development expenditures to develop a plan to support increased public access to the results of research funded by the Federal Government." Media Swartz has been featured in various works of art and has posthumously received dedications from numerous artists. In 2013, Kenneth Goldsmith dedicated his "Printing out the Internet" exhibition to Swartz. There are also dedicated biographical films for Aaron: The Internet's Own Boy: The Story of Aaron Swartz On January 11, 2014, marking the first anniversary of his death, a preview was released of The Internet's Own Boy: The Story of Aaron Swartz, a documentary about Swartz, the NSA and SOPA. The film was officially released at the January 2014 Sundance Film Festival. Democracy Now! covered the release of the documentary, as well as Swartz's life and legal case, in a sprawling interview with director Brian Knappenberger, Swartz's father, brother, and his attorney. The documentary is released under a Creative Commons License; it debuted in theaters and on-demand in June 2014. Mashable called the documentary "a powerful homage to Aaron Swartz". Its debut at Sundance received a standing ovation. Mashable printed, "With the help of experts, The Internet's Own Boy makes a clear argument: Swartz unjustly became a victim of the rights and freedoms for which he stood." The Hollywood Reporter described it as a "heartbreaking" story of a "tech wunderkind persecuted by the US government", and a must-see "for anyone who knows enough to care about the way laws govern information transfer in the digital age". Killswitch In October 2014, Killswitch, a documentary film featuring Aaron Swartz, as well as Lawrence Lessig, Tim Wu, and Edward Snowden, received its world premiere at the Woodstock Film Festival, where it won the award for Best Editing. The film focuses on Swartz's role in advocating for internet freedoms. In February 2015, Killswitch was invited to screen at the Capitol Visitor's Center in Washington, D.C. by Congressman Alan Grayson. The event was held on the eve of the Federal Communications Commission's historic decision on Net Neutrality. Congressman Grayson, Lawrence Lessig, and Free Press CEO Craig Aaron spoke about Swartz and his fight on behalf of a free and open Internet at the event. Congressman Grayson states that Killswitch is "one of the most honest accounts of the battle to control the Internet – and access to information itself." Richard von Busack of the Metro Silicon Valley writes of Killswitch, "Some of the most lapidary use of found footage this side of The Atomic Café". Fred Swegles of the Orange County Register remarks, "Anyone who values unfettered access to online information is apt to be captivated by Killswitch, a gripping and fast-paced documentary." Kathy Gill of GeekWire asserts that "Killswitch is much more than a dry recitation of technical history. Director Ali Akbarzadeh, producer Jeff Horn, and writer Chris Dollar created a human-centered story. A large part of that connection comes from Lessig and his relationship with Swartz." Other films Patriot of the Web is an independent biographical film about Aaron Swartz, written and directed by Darius Burke. The film was released on September 15, 2019, onto YouTube. Actor Shawn Mcclintock plays Aaron Swartz. The film had a limited video on demand release in December 2017 on Reelhouse and in January 2018 on Pivotshare. Another biographical film about Swartz, Think Aaron, is being developed by HBO Films. Works Specifications Markdown: Swartz was a major contributor to John Gruber's Markdown, a lightweight markup language for generating HTML, and author of its html2text translator. The syntax for Markdown was influenced by Swartz's earlier language (2002), which today is primarily remembered for its syntax for specifying headers, known as atx-style headers: Markdown itself remains in widespread use, with websites such as Reddit and GitHub using it. RDF/XML at W3C: In 2001, Swartz joined the RDFCore working group at the World Wide Web Consortium (W3C), where he authored RFC 3870, Application/RDF+XML Media Type Registration. The document described a new media type, "RDF/XML", designed to support the Semantic Web. Software DeadDrop: In 2011–2012, Swartz, Kevin Poulsen, and James Dolan designed and implemented DeadDrop, a system that allows anonymous informants to send electronic documents without fear of disclosure. In May 2013, the first instance of the software was launched by The New Yorker under the name Strongbox. The Freedom of the Press Foundation has since taken over development of the software, which has been renamed SecureDrop. Tor2web: In 2008, Swartz worked with Virgil Griffith to design and implement Tor2web, an HTTP proxy for Tor-hidden services. The proxy was designed to provide easy access to Tor from a basic web browser. The software is now maintained by Giovanni Pellerano within the GlobaLeaks project. Publications Notes Swartz has been identified as a cofounder of Reddit, but the title is a source of controversy. With the merger of Infogami and Reddit, Swartz became a co-owner and director of parent company Not A Bug, Inc., along with Reddit cofounders Steve Huffman and Alexis Ohanian. Swartz has been referred to as "cofounder" in the press and by investor Paul Graham (who recommended the merger); Ohanian describes him as "co-owner". The MIT network administration office told MIT police that "approximately 70 gigabytes of data had been downloaded, 98% of which was from JSTOR." The first federal indictment alleged "approximately 4.8 million articles", "1.7 million" of which "were made available by independent publishers for purchase through JSTOR's Publisher Sales Service." The subsequent DOJ press release alleged "over four million articles". The superseding indictment removed the estimates and instead characterized the amount as "a major portion of the total archive in which JSTOR had invested." See also Alexandra Elbakyan List of Wikipedia people Sci-Hub Shadow library References External links Github.com/aaronsw (Aaron Swartz) Remembrances (2013– ), with obituary and official statement from family and partner The Internet's Own Boy: The Story of Aaron Swartz, The Documentary Network, June 29, 2014, a film by Brian Knappenberger – Luminant Media The Aaron Swartz Collection at Internet Archive (2013– ) (podcasts, e-mail correspondence, other materials) Posting about Swartz as Wikipedia contributor (2013), at The Wikipedian Case Docket: US v. Swartz Report to the President: MIT and the Prosecution of Aaron Swartz JSTOR Evidence in United States vs. Aaron Swartz A collection of documents and events from JSTOR's perspective. Hundreds of emails and other documents they provided the government concerning the case. Federal law enforcement documents about Aaron Swartz, released under the Freedom of Information Act Further reading Biography of Swartz. Poulsen, Kevin. "MIT Moves to Intervene in Release of Aaron Swartz's Secret Service File." Wired. July 18, 2013. Documentary Brian Knappenberger (Producer and Director), The Internet's Own Boy: The Story of Aaron Swartz. Participant Media: 2014. Via The Internet Archive, www.archive.org/ Run time: 105 minutes. Ali Akbarzadeh (Director), Killswitch: The Battle to Control the Internet, Akorn Entertainment: 2014 1986 births 2013 deaths 2013 suicides 20th-century American businesspeople Activists from Illinois American activists American computer programmers Jewish American atheists 20th-century American Jews American technology writers American Wikimedians Articles containing video clips Businesspeople from New York City Businesspeople in information technology Copyright activists Internet activists Lake Forest College alumni North Shore Country Day School alumni Open access activists Open content activists People associated with computer security People charged with computer fraud People from Highland Park, Illinois Reddit people Stanford University alumni Suicides by hanging in New York (state) Wikipedia people Writers from Chicago 21st-century American Jews
62450375
https://en.wikipedia.org/wiki/Timothy%20M.%20Pinkston
Timothy M. Pinkston
Timothy M. Pinkston is an American computer engineer, researcher, educator and administrator whose work is focused in the area of computer architecture. He holds the George Pfleger Chair in Electrical and Computer Engineering and is a Professor of Electrical and Computer Engineering at University of Southern California (USC). He also serves in an administrative role as Vice Dean for Faculty Affairs at the USC Viterbi School of Engineering. Pinkston's computer architecture research focuses on the design of interconnection networks for many-core and multiprocessor computer systems. His research contributions span formal theory, methods, and techniques for abating interconnection network routing inefficiencies and preventing deadlock. He has contributed to development of solutions to network deadlocking phenomena, including routing-induced, protocol (message)-induced, and reconfiguration-induced deadlocks. He has also developed energy-, resource-, and performance-efficient network-on-chip (NoC) designs. In 2009, Pinkston became an IEEE Fellow (Institute of Electrical and Electronics Engineers) "for contributions to design and analysis of interconnection networks and routing algorithms." In 2019, Pinkston became an ACM Fellow (Association for Computing Machinery) "for contributions to interconnection network routing algorithms and architectures, and leadership in expanding computing research. Pinkston is the first African American to become a tenured faculty member with primary appointment in engineering and the first African American to hold a decanal administrative faculty position in engineering in USC's history. Education Pinkston earned a Bachelor of Science in Electrical Engineering in 1985 from Ohio State University. He then went on to earn an M.S. in Electrical Engineering in 1986 and a Ph.D. in Electrical Engineering in 1993, both from Stanford University. The title of his Ph.D. thesis is The GLORI Strategy for Multiprocessors: Integrating Optics into the Interconnect Architecture. Career Prior to embarking on a professorial career in academia, Pinkston was a Member of Technical Staff at AT&T Bell Laboratories, a Research Intern at IBM T. J. Watson Research Laboratories, and a Hughes Doctoral Fellow and Research Staff at Hughes Research Laboratories (HRL). In 1993, Pinkston joined the University of Southern California as an Assistant Professor and promoted to the ranks of Associate Professor in 1999 and full Professor in 2003. From 2003 to 2005, he served as the Director of the Computer Engineering Division of Electrical Engineering-Systems at USC. In 2009, Pinkston was appointed as the Senior Associate Dean of Engineering of the USC Viterbi School of Engineering and, in 2011, became the Vice Dean for Faculty Affairs in the Viterbi School. In 2017, Pinkston was named holder of the Louise L. Dunn Endowed Professorship in Engineering, and in 2019, he was named holder of the George Pfleger Chair in Electrical and Computer Engineering. At USC, Pinkston founded the Superior Multiprocessor Architecture (SMART) Interconnects Group which investigates high-performance communication architectures for parallel computer systems—interconnection networks, adaptive and reconfigurable routing algorithms, router design and implementation, and energy- and resource-efficient NoCs. Pinkston was the lead co-author of "Interconnection Networks", a chapter appearing as Appendix E in the 4th edition and as Appendix F in the 5th and 6th editions of the textbook Computer Architecture: A Quantitative Approach. He served as the founding Lead Program Director of the National Science Foundation’s Expeditions in Computing program in 2007–2008. Before that, he served two years as NSF's CISE CCF Program Director for the Computer Systems Architecture area and co-established the Multicore Chip Design and Architecture (MCDA) program, co-funded by SRC. Pinkston served as an Associate Editor of IEEE Transactions on Parallel and Distributed Systems (TPDS) from 1999 to 2002, a member of the Executive Committee of the IEEE Technical Committee on Computer Architecture (TCCA) from 2010–2015, and a founding member of the SIGARCH/SIGMICRO Committee to Aid Reporting on Discrimination and Harassment Policy Violations (CARES) since 2018. Research In collaboration with his SMART group members, Pinkston conducted deadlock characterization studies that revealed how infrequently, and under what conditions, deadlocks can form and be resolved in interconnection networks, giving credence to deadlock recovery-based routing as a viable alternative to deadlock avoidance-based routing. He and his collaborators investigated deadlock-free routing techniques that improve understanding of various approaches to resolve potential deadlocks, including regressive-based, deflective-based, and progressive-based recovery routing algorithms and architectures. Pinkston, with his collaborators, developed general theory for designing routing algorithms applicable to recovery-based as well as avoidance-based (preventative) approaches and developed a theoretical framework and design methodology for deadlock-free dynamic reconfiguration of routing algorithms—to tolerate network faults, hot-swapping, and other changes in interconnectivity that can cause reconfiguration-induced deadlocks—with minimal packet loss, high throughput, and improved resiliency. Pinkston also led the development of design methodologies and router architectures for energy-, resource-, and performance-efficient on-chip networks (NoCs). With SMART group members, he was among the first to explore architectural support for effectively applying power-saving techniques, such as power gating, to NoCs for reducing static power consumption in computer systems. Philanthropy With an endowment gift from Pinkston, The Ohio State University has established the Pinkston Family Achievement Award Fund, which annually awards scholarships to students in the Lambda Psi minority engineering honorary who are performing at the highest academic levels, as well as to a Minority Engineering Program (MEP) student with the most-improved performance. It also supports Ohio State's Academic Coaching in Engineering (ACE) Program which offers tutoring and study strategy instruction to MEP students in OSU's College of Engineering. Awards and honors 1984 - GEM Fellowship Award 1989 - Hughes Doctoral Fellowship Award 1994 - NSF Minority Research Initiation Award 1996 - NSF CAREER Award 2003 - ACM Recognition of Service Award 2005 - Distinguished Alumnus Award from the College of Engineering and the Minority Engineering Program (MEP), The Ohio State University 2009 - Fellow, IEEE 2018 - ACM Recognition of Service Award 2018 - IEEE Computer Society Recognition of Service Award 2019 - Fellow, ACM Selected publications “An efficient, fully adaptive deadlock recovery scheme: DISHA,” K. V. Anjan and T. M. Pinkston, in Proceedings of the 22nd ACM/IEEE Annual International Symposium on Computer Architecture (ISCA), pp. 201–210, 1995. “On deadlocks in interconnection networks,” S. Warnakulasuriya and T. M. Pinkston, in Proceedings of the 24th ACM/IEEE Annual International Symposium on Computer Architecture (ISCA), pp. 38–49, 1997. “A general theory for deadlock-free adaptive routing using a mixed set of resources,” J. Duato and T. M. Pinkston, IEEE Transactions on Parallel and Distributed Systems, 12(12), pp. 1219–1235, 2001. “A methodology for designing efficient on-chip interconnects on well-behaved communication patterns,” W. H. Ho and T. M. Pinkston, in Proceedings of the 9th IEEE International Symposium on High-Performance Computer Architecture (HPCA), pp. 377–388, 2003. “A Progressive Approach to Handling Message-Dependent Deadlocks in Parallel Computer Systems,” Y. H. Song and T. M. Pinkston, in IEEE Transactions on Parallel and Distributed Systems, 14(3), pp. 259–275, 2003. “Deadlock-free Dynamic Reconfiguration Schemes for Increased Network Dependability,” T. M. Pinkston, R. Pang, and J. Duato, in IEEE Transactions on Parallel and Distributed Systems, 14(8), pp. 780–794, 2003. “A Theory for Deadlock-free Dynamic Reconfiguration of Interconnection Networks: Part I,” J. Duato, O. Lysne, R. Pang, and T. M. Pinkston, in IEEE Transactions on Parallel and Distributed Systems, 16(5), pp. 412–427, 2005. “Characterizing the Cell EIB on-chip network,” T. W. Ainsworth and T. M. Pinkston, in IEEE Micro, Special Issue on On-Chip Interconnects for Multicores, IEEE Computer Society, 27(5), pp. 6–14, 2007. “A Lightweight Fault-Tolerant Mechanism for Network-on-Chip,” M. Koibuchi, H. Matsutani, H. Amano, and T. M. Pinkston, in Proceedings of the 2nd ACM/IEEE International Symposium on Networks-on-Chip (NOCS), pp. 13–22, 2008. “Critical Bubble Scheme: An efficient implementation of globally-aware network flow control,” L. Chen, R. Wang, and T. M. Pinkston, in Proceedings of the 25th IEEE International Parallel and Distributed Processing Symposium (IPDPS), pp. 592–603, 2011. “NoRD: Node-Router Decoupling for effective power-gating of on-chip routers,” L. Chen and T. M. Pinkston, in Proceedings of the 45th Annual ACM/IEEE International Symposium on Microarchitecture (MICRO), pp. 270–281, 2012. “Interconnection Networks,” T. M. Pinkston and J. Duato, in Computer Architecture: A Quantitative Approach, by John L. Hennessy and David A. Patterson, Elsevier Publishers, Appendix E, pp. 1–114 in the 4th edition, September 2006; and Appendix F, pp. 1–117 in 5th edition, September 2011; and Appendix F, pp. 1–117, in 6th edition, September, 2017. References External links Pinkston's page at University of Southern California Pinkston on Google Scholar Living people American computer scientists University of Southern California faculty Ohio State University College of Engineering alumni Stanford University School of Engineering alumni 1964 births
20159106
https://en.wikipedia.org/wiki/The%20Work%20Number
The Work Number
The Work Number is a user-paid verification of employment database created by TALX Corporation. TALX was acquired by Equifax Inc. in February 2007. Employers can purchase data on a prospective employee. This includes confirmation of an individual's employment records and income for verification purposes. The fee for this information is revealed only after the requester answers several personal questions. It is used by over 50,000 organizations to verify employment data. Some organizations that use the Work Number include Fannie Mae, Hilton Hotels, Rent-A-Center, the United States Postal Service, Domino's Pizza, the University of Pennsylvania, and the University of Missouri System. Founded in 1995, The Work Number has over 225 million employment records. The Work Number is an example of outsourcing of a Human Resources department function. Data collected The Work Number collects and archives weekly salary information. It also collects length of employment, job titles, "location information", and "other kinds of human resources-related information, such as health care provider, whether someone has dental insurance and if they’ve ever filed an unemployment claim." Proponents Time reducing The service reduces the amount of time required for Human Resource departments to respond to employment verification requests. Access control Employees of a company or organization using The Work Number's services receive an account that is set up for them on the website. Current, and presumably, former employees can log on to The Work Number at any time. Employees cannot control access to their records by any entity or person which knows their social security number. If the employee wishes for a requestor to see his or her salary history, the employee logs on and obtains a 6-digit code, which he or she passes on to the requestor. Without that particular 6-digit code, the requestor is not allowed to view salary of the employee. Equifax sells data to third parties. Companies including "mortgage, auto and other financial services credit grantors" may request pay rate information similar to a credit report. Also, "debt/collection agencies may request employment information" to verify someone's place of employment. Nature of records The system reports employment data, such as length of employment and job title. Proponents argue this can reduce the risk of legal liability over the subjective content of personal references. The Work Number report does not include performance reviews. Additional HR services The Work Number, if set up for this service by the employer, may provide duplicate copies of W-2 forms through the employee's online portal. Fees and Waivers The Work Number generally charges for verification data. Fees are waived for Federal, State, or County Social Service Departments. The reports are sent by fax and may take a few days to be sent. Certain expedited services or advanced services may have fees attached. To qualify for reduced or waived fees, the agency must register with The Work Number using an official fax number. Agencies that can take advantage of this service include eligibility programs, public housing, child support enforcement, and other public assistance needs. A "batch service" for multiple requests is also available. Criticisms 2013 sale of sensitive personal information In January 2013, The Work Number was criticized for selling access to people's ostensibly private data, especially salary data, to third parties, without the informed consent of the subject. Organisations affected included Columbia University, and third parties included debt collection companies. The director of policy and advocacy at the Privacy Rights Clearinghouse stated, "I think [this] is something that would be offensive to many people. One typically considers salary information to be shared by your employer just with IRS." 2017 exposure of Americans' salary data On 8 October 2017, Brian Krebs reported that The Work Number exposed the salary histories for employees of tens of thousands of U.S. companies to anyone in possession of the employee's Social Security number and date of birth. For roughly half the U.S. population, both of the latter pieces of data are known to be in possession of criminals, following Equifax's May-July 2017 security breach. Identity theft concerns Recently, internet security issues at sites that contain "sensitive" information has become a big business for hackers. Questions regarding the safety of various websites that offer services (such as The Work Number) that contain this information may be compromised by hackers (and may be a future target for hackers) and a problem for internet security officials. Profit motive The Work Number charges a fee to the requesting party for each Employment Verification. Requestors can choose a "pay per use" plan or can select a package which includes a certain number of verifications per month. Because a fee (up to $45.95 per verification) is required, this increases the financial cost of verifying an individual's employment. There is a risk this cost could be passed on to the applicant. Mandatory usage policies Some organizations are making use of The Work Number mandatory as the only way for employees or verifiers to receive information about a staffer's employment. This limits the availability of personalized, subjective, or qualitative references. Critics argue that the verification system can be reductive of a prospective employee's potential. Inaccurate or out of date information Consumers have notified Privacy Rights Clearinghouse that "the data in its database is inaccurate," that "when they try to use the information for employment verification, their titles are outdated or otherwise misrepresent their work history." See also Equifax Human resources Outsourcing TALX References External links Database of companies that use The Work Number Economic databases Business process outsourcing companies
47825616
https://en.wikipedia.org/wiki/J.D.%20Kleinke
J.D. Kleinke
J.D. Kleinke (born 1962) is an author and health care industry leader. He has written extensively about the economics, politics, and culture of the US health care system, including two books on the U.S. health care system, Bleeding Edge (1998) and Oxymorons (2001) and the medical novel Catching Babies (2011), which is currently in development as a TV series. He has also published two novels about the relationship between landscape, adventure sports, and spirituality in the American West, Dudeville (2017) and That Golden Shore (2021). His work has appeared in The New York Times, The Wall Street Journal, Barron’s, Forbes, Freeskiier, The Surfers Journal, Acoustic Guitar, Health Affairs, JAMA, and other publications. During his health career, he has been involved in the formation, management, and governance of numerous health care information organizations, including Health Grades, Truven Health Analytics, RIMS/Trizetto, Omnimedix Institute, Mount Tabor, and Context Matters. He remains active as a mentor to health care technology start-ups and growth companies, including Wildflower Health and Omada Health. Over the course of his career, Kleinke has written extensively about the economics, politics, and culture of the US health care system. Since the early 1990s, he was outspoken on the real impact of changes to the system (e.g., managed care, Obamacare, physician payment incentives, computerization), and the effects of such changes on patient care, clinician professional development, health care organization strategy, and the public health. He has served as a Resident Scholar of the American Enterprise Institute, member of the Editorial Board of Health Affairs, and frequent contributor to The Wall Street Journal and The Huffington Post. Early life and education Kleinke attended the University of Maryland where he graduated with a Bachelor of Science in 1989. He later attended Carey Business School at Johns Hopkins University where he graduated with a M.S.B. in 1997. Career Writing and advocacy Kleinke's earliest written work involved critical economic analysis of the first generation of managed care. His first book, Bleeding Edge: The Business of Health Care in the New Century, published in 1998, was a harsh critique of the economic conflict generated by the imposition of untested managed care methods on what was at the time an antiquated, fragmented health care financing and delivery system. His second book, Oxymorons: The Myth of a US Health Care System published in 2001, described in detail a health care system rebuilt around consumer choice, increasing patient cost-sharing, mandated coverage, and exchange-based health plan selection–the cornerstones of what would become the Affordable Care Act, or “Obamacare,” in 2010. Kleinke’ third book, Catching Babies, published in 2011, is a medical novel about the training of obstetrician-gynecologists and the culture of childbirth in the US. Catching Babies is currently in development as a TV series. From the early 1990s, Kleinke was one of the earliest advocates for the measurement of health care quality, the quantification-based accountability of health care providers, and the computerization of American medicine. He published several journal-length articles on these inter-related subjects in the peer-reviewed policy journal Health Affairs from the mid-1990s through 2005. This work–in particular “Dot-Gov: Market Failure and the Creation of a National Health Information Technology System”–has been widely cited by subsequent researchers and health information technology advocates. Kleinke's work on these subjects was used by policymakers to formulate legislation mandating and funding the adoption of electronic medical records by US health care providers, culminating with the Health Information Technology Act of 2010. Kleinke was an early supporter of the Affordable Care Act, President Barack Obama’s health reform law, based on his public analyses of the principles of market economics, consumer choice and insurer competition built into the law’s structure–ideas denied or overlooked by the President’s political opponents. As a Resident Fellow at the conservative think-tank, the American Enterprise Institute, Kleinke was the first to publish in the national media the conservative origins of the health care law. Kleinke’s article on these origins in The New York Times, “The Conservative Case for Obamacare”–published five weeks before the re-election of President Obama–generated significant controversy, and his departure from AEI followed three months later. Kleinke's health care writings have appeared in The Wall Street Journal, The New York Times, Barron's, Health Affairs, JAMA, the British Medical Journal, Modern Healthcare, Managed Healthcare and Forbes. Business career Kleinke’s health care writing and advocacy were informed by three decades of working at the forefront of the health information industry. This industry was in its infancy in the early 1990s when Kleinke and fellow students, alumni and faculty-mentors from The Johns Hopkins University established HCIA, the company that would become Solucient and is now known as Truven Health Analytics. While a graduate student at Hopkins, Kleinke had been serving as Director of Corporate Programs at Sheppard Pratt Health Systems in Towson, Maryland, which at the time, was the largest private psychiatric hospital in the US. While at Sheppard Pratt, Kleinke developed and managed the nation’s first provider-based, managed mental health care system. In that early management role, Kleinke recognized the importance of data, information and analysis to the future of the health care system, at a time when nearly every element of the system was paper-based, archaically organized, and grossly inefficient. Kleinke and his colleagues at HCIA and other start-ups in the early 1990s pioneered what would be the emerging field of “health care informatics,” or the application of quantitative analytical techniques to large sets of data on medical care. (This movement was part of what would come to be known across all industries as “Big Data"). Kleinke helped develop HCIA from a niche health care data analysis firm to a major provider of sophisticated information products and analytical services to health care systems, managed care organizations, and pharmaceutical companies across the U.S. and Europe. Today, its successor organization, Truven Health Analytics, is a health care information company with revenues estimated at more than $500 million per year. After five years and a successful IPO, Kleinke left HCIA in 1998 and joined the boards of other health care information companies, most notably Health Grades, which democratized the use of health care informatics by applying its methods to consumer health care provider selection. Kleinke served on the company’s Board of Directors until 2008, including as Executive Vice Chairman for several years, when it was also a publicly traded company. In 2004, Kleinke founded and led the Omnimedix Institute, a 501(c)(3) charitable organization that created, built and promoted information technologies for giving patients and their families safe and secure access to and control over their own medical data. With funding from private foundations, public corporations, and the federal government, Omnimedix developed and deployed open and ubiquitous health information technologies designed to increase patient access to the health care system; to data on their own medical care; and to information about the best care available for their medical condition. In 2007, Kleinke co-founded and led Mount Tabor, a health care information technology development company established to plan, design, build, test and launch systems for the transformation and movement of electronic medical information. Mount Tabor provided business strategy and technology integration services for health care companies, technology providers, and government entities creating health information products, systems and services. Mount Tabor helped build, launch and test the Google Health personal health information platform, and supported the implementation of Microsoft HealthVault personal health information platform. Kleinke currently serves as an investor in and mentor to Wildflower Health, a provider of mobile apps that help patients and families manage pregnancies and early childhood health, on behalf of commercial and Medicaid health plans, state Medicaid programs, and employers; and Context Matters, a provider of drug reimbursement and clinical information products, and related drug data management services, to pharmaceutical companies and financial institutions. He also serves on the Board of Primary Care Progress, a non-profit organization that provides management education and leadership training services to US physicians. Music performances and writings Kleinke is a musician, and plays guitar, mandolin, bass and banjo in folk, bluegrass and sacred music settings. He is the founder and leader of Nashir Neshama, a Jewish kirtan group based in Portland; and provides musical accompaniment for kirtan events and yoga classes. In the late 1980s, Kleinke wrote musician profiles and album reviews for Acoustic Guitar, Bluegrass Unlimited, Frets, Banjo Newsletter, and the Old Time Herald. Bibliography Books Select publications 2013, Combat Medicine's 'Golden Hour' in Iraq, One Decade Later, The Huffington Post 2012, The Conservative Case for Obamacare, The New York Times 2012, The Myth of Runaway Health Spending'', The Wall Street Journal References External links J.D. Kleinke official website 1962 births Writers from Albany, New York American non-fiction writers Johns Hopkins Carey Business School alumni University of Maryland, College Park alumni Living people
298763
https://en.wikipedia.org/wiki/Richard%20M.%20Karp
Richard M. Karp
Richard Manning Karp (born January 3, 1935) is an American computer scientist and computational theorist at the University of California, Berkeley. He is most notable for his research in the theory of algorithms, for which he received a Turing Award in 1985, The Benjamin Franklin Medal in Computer and Cognitive Science in 2004, and the Kyoto Prize in 2008. Karp was elected a member of the National Academy of Engineering (1992) for major contributions to the theory and application of NP-completeness, constructing efficient combinatorial algorithms, and applying probabilistic methods in computer science. Biography Born to parents Abraham and Rose Karp in Boston, Massachusetts, Karp has three younger siblings: Robert, David, and Carolyn. His family was Jewish, and he grew up in a small apartment, in a then mostly Jewish neighborhood of Dorchester in Boston. Both his parents were Harvard graduates (his mother eventually obtaining her Harvard degree at age 57 after taking evening courses), while his father had had ambitions to go to medical school after Harvard, but became a mathematics teacher as he could not afford the medical school fees. He attended Harvard University, where he received his bachelor's degree in 1955, his master's degree in 1956, and his Ph.D. in applied mathematics in 1959. He started working at IBM's Thomas J. Watson Research Center. In 1968, he became Professor of Computer Science, Mathematics, and Operations Research at the University of California, Berkeley. Karp was the first associate chair of the Computer Science Division within the Department of Electrical Engineering and Computer Science. Apart from a 4-year period as a professor at the University of Washington, he has remained at Berkeley. From 1988 to 1995 and 1999 to the present he has also been a Research Scientist at the International Computer Science Institute in Berkeley, where he currently leads the Algorithms Group. Richard Karp was awarded the National Medal of Science, and was the recipient of the Harvey Prize of the Technion and the 2004 Benjamin Franklin Medal in Computer and Cognitive Science for his insights into computational complexity. In 1994 he was inducted as a Fellow of the Association for Computing Machinery. He was elected to the 2002 class of Fellows of the Institute for Operations Research and the Management Sciences. He is the recipient of several honorary degrees and a member of the U.S. National Academy of Sciences, the American Academy of Arts and Sciences, and the American Philosophical Society. In 2012, Karp became the founding director of the Simons Institute for the Theory of Computing at the University of California, Berkeley. Work Karp has made many important discoveries in computer science, combinatorial algorithms, and operations research. His major current research interests include bioinformatics. In 1971 he co-developed with Jack Edmonds the Edmonds–Karp algorithm for solving the maximum flow problem on networks, and in 1972 he published a landmark paper in complexity theory, "Reducibility Among Combinatorial Problems", in which he proved 21 problems to be NP-complete. In 1973 he and John Hopcroft published the Hopcroft–Karp algorithm, the fastest known method for finding maximum cardinality matchings in bipartite graphs. In 1980, along with Richard J. Lipton, Karp proved the Karp–Lipton theorem (which proves that if SAT can be solved by Boolean circuits with a polynomial number of logic gates, then the polynomial hierarchy collapses to its second level). In 1987 he co-developed with Michael O. Rabin the Rabin–Karp string search algorithm. Turing Award His citation for the (1985) Turing Award was as follows: References External links ACM Crossroads magazine interview/bio of Richard Karp Karp's Home Page at Berkeley Biography of Richard Karp from the Institute for Operations Research and the Management Sciences American computer scientists American operations researchers 1935 births Living people Theoretical computer scientists Fellows of the Association for Computing Machinery Fellows of the Society for Industrial and Applied Mathematics Fellows of the Institute for Operations Research and the Management Sciences Kyoto laureates in Advanced Technology Members of the United States National Academy of Engineering Members of the United States National Academy of Sciences John von Neumann Theory Prize winners Jewish scientists Jewish American scientists National Medal of Science laureates Turing Award laureates UC Berkeley College of Engineering faculty Members of the French Academy of Sciences People from Boston 20th-century American engineers 21st-century American engineers 20th-century American mathematicians 21st-century American mathematicians 20th-century American scientists 21st-century American scientists Harvard School of Engineering and Applied Sciences alumni Members of the American Philosophical Society
54022970
https://en.wikipedia.org/wiki/PyTorch
PyTorch
PyTorch is an open source machine learning framework based on the Torch library, used for applications such as computer vision and natural language processing, primarily developed by Facebook's AI Research lab (FAIR). It is free and open-source software released under the Modified BSD license. Although the Python interface is more polished and the primary focus of development, PyTorch also has a C++ interface. A number of pieces of deep learning software are built on top of PyTorch, including Tesla Autopilot, Uber's Pyro, Hugging Face's Transformers, PyTorch Lightning, and Catalyst. PyTorch provides two high-level features: Tensor computing (like NumPy) with strong acceleration via graphics processing units (GPU) Deep neural networks built on a type-based automatic differentiation system History Facebook operates both PyTorch and Convolutional Architecture for Fast Feature Embedding (Caffe2), but models defined by the two frameworks were mutually incompatible. The Open Neural Network Exchange (ONNX) project was created by Facebook and Microsoft in September 2017 for converting models between frameworks. Caffe2 was merged into PyTorch at the end of March 2018. PyTorch tensors PyTorch defines a class called Tensor (torch.Tensor) to store and operate on homogeneous multidimensional rectangular arrays of numbers. PyTorch Tensors are similar to NumPy Arrays, but can also be operated on a CUDA-capable Nvidia GPU. PyTorch supports various sub-types of Tensors. Note that the term "tensor" here does not carry the same meaning as in mathematics or physics. The meaning of the word in those areas is only tangentially related to the one in Machine Learning. In mathematics, a tensor is a certain kind of object in linear algebra, while in physics the term "tensor" usually refers to what mathematicians call a tensor field. Modules Autograd module PyTorch uses a method called automatic differentiation. A recorder records what operations have performed, and then it replays it backward to compute the gradients. This method is especially powerful when building neural networks to save time on one epoch by calculating differentiation of the parameters at the forward pass. Optim module torch.optim is a module that implements various optimization algorithms used for building neural networks. Most of the commonly used methods are already supported, so there is no need to build them from scratch. nn module PyTorch autograd makes it easy to define computational graphs and take gradients, but raw autograd can be a bit too low-level for defining complex neural networks. This is where the nn module can help. Example The following program shows the functionality of the library with a simple example import torch dtype = torch.float device = torch.device("cpu") # This executes all calculations on the CPU # device = torch.device("cuda:0") # This executes all calculations on the GPU # Creation of a tensor and filling of a tensor with random numbers a = torch.randn(2, 3, device=device, dtype=dtype) print(a) # Output of tensor A # Output: tensor([[-1.1884, 0.8498, -1.7129], # [-0.8816, 0.1944, 0.5847]]) # Creation of a tensor and filling of a tensor with random numbers b = torch.randn(2, 3, device=device, dtype=dtype) print(b) # Output of tensor B # Output: tensor([[ 0.7178, -0.8453, -1.3403], # [ 1.3262, 1.1512, -1.7070]]) print(a*b) # Output of a multiplication of the two tensors # Output: tensor([[-0.8530, -0.7183, 2.58], # [-1.1692, 0.2238, -0.9981]]) print(a.sum()) # Output of the sum of all elements in tensor A # Output: tensor(-2.1540) print(a[1,2]) # Output of the element in the third column of the second row # Output: tensor(0.5847) print(a.min()) # Output of the minimum value in tensor A # Output: tensor(-1.7129) See also Comparison of deep learning software Differentiable programming DeepSpeed Torch (machine learning) References External links Applied machine learning Data mining and machine learning software Deep learning Facebook software Free science software Free software programmed in C Free software programmed in Python Open-source artificial intelligence Python (programming language) scientific libraries Software using the BSD license
58379044
https://en.wikipedia.org/wiki/Vanity%20Fair%20%282018%20TV%20series%29
Vanity Fair (2018 TV series)
Vanity Fair is a 2018 historical drama miniseries based on the 1848 novel of the same name by William Makepeace Thackeray. It was produced by Mammoth Screen and distributed by ITV and Amazon Studios. The series stars Olivia Cooke as Becky Sharp, Tom Bateman as Captain Rawdon Crawley, and Michael Palin as the author William Makepeace Thackeray. Cast Main Olivia Cooke as Becky Sharp, the daughter of a French opera singer and an artist father. Sharp is a cynical social climber who uses her charms to fascinate and seduce upper-class men. Claudia Jessie as Amelia Sedley, a good-natured naive young girl, of a wealthy London family who is Becky's friend from Miss Pinkerton's academy and invites Becky to stay in her London home following their graduation from the academy. Tom Bateman as Rawdon Crawley, an empty-headed cavalry officer, younger of the two Crawley sons and favourite of their Aunt Matilda, until he marries Sharp, a woman of a far lower class. Johnny Flynn as William Dobbin, colonel of the City Light Horse regiment and the best friend of George Osborne, who feels unrequited love for Amelia. Charlie Rowe as George Osborne, son of merchant John Osborne and childhood sweetheart, later husband, of Amelia, who defies his father to marry his love. Simon Russell Beale as Mr. John Sedley, Amelia and Jos's father and Louisa's husband who goes bankrupt. Anthony Head as Lord Steyne, a rich and powerful marquis who is attracted to Becky. Martin Clunes as Sir Pitt Crawley, a crude and profligate baronet who hired Becky as governess to his daughters before seeking to marry her, and then discovering she has become secretly engaged to his second son, Rawdon. Frances de la Tour as Lady Matilda Crawley, the wealthy aunt of the Crawley sons. Michael Palin as William Makepeace Thackeray, the author of Vanity Fair and narrator of the series. Robert Pugh as Mr. John Osborne, George's father who forbids him from marrying Amelia. Suranne Jones as Miss Pinkerton, snobbish and cold hearted headmistress of the academy which Amelia and Becky used to attend. David Fynn as Jos Sedley, Collector in India and Amelia's brother who has an initial attraction to Becky. Claire Skinner as Mrs. Louisa Sedley, Amelia and Joss's mother and John's wife. Mathew Baynton as Bute Crawley, Rawdon's Christian brother. Sian Clifford as Martha Crawley, Bute's spouse. Felicity Montagu as Arabella Briggs, servant to Lady Matilda, and later Becky. Monica Dolan as Mrs. Peggy O'Dowd Recurring Ellie Kendrick as Jane Osborne Elizabeth Berrington as Lady Bareacres Sally Phillips as Lady Steyne Richard Dixon as General Tuffo Peter Wight as Mr Raggles Patrick FitzSymons as Major Michael O'Dowd Episodes Production A cottage on Chevening House Estate, Sevenoaks in Kent featured as Rawdon Crawley's cottage. Squerryes Court, Sevenoaks, was used for the interiors of Miss Pinkerton's school. A scene on the promenade, featuring soldiers and horses, was shot outside the Royal Hotel in Deal, Kent. Further filming took place at Chatham Historic Dockyard, where various London street scenes were shot outside the Ropery, an embarkation to France was shot on Anchor Wharf, and the interior of the Commissioner's House was also used. Critical reception The series was met with a positive response from critics for its sets and Olivia Cooke's performance. On the review aggregation website Rotten Tomatoes, the series holds a 88% with an average rating of 7.08 out of 10 based on 33 reviews. The website's critical consensus reads, "Olivia Cooke's brilliant portrayal of the feisty and scheming Becky Sharp in Vanity Fair makes this adaptation of Thackeray's classic novel more relatable for a 21st century audience." On Metacritic, the film has a weighted average score of 66 out of 100, based on 7 critics, indicating "generally favorable reviews". Following the conclusion of the series and on writing about the series's significantly low viewing figures in comparison to the BBC One "ratings juggernaut" Bodyguard, Ben Dowell of the Radio Times praised Cooke's performance, writing that "of all the TV Beckys down the ages – Joyce Redman, Susan Hampshire, Eve Matheson, Natasha Little, not to mention Reese Witherspoon in the 2004 film – Cooke is definitely one of the best we’ve ever had." Newsday's Verne Gay was more critical of the show, calling it both "faithful and faithless" to the book and concluded that the series "can occasionally feel like a homework assignment." Matthew Gilbert, writing for The Boston Globe, was more positive, stating that "If you’re a fan of these adaptations...I think you’ll find something pleasing in this “Vanity Fair” — not heroes and heroines stirring about waiting for their happy endings, of course, but something far more scandalous and universal." References External links 2018 British television series debuts 2018 British television series endings 2010s British drama television series 2010s British television miniseries Television series set in the 19th century Films based on Vanity Fair (novel) ITV television dramas Amazon Studios films Television series by ITV Studios Television series by Mammoth Screen English-language television shows
46916778
https://en.wikipedia.org/wiki/Evercam
Evercam
Evercam is a free, open-source, closed-circuit television software application designed to be run as SaaS. Evercam originated as a proprietary VSaaS application in 2010 by Irish company Camba.tv Ltd who published the source code to the public under the Affero General Public License (AGPL) licence in April 2015. Uses of the software include security, supply chain monitoring, Time-lapse photography, and Enterprise Resource Planning integrations. Evercam earns revenue by providing the Software as a Service, a business model that is increasingly common amongst open source companies. Feature List Over 3000 Public Domain Cameras Web API written in Swagger See also Closed-circuit television (CCTV) Closed-circuit television camera IP camera Motion Open Source Surveillance Software Zoneminder Open Source Surveillance Software References External links Evercam official website API Evangelist review Programmable Web Review Techcrunch discussion Irish Independent Commentary Surveillance Software using the GNU AGPL license
64787379
https://en.wikipedia.org/wiki/Lady%20Mary%20Pelham%20%281811%20ship%29
Lady Mary Pelham (1811 ship)
Lady Mary Pelham was launched in 1811 as a packet based in Falmouth, Cornwall for the Post Office Packet Service. She repelled attack by privateers in 1812 and 1813, the latter being a notable and controversial engagement with an American privateer. Another American privateer captured her in February 1815 in the West Indies. New owners retained her name and between 1815 and at least 1824 she continued to sail to the Continent and South America. Packet Lloyd's Register (LR) started carrying the Falmouth packets in 1812 and that is when Lady Mary Pelham first appeared in it. James A. Stevens was appointed captain on 4 March 1811. On 14 October 1812 Lady Mary Pelham Packet repelled an attack off Cape Pallas by a privateer of 14 guns and 75 men. The privateer had earlier captured a vessel from Gibraltar returning there from Cagliari and armed with 10 guns. Lady Mary Pelham arrived at Falmouth on 5 December, having sailed from Malta on 7 November and Gibraltar on the 29th. On 2 November 1813 Lady Mary Pelham, acting commander Perring (or Pering), and , John A. Norway, master, encountered the American privateer Globe, Captain Richard Moon, off Teneriffe. During the engagement, Captain Norway, the surgeon, and two seamen were killed on Montague; 11 seamen were wounded. Lady Mary Pelham had two men wounded, one of them being Perring. There were conflicting accounts of the engagement, one denigrating Perring as a lawyer whose sole experience had been sailing a yacht, and Lady Mary Pelhams contribution to the engagement being too little too late. The matter came up in Parliament where documents were table showing that a second court of inquiry had exonerated Captain Perring and acknowledged that Lady Mary Pelham{'}s intervention had saved Montague from capture and had eventually succeeded in driving Globe off. After the engagement Globe put into Grand Canary in a highly damaged state. She had had 33 men killed, 19 wounded, and five captured in attempts to board Montague. Captain James Graham assumed command of Lady Mary Pelham on 21 June 1814. Capture Captain Graham sailed from Falmouth on 20 November 1814 and arrived at Suriname in January 1815. Lady Mary Pelham sailed from Suriname to Barbados, and then to Antigua, leaving Antigua on 1 February. The Baltimore privateer Kemp, Jose Joaquim Almeida, master, captured Lady Mary Pelham on 9 February 1815. Graham and seven of his men had been wounded, and two men killed before she struck. Kemp had one man killed and three wounded in the 40 minute action. Kemp was armed with six guns and had a crew of 135 men; Lady Mary Pelham was armed with 10 guns and had a complement of 42 men, including five passengers. Almeida sold his crew $632.75 worth of clothes taken from Lady Mary Pelham. The sum then became part of the prize account. Kemp sent her into Wilmington, North Carolina where she was libeled on 31 March 1815 and condemned. American merchantman Lady Mary Pelham was sold in Wilmington, with new owners retaining the name. Lady Mary Pelham, Sanders, master, a packet brig from Wilmington, North Carolina, discharged at Gibraltar on 5 June. Captain Sanders also sailed her between the US and Buenos Aires. In 1818, with Gillander, master, she was reported to have come into New York from Havana. On 14 April 1818 The ran down and sank Noma (Numa), of Baltimore, returning there in ballast from Amsterdam. The master and the crew were taken aboard the frigate, which took them to Bordeaux. Lady Mary Pelham, Schouyler, brought the mate and steward into New York from Bordeaux. Néréides commander was capitaine de vaisseau Boutouillic de La Villegonan and the incident occurred above the Azores. Néréide had been sailing from Martinique to Brest, France via Guadeloupe. On 14 June at two armed vessels flying Spanish colours, believed to be from Havana on their way to Corruna fired on Lady Mary Pelham. Captain Schoyler, believing that the only way to account for such behavior was that war had been declared between Spain and America, struck. He went aboard one of the vessels and the other sent an officer aboard Lady Mary Pelham. After it was established that no state of war existed, the Spaniards released her. She arrived at New York on 22 July.<ref>"Extract from the Logbook of the Bring Lady Mary Pelham, Arrived Yesterday from Bordeaux." National Advocate (New York, New York), 23 July 1818.</ref> On 24 January 1824, Lady Mary Pelham'', of New York, Langdon, master, put into Charleston. She was 23 days out of Campechey, on her way to Gibraltar. She resumed her voyage. Later that year she was reported to be at Buenos Aires. That is the last mention of her in the press. Notes, citations, and references Notes Citations References 1811 ships Age of Sail merchant ships of England Falmouth Packets Captured ships Age of Sail merchant ships of the United States
15202473
https://en.wikipedia.org/wiki/Julia%20Misbehaves
Julia Misbehaves
Julia Misbehaves is a 1948 American romantic comedy film starring Greer Garson and Walter Pidgeon as a married couple who are separated by the man's snobbish family. They meet again many years later, when the daughter whom the man has raised, played by Elizabeth Taylor, invites her mother to her wedding. The film also features Peter Lawford and Cesar Romero. This adaptation of Margery Sharp's 1937 novel The Nutmeg Tree, which was also the basis of the 1940 Broadway play Lady in Waiting, was director Jack Conway's final film. Plot In 1936 London, mature showgirl Julia Packett leads a precarious life. She pretends to be contemplating suicide in order to finagle some money out of a male friend in order to pay her bills. Then she receives a wedding invitation from her daughter Susan. As a young woman, Julia had married wealthy William Packett. However, after 14 months of marriage, his disapproving mother broke them up. Julia returned to show business but left her infant daughter with William so that the child could be raised in a safe environment. On the boat trip to France, Julia meets and falls for muscular acrobat Fred Ghenoccio, and when in Paris, she performs with his troupe with great success. Later, Fred proposes to her as her train pulls away from the station. Julia reaches her destination penniless, so following her usual methods, she convinces a stranger, Colonel Willowbrook, to give her money, supposedly for an evening gown and other clothing. However, she sneaks away before Willowbrook tries to become better acquainted with her. Her mother-in-law is less than pleased to see her, but Julia manages to see Susan, who insists that Julia stay. As time goes by, William's love for Julia resurfaces. Julia observes that Susan has strong feelings for lovestruck painter Ritchie Lorgan, though he is not her fiancé. Though Susan claims to be merely annoyed, Julia sees that Susan loves Ritchie and successfully brings the two together. Julia remains skeptical of William's newfound love, unable to forget the past. Complications arise when Fred shows up to claim her. However, when William encounters his old friend Colonel Willowbrook, he learns of Julia's affair with Fred. William persuades Willowbrook to pretend to not know him and to interrupt their breakfast. The revelation of Julia's questionable method of raising funds sends Fred packing. Eventually, Susan takes Julia's suggestion and elopes with Ritchie. When William chases after them, followed by Julia, they discover that they have been tricked into going to the wrong place. Following Susan's instructions, servants drive away their cars, leaving them stranded for 48 hours in their isolated honeymoon cabin. Julia tries to walk away in a rainstorm, but ends up in the mud. When William comes to her rescue, he ends up sprawled in the muck as well, leaving them both laughing at their predicament. Cast Greer Garson as Julia Packett Walter Pidgeon as William Sylvester Packett Peter Lawford as Ritchie Lorgan Elizabeth Taylor as Susan Packett Cesar Romero as Fred Ghenoccio Lucile Watson as Mrs. Packett Nigel Bruce as Colonel Bruce "Bunny" Willowbrook Mary Boland as Ma Gheneccio Reginald Owen as Benjamin Hawkins, Julia's friend Henry Stephenson as Lord Pennystone, Susan's future father-in-law Aubrey Mather as the Vicar Ian Wolfe as Hobson, the butler Fritz Feld as Pepito Phyllis Morris as Daisy Veda Ann Borg as Louise Harry Allen as bill collector (uncredited) Cast notes Elizabeth Taylor turned 16 during the filming of Julia Misbehaves and also received her first onscreen kiss, from Peter Lawford. Taylor had a crush on Lawford and pursued him, but he had been warned that she was off-limits and told her that there was no chance of a romance between them. Taylor stayed in bed for days until a visit from Lawford smoothed things out, and they remained friends. During filming, Lawford introduced Greer Garson to E. E. "Buddy" Fogelson, an oil and cattle millionaire from Texas, whom she married the next year. Julia Misbehaves was the fourth of six films in which Walter Pidgeon and Greer Garson co-starred. Production Julia Misbehaves began with the working titles The Nutmeg Tree (the title of the 1937 novel by Margery Sharp upon which the film was based) and Speak to Me of Love. The screenplay was originally to have been written by James Hilton and would have starred Gracie Fields. Announced in April 1941, it was postponed later in the year because Fields was unavailable. In 1946, the project was revived, with Greer Garson in the lead role and with Everett Riskin as the producer, replacing Dore Schary, who had replaced Sidney Franklin. Box office The film earned $2,948,000 in the U.S. and Canada and $1,549,000 overseas, resulting in a profit of $298,000. Critical reception A New York Times review commented that Garson was "out of her element" in the film, although a Variety reviewer said that she "...acquits herself like a lady out to prove she can be hoydenish when necessary. She proves it and audiences will like the new Garson." References Notes External links 1948 films 1948 romantic comedy films American films American black-and-white films American romantic comedy films English-language films Films scored by Adolph Deutsch Films based on British novels Films based on works by Margery Sharp Films based on romance novels Films directed by Jack Conway Metro-Goldwyn-Mayer films Films with screenplays by William Ludwig Films set in London
4330717
https://en.wikipedia.org/wiki/Emulation%20on%20the%20Amiga
Emulation on the Amiga
The Amiga computer can be used to emulate several other computer platforms, including legacy platforms such as the Commodore 64, and its contemporary rivals such as the IBM PC and the Macintosh. MS-DOS on Amiga via Sidecar or Bridgeboard MS-DOS compatibility was a major issue during the early years of the Amiga's lifespan in order to promote the machine as a serious business machine. In order to run the MS-DOS operating system, Commodore released the Sidecar for the Amiga 1000, basically an 8088 board in a closed case that connected to the side of the Amiga. Clever programming (a library named Janus, after the two-faced Roman god of doorways) made it possible to run PC software in an Amiga window without use of emulation. At the introduction of the Sidecar the crowd was stunned to see the MS-DOS version of Microsoft Flight Simulator running at full speed in an Amiga window on the Workbench. Later the Sidecar was implemented on an expansion card named "Bridgeboard" for Amiga 2000+ models. Bridgeboard cards appeared up to 486 processor variants. The Bridgeboard card and the Janus library made the use of PC expansion cards and harddisk/floppydisk drives possible. The bridgeboard card was manufactured by Commodore, later third party cards also appeared for the Amiga 500 and Amiga 600 expansion slot such as the KCS Powerboard. Eventually, full-software emulators, such as PC-Task and PCx allowed Amigas to run MS-DOS programs, including Microsoft Windows, without additional hardware, at the costs of speed and compatibility. The KCS PowerPC board Dutch Amiga Kolff Computer Supplies built a similar expansion for the A500. It was later improved so it could emulate VGA. It did not multitask however. Amiga Transformer When Commodore introduced the Amiga 1000 in July 1985 it also unexpectedly announced a software-based IBM PC emulator for it. The company demonstrated the emulator by booting IBM PC DOS and running Lotus 1-2-3. Some who attended the demonstration were skeptical that the emulator, while impressive technically, could run with acceptable performance. The application, called Transformer, was indeed extremely slow; The 'Landmark' benchmark rated it as a 300 kHz 286, far slower than the 4.7 MHz of IBM's oldest and slowest PC. In addition, it would only run on Amigas using the 68000 microprocessor, and would not run if the Amiga had more than 512K of RAM. PCTask PCTask is a software PC emulator emulating PC Intel hardware with 8088 processor and CGA graphic modes. The latest version of it (4.4) was capable to emulate an 80386 clocked at 12 MHz and features include support for up to 16 MiB RAM (15 MB extended) under MS-DOS, up to two floppy drives and 2 hard drives. The emulator could make use of hardfile devices and then it could handle multiple hard disk files and hard disk partitions. It supported high Density floppies and CD-ROM if the Amiga hardware had mounted those devices. The graphics mode available were MDA, CGA, EGA, VGA and SVGA emulating Hercules graphic cards with 512 KiB to 2 MiB RAM, and up to 256 colors on Amiga AGA machines, and could make use of Amiga graphic boards (e.g. Cybergraphics, EGS Spectrum, Picasso). Parallel, Serial and PC speaker emulation, and mouse support, including serial mouse emulation were also granted. If the Amiga hardware is fast enough (68060 or PPC) and has enough RAM, there could be also the possibility to run multiple PC-Task processes on the same machine, run MS-DOS applications in an Amiga window on a public screen (e.g. on Amiga Workbench GUI). PCTask could also transfer files between Amiga side and the emulated MS-DOS machine; it could make use of GoldenGate bridge cards which allow the Amiga equipped with expansion slots to get complete control of its silent ISA slots and use PC-ISA cards. And latest version of it (4.4) could run even Microsoft Windows up to 95. PcTask has an 8088/80286/80486 JITM (Just in Time Machine) capable to map all instructions of these processors, but require 4 megabytes extra of RAM for activating this feature. PcTask has been re-released as freeware by its author. Mac OS on Amiga Also introduced for the Amiga were two products, A-Max (both internal and external models) and the Emplant expansion card. Both allowed the Amiga to emulate an Apple Macintosh and run the classic Mac OS. It required an Apple Macintosh ROM image, or actual ROMs in the case of A-Max, which needed to be obtained from a real Macintosh. The user needed to own the real Macintosh or Mac ROMs to legally run the emulator. In 1988 the first Apple Mac emulator, A-Max, was released as an external device for any Amiga. It needed Mac ROMs to function, and could read Mac disks when used with a Mac floppy drive (Amiga floppy drives are unable to read Mac disks. Unlike Amiga disks Mac floppy disks spin at variable speeds, much like CD-ROM drives). It wasn't a particularly elegant solution, but it did provide an affordable and usable Mac experience. ReadySoft, makers of A-Max, followed up with A-Max II in the early 1990s. A-Max II was contained on a Zorro-compatible card and allowed the user, again using actual Mac ROMs, to emulate a color Macintosh. In fact, an Amiga 3000 emulating a Mac via A-Max II was significantly faster than the first consumer color Mac, the LC. Over time full-software virtualization was available, but a ROM image was still necessary. Example virtualization software include ShapeShifter (not to be confused with the third party preference pane ShapeShifter), later superseded by Basilisk II (both by the same programmer who conceived SheepShaver, Christian Bauer), Fusion and iFusion (the latter ran classic Mac OS by using a PowerPC "coprocessor" accelerator card). Virtual machines provide equal or faster speed than a Macintosh with the same processor, especially with respect to the m68k series due to real Macs running in MMU trap mode, hampering performance. Also, immediately after the 68k to PowerPC transition in 1994, there was a dearth of native PowerPC Mac software: Amiga computers with 68060 CPUs running ShapeShifter or Fusion were able to run 68k Macintosh code faster than real Macs. One should note that although Amigas were very successful at emulating Macintoshes, it was never considered to be a Macintosh clone as it could not use Mac OS as a primary operating system. Modern Amigas like AmigaOne and Pegasos can emulate Macintosh Machines by using Basilisk II or Mac-on-Linux. 8-bit Commodore computers Various Commodore 64 emulators were produced for the Amiga. In 1988 Compute! reviewed ReadySoft's The 64 Emulator and Software Insight Systems' GO-64 and reported mixed results with both. Although the magazine used copies of the genuine 64 ROMs, it found that some software such as SpeedScript did not run, and both emulators' performance was inferior to the real computer. Others included MagiC64 and A64. Amigas have their own version of VICE and Frodo software emulators. VICE emulates the 8-bit machines made by Commodore, including C64, C128, PET, and VIC-20. Atari ST Atari ST emulation on Amiga is very easy because the two machine share the same model of processor (68000) and more or less feature the same hardware characteristics. In the past there were produced various software based Atari emulators for the Amiga such as Amtari, or Medusa emulator. AmigaOS 4 and MorphOS can emulate Atari ST and Atari STE platforms by using Hatari free software emulator which was released under GPL. Amiga emulation PowerPC-equipped computers running AmigaOS 4 can run UAE to emulate a Motorola 68000-equipped Amiga. Original Kickstart 3.1 ROM images are included with AmigaOS 4.1 Update 4. See also References Amiga Amiga emulation software Macintosh platform emulators
31321225
https://en.wikipedia.org/wiki/Wizcom%20Technologies
Wizcom Technologies
Wizcom Technologies Ltd. is a multinational company, which is the largest producer of portable hand-held scanning translators. The company produces portable electronic pen-shaped scanners, which are capable of scanning printed text and immediately translating the text, word for word, into other languages, and displaying the translated text on an LCD screen or scanning text and keep it in memory in order to transfer scanned text to word processing software on a computer. The company was founded in 1995 in Jerusalem. During the 2000s the company relocated its headquarters to Marlborough, Massachusetts in the U.S. Products Quicktionary 2 Premium Also known as SuperPen. A portable hand-held scanning translator which is capable of scanning printed text and providing instantaneous word-by-word translation. The product includes, at no extra cost, over 25 downloadable language dictionaries, including European languages such as French, Spanish, Portuguese and Italian and various foreign languages which use different sets of characters such as Arabic, Chinese, Hebrew and Russian. The product can store over 1,000 pages of printed text, which can be transferred to word processing software on a computer. Quicktionary TS This is the third generation of Wizcom Scanning Translators which was launched at the end of 2007. Compared with its predecessors, this translator can be operated by a touch screen, which is intended to considerably simplify the spelling of unreadable text. Infoscan A portable hand-held scanning translator which is used only for the purpose of storing printed text in its memory to be later transferred to word processing software on a computer. It has no translation function and its price is well below the Quicktionary 2 Premium and the Quicktionary II Expert. ReadingPen A portable hand-held scanning translator which is used as an aid for students of a second language and for people with reading difficulties (such as dyslexia). This product enables the users to scan text, hear it spoken aloud and obtain immediate definitions and correct pronunciation. Quicktionary II Expert A portable hand-held scanning translator which was specifically produced for the German market. It is configured with six dictionaries: German-English, English-German, German-French, French-German, German-Spanish and Spanish-German. In contrast to Quicktionary 2 Premium, it is not possible to install additional dictionaries on this device or to transfer scanned text from this device to word processing software on a computer. Quicktionary II Genius A portable hand-held scanning translator which was specifically produced for the Japanese market. It is configured with the English-Japanese Genius dictionary. In contrast to Quicktionary 2 Premium, it is not possible to install additional dictionaries on this device or to transfer scanned text from this device to word processing software on a computer. Quicktionary II Multi A portable hand-held scanning translator which was specifically produced for the Israeli market. It is configured with five dictionaries: English-Hebrew, English-Russian, English-Arabic, English-French and French-English. In contrast to Quicktionary 2 Premium, it is not possible to install additional dictionaries on this device or to transfer scanned text from this device to word processing software on a computer. See also Electronic dictionary External links Stock information from the Frankfurt Stock Exchange Deutsche Börse stock information Companies based in Massachusetts Computer hardware companies Companies established in 1995 Electronics companies of Israel Companies based in Jerusalem Israeli inventions