id
stringlengths
3
8
url
stringlengths
32
207
title
stringlengths
1
114
text
stringlengths
93
492k
18657445
https://en.wikipedia.org/wiki/Manufacturing%20operations%20management
Manufacturing operations management
Manufacturing operations management (MOM) is a collection of systems for managing end-to-end manufacturing processes with a view to optimizing efficiency. There are many types of MOM software, including for production management, performance analysis, quality and compliance, and human machine interface (HMI). Production management software provides real-time information about jobs and orders, labor and materials, machine status, and product shipments. Performance analysis software displays metrics at the machine, line, plant and enterprise level for situational or historical analysis. Quality and compliance software is used to promote compliance with standards and specifications for operational processes and procedures. HMI software is a form of manufacturing operations management (MOM) software that enables operators to manage industrial and process control machinery using a computer-based interface. Emerging Software Trends Advancements in technology and market demands are enabling new capabilities in MOM software platforms, gradually closing gaps in end-user needs. Collaboration Capabilities: Collaboration and workflow services support people-to-people, people-to-systems, and systems-to-systems interactions, enforcing procedures and rules while flexibly adapting to real-time situations with alternate workflows and processes. Security Services: Future manufacturing platforms will leverage common security services that determine roles, responsibilities, authorities, and access across all systems and application functions while fitting into corporate IT security schemes. Asset & Production Model: Future manufacturing platforms will have a unified asset and production model that supports all of the interrelationships between physical production equipment, facilities, inventory/materials and people, as well as production definitions such as the manufacturing bill of materials, productions orders, etc. This contrasts with older systems that either had subsets of these interrelationships across multiple databases, or could not effectively deal with federating across multiple systems of record. Operations Database & Historians: Evolving from older systems that had separate historians and production databases that were difficult to correlate across, service-based platforms will have a unified operations database and historian. This will capture and aggregate all time-series and production event information surrounding everything involved in each product and production run with a full genealogy of components and materials, related performance information, and federation across other systems and devices of record. Visualization and Mobility: Today, different MOM applications support different graphical user interfaces, Web interfaces, specific mobile applications, etc. The future manufacturing platform will provide common visualization and mobility for a consistent user interface experience across different form factors, supporting dedicated and mobile workers that are orchestrated by consistent workflows and procedures. Smaller and Focused 'Apps': Today’s monolithic systems and applications have too many interdependencies of databases, operate inconsistently, and are not inherently integrated. Being able to take advantage of many of the common software platform services described above, modular apps will be significantly smaller, simpler, and focused. These apps will be much lighter weight in functionality, and, as a result, significantly easier and faster to develop. See also Manufacturing execution system Manufacturing process management Operations execution system Product life cycle management References External links ANSI/ISA-95.00.03-2005 Enterprise-Control System Integration, Part 3: Models of Manufacturing Operations Management MES Center Association, MES Center is a non-profit organization that provides information and trends to those who are interested in control and monitoring of production processes, detailed scheduling, production logistics, production quality management and maintenance from the perspective of MES / MOM information systems Manufacturing software
26996361
https://en.wikipedia.org/wiki/Linear%20Tape%20File%20System
Linear Tape File System
The Linear Tape File System (LTFS) is a file system that allows files stored on magnetic tape to be accessed in a similar fashion to those on disk or removable flash drives. It requires both a specific format of data on the tape media and software to provide a file system interface to the data. The technology, based around a self-describing tape format developed by IBM, was adopted by the LTO Consortium in 2010. History Magnetic tape data storage has been used for over 50 years, but typically did not hold file metadata in a form easy to access or modify independent of the file content data. Often external databases were used to maintain file metadata (file names, timestamps, directory hierarchy) to hold this data but these external databases were generally not designed for interoperability and tapes might or might not contain an index of their content. In Unix-like systems, there is the tar interoperable standard, but this is not well-suited to allow modification of file metadata independent of modifying file content data - and does not maintain a central index of files nor provide a filesystem interface or characteristics. LTFS technology was first implemented by IBM as a prototype running on Linux and Mac OS X during 2008/2009. This prototype was demonstrated at NAB 2009. Based on feedback from this initial demonstration and experience within IBM the filesystem was overhauled in preparation for release as a product. The LTFS development team worked with the vendors of LTO tape products (HP and Quantum) to build support and understanding of the LTFS format and filesystem implementation leading up to the public release. The LTFS Format Specification and filesystem implementation were released on April 12, 2010 with the support of IBM, HP, Quantum, and the LTO Consortium. LTFS v2.0.0 was released in March 2011, improving the text to clarify and remove ambiguity. It also added support for sparse files; persistent file identifiers; virtual extended attributes for filesystem metadata and control - and defined minimum and recommended blocksize values for LTFS volumes, for compatibility across various HBA hardware implementations. Format specification The ISO/IEC 20919:2016 standard defines the LTFS Format requirements for interchanged media that claims LTFS compliance. It defines the data format, independent of the physical storage media and the software commands format, to make data truly interchangeable. The ISO standard was prepared by SNIA. It is based on LTFS v2.2, and was adopted to ISO by a joint technical committee ISO/IEC JTC 1 Information Technology. The SNIA workgroup continues to develop LTFS and release updates. Version 2.0.0 defines rules for how the version number may change in future, and how compatibility is maintained across varying implementations. All implementations must: correctly read media that was compliant with any prior version write media that is compliant with the version they claim compliance with SNIA Technical Work Group In August 2012, SNIA announced that it was forming a TWG (Technical Work Group) to continue technical development of the specification. LTFS Format Specification v 2.1 is the baseline for the technical work and standards accreditation process; SNIA LTFS TWG members include HP, IBM, Oracle and Quantum. Nature While LTFS can make a tape appear to behave like a disk, it does not change the fundamentally sequential nature of tape. Files are always appended to the end of the tape. If a file is modified and overwritten or removed from the volume, the associated tape blocks used are not freed up, they are simply marked as unavailable and the used volume capacity is not recovered. Data is only deleted and capacity recovered if the whole tape is reformatted. In spite of these disadvantages, there are several uses cases where LTFS formatted tape is superior to disk and other data storage technologies. While LTO seek times can range from 10 to 100 seconds, the streaming data transfer rate can match or exceed disk data transfer rates. Additionally, LTO cartridges are easily transportable and hold far more data than any other removable data storage format. The ability to copy a large file or a large selection of files (up to 1.5TB uncompressed data for LTO-5, and 18TB for LTO-9) to an LTFS formatted tape, allows easy exchange of data to a collaborator, or the saving of an archival copy. Since LTFS is an open standard, LTFS formatted tapes are usable by a wide variety of computing systems. Implementations Tape drives manufacturers often offer two different editions, one for Single Drives and one for Tape Libraries, based on the LTFS Reference Implementation. IBM Linear Tape File System - Single Drive Edition The IBM Linear Tape File System - Single Drive Edition, (initially released as "IBM Long Term File System"), allows tapes to be formatted as an LTFS volume, and for these volumes to be mounted - and users and applications access files and directories stored on the tape directly, including drag-and-drop of files. IBM Linear Tape File System - Library Edition The IBM Linear Tape File System - Library Edition (LTFS-LE) product allows LTFS volumes to be used in a tape library. Each LTFS-formatted tape cartridge in the library appears as a separate folder under the filesystem mount point and the user or application can navigate into each of these folders to access the files stored on each tape. The LTFS-LE software automatically controls the tape library robotics to load and unload the necessary LTFS volumes. Oracle's StorageTek Linear Tape File System, Open Edition Oracle's free open source StorageTek Linear Tape File System (LTFS), Open Edition software is claimed to be the first to store 8.5TB (native capacity) on a single cartridge. It supports Oracle’s midrange StorageTek LTO 5 and LTO 6 tape drives from HP and IBM as well as Oracle’s StorageTek T10000C and T10000D tape drives. Oracle's StorageTek Linear Tape File System, Library Edition Oracle’s StorageTek LTFS-LE software offering supports the StorageTek SL8500 Modular Library System, the StorageTek SL3000 Modular Library System, and the StorageTek SL150 Modular Tape. HP Linear Tape File System The HP Linear Tape File System (HP LTFS) is HP's implementation. It is a free open source software application. Quantum Linear Tape File System Quantum Corporation provides an LTFS product with Windows, Linux and Mac OS X support. The Scalar LTFS Appliance is a file system that presents a Quantum tape library as an NAS share. This appliance makes files viewable as if they resided on a local disk and allows users to drag and drop files directly to and from a tape cartridge. LTFS compatible products DDS Tape Drives HPE: DAT-160 and DAT-320 Enterprise Tape Drives IBM: TS1140, TS1150, TS1155 and TS1160 Oracle (Sun/StorageTek): T10000C and T10000D LTO Tape Drives HPE, IBM, Quantum, and Tandberg: from LTO-5 to LTO-9 Appliances and ISVs (Independent Software Vendors) supporting LTFS A full set of vendors are listed at LTO website. LTFS projects Thought Equity Motion is executing a major film digitization and preservation project for the EYE Film Institute Netherlands. The project involves scanning more than 150 million discrete DPX files and storing them on LTO Gen5 using the LTFS format. More than 1 petabyte of film will be scanned and archived over two years (2010–2012). Industry recognition IBM LTFS technology received a Pick Hit Award from Broadcast Engineering at NAB 2011. IBM and FOX Networks received an Engineering Emmy Award in 2011 for a project that uses LTFS to store, exchange, and archive video content. IBM received the 2011 Hollywood Post-Alliance (HPA) Engineering Excellence Award. References External links LTFS at LTO LTFS at SNIA LTFS for Dummies Book Implementations: (LTFS Reference Implementation for stand alone tape drive) IBM Spectrum Archive - IBM Spectrum Archive Single Drive Edition (SDE) Oracle Tape Storage - Oracle Linear Tape File System, Open Edition HP Linear Tape File System - HPE LTFS Software Quantum Linear Tape File System - Quantum LTFS Source Computer file systems Computer storage tape media IBM file systems
7537076
https://en.wikipedia.org/wiki/Clifford%20Nass
Clifford Nass
Clifford Ivar Nass (April 3, 1958 – November 2, 2013) was a professor of communication at Stanford University, co-creator of The Media Equation theory, and a renowned authority on human-computer interaction (HCI). He was also known for his work on individual differences associated with media multitasking. Nass was the Thomas M. Storke Professor at Stanford and held courtesy appointments in Computer Science, Education, Law, and Sociology. He was also affiliated with the programs in Symbolic Systems and Science, Technology, and Society. Nass was the director of the Communication between Humans and Interactive Media (CHIMe) Lab, co-director of Kozmetsky Global Collaboratory (KGC) and its Real-time Venture Design Laboratory (ReVeL), and a co-founder of TeachAids. Early life and education Nass was born in Jersey City, New Jersey and raised in Teaneck, the son of Florence and Jules Nass. His parents formed New Jersey's first Mothers Against Drunk Driving chapter after Nass's older brother was killed by a drunk driver in 1981. Nass graduated cum laude with an A.B. in mathematics from Princeton University in 1981 after completing a senior thesis, titled "PASCGRAF and the Haloed line effect", under the supervision of Arthur Appel. He then conducted research in the areas of computer graphics, data structures and database design for IBM and Intel before returning to Princeton to pursue graduate studies. He received a Ph.D. in sociology from Princeton in 1986 after completing a doctoral dissertation titled "Society as computer: the structure and skill of information work in the United States, 1900-1980." He then joined the faculty at Stanford University. Nass died, age 55, of a heart attack in November 2013. Research and Books He was the author of three books: The Media Equation, Wired for Speech, and The Man Who Lied to His Laptop. He has also published over 150 papers in the areas of human-computer interaction, statistical methodology, and organizational theory. He was credited with the founding of the Computers are Social Actors paradigm. Nass consulted on the design of over 250 media products and services for companies including Microsoft, Toyota, Philips, BMW, Hewlett-Packard, AOL, Sony, and Dell. Early HCI Work Nass’ early work was primarily in exploring ways people interacted with computers, particularly how those interactions are “fundamentally social” in nature.  By identifying a social theme in people's interaction with computers, he was able to observe that humans project “agency” on computers, and thus people will interact with computers in the same ways as interacting with people. For example, he showed how people will observe the “politeness norm” and focus on the first application they are interacting with if another application interrupts them (such as a pop-up window). He also showed how computer users engender computers and interact with them differently based on whether the computer is perceived as male or female – preferring to hear praise from a male computer voice, or receive relationship and love advice from a female computer voice, as examples.  His 1994 presentation at the SIGCHI conference titled “Computers are Social Actors” outlined these and other observations on human computer interaction that led to the Computers as Social Actors paradigm. Mid-Career HCI Work After establishing that people interact with computers the same way they interact with people, particularly when using voice interfaces, Nass began to study this topic in more detail, publishing several studies that show the etiquette components of reciprocity, politeness, and responding to and giving praise as no different between people and computers as they were people and people. This line of research led to the publishing of his book Wired for Speech : How Voice Activates and Advances the Human-Computer Relationship, in which he summarized the results of much of this work. In 2007, Nass became a “dorm dad,” moving into a Stanford freshman residence hall. He was shocked and intrigued by the communication and technology practices of the students he observed. He watched as students utilized multiple devices all at once; texting, tweeting, listening to music and watching YouTube, all while working on homework. He began a line of investigation into multitasking and the effects it has on cognition, discovering that more people multitask, the worse multitaskers they become. This, Nass asserts, is due to losing the ability to filter out non-relative stimuli.  In a Frontline interview in February 2010, Nass discusses the results of one of his experiments, saying “It turns out multitaskers are terrible at every aspect of multitasking. They're terrible at ignoring irrelevant information; they're terrible at keeping information in their head nicely and neatly organized; and they're terrible at switching from one task to another.”  The results of these experiments were widely picked up in the media, and Nass was even invited to give a TedX talk at Stanford on the subject titled “Are you multitasking your life away?” Late Career HCI Work In The Man Who Lied to His Laptop, Nass summarizes many years of HCI research through the lens of what lessons they contain about human behavior and social relationships. The book, published in 2010, contains chapters on praise and criticism, personality, teams and team building, emotion, and persuasion. Nass continued to explore the effects of multitasking later in his career, co-publishing a study with Roy Pea showing the negative impacts on social well-being of certain media usage and media multitasking in 8-12-year-old girls.  This study also introduced a revised method for measuring media multitasking, building upon Nass’ earlier co-authored study with Eyal Ophir from 2009.  The new, revised method for measuring media multitasking allowed a more granular measurement of media multitasking in participants, which in turn allowed him to compare this measure to other variables, such as self-satisfaction and contentment. In addition to furthering the study of media multitasking, Nass also began to research the voice user interface in relation to autonomous vehicles.  He published a study that shows when a voice user interface reframes poor driving conditions in a positive light, it helps to regulate driver's emotions, attitudes, and increases driving performance.  He also published studies that show when a voice user interface in an autonomous vehicle describes the vehicle taking an action, say slowing down, by describing the “why” and the “how” led to better driving performance and feelings of trust and safety from the driver. Published Books The Man Who Lied to His Laptop: What Machines Teach Us About Human Relationships. Penguin Group, 2010. . Co-written with Corina Yen. Wired for Speech: How Voice Activates and Advances the Human-Computer Relationship. MIT Press, 2005. . Co-written with Scott Brave. The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places. Cambridge University Press, 1996. . Co-written with Byron Reeves. References External links Stanford Department of Communication Profile Clifford Nass video lectures (Academic Earth) New Paradigms for Using Computers, IBM Research (audio of lecture at IBM Almaden Research Center, 1996) Wired for Speech: Voice Interactions with People and Computers, (Stanford University Office of Science Outreach's Summer Science Lecture Series, 7 August 2008) 1958 births 2013 deaths IBM employees Intel people People from Jersey City, New Jersey People from Teaneck, New Jersey Princeton University alumni Stanford University Department of Communication faculty
65912341
https://en.wikipedia.org/wiki/Albion%20Packet%20%281800%20ship%29
Albion Packet (1800 ship)
Albion Packet was a schooner launched at Berwick by Gowan. She sailed primarily along Britain's coasts, and later to the Baltic. She disappeared from the registers between 1816 and 1822, when she reappeared as Albion. Circa 1827 she became Albion Packet again. She underwent two maritime mishaps, one in August 1802 and one circa December 1827, before being wrecked on 17 November 1832 near Orford High Light. Career Albion Packet first appeared in Lloyd's Register (LR) in 1800. On 31 August 1802, Lloyd's List reported that Albion Packet struck on a rock at the entrance of the Tweed as she was sailing from London to Leith. Albion Packet put into Berwick with five feet of water in her hold and would be obliged to unload her cargo to repair. In August 1809 Albion Packet sailed from London to Leith in 37 hours, the quickest such voyage known. In February 1814 Albion Packet, May, master, returned leaky to Leith from Gothenburg. She had been delivering mail but was unable to get into the port because of the ice. She transferred her mails to another packet. Neither LR nor RS published in 1817. Albion Packet was no longer listed in 1818. There was a report in 1819 that Albion Packet, of Leith, had been lost on the coast of Holland. On 16 or 17 October, Albion, Parker, master, was on her way from Leith to Hambro when she was driven ashore at Ameland. Most of her cargo was saved, but 43 casks of sugar were lost. It is not possible from the LR or SR volumes for 1819 to determine which vessel this was. In 1822 Albion, of 125 tons (bm) launched at Berwick in 1800, reappeared in LR. The Berwick shipping Company had been established in 1820. Then in 1827, LR listed two vessels: Albion, with the same information as in 1826, and Albion Packet, of 126 tons (bm), built in Berwick, and launched in 1806. This was the first listing of an Albion Packet since 1816. She had a new owner, master, and homeport, suggesting that some confusion may have developed as a consequence of the change. In 1827, Albion Packet, Hunter, master, was sailing from Hambro to Newcastle upon Tyne when she went shore near Cuxhaven. She was gotten off the sand and went into port in a leaky state to discharge her cargo and effect repairs. She came into port on 4 December. She arrived at Newcastle from Hamburg in January 1828 with a cargo of hides. In 1828 LR no longer listed a Berwick-built Albion, only Albion Packet. From 1828 on, all the advertisements for Albion Packets sailings gave her master's name as William Morrison, and her destinations as ports in Northern Germany or the Baltic. Fate Albion Packet, Marwood, master was driven ashore near Orford Haven on 11 November 1832. Her crew were rescued. She was on a voyage from Maldon, Essex to Sunderland, County Durham. The entry for Albion Packet in the Register of Shipping (RS)for 1832 carried the annotation "LOST". Notes, citations, and references Notes Citations References 1800 ships Maritime incidents in 1802 Maritime incidents in December 1827 Maritime incidents in November 1832 Age of Sail merchant ships of England Packet (sea transport)
30090536
https://en.wikipedia.org/wiki/TAL%20Technologies
TAL Technologies
TAL Technologies, Inc. is a privately owned software development company which develops and sells applications that automate data collection. Their main product lines include WinWedge, TCP-Com, TAL Barcode ActiveX Control and B-Coder Professional Barcode software. Corporate history 1985: TAL Technologies, Inc. was founded by President, Thomas Lutz 1989: Released Software Wedge for DOS, a RS232 data collection software 1990: Released WinWedge, a Windows-based RS232 data collection software 1990: Released B-Coder, a Barcode image generating software 2001: Introduced CE-Wedge, an RS232 data collection software for Pocket PCs 2002: Introduced TCP-Com, an RS232 to TCP/IP converter software Key Products TAL Technologies, Inc. major product lines include: • WinWedge - RS232 data collection products for quality control and laboratory instruments including balances, scales, pH meters, spectrophotometers, force gauges, digital electronic measuring instruments, etc. • TCP/Com - Multifunction serial to ethernet (and ethernet to serial) interface software. Easily access a serial device via a TCP/IP or UDP network or redirect Ethernet/IP data to real or “Virtual” RS232 serial ports. • B-Coder - Software for generating customized barcode images of various symbologies such as Code 39, UPC, PDF417, Data Matrix, etc. • Barcode ActiveX Control - ActiveX Control used for generating barcode images References Software companies based in Pennsylvania Companies based in Philadelphia Privately held companies based in Pennsylvania American companies established in 1985 1985 establishments in Pennsylvania Software companies of the United States Software companies established in 1985
9611373
https://en.wikipedia.org/wiki/Integrated%20workplace%20management%20system
Integrated workplace management system
An integrated workplace management system (IWMS) is a software platform that helps organizations optimize the use of workplace resources, including the management of a company's real estate portfolio, infrastructure and facilities assets. IWMS solutions are commonly packaged as a fully integrated suite or as individual modules that can be scaled over time. They are used by corporate occupiers, real estate services firms, facilities services providers, landlords and managing agents. Traditionally focused on supporting real estate and facilities professionals, IWMS solutions are becoming more employee-centric, expanding their touchpoints to include all building occupants and visitors. Core functional areas of IWMS IWMS tends to integrate five core functional areas (or at least a combination of some of the five) within an enterprise which were organizationally and operationally independent and showed minimal interdisciplinary synergy, prior to the advent of IWMS: Real estate management This area involves activities associated with the acquisition (including purchase and lease), financial management and disposition of real property assets. Common IWMS features that support real estate management include strategic planning, transaction management, request for proposal (RFP) analysis, lease analysis, portfolio management po, tax management, lease management, and lease accounting. Capital project management This area involves activities associated with the design and development of new facilities and the remodeling or enhancement of existing facilities, including their reconfiguration and expansion. Common IWMS features that support capital project management include capital planning, design, funding, bidding, procurement, cost and resource management, project documentation and drawing management, scheduling, and critical path analysis. Facilities management This area covers activities related to the operation and optimized utilization of facilities. Common IWMS features that support facility management include strategic facilities planning (including scenario modeling and analysis), CAD and BIM integration, space management, site and employee service management, resource scheduling, and move management. Maintenance management This area covers activities related to the corrective and preventive maintenance and operation of facilities and assets. Common IWMS features that support maintenance management include asset management, work requests, preventive maintenance, work order administration, warranty tracking, inventory management, vendor management and facility condition assessment. Sustainability and energy management This area covers activities related to the measurement and reduction of resource consumption (including energy and water) and waste production (including greenhouse gas emissions) within facilities. Common IWMS features that support sustainability and energy management include integration with building management systems (BMS), sustainability performance metrics, energy benchmarking, carbon emissions tracking, and energy efficiency project analysis. Implementation Planning IWMS components can be implemented in any order—or all together as a single, comprehensive implementation—according to the organization’s needs. As an implementation best practice, a phased approach for implementing IWMS components sequentially is advised—though a multi-function approach can still be followed. Each IWMS functional area requires the same steps for its implementation, though extra care, coordination and project management will be necessary to ensure smooth functioning for more complex implementations. Adoption of as-shipped business processes included in the IWMS software over an organization's existing business processes constitutes a “core success prerequisite and best practice” in the selection and implementation of IWMS software. As a result, organizations should limit configuration to all but the most compelling cases. Analyst coverage Since 2004, the IWMS market has been reported on by independent analyst firms Gartner Inc., Verdantix, IWMSconnect and IWMSNews. Gartner Market Guide for Integrated Workplace Management Systems Until 2014, Gartner published the IWMS Magic Quadrant, evaluating IWMS vendors on two criteria: 'completeness of vision' and 'ability to execute'. As the market further matured, the Magic Quadrant reports were replaced by an annual Market Guide, focusing more on the development of the market itself than on comparative positioning. The original author, Michael Bell, first described IWMS software as "integrated enterprise solutions that span the life cycle of facilities asset management, from acquisition and operations to disposition." In this first market definition, Gartner identified critical requirements of an IWMS, including a common database, advanced web services technologies and a system architecture that enabled user-defined workflow processes and customized portal interfaces. Gartner released updated IWMS Market Guide reports, as follows: The latest Gartner Market Guide for Integrated Workplace Management Systems was published on January 28, 2020 by Carol Rozwell, Former Distinguished VP Analyst, and Rashmi Choudhary, Principal Analyst. Verdantix Green Quadrant Integrated Workplace Management Systems Since 2017, Verdantix publishes Integrated Workplace Management Systems Green Quadrant reports and Buyer's Guides. The research firm's proprietary Green Quadrant methodology uses weighted criteria for vendor evaluation, grouped under 2 categories: Capabilities (breadth and depth of software functionality) and Momentum (strategic success factors). Evolution of IWMS While the core functions remain critical, IWMS is evolving into a cloud-based software platform that is built with the workplace experience at the center. Providing an interactive user interface across multiple devices, modern IWMS enables employees to access a variety of workplace services from a mobile app, kiosk or desktop. In the latest Verdantix research, 80% of executives considering IWMS software said the quality of the user interface was the most important factor influencing their decision. The IWMS of the future should serve as a digital workplace concierge, allowing employees to find people, reserve rooms, request services and receive mail or visitors. With the growth of the Internet of Things, a trend that is gaining ground is the integration of IWMS software and Smart Building solutions on a single platform, also termed IWMS+. This allows for IWMS to draw on real-time data from sensors to manage the modern work environment. See also Real estate Facility management Computer-aided facility management Enterprise asset management References Business software
1260095
https://en.wikipedia.org/wiki/BIOS%20interrupt%20call
BIOS interrupt call
BIOS interrupt calls are a facility that operating systems and application programs use to invoke the facilities of the Basic Input/Output System software on IBM PC compatible computers. Traditionally, BIOS calls are mainly used by DOS programs and some other software such as boot loaders (including, mostly historically, relatively simple application software that boots directly and runs without an operating system—especially game software). BIOS runs in the real address mode (Real Mode) of the x86 CPU, so programs that call BIOS either must also run in real mode or must switch from protected mode to real mode before calling BIOS and then switching back again. For this reason, modern operating systems that use the CPU in Protected mode or Long mode generally do not use the BIOS interrupt calls to support system functions, although they use the BIOS interrupt calls to probe and initialize hardware during booting. Real mode has the 1MB memory limitation, modern boot loaders (e.g. GRUB2, Windows Boot Manager) use the unreal mode or protected mode (and execute the BIOS interrupt calls in the Virtual 8086 mode, but only for OS booting) to access up to 4GB memory. In all computers, software instructions control the physical hardware (screen, disk, keyboard, etc.) from the moment the power is switched on. In a PC, the BIOS, pre-loaded in ROM on the motherboard, takes control immediately after the CPU is reset, including during power-up, when a hardware reset button is pressed, or when a critical software failure (a triple fault) causes the mainboard circuitry to automatically trigger a hardware reset. The BIOS tests the hardware and initializes its state; finds, loads, and runs the boot program (usually, an OS boot loader, and historical ROM BASIC); and provides basic hardware control to the software running on the machine, which is usually an operating system (with application programs) but may be a directly booting single software application. For IBM's part, they provided all the information needed to use their BIOS fully or to directly utilize the hardware and avoid BIOS completely, when programming the early IBM PC models (prior to the PS/2). From the beginning, programmers had the choice of using BIOS or not, on a per-hardware-peripheral basis. IBM did strongly encourage the authorship of "well-behaved" programs that accessed hardware only through BIOS INT calls (and DOS service calls), to support compatibility of software with current and future PC models having dissimilar peripheral hardware, but IBM understood that for some software developers and hardware customers, a capability for user software to directly control the hardware was a requirement. In part, this was because a significant subset of all the hardware features and functions was not exposed by the BIOS services. For two examples (among many), the MDA and CGA adapters are capable of hardware scrolling, and the PC serial adapter is capable of interrupt-driven data transfer, but the IBM BIOS supports neither of these useful technical features. Today, the BIOS in a new PC still supports most, if not all, of the BIOS interrupt function calls defined by IBM for the IBM AT (introduced in 1984), along with many more newer ones, plus extensions to some of the originals (e.g. expanded parameter ranges) promulgated by various other organizations and collaborative industry groups. This, combined with a similar degree of hardware compatibility, means that most programs written for an IBM AT can still run correctly on a new PC today, assuming that the faster speed of execution is acceptable (which it typically is for all but games that use CPU-based timing). Despite the considerable limitations of the services accessed through the BIOS interrupts, they have proven extremely useful and durable to technological change. Purpose of BIOS calls BIOS interrupt calls perform hardware control or I/O functions requested by a program, return system information to the program, or do both. A key element of the purpose of BIOS calls is abstraction - the BIOS calls perform generally defined functions, and the specific details of how those functions are executed on the particular hardware of the system are encapsulated in the BIOS and hidden from the program. So, for example, a program that wants to read from a hard disk does not need to know whether the hard disk is an ATA, SCSI, or SATA drive (or in earlier days, an ESDI drive, or an MFM or RLL drive with perhaps a Seagate ST-506 controller, perhaps one of the several Western Digital controller types, or with a different proprietary controller of another brand). The program only needs to identify the BIOS-defined number of the drive it wishes to access and the address of the sector it needs to read or write, and the BIOS will take care of translating this general request into the specific sequence of elementary operations required to complete the task through the particular disk controller hardware that is connected to that drive. The program is freed from needing to know how to control at a low level every type of hard disk (or display adapter, or port interface, or real-time clock peripheral) that it may need to access. This both makes programming operating systems and applications easier and makes the programs smaller, reducing the duplication of program code, as the functionality that is included in the BIOS does not need to be included in every program that needs it; relatively short calls to the BIOS are included in the programs instead. (In operating systems where the BIOS is not used, service calls provided by the operating system itself generally fulfill the same function and purpose.) The BIOS also frees computer hardware designers (to the extent that programs are written to use the BIOS exclusively) from being constrained to maintain exact hardware compatibility with old systems when designing new systems, in order to maintain compatibility with existing software. For example, the keyboard hardware on the IBM PCjr works very differently than the keyboard hardware on earlier IBM PC models, but to programs that use the keyboard only through the BIOS, this difference is nearly invisible. (As a good example of the other side of this issue, a significant share of the PC programs in use at the time the PCjr was introduced did not use the keyboard through BIOS exclusively, so IBM also included hardware features in the PCjr to emulate the way the original IBM PC and IBM PC XT keyboard hardware works. The hardware emulation is not exact, so not all programs that try to use the keyboard hardware directly will work correctly on the PCjr, but all programs that use only the BIOS keyboard services will.) In addition to giving access to hardware facilities, BIOS provides added facilities that are implemented in the BIOS software. For example, the BIOS maintains separate cursor positions for up to eight text display pages and provides for TTY-like output with automatic line wrap and interpretation of basic control characters such as carriage return and line feed, whereas the CGA-compatible text display hardware has only one global display cursor and cannot automatically advance the cursor, use the cursor position to address the display memory (so as to determine which character cell will be changed or examined), or interpret control characters. For another example, the BIOS keyboard interface interprets many keystrokes and key combinations to keep track of the various shift states (left and right Shift, Ctrl, and Alt), to call the print-screen service when Shift+PrtScrn is pressed, to reboot the system when Ctrl+Alt+Del is pressed, to keep track of the lock states (Caps Lock, Num Lock, and Scroll Lock) and, in AT-class machines, control the corresponding lock-state indicator lights on the keyboard, and to perform other similar interpretive and management functions for the keyboard. In contrast, the ordinary capabilities of the standard PC and PC-AT keyboard hardware are limited to reporting to the system each primitive event of an individual key being pressed or released (i.e. making a transition from the "released" state to the "depressed" state or vice versa), performing a commanded reset and self-test of the keyboard unit, and, for AT-class keyboards, executing a command from the host system to set the absolute states of the lock-state indicators (LEDs). Calling BIOS: BIOS software interrupts Operating systems and other software communicate with the BIOS software, in order to control the installed hardware, via software interrupts. A software interrupt is a specific variety of the general concept of an interrupt. An interrupt is a mechanism by which the CPU can be directed to stop executing the main-line program and immediately execute a special program, called an Interrupt Service Routine (ISR), instead. Once the ISR finishes, the CPU continues with the main program. On x86 CPUs, when an interrupt occurs, the ISR to call is found by looking it up in a table of ISR starting-point addresses (called "interrupt vectors") in memory: the Interrupt Vector Table (IVT). An interrupt is invoked by its type number, from 0 to 255, and the type number is used as an index into the Interrupt Vector Table, and at that index in the table is found the address of the ISR that will be run in response to the interrupt. A software interrupt is simply an interrupt that is triggered by a software command; therefore, software interrupts function like subroutines, with the main difference that the program that makes a software interrupt call does not need to know the address of the ISR, only its interrupt number. This has advantages for modularity, compatibility, and flexibility in system configuration. BIOS interrupt calls can be thought of as a mechanism for passing messages between BIOS and BIOS client software such as an operating system. The messages request data or action from BIOS and return the requested data, status information, and/or the product of the requested action to the caller. The messages are broken into categories, each with its own interrupt number, and most categories contain sub-categories, called "functions" and identified by "function numbers". A BIOS client passes most information to BIOS in CPU registers, and receives most information back the same way, but data too large to fit in registers, such as tables of control parameters or disk sector data for disk transfers, is passed by allocating a buffer (i.e. some space) in memory and passing the address of the buffer in registers. (Sometimes multiple addresses of data items in memory may be passed in a data structure in memory, with the address of that structure passed to BIOS in registers.) The interrupt number is specified as the parameter of the software interrupt instruction (in Intel assembly language, an "INT" instruction), and the function number is specified in the AH register; that is, the caller sets the AH register to the number of the desired function. In general, the BIOS services corresponding to each interrupt number operate independently of each other, but the functions within one interrupt service are handled by the same BIOS program and are not independent. (This last point is relevant to reentrancy.) The BIOS software usually returns to the caller with an error code if not successful, or with a status code and/or requested data if successful. The data itself can be as small as one bit or as large as 65,536 bytes of whole raw disk sectors (the maximum that will fit into one real-mode memory segment). BIOS has been expanded and enhanced over the years many times by many different corporate entities, and unfortunately the result of this evolution is that not all the BIOS functions that can be called use consistent conventions for formatting and communicating data or for reporting results. Some BIOS functions report detailed status information, while others may not even report success or failure but just return silently, leaving the caller to assume success (or to test the outcome some other way). Sometimes it can also be difficult to determine whether or not a certain BIOS function call is supported by the BIOS on a certain computer, or what the limits of a call's parameters are on that computer. (For some invalid function numbers, or valid function numbers with invalid values of key parameters—particularly with an early IBM BIOS version—the BIOS may do nothing and return with no error code; then it is the [inconvenient but inevitable] responsibility of the caller either to avoid this case by not making such calls, or to positively test for an expected effect of the call rather than assuming that the call was effective. Because BIOS has evolved extensively in many steps over its history, a function that is valid in one BIOS version from some certain vendor may not be valid in an earlier or divergent BIOS version from the same vendor or in a BIOS version—of any relative age—from a different vendor.) Because BIOS interrupt calls use CPU register-based parameter passing, the calls are oriented to being made from assembly language and cannot be directly made from most high-level languages (HLLs). However, a high level language may provide a library of wrapper routines which translate parameters from the form (usually stack-based) used by the high-level language to the register-based form required by BIOS, then back to the HLL calling convention after the BIOS returns. In some variants of C, BIOS calls can be made using inline assembly language within a C module. (Support for inline assembly language is not part of the ANSI C standard but is a language extension; therefore, C modules that use inline assembly language are less portable than pure ANSI standard C modules.) Invoking an interrupt Invoking an interrupt can be done using the INT x86 assembly language instruction. For example, to print a character to the screen using BIOS interrupt 0x10, the following x86 assembly language instructions could be executed: mov ah, 0x0e ; function number = 0Eh : Display Character mov al, '!' ; AL = code of character to display int 0x10 ; call INT 10h, BIOS video service Interrupt table A list of common BIOS interrupt classes can be found below. Note that some BIOSes (particularly old ones) do not implement all of these interrupt classes. The BIOS also uses some interrupts to relay hardware event interrupts to programs which choose to receive them or to route messages for its own use. The table below includes only those BIOS interrupts which are intended to be called by programs (using the "INT" assembly-language software interrupt instruction) to request services or information. : execute BASIC traditionally jumped to an implementation of Cassette BASIC (provided by Microsoft) stored in Option ROMs. This call would typically be invoked if the BIOS was unable to identify any bootable disk volumes on startup. At the time the original IBM PC (IBM machine type 5150) was released in 1981, the BASIC in ROM was a key feature. Contemporary popular personal computers such as the Commodore 64 and the Apple II line also had Microsoft Cassette BASIC in ROM (though Commodore renamed their licensed version Commodore BASIC), so in a substantial portion of its intended market, the IBM PC needed BASIC to compete. As on those other systems, the IBM PC's ROM BASIC served as a primitive diskless operating system, allowing the user to load, save, and run programs, as well as to write and refine them. (The original IBM PC was also the only PC model from IBM that, like its aforementioned two competitors, included cassette interface hardware. A base model IBM PC had only 16 KiB of RAM and no disk drives [of any kind], so the cassette interface and BASIC in ROM were essential to make the base model usable. An IBM PC with less than 32 KiB of RAM is incapable of booting from disk. Of the five 8 KiB ROM chips in an original IBM PC, totaling 40 KiB, four contain BASIC and only one contains the BIOS; when only 16 KiB of RAM are installed, the ROM BASIC accounts for over half of the total system memory [4/7ths, to be precise].) As time went on and BASIC was no longer shipped on all PCs, this interrupt would simply display an error message indicating that no bootable volume was found (famously, "No ROM BASIC", or more explanatory messages in later BIOS versions); in other BIOS versions it would prompt the user to insert a bootable volume and press a key, and then after the user pressed a key it would loop back to the bootstrap loader (INT 19h) to try booting again. Digital's Rainbow 100B used to call its BIOS, which was incompatible with the IBM BIOS. Turbo Pascal, Turbo C and Turbo C++ repurposed INT 18 for memory allocation and paging. Other programs also reused this vector for their own purposes. BIOS hooks DOS On DOS systems, IO.SYS or IBMBIO.COM hooks INT 13 for floppy disk change detection, tracking formatting calls, correcting DMA boundary errors, and working around problems in IBM's ROM BIOS "01/10/84" with model code 0xFC before the first call. Bypassing BIOS Many modern operating systems (such as Linux and Windows NT) bypass the BIOS interrupt calls after startup, the OS kernel will switch the CPU into protected mode or long mode at startup, preferring to use their own programs (such as kernel drivers) to control the attached hardware directly. The reason for this was primarily that these operating systems run the processor in protected mode or long mode, whereas calling BIOS interrupt call requires switching to real mode or unreal mode, or using Virtual 8086 mode. The real mode, the unreal mode and the virtual 8086 mode are slow. However, there are also serious security reasons not to switch to real mode, and the real mode BIOS code has limitations both in functionality and speed that motivate operating system designers to find a replacement for it. In fact, the speed limitations of the BIOS made it common even in the DOS era for programs to circumvent it in order to avoid its performance limitations, especially for video graphics display and fast serial communication. The problems with BIOS functionality include limitations in the range of functions defined, inconsistency in the subsets of those functions supported on different computers, and variations in the quality of BIOSes (i.e. some BIOSes are complete and reliable, others are abridged and buggy). By taking matters into their own hands and avoiding reliance on BIOS, operating system developers can eliminate some of the risks and complications they face in writing and supporting system software. On the other hand, by doing so those developers become responsible for providing "bare-metal" driver software for every different system or peripheral device they intend for their operating system to work with (or for inducing the hardware producers to provide those drivers). Thus it should be apparent that compact operating systems developed on small budgets would tend to use BIOS heavily, while large operating systems built by huge groups of software engineers with large budgets would more often opt to write their own drivers instead of using BIOS—that is, even without considering the compatibility problems of BIOS and protected mode. See also DOS interrupt call Interrupt descriptor table Input/Output Base Address Ralf Brown's Interrupt List References The x86 Interrupt List (a.k.a. RBIL, Ralf Brown's Interrupt List) Embedded BIOS User's Manual PhoenixBIOS 4.0 User's Manual IBM Personal System/2 and Personal Computer BIOS Interface Technical Reference, IBM, 1988, System BIOS for IBM PCs, Compatibles, and EISA Computers, Phoenix Technologies, 1991, Programmer's Guide to the AMIBIOS, American Megatrends, 1993, The Programmer's PC Sourcebook by Thom Hogan, Microsoft Press, 1991 BIOS Interrupts Application programming interfaces
15797535
https://en.wikipedia.org/wiki/PeakFit
PeakFit
PeakFit is an automated nonlinear peak separation and analysis software package for scientists performing spectroscopy, chromatography and electrophoresis. It automatically finds and fits up to 100 peaks to a data set, at a time, enabling users to characterize peaks and find the best equation that fits their data. PeakFit can also enhance the data obtained from traditional numerical methods and lab instruments. PeakFit was originally developed by Ron Brown of AISN Software and distributed by Jandel Scientific Software in the late 1980s but by January 2004, Systat Software acquired the exclusive worldwide rights from SPSS Inc. to distribute SigmaPlot and other Sigma Series products. References External links Science software Regression and curve fitting software
38161772
https://en.wikipedia.org/wiki/List%20of%20commercial%20video%20games%20with%20available%20source%20code
List of commercial video games with available source code
This is a list of commercial video games with available source code. The source code of these commercially developed and distributed video games is available to the public or the games' communities. Motivation Commercial video games are typically developed as proprietary closed source software products, with the source code treated as a trade secret (unlike open-source video games). When there is no more expected revenue, these games enter the end-of-life as a product with no support or availability for the game's users and community, becoming abandoned. Description In several of the cases listed here, the game's developers released the source code expressly to prevent their work from becoming abandonware. Such source code is often released under varying (free and non-free, commercial and non-commercial) software licenses to the games' communities or the public; artwork and data are often released under a different license than the source code, as the copyright situation is different or more complicated. The source code may be pushed by the developers to public repositories (e.g. SourceForge or GitHub), or given to selected game community members, or sold with the game, or become available by other means. The game may be written in an interpreted language such as BASIC or Python, and distributed as raw source code without being compiled; early software was often distributed in text form, as in the book BASIC Computer Games. In some cases when a game's source code is not available by other means, the game's community "reconstructs" source code from compiled binary files through time-demanding reverse engineering techniques. Source code availability in whatever form allows the games' communities to study how the game works, make modifications, and provide technical support themselves when the official support has ended, e.g. with unofficial patches to fix bugs or source ports to make the game compatible with new platforms. Games with source code available on release Games with later released source code Games with available source code The table below with available source code resulted not from official releases by companies or IP holders but from unclear release situations, like lost & found and leaks of unclear legality (e.g. by an individual developer on end-of-product-life) or undeleted content. Games with reconstructed source code Once games, or software in general, become an obsolete product for a company, the tools and source code required to re-create the game are often lost or even actively destroyed and deleted. On the closure of Atari in Sunnyvale, California, in 1996, the original source code of several milestones of video game history (like Asteroids and Centipede) were thrown out as trash. When much time and manual work is invested, it is still possible to recover or restore a source code variant which replicates the program's functions accurately from the binary program. Techniques used to accomplish this are decompiling, disassembling, and reverse engineering the binary executable. This approach typically does not result in the exact original source code but rather a divergent version, as a binary program does not contain all of the information originally carried in the source code. For example, comments and function names cannot be restored if the program was compiled without additional debug information. Using the techniques listed above within a "bottom-up" development methodology process, the re-created source-code of a game is able to replicate the behavior of the original game exactly, often being "clock-cycle accurate", and/or "pixel-per-pixel accurate". This approach is in contrast to that used by game engine recreations, which are often made using a "top-down" development methodology, and which can result in duplicating the general features provided by a game engine, but not necessarily an accurate representation of the original game. See also :Category:Commercial video games with freely available source code. List of open-source game engines List of open-source video games List of formerly proprietary software List of commercial video games released as freeware List of freeware games List of proprietary source-available software Source port Source-available software References External links Game Source Code Collection at the Internet Archive Video Game Preservation (collection of source code) on GitHub Liberated Games, specialized on open sourced computer games (archived; from 2015 to 2007-26) Open Source Game Clones, specialized in open source variants of commercial games Video game lists by license Video games
314123
https://en.wikipedia.org/wiki/The%20Thackery%20T.%20Lambshead%20Pocket%20Guide%20to%20Eccentric%20%26%20Discredited%20Diseases
The Thackery T. Lambshead Pocket Guide to Eccentric & Discredited Diseases
The Thackery T. Lambshead Pocket Guide to Eccentric & Discredited Diseases (2003) is an anthology of fantasy medical conditions edited by Jeff VanderMeer and Mark Roberts, and published by Night Shade Books. The Guide claims to be 83rd in a series of editions inaugurated by the fictional Dr. Thackery T. Lambshead in 1915, and contains generally humorous entries (in varying degrees of darkness) with disease descriptions by several popular authors such as Neil Gaiman, Alan Moore and Michael Moorcock, which together detail the "secret medical history" of the 20th century. In 2004, the book was shortlisted for a Hugo Award for Best Related Book and a World Fantasy Award for Best Anthology. A sequel anthology was released in 2011 called The Thackery T. Lambshead Cabinet of Curiosities, which was co-edited by Jeff VanderMeer and Anne VanderMeer. Description Contributors Alan M. Clark Alan Moore Andrew J. Wilson Brendan Connell Brian Evenson Brian Stableford China Miéville Cory Doctorow David Langford Dawn Andrews Elliot Fintushel G. Eric Schaller Gahan Wilson Gary Couzens Harvey Jacobs Iain Rowan Jack Slay, Jr. Jay Caselburg Jeff Topham Jeffrey Ford Jeffrey Thomas John Coulthart Kage Baker K.J. Bishop L. Timmel Duchamp Lance Olsen Liz Williams Martin Newell Michael Barry Michael Bishop Michael Cisco Michael Cobley Michael Moorcock Mike O'Driscoll Nathan Ballingrud Neil Gaiman Neil Williamson Paul Di Filippo R.M. Berry Rachel Pollack Rhys Hughes Richard Calder Rikki Ducornet Robert Freeman Wexler Sara Gwenllian Jones Shelley Jackson Stepan Chapman Steve Rasnic Tem Steve Redwood Tamar Yellin Tim Lebbon Trivia The Thackery T. Lambshead Pocket Guide to Eccentric & Discredited Diseases makes an appearance in the novel Monstrocity by Jeffrey Thomas. It is also referenced in VanderMeer's own collection City of Saints and Madmen. External links Book review at SFFWorld.com Four Additional (Unofficial) Entries to the Guide - A tribute by author Jordan Inman 2003 books Fantasy anthologies Fictional diseases and disorders
4628618
https://en.wikipedia.org/wiki/Making%20the%20Most%20of%20the%20Micro
Making the Most of the Micro
Making the Most of the Micro is a TV series broadcast in 1983 as part of the BBC's Computer Literacy Project. It followed the earlier series The Computer Programme. Unlike its predecessor, Making the Most of the Micro delved somewhat deeper into the technicalities and uses that microcomputers could be put to, once again mainly using the BBC Micro in the studio for demonstration purposes. The series was followed by Micro Live. Presenters Ian McNaught-Davis (known as 'Mac') was once again the anchorman but Chris Serle and Gill Nevill were absent, instead various experts were brought in as required to demonstrate some of the more technical aspects of the microcomputers and their uses. John Coll was the main technical 'bod' (he had also written the User Guide for the BBC Micro along with other manuals) and Ian Trackman also featured - he wrote most of the software that was used for demonstrating certain features of the microcomputer, not only for this series but also The Computer Programme and Computers in Control. The programme also featured location reports to demonstrate various practical and business uses of microcomputers. The title and incidental music was by Roger Limb of the BBC Radiophonic Workshop. Programmes The series was split into 10 programmes, each about 25 minutes long and dealing with a particular subject area. They were as follows (original airdates in brackets): The Versatile Machine (10 January 1983) Getting Down to BASIC (17 January 1983) Strings and Things (24 January 1983) Introducing Graphics (31 January 1983) Keeping a Record (7 February 1983) Getting Down to Business (14 February 1983) Sounds Interesting (21 February 1983) Everything Under Control (28 February 1983) Moving Pictures (7 March 1983) At the End of the Line (14 March 1983) See also Micro Men The Computer Programme Computers in Control Micro Live External links Ian Trackman's web site BBC Television shows Computer science education in the United Kingdom Computer television series 1983 British television series debuts 1983 British television series endings English-language television shows
49215375
https://en.wikipedia.org/wiki/Diligent%20Corporation
Diligent Corporation
Diligent Corporation, known as Diligent, is a software as a service company that enables board members of corporations, government organisations, and not-for-profit groups to share and collaborate information for board meetings. Headquartered in New York City, and incorporated in tax haven Delaware, the company has offices in Galway, Ireland; London, UK; Washington, D.C.; Munich, Germany; Budapest, Hungary; Hong Kong; and China. The company's main products are based on a software as a service model and sold by subscription. These include corporate governance technologies, such as a board portal for messaging, and voting tools for board directors and executives. Diligent's products are used by 60 percent of all Fortune 1000 companies. History Early history Diligent was founded by Brian Henry and Kiri Borg in 1994 as Manhattan Creative Partners (MCP). In 1998, SunAmerica became a client of Manhattan Creative Partners after SunAmerica director Peter Harbeck and Brian Henry discovered a joint passion as airplane owners and recreational pilots. In 1999, Alessandro Sodi was hired and trained by Henry and Borg to assist in account management. Eventually, Henry turned over the day-to-day management of the SunAmerica account to Sodi. In 2000, when SunAmerica expressed their desire to do away with their burdensome hard-copy corporate board paper books, Henry conceived of and hand-drew the design for the original and first web-based "Boardbook." Marco Morsella, the first employee hired by Henry and Borg in 1997 to be MCP's graphic designer, created a digital GUI based on Henry's design. SunAmerica became the first company to license the new, web-based Boardbook. Henry and Borg, and later the Diligent Partners, recognized the value of this innovative technology and invested most of the profits from other consultancy projects into the continuing development of the new software product. New company In 2003, Henry and Borg formed Diligent Partners, together with Sharon Daniels, Dan Kiley, Kenneth Carroll, Alessandro Sodi, Marc Daniels and Robert Craig. Henry was named CEO of the new entity, Sodi, President, and Carroll, chairman. At the same time, the company renamed MCP to Diligent Board Member Services "to reflect the firm's shift of focus to corporate governance service delivery." By 2006, Diligent had expanded to Europe, Canada, and the UK. Initial public offering Diligent listed on the New Zealand Exchange (NZX) in 2007 with a $24m NZX IPO, valuing the company at $115m. While fundraising for the company was successful, prior to the company listing some information became public regarding bankruptcies of original founder and CEO Brian Henry and his brother Gerald Henry. These bankruptcies occurred in the late 1980s due to the 1987 share market crash. Gerald Henry was given jail term for fraud in 1991 due to a subset of US law. Diligent released a statement clarifying that Gerald Henry was not associated with the company. The day following Diligent's listing, the company announced Henry's resignation as CEO due to the oversight of the disclosure of this information. However, some journalists at the time claimed that Diligent's PR firm and attorneys "should have advised Henry to publicly acknowledge his history." In fact, Diligent's IPO attorney, Mark Russell, said his firm, Buddle Findlay, "was proud to have our brand associated with Diligent and with Brian in particular. We have no concerns whatsoever." It was reported that "Diligent's promoters and directors, including director Mark Russell of law firm Buddle Findlay, knew of Brian's connection with Energycorp, but decided not to disclose. Coincidentally, Russell, in the 1980s worked for the Bank of New Zealand, which had lent the company $15.5m on a handshake." Henry was replaced by Alessandro Sodi, who was then President. Henry remained on the Board, his value to the company recognised by the other members of the board. Diligent's share price collapsed to $0.14 by March 2009 but later recovered to lead the pack of technology listings. Company growth and revenue restatement issues With the release of the Apple iPad in 2010, Diligent developed an app for this new platform and commenced a period of explosive growth in client numbers. Diligent made its first operating profit in 2012. Between 2011 and 2012, company revenue grew by 165%. By mid-2013, the Diligent stock price had reached over $8.00 NZD a share. In August 2013, Diligent announced that it would need to restate revenue for the current and preceding three years due to discrepancies in company accounting practices. No fraud was involved, though the company was required to recognize revenue from the date of a contract being signed rather than the start of a month, and its installation fees recognized over a longer period of time. The company also acknowledged that its accounting systems were underdeveloped and required improvement. It was fined by the NZX for a number of minor breaches of listing rules. Alessandro Sodi stepped down as CEO 2015 to focus on the launch of Diligent Teams as Chief Product Strategy Officer. Sodi was succeeded by current CEO Brian Stafford, an ex-Mckinsey partner and software-as-a-service specialist. Return to private ownership On 14 February 2016, Diligent announced that it had entered into a definitive agreement to be acquired by Insight Partners for consideration of $4.90 per share, valuing the company at $624M USD, subject to approval from the shareholders. A Shareholder meeting was held in Auckland, NZ on 13 April 2016 and having received 57% votes in favor of the Merger (as it was structured), Diligent de-listed from the NZX and returned into private hands. The merger was opposed by the NZ Shareholders Association who considered the offer too low, stating that "current offer is low compared to where the company's prospects have been in the past" Since 2016, Diligent has restructured its leadership team and acquired new companies, including Brainloop, BoardPad, Boardroom Resources and Manzama. In December 2018, Diligent announced the launch of the Diligent Institute, a think tank aiming to provide customers with global research on governance practices. Modern Leadership Initiative In June 2020, Diligent launched its Modern Leadership Initiative and partnered with more than 20 private equity companies to improve board-level diversity. Each private equity firm agreed to open board seats of five of their portfolio companies to racially diverse candidates. The private equity firms included Insight Partners, Vista Equity Partners, Hellman & Friedman, Hg, Genstar Capital, The Jordan Company, and TA Associates, and also New Mountain Capital, Francisco Partners, Heidrick & Struggles, Egon Zehnder, and the American Investment Council. Recent restructure. During the recent Covid pandemic, Diligent has accelerated its global restructuring plans and is in the process of shutting down a number of global offices. These offices have now been shifted to major tax havens with the back office transferred to Ireland. Corporate affairs Leadership Diligent is managed by CEO Brian Stafford. Other key executives are: Amanda Carty, Chief Marketing Officer Avigail Dadone, Chief People Officer Lisa Edwards, President and Chief Operating Officer Michael Flickman, Chief Technology Officer Michael Stanton, Chief Financial Officer Dan Zitting, Chief Product Officer Jack Van Arsdale, General Counsel Customer and revenue As of 2019, the company reported more than 16,500 customers and 550,000 individual users from more than 90 countries. In 2018, the company reported revenues exceeding $200 million and in July 2019 crossed $300 million in annual recurring revenue (ARR). Products Diligent's products are based on a software as a service model and sold by subscription. They are joined within the “Governance Cloud” and offer a secure set of corporate governance software. After the COVID-19 outbreak Diligent in March 2020 started to offer K–12 public school districts and mission-based nonprofits free access to its board software. Awards Diligent was recognized in the Deloitte Technology Fast 500 three times, ranking #98 in 2014, #420 in 2016 and #433 in 2020. In 2018 and 2019, Diligent was named one of Inc. 5000’s Fastest Growing Private Companies in America. References External links Official website Diligent Institute Financial software companies Software companies of New Zealand Software companies established in 1994 Companies listed on the New Zealand Exchange Software companies based in New York City Business software companies As a service
48438174
https://en.wikipedia.org/wiki/Heavy%20Gear%20II
Heavy Gear II
Heavy Gear II is a mecha based first-person shooter video game. Set in Dream Pod 9's Heavy Gear universe, the game was developed and published by Activision in 1999 for Microsoft Windows; it was ported to Linux in 2000 by Loki Software. It is a sequel to the 1997 video game Heavy Gear. Plot An atrocity caused by the New Earth Concordat (NEC) — the destruction of the Badlands city of Peace River by an antimatter bomb — leads to a formal cease-fire between the Northern and Southern Leagues. A team of elite Special-Ops pilots from across the planet, equipped with advanced prototype Gears and a newly unveiled intersystem assault shuttle, are sent through the interstellar Tannhauser gate to the planet Caprice, the nearest NEC base, to find out more and strike back at the enemy. Gameplay Heavy Gear II incorporates third-person and stealth elements. Development Development of Heavy Gear II began during the autumn of 1997 with an aggressive 13 month schedule which aimed to ship the game in autumn 1998. The Heavy Gear II team was a newly formed group within Activision, composed of some of the best and brightest employees that the firm had available, each of whom had experience working on several 3D games previously. The design team wanted to move away from the larger BattleMech style gears as found in the original Heavy Gear, and to smaller, more agile gears, staying truer the Heavy Gear universe and allowing for faster paced and more diverse gameplay. Work began on a new game engine, named Dark Side, to power the title. Built in C++, the first priority for the engine was stability, with a focus on memory management and leak tracking, allowing developers to spot memory leaks as soon as they occurred, rather than later on in a test cycle. By adopting a modular approach to their engine design, developers could easily switch in and out different systems as they experimented during the development process. Although there were debates within the team regarding the issue, it was concluded that software rendering was insufficient to deliver on the teams' visual goals, and so support was dropped from the engine, making video cards a necessity to run Heavy Gear II. This allowed the team to focus on delivering a single high quality product, saving the team from having to replicate in-game assets at lower level of details for lower specification machines, cutting time in production and testing. By targeting only video card equipped machines, Heavy Gear II could feature more units on screen and greater environmental detail. Offloading the visual rendering to dedicated video cards allowed for CPU cycles to be freed up for other uses such as AI systems. The approach to AI was to go for a high level "autopilot" system, whereby non-player characters could decide for themselves what objectives to achieve and how to go about it. This reduced the need for individually scripted actions, and enforced a consistent feel across all missions. Enemy units have a tactical awareness on the battlefield, and will keep track of the successes and failures of different approaches against the player character, varying their tactics depending on the situation. Enemy units may split off into separate fireteams, call upon reinforcements or order artillery strikes against the player. Multiplayer was an important element in the game, and it was implemented using an internal Activision networking SDK early in development, before even the single-player portion of the game. Although co-operative multiplayer was also planned, this feature had to be dropped, as it proved impossible to populate the game with enough enemy units to make the game challenging. The missions designed for the planned co-op mode were, however, saved for the standalone single-player "Historic Missions" in the final game. Despite the experience of the team, Heavy Gear II did not adhere to the intended schedule. The team underestimated the work needed to produce a demo, and did not factor in costs involved with employee turnover into their original schedule. When the game was released in June 1999, it had missed the intended ship date by approximately nine months. This slippage also had a negative impact on the effectiveness of the marketing campaign. The Dark Side engine would be used again in Activision's Interstate '82, released later that year. In October 1999, Loki Software was assigned to port Heavy Gear II across to Linux. The Linux port took Loki Software approximately 6 months to complete, and was described by Loki as "their most challenging product to date". As Heavy Gear II uses Microsoft's proprietary Direct3D API for graphical rendering, Loki had to port this to OpenGL in order for it to run on Linux. To realise the game's 3D sound on Linux, Loki worked with support from Creative Technology to build an open standard audio API from scratch with a cross-platform free software implementation, named OpenAL. Heavy Gear II was the first Direct3D game to be ported to OpenGL, and was the first to implement OpenAL. The Linux port was released in April 2000, and a demo followed in June. Reception The game received favourable reviews according to the review aggregation website GameRankings. The game was compared favourably to its predecessor, Heavy Gear, with PC Zone stating that it was a huge improvement over the first game, and GameSpot believing that Activision had redeemed itself for the failures of the original. Critics praised the gameplay. NextGen described the game as "a big, splendid mech sim, with lots of depth and excellent balance between action and strategy". PC Zone echoed similar sentiments, calling it "a commendable stab at creating a balanced blend of strategy and all-out bot-blasting action". IGN highlighted the diversity on offer, stating that the "variety in environments, variety in Gear loadout and variety in gameplay" was above and beyond that offered by others in the genre. The Dark Side powered graphics were well received. GameSpot described the graphics as being "very impressive, with detailed, heavily articulated gears and vehicles, natural-looking fauna [and] vivid landscapes". Maximum PC described the engine as "gorgeous", praising it for providing "sharply rendered mechs and terrain details, as well as eye-popping explosions that light up the landscape" and other graphical effects. It did note, however, that there were "a few clipping and polygon-collision-detection glitches", similar detractions were noted by Linux Journal and IGN. Performance wise, PC Gamer UK found it ran surprisingly well, even below the minimum requirements, while Maximum PC noted that the game maintained a consistent 30 fps on their test machine. IGN warned, however, that they encountered several crash bugs on the instant action and historical mission modes. Critics enjoyed the Gear customisation options. IGN, NextGen and Linux Journal all highlighted the sheer number of options available. Although GameSpot felt that the "threat level" limitation was artificial, it praised the flexibility it offered. PC Zone stated it was "a joy to give birth to these new creations and see them succeed in the task they were created for", in regards to the Gear customisation process. Response to the AI was mixed. GameSpot praised the AI as challenging, and PC Zone described the enemy AI as being "clued-up on battle tactics", able to "dodge and weave all over the place" as well as seek reinforcements. PC Gamer UK disagreed, accusing the AI of being "absolutely, hilariously terrible", a view shared with Linux Journal, who described the AI as being "basically pathetic", both critics cited examples of poor pathfinding. Critics reacted favourably to the multiplayer options. PC Zone described it as being "well-conceived and well-supported", highlighting the various modes of multiplayer gameplay and the "thriving clan structure". While IGN felt that there was nothing particularly new or innovative in the multiplayer game, and Maximum PC felt that it was not as strong as the single player experience, both critics still felt the multiplayer experience to be worthwhile. GameSpot compared the multiplayer element favourably to that of rival mecha game, MechWarrior 3, but, like Linux Journal, found the lack of co-operative multiplayer a disappointment. Some critics pointed out difficulty in adapting to the controls. PC Zone found the controls frustrating, warning readers about the "virtually insurmountable learning curves". The control system was described as the most notable flaw in the game by IGN, with the critic having to resort to using a printed controls reference card included with the game. Linux Journal looked at it from the other direction though, pointing out that once mastered, "the payoff, however, is extraordinary; the level of control a player has in HG2 is amazing". The game was commercially unsuccessful. In the United States, it sold 15,000 units by the end of July 1999, according to PC Data. Mark Asher of CNET Gamecenter called these figures "pretty awful" and remarked that it "has to be a major disappointment for Activision". The game's sales in the United States rose to 28,598 copies by the end of 1999. Daily Radars Andrew S. Bub named it the most unfairly overlooked simulation title of the year. The game was nominated for Computer Gaming Worlds 1999 "Science Fiction Simulator of the Year" award, which went to Freespace 2. It was also nominated for PC PowerPlays "Best Fantasy Sim" award, which went to MechWarrior 3. Although a sequel was reported to be in production at Savage Entertainment, this was ultimately cancelled. Activision would discontinue development on the Heavy Gear franchise in 2002, believing that strictly PC games such as Heavy Gear II did not have a viable future with the launch and growth of sixth generation consoles. No further news of Heavy Gear video games appeared until 2012, when Dream Pod 9 announced that Stompy Bot Productions were to produce a new game. In 2013, Stompy Bot launched a crowdfunding campaign, aiming to raise $900,000 towards the development of Heavy Gear Assault. Though the Kickstarter element of the campaign failed, development of the game is still ongoing. Notes References External links 1999 video games Activision games Heavy Gear video games Linux games Loki Entertainment games Multiplayer and single-player video games Multiplayer online games Video game sequels Video games about mecha Video games based on tabletop role-playing games Video games developed in the United States Windows games
5448360
https://en.wikipedia.org/wiki/Software%20manufacturing
Software manufacturing
Software manufacturing is the process of producing software in ways similar to the manufacturing of tangible goods. In this way of conducting business, each copy of the software is priced and sold as though it was a tangible product. The sales process usually is conducted by per copy or per desktop software licensing. When this method is used, the software is developed by software engineering firms specializing in such practices and distributed through retail stores and sold on a per unit basis at a margin price to the buyer greater than zero, even though discounting sales support, tech support, advertising, packaging, licensing agreements, refund and return handling, liability insurance, sales processing, support of new operating systems, and other costs, software has a zero marginal cost per copy to the producer. Software manufacturing like all other tangible goods can have errors, and Total quality management (TQM) can be implied in the process. Both proprietary software and free software can be produced in this model and sold and distributed as commercial software. References See also Commercial software Proprietary software Software development process Software factory Code morphing Code obfuscation Software distribution
55125284
https://en.wikipedia.org/wiki/Justin%20Davis%20%28gridiron%20football%29
Justin Davis (gridiron football)
Justin Davis (born November 11, 1995) is an gridiron football running back for the Ottawa Redblacks of the Canadian Football League (CFL). He played college football at USC and signed with the Los Angeles Rams as an undrafted free agent in 2017. Early life Davis attended Lincoln High School in Stockton, California. While there he played high school football. College career Davis played college football for the USC Trojans. He was named third team All-Pac-12 by analyist Phil Steele in 2015. Collegiate statistics Professional career Los Angeles Rams Davis signed with the Los Angeles Rams as an undrafted free agent on April 29, 2017. In Week 1, against the Indianapolis Colts in his NFL debut, Davis had his first career carry, a one-yard rush, in the 46–9 victory. As a rookie, he appeared in four games. In the 2018 season, he had two carries for 19 yards. Davis was waived during final roster cuts on August 31, 2019. Arizona Cardinals On September 3, 2019, Davis was signed to the Arizona Cardinals' practice squad. His practice squad contract with the team expired on January 6, 2020. Ottawa Redblacks Davis signed with the Ottawa Redblacks of the CFL on July 16, 2021, and was subsequently placed on the team's suspended list. References External links Los Angeles Rams bio USC Trojans bio 1995 births Living people African-American players of American football American football running backs Players of American football from Stockton, California Players of Canadian football from Stockton, California USC Trojans football players Los Angeles Rams players Arizona Cardinals players Ottawa Redblacks players 21st-century African-American sportspeople
17798594
https://en.wikipedia.org/wiki/India%E2%80%93Saudi%20Arabia%20relations
India–Saudi Arabia relations
India–Saudi Arabia relations (; ), also referred to as Indian-Saudi relations or Indo-Saudi relations, are the bilateral relationship between the Republic of India and the Kingdom of Saudi Arabia. Relations between the two nations are generally strong and close, especially in commercial interests. Indo-Saudi bilateral trade reached US$27.48 billion in the financial year 2017–18, up from US$25.1 billion in the preceding year. Saudi Arabia's exports to India stood at US$22.06 billion whereas India's exports were US$5.41 billion. History Trade and cultural links between ancient India and Arabia date back to the third millennium BC. By 1000 AD, the trade relations between southern India and Arabia flourished and became the backbone of the Arabian economy. Arab traders held a monopoly over the spice trade between India and Europe until the rise of European imperialist empires. India was one of the first nations to establish ties with the Third Saudi State. During the 1930s, British India heavily funded Nejd through financial subsidies. Formal diplomatic relations between contemporary India and Saudi Arabia were established soon after India gained independence in 1947. Relations between the two countries have strengthened considerably owing to collaboration in regional affairs and trade. Saudi Arabia is one of the largest suppliers of oil to India, who is one of the top seven trading partners and the fifth biggest investor in Saudi Arabia. In history there have been four visits to Saudi Arabia by an Indian Prime Minister: Jawaharlal Nehru (1955), Indira Gandhi (1982), Manmohan Singh (2010) and Narendra Modi (2016) . The two countries share similar views on combating terrorism. There is a strong geographical and a demographic connect between the 2 countries as Islam is the 2nd major religion in India. Moreover, the Muslim population in India is the second largest in the world, thus sending a huge number of Indian pilgrims every year for Umrah and Hajj. Background Since its independence in 1947, India has sought to maintain strong ties with Saudi Arabia, an important regional power and trading base in West Asia. In a major visit by King Saud of Saudi Arabia to India in November 1955, both nations agreed to shape their relationship based on the Five Principles of Peaceful Co-existence. Saudi Arabia is also home to more than 1.4 million Indian workers. India was the only South Asian nation to recognise the Soviet-backed Democratic Republic of Afghanistan, whereas Saudi Arabia was one of the key supporters of the Afghan mujahideen, who fought the Soviets and their Afghan allies from Pakistan. Development of bilateral relations India's strategic relations with Saudi Arabia have been affected by the latter's relations with Pakistan. Saudi Arabia supported Pakistan's stance on the Kashmir conflict and during the Indo-Pakistani War of 1971, at the expense of its relations with India. The Soviet Union's close relations with India also negatively affected Indo-Saudi relations. During the Persian Gulf War (1990–91), India officially maintained neutrality. Saudi Arabia's close military and strategic ties with Pakistan have also been a source of continuing strain. Since the 1990s, both nations have taken steps to improve ties. Saudi Arabia has supported granting observer status to India in the Organisation of Islamic Cooperation (OIC) and has expanded its collaboration with India to fight Islamic terrorism in the Middle East. In January 2006, King Abdullah of Saudi Arabia made a special visit to India, becoming the first Saudi monarch in 51 years to do so. The Saudi king and the Indian Prime Minister Manmohan Singh signed an agreement forging a strategic energy partnership that was termed the "Delhi Declaration." The pact provides for a "reliable, stable and increased volume of crude oil supplies to India through long-term contracts." Both nations also agreed on joint ventures and the development of oil and natural gas in public and private sectors. An Indo-Saudi joint declaration in the Indian capital New Delhi described the king's visit as "heralding a new era in India-Saudi Arabia relations." In 2019, Saudi Arabia increased the Hajj quota of India, making it the 2nd highest country in the number of Pilgrims. Thus, the number of Indian pilgrims jumped to 200,000 per year in 2019. In October 2021, India voted against a Dutch resolution in the UN Human Rights Council that proposed extending the mandate of the Group of Eminent Experts (GEE) human rights investigators in Yemen. This resolution had been heavily lobbied against by Saudi Arabia. Commerce Since the 1990s, India's economic liberalisation has helped bolster trade with Saudi Arabia, which annually supplies to India nearly 175 million barrels (25 million metric tonnes) of crude oil, or a quarter of its needs. In 2006–07 bilateral trade stood at US$16 billion (US$3 billion excluding oil) and is expected to double by 2010. India's current exports to Saudi Arabia stand at US$2.6 billion, while Saudi Arabia's exports are in the range of US$13.4 billion (US$1.2 billion excluding oil). India's major exports include basmati rice, textiles and garments and machinery, while it imports organic and inorganic chemicals, metal scrap, leather, gold and oil from Saudi Arabia. Both nations are expected to expand trade and cooperation and joint ventures in telecommunications, pharmaceuticals, health services, information technology, biotechnology, agriculture, construction projects, energy and financial services. Both countries agreed to launch joint ventures for developing gas-based fertiliser plants in Saudi Arabia. India agreed to set up institutes of higher education and research, provide educational opportunities in India for Saudi Arabian students and expand cooperation between India's Council of Scientific and Industrial Research and the Saudi Arabian Standards Organisation (SASO). India and Saudi trade was almost US$25 billion last fiscal year with about 2 million Indians working in Saudi Arabia . Saudi imports from India amounted to $7 billion or 2.7% of India's overall exports in 2015. The 10 major commodities exported from India to Saudi Arabia were: Cereals: $1.8 billion Spices: $1.1 billion Machinery: $459 million Iron or steel products: $367.8 million Organic chemicals: $336.8 million Meat: $267.9 million Vehicles: $229 million Ceramic products: $217.4 million Electronic equipment: $203.9 million Clothing (not knit or crochet): $162.8 million Saudi exports to India amounted to $21.4 billion or 5.5% of India's overall imports in 2015. The 10 major commodities exported from Saudi Arabia to India were: Oil: $16.4 billion Organic chemicals: $1.2 billion Plastics: $954.2 million Fertilizers: $729.9 million Gems, precious metals: $631.4 million Aircraft, spacecraft: $542.9 million Inorganic chemicals: $199.5 million Aluminum: $193.6 million Copper: $142.9 million Other chemical goods: $95.3 million In 2018, the bilateral trade relations between the two countries hit $27.5 billion. Bilateral investment India and Saudi Arabia are developing countries and need two-sided flow of investment in infrastructure and development. Progressive growth has been observed between the countries in bilateral investment after the liberalisation policy of India in 1991 and little bit faster increase in new millennium. Saudi Arabia is ranked at 15th position in country-wise FDI joint venture in India and it is second in Arab countries followed by UAE. Saudi has US$21.55 million worth value in FDI joint venture during 2004–05 to 2007–08. Saudi is also among the major FDI investing countries in India, it has invested during August 1991 to December 1999 and during January 2000 to August 2008. Investment is observed in diverse fields such as paper manufacture, chemicals, computer software, granite processing, industrial products and machinery, cement, metallurgical industries, etc. Indian firms also has shown the interest in Saudi market after new Saudi laws and established joint venture projects or wholly owned subsidiaries in the Kingdom. According to Saudi investment authority survey, India has 56 FDI projects having worth of 304 million SAR during 2005 in Saudi Arabia. These licenses are for projects in different sectors such as management and consultancy services, construction projects, telecommunications, information technology, pharmaceuticals, etc. Moreover, several Indian companies have established collaborations with Saudi companies and working in the Kingdom in the areas of designing, consultancy, financial services and software development. 2010 visit to Saudi Arabia by Manmohan Singh Indian Prime Minister Manmohan Singh undertook a three-day visit to Saudi Arabia beginning 27 February 2010. He was accompanied by his wife Gursharan Kaur, Health Minister Ghulam Nabi Azad, Commerce and Industry Minister Anand Sharma, Petroleum Minister Murli Deora and Minister of State for External Affairs Shashi Tharoor. It was the first visit to the kingdom by an Indian Prime Minister since 1982 and the third to date. In a rare diplomatic gesture symbolising the strong cultural and socio-economic ties between the two nations, Dr Singh and his official delegation were received at the royal terminal of the King Khalid International Airport by Crown Prince Sultan bin Abdul Aziz accompanied by his entire cabinet. In departure from the protocol norms, a red carpet was rolled out to the Prime Minister, instead of the traditional green carpet. The nearly 40 km route from the airport to the city centre was lined with Indian and Saudi Arabian flags. On the second day a formal reception was held in honour of the state guests. Singh was scheduled to address the Majlis-e-Shura, a privilege that has been described as "a singular honour". Female diplomat Latha Reddy was permitted not to wear the abaya or the hijab. This special gesture was described as "largely symbolic in nature, but it is a sign of the changing times". During his visit Singh received an honorary doctorate from Saudi Arabia's King Saud University. An MoU for co-operation was signed between Indian Institute of Science, Bangalore and the King Saud University in the presence of the Prime Minister. Later, speaking at a community event at the Indian Embassy hosted by Indian Ambassador Talmeez Ahmed, the Prime Minister praised the contributions made by the over 1.8 million Indian citizens. "India is proud of you and proud of your achievements in this country," he said. An extradition treaty was signed by Indian Health and Family Welfare Minister Ghulam Nabi Azad and Prince Naif bin Abdul Aziz the Saudi Arabian Second Deputy Prime Minister and Interior Minister. Four agreements pertaining to transfer of sentenced persons, cultural co-operation, Memorandum of Understanding between Indian Space Research Organisation and King Abdulaziz City for Science and Technology for co-operation in peaceful use of outer space and joint research and information technology were also signed in presence of the two leaders. Four other agreements were also signed the day before, including one by Tata Motors to supply school buses worth US$80 million. Singh returned home on 1 March 2010 concluding this three-day visit. This visit is considered as India's attempt to increase the depth of relationships between the two countries and make a pitch for investments from Saudi Arabia. 2016 visit by Narendra Modi Indian prime minister Narendra Modi visited Saudi Arabia in April, 2016. The agenda of the visit was energy, security, trade and well-being of Indian diaspora. During the visit, Modi was conferred the Order of King Abdulaziz which is Saudi Arabia's highest civilian honor. 2019 Visit to India The Saudi Crown Prince, Muhammad bin Salman, made a visit to India during his journey to many Asian countries in February, 2019. The crown prince met the Indian prime minister as well as a number of high officials in India. The main aim of the visit is to improve the historical ties between the two countries. The two sides agreed on increasing trade relations between them. Moreover, the number of Indian pilgrims performing Hajj in Saudi Arabia has been increased to 200,000 every year. The Saudi prince expected that the Saudi investment in India may reach $100 billion in the next two years. See also Indians in Saudi Arabia Foreign relations of India Foreign relations of Saudi Arabia References Saudi Arabia Bilateral relations of Saudi Arabia
2526997
https://en.wikipedia.org/wiki/EnOcean
EnOcean
The EnOcean technology is an energy harvesting wireless technology used primarily in building automation systems, but also in other application fields such as industry, transportation, logistics or smart homes solutions. The energy harvestng wireless modules are manufactured and marketed by the company EnOcean, headquartered in Oberhaching near Munich. The modules combine micro energy converters with ultra low power electronics and wireless communications and enable batteryless, wireless sensors, switches, and controls. In March 2012, the EnOcean wireless standard was ratified as the international standard ISO/IEC 14543-3-10, which is optimized for wireless solutions with ultra-low power consumption and energy harvesting. The standard covers the OSI (Open Systems Interconnection) layers 1-3 which are the physical, data link and networking layers. EnOcean is offering its technology and licenses for the patented features within the EnOcean Alliance framework. Technology EnOcean technology is based on the energetically efficient exploitation of applied slight mechanical motion and other potentials from the environment, such as indoor light and temperature differences, using the principles of energy harvesting. In order to transform such energy fluctuations into usable electrical energy, electromagnetic, solar cells, and thermoelectric energy converters are used. EnOcean-based products (such as sensors and light switches) perform without batteries and are engineered to operate maintenance-free. The radio signals from these sensors and switches can be transmitted wirelessly over a distance of up to 300 meters in the open and up to 30 meters inside buildings. Early designs from the company used piezo generators, but were later replaced with electromagnetic energy sources to reduce the operating force (3.5 newtons), and increase the service life to 100 operations a day for more than 25 years. EnOcean wireless data packets are relatively small, with the packet being only 14 bytes long and are transmitted at 125 kbit/s. RF energy is only transmitted for the 1's of the binary data, reducing the amount of power required. Three packets are sent at pseudo-random intervals reducing the possibility of RF packet collisions. Modules optimized for switching applications transmit additional data packets on release of push-button switches, enabling other features such as light dimming to be implemented. The transmission frequencies used for the devices are 902 MHz, 928.35 MHz, 868.3 MHz and 315 MHz. On May 30, 2017 EnOcean unveiled a series of light switches utilizing Bluetooth Low Energy radio (2.4 GHz). Application examples One example of the technology is a battery-free wireless light switch. Advantages are that it saves time and material by eliminating the need to install wires between the switch and controlled device, e.g., a light fixture. It also reduces noise on switched circuits, as the switching is performed locally at the load. Other lighting applications include occupancy sensors, light sensors and key card switches. Furthermore, heating, ventilation, and air conditioning (hvac) applications such as temperature sensors, humidity sensors, CO2 sensors, metering sensors already use EnOcean’s energy harvesting wireless technology. Company EnOcean GmbH is a venture-funded spin-off company of Siemens AG founded in 2001. It is a German company headquartered in Oberhaching, near Munich, which currently employs 60 people in Germany and the USA. It is a technology supplier of energy harvesting wireless modules (transmitters, receivers, transceivers, energy converter) to companies (e.g. Siemens Building Technologies, Distech Controls, Zumtobel, Omnio, Osram, Eltako, Wieland Electric, Pressac, Peha, Thermokon, Wago, Herga), which develop and manufacture products used in building automation (light, shading, hvac), industrial automation, and other application fields automotive industry (replacement of the conventional battery in tyre pressure sensors). The company won the Bavarian Innovation Prize 2002 for its technology, the award "Technology Pioneer 2006" by the World Economic Forum, the "Top-10 Product for 2007" award by Building Green and was among the global cleantech 100 in 2011. In November 2007, MK Electric, the manufacturer of consumer electrical fitments in the UK, adopted EnOcean technology for a wireless switches. In April 2012, EnOcean wireless technology was ratified as the international wireless standard ISO/IEC 14543-3-10 Information technology - Home Electronic Systems (HES) - Part 3-10: Wireless Short-Packet (WSP) protocol optimized for energy harvesting - Architecture and lower layer protocols. EnOcean Alliance A group of companies including EnOcean, Texas Instruments, Omnio, Sylvania, Masco, and MK Electric formed the EnOcean Alliance in April 2008 as a non-profit, mutual benefit organization. The EnOcean Alliance aims to internationalise this technology, and is dedicated to creating interoperability between the products of OEM partners, in order to bring about the existence of a broad range of interoperable wireless monitoring and controlling products for use in and around residential, commercial and industrial buildings. For this the EnOcean Alliance has drawn up the application level protocols are referred to as EEPs (EnOcean Equipment Profiles). Together with the three lower levels of the international wireless standard ISO/IEC 14543-3-10 the Alliance lay the foundation for a fully interoperable, open wireless technology. More than 250 companies currently belong to the EnOcean Alliance. The headquarters of the organization is in San Ramon, California. Market research company WTRS estimated that EnOcean module shipments might reach $1.4B in 2013. Software automation EnOcean is supported by Fhem and ago control. Fhem and ago control are GPL licensed software suites for house automation. They are used to automate some common tasks in the household like switching lamps, shutters, heating, etc., and to log events like temperature, humidity, and power consumption. Both run as servers that are controlled via web front-end, telnet, command line, or TCP/IP directly. See also IEEE 802.15.4 Insteon MyriaNed ZigBee Z-Wave References External links Arveni s.a.s website: other examples of vibration and pulse piezo energy harvesting EnOcean website EnOcean Alliance website Easyfit by EnOcean website Home Electronic Systems (HES) -- Part 3-10: Wireless Short-Packet (WSP) protocol Companies based in Bavaria Electronics companies of Germany Building automation Personal area networks Energy harvesting Automation
780419
https://en.wikipedia.org/wiki/Philip%20Don%20Estridge
Philip Don Estridge
Philip Donald Estridge (June 23, 1937 – August 2, 1985), known as Don Estridge, was an American computer engineer who led development of the original IBM Personal Computer (PC), and thus is known as the "father of the IBM PC". His decisions dramatically changed the computer industry, resulting in a vast increase in sales of personal computers, thus creating an entire industry of hardware manufacturers of IBM PCs. Jerry Vandiver was a consultant to Don and he was one of the most pleasant people to be around. He wrote the book for IBM PC & PC JR computer at the request of Don. Early life Estridge was born in Jacksonville, Florida. His father was a professional photographer. He graduated from Bishop Kenny High School in 1955, and from the University of Florida in 1959. He married Mary Ann Hellier in September, 1958, and they had three children: Patricia Ann, Mary Evelyn and Sandra Marie. He completed a bachelor's degree in electrical engineering at the University of Florida and worked at the Army, designing a radar system using computers, IBM and finally NASA's Goddard Space Flight Center until he moved to Boca Raton, Florida, in 1969. IBM Before being the leader of the team to develop the IBM PC he had been the lead manager for the development of the IBM Series/1 mini-computer. After this project, in 1979, he was assigned to manage a Series /1 special bid development group. This engineering and planning organization was responsible for responding to custom system solutions requested by large account sales and marketing representatives. One of the largest and most successful special bids ever won by IBM was the Series /1 Agent Computer System for the State Farm Insurance company. In mid-1980, he was rewarded with the opportunity to lead IBM's efforts in the emerging personal computer business. His efforts to develop the IBM PC began when he took control of the IBM Entry Level Systems in 1980 (and was later named president of the newly formed IBM Entry Systems Division (ESD) in August 1983), with the goal of developing a low-cost personal computer to compete against increasingly popular offerings from the likes of Apple Computer, Commodore International, and other perceived IBM competitors. To create a cost-effective alternative to those companies' products, Estridge realized that it would be necessary to rely on third-party hardware and software. This was a marked departure from previous IBM strategy, which centered on in-house vertical development of complicated mainframe systems and their requisite access terminals. Estridge also published the specifications of the IBM PC, allowing a booming third-party aftermarket hardware business to take advantage of the machine's expansion card slots. The competitive cost and expandability options of the first model, the IBM PC model 5150, as well as IBM's reputation, led to strong sales to both enterprise and home customers. Estridge was rapidly promoted, and by 1984 was IBM Vice President, Manufacturing, supervising all manufacturing worldwide. Steve Jobs offered Estridge a multimillion-dollar job as president of Apple Computer but he declined. Death and legacy Estridge and wife Mary Ann were killed in the crash of Delta Air Lines Flight 191 at Dallas/Fort Worth International Airport on August 2, 1985. He was 48 years old. The Estridges were survived by their four daughters. At the time of his death, IBM ESD, which included the development and manufacturing of the IBM PC, PC DOS, PC LAN and TopView, had nearly 10,000 employees and had sold over a million PCs. Estridge has been honored many times. In 1999, he was identified in CIO magazine as one of the people who "invented the enterprise". The Don Estridge High-Tech Middle School — formerly IBM Facility Building 051 — in Boca Raton, Florida, is named after him, and on the occasion of its dedication received from Don Estridge's family his personal IBM 5150 computer machines. References External links , Jan Winston, CIO Magazine, Dec. 15, 1999/Jan. 1 2000. Part of Inventing the Enterprise View from the Top by Michael J. Miller, PC Magazine, 09.04.01 The History Of The IBM Personal Computer 1937 births 1985 deaths University of Florida College of Engineering alumni People from Jacksonville, Florida IBM employees Bishop Kenny High School alumni Victims of aviation accidents or incidents in 1985 Victims of aviation accidents or incidents in the United States Accidental deaths in Texas
66205269
https://en.wikipedia.org/wiki/Rinki%20Sethi
Rinki Sethi
Rinki Sethi has been vice president and Chief Information Security Officer at Twitter Inc since September 2020. Sethi joined Twitter after the 2020 Twitter bitcoin scam breach which compromised accounts of then-presidential candidate Joe Biden, Kim Kardashian, Elon Musk and Microsoft co-founder Bill Gates. She was an Information Security Executive at IBM from 2018 to 2019. She has also worked with companies like Walmart, Intuit, Ebay and others as a CISO and security expert. Early life Sethi is an Indian American, born to Punjabi parents. She attended Capella University from 2006 to 2007 and completed a Master's Degree in Information Security and completed a Bachelors Degree in Computer Science Engineering from UC Davis in 2004. Career Sethi started her career at Pacific Gas and Electric Company as an Information Security Specialist from July 2004 to August 2006. From August 2006 to March 2009 she worked at Walmart as a security engineer. Sethi was later appointed by Ebay from 2009 to September 2012 where she had different security roles. She worked as VP, Security Operations at Palo Alto Networks from 2015-2018. She worked at IBM from October 2018 to April 2019 as the Vice President of Information Security. She worked at Intuit as a Director of Product Security from 2012-2015. Sethi served as chief information security officer at Rubrik from April 2019 to September 2020. Sethi also serves as an advisor to several startups, including LevelOps, Authomize, and Cybersecurity organizations, including Women in Cybersecurity. Sethi was named to the board of directors of Forge Rock in August 2021. References Year of birth missing (living people) Living people American computer specialists 21st-century American businesspeople Corporate executives Businesspeople from San Francisco Twitter people University of California, Davis alumni
21400017
https://en.wikipedia.org/wiki/File%20sequence
File sequence
In computing, as well as in non-computing contexts, a file sequence is a well-ordered, (finite) collection of files, usually related to each other in some way. In computing, file sequences should ideally obey some kind of locality of reference principle, so that not only all the files belonging to the same sequence ought to be locally referenced to each other, but they also obey that as much as is their proximity with respect to the ordering relation. Explicit file sequences are, in fact, sequences whose filenames all end with a numeric or alphanumeric tag in the end (excluding file extension). The aforementioned locality of reference usually pertains either to the data, the metadata (e.g. their filenames or last-access dates), or the physical proximity within the storage media they reside in. In the latter acception it is better to speak about file contiguity (see below). Identification Every GUI program shows contents of folders by usually ordering its files according to some criteria, mostly related to the files' metadata, like the filename. The criterion is, by default, the alphanumeric ordering of filenames, although some operating systems do that in "smarter" ways than others: for example file1.ext should ideally be placed before file10.ext, like GNOME Files and Thunar do, whereas, alphanumerically, it comes after (more on that later). Other criteria exist, like ordering files by their file type (or by their extension) and, if the same type, by either filename or last-access date, and so on. For this reason, when a file sequence has a more strong locality of reference, particularly when it is related to their actual contents, it is better to highlight this fact by letting their well-ordering induce an alphanumeric ordering of the filenames too. That is the case of explicit file sequences. Explicit file sequences Explicit file sequences have the same filename (including file extensions in order to confirm their contents' locality of reference) except for the final part (excluding the extension), which is a sequence of either numeric, alphanumeric or purely alphabetical characters to force a specific ordering; such sequences should also be ideally located all within the same directory. In this sense any files sharing the same filename (and possibly extension), only differing by the sequence number at the end of the filename, automatically belong to the same file sequence, at least when they are located in the same folder. It is also part of many naming conventions that number-indexed file sequences (in any number base) containing as many files as to span at most a fixed number of digits, make use of "trailing zeroes" in their filenames so that: all the files in the sequence share exactly the same number of characters in their complete filenames; non-smart alphanumeric orderings, like those of operating systems' GUIs, do not erroneously permute them within the sequence. To better explain the latter point, consider that, strictly speaking, file1.ext (1st file in the sequence) comes alphanumerically after file100.ext, which is actually the hundredth. By renaming the first file to file001.ext with two trailing zeroes, the problem is universally solved. Examples of explicit file sequences include: file00000.ext, file00001.ext, file00002.ext, , file02979.ext (five trailing zeroes), and another with a hexadecimal ordering of 256 files tag_00.ext, tag_01.ext, , tag_09.ext, tag_0A.ext, ..., tag_0F.ext, tag_10.ext, ..., tag_0F.ext, ..., tag_FF.ext (with just one trailing zero). Software and programming conventions usually represent a file sequence as a single virtual file object, whose name is comprehensively written in C-like formatted-string notation to represent where the sequence number is located in the filename and what is its formatting. For the two examples above, that would be filename%05d.ext and tag_%02H.ext, respectively, whereas for the former one, the same convention without trailing zeroes would be filename%5d.ext. Note, however, that such notation is usually not valid at operating system and command-line interface levels, because the '%' character is neither a valid regular expression nor a universally legal filename character: that notation just stands as a placeholder for the virtual file-like representing the whole explicit file sequence. Notable software packages acknowledging explicit file sequences as single filesystem objects, rather typical in the Audio/Video post-production industry (see below), are found among products by Autodesk, Quantel, daVinci, DVS, as well as Adobe After Effects. File scattering A file sequence located within a mass storage device is said to be contiguous if: every file in the sequence is unfragmented, i.e. each file is stored in one contiguous and ordered piece of storage space (ideally in one or multiple, but contiguous, extents); consecutive files in the sequence occupy contiguous portions of storage space (extents, yet consistently with their file ordering). File contiguity is a more practical requirement for file sequences than just their locality of reference, because it is related to the storage medium hosting the whole sequence than to the sequence itself (or its metadata). At the same time, it is a "high-level" feature, because it is not related to the physical and technical details of mass storage itself: particularly, file contiguity is realized in different ways according to the storage device's architecture and actual filesystem structure. At "low level", each file in a contiguous sequence must be placed in contiguous blocks, in spite of reserved areas or special metadata required by the filesystem (like inodes or inter-sector headers) actually interleaving them. File contiguity is, in most practical applications, "invisible" at operating-system or user levels, since all the files in a sequence are always available to applications in the same way, regardless of their physical location on the storage device (due to operating systems hiding the filesystem internals to higher-level services). Indeed, file contiguity may be related to I/O performance when the sequence is to be read or written in the shortest time possible. In some contexts (like optical disk burning - also cfr. below), data in a file sequence must be accessed in the same order as the file sequence itself; in other contexts, a "random" access to the sequence may be required. In both cases, most professional filesystems provide faster access strategies to contiguous files than non-contiguous ones. Data pre-allocation is crucial for write access, whereas burst read speeds are achievable only for contiguous data. When a file sequence is not contiguous, it is said to be scattered, since its files are stored in sparse locations on the storage device. File scattering is the process of allocating (or re-allocating) a file sequence as being (or becoming) uncontiguous. That is often associated with file fragmentation too, where each file is also stored in several, non-contiguous blocks; mechanisms contributing to the former are usually a common cause to the latter too. The act of reducing file scattering by means of allocating (in the first place) or moving (for already-stored data) files in the same sequence near together on the storage medium is called (file) file descattering. A few defragmentation strategies and dedicated software are able to both defragment single files and descatter file sequences. Multimedia file sequences There are many contexts which explicit file sequences are particularly important in: incremental backups, periodic logs and multimedia files captured or created with a chronological locality of reference. In the latter case, explicit file numbering is extremely important in order to provide both software and end users a way to discern the consequentiality of the contents stored therein. For example, digital cameras and similar devices save all the picture files in the same folder (until it either reaches its maximum file-number capacity, or a new event like midnight-coming or device-switching takes place) with a final number sequence: it would be very unpractical to choose a filename for each taken shot on the very shooting time, so the camera firmware/software picks one which is perfectly identifiable by its sequence number. With the aid of other metadata (and usually of specialized PC software), users can later on discern the multimedia contents and re-organize them, if needed. The Digital Intermediate example A typical example where explicit file sequences, as well as their contiguity, becomes crucial is in the digital intermediate (DI) workflow for motion picture and video industries. In such contexts, video data need to maintain the highest quality and be ready for visualization (usually real-time if not even better). Usually video data are acquired from either a digital video camera or a motion picture film scanner and stored into file sequences (as much as a common photographic camera does) and need to be post-produced in several steps, including at least editing, conforming and colour-correction. That requires: Uncompressed data, because any lossy compression, which is common in most finalized products, introduces unacceptable quality losses. Uncompressed data (again), because decompression times may degrade playing/visualization performance by hardware and software. Frame-per-file data management, because common post-production operations imply the shortest seek-times ever; "fast-forwarding" or "rewinding" to a specific (key) frame is much faster if done at filesystem level rather than within a huge, possibly fragmented video file; every frame is then stored in a single file as a still digital picture. Unambiguous frames' ordering, for obvious reasons, which is best accomplished grouping all the files together with explicit file numbering. File contiguity, because many filesystem architectures employ higher I/O speeds if transferring data on contiguous areas of the storage, whereas random allocation might prevent real-time or better loading performances. Consider that a single frame in a DI project is currently from 9MB to 48MB large (depending upon resolution and colour-depth), whereas video refresh rate is commonly 24 or 25 frames per second (if not faster); any storage required for real-time playing such contents thus needs a minimum overall throughput of 220MB/s to 1.2GB/s, respectively. With those numbers, all the above requirements (particularly file contiguity, given nowadays storage performances) become strictly mandatory. External links PySeq PySeq is an open source python module that finds groups of items that follow a naming convention containing a numerical sequence index (e.g. fileA.001.png, fileA.002.png, fileA.003.png...) and serializes them into a compressed sequence string representing the entire sequence (e.g. fileA.1-3.png). checkfileseq checkfileseq is an open source python script (usable via CLI) that scans a directory structure recursively for files missing in a file sequence and prints a report upon completion. It supports a wide array of filename patterns and can be customized to gain additional pattern logic. Computer files
40656316
https://en.wikipedia.org/wiki/Lee%20v.%20PMSI%2C%20Inc.
Lee v. PMSI, Inc.
Lee v. PMSI, Inc., No. 10-2094 (M.D. Florida January 13, 2011), was a case in the United States District Court for the Middle District of Florida about whether the Computer Fraud and Abuse Act (CFAA) makes it illegal for an employee to violate an employer's acceptable use policy. The court ruled that violating an employer's policy did not "exceed authorization" as defined by the CFAA and was not illegal under the act. Background The Computer Fraud and Abuse Act (CFAA) makes it illegal (with both civil and criminal penalties) to access a protected computer without authorization. Courts have long debated whether the statute applies to an employee who violates an employer's internal acceptable use policy. That interpretation of the CFAA could mean that any employee who surfs the Internet, checks Facebook, or logs into personal e-mail from work is guilty of a federal crime if the employer's workplace Internet use policy prohibits that behavior. Shortly before Lee v. PMSI, in its initial hearing on the case, the U.S. Ninth Circuit court of appeals ruled in United States v. Nosal that an employee violated the Computer Fraud and Abuse Act when he disobeyed an employer's Internet use restrictions. Case history After being fired from her position as a proposal developer in PMSI's marketing department, Wendi Lee sued PMSI for pregnancy discrimination barred by the Civil Rights Act of 1964 and Florida's analogous law. After moving the case to federal court, PMSI moved to dismiss this claim. This motion was denied and PMSI subsequently filed an amended complaint claiming Lee violated the Computer Fraud and Abuse Act. Claims PMSI Inc. argued Lee violated the CFAA because she engaged in "excessive internet usage" by "visit[ing] personal websites such as Facebook and monitor[ing] and [sending] personal email through her Verizon web mail account." They claimed her violations of the company's computer use policy cost the company more than $5,000 in wages to her and work that had to be performed by others. District court opinion In May 2011 District Judge Merryday held that Lee's conduct did not exceed authorized access to her employer's computer in violation of the CFAA. He said that the CFAA was meant to target hackers who steal information or destroy functionality, not employees who use the internet instead of working. He points out that the CFAA defines "exceeding authorized access" as not just gaining access to a system without authorization, but also obtaining or altering information. Because Lee never obtained or altered information by accessing her Facebook, email or news, she was not exceeding authorized access. The court described PMSI's claim of a $5000 loss "dubious" and held that loss productivity is not a type of loss valid for suing under the CFAA. The court cites LVRC Holdings v. Brekka, which states that when an employer authorizes an employee to use a computer under certain limitations, the employee remains authorized to use the computer, even if they violate the conditions. Lee could only have gained "unauthorized access" in violation of the CFAA if PMSI had terminated her access and she attempted to use the computer without permission. The court concludes that the rule of lenity requires a restrained, narrow interpretation of the statue. Extending the CFAA to cover private employee misconduct is the role of the legislature, not the judiciary. Significance This case was one of several which influenced the United States Court of Appeals for the Ninth Circuit en banc decision in United States v. Nosal. Like the district court in this case, the Ninth Circuit court found that definition of "exceeds authorized use" in the CFAA does not extend to violating acceptable use policies. Several attorneys cite this case as an example of "intimidation tactics" staged by employers in response to discrimination claims. See also Computer Fraud and Abuse Act Acceptable use policy United States v. Nosal References External links United States computer case law United States Internet case law 2011 in United States case law
4606216
https://en.wikipedia.org/wiki/2009%20NCAA%20Division%20I%20Men%27s%20Basketball%20Tournament
2009 NCAA Division I Men's Basketball Tournament
The 2009 NCAA Division I Men's Basketball Tournament was a single-elimination tournament in which 65 schools competed to determine the national champion of the men's NCAA Division I college basketball as a culmination of the 2008–09 basketball season. The tournament began on March 17, 2009, and concluded with the championship game on April 6 at Ford Field in Detroit, Michigan, where the University of North Carolina defeated Michigan State to become the champion. The 2009 tournament marked the first time for a Final Four having a minimum seating capacity of 70,000 and by having most of the tournament in the February Sweeps of the Nielsen Ratings due to the digital television transition in the United States on June 12, 2009, which also made this the last NCAA Basketball Tournament, in all three divisions, to air in analog television. The University of Detroit Mercy hosted the Final Four, which was the 71st edition. Prior to the start of the tournament, the top ranked team was Louisville in both the AP Top 25 and the ESPN/USA Today Coaches' Polls, followed by North Carolina, Memphis, and Pittsburgh. Only the Tar Heels of North Carolina were the regional winners and played in the Final Four. The Tar Heels completed one of the most dominant runs in the tournament's history by winning each of their games by at least twelve points. For the first time since seeding began, all #1–#3 seeds made it into the Sweet 16, and for the third consecutive time, all #1 seeds made the Elite Eight. Four schools made their NCAA tournament debut, all respective conference champions: Binghamton (America East), Morgan State (MEAC), Stephen F. Austin (Southland), and North Dakota State (Summit), a school in its first season of Division I eligibility. Tournament procedure Sixty-five teams were selected for the tournament. Thirty of the teams earned automatic bids by winning their conference tournaments. The automatic bid of the Ivy League, which does not conduct a postseason tournament, went to Cornell, its regular season champion. The remaining 34 teams were granted "at-large" bids by the NCAA Selection Committee. Two teams play an opening-round game, popularly called the "play-in game". The winner of that game advances to the main draw of the tournament as a 16 seed and plays a top seed in one of the regionals. The 2009 game was played on Tuesday, March 17, at the University of Dayton Arena in Dayton, Ohio, as it has since its inception in 2001. All 64 teams were seeded 1 to 16 within their regions; the winner of the play-in game automatically received a 16 seed. The Selection Committee seeded the entire field from 1 to 65. SEC commissioner Michael Slive served his last year as chairman of the committee. Schedule and venues The following are the sites that were selected to host each round of the 2009 tournament: Opening Round March 17 University of Dayton Arena, Dayton, Ohio (Host: University of Dayton) First and Second Rounds March 19 and 21 Greensboro Coliseum, Greensboro, North Carolina (Host: Atlantic Coast Conference) Sprint Center, Kansas City, Missouri (Host: Big 12 Conference) Wachovia Center, Philadelphia, Pennsylvania (Host: Saint Joseph's University) Rose Garden, Portland, Oregon (Host: University of Oregon) March 20 and 22 Taco Bell Arena, Boise, Idaho (Host: Boise State University) University of Dayton Arena, Dayton, Ohio (Host: University of Dayton) American Airlines Arena, Miami, Florida (Host: Florida International University) Hubert H. Humphrey Metrodome, Minneapolis, Minnesota (Host: University of Minnesota) Regional Semifinals and Finals (Sweet Sixteen and Elite Eight) March 26 and 28 East Regional, TD Garden, Boston, Massachusetts (Host: Boston College) West Regional, University of Phoenix Stadium, Glendale, Arizona, Arizona (Host: Arizona State University) March 27 and 29 South Regional, FedExForum, Memphis, Tennessee (Host: University of Memphis) Midwest Regional, Lucas Oil Stadium, Indianapolis, Indiana (Hosts: Horizon League and Butler University) National Semifinals and Championship (Final Four and Championship) April 4 and 6 Ford Field, Detroit, Michigan (Host: University of Detroit Mercy) Detroit was the 28th new host city, and Ford Field the 35th new venue, to host the Final Four. The tournament featured six new stadiums, including two domed stadiums. The Phoenix suburb of Glendale was host for the first time, with games being held at the University of Phoenix Stadium, home to football's Arizona Cardinals. Indianapolis also hosted at a new domed stadium, Lucas Oil Stadium, the replacement for the RCA Dome. After an eight-year hiatus, the tournament returned to Memphis at the FedExForum, the third venue in the city to host the tournament. Kansas City also introduced a new arena, the Sprint Center, after the previous eight appearances at Kemper Arena. For only the second time, the city of Miami hosted games, this time at the American Airlines Arena, home to the NBA's Miami Heat. And for the first time since 1975, the tournament returned to Portland, at the Rose Garden. This was the last tournament to feature the Metrodome, which closed in early 2014, and was replaced with U.S. Bank Stadium, hosted the 2019 Final Four. Qualifying teams Listed by grouping and seeding Bracket Results to date * – Denotes overtime period All times in U.S. ET. Opening Round Game – Dayton, Ohio Winner advanced to Midwest vs.1 Louisville. Midwest Regional – Indianapolis, Indiana West Regional – Glendale, Arizona East Regional – Boston, Massachusetts South Regional – Memphis, Tennessee Final Four – Ford Field, Detroit, Michigan Game summaries Midwest Region Goran Suton of Michigan State was the Midwest regional most outstanding player. He was joined by Spartan teammates Kalin Lucas and Travis Walton, Louisville's Earl Clark and Kansas's Cole Aldrich on the NCAA Tournament All-Midwest Regional team. First round To play the top-seeded Louisville Cardinals in the first round, Morehead State defeated Alabama State 58–43, with the Eagles keeping the Hornets without a lead the entire game. This marked the first time either team had played in the tournament in five years; the Eagles had not played since 1984. Morehead State fell to Louisville 74–54, the 100th time a 1 seed beat a 16 seed in the tournament since seeding began. However, the Eagles managed to keep the game close until halftime, when Louisville led by only 2 points. In the second half, the Cardinals began to apply their signature fullcourt pressure, forcing turnovers and outscoring Morehead State 22–6 at the beginning of the half. Leon Buchanan's 17 points for the Eagles were not enough to upset Louisville, whose top scorers, Samardo Samuels and Terrence Williams, scored a combined 28 points. Morehead State has not beaten Louisville in 52 years until 2011. In two overtimes, the Siena Saints beat the Ohio State Buckeyes 74–72. Ohio State had the advantage of playing an hour from their campus, and received 25 points, nine rebounds, and eight assists from Evan Turner. The Saints made 6 out of 23 3-pointers and had 22 turnovers. Accordingly, Siena trailed for most of the game, but scored the last four points in regulation to force overtime. At the end of the first overtime, Siena's Ronald Moore drained his first 3-pointer to force a second overtime. With 3.9 seconds left in that overtime, he hit a second three from the same location to give the Saints a late 2-point lead. In an attempt to send the game into a third overtime, Turner shot a 15-footer immediately afterwards, but he missed it. This was Siena's fifth appearance in the tournament, after beating Vanderbilt University in 2008 as a 13 seed. The Arizona-Utah matchup was not as close. The Fifth-seeded Utah Utes were upset by the twelfth-seeded Arizona Wildcats, one of the last teams to make it in the tournament and a questionable entry, by a score of 84–71. The Utes closed the lead to two with roughly five minutes left in the game, but the Wildcats' answer was a 10–1 run. Utah's Luke Nevill committed two fouls less than four minutes into the game and scored only 12 points. Nic Wise of Arizona, meanwhile, led the team with 29 points, with 21 in the second half. Tyler Kepkay led the Utes with a team 19 points in his embarking performance. The Cleveland State-Wake Forest game was an even larger upset. In their second bid in the tournament, the Cleveland State Vikings shocked the Wake Forest Demon Deacons 84–69. This 15-point win ties for third-greatest victory margin for a 13 seed over a 4 seed. Wake Forest, once ranked first in the country, had 16 turnovers in the matchup, compared to six for the Vikings. James Johnson of the Demon Deacons scored 22 points, although this could not compensate for a substandard offense. Their scoring leader, Jeff Teague, finished with 10 points, half his average. For these reasons, Wake Forest never obtained a lead, while Cleveland State sank three consecutive 3-pointers in the early minutes of the game. For the first time in 19 years, Dayton advanced to the second round of the tournament with a win over West Virginia 68–60. This also ended West Virginia's first-round winning streak, which had lasted since 1992. Chris Wright led the Dayton Flyers with 27 points, a career high, while also chalking up 10 rebounds. Charles Little also aided the Flyers with 18 points. Darryl Bryant, who led West Virginia with 21 points, shot two consecutive three-pointers to bring Dayton's lead to 48–47 with 11:02 minutes left in the game. However, that was the closest the Mountaineers had to a lead outside the beginning of the game. In their first eligible year, North Dakota State appeared in the tournament, facing defending champion Kansas. The three-seeded Kansas Jayhawks staved off the fourteenth-seeded Bison's upset bid with an 84–74 victory. Ben Woodside shined with 37 points for the Bison, his sixth game of the season with at least 30 points. However, Sherron Collins and Cole Aldrich proved too much for North Dakota State, accounting for 65 percent of the Jayhawks' points with 32 and 23 respectively. The tenth-seeded USC Trojans demolished the seventh-seeded Boston College Eagles by a score of 72–55, helped by Taj Gibson's 10-for-10 shooting from the field, tied for the second-best NCAA tournament field-goal shooting performance in history. He led the team with 24 points and recorded six rebounds, five assists, and three blocks. Dwight Lewis also added 20 points for the Trojans. After leading 34–30 at halftime, the Eagles scored just a single field goal during one 13-minute stretch, as part of a 23.1 shooting percentage in the second half. Robert Morris, the region's 15 seed, was blown away by second-seeded Michigan State 77–62. The game was tied with 4:44 left in the first half, but then the Colonials went almost 20 minutes without scoring a single point. The Spartans took advantage of this for a 21–0 run that sealed the game in their favor. The Colonials' Jeremy Chappell was the only team member to score double-digit points with 11, and he also led the team with six rebounds, two steals, and three blocks. Raymar Morgan was the Spartans' leading scorer with 16 points. Second round Ninth-seeded Siena faced top seed Louisville, with the Cardinals emerging victorious 79–72. Taking advantage of Louisville's 19 turnovers, the Saints came back from a 12-point deficit with 17:21 left in the game to snatch the lead around the 9-minute mark. Edwin Ubiles broke through Louisville's full-court pressure and added 24 points for Siena. Terrence Williams, known as one of the most relaxed players on the Cardinals roster, saved his team by grabbing rebounds and making 3s. He led the team with 24 points, 15 rebounds, two steals, and four assists. Earl Clark also helped the Cardinals' cause with 12 points and 12 rebounds. In a 12 vs. 13 seed Cinderella matchup, Arizona handily defeated Cleveland State. The Wildcats' zone defense puzzled Cleveland State, and their fast breaks sealed the game. The smallest deficit the Vikings faced was 48–44 about midway through the second half, though the Wildcats then went on a 13–2 run led by Nic Wise's five consecutive points. His 21 points led the team's four double-digit scorers. Arizona was excellent behind the free-throw line, finishing 24 for 28. Cole Aldrich's triple-double with 13 points, 20 rebounds, and 10 blocked shots paved the way for a third-seeded Kansas win over 11 seed Dayton. This was only the sixth triple-double in NCAA tournament history. With 43 points, Dayton scored the fewest points they had all season, compared to Kansas's 60. Despite their small point total, the Flyers shot 72 times, its most all season, amounting to a 22.2 shooting percentage. The Jayhawks were also not having one of their better offensive games, with Sherron Collins being an exception; he made 25 points. This marked the third straight Sweet Sixteen appearance for Kansas. Playing the tenth-seeded USC Trojans, second-seeded Michigan State utilized Travis Walton's career-high 18 points for a 74–69 win. Normally known as a defensive player and averaging 4.9 points per game, Walton shot 8 for 13 from the field. His team out-rebounded USC 33 to 23, and USC made only one three-point play. Star Trojan Taj Gibson was in foul trouble throughout much of the game, and yet his teammates rallied for 14 lead changes and 16 ties. Dwight Lewis, who gave a 19-point performance overall, scored six consecutive points for USC for a late tie. The Spartans only earned a victory after the Trojans missed their last nine shots. With the win, Michigan State has made it to the Sweet 16 eight times of the last 12 years, more than any other team except Duke. Regional semifinals (Sweet Sixteen) Louisville, the region's top seed, routed twelfth-seeded Arizona 103–64. In NCAA tournament history, this was Louisville's largest win and Arizona's largest loss. It was no surprise, given the Cardinals' 57.6 field goal percentage and their 48% shooting behind the arc. Their fullcourt pressure forced 15 turnovers on the Wildcats the entire game, including nine in the first half. Earl Clark led the Cardinals with 19 points, whose ballhandling garnered 29 assists. This was the most lopsided Sweet 16 victory since 1972. The Michigan State-Kansas matchup was much more intense. After overcoming a 13-point first half deficit, the Spartans won 67–62. They shot 16 of 17 from the foul line, and on their only miss they rebounded the ball and gave Raymar Morgan the only points of the night on a dunk. Such rallies in the second half narrowed the deficit and occasionally took the lead, although the Jayhawks responded and were up by 2 with 2 minutes left in the game. They were helped by Sherron Collins and Cole Aldrich's combined 37 points. However, Kalin Lucas of the Spartans, who had scored 11 points in the first 39 minutes of the game, made seven straight points with 48 seconds left. Goran Suton also added nine rebounds, five steals, and a season-high 20 points for Michigan State. Regional final (Elite Eight) Michigan State defeated overall number one seed Louisville, 64–52, to advance to their fifth Final Four since 1999. Michigan State held Louisville to their second lowest point total of the season with their man-to-man defense keeping them out of sync all game. Center Goran Suton had 19 points and Durrel Summers had 12 in the rout. Earl Clark had 19 for Louisville. West Region A. J. Price was named MVP of the West Regional. He was joined by teammate Kemba Walker, Missouri's DeMarre Carroll and J. T. Tiller and Memphis' Tyreke Evans on the NCAA West All-regional team. First round Forward Quincy Pondexter scored 23 points to lead his Washington Huskies to a first round 71–58 win over Mississippi State Bulldogs in the West Regional. Only Barry Stewart put up double digit points (14) for the Bulldogs. Second round Pac-10 champions Washington Huskies scored 46 points in the second half, but it was not enough to beat the Purdue Boilermakers in the second round of West Regional, falling short by two points (76–74). Leaders for Purdue were JaJuan Johnson with 22 points and Keaton Grant with 12 rebounds. Isaiah Thomas with 24 points and Jon Brockman with 18 rebounds led the Huskies. Regional semifinals (Sweet Sixteen) Connecticut faced Purdue at University of Phoenix Stadium in a West Regional semifinal. It was UConn who took full advantage of many Purdue mistakes and, even though Robbie Hummel was able to shoot quite well scoring 17 points, it was Hasheem Thabeet and the Huskies who pulled away for a 72–60 win to move onto the regional finals. In the nightcap of the sweet sixteen matchups, two sets of Tigers met, pitting Missouri against Memphis in a matchup that saw teams with similar fast-paced styles meet. Missouri was able to pull away with a 27–7 run that gave them a 64–40 lead. Though Memphis attempted to claw back into the game through Tyreke Evans' 33 points, it was JT Tiller, DeMarre Carroll, and Leo Lyons that moved on to meet UConn in the regional final along with the rest of their Missouri Tigers. Regional final (Elite Eight) Kemba Walker came off the UConn bench to spark them to a victory over the 3 seeded Missouri Tigers. East Region Scottie Reynolds was named Regional most outstanding player. He was joined by teammates Dwayne Anderson and Dante Cunningham, Panthers Sam Young and DeJuan Blair on the NCAA East All-Regional team. First round UCLA Bruins' Alfred Aboya scored two free-throw points with 48 seconds remaining in the game to help UCLA get by VCU in the first round at the East Regional in Philadelphia's Wachovia Center with Maynor's potential game winning jumper bouncing off the rim at the buzzer. Top scorers in the game were Eric Maynor (21) for VCU and Josh Shipp (16) for UCLA. Villanova Wildcats, playing at home against an American University team that featured 5 seniors, fell behind early as American hit a barrage of 3 pointers. However, in the 2nd half, Villanova was able to take advantage of 20 free throws in the final 13 minutes of the game to win against American. No. 12 seed Wisconsin upset #5 seed Florida St. 61–59 in OT. Down 31–19 at the half, the Badgers' Jason Bohannon made a three-point jumper to give Wisconsin the lead with 45 seconds left in regulation. Trevon Hughes fouled Toney Douglas, who made two free throws to send the game into over-time. In over-time, the Badgers trailed by one with just seconds left when Hughes made a twisting shot from the lane over two defenders to put the Badgers ahead 60–59. Hughes was also fouled on the shot, and made the resulting free throw to make the score 61–59. Florida State had just enough time to run a full court in-bounds play but, the pass was deflected at half court thus securing the Badger victory. Second round By six Wildcats scoring double-digit points, Villanova ended UCLA's hope of going to the Final Four for the fourth time in a row. Dante Cunningham had 18 points; Reggie Redding and Corey Fisher had 13; Corey Stokes put up 12; eleven points came from Scottie Reynolds and ten points were put up by Dwayne Anderson for the winning team. Josh Shipp had 18 points and Alfred Aboya had 8 rebounds for UCLA. Regional semifinals (Sweet Sixteen) Villanova (#3) upset Duke (#2), 77–54, to advance to the Regional Championship game to face Pittsburgh (#1). The Wildcats, who were ahead by 3 at half-time, were led in scoring by Scottie Reynolds (16), Dante Cunningham (14) and Reggie Redding (11). Regional final (Elite Eight) Number one seed Pittsburgh was upset by the Villanova Wildcats, 78–76 in the East Regional Finals, denying the Panthers a chance for a first national championship in men's basketball. With five seconds remaining, Levance Fields, who was fouled by Corey Fisher, shot two free-throws to tie the game for Pitt. But Scottie Reynolds' one-second jumper was good to give Villanova an upset victory. Pitt's Sam Young scored 28 points and DeJuan Blair had 20 points. Dwayne Anderson was top scorer for the Wildcats with 17 points. South Region Ty Lawson was the South regional MVP and he was joined on the All-regional team by teammates Danny Green and Tyler Hansbrough as well as Blake Griffin and Syracuse's Jonny Flynn. First round WKU advanced to the second round for a second consecutive year as a 12 seed, beating 5th seeded Illinois. 10th seeded Michigan upset 7th seeded Clemson 62–59 in its first tournament win since 1998. It was Michigan's first tournament appearance in 11 years after the school was rocked with sanctions and punishments from the Chris Webber scandal in the mid-2000s. Second round Regional semifinals (Sweet Sixteen) Regional final (Elite Eight) Final Four All final four teams in the tournament had won at least one national championship. Entering the tournament, North Carolina had the most, with four (1957, 1982, 1993, 2005); Connecticut had two; (1999, 2004); Michigan State also had two; (1979, 2000), and Villanova won one; (1985). The Spartans had home court advantage by playing in their home state. Six teams have played the Final Four in their home states, but only four of them won. UCLA (1968, 1972, 1975) and North Carolina State (1974) won the national title, but Duke (1994) and Purdue (1980) lost in the Final Four. The biggest advantage came in 1968 and 1972 when UCLA played the championship game at the Los Angeles Memorial Sports Arena, which is a short distance from Pauley Pavilion, their home court since 1965. Michigan State vs. Connecticut Michigan State, with 7 minutes to play, finally took hold of the game and defeated the number one seed Connecticut to advance to the championship game against North Carolina. The Spartans started the game with a 7-point run, but the Huskies came back to take a lead in the first half. Michigan State took it back and was leading by two at the half. Connecticut had the lead twice early in the second period. Michigan State, led by guard Kalin Lucas with 21 points and forward Raymar Morgan with 18 points, was just too much at the end for the Huskies. Scoring for Connecticut was shared by Jeff Adrien (13), Stanley Robinson (15), Hasheem Thabeet (17) and A.J. Price (15). Villanova vs. North Carolina After the first five minutes, North Carolina used an 11-point run to end Villanova's hope for a national championship and put the Tar Heels into the championship game for a chance to win their fifth title in nine trips. Ty Lawson produced 22 points, followed by Wayne Ellington with 20 points and Tyler Hansbrough with 18 points. Hansbrough, the sixth-leading scorer in tournament history, pulled down 11 rebounds. For Roy Williams, who coached North Carolina to a national championship in 2005, it is back to the title game again. Championship game – Michigan State vs. North Carolina This 71st title game featured #1 seed North Carolina, which had a 4–4 record in the finals, versus #2 seed Michigan State, which had a 2–0 record going into the game. It was also a matchup featuring future Hall of Fame coach Tom Izzo, who guided Michigan State to the championship in 2000 and 5 trips to the Final Four, against current Hall of Famer Roy Williams, who won the title in 2005 and reached 7 Final Fours. To celebrate the 30th anniversary of the 1979 national title game between Michigan State Spartans and the Sycamores of Indiana State, Hall of Fame players Earvin "Magic" Johnson and Larry Bird, who had played against each other, presented the game ball at the 2009 NCAA national championship game Monday night. The game was a rematch of "BasketBowl II", of 2008's ACC-Big Ten Challenge, won by the Tar Heels 98–63. That game was also played at Ford Field. North Carolina, with a first bucket from Deon Thompson, took off and ran to a 21-point lead at the 10-minute mark. The lead grew to 24 with less than 5 minutes remaining in the first half, with most points coming from Wayne Ellington (15). The Spartans were behind 34–55 at the half, a tournament record lead for the Tar Heels. Goran Suton had the most points for Michigan State. In the second half, Michigan State made a comeback to within 13 points of North Carolina with 4:56 to go in the game, but was unable to overcome the record 21 turnovers. Roy Williams and his Tar Heels defeated the Spartans 89–72 to take home his second trophy for the university. Ty Lawson set a record with 8 steals. All Tournament team Wayne Ellington, North Carolina (Most Outstanding Player) Tyler Hansbrough, North Carolina Ty Lawson, North Carolina Kalin Lucas, Michigan State Goran Suton, Michigan State Tournament notes Largest tournament point differential (+121) by the champion since 1996 (a new record was set in 2016 after the Villanova Wildcats defeated the North Carolina Tar Heels). Highest attended National Semifinal Games (72,456) in Final Four history, breaking the old record of 64,959 (a new record was set in 2014). Highest attended National Championship Game (72,922) in Final Four history breaking the old record of 64,959 (a new record was set in 2014). Highest total Final Four attendance (145,378) ever breaking the old record of 129,918 (a new record was set in 2014). Roy Williams is one of four active coaches to win multiple titles. Billy Donovan, Mike Krzyzewski and Jim Calhoun are the three other coaches. Nielsen ratings for the Championship Game were down 7% to 11.9/19 versus a 12.8/20 the previous year. The entire tournament averaged a 6.3/13, a 5% increase. Blake Griffin of Oklahoma was the winner of the John Wooden Award, presented by the Los Angeles Athletic Club on Friday, April 10 in Los Angeles. 708,296 fans in attendance over the course of 35 sessions. Record by conference *Morehead State won the Opening Round game. The America East, Atlantic Sun, Big Sky, Big South, Big West, CAA, Ivy, MAC, MEAC, MVC, NEC, Patriot, Southland, SoCon, SWAC, Summit, and WAC conferences all went 0–1. The columns R32, S16, E8, F4, and CG respectively stand for the Round of 32, Sweet Sixteen, Elite Eight, Final Four, and Championship Game. Media Television Once again, except for the play-in game, which was telecast on ESPN, CBS and CBS College Sports Network served as broadcasters on television for the tournament. The only change from past years at the Final Four was that Jim Nantz worked with Clark Kellogg in the color commentary position instead of Billy Packer, who left CBS in July 2008. Studio: Greg Gumbel, Greg Anthony and Seth Davis Jim Nantz and Clark Kellogg and Tracy Wolfson (she was only used as a backstage reporter for the Final Four and NCAA Championship game) – First & Second Round at Greensboro, North Carolina; South Regional at Memphis, Tennessee; Final Four at Detroit, Michigan Dick Enberg or Carter Blackburn and Jay Bilas – Blackburn Thursday afternoon; Enberg Thursday night, First & Second round at Philadelphia, Pennsylvania; West Regional at Glendale, Arizona Verne Lundquist and Bill Raftery – First & Second Round at Dayton, Ohio; East Regional at Boston, Massachusetts Gus Johnson and Len Elmore – First & Second Round at Minneapolis, Minnesota; Midwest Regional at Indianapolis, Indiana Kevin Harlan and Dan Bonner – First & Second Round at Portland, Oregon Ian Eagle and Jim Spanarkel – First & Second Round at Miami, Florida Craig Bolerjack and Bob Wenzel – First & Second Round at Boise, Idaho Tim Brando and Mike Gminski – First & Second Round at Kansas City, Missouri For the play-in game in Dayton, ESPN had Brent Musburger, Steve Lavin and Erin Andrews working as the announcers. Some CBS affiliates put additional game broadcasts on digital subchannels, or, as in the following two instances, on other stations: WOIO and WUAB (Raycom Media duopoly): On March 20, WOIO aired Ohio State vs. Siena, while Cleveland State vs. Wake Forest was on WUAB at the same time. The Cleveland area has a substantial number of OSU alumni, and Mansfield, although part of the Cleveland market, is equidistant to both Columbus and Cleveland. KOTV and KQCW (Griffin Media duopoly): Also on March 20, KOTV aired Oklahoma State vs. Tennessee; at the same time, Kansas vs. North Dakota State was on KQCW. The reason for this simulcast is that part of the Tulsa market includes Coffeyville and other communities at the southern end of Kansas. Radio Westwood One was once again the radio home for the tournament. Opening Round Game Marc Vandermeer and Steve Lappas – at Dayton, Ohio First/Second Round Bill Rosinski and Kyle Macy – at Greensboro, North Carolina Kevin Kugler and Pete Gillen – at Kansas City, Missouri Wayne Larrivee and John Thompson – at Philadelphia, Pennsylvania Dave Sims and P.J. Carlesimo – at Portland, Oregon Ted Robinson and Bill Frieder – at Boise, Idaho Mark Champion and Kelly Tripucka – at Dayton, Ohio Tom McCarthy and Kevin Grevey – at Miami, Florida Brad Sham and Reid Gettys – at Minneapolis, Minnesota Regionals Kevin Kugler and John Thompson – East Regional at Boston, Massachusetts Ian Eagle and Pete Gillen – Midwest Regional at Indianapolis, Indiana Kevin Harlan and P.J. Carlesimo – South Regional at Memphis, Tennessee Wayne Larrivee and Bill Frieder – West Regional at Glendale, Arizona Final Four Kevin Kugler, John Thompson and Bill Raftery – at Detroit, Michigan International broadcasters : CBC Television, uses the CBS broadcast and commentators, the CBC personalities, themes and graphics. : One HD / ESPN Australia, uses the CBS broadcast and commentators. : ESPN Brasil, uses the CBS broadcast. Europe, North Africa and Middle East: ESPN America : Live/delayed on Basketball TV, and recorded on C/S9; uses the CBS broadcast and commentators. Yahoo! Sports and NCAA.com also broadcast the entire tournament live for free on the internet. See also 2009 NCAA Division II Men's Basketball Tournament 2009 NCAA Division III Men's Basketball Tournament 2009 NCAA Division I Women's Basketball Tournament 2009 NCAA Division II Women's Basketball Tournament 2009 NCAA Division III Women's Basketball Tournament 2009 National Invitation Tournament 2009 Women's National Invitation Tournament 2009 NAIA Division I Men's Basketball Tournament 2009 NAIA Division II Men's Basketball Tournament 2009 NAIA Division I Women's Basketball Tournament 2009 NAIA Division II Women's Basketball Tournament 2009 College Basketball Invitational 2009 CollegeInsider.com Postseason Tournament 2008–09 NCAA Division I men's basketball season References NCAA Division I Men's Basketball Tournament Ncaa Men's Division I Basketball Tournament NCAA Division I Men's Basketball Tournament NCAA Division I Men's Basketball Tournament NCAA Division I Men's Basketball Tournament Sports in Portland, Oregon
10348099
https://en.wikipedia.org/wiki/Knowledge%20management%20software
Knowledge management software
Knowledge management software (KM software) is a subset of enterprise content management software, which contains a range of software that specializes in the way information is collected, stored and/or accessed. The concept of knowledge management is based on a range of practices used by an individual, a business, or a large corporation to identify, create, represent and redistribute information for a range of purposes. Software that enables an information practice or range of practices at any part of the processes of information management can be deemed to be called information management software. A subset of information management software that emphasizes an approach to build knowledge out of information that is managed or contained is often called knowledge management software. KM software in most cases provides a means for individuals, small groups or mid-sized businesses to innovate, build new knowledge in the group, and/or improve customer experience. Knowledge management systems (software) include a range of about 1,500 or more different approaches to collect and contain information to then build knowledge that can be searched through specialised search tools. These include concept building tools and/or visual search tools that present information in a connected manner not originally conceptualised by those collecting or maintaining the information database. One of the main categories of knowledge management software is groupware, which can be used for knowledge sharing and capture. Groupware is a combination of synchronous, asynchronous and community focused tools. Groupware can be used to exchange knowledge and expertise even when the team members are situated around the world. Features Features of KM software usually include: Aggregation of content from both internal and external sources Classification of content using taxonomies Search Expertise location Views/dashboards As business today is becoming increasingly international, the ability to access information in different languages is now a requirement for some organizations. Reported success factors of a KM system include the capability to integrate well with existing internal systems and the scalability of the system to grow within the organization. Range KM software ranges from small software packages for an individual to use, such as brainstorming software, to highly specialized enterprise software suitable for use by hundreds of employees. Often KM software provides a key resource for employees working in customer service or telephone support industries, or sectors of large corporations. Knowledge management software, in general, enables the combination of unstructured information sources, such as individual word processed documents and/or PDF documents, email, graphic illustrations, unstructured notes, website links, invoices, and other information bearing collections, such as a simple thought, through to a combination of millions of interactions from a website, and through that combination enables the seeker to obtain knowledge that otherwise would not have been discovered. As Internet access speeds increased, many on-demand (or software as a service) products have evolved and are now the leading suppliers of KM software. Visual search One of the departures from the almost standard keyword search approach are those group of companies developing visual search techniques. Some common visual search approaches include: Tree traversal – A folder is opened and inside the display of that folder are further sub-folders. The folders are searched in a specific order, exactly once, in a systematic manner. This tree traversal approach relies on the naming of folders to provide a rich enough indication as to what is contained in the next folder or level of folders. Taxonomy navigation – A taxonomy (or topic map) is the classification of things or concepts, as well as the principles underlying such classification. In KM software, taxonomies are often used as a way of visually structuring the available information by tagging it with relevant topics and visually represent them as folders and sub-folders inside the taxonomy. Users can then navigate the taxonomy and select the topic, or combination of topics (faceted search), to perform the search on. Tag cloud search – Once text data has been tagged with certain topics it can be visually represented as a Tag Cloud, where the importance of each tag is represented as a font size and/or color. This way you can identify and pick the most prominent topics. Matrix/heat map search – The classification of information into topics facilitates visualization and analysis of the information flow. A combined topic search can be presented as values in a Matrix, and a Heat Map is a graphical representation of that data, presented in colors. Notable knowledge management tools include: Bloomfire – Knowledge sharing software with AI Collective Knowledge – An open source, portable and command line framework for knowledge management Confluence – Wiki-based knowledge management software that is part of Atlassian's suite eGain - Knowledge management and AI software Elium - Knowledge management software OpenKM – Open source knowledge management software See also Customer knowledge Knowledge base Knowledge extraction References External links Information technology management Knowledge management Technical communication
57769260
https://en.wikipedia.org/wiki/%2815977%29%201998%20MA11
(15977) 1998 MA11
, provisional designation: , is a Jupiter trojan from the Trojan camp, approximately in diameter. It was discovered on 19 June 1998, by astronomers with the Lincoln Near-Earth Asteroid Research at the Lincoln Lab's ETS near Socorro, New Mexico, in the United States. The suspected tumbler is also a slow rotator with a period of 250 hours. It has not been named since its numbering in July 2000. Orbit and classification is a Jupiter trojan in a 1:1 orbital resonance with Jupiter. It is located in the trailering Trojan camp at the Gas Giant's Lagrangian point, 60° behind its orbit . It orbits the Sun at a distance of 4.9–5.4 AU once every 11 years and 10 months (4,309 days; semi-major axis of 5.18 AU). Its orbit has an eccentricity of 0.05 and an inclination of 17° with respect to the ecliptic. The body's observation arc begins with a precovery published by the Digitized Sky Survey and taken at Palomar Observatory in December 1953, more than 44 years prior to its official discovery observation at Socorro. Numbering and naming This minor planet was numbered by the Minor Planet Center on 26 July 2000 (). , it has not been named. Physical characteristics is an assumed C-type asteroid. It has a typical V–I color index of 0.906 (see table below). Rotation period In August 2013, Robert Stephens at the Center for Solar System Studies observed over three nights. However no meaningful rotational lightcurve could be determined, as the lightcurve's amplitude never varied more than 0.02 magnitude. A period of 11.17 hours was only derived for demonstration purpose (). In December 2015, Stephens obtained an improved lightcurve with a rotation period of hours and a brightness variation of 0.30 magnitude (). This time the asteroid was observed on 16 nights over a period of one month. The photometric observations also revealed that this object possibly has a non-principal axis rotation, which is commonly known as tumbling. Diameter and albedo According to the survey carried out by the NEOWISE mission of NASA's Wide-field Infrared Survey Explorer and the Japanese Akari satellite, measures 43.53 and 51.53 kilometers in diameter and its surface has an albedo of 0.071 and 0.046, respectively. The Collaborative Asteroid Lightcurve Link assumes a standard albedo for a carbonaceous asteroid of 0.057 and calculates a diameter of 46.30 kilometers based on an absolute magnitude of 10.4. Notes References External links Asteroid Lightcurve Database (LCDB), query form (info ) Discovery Circumstances: Numbered Minor Planets (15001)-(20000) – Minor Planet Center Asteroid (15977) 1998 MA11 at the Small Bodies Data Ferret 015977 015977 015977 19980619
40995752
https://en.wikipedia.org/wiki/College%20of%20Business%20Education
College of Business Education
The College of Business Education (CBE) is a higher learning institution in Tanzania established in 1965, registered and accredited by the National Council for Technical Education (NACTE) to offer Certificate, Diploma and Degree Programmes in various fields of study. Academic departments The College of Business Education has six academic departments as of 2019: Accountancy Business Administration ICT and Mathematics Marketing Metrology and Standardization Procurement and Supply Management Programmes offered Undergraduate and Postgraduate Programmes offered at CBE as of 2019: Undergraduate programs Technician certificate courses (NTA level 4–5) Technician Certificate in Accountancy Technician Certificate in Business Administration Technician Certificate in Marketing Management Technician Certificate in Procurement and Supplies Management Technician Certificate in Metrology and Standardization (DSM only) Technician Certificate in Information Technology Ordinary diploma courses (NTA level 6) Ordinary Diploma in Accountancy Ordinary Diploma in Business Administration Ordinary Diploma in Marketing Ordinary Diploma in Procurement and Supplies Management Ordinary Diploma in Metrology and Standardization (DSM Only) Ordinary Diploma in Information Technology bachelor's degree courses (three years) (NTA level 7–8) Bachelor's degree in Accountancy (BACC) Bachelor's degree in Business Administration (BBA) Bachelor's degree in Marketing (BMK) Bachelor's degree in Procurement and Supplies Management (BPS) Bachelor's degree in Metrology and Standardization (BMET) Bachelor's degree in Business Studies with Education (BBSE) Bachelor's degree in Information Technology (BIT) Postgraduate programmes Postgraduate diploma courses Postgraduate Diploma in Project Management (PGDPM) Postgraduate Diploma in Business Administration (PGDBA) Postgraduate Diploma in Financial Management (PGDFM) Masters courses Masters for Information Technology in Project Management (IT-Project Management) Masters of Information and Communication Technology for Development (ICT4D) Masters of Supply Chain Management (MSCM) Masters of International Business Management (MIBM) Masters of Business Administration in Finance and Banking Masters of Business Administration in Human Resource Management Masters of Business Administration in Marketing Management Rankings In 2018, The Webometrics Ranking placed CBE on the 21st rank in the best Universities and Colleges in Tanzania. In 2019, CBE was ranked 15th in the best Universities and Colleges in Tanzania and it was also ranked 2nd in the best Colleges in Tanzania after Nelson Mandela African Institute of Science & Technology (NM-AIST). References External links Colleges in Tanzania Education in Dar es Salaam Public universities in Tanzania
19961487
https://en.wikipedia.org/wiki/Scene7
Scene7
Scene7 is an American on-demand rich media software company that provides document hosting and interactive publishing services such as online catalogs, targeted email, video, and image management. Retailers use the company's services to showcase products on their websites and to allow customers to interact with the products. Scene7's technology allows users to manipulate product images by zooming in and rotating products, simulating the inspection of merchandise in retail stores. The company, founded as a division of Autodesk, created a room decoration computer software called Picture This Home in the mid-1990s. The division was sold to Broderbund in 1998, then spun off as a company called GoodHome.com in June 1999. After GoodHome.com failed to become profitable, it was reorganized and renamed Scene7. It formally launched on January 23, 2001 and focused on helping companies prepare interactive advertisements for consumers. Adobe Systems acquired Scene7 in 2007 for an undisclosed sum. Company A subsidiary of Adobe Systems, Scene7 provides document hosting and interactive publishing services, typically charging clients to convert catalog print files to interactive web pages. The company does most of its business in North America. Its primary competitors for dynamic imaging services and technology are RichFX and LiquidPixels. Scene7 products rely on several Adobe products, including Adobe Photoshop, Adobe InDesign, Adobe Flash, Adobe Illustrator, and Adobe Flex; this relationship existed before Adobe purchased the company. Scene7 does not maintain any servers to host its services; instead, it uses a "pay as you grow" program that only requires it to pay for the resources that it uses. Scene7's clients include the companies Sears, Lands' End, Harrods, Macy's, Office Depot, Levi Strauss & Co., La-Z-Boy, and QVC. In 2001, Scene7 agreed to develop home design and landscaping software for Individual Software for $50 million. High-end casual clothing retailer Anthropologie has used Scene7's services to create and deploy online catalogs for its e-commerce website since November 9, 2004. The retailer implemented Scene7's Dynamic Imaging service to let customers zoom in on products, similar to how merchandise is inspected in retail stores. The Harrods department store signed an agreement with Scene7 on June 24, 2005 to use Scene7's imaging and catalog system on the store's website. This required Harrods to convert all its printed material to a digital format for Internet use. On October 20, 2005, web agency Logan Tod announced that it had partnered with Scene7 to offer new features on its e-commerce websites. These include Scene7's Dynamic Imaging service, which allows visitors to zoom in on products to see details; this feature, Logan Tod claimed, helps to drive purchase decisions. Fathead, a company that builds and sells wall graphics, uses Scene7's services on its website to generate images and to allow visitors to layer images on top of each other dynamically. National Business Furniture, a furniture retailer based in Milwaukee, Wisconsin, implemented Scene7's technology on November 3, 2008, allowing visitors to dynamically change the colors of product images on the company's website. History GoodHome.com (1995−2000) The company began as a development team that created software called Picture This Home in the mid-1990s for Autodesk in San Rafael, California. The program allowed people to virtually preview room decoration projects before any work began. Its users can create virtual rooms, change walls and arrange furniture, and create photo-realistic renderings of completed designs. Picture This Home was awarded the Good Housekeeping Seal of Approval. In 1998, the software and its team of 40 developers were sold to Broderbund, which was owned by The Learning Company, a subsidiary of Mattel Inc. Broderbund eventually spun Picture This Home off as a company called GoodHome.com in June 1999 with Doug Mack as its CEO. The company received $30 million in venture capital from Hearst Interactive Media. Mack noted that before Broderbund spun off GoodHome.com as a separate company, there was a "big culture clash" between the established company and the new media division. He said that companies such as Broderbund were so concerned with established business that smaller, newer divisions found it hard to receive sufficient attention and bandwidth. He praised Broderbund's decision to spin the company off as a separate entity: "We would have never been able to build our Web business if we were not in a separate building with separate funding." Mack observed that newer companies like GoodHome.com rarely had any problem in hiring new employees because of the opportunity for Internet companies to grow quickly into "the next Amazon.com". In September 1999, GoodHome.com merged with Alexandria, Virginia-based nHabit.com, a rival company, for an undisclosed sum. After the merger, GoodHome.com was assured that it would grow quickly; the merger also added the Internet service provider America Online to GoodHome.com's portfolio as a client. Ten weeks after forming a business plan, GoodHome.com officially launched on September 29, 1999 with offices in San Rafael and New York City, New York, and Roger Horchow was assigned as its chairman. The company used the slogan "A beautiful home. It was never this easy," and focused on selling furniture and other home items, spending $20 million on advertisements in its first year. Mack decided that the company should target women, since "women make 80 percent of decorating decisions." The company built a home furnishings portal to compete with the websites Living.com and Furniture.com, which both went bankrupt in 2000. In April 2000, GoodHome.com's monthly sales topped $1 million; the company's goal was to be profitable within two to three years. One of the website's biggest attractions was its virtual decorating service that let customers see how certain features such as the paint, upholstery fabric, rugs, and pillows would look before a purchase. When considering why this service was so popular, Mack noted that consumers usually feel more confident in a purchase when there are few unknowns. At the time, selling products over the Internet was not a popular concept outside the United States, but Mack was confident in expanding GoodHome.com's portfolio to include foreign companies: "We're already getting so many requests from companies about expanding our website abroad... I see this happening quickly within the next few years." GoodHome.com encountered difficulties in running its business in 2000, when several other companies that offered similar services launched. The increasing demand for online catalog services, considered a phenomenon, was dubbed the "hottest thing since sliced bread" by an analyst from technology research firm Forrester Research, which estimated that roughly $500 million was invested in home furnishing websites from 1999 to 2000. It became difficult for consumers to decide which service provided better quality; a business owner commented, "You can't tell the difference in quality between something that's $3,000 and something that's $10,000." GoodHome.com, which had offered free shipping, phased out the feature on July 15, 2000, in favor of "heavily subsidized rates". To compete with new companies, GoodHome.com also introduced new features such as a "floor planning" feature to allow website visitors build an electronic version of their rooms, then drag in furnishings to see how they fit. Reorganization (2001−2007) After spending several years operating at a loss, GoodHome.com reorganized as Scene7, which formally launched on January 23, 2001, with $15 million raised from investors that included Hearst Interactive Media. The new company focused on helping companies prepare interactive advertisements for consumers. Mack, the Broderbund executive who had decided to spin off the company, reflected on the decision to reorganize and relaunch: "We got a year into [the initial GoodHome scheme] and the whole B2C (business-to-consumer) market tanked, and we realized we could not build a successful business as a portal [...] But the whole time we kept having people approach us to license the technology [to create virtual catalogs], and finally a light bulb went off when we realized we were sitting on top of a great technology we could sell." Scene7 raised a round of financing on July 12, 2001 that totaled $11.3 million, which helped stabilize the company. The deal was led by venture capitalists from several firms, including Louis Bacon's Moore Capital Management and Xcelera of the Cayman Islands, with cash investments from Cooley Godward and Perkins Coie. After the latest round of financing, Mack planned for Scene7 to have 15 clients and a burn rate, or negative cash flow, of less than $700,000 a month, stating, "What we learned was to stick to your strategy, and don't get nervous when the competition is adopting a strategy to spend their way to victory." At the time, the company's revenues were well below its peak of $1 million a month, but Mack intended to increase revenues past that point in a few months. Scene7 moved from San Rafael to Hamilton Landing in Novato, California in September 2002 to accommodate more employees. On July 9, 2003, the company acquired all of the assets of workflow provider and advertising software company Engage for $1.2 million and assumed its $650,000 debt after Engage filed for Chapter 11 bankruptcy. Engage was the parent company of both Cascade and MidSystems, which were two of the first companies that tried to automate prepress production for newspapers and large printers. On August 15, 2003, Scene7 acquired its top competitor, TrueSpectra of San Mateo, for an undisclosed amount of cash and stock. On June 15, 2004, Scene7 raised $7.5 million in another round of financing, led by home shopping company QVC with some of Scene7's existing investors. At the same time, Jeffrey Branman, President of Interactive Technology Partners at QVC, and David Rubenstein, co-founder of the private equity firm The Carlyle Group, joined Scene7's board of directors, which was composed of James Caccavo of Moore Capital, Andrew Wright of RealNetworks, and Mack. Since the early 2000s, the company's growth has been fueled by an increase in broadband Internet access, which loads virtual catalogs faster than dial-up Internet access. When catalogs first appeared online in the late 1990s, the graphics took too long to load. After high-speed Internet access became more popular, virtual catalogs quickly grew to become a popular feature of online stores. In addition to faster Internet connectivity, a study in 2000 noted that an online presence for brick and mortar businesses increased offline sales by an average of 27%. Mack also pointed out that having more product information disseminated helps play a role in increasing sales: "We have the ability to provide consistent information... One of the advantages of selling furniture online is the hyperscript; you always have the original specifications on a product." Subsidiary of Adobe (2007) Scene7 was acquired by Adobe Systems on May 31, 2007 for an undisclosed sum. At that time, Scene7 had 80 employees, most of whom were transferred from Scene7's former headquarters in Novato, California to Adobe's offices in San Francisco, California. Mack joined Adobe as its vice president of Creative Solutions Services. Scene7 was added to Adobe's product line as a hosted service to help boost Adobe's overall services strategy, especially its software as a service efforts, and because Scene7 was a great fit due to its heavy usage of Adobe products. Adobe plans to integrate Scene7's products into Adobe LiveCycle, the company's suite of server software products, at an unspecified time. The Scene7 brand will continue to be used, but it will "eventually be replaced with the Adobe brand". Denmark-based YaWah, a dynamic imaging software company, was acquired by Adobe on September 26, 2008 to help expand Scene7 globally. References Mass media companies of the United States 2007 mergers and acquisitions Companies based in San Rafael, California
13288852
https://en.wikipedia.org/wiki/Kaspersky%20Internet%20Security
Kaspersky Internet Security
Kaspersky Internet Security (often abbreviated to KIS) is an internet security suite developed by Kaspersky Lab compatible with Microsoft Windows and Mac OS X. Kaspersky Internet Security offers protection from malware, as well as email spam, phishing and hacking attempts, and data leaks. Kaspersky Lab Diagnostics results are distributed to relevant developers through the MIT License. Windows edition Version 2007 (6.0) Version 6.0 was the first release of KIS. PC World magazine praised version 6.0's detection of malware. KIS detected 100 percent of threats on a subset of the January 2006 wild-list, a list of prevalent threats. The suite detected almost 100 (99.57%) percent of adware samples. KIS has the ability to scan within compressed or packed files, detecting 83.3 percent of the "hidden" malware. However, version 6.0 was criticized for not completely removing malware by leaving Registry entries and files. PC World also highlighted the suite's false positives — eight of 20,000 clean files were incorrectly flagged as malicious — and its noticeable impact on computer performance. However, data is cached from each scan, making each subsequent scan faster. The firewall blocked all attacks from inside and outside the computer when tested. The magazine found the graphical user interface to be awkward to navigate. Features such as parental controls and instant messaging protection, found in competing suites from Symantec and McAfee, were not a part of version 6.0. Both CNET and PC World criticized the suite's relatively high retail price, US$79.95. KIS 6.0 supports Windows 98 SE, ME, NT Workstation 4.0, 2000 Professional, XP Home Edition, XP Professional, XP Professional x64, and Vista. 50 megabytes of free space, Internet Explorer 5.5, and Windows Installer 2.0 are required. RAM and CPU requirements are dependent on the operating system. Version 2008 (7.0) Version 7.0 introduced a redesigned GUI. Components were renamed and reorganized; the Anti-hacker module was renamed to the Firewall, and the Anti-Spy module was integrated with the Privacy Control module. PC World described the new interface as "intuitive" and "great-looking". Parental controls were introduced, with specific settings for different age categories, such as "child" or "parent". Within age categories are content categories, such as drugs or violence. Users can manually configure profiles. Filtering profiles can be associated with users. Since content is filtered at the network level, the feature will work with any Internet browser. The filter relies on a database of known URLs and can analyse websites in real-time. Attempts to access forbidden URLs are logged, and sites visited are tracked as well, raising privacy issues. Limits on Internet access may be set based on time, and chat rooms along with webmail sites can be manually blocked. Spam filtering integrates with Microsoft Outlook, Outlook Express, Windows Mail, and The Bat!. E-mail content is analysed and scored, and e-mail with scores above two specified thresholds are either marked as "!!spam" or "??probably spam". The Mail Dispatcher feature shows subject and sender information for messages, and allows users to avoid downloading blatant spam by selecting which messages to download. The filter self-trains by analyzing incoming and outgoing e-mail not marked as spam, or by analyzing folders only containing spam or valid e-mail. Senders of verified valid e-mail are whitelisted. E-mail can also be whitelisted or blacklisted based on phrases present in the text. E-mail with non-ASCII characters or invisible text can also be blocked. However, version 7.0 had a relatively poor showing, misidentifying 30 percent of valid messages in PC Magazine testing. 30 percent of spam also made to the inbox. Protection against data leaks was incorporated in this release. The suite warns users when programs attempt to access or send data from certain areas, such as where Internet Explorer stores webform information. Malware protection was mostly positive in detection and disinfection tests by AV-Test.org. Version 7.0 detected 100 percent of wildlist threats. Using one-month-old signatures and a set of new malware, however, detection fell to 14 percent. Files were scanned at 5.24 megabytes per second. Version 7.0 successfully identified all six actively running rootkits, four of six inactive rootkits, and was only able to remove two of six rootkits. The firewall correctly blocked all attempted outside connections, with a reasonable level of security when left on default settings. This version drops support for the Windows 98, 2000, and NT. Windows XP Service Pack 2 is required, except in the case of XP Professional x64 edition. Vista is supported as well. RAM and CPU requirements are dependent on the operating system. 75 megabytes of free space, Internet Explorer 5.5, and Windows Installer 2.0. Version 2009 (8.0) This version introduces a revised user interface, an application filtering module, an updated anti-virus engine, and a vulnerability scanner. The main window separates settings in four categories, compared to eight in its predecessor. A status bar changes colour (green, yellow, and red) to reflect overall program status and flashes to divert attention when needed. PC Magazine also noted pop-up notifications were kept to a minimum. Kaspersky claims the core anti-virus engine was revised to increase scan speed. PC Magazine found an initial scan took over two hours, however subsequent scans took two minutes to complete. However, malware detection was relatively low in comparison to other anti-virus applications tested. Out of 650 thousand samples, version 8.0 detected 95.6 percent. The top score was around 99 percent. Using two-week-old signatures, version 8.0 detected 52 percent of viruses in a different set of samples. Kaspersky also blocked about 60 percent of malware based solely on behaviour. The top performers scored 55.3 percent and 80 percent respectively. Version 2009 detected 98.1 percent of adware. However, PC World noted to achieve that kind of performance, users will have to modify program settings. On default settings, KIS allowed Zango to install. To block the installation, users must enable KIS to scan for "other malware". The Security Analyzer looks for operating system and program patches. It also looks for vulnerable system settings, presenting users with a list of recommended actions to prevent malware from gaining access to a system. However, PC World criticized the amount of computer jargon used and lack of information about how adjust settings appropriately. On the other hand, PC Magazine found the feature straightforward, and often the solution involved downloading and installing an update. KIS uses a whitelist by Carbon Black to classify trusted and malicious programs. Malicious programs are not allowed to run at all. Unknown programs falling in between the two categories are restricted in the actions they can perform. Its firewall blocked all attacks in PC Magazine testing. Phishing protection was introduced in this release. Testing by PC Magazine found the feature blocked 44 percent of phishing URLs. Internet Explorer 7 blocked 67 percent of the URLs, and Mozilla Firefox blocked 81 percent. Spam filtering now integrates with Mozilla Thunderbird and scans NNTP traffic. Spam can be automatically diverted to its own folder. When using an unsupported e-mail client to download POP3, IMAP or NNTP mail, Kaspersky will still generate a report of all messages. However, in an unsupported client, there will be no toolbar nor will the program classify any messages as spam in the client itself. Version 2010 (9.0) Version 2010 of Kaspersky Internet Security introduced an overhauled user interface and a sandbox for running applications in a virtualized environment. The 9.0.0.736 build of KIS 2010 fully supported the Windows 7 operating system. Version 2011 (11.0) The beta version was released for all windows users on 8 June 2010. This version included a new interface, as well as a gadget only available for Windows Vista and Windows 7 users. PC Mag rated this version "very good" (4/5 stars). Its firewall was noted to be very good, but that made up for its only adequate malware detection rates. Two critical fixes have been released by Kaspersky Lab, making the current version 11.0.2.556. Version 2012 (12.0) On 1 March 2011, Kaspersky released the first build of version 2012, it came out as beta version and in English, French and Russian version, with more versions due out later. On 7 June 2011 Kaspersky Lab announced the commercial release of Kaspersky Internet Security 2012 in France, Germany, Switzerland. The current version is 12.0.0.374. Version 2013 (13.0) The beta version was released for all windows users on 3 March 2012. This version includes an interface which looks (currently, at least) much like Internet Security 2012. There is no Safe Run option, no Proactive Defense, while instead the behavioural monitoring System Watcher seems to be taking greater responsibility for detecting malware and a Safe Banking feature has been added. The release candidate (build 13.0.1.4088 RC) was released for all windows users on 20 July 2012. The Final Version was released on 28 August 2012 build 13.0.1.4190 Version 2014 (14.0) Beta testing started on 12 March 2013. This version introduced a Windows 8 like GUI design. The final version was released on 3 August 2013 build 14.0.0.4651 in India and Russia, then on August 13 in the US and August 27 in the UK. The 2014 release was frequently characterized as inferior to user expectations, largely due to its removal of a range of granular fine-tuning options in 2013 and earlier, which were used by experienced users; a number of these were added back in the 2015 beta by the time of its technical release (build 463). As of February 13, 2014 build 14.0.0.4651(E) was released. Build 14.0.0.4651(I) is the latest (current) version. Version 2015 (15.0) In April 2014, a beta version of the 2015 product, build 463, was released, followed by a technical release preview, of the near-complete 2015 product. The first official release of the product was in Bangladesh in June 2014. Version 2017 (17.0) Version 2018 (18.0) Version 2019 (19.0) Version 2020 (20.0) Version 2021 (21.0) Controversies regarding security In March 2015, Bloomberg accused Kaspersky of having close ties to Russian military and intelligence officials. Kaspersky slammed the claims in his blog, calling the coverage "sensationalist" and guilty of "exploiting paranoia" to "increase readership". As a result of alleged Russian involvement in the 2016 Presidential Election and ongoing investigations, the Department of Homeland Security officially banned the use of the Kaspersky Internet Security by the United States federal government in September, 2017. As of December 12, 2017, the use of Kaspersky software is banned from use by the American federal government by law. See also Antivirus software Internet Security Comparison of antivirus software Comparison of firewalls Comparison of computer viruses Eugene Kaspersky Natalya Kaspersky Kaspersky Anti-Virus Kaspersky Lab References External links 2006 software Antivirus software Firewall software Security software Proprietary software Windows security software MacOS security software Linux security software
52060075
https://en.wikipedia.org/wiki/DDoS%20attack%20on%20Dyn
DDoS attack on Dyn
The DDoS attack on Dyn was a series of distributed denial-of-service attacks (DDoS attacks) on October 21, 2016, targeting systems operated by Domain Name System (DNS) provider Dyn. The attack caused major Internet platforms and services to be unavailable to large swathes of users in Europe and North America. The groups Anonymous and New World Hackers claimed responsibility for the attack, but scant evidence was provided. As a DNS provider, Dyn provides to end-users the service of mapping an Internet domain name—when, for instance, entered into a web browser—to its corresponding IP address. The distributed denial-of-service (DDoS) attack was accomplished through numerous DNS lookup requests from tens of millions of IP addresses. The activities are believed to have been executed through a botnet consisting of many Internet-connected devices—such as printers, IP cameras, residential gateways and baby monitors—that had been infected with the Mirai malware. Affected services Services affected by the attack included: Airbnb Amazon.com Ancestry.com The A.V. Club BBC The Boston Globe Box Business Insider CNN Comcast CrunchBase DirecTV The Elder Scrolls Online Electronic Arts Etsy Evergreen ILS FiveThirtyEight Fox News The Guardian GitHub Grubhub HBO Heroku HostGator iHeartRadio Imgur Indiegogo Mashable National Hockey League Netflix The New York Times Overstock.com PayPal Pinterest Pixlr PlayStation Network Qualtrics Quora Reddit Roblox Ruby Lane RuneScape SaneBox Seamless Second Life Shopify Slack SoundCloud Squarespace Spotify Starbucks Storify Swedish Civil Contingencies Agency Swedish Government Tumblr Twilio Twitter Verizon Communications Visa Vox Media Walgreens The Wall Street Journal Wikia Wired Wix.com WWE Network Xbox Live Yammer Yelp Zillow Investigation The US Department of Homeland Security started an investigation into the attacks, according to a White House source. No group of hackers claimed responsibility during or in the immediate aftermath of the attack. Dyn's chief strategist said in an interview that the assaults on the company's servers were very complex and unlike everyday DDoS attacks. Barbara Simons, a member of the advisory board of the United States Election Assistance Commission, said such attacks could affect electronic voting for overseas military or civilians. Dyn disclosed that, according to business risk intelligence firm FlashPoint and Akamai Technologies, the attack was a botnet coordinated through numerous Internet of Things-enabled (IoT) devices, including cameras, residential gateways, and baby monitors, that had been infected with Mirai malware. The attribution of the attack to the Mirai botnet had been previously reported by BackConnect Inc., another security firm. Dyn stated that they were receiving malicious requests from tens of millions of IP addresses. Mirai is designed to brute-force the security on an IoT device, allowing it to be controlled remotely. Cybersecurity investigator Brian Krebs noted that the source code for Mirai had been released onto the Internet in an open-source manner some weeks prior, which made the investigation of the perpetrator more difficult. On 25 October 2016, US President Obama stated that the investigators still had no idea who carried out the cyberattack. On 13 December 2017, the Justice Department announced that three men (Paras Jha, 21, Josiah White, 20, and Dalton Norman, 21) had entered guilty pleas in cybercrime cases relating to the Mirai and clickfraud botnets. Perpetrators In correspondence with the website Politico, hacktivist groups SpainSquad, Anonymous, and New World Hackers claimed responsibility for the attack in retaliation against Ecuador's rescinding Internet access to WikiLeaks founder Julian Assange, at their embassy in London, where he had been granted asylum. This claim has yet to be confirmed. WikiLeaks alluded to the attack on Twitter, tweeting "Mr. Assange is still alive and WikiLeaks is still publishing. We ask supporters to stop taking down the US internet. You proved your point." New World Hackers has claimed responsibility in the past for similar attacks targeting sites like BBC and ESPN.com. On October 26, FlashPoint stated that the attack was most likely done by script kiddies. A November 17, 2016, Forbes article reported that the attack was likely carried out by "an angry gamer". A September 20, 2018, WeLiveSecurity article stated the its three creators meant it as a way of gaining an advantage in fierce competition surrounding the computer game Minecraft – by preventing players from using competitors’ servers and driving them to their own servers in order to ultimately make money off them. On December 9, 2020, one of the perpetrators pleaded guilty to taking part in the attack. The perpetrator's name was withheld due to his or her age. See also WannaCry ransomware attack Mirai (malware) Vulnerability (computing) Dyn (DynDNS) DDoS Attack References 2016 in computing Denial-of-service attacks October 2016 crimes in Europe October 2016 crimes in the United States Internet of things WikiLeaks Botnets Malware Domain Name System Hacking in the 2010s Cloud infrastructure attacks & failures 2010s internet outages
40643
https://en.wikipedia.org/wiki/Non-uniform%20memory%20access
Non-uniform memory access
Non-uniform memory access (NUMA) is a computer memory design used in multiprocessing, where the memory access time depends on the memory location relative to the processor. Under NUMA, a processor can access its own local memory faster than non-local memory (memory local to another processor or memory shared between processors). The benefits of NUMA are limited to particular workloads, notably on servers where the data is often associated strongly with certain tasks or users. NUMA architectures logically follow in scaling from symmetric multiprocessing (SMP) architectures. They were developed commercially during the 1990s by Unisys, Convex Computer (later Hewlett-Packard), Honeywell Information Systems Italy (HISI) (later Groupe Bull), Silicon Graphics (later Silicon Graphics International), Sequent Computer Systems (later IBM), Data General (later EMC), and Digital (later Compaq, then HP, now HPE). Techniques developed by these companies later featured in a variety of Unix-like operating systems, and to an extent in Windows NT. The first commercial implementation of a NUMA-based Unix system was the Symmetrical Multi Processing XPS-100 family of servers, designed by Dan Gielan of VAST Corporation for Honeywell Information Systems Italy. Overview Modern CPUs operate considerably faster than the main memory they use. In the early days of computing and data processing, the CPU generally ran slower than its own memory. The performance lines of processors and memory crossed in the 1960s with the advent of the first supercomputers. Since then, CPUs increasingly have found themselves "starved for data" and having to stall while waiting for data to arrive from memory (e.g. for Von-Neumann architecture-based computers, see Von Neumann bottleneck). Many supercomputer designs of the 1980s and 1990s focused on providing high-speed memory access as opposed to faster processors, allowing the computers to work on large data sets at speeds other systems could not approach. Limiting the number of memory accesses provided the key to extracting high performance from a modern computer. For commodity processors, this meant installing an ever-increasing amount of high-speed cache memory and using increasingly sophisticated algorithms to avoid cache misses. But the dramatic increase in size of the operating systems and of the applications run on them has generally overwhelmed these cache-processing improvements. Multi-processor systems without NUMA make the problem considerably worse. Now a system can starve several processors at the same time, notably because only one processor can access the computer's memory at a time. NUMA attempts to address this problem by providing separate memory for each processor, avoiding the performance hit when several processors attempt to address the same memory. For problems involving spread data (common for servers and similar applications), NUMA can improve the performance over a single shared memory by a factor of roughly the number of processors (or separate memory banks). Another approach to addressing this problem is the multi-channel memory architecture, in which a linear increase in the number of memory channels increases the memory access concurrency linearly. Of course, not all data ends up confined to a single task, which means that more than one processor may require the same data. To handle these cases, NUMA systems include additional hardware or software to move data between memory banks. This operation slows the processors attached to those banks, so the overall speed increase due to NUMA heavily depends on the nature of the running tasks. Implementations AMD implemented NUMA with its Opteron processor (2003), using HyperTransport. Intel announced NUMA compatibility for its x86 and Itanium servers in late 2007 with its Nehalem and Tukwila CPUs. Both Intel CPU families share a common chipset; the interconnection is called Intel QuickPath Interconnect (QPI), which provides extremely high bandwidth to enable high on-board scalability and was replaced by a new version called Intel UltraPath Interconnect with the release of Skylake (2017). Cache coherent NUMA (ccNUMA) Nearly all CPU architectures use a small amount of very fast non-shared memory known as cache to exploit locality of reference in memory accesses. With NUMA, maintaining cache coherence across shared memory has a significant overhead. Although simpler to design and build, non-cache-coherent NUMA systems become prohibitively complex to program in the standard von Neumann architecture programming model. Typically, ccNUMA uses inter-processor communication between cache controllers to keep a consistent memory image when more than one cache stores the same memory location. For this reason, ccNUMA may perform poorly when multiple processors attempt to access the same memory area in rapid succession. Support for NUMA in operating systems attempts to reduce the frequency of this kind of access by allocating processors and memory in NUMA-friendly ways and by avoiding scheduling and locking algorithms that make NUMA-unfriendly accesses necessary. Alternatively, cache coherency protocols such as the MESIF protocol attempt to reduce the communication required to maintain cache coherency. Scalable Coherent Interface (SCI) is an IEEE standard defining a directory-based cache coherency protocol to avoid scalability limitations found in earlier multiprocessor systems. For example, SCI is used as the basis for the NumaConnect technology. NUMA vs. cluster computing One can view NUMA as a tightly coupled form of cluster computing. The addition of virtual memory paging to a cluster architecture can allow the implementation of NUMA entirely in software. However, the inter-node latency of software-based NUMA remains several orders of magnitude greater (slower) than that of hardware-based NUMA. Software support Since NUMA largely influences memory access performance, certain software optimizations are needed to allow scheduling threads and processes close to their in-memory data. Microsoft Windows 7 and Windows Server 2008 R2 added support for NUMA architecture over 64 logical cores. Java 7 added support for NUMA-aware memory allocator and garbage collector. Linux kernel: Version 2.5 provided a basic NUMA support, which was further improved in subsequent kernel releases. Version 3.8 of the Linux kernel brought a new NUMA foundation that allowed development of more efficient NUMA policies in later kernel releases. Version 3.13 of the Linux kernel brought numerous policies that aim at putting a process near its memory, together with the handling of cases such as having memory pages shared between processes, or the use of transparent huge pages; new sysctl settings allow NUMA balancing to be enabled or disabled, as well as the configuration of various NUMA memory balancing parameters. OpenSolaris models NUMA architecture with lgroups. FreeBSD added support for NUMA architecture in version 9.0. Silicon Graphics IRIX (discontinued as of 2021) support for ccNUMA architecture over 1240 CPU with Origin server series. Hardware support As of 2011, ccNUMA systems are multiprocessor systems based on the AMD Opteron processor, which can be implemented without external logic, and the Intel Itanium processor, which requires the chipset to support NUMA. Examples of ccNUMA-enabled chipsets are the SGI Shub (Super hub), the Intel E8870, the HP sx2000 (used in the Integrity and Superdome servers), and those found in NEC Itanium-based systems. Earlier ccNUMA systems such as those from Silicon Graphics were based on MIPS processors and the DEC Alpha 21364 (EV7) processor. See also Uniform memory access (UMA) Cache-only memory architecture (COMA) Scratchpad memory (SPM) Partitioned global address space HiperDispatch References External links NUMA FAQ Page-based distributed shared memory OpenSolaris NUMA Project Introduction video for the Alpha EV7 system architecture More videos related to EV7 systems: CPU, IO, etc NUMA optimization in Windows Applications NUMA Support in Linux at SGI Intel Tukwila Intel QPI (CSI) explained current Itanium NUMA systems Parallel computing Computer memory
84520
https://en.wikipedia.org/wiki/Tethys%20%28mythology%29
Tethys (mythology)
In Greek mythology, Tethys (; ) was a Titan daughter of Uranus and Gaia, a sister and wife of the Titan Oceanus, and the mother of the river gods and the Oceanids. Although Tethys had no active role in Greek mythology and no established cults, she was depicted in mosaics decorating baths, pools, and triclinia in the Greek East, particularly in Antioch and its suburbs, either alone or with Oceanus. Genealogy Tethys was one of the Titan offspring of Uranus (Sky) and Gaia (Earth). Hesiod lists her Titan siblings as Oceanus, Coeus, Crius, Hyperion, Iapetus, Theia, Rhea, Themis, Mnemosyne, Phoebe, and Cronus. Tethys married her brother Oceanus, an enormous river encircling the world, and was by him the mother of numerous sons (the river gods) and numerous daughters (the Oceanids). According to Hesiod, there were three thousand (i.e. innumerable) river gods. These included Achelous, the god of the Achelous River, the largest river in Greece, who gave his daughter in marriage to Alcmaeon and was defeated by Heracles in a wrestling contest for the right to marry Deianira; Alpheus, who fell in love with the nymph Arethusa and pursued her to Syracuse, where she was transformed into a spring by Artemis; and Scamander who fought on the side of the Trojans during the Trojan War and, offended when Achilles polluted his waters with a large number of Trojan corpses, overflowed his banks nearly drowning Achilles. According to Hesiod, there were also three thousand Oceanids. These included Metis, Zeus' first wife, whom Zeus impregnated with Athena and then swallowed; Eurynome, Zeus' third wife, and mother of the Charites; Doris, the wife of Nereus and mother of the Nereids; Callirhoe, the wife of Chrysaor and mother of Geryon; Clymene, the wife of Iapetus, and mother of Atlas, Menoetius, Prometheus, and Epimetheus; Perseis, wife of Helios and mother of Circe and Aeetes; Idyia, wife of Aeetes and mother of Medea; and Styx, goddess of the river Styx, and the wife of Pallas and mother of Zelus, Nike, Kratos, and Bia. Primeval mother? Passages in a section of the Iliad, called the Deception of Zeus, suggest the possibility that Homer knew a tradition in which Oceanus and Tethys (rather than Uranus and Gaia, as in Hesiod) were the primeval parents of the gods. Twice Homer has Hera describe the pair as "Oceanus, from whom the gods are sprung, and mother Tethys". According to M. L. West, these lines suggests a myth in which Oceanus and Tethys are the "first parents of the whole race of gods." However, as Timothy Gantz points out, "mother" could simply refer to the fact that Tethys was Hera's foster mother for a time, as Hera tells us in the lines immediately following, while the reference to Oceanus as the genesis of the gods "might be simply a formulaic epithet indicating the numberless rivers and springs descended from Okeanos" (compare with Iliad 21.195–197). But, in a later Iliad passage, Hypnos also describes Oceanus as "genesis for all", which, according to Gantz, is hard to understand as meaning other than that, for Homer, Oceanus was the father of the Titans. Plato, in his Timaeus, provides a genealogy (probably Orphic) which perhaps reflected an attempt to reconcile this apparent divergence between Homer and Hesiod, in which Uranus and Gaia are the parents of Oceanus and Tethys, and Oceanus and Tethys are the parents of Cronus and Rhea and the other Titans, as well as Phorcys. In his Cratylus, Plato quotes Orpheus as saying that Oceanus and Tethys were "the first to marry", possibly also reflecting an Orphic theogony in which Oceanus and Tethys—rather than Uranus and Gaia—were the primeval parents. Plato's apparent inclusion of Phorkys as a Titan (being the brother of Cronus and Rhea), and the mythographer Apollodorus's inclusion of Dione, the mother of Aphrodite by Zeus, as a thirteenth Titan, suggests an Orphic tradition in which Hesiod's twelve Titans were the offspring of Oceanus and Tethys, with Phorkys and Dione taking the place of Oceanus and Tethys. According to Epimenides, the first two beings, Night and Aer, produced Tartarus, who in turn produced two Titans (possibly Oceanus and Tethys) from whom came the world egg. Mythology Tethys played no active part in Greek mythology. The only early story concerning Tethys is what Homer has Hera briefly relate in the Iliad’s Deception of Zeus passage. There, Hera says that when Zeus was in the process of deposing Cronus, she was given by her mother Rhea to Tethys and Oceanus for safekeeping and that they "lovingly nursed and cherished me in their halls". Hera relates this while dissembling that she is on her way to visit Oceanus and Tethys in the hopes of reconciling her foster parents, who are angry with each other and are no longer having sexual relations. Originally Oceanus' consort, at a later time Tethys came to be identified with the sea, and in Hellenistic and Roman poetry Tethys' name came to be used as a poetic term for the sea. The only other story involving Tethys is an apparently late astral myth concerning the polar constellation Ursa Major (the Great Bear), which was thought to represent the catasterism of Callisto who was transformed into a bear and placed by Zeus among the stars. The myth explains why the constellation never sets below the horizon, saying that since Callisto had been Zeus's lover, she was forbidden by Tethys from "touching Ocean's deep" out of concern for her foster-child Hera, Zeus's jealous wife. Claudian wrote that Tethys nursed two of her nephlings in her breast, Helios and Selene, the children of her siblings Hyperion and Theia, during their infancy, when their light was weak and had not yet grown into their older, more luminous selves. In Ovid's Metamorphoses, Tethys turns Aesacus into a diving bird. Tethys was sometimes confused with another sea goddess, the sea-nymph Thetis, wife of Peleus and mother of Achilles. Tethys as Tiamat M. L. West detects in the Iliad's Deception of Zeus passage an allusion to a possible archaic myth "according to which [Tethys] was the mother of the gods, long estranged from her husband," speculating that the estrangement might refer to a separation of "the upper and lower waters ... corresponding to that of heaven and earth," which parallels the story of "Apsū and Tiamat in the Babylonian cosmology, the male and female waters, which were originally united (En. El. I. 1 ff.)," but that, "By Hesiod's time the myth may have been almost forgotten and Tethys remembered only as the name of Oceanus' wife." This possible correspondence between Oceanus and Tethys, and Apsū and Tiamat has been noticed by several authors, with Tethys' name possibly having been derived from that of Tiamat. Iconography Representations of Tethys before the Roman period are rare. Tethys appears, identified by inscription (ΘΕΘΥΣ), as part of an illustration of the wedding of Peleus and Thetis on the early sixth-century BC Attic black-figure "Erskine" dinos by Sophilos (British Museum 1971.111–1.1). Accompanied by Eileithyia, the goddess of childbirth, Tethys follows close behind Oceanus at the end of a procession of gods invited to the wedding. Tethys is also conjectured to be represented in a similar illustration of the wedding of Peleus and Thetis depicted on the early sixth-century BC Attic black-figure François Vase (Florence 4209). Tethys probably also appeared as one of the gods fighting the Giants in the Gigantomachy frieze of the second-century BC Pergamon Altar. Only fragments of the figure remain: a part of a chiton below Oceanus' left arm and a hand clutching a large tree branch visible behind Oceanus' head. During the second to fourth centuries AD, Tethys—sometimes with Oceanus, sometimes alone—became a relatively frequent feature of mosaics decorating baths, pools, and triclinia in the Greek East, particularly in Antioch and its suburbs. Her identifying attributes are wings sprouting from her forehead, a rudder/oar, and a ketos, a creature from Greek mythology with the head of a dragon and the body of a snake. The earliest of these mosaics, identified as Tethys, decorated a triclinium overlooking a pool, excavated from the House of the Calendar in Antioch, dated to shortly after AD 115 (Hatay Archaeology Museum 850). Tethys, reclining on the left, with Oceanus reclining on the right, has long hair, a winged forehead, and is nude to the waist with draped legs. A ketos twines around her raised right arm. Other mosaics of Tethys with Oceanus include Hatay Archaeology Museum 1013 (from the House of Menander, Daphne), Hatay Archaeology Museum 9095, and Baltimore Museum of Art 1937.126 (from the House of the Boat of Psyches: triclinium). In other mosaics, Tethys appears without Oceanus. One of these is a fourth-century AD mosaic from a pool (probably a public bath) found at Antioch, now installed in Boston, Massachusetts at the Harvard Business School's Morgan Hall and formerly at Dumbarton Oaks, Washington, D.C. (Dumbarton Oaks 76.43). Besides the Sophilos dinos, this is the only other representation of Tethys identified by inscription. Here Tethys, with a winged forehead, rises from the sea bare-shouldered, with long dark hair parted in the middle. A golden rudder rests against her right shoulder. Others include Hatay Archaeology Museum 9097, Shahba Museum (in situ), Baltimore Museum of Art 1937.118 (from the House of the Boat of Psyches: Room six), and Memorial Art Gallery 42.2. Toward the end of the period represented by these mosaics, Tethys' iconography appears to merge with that of another sea goddess Thalassa, the Greek personification of the sea (thalassa being the Greek word for the sea). Such a transformation would be consistent with the frequent use of Tethys' name as a poetic reference to the sea in Roman poetry (see above). Modern use of the name Tethys, a moon of the planet Saturn, and the prehistoric Tethys Ocean are named after this goddess. Notes References Aeschylus(?), Prometheus Bound in Aeschylus, with an English translation by Herbert Weir Smyth, Ph. D. in two volumes. Vol 2. Cambridge, Massachusetts, Harvard University Press. 1926. Online version at the Perseus Digital Library. Aeschylus, Persians. Seven against Thebes. Suppliants. Prometheus Bound. Edited and translated by Alan H. Sommerstein. Loeb Classical Library No. 145. Cambridge, Massachusetts: Harvard University Press, 2009. . Online version at Harvard University Press. Apollodorus, Apollodorus, The Library, with an English Translation by Sir James George Frazer, F.B.A., F.R.S. in 2 Volumes. Cambridge, Massachusetts, Harvard University Press; London, William Heinemann Ltd. 1921. Online version at the Perseus Digital Library. Apollonius of Rhodes, Apollonius Rhodius: the Argonautica, translated by Robert Cooper Seaton, W. Heinemann, 1912. Internet Archive. Beazley, John Davidson, The Development of Attic Black-figure, Volume 24, University of California Press, 1951. . Budelmann, Felix and Johannes Haubold, "Reception and Tradition" in A Companion to Classical Receptions, edited by Lorna Hardwick and Christopher Stray, pp. 13–25. John Wiley & Sons, 2011. . Burkert, Walter The Orientalizing Revolution: Near Eastern Influence on Greek Culture in the Early archaic Age, Harvard University Press, 1992, pp. 91–93. Cahn, Herbert A., "Thalassa", in Lexicon Iconographicum Mythologiae Classicae (LIMC) VIII.1 Artemis Verlag, Zürich and Munich, 1997. . Caldwell, Richard, Hesiod's Theogony, Focus Publishing/R. Pullins Company (June 1, 1987). . Callimachus, Callimachus and Lycophron with an English translation by A. W. Mair ; Aratus, with an English translation by G. R. Mair, London: W. Heinemann, New York: G. P. Putnam 1921. Internet Archive Campbell, Sheila D. (1988), The Mosaics of Antioch, PIMS. . Campbell, Sheila D. (1998), The Mosaics of Anemurium, PIMS. . Carpenter, Thomas H., Dionysian Imagery in Archaic Greek Art: Its Development in Black-Figure Vase Painting, Clarendon Press, 1986. . Diodorus Siculus, Diodorus Siculus: The Library of History. Translated by C. H. Oldfather. Twelve volumes. Loeb Classical Library. Cambridge, Massachusetts: Harvard University Press; London: William Heinemann, Ltd. 1989. Vol. 3. Books 4.59–8. . Online version at Bill Thayer's Web Site Eraslan, Şehnaz. "Oceanus, Tethys and Thalssa Figures in the Light of Antioch and Zeugma Mosaics.", in: Journal of International Social Research 8.37 (2015), pp. 454–461. Fowler, R. L. (2000), Early Greek Mythography: Volume 1: Text and Introduction, Oxford University Press, 2000. . Fowler, R. L. (2013), Early Greek Mythography: Volume 2: Commentary, Oxford University Press, 2013. . Gantz, Timothy, Early Greek Myth: A Guide to Literary and Artistic Sources, Johns Hopkins University Press, 1996, Two volumes: (Vol. 1), (Vol. 2). Hard, Robin, The Routledge Handbook of Greek Mythology: Based on H.J. Rose's "Handbook of Greek Mythology", Psychology Press, 2004, . Hesiod, Theogony, in The Homeric Hymns and Homerica with an English Translation by Hugh G. Evelyn-White, Cambridge, Massachusetts., Harvard University Press; London, William Heinemann Ltd. 1914. Online version at the Perseus Digital Library. Homer, The Iliad with an English Translation by A.T. Murray, Ph.D. in two volumes. Cambridge, Massachusetts., Harvard University Press; London, William Heinemann, Ltd. 1924. Online version at the Perseus Digital Library. Hyginus, Gaius Julius, Astronomica, in The Myths of Hyginus, edited and translated by Mary A. Grant, Lawrence: University of Kansas Press, 1960. Hyginus, Gaius Julius, Fabulae in Apollodorus' Library and Hyginus' Fabulae: Two Handbooks of Greek Mythology, Translated, with Introductions by R. Scott Smith and Stephen M. Trzaskoma, Hackett Publishing Company, 2007. . Homeric Hymn to Hermes (4), in The Homeric Hymns and Homerica with an English Translation by Hugh G. Evelyn-White, Cambridge, Massachusetts., Harvard University Press; London, William Heinemann Ltd. 1914. Online version at the Perseus Digital Library. Jentel, Marie-Odile, "Tethys I", in Lexicon Iconographicum Mythologiae Classicae (LIMC) VIII.1 Artemis Verlag, Zürich and Munich, 1997. . Kern, Otto. Orphicorum Fragmenta, Berlin, 1922. Internet Archive Kondoleon, Christine, Antioch: The Lost Ancient City, Princeton University Press, 2000. . Lycophron, Alexandra (or Cassandra) in Callimachus and Lycophron with an English translation by A. W. Mair ; Aratus, with an English translation by G. R. Mair, London: W. Heinemann, New York: G. P. Putnam 1921. Internet Archive Macrobius, Saturnalia, Volume II: Books 3-5, edited and translated by Robert A. Kaster, Loeb Classical Library No. 511, Cambridge, Massachusetts, Harvard University Press, 2011. Online version at Harvard University Press. . Matthews, Monica, Caesar and the Storm: A Commentary on Lucan, De Bello Civili, Book 5, Lines 476–721, Peter Lang, 2008. . Most, G.W., Hesiod, Theogony, Works and Days, Testimonia, Edited and translated by Glenn W. Most, Loeb Classical Library No. 57, Cambridge, Massachusetts, Harvard University Press, 2018. . Online version at Harvard University Press. Ovid, Ovid's Fasti: With an English translation by Sir James George Frazer, London: W. Heinemann LTD; Cambridge, Massachusetts: : Harvard University Press, 1959. Internet Archive. Ovid, Metamorphoses, Brookes More. Boston. Cornhill Publishing Co. 1922. Online version at the Perseus Digital Library. Plato, Cratylus in Plato in Twelve Volumes, Vol. 12 translated by Harold N. Fowler, Cambridge, Massachusetts, Harvard University Press; London, William Heinemann Ltd. 1925. Online version at the Perseus Digital Library. Plato, Critias in Plato in Twelve Volumes, Vol. 9 translated by W.R.M. Lamb. Cambridge, Massachusetts, Harvard University Press; London, William Heinemann Ltd. 1925. Online version at the Perseus Digital Library. Plato, Timaeus in Plato in Twelve Volumes, Vol. 9 translated by W.R.M. Lamb, Cambridge, Massachusetts, Harvard University Press; London, William Heinemann Ltd. 1925. Online version at the Perseus Digital Library. Claudian, Rape of Persephone in Claudian: Volume II. Translated by Platnauer, Maurice. Loeb Classical Library Volume 136. Cambridge, MA. Harvard Univserity Press. 1922. Pollitt, Jerome Jordan, Art in the Hellenistic Age, Cambridge University Press. . Queyrel, François, L'Autel de Pergame: Images et pouvoir en Grèce d'Asie, Paris: Éditions A. et J. Picard, 2005. . Smith, William; Dictionary of Greek and Roman Biography and Mythology, London (1873). Wages, Sara M., “A Note on the Dumbarton Oaks ‘Tethys Mosaic’” in: Dumbarton Oaks Papers 40 (1986), pp. 119–128. West, M. L. (1966), Hesiod: Theogony, Oxford University Press. West, M. L. (1983), The Orphic Poems, Clarendon Press. . West, M. L. (1997), The East Face of Helicon: West Asiatic Elements in Greek Poetry and Myth, Oxford University Press. . Williams, Dyfri, "Sophilos in the British Museum" in Greek Vases In The J. Paul Getty Museum, Getty Publications, 1983, pp. 9–34. . External links TETHYS from The Theoi Project TETHYS from greekmythology.com Titans (mythology) Greek goddesses Greek sea goddesses Sea and river goddesses Children of Gaia Deities in the Iliad Characters in Greek mythology Metamorphoses characters Women in Greek mythology
38602166
https://en.wikipedia.org/wiki/Mosh%20%28software%29
Mosh (software)
In computing, Mosh (mobile shell) is a tool used to connect from a client computer to a server over the Internet, to run a remote terminal. Mosh is similar to SSH, with additional features meant to improve usability for mobile users. The major features are: Mosh maintains its session even when it "roams" (when the client endpoint changes to different IP addresses), for example by moving to a different Wi-Fi network or when changing from Wi-Fi to 3G. Mosh maintains the terminal session (not "connection" in the TCP-sense because Mosh uses UDP) even when a user loses their Internet connection or puts their client to "sleep." In comparison, SSH can lose its connection in such cases because TCP times out. Mosh client attempts to be responsive to keyboard events (typing, erasing characters with the [Delete] key, and so on) without waiting for network lag. It uses an adaptive system that predicts whether the application running on the server will decide to echo the user's keystrokes or deletions. The main drawbacks of mosh are additional prerequisites to the server, that it lacks some special features of SSH (such as connection forwarding) and the lack of a native Windows client. An alternative for Linux servers (that still require installation on the server) is to use GNU Screen on top of a regular SSH connection. Design Mosh works at a different layer from SSH. Whereas SSH transmits a stream of bytes in each direction (from server to client or client to server) using TCP, Mosh runs a terminal emulator at the server to figure out what should be on the screen. The server then transmits this screen to the client at a varying frame rate, depending on the speed of the network. This allows Mosh to save on network traffic on slow or intermittent connections. Supported platforms Mosh is available for most Linux distributions, macOS, FreeBSD, NetBSD, and OpenBSD, Android, Solaris, Cygwin, and as a Chrome App. The iOS program Termius includes an independent implementation of the Mosh protocol. Performance Roaming Mosh is built on the State-Synchronization Protocol (SSP), which supports single-packet roaming. After the client has switched to a new IP address, a single packet that successfully reaches the server is enough to "roam" the connection. The client does not need to know it has roamed. (The client may be using NAT and the NAT roamed instead.) Packet loss In the Mosh research paper, the creators tested SSP on a link with 29% packet loss, and found that SSP reduced the average response time by a factor of 50 (from 16.8 seconds to 0.33 seconds) compared with SSH, which uses TCP. A different study, by students at Stanford University, found that SSP reduced the average response time by a factor of 30 (from 5.9 seconds to 0.19 seconds). Local echo According to Mosh's developers, the program was found to be able to predict and immediately display 70% of user keystrokes, reducing the median response time to a keystroke to less than 5 milliseconds (masking the latency of the network). A different study, by students at Stanford University, found that Mosh was able to quickly echo 55% of user keystrokes. Drawbacks Compared to the more popular SSH, mosh has the following drawbacks: Prerequisites on the server The major drawback of mosh is that it requires the server to fulfill additional prerequisites which are not needed by ssh itself. Due to its design, mosh needs the server to allow direct connections via UDP. Servers not fulfilling these prerequisites cannot be used by mosh. Examples of such systems include servers behind firewalls which restrict connections to the ssh-port via TCP. Also problematic are servers which are only indirectly reachable. The latter is usually accommodated by ssh via the 'ProxyCommand' option, but this is not supported by mosh. One port per connection By default, the server tries to allocate the first free UDP port in the range 60001–61000, per connection. This dynamic port allocation is considered an extra burden and risk for firewall maintenance. A significant part of the firewall-filtering happens through connection tracking, so called stateful filtering, this is based on the SYN/ACK flags in TCP segments, UDP packets don't have such flags.Mitigation: The UDP port on the server can be set per mosh connection, so that only a limited number of ports need to be opened Deep packet inspection firewalls and Application firewalls can handle this better by looking at content of the packet and associate it to the initial connection. Output drops and lack of terminal scrollback Scrollback is not supported in the current release of mosh, and when using it in a terminal emulator with scrollbars they disappear, but is planned for the 1.3 release. This functionality is trade-off for garbage cleaning, as binary output is wiped away. One way to mitigate this currently is by using mosh in combination with a terminal multiplexer like screen or tmux. Lack of ssh-agent forwarding SSH-agent forwarding is not currently supported. Lack of X11 forwarding X11 Forwarding is not yet supported. See also Block-oriented terminal tmux Secure Shell Command-line interface References Application layer protocols Unix software
18773335
https://en.wikipedia.org/wiki/Microsoft%20litigation
Microsoft litigation
Microsoft has been involved in numerous high-profile legal matters that involved litigation over the history of the company, including cases against the United States, the European Union, and competitors. Governmental In its 2008 annual report, Microsoft stated: Antitrust In the 1990s, Microsoft adopted exclusionary licensing under which PC manufacturers were required to pay for an MS-DOS license even when the system shipped with an alternative operating system. Critics attest that it also used predatory tactics to price its competitors out of the market and that Microsoft erected technical barriers to make it appear that competing products did not work on its operating system. In a consent decree filed on July 15, 1994, Microsoft agreed to a deal under which, among other things, the company would not make the sale of its operating systems conditional on the purchase of any other Microsoft product. On February 14, 1995 Judge Stanley Sporkin issued a 45-page opinion that the consent decree was not in the public interest. Later that spring, a three-judge federal appeals panel removed Sporkin and reassigned the consent decree. Judge Thomas Penfield Jackson entered the decree on August 21, 1995, three days before the launch of Windows 95. A Microsoft purchase of Intuit was scuttled in 1994 due to antitrust concerns that Microsoft would be purchasing a major competitor. After bundling the Internet Explorer web browser into its Windows operating system in the late 1990s (without requiring a separate purchase) and acquiring a dominant share in the web browser market, the antitrust case United States v. Microsoft was brought against the company. In a series of rulings by judge Thomas Penfield Jackson, the company was found to have violated its earlier consent decree and abused its monopoly in the desktop operating systems market. The "findings of fact" during the antitrust case established that Microsoft has a monopoly in the PC desktop operating systems market: Viewed together, three main facts indicate that Microsoft enjoys monopoly power. First, Microsoft's share of the market for Intel-compatible PC operating systems is extremely large and stable. Second, Microsoft's dominant market share is protected by a high barrier to entry. Third, and largely as a result of that barrier, Microsoft's customers lack a commercially viable alternative to Windows. (III.34) The findings of fact go on to explain the nature of the "barrier to entry": The fact that there is a multitude of people using Windows makes the product more attractive to consumers. The large installed base ... impels ISVs (independent software vendors) to write applications first and foremost to Windows, thereby ensuring a large body of applications from which consumers can choose. The large body of applications thus reinforces demand for Windows, augmenting Microsoft's dominant position and thereby perpetuating ISV incentives to write applications principally for Windows ... The small or non-existent market share of an aspiring competitor makes it prohibitively expensive for the aspirant to develop its PC operating system into an acceptable substitute for Windows. (III.39–40) The proposed remedy (dividing Microsoft into two companies) was never applied. The judge who decided the original case was removed from the decision concerning the penalty due to public statements, and replaced by a judge more sympathetic to Microsoft. While new penalties were under consideration, the Clinton administration ended and the Bush administration took office. The new administration announced that in the interest of ending the case as quickly as possible, it would no longer seek to break the company up, and that it would stop investigating claims of illegal tying of products. Eighteen days later, Judge Kollar-Kotelly ordered the justice department and Microsoft to "engage in discussions seven days a week, 24 hours a day." The judge cited the events of September 11, 2001, in her direction to begin settlement talks but did not explain the linkage between the two. Attorney General Ashcroft, however, denied that the events of September 11 had any effect on the outcome. Microsoft subsequently reached a settlement with the Department of Justice and some of the states which brought suit against it. Several class-action lawsuits filed after the conviction are still pending. In early 2002, Microsoft proposed to settle the private lawsuits by donating $1 billion USD in money, software, services, and training, including Windows licenses and refurbished PCs, to about 12,500 underprivileged public schools. This was seen by the judge as a potential windfall for Microsoft, not only in educating schoolchildren on Microsoft solutions but also in flooding the market with Microsoft products. Among the protesters were Apple Inc. which feared further loss of its educational market share. The federal judge rejected the proposed settlement. In 2003 to 2004, the European Commission investigated the bundling of Windows Media Player into Windows, a practice which rivals complained was destroying the market for their own products. Negotiations between Microsoft and the Commission broke down in March 2004, and the company was subsequently handed down a record fine of €497 million ($666 million) for its breaches of EU competition law. Separate investigations into alleged abuses of the server market were also ongoing at the same time. On December 22, 2004, the European Court decided that the measures imposed on Microsoft by the European Commission would not be delayed, as was requested by Microsoft while waiting for the appeal. Microsoft has since paid a €497 million fine, shipped versions of Windows without Windows Media Player, and licensed many of the protocols used in its products to developers in countries within the European Economic Area. However, the European Commission has charactized the much delayed protocol licensing as unreasonable, called Microsoft "non-compliant" and still violating antitrust law in 2007, and said that its RAND terms were above market prices; in addition, they said software patents covering the code "lack significant innovation", which Microsoft and the EC had agreed would determine licensing fees. Microsoft responded by saying, that other government agencies had found "considerable innovation". Microsoft appealed the facts and ruling to the European Court of First Instance with hearings in September 2006. In 2000, a group of customers and business filed a class action suit in Comes v. Microsoft Corp., alleging that Microsoft violated Iowa's antitrust laws by engaging in monopolistic practices. In 2002, the Iowa Supreme Court ruled that indirect purchasers (consumers who purchased computers from a third-party, with Microsoft's software pre-installed in the computer) could be included as members of the class in the class action suit. On remand, the trial court certified two classes of plaintiffs, and the Iowa Supreme Court ultimately affirmed the class certification. In August 2007, the parties ultimately reached a settlement valued at $179.95 million. On September 17, 2007, the EU Court of First Instance rejected Microsoft's appeal. The court affirmed the original contested finding: All elements of Microsoft's appeal were dismissed. Microsoft accepted the judgment of the Court of First Instance and proceeded to make available interoperability information as originally required by the European Commission. Microsoft also faced competition law in South Korea and was fined $32 million in December 2005 and ordered to unbundle instant messaging, Windows Media Player and Windows Media Service, or let competitors' products take their place. Microsoft noted in their October 2005 SEC filing that they may have to pull out of South Korea, although they later denied fulfilling such a plan. Microsoft's 2006 appeal was struck down; they have another appeal pending. Microsoft also faced sanctions from Japan Fair Trade Commission twice in 1998 when Japanese manufacturers were forced to include Microsoft Word on new systems instead of homegrown word processor software Ichitaro, and again in 2004 for clauses detrimental to ability of Japanese computer manufacturers to obtain a Windows OEM license. European antitrust regulators on February 27, 2008 fined Microsoft $1.3 billion for failing to comply with a 2004 judgment, that the company had abused its market dominance. The new fine by the European Commission was the largest it has ever imposed on an individual company, and brings the total in fines imposed on Microsoft to about $US 2.5 billion, at current exchange rates. Microsoft had previously been fined after the commission determined in 2004 that the company had abused the dominance of its Windows operating system to gain unfair market advantage. The commission imposing the new fine said, that it was because the company had not met the prescribed remedies after the earlier judgment. European Union The European Union Microsoft competition case is a case brought by the European Commission of the European Union (EU) against Microsoft for abuse of its dominant position in the market (according to competition law). It started as a complaint from Novell over Microsoft's licensing practices in 1993, and eventually resulted in the EU ordering Microsoft to divulge certain information about its server products and release a version of Microsoft Windows without Windows Media Player. February 2008 fine On February 27, 2008 the European Union (EU) competitions commission announced its decision to fine the Microsoft Corporation €899 million (US$1.35 billion), approximately 1/10 of the company's net yearly earnings, for failing to comply with the 2004 antitrust order. The first decision in this antitrust case was given in 2004 citing that Microsoft withheld needed interoperability information from rival software companies which prevented them from making software compatible with Windows. The commission ordered Microsoft to provide this information. Microsoft agreed to this, providing the information for royalty fees of 6.85% of the licensee's revenues for the product on grounds of innovation (specifically, 3.87% for the patent license and of 2.98% for the information license). The EU found these royalty fees unreasonable and Microsoft was ordered to lower them. Microsoft complied with this, adjusting the royalty rates to 1.2% (changing the rates for the licenses to 0.7% and 0.5%, respectively) in the European Union, while keeping the rate the same for the rest of the world. The EU still saw this as an unreasonable rate, and Microsoft, two months after lowering the rates, reduced the rates yet again to a flat rate of €10,000 or a royalty of 0.4% applicable worldwide. Microsoft's royalty rates, which were deemed unreasonable for the period of 15 months between June 21, 2006 and October 21, 2007 are the cause for the fine. So far, the EU has fined Microsoft €1.68 billion in 3 separate fines in this case. This fine will go towards the European Union annual budget. European Commissioner for Competition Neelie Kroes stated that the fine was "reasonable and proportionate," as the figure could have gone up as high as €1.5 billion, the maximum that the EU commission can impose. She also said that it should act as "a signal to the outside world, and especially Microsoft, that they should stick to the rules" and that "Talk is cheap. Flouting the rules is expensive." Although she also expressed hope that "today's decision closes a dark chapter in Microsoft's record of non-compliance with the Commission." It is not certain whether Microsoft will appeal this decision. A Microsoft spokesperson has stated that Microsoft will review this latest fine, citing that "The commission announced in October 2007 that Microsoft was in full compliance with the 2004 decision, so these fines are about the past issues that have been resolved." Microsoft's General Counsel Brad Smith commented "It's clearly very important to us as a company that we comply with our obligations under European law. We will study this decision carefully, and if there are additional steps that we need to take in order to comply with it, we will take them." Microsoft had appealed against fines by the EU before, but all the charges were defeated. If Microsoft does not appeal the decision, the company will have 3 months (starting February 27) to pay the fine in full. The decisions came after Microsoft announced they were disclosing 30,000 pages of previously secret software code last Thursday (February 21). The EU competition commissioner commented that this move "does not necessarily equal a change in business practice." Spanish antitrust investigation In September 2011, the competition commission in Spain began an investigation into Microsoft's licence agreements, which prevent the transfer of Microsoft software to third parties. United States United States v. Microsoft Corp., 87 F. Supp. 2d 30 (D.D.C. 2000) was a set of consolidated civil actions filed against Microsoft Corporation on May 18, 1998 by the United States Department of Justice (DOJ) and twenty U.S. states. Joel I. Klein was the lead prosecutor. The plaintiffs alleged that Microsoft abused monopoly power in its handling of operating system sales and web browser sales. The issue central to the case was whether Microsoft was allowed to bundle its flagship Internet Explorer (IE) web browser software with its Microsoft Windows operating system. Bundling them together is alleged to have been responsible for Microsoft's victory in the browser wars as every Windows user had a copy of Internet Explorer. It was further alleged that this unfairly restricted the market for competing web browsers (such as Netscape Navigator or Opera) that were slow to download over a modem or had to be purchased at a store. Underlying these disputes were questions over whether Microsoft altered or manipulated its application programming interfaces (APIs) to favor Internet Explorer over third party web browsers, Microsoft's conduct in forming restrictive licensing agreements with OEM computer manufacturers, and Microsoft's intent in its course of conduct. Microsoft stated that the merging of Microsoft Windows and Internet Explorer was the result of innovation and competition, that the two were now the same product and were inextricably linked together and that consumers were now getting all the benefits of IE for free. Those who opposed Microsoft's position countered that the browser was still a distinct and separate product which did not need to be tied to the operating system, since a separate version of Internet Explorer was available for Mac OS. They also asserted that IE was not really free because its development and marketing costs may have kept the price of Windows higher than it might otherwise have been. The case was tried before U.S. District Court Judge Thomas Penfield Jackson. The DOJ was initially represented by David Boies. On June 30, 2004, the U.S. appeals court unanimously approved the settlement with the Justice Department, rejecting objections that the sanctions were inadequate. Usage of Microsoft Office 365 and Teams in schools The Court of Justice of the European Union on 16 July 2020 had ruled that "it is illegal to send private data outside of EU to the US" Microsoft Office 365 had been banned from several schools in Europe over privacy concerns. Other In March 2004, during a consumer class-action lawsuit in Minnesota, internal documents subpoenaed from Microsoft revealed that the company had violated nondisclosure agreements seven years earlier in obtaining business plans from Go Corporation, using them to develop and announce a competing product named PenWindows, and convincing Intel to reduce its investment in Go. After Go was purchased by AT&T and Go's tablet-based computing efforts were shelved, PenWindows development was dropped. In May 2004, a class-action lawsuit accused Microsoft of overcharging customers in the state of California. The company settled the case for $1.1 billion, and a California court ordered Microsoft to pay an additional $258 million in legal fees (including over $3,000 per hour for the lead attorney in the case, more than $2,000 per hour for colleagues, and in excess of $1,000 per hour for administrative work). A Microsoft attorney responded, "Somebody ends up paying for this. These large fee awards get passed on to consumers." The total bill for legal fees was later reduced to just over $112 million. Because of the structure of the settlement, the law firm which sued Microsoft could end up getting more money from the company than California consumers and schools, the beneficiaries of the settlement. In 2006, Microsoft initiated an investigation of Lithuanian government institutions for determining whether they choose long-term strategies of the software they use correctly. The investigation, funded by Microsoft itself, will be performed by the Vilnius University together with the Lithuanian Institution of the Free Market, a think tank organization. The investigation was initialised after the government started to prepare 860 thousand litas project to encourage the use of open-source software. The vice-president of Microsoft, Vahe Torossian, stated that "the government should not be technologically subjectivist". Microsoft was sued for the "Windows Vista Capable" logo and in Iowa. Microsoft Word was also a subject of court case. On July 12, 2013, Microsoft is suing the U.S. Customs and Border Protection over Google phone ban. Homeland Security Secretary Janet Napolitano is also named in the lawsuit. Private Microsoft has also fought numerous legal battles against private companies. The most prominent ones are against: Alcatel-Lucent, which won US$1.52 billion in a lawsuit which alleged that Microsoft had infringed its patents on playback of audio files. This ruling was overturned in a higher court. Apple Inc. (known as Apple Computer, Inc. at the time), which accused Microsoft in the late 1980s of copying the "look and feel" of the graphical user interface of Apple's operating systems. The courts ruled in favor of Microsoft in 1994. Another suit by Apple accused Microsoft, along with Intel and the San Francisco Canyon Company, in 1995 of knowingly stealing several thousand lines of QuickTime source code in an effort to improve the performance of Video for Windows. After a threat to withdraw support for Office for Mac, this lawsuit was ultimately settled in 1997. Apple agreed to make Internet Explorer the default browser over Netscape, and Microsoft agreed to continue developing Office and other software for the Mac for the next 5 years, purchase $150 million of non-voting Apple stock, and made a quiet payoff estimated to be in the US$500 million-$2 billion range. AOL, on behalf of its Netscape division. Netscape (as an independent company) also was involved in the United States v. Microsoft antitrust suit. AtomicPark.com, which in 2009 was ordered to pay $1.2 million to Microsoft for selling unauthorized versions of Microsoft software. Be Inc., which accused Microsoft of exclusionary and anticompetitive behavior intended to drive Be out of the market. Be even offered to license its Be Operating System (BeOS) for free to any PC vendors who would ship it pre-installed, but the vendors declined due to what Be believes were fears of pricing retaliation from Microsoft: by raising the price of Microsoft Windows for one particular PC vendor, Microsoft could price that vendor's PCs out of the market. Bristol Technology, which accused Microsoft illegally withheld Windows source code and used its dominant position with Windows to move into other markets. A ruling later ordered Microsoft to pay $1 million to Bristol Technologies (see also Windows Interface Source Environment). Caldera, Inc. in 1996, accused Microsoft of several anti-competitive practices, including vaporware announcements, creating FUD, exclusionary licensing and artificial tying. One of the claims was down to bundling and tying MS-DOS 7 and Windows 4 into a single product (Windows 95) for the sole purpose of eliminating competition, another to having modified Windows 3.1 so that it would not run on DR DOS 6.0 although there was no technical reason for it not to work. Several industry experts revealed that Microsoft put encrypted code, which became known as AARD code, in five otherwise unrelated Microsoft programs in order to prevent the functioning of DR DOS in pre-releases (beta versions) of Windows 3.1, and that it was technically possible to run Windows 4 on DR-DOS 7 after bypassing some new and non-essential interface code through WinGlue. In 2000, Microsoft settled out of court for an undisclosed sum, which in 2009 was revealed to be $280 million, and the Caldera evidence was destroyed in 2003. Opera Software, which accused Microsoft of intentionally making its MSN service incompatible with the Opera browser on several occasions. Sendo, which accused Microsoft of terminating their partnership so it could steal Sendo's technology to use in Pocket PC 2002 Phone Edition. Spyglass, which licensed its browser to Microsoft in return for a percentage of each sale; Microsoft turned the browser into Internet Explorer and bundled it with Windows, giving it away to gain market share but effectively destroying any chance of Spyglass making money from the deal they had signed with Microsoft; Spyglass sued for deception and won an $8 million settlement. Stac Electronics, which accused Microsoft of stealing its data compression code and using it in MS-DOS 6.0. Microsoft eventually lost the subsequent Stac v. Microsoft lawsuit and was ordered by a federal court to pay roughly $120 million in compensation. Sun Microsystems, which held Microsoft in violation of contract for including a modified version of Java in Microsoft Windows that provided Windows-specific extensions to Sun's Java language; Microsoft lost this decision in court and were forced to stop shipping their Windows-specific Java virtual machine. Microsoft eventually ceased to include any Java Virtual Machine in Windows, and Windows users who require a Java Virtual Machine need to download the software or otherwise acquire a copy from a source other than Microsoft. Zhongyi Electronic, which, having licensed two fonts which it had designed to Microsoft for use only in Windows 95, filed suit in China in April 2007, accusing Microsoft of using those fonts in subsequent Windows 98, 2000, XP, Server 2003 and four other Chinese-language Windows operating systems. Beijing's No. 1 intermediate people's court ruled on November 16, 2009, that Microsoft violated the scope of licensing agreements between the two companies. The result of the verdict is that Microsoft has to stop selling Chinese-language versions of the aforementioned operating systems. Microsoft said it will appeal. One of the fonts in question may be SimSun. Many other smaller companies have filed patent abuse and predatory practice suits against Microsoft. Patents Alcatel-Lucent The dispute between Microsoft and Lucent (and later Alcatel-Lucent) began in 2003 when Lucent Technologies (acquired by Alcatel in 2006) filed suit against Gateway in the U.S. District Court for the Southern District of California in San Diego. Lucent also sued Dell in the U.S. District Court for the Eastern District of Virginia; soon thereafter, that court transferred the Dell case to San Diego, where it was consolidated with the case against Gateway. Lucent claimed in this first San Diego case that Dell and Gateway had violated patents on MP3-related technologies developed by Bell Labs, a division of predecessor company American Telephone & Telegraph. Other patents said to be infringed relate to MPEG video technology, speech technology, internet technology, and other technologies. Microsoft intervened in the lawsuit in April 2003 and Alcatel was added after it acquired Lucent. After the first San Diego lawsuit was filed, Microsoft and Lucent have filed additional patent lawsuits against each other. In February 2007, Microsoft filed a lawsuit at the International Trade Commission claiming that Alcatel-Lucent infringed its patents. There is a second case in San Diego where Microsoft is asserting that Alcatel-Lucent infringes 10 of its patents, and yet another case in Texas where each alleges that the other is infringing its patents. Burst.com Burst.com claims that Microsoft stole Burst's patented technology for delivering high speed streaming sound and video content on the internet. Also at issue in the case is a 35-week period of missing emails in the evidence Microsoft handed over to Burst which was discovered by Burst.com's lawyers. Burst accuses Microsoft of crafting a 30-day email deletion policy specifically to cover up illegal activity. Microsoft settled with the company for $60 million in exchange for an agreement to license some of the company's technologies. Eolas Eolas and University of California, which accused Microsoft of using some of its software patents in their web browser, won $521 million in court; however, Eolas' patents were invalidated in 2012. SurfCast SurfCast is suing Microsoft for infringing patent on Live Tiles. Copyrights Apple Apple Computer Inc. v. Microsoft Corporation, 35 F.3d 1435 (9th Cir. 1994) was a copyright infringement lawsuit in which Apple Computer, Inc. (now Apple Inc.) sought to prevent Microsoft Corporation and Hewlett-Packard from using visual graphical user interface (GUI) elements that were similar to those in Apple's Lisa and Macintosh operating systems. Some critics claimed that Apple was really attempting to gain all intellectual property rights over the desktop metaphor for computer interfaces, and perhaps all GUIs, on personal computers. Apple lost all claims in the lawsuit, except that the court ruled that the "trash can" icon and file folder icons from Hewlett-Packard's now-forgotten NewWave windows application were infringing. The lawsuit was filed in 1988 and lasted four years; the decision was affirmed on appeal in 1994, and Apple's appeal to the U.S. Supreme Court was denied. Trademarks Lindows Microsoft v. Lindows.com, Inc. was a court case brought on December 20, 2001, by Microsoft against Lindows, Inc, claiming that the name "Lindows" was a violation of its trademark "Windows". In addition to the United States, Microsoft has also sued Lindows in Sweden, France, Belgium, Luxembourg, the Netherlands and Canada. Michael Robertson has called this situation "Sextuple Jeopardy", an extension of the term double jeopardy. In response to these lawsuits, Lindows had launched ChoicePC.com, which allows people to purchase lifetime Lindows memberships that includes a free copy of LindowsOS, free LindowsOS upgrades for life, and a ChoicePC.com T-shirt, for US$100. All money from the memberships goes towards helping Lindows in its legal battle against Microsoft. MikeRoweSoft In a legal dispute, Microsoft sued a Canadian high school student named Mike Rowe over the domain name MikeRoweSoft.com. The case received international press attention following Microsoft's perceived heavy handed approach to a 12th grade student's part-time web design business and the subsequent support that Rowe received from the online community. A settlement was eventually agreed, with Rowe granting ownership of the domain to Microsoft in return for training and gifts. , the domain MikeRoweSoft.com still redirects to microsoft.com. Shah Microsoft sued several parties for contributory cybersquatting—that is, encouraging others (through software and instructional videos) to cybersquat on domain names that infringed on Microsoft's trademarks. Microsoft prevailed in court and also established a precedent that liabilities under the Anticybersquatting Consumer Protection Act (ACPA) include contributory trademark infringement. Windows Commander From 1993 until 2002, Total Commander was called Windows Commander; the name was changed in 2002, out of fear of a lawsuit after the developers received a letter from Microsoft pointing out that the word "windows" was trademarked by Microsoft. wxWindows The wxWindows project was renamed to wxWidgets in September 2003 out of fear of a lawsuit after the founder developer Julian Smart received a letter from Microsoft pointing out that the 'Windows' is a UK trademark owned by Microsoft. Microwindows The Microwindows project was renamed to Nano-X Window System in January 2005, due to legal threats from Microsoft regarding the Windows trademark. Other Microsoft v. Internal Revenue Service Xbox 360 Microsoft has been accused of deceiving consumers by concealing the high failure rate of its Xbox 360 game console. A woman from California sued Microsoft on October 2008 in Superior Court in Sacramento County, stating that the company violated multiple state consumer-protection and unfair-competition laws. The woman alleged that the company continued to sell the Xbox 360 even though it knew that the console's hardware was likely to fail. References External links Microsoft On The Issues Microsoft Litigation Resource Page Microsoft case could make EU 'litigation capital of the world' History of Microsoft Computer case law
1788660
https://en.wikipedia.org/wiki/Anaglyph%203D
Anaglyph 3D
Anaglyph 3D is the stereoscopic 3D effect achieved by means of encoding each eye's image using filters of different (usually chromatically opposite) colors, typically red and cyan. Anaglyph 3D images contain two differently filtered colored images, one for each eye. When viewed through the "color-coded" "anaglyph glasses", each of the two images reaches the eye it's intended for, revealing an integrated stereoscopic image. The visual cortex of the brain fuses this into the perception of a three-dimensional scene or composition. Anaglyph images have seen a recent resurgence due to the presentation of images and video on the Web, Blu-ray Discs, CDs, and even in print. Low cost paper frames or plastic-framed glasses hold accurate color filters that typically, after 2002, make use of all 3 primary colors. The current norm is red and cyan, with red being used for the left channel. The cheaper filter material used in the monochromatic past dictated red and blue for convenience and cost. There is a material improvement of full color images, with the cyan filter, especially for accurate skin tones. Video games, theatrical films, and DVDs can be shown in the anaglyph 3D process. Practical images, for science or design, where depth perception is useful, include the presentation of full scale and microscopic stereographic images. Examples from NASA include Mars Rover imaging, and the solar investigation, called STEREO, which uses two orbital vehicles to obtain the 3D images of the sun. Other applications include geological illustrations by the United States Geological Survey, and various online museum objects. A recent application is for stereo imaging of the heart using 3D ultra-sound with plastic red/cyan glasses. Anaglyph images are much easier to view than either parallel (diverging) or crossed-view pairs stereograms. However, these side-by-side types offer bright and accurate color rendering, not easily achieved with anaglyphs. Also, extended use of the "color-coded" "anaglyph glasses" can cause discomfort, and the afterimage caused by the colors of the glasses may temporarily affect the viewer's visual perception of real life objects. Recently, cross-view prismatic glasses with adjustable masking have appeared, that offer a wider image on the new HD video and computer monitors. History The oldest known description of anaglyph images was written in August 1853 by W. Rollmann in Stargard about his "Farbenstereoscope" (color stereoscope). He had the best results viewing a yellow/blue drawing with red/blue glasses. Rollmann found that with a red/blue drawing the red lines were not as distinct as yellow lines through the blue glass. In 1858, in France, delivered a report to l'Académie des sciences describing how to project three-dimensional magic lantern slide shows using red and green filters to an audience wearing red and green goggles. Subsequently he was chronicled as being responsible for the first realisation of 3D images using anaglyphs. Louis Ducos du Hauron produced the first printed anaglyphs in 1891. This process consisted of printing the two negatives which form a stereoscopic photograph on to the same paper, one in blue (or green), one in red. The viewer would then use colored glasses with red (for the left eye) and blue or green (right eye). The left eye would see the blue image which would appear black, whilst it would not see the red; similarly the right eye would see the red image, this registering as black. Thus a three dimensional image would result. William Friese-Green created the first three-dimensional anaglyphic motion pictures in 1889, which had public exhibition in 1893. 3-D films enjoyed something of a boom in the 1920s. The term "3-D" was coined in the 1950s. As late as 1954, films such as Creature from the Black Lagoon remained very successful. Originally shot and exhibited using the Polaroid system, Creature from the Black Lagoon was successfully reissued much later in an anaglyph format so it could be shown in cinemas without the need for special equipment. In 1953, the anaglyph had begun appearing in newspapers, magazines and comic books. The 3-D comic books were one of the most interesting applications of anaglyph to printing. Over the years, anaglyphic pictures have sporadically appeared in comics and magazine ads. Although not anaglyphic, Jaws 3-D was a box-office success in 1983. At present the excellent quality of computer displays and user-friendly stereo-editing programs offer new and exciting possibilities for experimenting with anaglyph stereo. Production Anaglyph from stereo pairs A stereo pair is a pair of images from slightly different perspectives at the same time. Objects closer to the camera(s) have greater differences in appearance and position within the image frames than objects further from the camera. Historically cameras captured two color filtered images from the perspective of the left and right eyes which were projected or printed together as a single image, one side through a red filter and the other side through a contrasting color such as blue or green or mixed cyan. As outlined below, one may now, typically, use an image processing computer program to simulate the effect of using color filters, using as a source image a pair of either color or monochrome images. This is called mosaicking or image stitching. In the 1970s filmmaker Stephen Gibson filmed direct anaglyph blaxploitation and adult movies. His "Deep Vision" system replaced the original camera lens with two color-filtered lenses focused on the same film frame. In the 1980s, Gibson patented his mechanism. Many computer graphics programs provide the basic tools (typically layering and adjustments to individual color channels to filter colors) required to prepare anaglyphs from stereo pairs. In simple practice, the left eye image is filtered to remove blue & green. The right eye image is filtered to remove red. The two images are usually positioned in the compositing phase in close overlay registration (of the main subject). Plugins for some of these programs as well as programs dedicated to anaglyph preparation are available which automate the process and require the user to choose only a few basic settings. Stereo conversion (single 2D image to 3D) There also exist methods for making anaglyphs using only one image, a process called stereo conversion. In one, individual elements of a picture are horizontally offset in one layer by differing amounts with elements offset further having greater apparent changes in depth (either forward or back depending on whether the offset is to the left or right). This produces images that tend to look like elements are flat standees arranged at various distances from the viewer similar to cartoon images in a View-Master. A more sophisticated method involves use of a depth map (a false color image where color indicates distance, for example, a grayscale depth map could have lighter indicate an object closer to the viewer and darker indicate an object further away). As for preparing anaglyphs from stereo pairs, stand-alone software and plug-ins for some graphics apps exist which automate production of anaglyphs (and stereograms) from a single image or from an image and its corresponding depth map. As well as fully automatic methods of calculating depth maps (which may be more or less successful), depth maps can be drawn entirely by hand. Also developed are methods of producing depth maps from sparse or less accurate depth maps. A sparse depth map is a depth map consisting of only a relatively few lines or areas which guides the production of the full depth map. Use of a sparse depth map can help overcome auto-generation limitations. For example, if a depth finding algorithm takes cues from image brightness an area of shadow in the foreground may be incorrectly assigned as background. This misassignment is overcome by assigning the shaded area a close value in the sparse depth map. Mechanics Viewing anaglyphs through spectrally opposed glasses or gel filters enables each eye to see independent left and right images from within a single anaglyphic image. Red-cyan filters can be employed because our vision processing systems use red and cyan comparisons, as well as blue and yellow, to determine the color and contours of objects. In a red-cyan anaglyph, the eye viewing through the red filter sees red within the anaglyph as "white", and the cyan within the anaglyph as "black". The eye viewing through the cyan filter perceives the opposite. Actual black or white in the anaglyph display, being void of color, are perceived the same by each eye. The brain blends together the red and cyan channeled images as in regular viewing but only green and blue are perceived. Red is not perceived because red equates with white through red gel and is black through cyan gel. However green and blue are perceived through cyan gel. Types Complementary color Complementary color anaglyphs employ one of a pair of complementary color filters for each eye. The most common color filters used are red and cyan. Employing tristimulus theory, the eye is sensitive to three primary colors, red, green, and blue. The red filter admits only red, while the cyan filter blocks red, passing blue and green (the combination of blue and green is perceived as cyan). If a paper viewer containing red and cyan filters is folded so that light passes through both, the image will appear black. Another recently introduced form employs blue and yellow filters. (Yellow is the color perceived when both red and green light passes through the filter.) Anaglyph images have seen a recent resurgence because of the presentation of images on the Internet. Where traditionally, this has been a largely black & white format, recent digital camera and processing advances have brought very acceptable color images to the internet and DVD field. With the online availability of low cost paper glasses with improved red-cyan filters, and plastic framed glasses of increasing quality, the field of 3D imaging is growing quickly. Scientific images where depth perception is useful include, for instance, the presentation of complex multi-dimensional data sets and stereographic images of the surface of Mars. With the recent release of 3D DVDs, they are more commonly being used for entertainment. Anaglyph images are much easier to view than either parallel sighting or crossed eye stereograms, although these types do offer more bright and accurate color rendering, most particularly in the red component, which is commonly muted or desaturated with even the best color anaglyphs. A compensating technique, commonly known as Anachrome, uses a slightly more transparent cyan filter in the patented glasses associated with the technique. Processing reconfigures the typical anaglyph image to have less parallax to obtain a more useful image when viewed without filters. Compensating focus diopter glasses for red-cyan method Simple sheet or uncorrected molded glasses do not compensate for the 250 nanometer difference in the wavelengths of the red-cyan filters. With simple glasses the red filter image can be blurry when viewing a close computer screen or printed image since the retinal focus differs from the cyan filtered image, which dominates the eyes' focusing. Better quality molded plastic glasses employ a compensating differential diopter power to equalize the red filter focus shift relative to the cyan. The direct view focus on computer monitors has been recently improved by manufacturers providing secondary paired lenses, fitted and attached inside the red-cyan primary filters of some high-end anaglyph glasses. They are used where very high resolution is required, including science, stereo macros, and animation studio applications. They use carefully balanced cyan (blue-green) acrylic lenses, which pass a minute percentage of red to improve skin tone perception. Simple red/blue glasses work well with black and white, but the blue filter is unsuitable for human skin in color. U.S. Patent No. 6,561,646 was issued to the inventor in 2003. In the trade, the label "www.anachrome" is used to label diopter corrected 3D glasses covered by this patent. (ACB) 3-D (ACB) 'Anaglyphic Contrast Balance' is a patented anaglyphic production method by Studio 555. Retinal Rivalry of color contrasts within the color channels of anaglyph images is addressed. Contrasts and details from the stereo pair are maintained and re-presented for view within the anaglyph image. The (ACB) method of balancing the color contrasts within the stereo pair enables a stable view of contrast details, thus eliminating retinal rivalry. The process is available for red/cyan color channels but may use any of the opposing color channel combinations. As with all stereoscopic anaglyphic systems, screen or print, the display color should be RGB accurate and the viewing gels should match the color channels to prevent double imaging. The basic (ACB) method adjusts red, green and blue, but adjusting all six color primaries is preferred. The effectiveness of the (ACB) process is proven with the inclusion of primary color charts within a stereo pair. A contrast-balanced view of the stereo pair and color charts is evident in the resulting (ACB) processed anaglyph image. The (ACB) process also enables black and white (monochromatic) anaglyphs with contrast balance. Where full color to each eye is enabled via alternating color channels and color-alternating viewing filters, (ACB) prevents shimmer from pure-colored objects within the modulating image. Vertical and diagonal parallax is enabled with concurrent use of a horizontally oriented lenticular or parallax barrier screen. This enables a Quadrascopic full color holographic effect from a monitor. ColorCode 3-D ColorCode 3-D was deployed in the 2000s and uses amber and blue filters. It is intended to provide the perception of nearly full color viewing (particularly within the RG color space) with existing television and paint mediums. One eye (left, amber filter) receives the cross-spectrum color information and one eye (right, blue filter) sees a monochrome image designed to give the depth effect. The human brain ties both images together. Images viewed without filters will tend to exhibit light-blue and yellow horizontal fringing. The backwards compatible 2D viewing experience for viewers not wearing glasses is improved, generally being better than previous red and green anaglyph imaging systems, and further improved by the use of digital post-processing to minimize fringing. The displayed hues and intensity can be subtly adjusted to further improve the perceived 2D image, with problems only generally found in the case of extreme blue. The blue filter is centered around 450 nm and the amber filter lets in light at wavelengths at above 500 nm. Wide spectrum color is possible because the amber filter lets through light across most wavelengths in spectrum and even has a small leakage of the blue color spectrum. When presented the original left and right images are run through the ColorCode 3-D encoding process to generate one single ColorCode 3-D encoded image. In the United Kingdom, television station Channel 4 commenced broadcasting a series of programs encoded using the system during the week of November 16, 2009. Previously the system had been used in the United States for an "all 3-D advertisement" during the 2009 Super Bowl for SoBe, Monsters vs. Aliens animated movie and an advertisement for the Chuck television series in which the full episode the following night used the format. Inficolor 3D Developed by TriOviz, Inficolor 3D is a patent pending stereoscopic system, first demonstrated at the International Broadcasting Convention in 2007 and deployed in 2010. It works with traditional 2D flat panels and HDTV sets and uses expensive glasses with complex color filters and dedicated image processing that allow natural color perception with a 3D experience. This is achieved through having the left image using the green channel only and the right using the red and blue channels with some added post processing, which the brain then combines the two images to produce a nearly full color experience. When observed without glasses, some slight doubling can be noticed in the background of the action which allows watching the movie or the video game in 2D without the glasses. This is not possible with traditional brute force anaglyphic systems. Inficolor 3D is a part of TriOviz for Games Technology, developed in partnership with TriOviz Labs and Darkworks Studio. It works with Sony PlayStation 3 (Official PlayStation 3 Tools & Middleware Licensee Program) and Microsoft Xbox 360 consoles as well as PC. TriOviz for Games Technology was showcased at Electronic Entertainment Expo 2010 by Mark Rein (vice-president of Epic Games) as a 3D tech demo running on an Xbox 360 with Gears of War 2. In October 2010 this technology has been officially integrated in Unreal Engine 3, the computer game engine developed by Epic Games. Video games equipped with TriOviz for Games Technology are: Batman Arkham Asylum: Game of the Year Edition for PS3 and Xbox 360 (March 2010), Enslaved: Odyssey to the West + DLC Pigsy's Perfect 10 for PS3 and Xbox 360 (Nov. 2010), Thor: God of Thunder for PS3 and Xbox 360 (May 2011), Green Lantern: Rise of the Manhunters for PS3 and Xbox 360 (June 2011), Captain America: Super Soldier for PS3 and Xbox 360 (July 2011). Gears of War 3 for Xbox 360 (September 2011), Batman: Arkham City for PS3 and Xbox 360 (October 2011), Assassin's Creed: Revelations for PS3 and Xbox 360 (November 2011), and Assassin's Creed III for Wii U (November 2012). The first DVD/Blu-ray including Inficolor 3D Tech is: Battle for Terra 3D (published in France by Pathé & Studio 37 - 2010). Most other games can be played in this format with Tridef 3D with display settings set to Colored Glasses>Green/Purple, although this is not officially supported by Trioviz, but the results are nearly identical without limiting the game selection. Anachrome red/cyan filters A variation on the anaglyph technique from the early 2000s is called "Anachrome method". This approach is an attempt to provide images that look nearly normal, without glasses, for small images, either 2D or 3D, with most of the negative qualities being masked innately by the small display. Being "compatible" for small size posting in conventional websites or magazines. Usually a larger file can be selected that will fully present the 3D with the dramatic definition. The 3D (Z axis) depth effect is generally more subtle than simple anaglyph images, which are usually made from wider spaced stereo pairs. Anachrome images are shot with a typically narrower stereo base, (the distance between the camera lenses). Pains are taken to adjust for a better overlay fit of the two images, which are layered one on top of another. Only a few pixels of non-registration give the depth cues. The range of color perceived, is noticeably wider in Anachrome image, when viewed with the intended filters. This is due to the deliberate passage of a small (1 to 2%) of the red information through the cyan filter. Warmer tones can be boosted, because each eye sees some color reference to red. The brain responds in the mental blending process and usual perception. It is claimed to provide warmer and more complex perceived skin tones and vividness. Interference filter systems This technique uses specific wavelengths of red, green, and blue for the right eye, and different wavelengths of red, green, and blue for the left eye. Eyeglasses which filter out the very specific wavelengths allow the wearer to see a full color 3D image. Special interference filters (dichromatic filters) in the glasses and in the projector form the main item of technology and have given the system this name. It is also known as spectral comb filtering or wavelength multiplex visualization. Sometimes this technique is described as a "super-anaglyph" because it is an advanced form of spectral-multiplexing which is at the heart of the conventional anaglyph technique. This technology eliminates the expensive silver screens required for polarized systems such as RealD, which is the most common 3D display system in theaters. It does, however, require much more expensive glasses than the polarized systems. Dolby 3D uses this principle. The filters divide the visible color spectrum into six narrow bands – two in the red region, two in the green region, and two in the blue region (called R1, R2, G1, G2, B1 and B2 for purposes of this description). The R1, G1 and B1 bands are used for one eye image, and R2, G2, B2 for the other eye. The human eye is largely insensitive to such fine spectral differences so this technique is able to generate full-color 3D images with only slight color differences between the two eyes. The Omega 3D/Panavision 3D system also used this technology, though with a wider spectrum and more "teeth" to the "comb" (5 for each eye in the Omega/Panavision system). The use of more spectral bands per eye eliminates the need to color process the image, required by the Dolby system. Evenly dividing the visible spectrum between the eyes gives the viewer a more relaxed "feel" as the light energy and color balance is nearly 50-50. Like the Dolby system, the Omega system can be used with white or silver screens. But it can be used with either film or digital projectors, unlike the Dolby filters that are only used on a digital system with a color correcting processor provided by Dolby. The Omega/Panavision system also claims that their glasses are cheaper to manufacture than those used by Dolby. In June 2012 the Omega 3D/Panavision 3D system was discontinued by DPVO Theatrical, who marketed it on behalf of Panavision, citing "challenging global economic and 3D market conditions". Although DPVO dissolved its business operations, Omega Optical continues promoting and selling 3D systems to non-theatrical markets. Omega Optical’s 3D system contains projection filters and 3D glasses. In addition to the passive stereoscopic 3D system, Omega Optical has produced enhanced anaglyph 3D glasses. The Omega’s red/cyan anaglyph glasses use complex metal oxide thin film coatings and high quality annealed glass optics. Viewing A pair of glasses, with filters of opposing colors, is worn to view an anaglyphic photo image. A red filter lens over the left eye allows graduations of red to cyan from within the anaglyph to be perceived as graduations of bright to dark. The cyan (blue/green) filter over the right eye conversely allows graduations of cyan to red from within the anaglyph to be perceived as graduations of bright to dark. Red and cyan color fringes in the anaglyph display represent the red and cyan color channels of the parallax-displaced left and right images. The viewing filters each cancel out opposing colored areas, including graduations of less pure opposing colored areas, to each reveal an image from within its color channel. Thus the filters enable each eye to see only its intended view from color channels within the single anaglyphic image. Red-green glasses are also usable, but may give the viewer a more strongly colored view, since red and green are not complementary colors. Red sharpened anaglyph glasses Simple paper uncorrected gel glasses, cannot compensate for the 250 nanometer difference in the wavelengths of the red-cyan filters. With simple glasses, the red filtered image is somewhat blurry, when viewing a close computer screen or printed image. The red retinal focus differs from the image through the cyan filter, which dominates the eyes' focusing. Better-quality molded acrylic glasses frequently employ a compensating differential diopter power (a spherical correction) to balance the red filter focus shift relative to the cyan, which reduces the innate softness and diffraction of red filtered light. Low-power reading glasses worn along with the paper glasses also sharpen the image noticeably. The correction is only about 1/2 + diopter on the red lens. However, some people with corrective glasses are bothered by difference in lens diopters, as one image is a slightly larger magnification than the other. Though endorsed by many 3D websites, the diopter "fix" effect is still somewhat controversial. Some, especially the nearsighted, find it uncomfortable. There is about a 400% improvement in acuity with a molded diopter filter, and a noticeable improvement of contrast and blackness. The American Amblyopia Foundation uses this feature in their plastic glasses for school screening of children's vision, judging the greater clarity as a significant plus factor. Anachrome filters Plastic glasses, developed in recent years, provide both the diopter "fix" noted above, and a change in the cyan filter. The formula provides intentional "leakage" of a minimal (2%) percentage of red light with the conventional range of the filter. This assigns two-eyed "redness cues" to objects and details, such as lip color and red clothing, that are fused in the brain. Care must be taken, however, to closely overlay the red areas into near-perfect registration, or "ghosting" can occur. Anachrome formula lenses work well with black and white, but can provide excellent results when the glasses are used with conforming "anachrome friendly" images. The US Geological Survey has thousands of these "conforming" full-color images, which depict the geology and scenic features of the U.S. National Park system. By convention, anachrome images try to avoid excess separation of the cameras and parallax, thereby reducing the ghosting that the extra color bandwidth introduces to the images. Traditional anaglyph processing methods One monochromatic method uses a stereo pair available as a digitized image, along with access to general-purpose image processing software. In this method, the images are run through a series of processes and saved in an appropriate transmission and viewing format such as JPEG. Several computer programs will create color anaglyphs without Adobe Photoshop, or a traditional, more complex compositing method can be used with Photoshop. Using color information, it is possible to obtain reasonable (but not accurate) blue sky, green vegetation, and appropriate skin tones. Color information appears disruptive when used for brightly colored and/or high-contrast objects such as signs, toys, and patterned clothing when these contain colors that are close to red or cyan. Only few color anaglyphic processes, e.g. interference filter systems used for Dolby 3D, can reconstruct full-color 3D images. However, other stereo display methods can easily reproduce full-color photos or movies, e.g. active shutter 3D or polarized 3D systems. Such processes allow better viewing comfort than most limited color anaglyphic methods. According to entertainment trade papers, 3D films had a revival in recent years and 3D is now also used in 3D Television. Depth adjustment The adjustment suggested in this section is applicable to any type of stereogram but is particularly appropriate when anaglyphed images are to be viewed on a computer screen or on printed matter. Those portions of the left and right images that are coincident will appear to be at the surface of the screen. Depending upon the subject matter and the composition of the image it may be appropriate to make this align to something slightly behind the nearest point of the principal subject (as when imaging a portrait). This will cause the near points of the subject to "pop out" from the screen. For best effect, any portions of a figure to be imaged forward of the screen surface should not intercept the image boundary, as this can lead to a discomforting "amputated" appearance. It is of course possible to create a three-dimensional "pop out" frame surrounding the subject in order to avoid this condition. If the subject matter is a landscape, you may consider putting the frontmost object at or slightly behind the surface of the screen. This will cause the subject to be framed by the window boundary and recede into the distance. Once the adjustment is made, trim the picture to contain only the portions containing both left and right images. In the example shown above, the upper image appears (in a visually disruptive manner) to spill out from the screen, with the distant mountains appearing at the surface of the screen. In the lower modification of this image the red channel has been translated horizontally to bring the images of the nearest rocks into coincidence (and thus appearing at the surface of the screen) and the distant mountains now appear to recede into the image. This latter adjusted image appears more natural, appearing as a view through a window onto the landscape. Scene composition Dual purpose, 2D or 3D "compatible anaglyph" technique Since the advent of the Internet, a variant technique has developed where the images are specially processed to minimize visible mis-registration of the two layers. This technique is known by various names, the most common, associated with diopter glasses, and warmer skin tones, is Anachrome. The technique allows most images to be used as large thumbnails, while the 3D information is encoded into the image with less parallax than conventional anaglyphs. Anaglyphic color channels Anaglyph images may use any combination of color channels. However, if a stereoscopic image is to be pursued, the colors should be diametrically opposed. Impurities of color channel display, or of the viewing filters, allow some of the image meant for the other channel to be seen. This results in stereoscopic double imaging, also called ghosting. Color channels may be left-right reversed. Red/cyan is most common. magenta/green and blue/yellow are also popular. Red/green and red/blue enable monochromatic images especially red/green. Many anaglyph makers purposely integrate impure color channels and viewing filters to enable better color perception, but this results in a corresponding degree of double imaging. Color channel brightness % of white: red-30/cyan-70, magenta-41/green-59 or especially blue-11/yellow-89), the lighter display channel may be darkened or the brighter viewing filter may be darkened to allow both eyes a balanced view. However the Pulfrich effect can be obtained from a light/dark filter arrangement. The color channels of an anaglyphic image require pure color display fidelity and corresponding viewing filter gels. The choice of ideal viewing filters is dictated by the color channels of the anaglyph to be viewed. Ghosting can be eliminated by ensuring a pure color display and viewing filters that mach the display. Retinal rivalry can be eliminated by the (ACB) 3-D Anaglyphic Contrast Balance method patented by that prepares the image pair prior to color channeling in any color. In theory, under trichromatic principles, it is possible to introduce a limited amount of multiple-perspective capability (a technology not possible with polarization schemes). This is done by overlapping three images instead of two, in the sequence of green, red, blue. Viewing such an image with red-green glasses would give one perspective, while switching to blue-red would give a slightly different one. In practice, this remains elusive as some blue is perceived through green gel and most green is perceived through blue gel. It is also theoretically possible to incorporate rod cells, which optimally perform at a dark cyan color, in well-optimized mesopic vision, to create a fourth filter color and yet another perspective; however, this has not yet been demonstrated, nor would most televisions be able to process such tetrachromatic filtering. Applications On April 1, 2010, Google launched a feature in Google Street View that shows anaglyphs rather than regular images, allowing users to see the streets in 3D. Home entertainment Disney Studios released Hannah Montana & Miley Cyrus: Best of Both Worlds Concert in August 2008, its first anaglyph 3D Blu-ray Disc. This was shown on the Disney Channel with red-cyan paper glasses in July 2008. However, on Blu-ray Disc anaglyph techniques have more recently been supplanted by the Blu-ray 3D format, which uses Multiview Video Coding (MVC) to encode full stereoscopic images. Though Blu-ray 3D does not require a specific display method, and some Blu-ray 3D software players (such as Arcsoft TotalMedia Theatre) are capable of anaglyphic playback, most Blu-ray 3D players are connected via HDMI 1.4 to 3D televisions and other 3D displays using more advanced stereoscopic display methods, such as alternate-frame sequencing (with active shutter glasses) or FPR polarization (with the same passive glasses as RealD theatrical 3D). Comics These techniques have been used to produce 3-dimensional comic books, mostly during the early 1950s, using carefully constructed line drawings printed in colors appropriate to the filter glasses provided. The material presented were from a wide variety of genres, including war, horror, crime, and superhero. Anaglyphed comics were far more difficult to produce than normal comics, requiring each panel to be drawn multiple times on layers of acetate. While the first 3D comic in 1953 sold over two million copies, by the end of the year sales had bottomed out, though 3D comics have continued to be released irregularly up until the present day. Science and mathematics Three-dimensional display can also be used to display scientific data sets, or to illustrate mathematical functions. Anaglyph images are suitable both for paper presentation, and moving video display (see neuroimage related paper). They can easily be included in science books, and viewed with cheap anaglyph glasses. Anaglyphy (including, among others, aerial, telescopic, and microscopic images) is being applied to scientific research, popular science, and higher education. Also, chemical structures, particularly for large systems, can be difficult to represent in two dimensions without omitting geometric information. Therefore, most chemistry computer software can output anaglyph images, and some chemistry textbooks include them. Today, there are more advanced solutions for 3D imaging available, like shutter glasses together with fast monitors. These solutions are already extensively used in science. Still, anaglyph images provide a cheap and comfortable way to view scientific visualizations. See also Holography Phantogram Pulfrich effect Vectograph Wheatstone viewer References External links TIM - Online raytracer that also generates anaglyphs (for red/blue glasses) and autostereograms 3D STEREO PORTAL Videos & Photos Collection from the World 3D Anaglyph red-cyan How to produce an anaglyph image from a stereoscopic camera device. Anaglyph Gallery at Brooklyn Stereography featuring hundreds of red-cyan anaglyphs 3-D Anaglyph Technique Amazing 3-D Stereo Anaglyph Photos featuring various unusual subjects Making a 3-D anaglyph movie with two 16mm cameras Stereoscopic photography 3D imaging Articles containing video clips
42870
https://en.wikipedia.org/wiki/Java%20Platform%2C%20Micro%20Edition
Java Platform, Micro Edition
Java Platform, Micro Edition or Java ME is a computing platform for development and deployment of portable code for embedded and mobile devices (micro-controllers, sensors, gateways, mobile phones, personal digital assistants, TV set-top boxes, printers). Java ME was formerly known as Java 2 Platform, Micro Edition or J2ME. As of December 22, 2006, the Java ME source code is licensed under the GNU General Public License, and is released under the project name phoneME. The platform uses the object-oriented Java programming language. It is part of the Java software-platform family. Java ME was designed by Sun Microsystems, acquired by Oracle Corporation in 2010; the platform replaced a similar technology, PersonalJava. Originally developed under the Java Community Process as JSR 68, the different flavors of Java ME have evolved in separate JSRs. Oracle provides a reference implementation of the specification, but has tended not to provide free binary implementations of its Java ME runtime environment for mobile devices, rather relying on third parties to provide their own. As of 2008, all Java ME platforms are currently restricted to JRE 1.3 features and use that version of the class file format (internally known as version 47.0). Should Oracle ever declare a new round of Java ME configuration versions that support the later class file formats and language features, such as those corresponding to JRE 1.5 or 1.6 (notably, generics), it will entail extra work on the part of all platform vendors to update their JREs. Java ME devices implement a profile. The most common of these are the Mobile Information Device Profile aimed at mobile devices, such as cell phones, and the Personal Profile aimed at consumer products and embedded devices like set-top boxes and PDAs. Profiles are subsets of configurations, of which there are currently two: the Connected Limited Device Configuration (CLDC) and the Connected Device Configuration (CDC). There are more than 2.1 billion Java ME enabled mobile phones and PDAs. It was popular in sub-$200 devices such as Nokia's Series 40. It was also used on the Bada operating system and on Symbian OS along with native software. Users of Windows CE, Windows Mobile, Maemo, MeeGo and Android can download Java ME for their respective environments ("proof-of-concept" for Android). Connected Limited Device Configuration The Connected Limited Device Configuration (CLDC) contains a strict subset of the Java-class libraries, and is the minimum amount needed for a Java virtual machine to operate. CLDC is basically used for classifying myriad devices into a fixed configuration. A configuration provides the most basic set of libraries and virtual-machine features that must be present in each implementation of a J2ME environment. When coupled with one or more profiles, the Connected Limited Device Configuration gives developers a solid Java platform for creating applications for consumer and embedded devices. The configuration is designed for devices with 160KB to 512KB total memory, which has a minimum of 160KB of ROM and 32KB of RAM available for the Java platform. Mobile Information Device Profile Designed for mobile phones, the Mobile Information Device Profile includes a GUI, and a data storage API, and MIDP 2.0 includes a basic 2D gaming API. Applications written for this profile are called MIDlets. Almost all new cell phones come with a MIDP implementation, and it is now the de facto standard for downloadable cell phone games. However, many cellphones can run only those MIDlets that have been approved by the carrier, especially in North America. JSR 271: Mobile Information Device Profile 3 (Final release on Dec 9, 2009) specified the 3rd generation Mobile Information Device Profile (MIDP3), expanding upon the functionality in all areas as well as improving interoperability across devices. A key design goal of MIDP3 is backward compatibility with MIDP2 content. Information Module Profile The Information Module Profile (IMP) is a profile for embedded, "headless" devices such as vending machines, industrial embedded applications, security systems, and similar devices with either simple or no display and with some limited network connectivity. Originally introduced by Siemens Mobile and Nokia as JSR-195, IMP 1.0 is a strict subset of MIDP 1.0 except that it doesn't include user interface APIs — in other words, it doesn't include support for the Java package javax.microedition.lcdui. JSR-228, also known as IMP-NG, is IMP's next generation that is based on MIDP 2.0, leveraging MIDP 2.0's new security and networking types and APIs, and other APIs such as PushRegistry and platformRequest(), but again it doesn't include UI APIs, nor the game API. Connected Device Configuration The Connected Device Configuration is a subset of Java SE, containing almost all the libraries that are not GUI related. It is richer than CLDC. Foundation Profile The Foundation Profile is a Java ME Connected Device Configuration (CDC) profile. This profile is intended to be used by devices requiring a complete implementation of the Java virtual machine up to and including the entire Java Platform, Standard Edition API. Typical implementations will use some subset of that API set depending on the additional profiles supported. This specification was developed under the Java Community Process. Personal Basis Profile The Personal Basis Profile extends the Foundation Profile to include lightweight GUI support in the form of an AWT subset. This is the platform that BD-J is built upon. Implementations Sun provides a reference implementation of these configurations and profiles for MIDP and CDC. Starting with the JavaME 3.0 SDK, a NetBeans-based IDE will support them in a single IDE. In contrast to the numerous binary implementations of the Java Platform built by Sun for servers and workstations, Sun does not provide any binaries for the platforms of Java ME targets with the exception of an MIDP 1.0 JRE (JVM) for Palm OS. Sun provides no J2ME JRE for the Microsoft Windows Mobile (Pocket PC) based devices, despite an open-letter campaign to Sun to release a rumored internal implementation of PersonalJava known by the code name "Captain America". Third party implementations are widely used by Windows Mobile vendors. Operating systems targeting Java ME have been implemented by DoCoMo in the form of DoJa, and by SavaJe as SavaJe OS. The latter company was purchased by Sun in April 2007 and now forms the basis of Sun's JavaFX Mobile. The open-source Mika VM aims to implement JavaME CDC/FP, but is not certified as such (certified implementations are required to charge royalties, which is impractical for an open-source project). Consequently, devices which use this implementation are not allowed to claim JavaME CDC compatibility. The Linux-based Android operating system uses a proprietary version of Java that is similar in intent, but very different in many ways from Java Me. JSRs (Java Specification Requests) Foundation Main extensions Future ESR The ESR consortium is devoted to Standards for embedded Java. Especially cost effective Standards. Typical applications domains are industrial control, machine-to-machine, medical, e-metering, home automation, consumer, human-to-machine-interface, ... See also Android (operating system) iOS BlackBerry OS Danger Hiptop Embedded Java JavaFX Mobile Mobile development Mobile games Mobile learning Qualcomm Brew Smartphone References Notes JSR 232: Mobile Operational Management an advanced OSGi technology based platform for mobile computing JSR 291: Dynamic Component Support for Java SE symmetric programming model for Java SE to Java ME JSR 232 Bibliography External links Sun Developer Network, Java ME Nokia's Developer Hub Java pages Nokia S60 Java Runtime blogs Sony Ericsson Developer World Motorola Developer Network J2ME Authoring Tool LMA Users Network Samsung Mobile Developer's site Sprint Application Developer's Website Performance database of Java ME compatible devices MicroEJ platforms for embedded systems Book - Mobile Phone Programming using Java ME (J2ME) Tutorial Master ng, J2ME Computing platforms Platform, Micro Edition Platform, Micro Edition
286312
https://en.wikipedia.org/wiki/UL%20%28safety%20organization%29
UL (safety organization)
UL, LLC is a global safety certification company headquartered in Northbrook, Illinois. It maintains offices in 46 countries. Established in 1894 as the Underwriters' Electrical Bureau (a bureau of the National Board of Fire Underwriters), it was known throughout the 20th century as Underwriters Laboratories and participated in the safety analysis of many of that century's new technologies. UL is one of several companies approved to perform safety testing by the U.S. federal agency Occupational Safety and Health Administration (OSHA). OSHA maintains a list of approved testing laboratories, which are known as Nationally Recognized Testing Laboratories. History Underwriters Laboratories Inc. was founded in 1894 by William Henry Merrill. Early in his career as an electrical engineer in Boston, a 25-year-old Merrill was sent by underwriters issuing fire insurance to assess risk, and investigate the World Fair's Palace of Electricity. In order to determine and mitigate risk, Merrill found it necessary to conduct tests on building materials. Upon seeing a growing potential in his field, Merrill stayed in Chicago to found Underwriters Laboratories. Merrill soon went to work developing standards, launching tests, designing equipment and uncovering hazards. Aside from his work at UL, Merrill served as the National Fire Protection Association's secretary-treasurer (1903–1909) and president (1910–1912) and was an active member of the Chicago Board and Union Committee. In 1916, Merrill became UL's first president. UL published its first standard, "Tin Clad Fire Doors", in 1903. In 1905, UL established a Label Service for certain product categories that require more frequent inspections. In 1906, UL introduced the UL Mark to indicate products that had passed their testing. UL inspectors conducted the first factory inspections on labeled products at manufacturers' facilities. A history of UL marks has been discussed by Toft. From 1905 to 1979 UL Headquarters was located at 207-231 East Ohio Street in Chicago. In 1979, the organization moved its headquarters to a 153-acre campus in Northbrook, Illinois, 25 miles north of its former downtown Chicago location. UL has expanded into an organization with 64 laboratories, testing and certification facilities serving customers in 104 countries. It has evolved from its roots in electrical and fire safety to address broader safety issues, such as hazardous substances, water quality, food safety, performance testing, safety and compliance education, and environmental sustainability. On January 1, 2012, Underwriters Laboratories transformed from a non-profit organization to a for-profit company in the U.S. A new subsidiary named simply UL LLC, a limited liability corporation, took over Underwriters Laboratories’ product testing and certification business. UL Standards Sustainability Standards UL 106, Standard for Sustainability for Luminaires (under development) UL 110, Standard for Sustainability for Mobile Phones Standards for Electrical and Electronic Products UL 50, Enclosures for Electrical Equipment UL 50E, Enclosures for Electrical Equipment, Environmental Considerations UL 153, Portable Electric Lamps UL 197, Commercial Electrical Cooking Appliances UL 244B, Field Installed and/or Field Connected Appliance Controls UL 410, Slip Resistance of Floor Surface Materials UL 796, Printed-Wiring Boards UL 916, Energy Management Equipment UL 962, Household and Commercial Furnishings UL 962A, Furniture Power Distribution Units UL 962B, Outline for Merchandise Display and Rack Mounted Power Distribution Units UL 970, Retail Fixtures and Merchandising Displays UL 1026, Electric Household Cooking and Food Serving Appliances UL 1492, Audio/Video Products and Accessories UL 1598, Luminaires UL 1642, Lithium Batteries UL 1995, Heating and Cooling Equipment UL 2267 Standard for Safety - Fuel Cell Power Systems for Installation in Industrial Electric Trucks UL 6500, Audio/Video and Musical Instrument Apparatuses for Household, Commercial and Similar General Uses UL 60065, Audio, Video and Similar Electronic Apparatuses: Safety Requirements UL 60335-1, Household and Similar Electrical Appliances, Part 1: General Requirements UL 60335-2-24, Household and Similar Electrical Appliances, Part 2: Particular Requirements for Motor Compressors UL 60335-2-3, Household and Similar Electrical Appliances, Part 2: Particular Requirements for Electric Irons UL 60335-2-34, Household and Similar Electrical Appliances, Part 2: Particular Requirements for Motor Compressors UL 60335-2-8, Household and Similar Electrical Appliances, Part 2: Particular Requirements for Shavers, Hair Clippers and Similar Appliances UL 60950, Information Technology Equipment UL 60950-1, Information Technology Equipment – Safety, Part 1: General Requirements UL 60950-21, Information Technology Equipment – Safety, Part 21: Remote Power Feeding UL 60950-22, Information Technology Equipment – Safety, Part 22: Equipment to be Installed Outdoors UL 60950-23, Information Technology Equipment – Safety, Part 23: Large Data Storage Equipment Life Safety Standards UL 217, Single- and Multiple- Station Smoke Alarms UL 268, Smoke Detectors for Fire Protective Signaling Systems UL 268A, Smoke Detectors for Duct Application UL 1626, Residential Sprinklers for Fire Protection Service UL 1971, Signaling Devices for the Hearing Impaired Standards for Building Products UL 10A, Tin-Clad Fire Doors UL 20, General-Use Snap Switches UL 486E, Equipment Wiring Terminals for Use with Aluminum and/or Copper Conductors UL 1256, Fire Test of Roof/Deck Constructions Standards for Industrial Control Equipment UL 508, Industrial Control Equipment UL 508A, Industrial Control Panels UL 508C, Power Conversion Equipment UL 61800-5-1, Adjustable Speed Electrical Power Drive Systems Standards for Plastic Materials UL 94, Tests for Flammability of Plastic Materials for Parts in Devices and Appliances UL 746A, Polymeric Materials: Short-Term Property Evaluations UL 746B, Polymeric Materials: Long-Term Property Evaluations UL 746C, Polymeric Materials: Use in Electrical Equipment Evaluations. UL 746D, Polymeric Materials: Fabricated Parts UL 746E, Polymeric Materials: Industrial Laminates, Filament Wound Tubing, Vulcanized Fiber and Materials Used in Printed-Wiring Boards UL 746F, Polymeric Materials: Flexible Dielectric Film Materials for Use in Printed-Wiring Boards and Flexible Materials Interconnect Constructions Standards for Wire and Cable UL 62, Flexible Cords and Cables UL 758, Appliance Wiring Material (AWM) UL 817, Cord Sets and Power Supply Cords UL 2556, Wire and Cable Test Methods Standards for Canada developed by ULC Standards, a member of the UL family of companies CAN/ULC-S101-07, Standard Methods for Fire Endurance Tests of Building Construction and Materials CAN/ULC-S102-10, Standard Methods of Test for Surface-Burning Characteristics of Building Materials and Assemblies CAN/ULC-S102.2-10, Standard Methods of Test for Surface-Burning Characteristics of Flooring, Floor Coverings, and Miscellaneous Materials and Assemblies CAN/ULC-S104-10, Standard Methods for Fire Tests of Door Assemblies CAN/ULC-S107-10, Standard Methods for Fire Tests of Roof Coverings CAN/ULC-S303-M91 (R1999), Standard Methods for Local Burglar Alarm Units and Systems Photovoltaic UL 1703, Photovoltaic Flat-Plate Modules UL 1741, Inverters, Converters, Controllers and Interconnection System Equipment for Use With Distributed Energy Resources UL 2703, Rack Mounting Systems and Clamping Devices for Flat-Plate Photovoltaic Modules and Panels Recognized Component Mark The "Recognized Component Mark" is a type of quality mark issued by Underwriters Laboratories. It is placed on components which are intended to be part of a UL listed product, but which cannot bear the full UL logo themselves. The general public does not ordinarily come across it, as it is borne on components which make up finished products. Computer benchmarking UL offers the following computer benchmarking products: 3DMark Easy Benchmark Automation PCMark 10 PCMark for Android Servermark Testdriver UL Procyon AI Inference Benchmark UL Procyon Photo Editing Benchmark UL Procyon Video Editing Benchmark VRMark Similar organizations Applied Research Laboratories (ARL) A competing testing laboratory, based in Florida, U.S. Bureau Veritas A competing test, inspection, and certification company. Baseefa A similar organization in the United Kingdom. Canadian Standards Association (CSA) A similar organization in Canada. Also serves as a competitive alternative for U.S. products. CCOE Chief Controller of Explosives CEBEC Testing laboratory, inspection, and certification company, based in Brussels, Belgium. DNV GL A global testing laboratory, inspection, certification, marine class, and engineering organisation, headquartered in Høvik, Norway. Efectis A similar organization in Europe, fire science expert, testing laboratory, and certification body. ETL SEMKO A competing testing laboratory, part of Intertek; based in London, U.K. FM Approvals A competing certification body, based in Rhode Island, U.S. ICC-ES International Code Council Evaluation Services. IAPMO R&T A competing certification body, based in Ontario, California, U.S. INERIS Testing laboratory, inspection, and certification company, based in France. KFI The Korea Fire Institute, a similar organization in Korea. MET Laboratories, Inc. A competing testing laboratory, based in Baltimore, Maryland, U.S. MiCOM Labs (MiCOM) A consumer, wireless, telecom, IT, medical, and aerospace industry, testing, and certification laboratory, based in Pleasanton, California, U.S. NTA Inc A certification agency based in Nappanee, Indiana, U.S. QAI Laboratories (QAI) A competing certification body, with locations in Canada (Vancouver, BC - HQ and Vaughan, ON); United States (Rancho Cucamonga, CA and Tulsa, OK); Seoul, South Korea; and Shanghai, China. Sira A similar organization for the UK/Europe. GS Geprüfte Sicherheit TÜV German approvals organizations. Cardno PPI A similar third party organization, with offices in Houston, Texas, U.S.; Lafayette, LA, U.S.; London, U.K.; and Perth, Australia. See also ANSI CE marking Conformance mark Consumer Reports Consumers Union Fire test Good Housekeeping Seal National Sanitation Foundation NEMKO Product certification Quality control RoHS Safety engineering Société Générale de Surveillance References External links List of US NRTLs at OSHA Commercial laboratories Companies based in Northbrook, Illinois Organizations established in 1894 Standards organizations in the United States Product-testing organizations Certification marks 1894 establishments in the United States Laboratories in the United States
31727564
https://en.wikipedia.org/wiki/Statmetrics
Statmetrics
Statmetrics is a free cross-platform software application providing an interactive environment for computational finance. Statmetrics is an analytical tool which offers several modeling techniques to analyze selected markets and integrates widely implemented quantitative finance technologies in addition with contemporary econometric analysis methods. Statmetrics can be used in diverse fields to perform econometric analysis, technical analysis, risk management, portfolio management and asset allocation. See also Computational finance Technical analysis Econometrics Modeling and analysis of financial markets References External links Official Website RoyalQ Trading Bot Financial markets Financial software Financial markets software Technical analysis software Cross-platform software Java platform software
30593912
https://en.wikipedia.org/wiki/University%20of%20South%20Asia%20%28Pakistan%29
University of South Asia (Pakistan)
The University of South Asia () or USA is a private university located in Lahore, Punjab, Pakistan. It was established in 1987 as a computer training institution with the name of National College of Computer Sciences (NCCS). More than 100,000 students underwent training programs in Computer Sciences. It was developed to promote computer education but has expanded to provide chartered degrees in fields that include Business Studies, Culinary Sciences, Allied Health Sciences, Civil Engineering, Electrical Engineering, Architecture, Law, Media Studies, Physical Therapy, Nutrition and Fashion and Interior Design. All undergraduate and graduate programs at the university are recognized by the Higher Education Commission of Pakistan and all the respective regulatory bodies. As a chartered university, all degrees awarded are internationally recognized. The university has three campuses: Cantt. Campus: 47 Tufail Road, Lahore Cantt Raiwind Road Campus: 5 km from Thokar Niaz Baig Burki Campus: Barki Road The Vice Chancellor of the university is Ex-Minister of Education, Punjab Mian Imran Masood. History The Sadiq Memorial Educational Society established a chain of National Colleges of Computer Science throughout the country to offer computer science programs. The Bachelor of Computer Science program was launched in 1992 and Intermediate with Computer Science was introduced in 1994. In 1995 MBA and MCS programs were launched. The National Group of Colleges established a chartered degree-awarding higher education institution called the Institute of South Asia on April 14, 2003 vide “The Institute of South Asia, Lahore, Ordinance 2003, Punjab Ordinance No. IV of 2003”. The institute was upgraded to the status of a university on July 9, 2005 vide “The University Of South Asia, Lahore ACT 2005”, with the Chief Minister of the Punjab inaugurating the University of South Asia. Departments The Faculty of Management Sciences offers Bachelor and Master's degree in Business Administration, HR, Supply Chain and Logistics, Digital Marketing. The department aims to offer the latest and most up to date curriculum with project based learning to help students apply the concepts, allowing them to internalize the knowledge received. With a vibrant clubs and societies culture, students learn in and out of Classes for holistic personality development. Students are also offered support in Admissions abroad. The Department of Architecture offers a five-year degree program Bachelor in Architecture Design (B. Arch). Students are given a license number from PCATP on the completion of the degree. The Department of Biotechnology and Agro Sciences offers the program Bachelor in Biotechnology and Agro Sciences. The Department of Building and Architecture program combines architecture, management and technology. The Department of Civil Engineering offers a bachelor's and master's degree in civil engineering. Students who meet the requirements of the degree are given a PEC Number by Pakistan Engineering Council. The Department of Computer Sciences offers a bachelor's and master's degree in computer science, software engineering, information technology, robotics and ITC. Students learn with hands-on approach and are encouraged to work in projects from semester 1. With ample developmental opportunities, students enhanced their knowledge and soft skills to become successful graduates. The Department of Electrical Engineering offers an undergraduate and graduate degree in Electrical Engineering. Students who meet the requirements of the degree are given a PEC Number by Pakistan Engineering Council. The Department of Technology offers bachelor of Civil Technology, bachelor of Electrical Technology and bachelor of Mechanical Technology The Department of Fashion Design offers a Bachelor and Master's in Fashion Design, Interior Design, and Textile Design. With a vibrant environment, students are encouraged to express their individual aesthetics and refine their design thinking. The Department of Health Sciences offers the degrees of Bachelor of Nutrition & Dietetics and Doctor of Physiotherapy. The School of Law offers a 5-year joint degree programme of Bachelors and LL.B, with three specializations, which are all 'qualifying', letting students to proceed straight onto the professional stage of training to become a lawyer. Media Studies The Department of Career Counseling and Placement Center offers students academic and career guidance to help them make informed decisions. The department offers seminars for students and outsiders seeking to develop their skill sets. References External links University of South Asia official website Universities and colleges in Lahore Educational institutions established in 1987 Private universities and colleges in Punjab, Pakistan Engineering universities and colleges in Pakistan 1987 establishments in Pakistan
52752762
https://en.wikipedia.org/wiki/National%20Critical%20Information%20Infrastructure%20Protection%20Centre
National Critical Information Infrastructure Protection Centre
National Critical Information Infrastructure Protection Centre (NCIIPC) is an organisation of the Government of India created under the Section 70A of the Information Technology Act, 2000 (amended 2008), through a gazette notification on 16 January 2014. Based in New Delhi, India, it is designated as the National Nodal Agency in terms of Critical Information Infrastructure Protection. It is a unit of the National Technical Research Organisation (NTRO) and therefore comes under the Prime Minister's Office (PMO). Critical Information Infrastructure The Information Technology Act, 2000 defines Critical Information Infrastructure (CII) as “… those computer resource, the incapacitation or destruction of which, shall have debilitating impact on national security, economy, public health or safety". NCIIPC has broadly identified the following as ‘Critical Sectors’ :- Power & Energy Banking, Financial Services & Insurance Telecom Transport Government Strategic & Public Enterprises Information Security Practices and Procedures for Protected System Rules, 2018 Vision "To facilitate safe, secure and resilient Information Infrastructure for Critical Sectors of the Nation." Mission "To take all necessary measures to facilitate protection of Critical Information Infrastructure, from unauthorized access, modification, use, disclosure, disruption, incapacitation or distraction through coherent coordination, synergy and raising information security awareness among all stakeholders. " Functions and Duties National nodal agency for all measures to protect nation's critical information infrastructure. Protect and deliver advice that aims to reduce the vulnerabilities of critical information infrastructure, against cyber terrorism, cyber warfare and other threats. Identification of all critical information infrastructure elements for approval by the appropriate Government for notifying the same. Provide strategic leadership and coherence across Government to respond to cyber security threats against the identified critical information infrastructure. Coordinate, share, monitor, collect, analyze and forecast, national level threat to CII for policy guidance, expertise sharing and situational awareness for early warning or alerts. The basic responsibility for protecting CII system shall lie with the agency running that CII. Assisting in the development of appropriate plans, adoption of standards, sharing of best practices and refinement of procurement processes in respect of protection of Critical Information Infrastructure. Evolving protection strategies, policies, vulnerability assessment and auditing methodologies and plans for their dissemination and implementation for protection of Critical Information Infrastructure. Undertaking research and development and allied activities, providing funding (including grants-in-aid) for creating, collaborating and development of innovative future technology for developing and enabling the growth of skills, working closely with wider public sector industries, academia et al. and with international partners for protection of Critical Information Infrastructure. Developing or organising training and awareness programs as also nurturing and development of audit and certification agencies for protection of Critical Information Infrastructure. Developing and executing national and international cooperation strategies for protection of Critical Information Infrastructure. Issuing guidelines, advisories and vulnerability or audit notes etc. relating to protection of critical information infrastructure and practices, procedures, prevention and response in consultation with the stake holders, in close coordination with Indian Computer Emergency Response Team and other organisations working in the field or related fields. Exchanging cyber incidents and other information relating to attacks and vulnerabilities with Indian Computer Emergency Response Team and other concerned organisations in the field. In the event of any threat to critical information infrastructure the National Critical Information Infrastructure Protection Centre may call for information and give directions to the critical sectors or persons serving or having a critical impact on Critical Information Infrastructure. Operations NCIIPC maintains a 24x7 Help Desk to facilitate reporting of incidents. Toll Free No. 1800-11-4430. Issues advisories or alerts and provide guidance and expertise-sharing in addressing the threats/vulnerabilities for protection of CII. In the event of a likely/actual national-level threat, it plays a pivotal role to coordinate the response of the various CII stake-holders in close cooperation with CERT-India. Programs NCIIPC runs a number of programs to engage with its Stakeholders. Some of them are as follows: Responsible Vulnerability Disclosure Program (RVDP) Incident Response (IR) Malware Reporting Internship Courses Research Scholars, etc. Initiatives Some of the major NCIIPC initiatives are as follows: Incident Response and Responsible Vulnerability Disclosure program- NCIIPC runs these programs for reporting any Vulnerability in Critical Information Infrastructures. PPP for Training- Identification of PPP entities for partnership and formulation of training requirements and guidelines for conducting training for all stakeholders. CII Range to simulate real world threat– IT and OT simulations for critical sectors to test the defense of CII. Cyber Security Preparedness Survey, Risk Assessment, Audit, review and Compliance. Interns, Research Scholars & Cyber Security professionals- NCIIPC Internship program (both Full-time and Part-time) is available throughout the year. NCIIPC Newsletter NCIIPC releases its quarterly newsletter encompassing latest developments in the field of Critical Information Infrastructure(CII) and its protection along with various initiatives taken by NCIIPC to spread awareness and best practices and much more. January 2022 October 2021 July 2021 April 2021 January 2021 October 2020 July 2020 April 2020 January 2020 October 2019 July 2019 April 2019 January 2019 October 2018 July 2018 April 2018 January 2018 October 2017 July 2017 April 2017 January 2017 NCIIPC Guidelines NCIIPC releases SOPs and Guidelines for CISOs, CII Organisations and others to enhance the cybersecurity defence posture. Below are the copies: NCIIPC COVID-19 Guidelines SOP : Public-Private-Partnership Guidelines for Identification of CII Rules for the Information Security Practices and Procedures for Protected System National Cyber Security Policy, 2013 Roles and Responsibilities of CISOs Guidelines for Protection of CII Evaluating Cyber Security in CII SOP : Incident Response SOP : Audit of CII/Protected Systems References External links Government of India Information technology organisations based in India Cyberwarfare Cyber Security in India Computer security organizations Indian intelligence agencies
487329
https://en.wikipedia.org/wiki/IBM%207950%20Harvest
IBM 7950 Harvest
The IBM 7950, also known as Harvest, was a one-of-a-kind adjunct to the Stretch computer which was installed at the United States National Security Agency (NSA). Built by IBM, it was delivered in 1962 and operated until 1976, when it was decommissioned. Harvest was designed to be used for cryptanalysis. Development In April 1958, the final design for the NSA-customized version of IBM's Stretch computer had been approved, and the machine was installed in February 1962. The design engineer was James H. Pomerene, and it was built by IBM in Poughkeepsie, New York. Its electronics (fabricated of the same kind of discrete transistors used for Stretch) were physically about twice as big as the Stretch to which it was attached. Harvest added a small number of instructions to Stretch, and could not operate independently. An NSA-conducted evaluation found that Harvest was more powerful than the best commercially available machine by a factor of 50 to 200, depending on the task. Architecture The equipment added to the Stretch computer consisted of the following special peripherals: IBM 7951 — Stream coprocessor IBM 7952 — High performance core storage IBM 7955 — Magnetic tape system, also known as TRACTOR IBM 7959 — High speed I/O exchange With the stream processing unit, Harvest was able to process 3 million characters a second. The TRACTOR tape system, part of the HARVEST system, was unique for its time. It included six tape drives, which handled tape in cartridges, along with a library mechanism that could fetch a cartridge from a library, mount it on a drive, and return it to the library. The transfer rates and library mechanism were balanced in performance such that the system could read two streams of data from tape, and write a third, for the entire capacity of the library, without any time wasted for tape handling. Programming Harvest's most important mode of operation was called "setup" mode, in which the processor was configured with several hundred bits of information and the processor then operated by streaming data from memory — possibly taking two streams from memory — and writing a separate stream back to memory. The two byte streams could be combined, used to find data in tables, or counted to determine the frequency of various values. A value could be anything from 1 to 16 contiguous bits, without regard to alignment, and the streams could be as simple as data laid out in memory, or data read repeatedly, under the control of multiply-nested "do"-loop descriptors, which were interpreted by the hardware. Two programming languages, Alpha and Beta (not be confused with Simula-inspired BETA programming language) were designed for programming it, and IBM provided a compiler for the former around the time the machine was delivered. Usage One purpose of the machine was to search text for key words from a watchlist. From a single foreign cipher system, Harvest was able to scan over seven million decrypts for any occurrences of over 7,000 key words in under four hours. The computer was also used for codebreaking, and this was enhanced by an early distributed networking system codenamed Rye, which allowed remote access to Harvest. According to a 1965 NSA report, "RYE has made it possible for the agency to locate many more potentially exploitable cryptographic systems and 'bust' situations. Many messages that would have taken hours or days to read by hand methods, if indeed the process were feasible at all, can now be 'set' and machine decrypted in a matter of minutes". Harvest was also used for decipherment of solved systems; the report goes on to say that, "Decrypting a large batch of messages in a solved system [is] also being routinely handled by this system". The Harvest-RYE system became an influential example for computer security; a 1972 review identified NSA’s RYE as one of two “examples of early attempts at achieving ‘multi-level’ security.” Harvest remained in use until 1976, having been in operation at the NSA for fourteen years. Part of the reason for its retirement was that some of the mechanical components of TRACTOR had worn beyond use, and there was no practical way to replace them. IBM declined to re-implement the architecture in a more modern technology. See also Cryptanalytic computer References Sources James Bamford, Body of Secrets, 2001, . S.G. Campbell, P.S. Herwitz and J.H. Pomerene A Nonarithmetical System Extension, pp 254–271 in W.Buchholz, Planning a Computer System: Project Stretch, McGraw-Hill, 1962. A scanned PDF version is on-line at (10.4MB) Douglas Hogan General and Special-Purpose Computers: a Historical Look and Some Lessons Learned, National Security Agency, 1986. A scanned PDF version is on-line at (1.1MB) Samuel Simon Snyder History of NSA General-Purpose Electronic Digital Computers, pp 39–64, National Security Agency, 1964. A scanned PDF version is on-line at (3.3MB) External links Eric Smith, IBM Stretch (aka IBM 7030 Data Processing System) Warren Alva Hunt, Early History of Harvest Computer Timeline of the IBM Stretch/Harvest Era (1956-1961) TRACTOR (IBM history page) 7950 7 7950 7950 One-of-a-kind computers Cryptanalytic devices Computer-related introductions in 1962
25997162
https://en.wikipedia.org/wiki/FX.25%20Forward%20Error%20Correction
FX.25 Forward Error Correction
FX.25 is a protocol extension to the AX.25 Link Layer Protocol. FX.25 provides a Forward Error Correction (FEC) capability while maintaining legacy compatibility with non-FEC equipment. FX.25 was created by the Stensat Group in 2005, and was presented as a technical paper at the 2006 TAPR Digital Communications Conference in Tucson, AZ. Overview FX.25 is intended to complement the AX.25 protocol, not replace it. It provides an encapsulation mechanism that does not alter the AX.25 data or functionalities. An error correction capability is introduced at the bottom of Layer 2 in the OSI model. The AX.25 Link Layer Protocol is extensively used in amateur radio communications. The packets are validated by a 16-bit CRC, and are discarded if one or more errors are detected. In many cases, such as space-to-earth telemetry, the packets are broadcast unidirectionally. No back-channel may be available to request retransmission of errored elements. Consequently, AX.25 links are inherently intolerant of errors. The FX.25 protocol extension provides an error correction "wrapper" around the AX.25 packet, allowing for removal of errors at the receiving end. Data fields have been carefully chosen to allow reception of the AX.25 packet data within an FX.25 frame by a non-FEC decoder. Technical Implementation A composite FX.25 entity is called a "frame," distinguishing it from the AX.25 "packet" contained within. The FX.25 frame contains the following elements: - Preamble - Correlation Tag - AX.25 Packet - - AX.25 Packet Start - - AX.25 Packet Body - - AX.25 Packet Frame Check Sequence (FCS) - - AX.25 Packet End - Pad for bit-to-byte alignment - FEC Check Symbols - Postamble The "FEC Codeblock" contains all elements except the Preamble, Correlation Tag, and Postamble. These three elements exist outside of the correction-space for the FEC algorithm. The Preamble and Postamble blocks are variable length, and are included to account for delays typically found in radio links - transmitter "key" to stable operation, receiver squelch latency, etc. The Correlation Tag is a Gold code, and contains inherent error tolerance. This is necessary to provide a "start of frame" marker without requiring a dependency on the FEC capability. The FEC frame currently implements Reed Solomon error correction algorithms, but is not restricted to these. Performance Performance improvement will be a function of AX.25 packet size combined with the noise characteristics of the transmission channel. Initial performance testing involved transmission of 61 FX.25 frames over an interval of about 15 minutes. - 9 frames were received without errors - 19 frames were received with correctable errors - 33 frames were received with uncorrectable errors 15% of the AX.25 packets [9/61] were decodable without the FEC capability 46% of the AX.25 packets [(9+19)/61] were decodable with the FEC capability References External links 2006 TAPR DCC webpage FX.25 Specification (pdf) FX.25 Presentation Slides from 2006 TAPR DCC (pdf) FX.25 Google Discussion Group AX.25 + FEC = FX.25 -- details about the "Dire Wolf" software TNC implementation of FX.25. Packet radio Link protocols
3508936
https://en.wikipedia.org/wiki/SoftPC
SoftPC
SoftPC is a software emulator of x86 hardware. It was developed in the mid-1980s by Rod MacGregor, who founded Insignia Solutions. Available originally on UNIX workstations to run MS-DOS, the software was ported to the Macintosh in 1987, and later gained the ability to run Microsoft Windows software. Besides Mac OS, supported platforms included SGI IRIX, Sun Solaris, HP-UX, IBM AIX, NeXTSTEP, Motorola 88000, OpenVMS on VAX and DEC Alpha systems, DEC ULTRIX, and others. Bundles of SoftPC with Windows (3.x, 95, 98) were called SoftWindows, although it was possible to install Windows into the basic SoftPC environment along with some special utilities provided by Insignia to achieve the same effect. Beginning in 1996, Insignia commanded the niche for this product area, but it soon faced heavy competition from Connectix with their Virtual PC product. Insignia sold the product line to FWB Software in October 1999 in order to focus on supplying implementations of Java for the mobile device market. FWB continued to sell SoftWindows until March 2001. FWB Software also marketed a separate version of the software that did not include a bundled copy of Windows, called RealPC, until 2003. When Microsoft released Windows NT it included a subsystem ("WOW" - Windows on Windows, later NTVDM) for running virtualized 16-bit Windows (x86) programs. However, they had also made changes to Windows to allow it to run on alternative processors (Alpha, PowerPC), and for these an emulation layer was needed for programs compiled for Intel processors. Customized versions of Insignia's core emulation system were produced to this end, but the alternative NT architectures never became widely used. Unlike most emulators, the SoftWindows product used recompiled Windows components to improve performance in most business applications, providing almost native performance (but this meant that, unlike SoftPC, SoftWindows was not upgradable). See also Windows Interface Source Environment References IRIX software NeXTSTEP software Solaris software OpenVMS software X86 emulators
57746087
https://en.wikipedia.org/wiki/2018%E2%80%9319%20USC%20Trojans%20men%27s%20basketball%20team
2018–19 USC Trojans men's basketball team
The 2018–19 USC Trojans men's basketball team represented the University of Southern California during the 2018–19 NCAA Division I men's basketball season. Led by sixth-year head coach Andy Enfield, they played their home games at the Galen Center in Los Angeles, California as members of the Pac-12 Conference. Previous season The Trojans finished the season 24–12, 12–6 in Pac-12 play to finish in second place. As the No. 2 seed in the Pac-12 Tournament, they defeated Oregon State in the quarterfinals and Oregon in the semifinals before losing to Arizona in the championship game. They were one of the last four teams not selected for the NCAA Tournament and as a result earned a No. 1 seed in the National Invitation Tournament, where they defeated UNC Asheville in the first round before losing to Western Kentucky in the second round. Offseason Departures 2018 Recruiting class Roster Sophomore Forward Jordan Usher transferred to Georgia Tech on December 31, 2018 after being suspended indefinitely for "unspecified conduct issues." Schedule and results |- !colspan=9 style=| Non-conference regular season |- !colspan=9 style=| Pac-12 regular season |- !colspan=9 style=| Pac-12 Tournament References USC Trojans men's basketball seasons Usc USC Trojans basketball, men USC Trojans basketball, men USC Trojans basketball, men USC Trojans basketball, men
18195583
https://en.wikipedia.org/wiki/IOS%20version%20history
IOS version history
The version history of the mobile operating system iOS, developed by Apple Inc., began with the release of iPhone OS 1 for the original iPhone on June 29, 2007. Since its initial release, it has been used as the operating system for iPhone, iPad, iPod Touch, and HomePod. Continuous development since its initial release resulted in new major releases of the software, typically being announced at the annual Apple Worldwide Developers Conference and later released in September, coinciding with the release of new iPhone models. Starting with the 13.0 release, the operating system for iPad was split off as iPadOS. The latest version of iOS and iPadOS, 15.3.1, was released on February 10, 2022. The latest beta version of iOS and iPadOS, 15.4 beta 4, was released on February 22, 2022. Updates can be done over-the-air through Settings (since iOS 5), or via the iTunes or Finder applications. Overview June 2007 saw the official release of iPhone OS 1 – concurrently with the first iPhone – which eventually became to be known as iOS. iOS did not have an official name until the official release of the iPhone software development kit (iPhone SDK) on March 6, 2008. Before then, Apple marketing simply stated that iPhone ran a version of Mac OS X made specifically for the iPhone. When iOS was introduced, it was named iPhone OS. It was officially renamed iOS on June 7, 2010, with the announcement and introduction of the first-generation iPad. The introduction of what would later become the iPad line, and the existence of iPod Touch, meant the iPhone was no longer the only device to run the mobile operating system. iOS 4 was the first major iOS release that reflected the name change. Apple licensed the "iOS" trademark from Cisco Systems. Apple concurrently provides the same version of iOS for the comparable model of iPhone and iPod Touch, usually devices released in the same calendar year. iPhone users receive all software updates for free, while iPod Touch users paid for the 2.0 and 3.0 major software updates. As of iOS 4, Apple no longer charges for iPod Touch updates. four versions of iOS were not publicly released, with the version numbers of three of them changed during development. iPhone OS 1.2 was replaced by a 2.0 version number after the first beta; the second beta was named 2.0 beta 2 instead of 1.2 beta 2. The second was iOS 4.2, replaced with 4.2.1 due to a Wi-Fi bug in 4.2 beta 3, causing Apple to release 2 golden masters (4.2 GM and 4.2.1 GM). The third was iOS 13.4.5, which was renamed to iOS 13.5 when Beta 3 was released, with the introduction of the Exposure Notification API, which required an SDK update. Similarly, iOS 13.5.5 was renamed to iOS 13.6 in beta 2, with the introduction of new features for the Health app. One version of iOS was pulled back by Apple after being released (iOS 8.0.1) due to issues with cellular service and Touch ID on iPhone 6 and iPhone 6 Plus units. Version history iPhone OS 1 Apple announced iPhone OS 1 at the iPhone keynote on January 9, 2007, and it was released to the public alongside the original iPhone on June 29, 2007. No official name was given on its initial release; Apple marketing literature simply stated the iPhone runs a version of Apple's desktop operating system, OS X. During the development phase of iPhone OS 1, there were at least 17-18 different concepts developed. Many on the team were still hung up on the idea that everyone would want to type on a hardware keyboard, not a glass screen. The idea of introducing a complete touch screen was very novel to everyone. Many user interfaces were prototyped, including the multi-touch click-wheel. Although many thought it was a waste of time, Apple CEO Steve Jobs insisted on prototyping all concepts before the Mac OS-X-based version of the operating system was selected. The release of iPhone OS 1.1 brought support for the iPod Touch (1st generation). iPhone OS 1.1.5 was the final version of iPhone OS 1. It became unsupported on May 18, 2010. iPhone OS 2 Apple announced iPhone OS 2 at the 2008 Worldwide Developers Conference on June 9, 2008, and it was released to the public on July 11, 2008, alongside the iPhone 3G. iPhone OS 2 was the first release to have the App Store, allowing developers to create third-party apps for the iPhone. This OS was made available to iPod touch users for $9.95 in which it introduced key requested features such as a push email. Apple did not drop support for any devices with this release. iPhone OS 2 was compatible with all devices released up to that time. The release of iPhone OS 2.1.1 brought support for the iPod Touch (2nd generation). iPhone OS 2.2.1 was the final version of iPhone OS 2. iPhone OS 3 Apple announced iPhone OS 3 on March 17, 2009, and it was released to the public on June 17, 2009, alongside the iPhone 3GS. Apple did not drop support for any devices with this release. iPhone OS 3 was compatible with all devices released up to that time, but not all features were available on the original iPhone. The final release supported on the original iPhone and iPod Touch (1st generation) was iPhone OS 3.1.3. The first iPad was introduced along with iPhone OS 3.2. iOS 4 Apple announced iOS 4 in March 2010 and it was released to the public on June 21, 2010, alongside the iPhone 4. With this release, Apple dropped support for the original iPhone and the 1st generation iPod Touch, which is the first time Apple had dropped support for any device in an iOS release. The iPhone 3G and the 2nd generation iPod Touch were capable of running iOS 4, but had limited features. For example, both devices lack multitasking capabilities and the ability to set a home screen wallpaper. However, iOS 4 was the first major release that iPod Touch users did not have to pay any money for. The release of iOS 4.2.1 brought compatibility to the original iPad and was the final release supported on the iPhone 3G and 2nd generation iPod Touch due to major performance issues. The release of iOS 4.3 brought iPad 2 compatibility. iOS 5 Apple announced iOS 5 on June 6, 2011, at its annual Apple Worldwide Developers Conference (WWDC) event, and it was released to the public on October 12, 2011, alongside the iPhone 4S. Apple did not drop support for any devices with this release; support for the iPhone 3G and the iPod Touch (2nd generation) had already been dropped with the release of iOS 4.3 seven months earlier due to major performance issues. Therefore, iOS 5 was released for the iPhone 3GS onwards, iPod Touch (3rd generation) onwards, and the iPad (1st generation) and iPad 2. The release of iOS 5.1 brought support for the iPad (3rd generation). iOS 5.1.1 was the final release supported for the original iPad and iPod Touch (3rd generation). iOS 6 Apple announced iOS 6 on June 11, 2012, at its annual Apple Worldwide Developers Conference (WWDC) event, and it was released to the public on September 19, 2012, alongside the iPhone 5, iPod Touch (5th generation), and iPad (4th generation). With this release, Apple dropped support for the iPod Touch (3rd generation) and the iPad (1st generation) due to performance issues, and offered only limited support on the iPhone 3GS and iPod Touch (4th generation). The iPhone 4 onwards, the iPod Touch (5th generation), the iPad 2 onwards and the iPad Mini (1st generation) were fully supported. iOS 6.1.6 was the final release supported for the iPhone 3GS and iPod Touch (4th generation). iOS 7 Apple announced iOS 7 on June 10, 2013, at its annual Apple Worldwide Developers Conference (WWDC) event, and it was released to the public on September 18, 2013, alongside the iPhone 5C and iPhone 5S. With this release, Apple dropped support for the iPhone 3GS due to hardware limitations and the iPod Touch (4th generation) due to performance issues. iOS 7 has limited support on the iPad 2 and the iPhone 4 since they do not support Siri. However, other devices from the iPhone 4S onwards, iPod Touch (5th generation) onwards, the iPad (3rd generation) onwards, and the iPad Mini (1st generation) onwards were fully supported. The release of iOS 7.0.3 brought support for the iPad Air and iPad Mini 2. iOS 7.1.2 was the final release on the iPhone 4. iOS 8 Apple announced iOS 8 on June 2, 2014, at its annual Apple Worldwide Developers Conference (WWDC) event, and it was released to the public on September 17, 2014, alongside the iPhone 6 and iPhone 6 Plus. With this release, Apple dropped support for one device, the iPhone 4. iOS 8 has limited support on the iPad 2, iPhone 4S, iPad (3rd generation), iPad Mini (1st generation), and the iPod Touch (5th generation), as Apple received widespread complaints of extremely poor/slow performance from owners of these devices. All other devices from the iPhone 5 onwards, iPod Touch (6th generation) onwards, the iPad (4th generation) onwards, and the iPad Mini 2 onwards were fully supported. The release of iOS 8.1 brought support for the iPad Air 2 and iPad Mini 3, and the release of iOS 8.4 brought support for the iPod Touch (6th generation). iOS 8.3 was the first version of iOS to have public beta testing available, where users could test the beta for upcoming releases of iOS and send feedback to Apple about bugs or issues. The final version of iOS 8 was iOS 8.4.1. iOS 9 Apple announced iOS 9 on June 8, 2015, at its annual Apple Worldwide Developers Conference (WWDC) event, and it was released to the public on September 16, 2015, alongside the iPhone 6S, iPhone 6S Plus and iPad Mini 4. With this release, Apple did not drop support for any iOS devices. Therefore, iOS 9 was supported on the iPhone 4S onwards, iPod Touch (5th generation) onwards, the iPad 2 onwards, and the iPad Mini (1st generation) onwards. However, iOS 9 has limited support on devices with an Apple A5 or A5X processor: the iPhone 4S, iPad 2, iPad (3rd generation), iPad Mini (1st generation), and iPod Touch (5th generation). This release made the iPad 2 the first device to support six major releases of iOS, supporting iOS 4 thru iOS 9. Despite Apple's promise of better performance on these devices, there were still widespread complaints that the issue had not been fixed. iOS 9.3.5 is the final release on the iPod Touch (5th generation), the Wi-Fi-only iPad 2, the Wi-Fi-only iPad (3rd generation), and the Wi-Fi-only iPad Mini (1st generation). iOS 9.3.6 is the final release on the iPhone 4S, the Wi-Fi + cellular iPad 2, the Wi-Fi + cellular iPad (3rd generation), and the Wi-Fi + cellular iPad Mini (1st generation). iOS 10 Apple announced iOS 10 on June 13, 2016, at its annual Apple Worldwide Developers Conference (WWDC) event, and it was released to the public on September 13, 2016, alongside the iPhone 7 and iPhone 7 Plus. With this release, Apple dropped support for devices using an A5 or A5X processor: the iPhone 4S, the iPad 2, iPad (3rd generation), iPad Mini (1st generation), and iPod Touch (5th generation). iOS 10 has limited support on devices with 32-bit processors: the iPhone 5, iPhone 5C, and iPad (4th generation). However, the iPhone 5S onwards, iPod Touch (6th generation), iPad Air onwards, and the iPad Mini 2 onwards are fully supported. The release of iOS 10.2.1 brought support for the iPad (5th generation), and iOS 10.3.2 brought support for the iPad Pro (10.5-inch) and the iPad Pro (12.9-inch, 2nd generation). iOS 10.3.3 is the final supported release for the iPhone 5C and the Wi-Fi-only iPad (4th generation). iOS 10.3.4 is the final supported release for the iPhone 5 and the Wi-Fi + cellular iPad (4th generation). iOS 10 is the last iOS version to run on 32-bit processors and also the last to run 32-bit apps. iOS 11 Apple announced iOS 11 on June 5, 2017, at its annual Apple Worldwide Developers Conference (WWDC) event, and it was released to the public on September 19, 2017, alongside the iPhone 8 and iPhone 8 Plus. With this release, Apple dropped support for the 32-bit iPhone 5, iPhone 5C, and iPad (4th generation), and also dropped support for 32-bit applications, making iOS a 64-bit-only OS that only runs 64-bit apps. iOS 11 has limited support on devices with the Apple A7 or A8 processors: the iPhone 5S, iPhone 6/6 Plus, iPod Touch (6th generation), iPad Air, iPad Air 2, iPad Mini 2, 3, and 4. However, all other devices from the iPhone 6S/6S Plus onwards, iPhone SE (1st generation), iPad Pro, and iPad (5th generation) onwards are fully supported. iOS 11.0.1 brought support for the iPhone X and iOS 11.3 brought support for the iPad (6th generation). The final version of iOS 11 to be released was iOS 11.4.1. iOS 12 Apple announced iOS 12 on June 4, 2018, at its annual Apple Worldwide Developers Conference (WWDC) event, and it was released to the public on September 17, 2018, alongside the iPhone XS, iPhone XS Max and iPhone XR. With this release, Apple did not drop support for any iOS devices. Therefore, iOS 12 was supported on the iPhone 5S onwards, iPod Touch (6th generation), the iPad Air onwards, and the iPad Mini 2 onwards. However, iOS 12 has limited support on devices with the Apple A7 or A8 processors: the iPhone 5S, iPhone 6/6 Plus, iPod Touch (6th generation), iPad Air, iPad Air 2, iPad Mini 2, 3, and 4. All other devices from the iPhone 6S/6S Plus onwards, the iPad Air (2019), the iPad (5th generation) onwards, and all iPad Pro models are fully supported. iOS 12.1 brought support to the iPad Pro (12.9-inch, 3rd generation) and iPad Pro (11-inch, 1st generation) and iOS 12.2 brought support to the iPad Mini (5th generation) and iPad Air (3rd generation). iOS 12.5.5 is the last supported release for the iPhone 5S, iPhone 6, iPhone 6 Plus, iPad Air (1st generation), iPad mini 2, iPad mini 3, and iPod touch (6th generation). iOS 13 / iPadOS 13 Apple announced iOS 13 on June 3, 2019, at its annual Apple Worldwide Developers Conference (WWDC) event, and it was released to the public on September 19, 2019, alongside the iPhone 11 series (11, 11 Pro, 11 Pro Max). The principal features include dark mode and Memoji support for A9+ devices. The NFC framework now supports reading several types of contactless smartcards and tags. The iPad gains several tablet-oriented features, and its operating system has been rebranded as iPadOS; iPadOS 13 was announced at the 2019 WWDC as well. With this release, Apple dropped support for all devices with less than 2 GB of RAM, which included the iPhone 5S and iPhone 6 and 6 Plus, iPod Touch (6th generation), iPad Mini 2, iPad Mini 3, and iPad Air. iOS 13/iPadOS 13 has limited support on devices with the A8/A8X. However, all other devices from the iPhone 6s/6s Plus onwards, iPod Touch (7th generation), iPad Pro (1st generation), iPad (5th generation), and iPad Mini (5th generation) onwards are fully supported (A9 and A10 devices having almost full support, A11 and on having full support). iOS 13 brought support for the iPhone 11 and iPhone 11 Pro / Pro Max, second-generation iPhone SE, and iPadOS 13 brought support for the iPad (7th generation), the iPad Pro (12.9-inch, 4th generation) and the iPad Pro (11-inch, 2nd generation). iOS 14 / iPadOS 14 Apple announced iOS 14 and iPadOS 14 on June 22, 2020, at its annual WWDC 2020 event, with a developer beta released on the same day and a public beta released on July 9, 2020. iOS 14 and iPadOS 14 were released on September 16, 2020, alongside the iPad (8th Generation) and iPad Air (4th Generation). All devices that supported iOS 13 also support iOS 14. This makes the iPad Air 2 the first device to support seven versions of iOS and iPadOS, from iOS 8 to iPadOS 14. Some new features introduced in iOS 14 and iPadOS 14 include redesigned widgets that can now be placed directly on the homescreen (only for iOS), along with the App Library, which automatically categorizes apps into one page, Picture in Picture in iPhone and iPod touch, and the CarKey technology to unlock and start a car with NFC. iOS and iPadOS 14 also allow the user to have incoming calls shown in banners rather than taking up the whole screen (the latter view is still available as an optional function). The release of iPadOS 14.0 brought support for the 8th generation iPad and the 4th generation iPad Air and the release of iOS 14.1 brought support for the iPhone 12, the iPhone 12 Mini and the iPhone 12 Pro and Pro Max. iOS 14 and iPadOS 14 have limited support on devices with A8, A8X, A9, A9X, and A10 Fusion chips, whereas devices with A10X Fusion and A11 Bionic chip have almost full support, and devices with A12 Bionic chip and later have full support. iOS 15 / iPadOS 15 Apple announced iOS 15 and iPadOS 15 on June 7, 2021, at its annual WWDC 2021 event, with a developer beta released on the same day and a public beta released a few weeks later, at the end of June 2021. All devices that supported iOS 13, iPadOS 13, iOS 14, and iPadOS 14 also support iOS 15 and iPadOS 15. However, iOS 15 and iPadOS 15 have limited support on devices with A8, A8X, A9, A9X, A10 Fusion, A10X Fusion, and A11 Bionic chips, which include iPhone 6S, iPhone 7, iPhone 8, iPhone X, iPhone SE (1st generation), iPod Touch (7th generation), iPad (5th generation), iPad (6th generation), iPad (7th generation), iPad Air 2, iPad Mini 4, iPad Pro (1st generation), and iPad Pro (2nd generation). Notable software bugs and issues in iOS Device codes iPhone iPhone (1st generation) iPhone 3G iPhone 3GS iPhone 4 (GSM version) iPhone 4 (CDMA version) iPhone 4S iPhone 5 iPhone 5C iPhone 5S iPhone 6 iPhone 6 Plus iPhone 6S iPhone 6S Plus iPhone SE (1st generation) iPhone 7 with Intel PMB9943 modem iPhone 7 with Qualcomm MDM9645 modem iPhone 7 Plus with Intel PMB9943 modem iPhone 7 Plus with Qualcomm MDM9645 modem iPhone 8 with Intel PMB9948 modem iPhone 8 with Qualcomm MDM9655 modem iPhone 8 Plus with Intel PMB9948 modem iPhone 8 Plus with Qualcomm MDM9655 modem iPhone X with Intel PMB9948 modem iPhone X with Qualcomm MDM9655 modem iPhone XS iPhone XS Max iPhone XR iPhone 11 iPhone 11 Pro iPhone 11 Pro Max iPhone SE (2nd generation) iPhone 12 iPhone 12 Mini iPhone 12 Pro iPhone 12 Pro Max iPhone 13 iPhone 13 Mini iPhone 13 Pro iPhone 13 Pro Max iPad iPad (1st generation) Wi-Fi only iPad (1st generation) Wi-Fi+3G iPad 2 Wi-Fi-only iPad 2 Wi-Fi+3G GSM iPad 2 Wi-Fi+3G CDMA iPad 3 Wi-Fi-only iPad 3rd generation Wi-Fi+4G (LTE) (AT&T/global version) iPad 3rd generation Wi-Fi+4G (LTE) (Verizon version) iPad 4th generation Wi-Fi-only iPad 4th generation Wi-Fi+4G (LTE) (AT&T/global version) iPad 4th generation Wi-Fi+4G (LTE) (Verizon version) iPad Air Wi-Fi-only iPad Air Wi-Fi+4G (LTE) (AT&T/global version) iPad Air Wi-Fi+4G (LTE) (Verizon version) iPad Air 2 Wi-Fi-only iPad Air 2 Wi-Fi+4G (LTE) (AT&T/global version) iPad (2017) Wi-Fi-only iPad (2017) Wi-Fi+4G (LTE) (AT&T/global version) iPad (2018) Wi-Fi-only iPad (2018) Wi-Fi+4G (LTE) (AT&T/global version) iPad Air (2019) Wi-Fi-only iPad Air (2019) Wi-Fi+4G (LTE) (AT&T/global version) iPad (2019) Wi-Fi-only iPad (2019) Wi-Fi+4G (LTE) (AT&T/global version) iPad Air (2020) Wi-Fi-only iPad Air (2020) Wi-Fi+4G (LTE) (AT&T/global version) iPad (2020) Wi-Fi-only iPad (2020) Wi-Fi+4G (LTE) (AT&T/global version) iPad Mini iPad Mini (1st generation) Wi-Fi-only iPad Mini (1st generation) Wi-Fi+4G (LTE) iPad Mini 2 Wi-Fi-only iPad Mini 2 Wi-Fi+4G (LTE) iPad Mini 3 Wi-Fi-only iPad Mini 3 Wi-Fi+4G (LTE) iPad Mini 4 Wi-Fi-only iPad Mini 4 Wi-Fi+4G (LTE) iPad Mini (5th generation) Wi-Fi-only iPad Mini (5th generation) Wi-Fi+4G (LTE) iPad Mini (6th generation) Wi-Fi-only iPad Mini (6th generation) Wi-Fi+4G (LTE) iPad Pro iPad Pro (1st generation (12.9-inch)) Wi-Fi-only iPad Pro (1st generation (12.9-inch)) Wi-Fi+4G (LTE) iPad Pro (9.7-inch) Wi-Fi-only iPad Pro (9.7-inch) Wi-Fi+4G (LTE) iPad Pro (2nd generation (12.9-inch)) Wi-Fi-only iPad Pro (2nd generation (12.9-inch)) Wi-Fi+4G (LTE) iPad Pro (10.5-inch) Wi-Fi-only iPad Pro (10.5-inch) Wi-Fi+4G (LTE) iPad Pro (3rd generation (12.9-inch)) Wi-Fi-only iPad Pro (3rd generation (12.9-inch)) Wi-Fi+4G (LTE) iPad Pro (1st generation (11-inch)) Wi-Fi-only iPad Pro (1st generation (11-inch)) Wi-Fi+4G (LTE) iPad Pro (4th generation (12.9-inch)) Wi-Fi-only iPad Pro (4th generation (12.9-inch)) Wi-Fi+4G (LTE) iPad Pro (2nd generation (11-inch)) Wi-Fi-only iPad Pro (2nd generation (11-inch)) Wi-Fi+4G (LTE) iPad Pro (5th generation (12.9-inch)) Wi-Fi-only iPad Pro (5th generation (12.9-inch)) Wi-Fi+4G (LTE) iPad Pro (3rd generation (11-inch)) Wi-Fi-only iPad Pro (3rd generation (11-inch)) Wi-Fi+4G (LTE) iPod Touch iPod Touch (1st generation) iPod Touch (2nd generation) iPod Touch (3rd generation) iPod Touch (4th generation) iPod Touch (5th generation) iPod Touch (6th generation) iPod Touch (7th generation) HomePod HomePod (1st generation) HomePod Mini (1st generation) See also Information on other operating systems developed by Apple Inc. macOS iPadOS watchOS tvOS Version histories for other operating systems and products by Apple Inc. macOS version history iTunes version history Safari version history Version histories for other mobile operating systems Android version history References External links – official site Lists of operating systems Software version histories Tablet operating systems
4304957
https://en.wikipedia.org/wiki/List%20of%20architecture%20schools
List of architecture schools
This is a list of architecture schools at colleges and universities around the world. An architecture school (also known as a school of architecture or college of architecture), is an institution specializing in architectural education. Africa Algeria Département d'architecture de l'université Benyoucef Benkhedda Algiers Département d'architecture de l'université Amar Telidji de Laghouat Département d'Architecture de l'université L'arbi Ben Mhidi Oum El Bouaghi Département d'architecture de l'université Med Khieder de Biskra Département d'Architecture de Sétif Département d'architecture du centre universitaire L'Arbi Tbessi de Tébessa École Polytechnique d'Architecture et d'Urbanisme (EPAU) d'Alger Institut d'architecture de Batna Institut d'architecture et d'urbanisme de l'université Saad Dahleb Blida Institut d'architecture de Mostaganem Institut d'architecture de Tizi-Ouzou Institut d'architecture de Tlemcen (Université d'Abou Bakr Belkaid, faculté des sciences de l'ingénieur), Tlemcen Institut d'Architecture et d'Urbanisme (IAUC) de Constantine Cameroon École Supérieure Speciale d'Architecture du Cameroun (ESSACA) Yaoundé Ghana Department of architecture, Faculty of Art and Built Environment, Kwame Nkrumah Un of Science and Technology, Kumasi Department of architecture of the Faculty of Art and built engineering,university of Ghana, Accra Egypt Department of Architecture of the Faculty of Engineering, Al Azhar University, Cairo and Qena Department of Architecture in the German University (GUC), Cairo Department of Architecture of the Faculty of Engineering (HTI), 10th of Ramadan City Department of Architecture of the Faculty of Engineering, Ain Shams University, Abbaseya, Cairo Department of Architecture of the Faculty of Engineering, Alexandria University, Alexandria Department of Architecture of the Faculty of Engineering, Arab Academy for Science and Technology and Maritime Transport, Cairo and Alexandria Department of Architecture of the Faculty of Engineering, Suez Canal University, Ismailia Department of Architecture of the Faculty of Engineering, Assiut University, Assiut Department of Architecture of the Faculty of Engineering, Behira Higher Institute, BHI, Kilo 47 Alexandria, Cairo Desert Road Department of Architecture of the Faculty of Engineering, Benha University, Shoubra Department of Architecture of the Faculty of Engineering, Cairo Higher Institute, CHI, New Cairo, Cairo Department of Architecture of the Faculty of Engineering, Cairo University, Giza Department of Architecture of the Faculty of Engineering, El Shorouk Academy, Cairo, El Shorouk Department of Architecture of the Faculty of Engineering, Helwan University, Mattareya, Cairo Department of Architecture of the Faculty of Engineering, Higher Technological Institute (HTI), 10th of Ramadan City Department of Architecture of the Faculty of Engineering, Mansoura University, Mansoura Department of Architecture of the Faculty of Engineering, Menoufia University, Menoufia Department of Architecture of the Faculty of Engineering, Misr International University, Cairo Department of Architecture of the Faculty of Engineering, Misr University for Science and Technology (MUST), 6 October City Department of Architecture of the Faculty of Engineering, Modern Academy for Engineering and Technology, Maadi Department of Architecture of the Faculty of Engineering, MSA University, 6 October City, Cairo Department of Architecture of the Faculty of Engineering, Pharos University in Alexandria, Alexandria Department of Architecture of the Faculty of Engineering, Tanta University, Tanta Department of Architecture of the Faculty of Engineering, Zagazig University, Zagazig Department of Architecture of the Faculty of Fine Arts, Alexandria University, Alexandria Department of Architecture of the Faculty of Fine Arts, Helwan University, Zamalek, Cairo Department of Architecture of the Faculty of Fine Arts, Minia University, Minia Department of Architecture, School of Engineering, Université Française d'Égypte, Cairo Department of Architecture, School of Engineering, Future University in Egypt, FUE, Cairo Department of Construction and Architectural Engineering American University in Cairo, (AUC), Kattameya, Helwan Department of Architecture of the Faculty of Engineering, British University in Egypt, Sherouq Kenya Jomo Kenyatta University of Agriculture and Technology, School of Architecture and Building Sciences, Department of Architecture Technical University of Kenya, Faculty of Engineering and the Built Environment, Department of Architecture and Environmental Design Technical University of Mombasa, Faculty of Engineering and Technology, Department of Building and Civil Engineering University of Nairobi, Department of Architecture, Faculty of Architecture and Design Libya Benghazi University, Faculty of Engineering, Department of Architecture and Urban Planning, Benghazi Omar Al-Mukhtar University, Faculty of Art and Architecture, Department of Architecture and Planning, Derna, Al Fatah branch, Derna University of Tripoli, School of Applied Sciences, Department of Architecture and Civil Engineering, Tripoli Mauritius École nationale supérieure d'architecture de Nantes Mauritius (ENSA Nantes Mauritius), UNICITI, Quatre Bornes Ucsi University /Jr School Port Louis Morocco Ecole d'Architecture de Casablanca, Casablanca Ecole Nationale d'Architecture, Rabat Ecole Supérieure de Design e des Arts Visuels Casablanca, Casablanca Nigeria Universities Abia State University, Uturu, Abia State Abubakar Tafawa Ballewa University, Bauchi State Ahmadu Bello University, Zaria, Kaduna State Ambrose Alli University, Ekpoma, Edo State Anambra State University of Science and Technology, Anambra State Bells University of Technology, Ota, Ogun State Crescent University, Abeokuta, Ogun State Covenant University, Ota, Ogun State Cross River University of Technology, Cross River State Enugu State University of Science and Technology, Enugu State Federal University of Agriculture Makurdi, Beneu State Federal University of Technology, Akure, Ondo State Federal University of Technology Minna, Niger State Federal University of Technology Yola, Adamawa State Imo State University, Imo Sate Joseph Ayo Babalola University, Ipo Arakeji and Ikejì-arakeji, Osun State Kano University of Science and Technology, Kano State Ladoke Akintola University of Technology, Ogbomosho, Oyo State Nnamdi Azikiwe University, Awka, Anambra State Obafemi Awolowo University, Ile-Ife Osun State Olabisi Onabanjo University, Ago-iwoye, Ogun State Oluwatomisin Olamide Adeshiyan University, Lekki, Lagos State River State University of Science and Technology, Npkolu, River State University of Benin, Benin, Edo State University of Jos, Jos, Plateau State University of Lagos, Akoka, Lagos State University of Nigeria, Nsukka, Enugu State University of Uyo, Uyo, Akwa Ibom State Polytechnics Abia State Polytechnic, Aba, Abia State Crown Polytechnic, Odo, Ekiti State Federal Polytechnic Ado-Ekiti, Ekiti State Federal Polytechnic Bida, Niger State Federal Polytechnic Ilaro, Ogun State Federal Polytechnic Nekede, Owerri, Imo State Federal Polytechnic Oko, Anambra State Kogi State Polytechnic, Kogi State Kwara State polytechnic, Ilorin, Kwara State Osun State Polytechnic, Iree, Osun State Lagos State Polytechnic, Ikorodu, Lagos State Moshood Abiola Polytechnic, Abeokuta, Ogun State The Oke-Ogun Polytechnic, Saki, Oyo State The Polytechnic, Ibadan, Oyo State Yaba College of Technology, Lagos State, Nigeria Institute Ogun State Institute of Technology (formerly, Gateway Polytechnic) Igbesa, Ogun State South Africa Sudan Faculty of Architecture, College of Engineering, University of Khartoum, Khartoum Faculty of Architecture, Autocratic University, Khartoum Faculty of Architecture, National Ribat University, Khartoum College of Engineering Sciences, Department of Architecture and Planning, Omdurman Islamic University Faculty of Architecture, Future University Khartoum, Khartoum Department of Architecture, University of Science and Technology, Khartoum Faculty of Architecture, National University, Khartoum Faculty of Architecture, University of Medical Science and Technology, Khartoum Faculty of Architecture, AlMughtaribeen University, Khartoum Faculty of Architecture, Department of Architecture and Engineering, The University of Bahri, Khartoum Faculty of Architecture, Department of Engineering, Sudan International University, Khartoum Faculty of Architecture, University of Garden City, Khartoum Faculty of Architecture, Sharg El Neil College, Khartoum Tanzania Ardhi University University of Dar es Salaam Mbeya University Of Science and Technology Togo Ecole Africaine des Métiers d'Architecture et d'Urbanisme (EAMAU) à Lomé Tunisie , Department of Architecture Uganda College of Engineering, Design, Art and Technology, Makerere University, Kampala Faculty of the Built Environment, Uganda Martyrs University, Nkozi Faculty of lands and architecture, Kyambogo University Zimbabwe Faculty of the Built Environment, National University of Science and Technology, Bulawayo Asia Bangladesh Bangabandhu Sheikh Mujibur Rahman Science and Technology University Department of Architecture, Gopalganj Bangladesh University of Engineering & Technology, Department of Architecture, Faculty of Architecture and Planning, Dhaka Chittagong University of Engineering and Technology, Department of Architecture, Faculty of Architecture & Planning, Chittagong Dhaka University of Engineering & Technology, Gazipur Department of Architecture, Dhaka Hajee Mohammad Danesh Science & Technology University Department of Architecture, Dinajpur Khulna University of Engineering & Technology, Department of Architecture , Faculty of Architecture & Planning Khulna Khulna University, Discipline of Architecture, School of Science, Engineering & Technology, Khulna Military Institute of Science and Technology, Department of Architecture, Dhaka Pabna University of Science & Technology, Department of Architecture, Pabna Rajshahi University of Engineering and Technology, Department of Architecture, Rajshahi Shahjalal University of Science and Technology, Department of Architecture, School of Applied Sciences & Technology, Sylhet Ahsanullah University of Science and Technology, Department of Architecture, Dhaka American International University Bangladesh, Department of Architecture, Dhaka Bangladesh University, Department of Architecture, Dhaka Brac University, Department of Architecture, Dhaka Daffodil International University, Department of Architecture, Dhaka Leading University, Department of Architecture, Sylhet North South University, Department of Architecture, Dhaka Premier University, Chittagong, Department of Architecture, Chittagong Primeasia University, Department of Architecture, Dhaka Shanto-Mariam University of Creative Technology, Department of Architecture, Dhaka Sonargaon University, Department of Architecture, Dhaka Southeast University, Department of Architecture, Dhaka Stamford University, Department of Architecture, Dhaka State University of Bangladesh, Department of Architecture, Dhaka University of Asia Pacific, Department of Architecture, Dhaka China Chongqing University, Architecture and Construction College, Shapingba, Chongqing Fuzhou University, College of Architecture, Fuzhou, Fujian Province Harbin Institute of Technology (HIT) School of Architecture, Harbin, Heilongjiang Huazhong University of Science and Technology, College of Architecture and Urban Planning, Wuhan, Hubei Province South China University of Technology (SCUT), School of Architecture and Civil Engineering, Guangzhou, Guangdong Province Southeast University (SEU), School of Architecture, Nanjing, Jiangsu Province Suzhou University of Science and Technology, School of Architecture and Urban Planning, Suzhou, Jiangsu Tianjin University (TJU), School of Architecture, Tianjin Tongji University, College of Architecture and Urban Planning, Shanghai Tsinghua University (THU), School of Architecture, Department of Architecture, Beijing Xi'an Jiaotong Liverpool University (XJTLU), Department of Architecture (XLarch), Suzhou, Jiangsu Xi'an Jiaotong University, Department of Architecture, Xi'an, Shaanxi Xi'an University of Architecture and Technology, School of Architecture, Xi'an, Shaanxi Zhejiang University, College of Civil Engineering and Architecture, Hangzhou, Zhejiang Province Zhengzhou University, School of Architecture, Zhengzhou Zhejiang University of Science and Technology, School of Civil Engineering and Architecture, Hangzhou Inner Mongolia University Of Technology, School of Architecture, Hohhot Hong Kong These are the only two universities offering accredited Master of Architecture for architect professional registration. The Chinese University of Hong Kong, School of Architecture, Hong Kong, founded in 1992 The University of Hong Kong, Faculty of Architecture, Department of Architecture, Hong Kong, founded in 1950 India School of Planning and Architecture, Delhi National Institute of Technology, Jaipur, Rajasthan National Institute of Technology Trichy, Tamil Nadu Chandigarh College of Architecture, Chandigarh Bharati Vidyapeeth, Pune Department of architecture, Central university of Rajasthan,Ajmer. Department of Planning & Architecture, Malaviya National Institute of Technology, Jaipur Department of Architecture, College of Engineering, Trivandrum, Kerala Department of Architecture, Maulana Azad National Institute of Technology, Bhopal Faculty of Architecture, PES University, Bangalore Noble Architecture College, Junagadh (Gujarat) Ernad Knowledge City College of Architecture, Malappuram (Kerala) East West School of Architecture, Bangalore Aayojan School of Architecture, Jaipur Academy of Architecture, Mumbai Academy of Architecture, Rachana Sansad, Mumbai Aditya College of Architecture, Mumbai Allana College of Architecture, Pune Amity School of Architecture and Planning, Lucknow Asian school of architecture and design innovations, Kochi Axis institute of architecture, kanpur Anna University School of Architecture and Planning, Chennai APEEJAY School of Architecture and Planning, Greater Noida B. V. Bhoomaraddi College of Engineering and Technology, Hubli, Karnataka, Dept of Architecture Balwant Sheth School of Architecture, NMIMS University, Mumbai BGS School of Architecture and Planning, Bangalore Bharath Architecture Department, Selaiyur, Chennai BKPS College of Architecture, BKPS, Pune, Maharashtra C.A.R.E. School of Architecture, Tiruchirapalli College of Architecture, Trivandrum (CAT) College of Engineering & Technology, Bhubaneswar, Orissa CSIIT School of Planning and Architecture, Secunderabad Cumins College, Pune D.C. Patel School of Architecture, Arvindbhai Patel Institute of Environmental Design, Sardar Patel University, Anand Dark School of Architecture Dayanand Sagar Institutes, School of Architecture, Bangalore (Chiranjith) Dehradun Institute of Technology, Dehradun, Uttrakhand Department of Architecture, Bengal Engineering and Science University, Shibpur, West Bengal Department of Architecture, Birla Institute of Technology, Mesra, Ranchi, Jharkhand Department of Architecture, BMSCE, Bangalore Department of Architecture, College of Engineering, Trivandrum, Kerala Department of Architecture, DBSCR University of Science and Technology, Murthal, Sonepat Department of Architecture, Jadavpur University, Kolkata Department of Architecture, Madhav Institute of Technology and Science, Gwalior, India Department of Architecture, Maharaja Sayajirao University, Baroda, Gujarat Department of Architecture, Veer Surendra Sai University of Technology, Burla Department of Architecture, [Integral University], [Lucknow] Department of Architecture, National Institute of Technology, Hamirpur, Himachal Pradesh Department of Architecture, National Institute of Technology, Patna, Bihar Department of Architecture, National Institute of Technology Trichy Department of Architecture, School of Planning and Architecture, Bhopal Department of Architecture, TKM College of Engineering, Kollam, Kerala Department of Architecture, ZHCET, Aligarh Muslim University, Aligarh Dignity College of Architecture, Durg Dr. Bhanubhen Nanavati College of Architecture for Women (Pune) Dr. D.Y. Patil College of Architecture, Nerul, Navi Mumbai Faculty of Architecture and Ekistics, Jamia Millia Islamia, New Delhi Faculty of Architecture, CEPT University, Ahmedabad Faculty of Architecture, Gautam Buddha Technical University, Lucknow Faculty of Design, School of Interior Design, CEPT University, Ahmedabad Sir J.J. College of Architecture, University of Mumbai, Mumbai Hindustan University, Chennai, Tamil Nadu Holy Crescent College of Architecture Alwaye, Kerala Indian Institute of Technology Kharagpur, West Bengal Indian Institute of Technology Roorkee, Uttarakhand Indubhai Parekh School of Architecture, Rajkot, Gujarat IES College of Architecture, Mumbai. Jawaharlal Nehru Architecture and Fine Arts University, Hyderabad Kalasalingam School of Architecture, Kalasalingam University, Krishnankoil, Virudhunagar District/Tamil Nadu/India K S School of Architecture, Bangalore K.R. Mangalam School of Architecture & Planning, New Delhi Kamla Raheja Vidyanidhi Institute for Architecture and Environmental Studies, Mumbai Lokmanya Institute of Architecture and Design Studies, Mumbai Lovely School of Architecture and Design (LSAD), Lovely Professional University, Punjab L.S.Raheja School of Architecture, Mumbai M.E.S School of Architecture, MES College of Engineering, Kuttipuram, Kerala M.S. Ramaiah School of Architecture, Bangalore Manipal University, Manipal marg institute of design and architecture swarnabhoomi Marian College of Architecture and Planning, Kazhakkuttom, Trivandrum Masterji College of Architecture, Vattinagulapally, Hyderabad Measi Academy of Architecture, Chennai MBS School of Planning and Architecture, Dwarka sector 9, New Delhi McGan's Ooty School of Architecture, Ooty, Tamil Nadu MEASI Academy of Architecture, Chennai, Tamil Nadu National Institute of Technology Calicut National Institute of Technology Raipur, Chhattisgarh National Institute of Technology, Jaipur, Rajasthan Om Institute of Architecture and Design, Hisar, Haryana Papni School of Architecture, Sriperumbudur Taluk, Kanchipuram District, Tamil Nadu. Pillai's College of Architecture, New Panvel, Navi Mumbai Piloo Modi College of Architecture, ABIT, Cuttack, Orissa Prahar School of Architecture (PSA), Coimbatore, Tamil Nadu Purvanchal Institute of Architecture & Design, Gorakhpur, U.P. R.V. College of Architecture, Bengaluru Rajalakshmi School of Architecture Rajalakshmi Engineering College, Thandalam, Chennai, Tamil Nadu. Rizvi College of Architecture, Mumbai Raffles Design International, Mumbai Sardar Vallabhbhai Patel Institute of Technology, SVIT Vasad Sasi Creative School of Architecture, Coimbatore Sathyabama University, Chennai School of Architecture, Dr. M.G.R Educational and Research Institute University, Chennai School of Architecture, Government College of Engineering, Trichur, Kerala School of Architecture, Meenakshi College of Engineering (SOA, MCE), Chennai School of Architecture, Vadodara Design Academy, Vadodara School of Architecture & Design, Faculty of Design, Manipal University Jaipur, Jaipur School of Architecture and Landscape Design, Shri Mata Vaishno Devi University, Jammu and Kashmir School of Planning and Architecture, Bhopal School of Architecture and Planning, Periyar Maniammai University, Thanjavur, Tamil Nadu School of Planning and Architecture, Vijayawada SCMS School of Architecture (SSA), Cochin, Kerala Sinhgad College of Architecture (SCOA), Pune Smt. Premalatai Chavan College of Architecture, Karad, Maharashtra Sri Venkateshwara College of Architecture, Hyderabad SRM School of Architecture, Chennai Sunder Deep College of Architecture, Ghaziabad Surya School of Architecture (SSA), Chandigarh Sushant School of Art and Architecture, Gurgaon TVB School of Habitat Studies, Vasant Kunj, New Delhi University Institute of Architecture (UIA) Chandigarh University, Mohali University School of Architecture and Planning (USAP) GGSIPU, Delhi University School of Architecture and Planning (USAP) GGSIPU, Delhi The University School of Design, Manasagangothri, Mysore Vastu Kala Academy College of Architecture, New Delhi Vidya Pratishthan's School of Architecture (VPSOA), Baramati, Pune Visvesvaraya National Institute of Technology, Nagpur, Maharashtra Viva School of Architecture, Virar (East), Thane, Maharashtra School of Architecture ( VSPARC), VIT University, Vellore.vit.ac.in Indonesia Atma Jaya University, Yogyakarta Bandung Institute of Technology, Bandung Binus University, Jakarta Brawijaya University, Malang Diponegoro University, Semarang Gadjah Mada University, Yogyakarta Hasanuddin University, Makasar Indonesia University of Education, Bandung Lambung Mangkurat University, Banjarmasin Palangkaraya University, Palangka Raya Pancasila University, Jakarta Parahyangan Catholic University, Bandung Sam Ratulangi University, Menado Sebelas Maret University, Surakarta Sepuluh November Institute of Technology, Surabaya Soegijapranata Catholic University, Semarang Tadulako University, Palu Tanjung Pura University, Pontianak Tanri Abeng University, Jakarta Tarumanagara University, Jakarta Trisakti University, Jakarta Universitas Islam Indonesia, Yogyakarta University of Indonesia, Jakarta University of North Sumatra, Sumatra Utara Udayana University, Denpasar, Bali Universitas Islam Negeri Maulana Malik Ibrahim, Malang, Jawa Timur Mercu Buana University, Meruya, Jakarta Iran Tarbiat Modares University, Tehran Art, Architecture University, Department of Architecture Azad Bam University Azad Kerman University Azad Mashad University, Art & Architecture University, Department of Architecture, Mashad Azad Shiraz University Azad Tehran University, Art & Architecture University, Department of Architecture, Tehran Azad University, South Tehran Branch Eshragh University, Faculty of Architecture, Department of Architecture, Bojnourd Hafez Higher Education Institute of Shiraz Imam Khomeini International University, Architecture, Urbanism, Qazvin Iran University of Science and Technology (IUST), Faculty of Architecture, Department of Architecture and Urban Planning, Tehran Islamic Azad University (Anar branch) Department of Architecture Islamic Azad University, Mashhad Islamic Azad University of Khorasgan (Isfahan) Islamic Azad University of Noor Islamic Azad University of Qazvin (QIAU) Islamic Azad University of Bandar Anzali (IAU Anzali) Islamic Azad University of Shahrood Islamic Azad University of Zanjan (AZU) Islamic Azad University, South Tehran Branch Persian Gulf University of Bushehr Shahid Bahonar Technical College of Shiraz Shahrood University of Technology, Faculty of Architectural Engineering and Urbanism Shiraz university, Faculty of Art and Architecture, Shiraz Shahid Chamran University of Ahvaz Shahid Rajaee University Tabriz Islamic Art University (TIAU), Faculty of Art & Architecture, Tabriz Tarbiat Modares University, Faculty of Art and Architecture, Department of Architecture, Tehran Technical and Vocational University University of Art, Faculty of Urbanism and Architecture, Department of Architecture University of Guilan, Architecture, Urbanism, Urban Design and Planning University of Isfahan (Isfahan Art), Architecture, Urbanism, Urban Design and Planning University of Mazandaran, Faculty of Art and Architecture, Department of Architecture, Babolsar, University of Science and Culture (USC), Faculty of Art and Architecture University of Tehran, Faculty of Fine Arts, Department of Architecture, Tehran University of Urmia, Faculty of Art and Architecture, Department of Architecture Yazd University, Faculty of Art and Architecture, Yazd Iraq Bayan University, Department of Architecture Engineering, Erbil, Kurdistan region Cihan University, Department of Architecture Engineering, Erbil, Kurdistan region Ishik University, Architecture Department, Erbil, Kurdistan region Koya University, Department of Architectural Engineering, Koya, Erbil, Kurdistan region Nahrain University, Department of Architectural Engineering, Baghdad Nawroz University, Department of Architectur, Dohuk, Kurdistan region Salahaddin University, Architectural Engineering Department, Erbil, Kurdistan region University of Babylon, College of Engineering, Babylon University of Baghdad, Architecture Engineering Department, Baghdad University of Basrah, Architectural Department, Basra University of Duhok, School of Engineering, Department of Architecture. Duhok, Kurdistan region University of Mosul, Department of Architectural Engineering, Mosul University of Sulaymaniyah, Architectural Engineering, Sulaymaniyah, Kurdistan region University of Technology, Department of Architectural Engineering, Baghdad University of Wasit, Architecture Engineering Department, Wasit, Uruk University College, Architectur Department, Baghdad Israel Ariel University, School of Architecture, Ariel Bezalel Academy of Art and Design, Department of Architecture, Jerusalem Technion, Israel Institute of Technology, Faculty of Architecture and Town Planning, Haifa Tel Aviv University, Yolanda and David Katz Faculty of the Arts, David Azrieli School of Architecture, Tel Aviv Wizo School of design, The Neri Bloomfield School of Design and Education, Haifa Japan Universities and junior colleges in Japan with a Department of Architecture. National universities Chiba University Faculty of Urban Environmental Systems, Department of Environmental Planning, Department of Architecture Hiroshima University Faculty of Architecture (Program of the four species) Hokkaido University Faculty of Engineering, Department of Environmental and Social System Architecture Kagoshima University Faculty of Engineering, Department of Architecture Kobe University Faculty of Engineering, Department of Architecture Kumamoto University Faculty of Engineering, Department of Architecture Kyoto Institute of Technology, Science part craft engineering modeling region, Kyoto Kyoto University, Faculty of Engineering, Department of Architecture and Architectural Systems, Kyoto Kyushu Institute of Technology Faculty of Engineering, Department of Social architecture construction course Kyushu University Faculty of Engineering, Department of Architecture / Faculty of Arts (formerly Kyushu Institute of Design), Department of Environmental Design Mie University Faculty of Engineering, Department of Architecture Muroran Institute of Technology Faculty of Engineering, Department of Systems Engineering construction, Division of Architecture Nagasaki University Faculty of Engineering Department of the structure Nagoya Institute of Technology Faculty of Engineering, Architecture & Design Department Nagoya University Faculty of Engineering, Department of Social and Environmental Studies Course Architecture Nara Women's University Faculty of Environmental Studies life living environment, Department of Housing course National University Corporation Tsukuba University of Technology Department of Architecture hearing section Niigata University Faculty of Engineering Department of Construction of Architecture Oita University Faculty of Engineering, Department of Environmental Engineering course architecture welfare Osaka University Faculty of Engineering, Architectural Engineering and Environmental Engineering, Department of General In addition to Earth Saga University Faculty of Science and Technology of Urban Engineering, Graduate School of Architecture and Urban Design Course Shimane University Faculty of General Science and Technology Department of Materials and Process Shinshu University Faculty of Engineering, Department of Architecture Tohoku University Faculty of Engineering, Department of Architecture Sendai Tokyo Institute of Technology (TokyoTech), Faculty of Engineering, Department of Architecture Tokyo Tokyo National University of Fine Arts Faculty of Fine Arts Department of Architecture Toyohashi University of Technology Faculty of Engineering Department of Construction University of Fukui, Faculty of Engineering, Department of Building & Construction University of the Ryukyus Faculty of Engineering, Department of Architecture course construction environment University of Tsukuba Science and Technology group (or social engineering <Urban Planning> College of Engineering Systems <Engineering Environment and Development>), main major design group specializing in art school <architectural design, environmental design> University of Tokyo, Faculty of Engineering, Department of Architecture, Department of Urban Engineering Tokyo University of Toyama Faculty of Arts and Culture, Department of Arts and Design building science course Utsunomiya University Faculty of Engineering Department of Construction of Architecture course Wakayama University Faculty of Systems Engineering, Department of Environmental Systems Yamaguchi University Faculty of Engineering, Department of Kansei Engineering Design Course of the human space Yokohama National University Faculty of Engineering, Department of Construction of Architecture Public universities Akita Municipal Junior College of Arts and Crafts Department of Industrial Design Akita Prefectural University Faculty of Systems Science and Technology Department of Architecture and Environment Systems Gifu City Women's College Interior Design Course, Department of Kansei architecture design vocational life design Iwate Prefectural University College of Life Science, Department of Life Science Kanazawa College of Art Faculty of Arts and Crafts Design Department of Environmental Design course Kyoto Prefectural University Faculty of Life and Environmental Department of Environmental Design, Landscape Design, Architecture, Living Environment course and life course Maebashi Institute of Technology Faculty of Engineering, Department of Architecture (Day), Department of Integrated Design (Night) Miyagi University Faculty of Project, Department of Design and Information Design, Course of Space Design Nagoya City University Faculty of Arts and city Department of Environmental Design Okayama Prefectural University School of Design Department of Design Osaka City University Faculty of Engineering, Department of Architecture, Environmental Urban Engineering, Graduate School, Life Sciences Division, Department of the living environment Prefectural University of Kumamoto, Faculty of Symbiotic Science Course, Department of Environmental Symbiosis residential environment Sapporo City University Faculty of Design Department of Design, Space Design Course Tokyo Metropolitan University Faculty of Urban Environment, Department of Architecture and Urban urban environment course University of Hyogo (Hyogo Prefectural University) Faculty of Human Environment Department of the human environment (course plan living space and conservation course creative living environment course environmental analysis) University of Kitakyushu Faculty of International Environmental Engineering, Department of Environmental Space Design KitaKyushu The University of Shiga Prefecture Environmental Sciences Department of Environmental Design, Department of Private universities Hokkaido and Tohoku Dohto University Faculty of Fine Arts Department of Architecture Hachinohe Institute of Technology Faculty of Engineering, Department of Architectural Engineering, Faculty of Design, Department of ansei Design, Course of Housing design sensibility Hokkai Gakuen University Faculty of Engineering, Department of Architecture Hokkaido Institute of Technology Faculty of creation space architecture department (from 2008) Koriyama Women's University Faculty of Human Sciences and Design, Department of Human Life Tohoku Bunka Gakuen University Faculty of Science and Technology (living from Department of Environmental Planning Department of Environmental Design, human, 2008 Department of Environmental Design ) Tohoku Institute of Technology Faculty of Engineering, Department of Architecture Tohoku University of Art and Design Faculty of Engineering and Design, Department of Environmental Design Tokai University (Asahikawa Campus, formerly Hokkaido Tokai University), Faculty of Engineering, Department of Architecture and Environment Design, Arts course architecture Town Planning course, Sapporo Kanto Ashikaga Institute of Technology Faculty of Engineering, Department of Architecture Bunka Women's University Faculty of Art, Department of Housing Architectural Design Course Chiba Institute of Technology Faculty of Engineering, Department of Architecture and Urban Environment Hosei University Faculty of Engineering, Department of Architecture Institute of Technologists (Monotsukuri University) Faculty of craft skill, Department of Construction course cities and architecture course architecture and interior design course wooden building finishing course Japan Women's University Faculty of Human Sciences and Design dwelling Department of Architecture and Environment Design Course Jissen Women's University Faculty of living environment architecture design course Kanagawa University Faculty of Engineering, Department of Architecture Kanto Gakuin University Faculty of Engineering, Department of Architecture/Faculty of Human Environment Department of Environmental Design Keio University Faculty of Science and Technology System Design Engineering (Hiyoshi Campus) and the Faculty of Environmental Information, Department of Information Design (Shonan-Fujisawa Campus) Kogakuin University Faculty of Architecture, Town Planning Department (established in April 2011), Department of Architecture, Department of Architecture and Design: restructuring and reorganization to "Faculty of Architecture" in April 2011, Faculty of Engineering, Department of Architecture, Part 1, Department of Architecture and Urban Design Faculty of Engineering, Department of Architecture Part 2 (night) Kokushikan University Faculty of Architecture Department of Design from the Faculty of Science and Technology Department of Science and Engineering, architectural changes to the system Komazawa Women's University Faculty of Humanities, Department of molding space Kyoei Gakuen Junior College Department of Housing Kyoritsu Women's University Faculty of Human Sciences and Design Department of Architecture and Design Meiji University Faculty of Science and Technology Department of Architecture Meikai University Faculty of Real Estate, Department of Real Estate, Environmental Design Course Meisei University Faculty of Science and Technology Department of Architecture Musashino Art University Faculty of Art, Department of Architecture Musashino University Faculty of Environmental Studies, Department of Environment Living Environment course Nihon University College of Bioresource Science, Department of Biological and Environmental (Fujisawa Campus, Kanagawa) College of Engineering, Department of Architecture (Koriyama Campus, Fukushima) College of Fine Arts, Department of Design, Architecture Design Course (Ecoda Campus, Tokyo) College of Production Engineering, Department of Architecture (Mimomi Campus, Chiba) College of Science and Technology, School of Architecture, School of Marine Engineering Building (Tokyo Surugadai Campus and Chiba Funabashi Campus) Junior College Department of Design, Architecture and life (Funabashi Campus, Chiba) Nippon Institute of Technology Faculty of Engineering, Department of Architecture Shibaura Institute of Technology Faculty of Engineering, Department of Architecture, Department of Architectural Engineering (Mita Campus, Tokyo) Faculty of Systems Engineering, Department of Environmental Systems (Saitama Campus) Showa Women's University Faculty of Life Sciences Division, Department of Architecture course living environment, the design area, Department of Interior Architecture and Design, College of cultural creation Tama Art University Faculty of Art, Department of Environmental Design Department of Design, Space Design, Faculty of plastic expression field Tokai University Faculty of Engineering, Department of Architecture, Faculty of Engineering and Design Department of Architecture and Design Tokyo City University (formerly Musashi Institute of Technology), Faculty of Engineering, Department of Architecture Tokyo Denki University Faculty of Future Science, Department of Architecture Tokyo Polytechnic University Faculty of Engineering, Department of Architecture Tokyo University of Science Faculty of Engineering, Department of Architecture, Part 1, Faculty of Engineering, Department of Architecture, Part 2 (Iidabashi Campus, Tokyo), Faculty of Science and Technology Department of Architecture (Noda Campus, Chiba) Tokyo Zokei University (Tokyo University of Art and Design) Faculty of Art and Design, Department of Interior Design, interior architecture major area Toyo University Faculty of Engineering, Department of Architecture/Faculty of Life human Design, Department of Environmental Design Waseda University School of Creative Science and Technology, Department of Architecture Chubu Aichi Institute of Technology Faculty of Engineering Course of the Department of Architecture, Urban and Environmental Studies Course of Architecture and Environment Aichi Konan College Department of Life Science, interior and architecture course Aichi Sangyo University Art School Department of Architecture, Department of Architecture in Ministry of Education and Communication Aichi Shukutoku University Faculty of Contemporary Social Studies, Department of Urban and Environmental Design, Course of modern society Chubu University Faculty of Engineering, Department of Architecture Daido University Faculty of Engineering, Department of Architecture Fukui University of Technology Faculty of Engineering, Department of Building Construction Gifu Women's University Faculty of Human Sciences and Design Department of Life Sciences Kanazawa Institute of Technology Faculty of Architecture Building and Environment, Department of Architecture and Urban Design Meijo University Faculty of Science and Technology, Department of Architecture Nagaoka Institute Of Design Faculty of Art, Department of Architecture and Environmental Design Nagoya Zokei University (Nagoya University of Art and Design) Art College, Department of Design, Architecture and Design Course space Niigata Institute of Technology Faculty of Engineering, Department of Architecture Shizuoka University of Art and Culture School of Design, Department of molding space Sugiyama Jogakuen University life science unit Department of living and Environmental Design Kinki BAIKA Women's University Faculty of Modern Human Living, Environment Department Kansai University Faculty of Urban and Environmental Engineering, Department of Architecture Kinki University Faculty of Architecture Department of Architecture (established in April 2011: restructuring and reorganization to Faculty of Architecture in April 2011, Faculty of Science and Technology Department of Architecture) Faculty of Engineering, Department of Architecture Industrial Science and Technology Department of Architecture and Design (Kyushu Campus) Literary department seminar and spatial design: Architecture, Department of Art and Design course Kio University Faculty of Health Sciences Department of healthy living human environmental design course Kobe Design University School of Design, Department of Architecture & Environmental Kwansei Gakuin University Faculty of Policy Management, urban policy field Kyoto Seika University, Faculty of Design, Department of Architecture, Kyoto Kyoto Tachibana University Faculty of Modern Business, Department of Urban Design, Tourism and Urban design course Interior and Architecture course, Kyoto University of the Arts Faculty of Arts, Department of Environmental Design, Architecture Design Course/Ministry of Education Department of Design, Architecture Design Course Kyoto Women's University Faculty of Human Sciences and Design, Department of formative life Mukogawa Women's University Faculty of living environment, Department of Architecture Osaka Institute of Technology Faculty of Engineering, Department of Architecture, Department of Space Design Osaka Sangyo University Faculty of Engineering, Department of Environmental Design, Architecture, from the Department of Environmental Design (renamed in April 2008) living environment from the Department of Urban and Environmental Department, Faculty of Human Environment (renamed in April 2008) Osaka Shoin Women's University College of liberal arts, Department of Interior Design Osaka University of Arts Faculty of Arts, Department of Architecture Osaka University of Human Sciences Faculty of Human Sciences, Department of Environment and Architectural Design Otemae Junior College Department of Housing Comprehensive System Design Life Ritsumeikan University Faculty of Science and Technology Urban Systems Engineering, Department of Architecture and Urban Design Setsunan University Faculty of Science and Technology, Department of Architecture Takarazuka University of Art and Design School of Art, Department of Industrial Design, Architectural Design Course and Interior Design Course Tezukayama University Faculty of Modern Life Design, Department of Dwelling Space Chugoku, Shikoku and Kyushu Daiichi Institute of Technology (first technical university) Faculty of Engineering, Department of Architecture Fukuoka University Faculty of Engineering, Department of Architecture Fukuyama University Faculty of Engineering, Department of Architecture Hiroshima Jogakuin University Life Sciences Faculty, Department of Design and living information Hiroshima Institute of Technology Faculty of Engineering, Department of Architecture, Faculty of Environmental Studies Department of Environmental Design Hiroshima International University Faculty of Design, Department of Housing (Social and Environmental Sciences) Kawasaki University of Medical Welfare Faculty of Management, Department of Design, Health and Welfare Kochi University of Technology Faculty of Social Systems Engineering Kurume Institute of Technology Faculty of Engineering, Department of Building Facilities Kwassui Women's College Faculty of Life Healthy Life, Department of Design Kyushu Kyoritsu University Faculty of Engineering, Department of Architecture Kyushu Sangyo University Faculty of Engineering, Department of Architecture, Department of housing and interior design Kyushu Women's University Faculty of Human Sciences and Design, Department of Human Life Matsuyama Shinonome Junior College Department of Life Sciences School of Life Design Mimasaka University of Life Sciences Faculty, Welfare, Department of Environmental Design course architecture welfare Nagasaki Institute of Applied Science Faculty of Engineering, Department of Architecture Nippon Bunri University Faculty of Engineering, Department of Architecture and Design Nishinippon Institute of Technology School of Design, Department of Architecture Okayama University of Science Faculty of Information Technology, Department of Architecture Sojo University Faculty of Engineering, Department of Architecture Tohwa University Faculty of Environmental Design Engineering course architectural, interior design of residential Tokai University (Kumamoto Campus, formerly Kyushu Tokai University), School of Industrial Engineering, Department of Architecture Tokushima Bunri University Faculty of Human Life, Department of Housing Tottori University of Environmental Studies Faculty of Environmental Information, Department of Environmental Design Yasuda Women's University Faculty of Human Sciences and Design, Department of life Design Jordan Jordan University of Science and Technology, College of Architecture and Design, Ar Ramtha. University of Jordan, Department of Architecture, Amman. Laos National University of Laos, Faculty of Architecture, Vientiane capital city Souphanouvong University, Faculty of Architecture, Luangprabang world heritage city Lebanon American University of Beirut, Faculty of Architecture, Beirut Beirut Arab University, Faculty of Architecture, Debbieh Lebanese American University, Faculty of Architecture, Beirut Académie libanaise des Beaux-Arts, Faculty of Architecture Lebanese University, Institute des Beaux Arts, Department of Architecture Notre Dame University–Louaize, Faculty of Architecture, Louaize Université Saint-Esprit de Kaslik, Department of Architecture, Kaslik Université Saint-Joseph, Ecole d'Architecture, Beirut Malaysia ALFA International College, School of Architecture, Subang Jaya, Selangor Infrastructure University Kuala Lumpur (IUKL), School of Architecture and Built Environment, Kajang International Islamic University of Malaysia Kulliyyah (Faculty) of Architecture and Environmental Design, Gombak, Selangor International University College of Technology Twintech, Faculty of Built Environment (FABE), Department of Architecture, Bandar Sri Damansara, Kuala Lumpur Limkokwing University, Faculty of Built Environment (FABE), Department of Architecture, Cyberjaya Linton University College, Department of Architecture, Mantin, Negeri Sembilan MARA University of Technology (UiTM), Faculty of Architecture, Planning and Surveying, Department of Architecture, Shah Alam, Selangor, Sri Iskandar, Perak National University of Malaysia (UKM), Faculty of Engineering & Built Environment, Department of Architecture, Bangi, Selangor Port Dickson Polytechnic, Architectural Unit, Department of Civil Engineering, Port Dickson, Negeri Sembilan Putra University, Malaysia (UPM), Faculty of Design and Architecture (FRSB), Serdang, Selangor Taylor's University, School of Architecture, Building and Design (SABD), Petaling Jaya Tunku Abdul Rahman University College (TARUC), Faculty of Engineering and Built Environment (FEBE), Kuala Lumpur University College Sedaya International (UCSI) Department of Architecture Cheras, Kuala Lumpur University of Malaya (UM), Faculty of the Built Environment (FBE), Kuala Lumpur University of Science, Malaysia (USM), School Of Housing, Building And Planning (HBP), Architecture Department. Minden, Penang Universiti Teknologi Malaysia (UTM), Faculty of Built Environment (FAB), Department of Architecture, Skudai, Johor Bahru Universiti Tunku Abdul Rahman (UTAR), Lee Kong Chian Faculty of Engineering and Science (LKC FES), Department of Architecture & Sustainable Design, Sungai Long, Selangor Universiti Sains Islam Malaysia (USIM), Faculty of Engineering and Built Environment (FKAB), Department of Architecture, Nilai, Negeri Sembilan Oman Caledonian College of Engineering Higher College of Technology Sultan Qaboos University Pakistan Khyber Pakhtunkhwa CECOS University of Information Technology and Emerging Sciences, Department of Architecture, Peshawar University of Engineering and Technology, Peshawar, Department of Architecture, Abbottabad Punjab Beaconhouse National University Department of Architecture, School of Architecture and Design, Lahore COMSATS Institute of Information Technology (CIIT), Department of Architecture, Islamabad COMSATS Institute of Information Technology (CIIT), Department of Architecture, Lahore National College of Arts, Department of Architecture, Lahore National College of Arts, Department of Architecture, Rawalpindi National University of Sciences and Technology, School of Art, Design and Architecture, Islamabad University of Engineering and Technology, Lahore, Department of Architecture University of the Punjab, Department of Architecture, College Of Art and Design, Lahore University of South Asia, Department of Architecture (Accreditation on hold), Lahore University of Lahore, Department of Architecture, Lahore Sindh Dawood University of Engineering and Technology, Department of Architecture, Karachi Indus Valley School of Art and Architecture, Department of Architecture, Karachi Mehran University of Engineering and Technology, Department of Architecture, Centre of Excellence in Art and Design (Accreditation on hold), Jamshoro Mehran University of Engineering and Technology, Department of Architecture, Jamshoro NED University of Engineering and Technology, Department of Architecture, Karachi University of Karachi, Architecture Program at Department of Visual Studies, Karachi Philippines Luzon Adamson University (AdU), College of Architecture Don Honorio Ventura Technological State University (DHVTSU), College of Engineering and Architecture, Bacolor, Pampanga Bulacan State University (BulSU), College of Architecture and Fine Arts, Malolos City Cebu Institute of Technology – University (CIT-U), College of Engineering and Architecture, Cebu City Central Colleges of the Philippines (CCP), College of Architecture De La Salle–College of Saint Benilde (DLS-CSB), College of Architecture De La Salle University–Dasmariñas (DLSU-D), College of Engineering Architecture and Technology, Dasmariñas Eulogio "Amang" Rodriguez Institute of Science and Technology (EARIST), College of Architecture and Fine Arts Far Eastern University (FEU), Institute of Architecture and Fine Arts (IARFA) FEATI University, College of Architecture Manuel L. Quezon University (MLQU), School of Architecture Mapúa Institute of Technology (MIT), School of Architecture, Industrial Design and the Built Environment National University (NU), College of Architecture Nueva Ecija University of Science and Technology (NEUST), College of Architecture Pamantasan ng Lungsod ng Maynila (PLM), College of Architecture and Urban Planning Polytechnic University of the Philippines, College of Architecture and Fine Arts (PUP-CAFA) Rizal Technological University (RTU), College of Engineering and Industrial Technology Saint Louis University, Baguio City (SLU), Otto Hahn School of Engineering and Architecture, Baguio Technological Institute of the Philippines-Manila (TIP-Manila) and Quezon City (TIP-QC) College of Architecture and Engineering Technological University of the Philippines-Manila (TUP-Manila) College of Architecture and Fine Arts University of the Assumption (UA), College of Engineering and Architecture, City of San Fernando, Pampanga University of Baguio (UB), College of Engineering and Architecture, Baguio University of the Cordilleras (UC-BCF), College of Engineering and Architecture, Baguio City University of Pangasinan (UPANG), College of Architecture, Dagupan City University of the Philippines Diliman (UP-D), College of Architecture University of Saint Anthony (USANT), College of Engineering and Architecture and Technology, San Miguel, Iriga City University of Santo Tomas (UST), College of Architecture Visayas and Mindanao La Consolacion College–Bacolod (LCCB), School of Architecture, Fine Arts and Interior Design, Bacolod Mindanao University of Science and Technology (MUST), College of Engineering and Architecture, Lapasan, Cagayan de Oro City Silliman University (SU), College of Engineering and Design, Department of Architecture, Dumaguete City, Negros Oriental University of Mindanao (UM), College of Architecture and Fine Arts, Davao City University of the Philippines Mindanao, UPMin, B.S. Architecture Program, Davao City University of San Agustin (USA), College of Engineering and Architecture, General Luna, Iloilo City University of San Carlos (USC), College of Architecture and Fine Arts, Cebu City Western Mindanao State University, College of Architecture, Zamboanga City Western Visayas College of Science and Technology, College of Engineering and Architecture, Iloilo City Saudi Arabia King Abdulaziz University, Faculty of Environmental Design, Jeddah Effat University, Faculty of Architecture & Design, Jeddah https://www.effatuniversity.edu.sa/English/Academics/Undergraduate/CoAD/Pages/default.aspx King Fahd University of Petroleum & Minerals, College of Environmental Design, Dhahran King Faisal University, College of Architecture and Planning, Dammam King Saud University, College of Architecture and Planning, Riyadh Qassim University, College of Architecture and Planning, Buraidah Umm al-Qura University, College of Engineering and Islamic Architecture, Makkah University of Dammam, College of Architecture and Planning, Dammam iLines Architecture, Makkah, Saudi Arabia. Singapore National University of Singapore (NUS), School of Design and Environment (SDE), Department of Architecture Singapore University of Technology and Design (SUTD) Department of Architecture and Sustainable Design. South Korea Hanyang University, Department of Architecture, College of Architecture, Seoul Hongik University, Department of Architecture, College of Architecture, Seoul Konkuk University, Department of Architecture, College of Architecture, Seoul Konkuk University, [GSAKU] Graduate School of Architecture, Seoul Kookmin University, [SAKU] Department of Architecture, College of Architecture, Seoul Korea National University of Arts (K-ARTS), School of Visual Arts, Seoul Korea University (KU), Department of Architecture, Seoul Kyunghee University (KHU), Department of Architecture, Yongin Myongji University (MJU), College of Architecture, Yongin Seoul National University (SNU), Department of Architecture, Seoul Sungkyunkwan University (SKKU), Department of Architecture, Seoul University of Seoul, Department of Architecture, College of Urban Sciences, Seoul University of Ulsan (Ulsan Institute of Technology, UIT), School of Architecture, Ulsan Wonkwang University (WKU), Department of Architecture, Iksan Yonsei University (YU), Department of Architectural Engineering, Seoul Sri Lanka City School of Architecture (CSA), Colombo University of Moratuwa (UoM), Faculty of Architecture, Department of Architecture, Moratuwa University of Kelaniya (UOK), Faculty of Architecture, Department of Architecture, Kelaniya Taiwan Chinese Culture University, College of Environmental Design, Department of Architecture and Urban Design, Taipei Chung Hua University, College of Architecture and Planning, Department of Architecture and Urban Planning, Hsinchu Chung Yuan Christian University, College of Design, Department of Architecture, Zhongli Feng Chia University, School of Architecture, Taichung City Ming Chuan University, School of Design, Department of Architecture, Taipei National Cheng Kung University, College of Planning & Design, Department of Architecture, Tainan City National Chiao Tung University, College of Humanities and Social Sciences, Graduate Institute of Architecture, Hsinchu National Quemoy University, Department of Architecture, Kinmen County National Taiwan University of Science and Technology, Department of Architecture, Taipei City Tamkang University, College of Engineering, Department of Architecture, Danshui District, New Taipei City Tunghai University, Department of Architecture, Taichung City Thailand Assumption University (AU), The International University in Thailand, School of Architecture, Suwannaphumi, Bangkok Bangkok University (BU), Faculty of Architecture, Pathum Thani Chiang Mai University (CMU), Faculty of Architecture, Chiang Mai Chulalongkorn University (CU), Faculty of Architecture, INDA (International Program in Design and Architecture) Bangkok Kasem Bundit University (KBU), Faculty of Architecture, Bangkok Kasetsart University (KU), Faculty of Architecture, Bang Khen, Bangkok Khon Kaen University (KKU), Faculty of Architecture, Khon Kaen King Mongkut's Institute of Technology Ladkrabang (KMITL), Faculty of Architecture, Department of Architecture, Lat Krabang, Bangkok King Mongkut's University of Technology Thonburi (KMUTT/Bangmod), School of Architecture and Design, Thung Khru, Bangkok Maejo University (MJU), Faculty of Architecture and Environmental Design, Sansai, Chiang Mai Mahasalakarm University (MSU), Faculty of Architecture Urban Design and Creative Arts, Mahasalakarm Rajamangala University of Technology Thanyaburi (RMUTT), Faculty of Architecture, Pathum Thani Rangsit University (RSU), Faculty of Architecture, Rangsit Silpakorn University (SU), Faculty of Architecture, Bangkok Sripatum University (SPU), Faculty of Architecture, Bangkok Thammasat University (TU), Faculty of Architecture and Planning, Pathum Thani United Arab Emirates Abu Dhabi University, Department of Architecture and Design, College of Engineering and Computer Science, Abu Dhabi American University in Dubai, Department of Architecture Ajman University of Science and Technology, College of Architectural Engineering, Ajman American University of Sharjah, College of Architecture, art and Design, Sharjah Canadian University of Dubai, College of Architecture, Dubai United Arab Emirates University, Department of Architectural Engineering, Al Ain University of Sharjah, College of Architectural Engineering, Sharjah Vietnam Hanoi Architectural University, Hanoi Ho Chi Minh City Architecture University, Ho Chi Minh city Hue University of Sciences, Faculty of Architecture, Hue City National University of Civil Engineering, Faculty of Architecture and Urban Planning, Hanoi Van Lang University, Ho Chi Minh city Ton Duc Thang University, Faculty of Civil Engineering, Department Of Architecture, Ho Chi Minh city Europe Albania Albanian University Epoka University, Department of Architecture International school of Architecture and Urban Policies, POLIS University Universiteti Politeknik i Tiranës, Departamenti i Arkitekturës dhe Urbanistikës Austria Academy of Fine Arts Vienna, Architektur, Vienna University for Continuing Education Krems, Department for Building and Environment, Krems Graz University of Technology (TU Graz), Faculty of Architecture, Graz University of Applied Arts Vienna, School of Architecture, Vienna University of Art and Design Linz, Department the architecture programme, Linz University of Innsbruck, Fakultät für Architektur, Innsbruck TU Wien, Fakultät für Architektur und Raumplanung, Vienna Belgium French Community of Belgium The following is a list of French-speaking architectural schools in Belgium: Architecture University of Louvain (UCLouvain) Faculty of Architecture, Architectural Engineering and Urban Planning, Brussels, Tournai and Louvain-la-Neuve School of Urbanism and Territorial Planning, Louvain School of Engineering, Louvain-la-Neuve University of Liège (ULiège) Faculty of Architecture, Liège Gembloux Agro-Bio Tech, Gembloux Université libre de Bruxelles (ULB), Faculty of Architecture, Ixelles University of Mons (UMons), Faculty of Architecture and Urban Planning, Mons Saint-Luc Institutes Brussels, Higher Institute for Urbanism and Urban Renovation (ISURU), Brussels Architectural engineering University of Louvain (UCLouvain) Faculty of Architecture, Architectural Engineering and Urban Planning, Brussels, Tournai and Louvain-la-Neuve Louvain School of Engineering, Louvain-la-Neuve University of Liège (ULiège), School of Engineering, Liège Université libre de Bruxelles (ULB), École polytechnique de Bruxelles, Brussels University of Mons (UMons), Faculty of Engineering, Mons Interior architecture Saint-Luc Institutes Brussels, Saint-Luc Brussels School of Arts (ESA), Brussels Saint-Luc Liège School of Arts, Liège City of Liège School of Arts, Liège Arts2 School of Arts, Mons ENSAV La Cambre, Brussels Académie Royale des Beaux-Arts de la Ville de Bruxelles (ARBA-ESA), Brussels Académie Royale des Beaux-Arts de la Ville de Tournai, Tournai Flemish Community of Belgium The following is a list of Dutch-speaking architectural schools in Belgium: Architecture University of Antwerp (UA), Faculty of Design Sciences, Antwerp Katholieke Universiteit te Leuven (KU Leuven), Faculty of Architecture, Schaerbeek and Ghent Hasselt University (UHasselt), Faculty of Architecture and Arts, Hasselt Architectural engineering Ghent University (UGent), Faculty of Engineering and Architecture, Ghent Katholieke Universiteit te Leuven (KU Leuven), Faculty of Engineering Sciences, Leuven Vrije Universiteit Brussel (VUB), Faculty of Engineering, Ixelles Bosnia and Herzegovina International University of Sarajevo, Faculty of Architecture and Social Science, Sarajevo University of Banja Luka, Faculty of Architecture and Civil Engineering, Banja Luka University of Sarajevo, Faculty of Architecture, Sarajevo Bulgaria Higher School of Civil Engineering (VSU) "Lyben Karavelov", Sofia New Bulgarian University, Sofia University of Architecture, Civil Engineering and Geodesy, Sofia Varna Free University "Chernorizets Hrabar", Varna Croatia University of Split, Faculty of Civil Engineering, Architecture and Geodesy, Split University of Zagreb, Faculty of Architecture, Zagreb Cyprus Cyprus School of Architecture, Cyprus College of Art, Lemba. Eastern Mediterranean University, Faculty of Architecture, Famagusta Frederick University, School of Architecture, Fine and Applied Arts, Department of Architecture, Nicosia Neapolis University, School of Architecture and Environmental Sciences, Paphos University of Cyprus, School of Engineering, Department of Architecture, Nicosia University of Nicosia, School of Humanities, Social Science & Law, Department of Architecture, Nicosia Czechia Academy of Arts, Architecture and Design, Department of architecture, Prague ARCHIP, Architectural Institute in Prague, Prague Brno University of Technology, Faculty of Architecture, Brno Czech Technical University in Prague, Faculty of Architecture, Prague Technical University of Liberec, Faculty of Architecture, Liberec Denmark Aarhus School of Architecture, Aarhus Royal Danish Academy of Fine Arts Estonia Estonian Academy of Arts, Tallinn Finland Aalto University Tampere University of Technology University of Oulu, Department of Architecture, Oulu France Architecture schools in France are called ENSA: École nationale supérieure d'architecture. École d'architecture de la ville et des territoires à Marne-la-Vallée, Champs-sur-Marne, Île-de-France École de Chaillot, La cité de l'architecture et du patrimoine, Palais de Chaillot, Paris École nationale supérieure d'architecture de Clermont-Ferrand, Clermont-Ferrand École nationale supérieure d'architecture de Grenoble, Grenoble École nationale supérieure d'architecture de Lyon, Lyon École nationale supérieure d'architecture de Marseille-Luminy, Marseille-Luminy École nationale supérieure d'architecture de Montpellier, Languedoc-Roussillon École nationale supérieure d'architecture de Nancy, Nancy École nationale supérieure d'architecture de Nantes, Nantes École nationale supérieure d'architecture de Normandie, Rouen École nationale supérieure d'architecture de Paris-Belleville, Paris École nationale supérieure d'architecture de Paris-La Villette (ENSAPLV), Paris École nationale supérieure d'architecture de Paris-Malaquais, Paris École nationale supérieure d'architecture de Paris-Val de Seine, Paris École nationale supérieure d'architecture de Rennes Rennes, Brittany École nationale supérieure d'architecture de Saint-Étienne, Saint-Étienne École nationale supérieure d'architecture de Strasbourg, Strasbourg École nationale supérieure d'architecture de Toulouse, Toulouse École nationale supérieure d'architecture de Versailles, Versailles École nationale supérieure d'architecture et de paysage de Bordeaux, Bordeaux École nationale supérieure d'architecture et de paysage de Lille, Lille École nationale supérieure des arts et industries de Strasbourg, INSA de Strasbourg, Strasbourg École spéciale d'architecture (ESA), Paris Institut national des sciences appliquées de Strasbourg, Strasbourg Confluence Institute for Innovation and Creative Strategies in Architecture, Paris Paris School of Architecture (PSA), Paris Germany Architecture schools in Germany can be part of art academies, technical universities or universities of applied sciences. Greece Aristotle University of Thessaloniki, Faculty of Engineering, School of Architecture, Thessaloniki Democritus University of Thrace, Faculty of Engineering, Department of Architectural Engineering, Xanthi, Thrace National Technical University of Athens (Εθνικό Μετσόβιο Πολυτεχνείο), School of Architecture, Athens Technical University of Crete, Department of Architecture, Chania, Crete University of Patras, School of Engineering, Department of Architecture, Patras University of Thessaly, School of Engineering, Department of Architecture, Volos, Thessaly University of Ioannina, School of Engineering, Department of Architecture, Ioannina, Epirus Hungary Budapest University of Technology and Economics, Faculty of Architecture, Budapest Moholy-Nagy University of Art and Design, Faculty of Architecture, Budapest Nyugat-Magyarországi Egyetem, Faipari Mérnöki Kar, Sopron Széchenyi István University, Faculty of Architectural Design, Győr Szent István University, Ybl Miklós Faculty of Architecture and Civil Engineering, Budapest University of Debrecen, Faculty of Technology, Debrecen University of Pécs, Pollack Mihály Faculty of Technology, Pécs Iceland Iceland Academy of Arts, Department of Design & Architecture, Reykjavik Ireland Architecture Schools in the Republic of Ireland Architecture Schools in Northern Ireland (United Kingdom) Queen's University Belfast, School of Natural and Built Environment, Belfast, Ulster University, Belfast School of Architecture, Belfast Italy Latvia Riga Technical University Rigas International School of Economics and Business Administration Lithuania Kaunas University of Technology Vilnius Academy of Arts Vilnius Gediminas Technical University Moldova Technical University of Moldova, Urbanism and Architecture Faculty; http://utm.md/en/ Netherlands Amsterdam School of the Arts, Amsterdam Academy of Architecture, Amsterdam ArtEZ, Arnhem Academy of Architecture, Arnhem Berlage Institute, Postgraduate Laboratory of Architecture, Rotterdam Delft University of Technology, Faculty of Architecture, Delft Eindhoven University of Technology, Department of Architecture, Building and Planning, Eindhoven Maastricht University, Faculty of Humanities & Sciences, Department of Architecture, Maastricht The Rotterdam Academy of Architecture and Urban Design, Rotterdam North Macedonia The Ss. Cyril and Methodius University, Faculty of Architecture State University of Tetovo, Applied Sciences Faculty University American College Skopje, Faculty of Architecture and Design Norway Godkjente arkitektskoler: Bergen School of Architecture, Bergen Norwegian University of Science and Technology Faculty of Architecture and Fine Art, Trondheim Oslo School of Architecture and Design, Oslo Poland Andrzej Frycz Modrzewski Krakow University (Krakowska Akademia im. Andrzeja Frycza Modrzewskiego), Faculty of Architecture and Fine Arts, Kraków Białystok Technical University (Politechnika Białostocka), Faculty of Architecture, Białystok Gdańsk University of Technology (Politechnika Gdańska), Faculty of Architecture, Gdańsk Kielce University of Technology (Politechnika Świętokrzyska), Faculty of Architecture and Town Planning, Kielce Lublin University of Technology (Politechnika Lubelska), Faculty of Architecture, Lublin Poznań University of Technology (Politechnika Poznańska), Faculty of Architecture, Poznań Silesian University of Technology (Politechnika Śląska), Faculty of Architecture, Gliwice Tadeusz Kościuszko University of Technology (Politechnika Krakowska), Kraków Technical University of Łódź, Faculty of Civil Engineering, Architecture and Environmental Engineering, Łódź University of Arts in Poznań (Uniwersytet Artystyczny w Poznaniu), Faculty of Architecture and Design, Poznań University of Economy Bydgoszcz (Wyższa Szkoła Gospodarki (WSG)), Architecture und Town Planning, Bydgoszcz Warsaw University of Technology (Politechnika Warszawska), Faculty of Architecture, Warszawa West Pomeranian University of Technology (Zachodniopomorski Uniwersytet Technologiczny), Faculty of Civil Engineering and Architecture, Szczecin Wroclaw University of Technology (Politechnika Wrocławska), Faculty of Architecture, Wrocław Portugal Escola de Arquitectura da Universidade do Minho Escola Superior Artística do Porto, Arquitectura, Porto Escola Superior Gallaecia (ESG), Arquitectura, Vila Nova de Cerveira Escola Universitária das Artes de Coimbra Instituto Superior de Ciências do Trabalho e da Empresa (ISCTE), Department of Architecture, Lisbon Lusíada University of Porto (ULP), Faculty of Architecture and Arts (FAAULP), Porto Technical University of Lisbon (UTL), Instituto Superior Técnico (IST), Department of Civil Engineering and Architecture, Lisbon Universidade Autónoma de Lisboa Luís de Camões Universidade da Beira Interior (UBI), Arquitectura Covilhã Universidade Lusíada de Lisboa, School of Architecture and Arts, Lisbon Universidade Lusófona de Humanidades e Tecnologias University of Coimbra (UC), Faculty of Sciences and Technology (FCTUC), Department of Architecture, Coimbra University of Évora (UE), Department of Architecture, Évora University of Porto (UP), Faculty of Architecture (FAUP), Porto Technical University of Lisbon (UTL), Faculty of Architecture (FA), Lisbon Romania Universitatea de Architectură şi Urbanism "Ion Mincu", București Universitatea din Oradea, Facultatea de Architectură şi Construcţii, Oradea Universitatea Politehnică din Timișoara, Facultatea de Arhitectură, Timișoara Universitatea Spiru Haret, Facultatea de Arhitectură, București Universitatea Tehnică Cluj-Napoca, Facultatea de Arhitectură, Cluj-Napoca Universitatea Tehnică "Gheorghe Asachi", Facultatea de Arhitectură, Iaşi Russia Institute of Architecture & Art, Rostov on Don MArchI, Moscow Architectural Institute – State Academy Moscow Architecture School MARCH Moscow State University of Civil Engineering – Institute of Construction and Architecture, Moscow Novosibirsk State University of Architecture, Design and Arts Pacific National University, Faculty of Architecture & Design, Khabarovsk Russian Academy of Arts, Department of Architecture Saint-Petersburg State University of Architecture and Civil Engineering, Saint-Petersburg State University of Land Use Planning, Faculty of Architecture Tomsk State University of Architecture and Building, Tomsk USAAA, Ural State Academy of Architecture and Arts, Ekaterinburg Vologda State Technical University, Department of Architecture and Urban Design, Vologda Serbia State University of Novi Pazar, Faculty of Technical Sciences, Novi Pazar University of Belgrade, Faculty of Architecture, Belgrade University of Niš, Faculty of Civil Engineering and Architecture, Niš University of Novi Sad, Faculty of Technical Sciences, Novi Sad University of Prišina, Faculty of Technical Sciences, Kosovska Mitrovica Slovakia Academy of Fine Arts and Design in Bratislava, Department of Architecture, Bratislava Slovak University of Technology in Bratislava, Faculty of Architecture, Bratislava Technical University of Kosice, Faculty of Arts, Kosice Slovenia University of Ljubljana, Faculty of Architecture, Ljubljana University of Maribor, Faculty of Civil Engineering (Department of Architecture), Maribor Spain Architecture schools in Spain are called ETSA: "Escuela Técnica Superior de Arquitectura" in Spanish, or "Escola Tècnica Superior d'Arquitectura" in Catalan. Andalucía; International School of Metaphoric Architecture, Málaga Escuela Técnica Superior de Arquitectura de la Universidad de Granada, Granada Escuela Técnica Superior de Arquitectura de la Universidad de Málaga, Málaga Escuela Técnica Superior de Arquitectura de la Universidad de Sevilla, Sevilla Castilla-La Mancha; Escuela de Arquitectura de la Universidad de Castilla-La Mancha, Toledo Castilla y León; Escuela Técnica Superior de Arquitectura de la Universidad de Valladolid, Valladolid Escuela Técnica Superior de Arquitectura de la Universidad IE (Segovia) Catalunya; Barcelona Institute of Architecture (BIArch) Escola Tècnica Superior d'Arquitectura de Barcelona de la Universitat Politècnica de Catalunya, Barcelona Escola Tècnica Superior d'Arquitectura de la Universitat de Girona, Girona Escola Tècnica Superior d'Arquitectura de la Universitat Internacional de Catalunya, Barcelona Escola Tècnica Superior d'Arquitectura de la Universitat Ramon Llull – La Salle, Barcelona Escola Tècnica Superior d'Arquitectura de la Universitat Rovira i Virgili, Reus Escola Tècnica Superior d'Arquitectura del Vallès de la Universitat Politècnica de Catalunya, Sant Cugat del Vallès Institute for Advanced Architecture of Catalonia, Barcelona Metropolis Master in Architecture and Urban Culture (Universitat Politècnica de Catalunya/Centre de Cultura Contemporània de Barcelona) Galicia; Escola Técnica Superior de Arquitectura da Coruña, A Coruña Las Palmas de Gran Canaria; Escuela Técnica Superior de Arquitectura de la Universidad de Las Palmas de Gran Canaria, Gran Canaria Madrid; Escuela Técnica Superior de Arquitectura de la Universidad Alfonso X, Madrid Escuela Técnica Superior de Arquitectura de la Universidad Camilo José Cela, Madrid Escuela Técnica Superior de Arquitectura de la Universidad CEU San Pablo, Madrid Escuela Técnica Superior de Arquitectura de la Universidad de Alcalá, Madrid Escuela Técnica Superior de Arquitectura de la Universidad Europea de Madrid, Madrid Escuela Técnica Superior de Arquitectura de la Universidad Francisco de Vitoria, Madrid Escuela Técnica Superior de Arquitectura de la Universidad IE, Madrid/Segovia Escuela Técnica Superior de Arquitectura de la Universidad Pontificia de Salamanca, Madrid Escuela Técnica Superior de Arquitectura de Madrid Navarra; Escuela Técnica Superior de Arquitectura de la Universidad de Navarra, Pamplona País Vasco; Escuela Técnica Superior de Arquitectura de la Universidad del País Vasco, San Sebastián Valencia; Escuela Superior de Enseñanzas Técnicas, Arquitectura, Universidad CEU Cardenal Herrera Polytechnic University of Valencia (UPV), School of Architecture (ETSAV), Valencia University of Alicante, Department of Architectural Constructions, Alicante Sweden Chalmers University of Technology, Department of Architecture, Gothenburg Lund University, Department of Architecture, Lund KTH Royal Institute of Technology, School of Architecture and Built Environment, Stockholm Umeå School of Architecture (UMA), Umeå Switzerland Turkey Abant Izzet Baysal University Department of Architecture, Bolu Abdullah Gül University, Department of Architecture, Kayseri Akdeniz University, Fine Arts Faculty Department of Architecture, Antalya Antalya International University, School of Fine Arts and Architecture, Antalya Bahçeşehir University, Faculty of Architecture, Istanbul Beykent University, Faculty of Architecture, Istanbul Dokuz Eylül University, Faculty of Architecture, Izmir Gazi University, Faculty of Architecture, Ankara Gediz University, Department of Architecture, İzmir Fatih Sultan Mehmet Vakıf University, Faculty of Architecture and Design, İstanbul Istanbul Arel University, Faculty of engineering and Architecture, Istanbul Istanbul Bilgi University, Faculty of Architecture, Istanbul Istanbul Kültür University, Faculty of Architecture, Istanbul Istanbul Medipol University, Fine Arts Design and Architecture, Istanbul Istanbul Technical University, Faculty of Architecture, Istanbul İzmir Institute of Technology, Faculty of Architecture, Izmir İzmir University of Economics, Faculty of Architecture, Izmir Karadeniz Technical University, Faculty of Architecture, Trabzon Maltepe University, Faculty of Architecture, Istanbul Middle East Technical University, Faculty of Architecture, Ankara Mimar Sinan University of Fine Arts, Faculty of Architecture, Istanbul Selçuk University, Department of Architecture, Konya TED University, Faculty of Architecture, Ankara Trakya University, Faculty of Architecture, Edirne Yeditepe University, Faculty of Engineering and Architecture, Istanbul Yıldız Technical University, Faculty of Architecture, Istanbul Ukraine Donbasska State Academy of Construction and Architecture, Faculty of Architecture, Makiivka Kharkiv National Academy of Municipal Economy, Townplanning Faculty, Kharkiv Kharkiv State Technical University of Construction and Architecture, Faculty of Architecture, Kharkiv Kyiv National University of Construction and Architecture, Architectural Faculty, Kyiv National Academy of Fine Arts and Architecture, Faculty of Architecture, Kyiv National Academy of Nature Conservation and Resort's Construction, Architectural-Construction Faculty, Simferopol National University "Lvivska Politechnika", Institute of Architecture, Lviv Odessa State Academy of Construction and Architecture, Architectural and Art's Institute, Odessa Poltava National Technik University, Architectural Department, Poltava Prydniprovska State Academy of Civil Engineering and Architecture, Faculty of Architecture, Dnipro United Kingdom England Anglia Ruskin University, Department of Engineering and Built Environment, Chelmsford Architectural Association School of Architecture, London Arts University Bournemouth, School of Architecture, Bournemouth Bartlett School of Architecture, Faculty of the Built Environment, University College London, London Birmingham City University, Birmingham School of Architecture and Design, Birmingham Canterbury School of Architecture, University for the Creative Arts, Canterbury, Kent Coventry University, School of Architecture, Coventry De Montfort University, The Leicester School of Architecture, Leicester Falmouth University, School of Architecture Design and Interiors. Hull College, Hull School of Art and Design, Hull Kent School of Architecture, University of Kent, Canterbury, Kent Kingston University, School of Architecture and Landscape, London Leeds Beckett University, Faculty of Arts, Environment and Technology, Leeds Liverpool John Moores University, Faculty of Media, Arts and Social Science Liverpool London Metropolitan University, School of Art, Architecture and Design, London London School of Architecture, London London South Bank University, Architecture and Design, South Bank, London Newcastle University, Newcastle University School of Architecture, Planning and Landscape, Faculty of Humanities and Social Sciences, Newcastle upon Tyne Northumbria University, Department of Architecture and Built Environment, Newcastle upon Tyne Norwich University of the Arts, School of Architecture, Norwich Nottingham Trent University (NTU), School of Architecture, Design and the Built Environment, Nottingham Oxford School of Architecture, Oxford Brookes University, Oxford Ravensbourne University London, Department of Architecture, Greenwich Peninsula, London Royal College of Art, School of Architecture, London The Manchester School of Architecture, Manchester Sheffield Hallam University, Department of Architecture and Planning, Sheffield University of the Arts London, Central Saint Martins College of Art and Design University of Bath Department of Architecture and Civil Engineering, Bath University of Brighton University of Cambridge, Department of Architecture, Cambridge University of Central Lancashire, The Grenfell-Baines School of Architecture, Preston University of East London, School of Architecture, Computing and Engineering, London University of Greenwich, Department of Architecture and Landscape, London University of Huddersfield, School of Art, Design and Architecture, Huddersfield University of Lincoln, Lincoln School of Architecture, Lincoln University of Lancaster, Lancaster School of Architecture, Lancaster University of Liverpool, Liverpool University School of Architecture, Liverpool University of Nottingham, Department of Architecture and Built Environment, Nottingham University of Plymouth, School of Architecture, Design and Environment, Plymouth University of Portsmouth, UK University of Reading, School of Architecture, Reading, Berkshire University of Sheffield, Sheffield School of Architecture, Sheffield University of the West of England, Faculty of the Built Environment, Frenchay Campus, Bristol University of Westminster, School of Architecture and the Built Environment, Marylebone, London, UK Northern Ireland Queens University of Belfast, School of Architecture The University of Ulster Architecture, School of Architecture, Belfast Scotland Dundee School of Architecture, University of Dundee, Dundee ESALA, Edinburgh School of Architecture and Landscape Architecture Mackintosh School of Architecture, Glasgow School of Art (University of Glasgow), Glasgow Robert Gordon University, The Scott Sutherland School of Architecture, Aberdeen University of Strathclyde, Faculty of Engineering, Department of Architecture, Strathclyde Wales Centre for Alternative Technology, Graduate School of the Environment, Machynlleth University of Wales Trinity Saint David, Faculty of Architecture, Computing and Engineering, Swansea Welsh School of Architecture, Cardiff University, Cardiff North America Canada Alberta University of Calgary School of Architecture, Planning and Landscape, Calgary British Columbia University of British Columbia School of Architecture and Landscape Architecture, Vancouver Manitoba University of Manitoba Faculty of Architecture, Winnipeg Nova Scotia Dalhousie University (formerly Technical University of Nova Scotia) Faculty of Architecture and Planning, Halifax, Nova Scotia Ontario Carleton University, Azrieli School of Architecture and Urbanism, Ottawa Laurentian University, Faculty of Science, Engineering and Architecture, Sudbury Ryerson University, Faculty of Engineering, Architecture and Applied Science, Toronto University of Toronto, John H. Daniels Faculty of Architecture, Landscape and Design, Toronto University of Waterloo, School of Architecture, Cambridge Quebec McGill University School of Architecture, Montreal Université de Montréal, Faculté de l'aménagement, Montréal Université Laval, École d'Architecture, Faculté d'aménagement, d'architecture et des arts visuels Mexico CEDIM, Centro de Estudios Superiores de Diseño de Monterrey, S.C., Departamento de Arquitectura Instituto Superior de Arquitectura y Diseño (ISAD), Chihuahua, Mexico ITESM, Carrera de Arquitectura ITESO, Instituto Tecnologico de Estudios Superiores de Occidente UDG, Universidad de Guadalajara, Centro Universitario de Arte, Arquitectura y Diseño UG, Universidad de Guanajuato, Departamento de Arquitectura UIA, Universidad Iberoamericana, Departamento de Arquitectura UNAM, Facultad de Arquitectura, Mexico City UADY, Faculty of Architecture – Facultad de Arquitectura, Mérida, Yucatán UANL, Univestidad Autónoma de Nuevo León, Facultad de Arquitectura UA, Universidad Anáhuac Mexico Norte, Departamento de Arquitectura Universidad Autonoma the Guadalajara (UAG) Universidad de las Americas Puebla (UDLAP), Departamento de Arquitectura Universidad de Monterrey, Universidad de Monterrey, Departamento de Arquitectura Puerto Rico Escuela de Arquitectura de la Pontificia Universidad Católica de Puerto Rico, Ponce, Puerto Rico United States Central America Costa Rica Costa Rica Institute of Technology, Escuela de Arquitectura y Urbanismo, Campus San José, San José Universidad Autónoma de Centro América, Curridabat Universidad Central de Costa Rica, Escuela de Ingenierías y Arquitectura, Barrio Escalante Universidad Creativa de Costa Rica University of Costa Rica, Escuela de Arquitectura, San José Universidad de las Ciencias y el Arte, facultad de Diseño, San José Universidad Hispanoamericana, Escuela de Arquitectura, Barrio Escalante Universidad Interamericana, Facultad de Ingeniería y Arquitectura, Heredia, Costa Rica|Heredia Universidad Internacional de las Américas, Facultad de Ingeniería y Arquitectura, San José Universidad Latina de Costa Rica, Facultad de Ingeniería y Arquitectura Universidad Latinoamericana de las Ciencias y la Tecnología, Facultad de Ingenierías y Arquitectura, Escuela de Arquitectura Universidad Veritas, Escuela de Arte, Diseño y Arquitectura Cuba Instituto Superior Politecnico de Julio Antonio Mella Jose Antonio Echeverria Higher Technical University Universidad Central Marta Abreu de las Villas, Facultad de Arquitectura Dominican Republic Pontifica Universidad Católica Madre y Maestra Universidad Autónoma de Santo Domingo Universidad Católica Nordestana Universidad Central del Este Universidad Dominicana O & M Universidad Iberoamericana Universidad Nacional Pedro Henríquez Ureña Universidad Tecnológica de Santiago Instituto Nacional de Ciencias Exactas Universidad Católica del Cibao El Salvador Universidad Albert Einstein Universidad Católica de El Salvador Universidad Centroamericana José Simeon Cañas Universidad de El Salvador Universidad de Oriente Universidad Dr. José Matías Delgado Universidad Francisco Gavidia Universidad Gerardo Barrios Universidad Politécnica de El Salvador Universidad Tecnológica de El Salvador Guatemala Universidad de San Carlos de Guatemala Universidad del Istmo Universidad Francisco Marroquín Universidad Mariano Galvez Universidad Mesoamericana Universidad Rafael Landívar Honduras Universidad Nacional Autónoma de Honduras, Campus Tegucigalpa Universidad José Cecilio Del Valle Universidad Católica de Honduras Universidad de San Pedro Sula Universidad Tecnologica Centroamericana Centro de Diseño, Arquitectura y Construcción (CEDAC) Nicaragua Universidad Católica REDEMPTORIS MATER Universidad Centroamericana Universidad del Valle Universidad Iberoamericana de Ciencia y Tecnología Universidad Nacional de Ingeniería Panama Columbus University, Escuela de Arquitectura Isthmus, Escuela de Arquitectura y Diseño de América Latina y el Caribe Quality Leadership University Universidad Católica Santa María la Antigua, Escuela de Diseño Universidad de Panamá, Facultad de Arquitectura, Escuela de Arquitectura Universidad Autónoma de Chiriquí, Facultad de Arquitectura http://www.unachi.ac.pa Universidad Interamericana de Panamá, Facultad de Ingenierías y Arquitectura, Escuela de Arquitectura Universidad Tecnológica de Panamá, Escuela de Ingenierías South America Argentina Universidad Argentina John F. Kennedy, Escuela de Arquitectura, Buenos Aires Universidad Blas Pascal, Facultad de Arquitectura, Córdoba Universidad Católica de Córdoba, Facultad de Arquitectura, Córdoba Universidad Católica de La Plata, Facultad de Arquitectura y Diseño, La Plata Universidad Católica de Salta, Facultad de Arquitectura y Urbanismo, Salta Universidad Católica de Santa Fe, Facultad de Arquitectura y Diseño, Santa Fe Universidad de Belgrano, Facultad de Arquitectura y Urbanismo, Belgrano Universidad de Buenos Aires, Facultad de Arquitectura, Diseño y Urbanismo, Buenos Aires Universidad de Flores, Facultad de Planeamiento Socio-Ambiental, Buenos Aires, Cipolletti Universidad de Mendoza, Facultad de Arquitectura, Urbanismo y Diseño, Mendoza/ Universidad de Moron, Facultad de Arquitectura, Diseño, Arte y Urbanismo, Moron Universidad Nacional de Córdoba, Facultad de Arquitectura, Urbanismo y Diseño, Córdoba Universidad Nacional de La Plata, Facultad de Arquitectura y Urbanismo, La Plata Universidad Nacional de Mar del Plata, Facultad de Arquitectura, Urbanismo y Diseño, Mar Del Plata Universidad de Palermo, Facultad de Arquitectura, Buenos Aires Universidad Nacional de Rosario, Facultad de Arquitectura, Planeamiento y Diseño, Rosario Universidad Nacional de San Juan, Facultad de Arquitectura, San Juan Universidad Nacional de Tucumán, Facultad de Arquitectura y Urbanismo, Tucumán Universidad Nacional del Litoral, Facultad de Arquitectura, Diseño y Urbanismo, Santa Fe Universidad Nacional del Nordeste, Facultad de Arquitectura y Urbanismo Resistencia, Chaco Universidad Torcuato di Tella, Escuela de Arquitectura y Estudios Urbanos, Buenos Aires Brazil Centro Universitário Belas Artes de São Paulo, CUBASP, São Paulo Centro Universitário Izabela Hendrix, Belo Horizonte Escola da Cidade, AEAUSP, São Paulo Pontifícia Universidade Católica do Rio Grande do Sul, PUC-RS Universidade de Brasília, UnB, Faculdade de Arquitetura e Urbanismo, Brasília Universidade de Fortaleza, UNIFOR, Fortaleza Universidade de São Paulo, USP, Faculdade de Arquitetura e Urbanismo, São Paulo Centro Universitário Filadélfia – UNIFIL, Londrina, Paraná, Universidade do Vale do Rio dos Sinos, UNISINOS Universidade Estadual de Campinas, UNICAMP, Faculdade de Engenharia Civil, Arquitetura e Urbanismo, Campinas Universidade Estadual de Londrina (UEL) Universidade Estadual Paulista Júlio de Mesquita Filho, UNESP Universidade Federal de Minas Gerais, UFMG, Faculdade de Arquitetura e Urbanismo, Belo Horizonte Universidade Federal de Pernambuco, UFPE Universidade Federal de Santa Catarina- UFSC Universidade Federal de Uberlândia (UFU) Universidade Federal de Viçosa (UFV) Universidade Federal do Ceará, UFC, Curso de Arquitetura e Urbanismo, Fortaleza Universidade Federal do Espírito Santo, UFES Universidade Federal do Pará, UFPA Universidade Federal do Paraná, UFPR Universidade Federal do Rio de Janeiro, UFRJ, Faculdade de Arquitetura e Urbanismo, Rio de Janeiro Universidade Federal do Rio Grande do Sul, UFRGS Universidade Presbiteriana Mackenzie, UPM, Faculdade de Arquitetura e Urbanismo, São Paulo Universidade São Francisco, USF, Curso de Arquitetura e Urbanismo, Itatiba Universidade Tecnológica Federal do Paraná, UTFPR Chile Pontificia Universidad Católica de Chile Facultad de Arquitectura, Diseño y Estudios Urbanos Pontificia Universidad Católica de Valparaíso Escuela de Arquitectura y Diseño Universidad Católica del Norte Escuela de Arquitectura Universidad de Chile Facultad de Arquitectura y Urbanismo, Santiago Universidad Técnica Federico Santa María Departamento de Arquitectura Colombia Pontificia Universidad Javeriana, Bogotá, Facultad de Arquitectura y Diseño Universidad Catolica de Colombia, Facultad de Arquitectura, RIBA Accredited Program, Bogota Universidad de America, Facultad de Arquitectura y Urbanismo, Bogota Universidad de Boyacá, Facultad de Arquitectura y Bellas Artes, Programa de Arquitectura, Especialización en Diseño Urbano, Maestría en Urbanismo, Tunja, Boyacá Universidad de Ibagué, Facultad de Humanidades, Programa de Arquitectura Universidad de La Salle, Colombia, Universidad de La Salle, Facultad Facultad de Ciencias del Hábitat, Bogota Universidad Nacional de Colombia, Faculatad de Artes, Escuela de Arquitectura y Urbanismo, Bogota Universidad Piloto De Colombia, Faculdad de Arquitectura y Artes, Programa de Arquitectura Universidad Pontificia Bolivariana, Facultad de Arquitectura y Diseño, Medellín Arquitectura Universidad San Buenaventura, Facultad de Arquitectura Universidad Santo Tomas, Facultad de Arquitectura, Tunja Universidad de los Andes, Universidad de los Andes, Facultad de arquitectura y diseno, Bogotá University of Valle, Facultad de Artes Integradas, Escuela de Arquitectura, Cali University la Gran Colombia, Facultad de arquitectura, Bogotá Ecuador Central University of Ecuador, School of Architecture and Urbanism, Quito Pontificia Universidad Católica del Ecuador, Facultad de Arquitectura, Diseño y Artes, Quito Universidad Católica de Santiago de Guayaquil, Guayaquil, Facultad de Arquitectura y Diseño Guayaquil Universidad de Cuenca, Cuenca, Ecuador, Facultad de Arquitectura Universidad San Francisco de Quito, Facultad de Arquitectura y Diseño Interior, Cumbaya, Quito Perú Universidad Peruana de Ciencias Applicadas, Facultad de Arquitectura, Lima Pontificia Universidad Católica del Perú, Facultad de Arquitectura y Urbanismo, Lima Universidad Nacional de Ingeniería Facultad de Arquitectura, Urbanismo y Artes, Lima Universidad Nacional de San Antonio Abad del Cusco Facultad de Arquitectura y Artes Plásticas, Cusco Universidad Ricardo Palma, Facultad de Arquitectura y Urbanismo Uruguay Universidad de la República, Facultad de Arquitectura, Montevideo Universidad ORT, Facultad de Arquitectura, Montevideo Venezuela Instituto Universitario Politecnico Santiago Mariño, Maracaibo, Cabimas, Ciudad Ojeda, Barinas, Mérida, San Cristobal, Caracas, Valencia, Maracay, Barcelona, Maturín, Puerto Ordaz y Porlamar, Facultad De Arquitectura Facultad de Arquitectura y Urbanismo de la Universidad Central de Venezuela en Caracas Facultad de Arquitectura de la Universidad Santa María en Caracas Escuela de Arquitectura de la Universidad Simon Bolivar en Caracas Facultad de Arquitectura y Urbanismo de la Universidad de Los Andes en Mérida Facultad de Arquitectura y Diseño de la Universidad del Zulia en Maracaibo Facultad de Arquitectura y Urbanismo de la Universidad de Carabobo en Valencia Oceania Australia New Zealand Unitec New Zealand, architecture and landscape, Auckland University of Auckland, National Institute of Creative Arts and Industries, School of Architecture and Planning, Auckland Victoria University of Wellington, School of Architecture, Wellington Ara Institute of Canterbury, Bachelor of Architectural Studies (Architectural Technology), Christchurch References External links vitruvio.ch Architectural schools in the world Top Architecture Universities in the UK World's best architecture schools Decoolation Mimarlık – www.decoolation.com.tr Top Architecture Colleges in Bangalore, India
386623
https://en.wikipedia.org/wiki/Internet%20Security%20Association%20and%20Key%20Management%20Protocol
Internet Security Association and Key Management Protocol
Internet Security Association and Key Management Protocol (ISAKMP) is a protocol defined by RFC 2408 for establishing Security association (SA) and cryptographic keys in an Internet environment. ISAKMP only provides a framework for authentication and key exchange and is designed to be key exchange independent; protocols such as Internet Key Exchange (IKE) and Kerberized Internet Negotiation of Keys (KINK) provide authenticated keying material for use with ISAKMP. For example: IKE describes a protocol using part of Oakley and part of SKEME in conjunction with ISAKMP to obtain authenticated keying material for use with ISAKMP, and for other security associations such as AH and ESP for the IETF IPsec DOI. Overview ISAKMP defines the procedures for authenticating a communicating peer, creation and management of Security Associations, key generation techniques and threat mitigation (e.g. denial of service and replay attacks). As a framework, ISAKMP typically utilizes IKE for key exchange, although other methods have been implemented such as Kerberized Internet Negotiation of Keys. A Preliminary SA is formed using this protocol; later a fresh keying is done. ISAKMP defines procedures and packet formats to establish, negotiate, modify and delete Security Associations. SAs contain all the information required for execution of various network security services, such as the IP layer services (such as header authentication and payload encapsulation), transport or application layer services or self-protection of negotiation traffic. ISAKMP defines payloads for exchanging key generation and authentication data. These formats provide a consistent framework for transferring key and authentication data which is independent of the key generation technique, encryption algorithm and authentication mechanism. ISAKMP is distinct from key exchange protocols in order to cleanly separate the details of security association management (and key management) from the details of key exchange. There may be many different key exchange protocols, each with different security properties. However, a common framework is required for agreeing to the format of SA attributes and for negotiating, modifying and deleting SAs. ISAKMP serves as this common framework. ISAKMP can be implemented over any transport protocol. All implementations must include send and receive capability for ISAKMP using UDP on port 500. Implementation OpenBSD first implemented ISAKMP in 1998 via its isakmpd(8) software. The IPsec Services Service in Microsoft Windows handles this functionality. The KAME project implements ISAKMP for Linux and most other open source BSDs. Modern Cisco routers implement ISAKMP for VPN negotiation. Vulnerabilities Leaked NSA presentations released by Der Spiegel''' indicate that ISAKMP is being exploited in an unknown manner to decrypt IPSec traffic, as is IKE. The researchers who discovered the Logjam attack state that breaking a 1024-bit Diffie–Hellman group would break 66% of VPN servers, 18% of the top million HTTPS domains, and 26% of SSH servers, which is consistent with the leaks according to the researchers. See also Oakley protocol IPsec IKE GDOI References External links — Internet Security Association and Key Management Protocol — The Internet IP Security Domain of Interpretation for ISAKMP'' IPsec Cryptographic protocols Key management
211855
https://en.wikipedia.org/wiki/Dave%20Arneson
Dave Arneson
David Lance Arneson (; October 1, 1947 – April 7, 2009) was an American game designer best known for co-developing the first published role-playing game (RPG), Dungeons & Dragons, with Gary Gygax, in the early 1970s. Arneson's early work was fundamental to the development of the genre, developing the concept of the RPG using devices now considered to be archetypical, such as adventuring in "dungeons" and using a neutral judge who doubles as the voice and consciousness of all characters aside from the player characters to develop the storyline. Arneson discovered wargaming as a teenager in the 1960s, and began combining these games with the concept of role-playing. He was a University of Minnesota student when he met Gygax at the Gen Con gaming convention in the late 1960s. In 1970 Arneson created the game and fictional world that became Blackmoor, writing his own rules and basing the setting on medieval fantasy elements. Arneson showed the game to Gygax the following year, and the pair co-developed a set of rules that became Dungeons & Dragons (D&D). Gygax subsequently founded TSR, Inc. to publish the game in 1974. Arneson worked briefly for the company. Arneson left TSR in 1976, and filed suit in 1979 to retain credits and royalties on the game. He continued to work as an independent game designer, including work submitted to TSR in the 1980s, and continued to play games for his entire life. Arneson also did some work in computer programming, and taught computer game design and game rules design at Full Sail University from the 1990s until shortly before his death in 2009. Experience with miniature wargaming Arneson's role-playing game design work grew from his interest in wargames. His parents bought him the board wargame Gettysburg by Avalon Hill. After Arneson taught his friends how to play, the group began to design their own games, and tried out new ways to play existing games. Arneson was especially fond of naval wargames. Exposure to role-playing influenced his later game designs. In college history classes he role-played historical events, and preferred to deviate from recorded history in a manner similar to "what if" scenarios recreated in wargames. In the late 1960s Arneson joined the Midwest Military Simulation Association (MMSA), a group of miniature wargamers and military figurine collectors in the Minneapolis-St. Paul area that included among its ranks future game designer David Wesely. Wesely asserts that it was during the Braunstein games he created and refereed, and in which other MMSA members participated, that Arneson helped develop the foundations of modern role-playing games on a 1:1 scale basis by focusing on non-combat objectives—a step away from wargaming towards the more individual play and varied challenges of later RPGs. Arneson was a participant in Wesely's wargame scenarios, and as Arneson continued to run his own scenarios he eventually expanded them to include ideas from The Lord of the Rings and Dark Shadows. Arneson took over the Braunsteins when Wesely was drafted into the Army, and often ran them in different eras with different settings. Arneson had also become a member of the International Federation of Wargamers by this time. In 1969 Arneson was a history student at the University of Minnesota and working part-time as a security guard. He attended the second Gen Con gaming convention in August 1969 (at which time wargaming was still the primary focus) and it was at this event that he met Gary Gygax, who had founded the Castle & Crusade Society within the International Federation of Wargamers in the 1960s at Lake Geneva, Wisconsin. Arneson and Gygax also shared an interest in sailing ship games and they co-authored the Don't Give Up the Ship naval battle rules, serialized from June 1971 and later published as a single volume in 1972 by Guidon Games with a revised edition by TSR, Inc. in 1975. Blackmoor Following the departure of David Wesely to service in the Army Reserves in October 1970, Arneson and his fellow players in the Twin Cities began to imagine alternate settings for "Braunstein" games. Arneson developed a Braunstein in which his players played fantasy versions of themselves in the medieval Barony of Blackmoor, a land inhabited in part by fantastic monsters. As the game quickly grew and characters developed, Arneson devised scenarios where they would quest for magic and gold, escort caravans, lead armies for or against the forces of evil, and delve into the dungeons beneath Castle Blackmoor (which was represented by a Kibri kit model of Branzoll Castle). To explain his inspiration for the game, Arneson said: Arneson drew heavily upon the fantasy material in the Chainmail rules, written by Gygax and Jeff Perren and published in the spring of 1971, but after a short and unsatisfactory trial of the Fantasy Combat table found therein, he developed his own mix of rules, including adapted elements from his revision of Civil War Ironclad game. The gameplay would be recognizable to modern D&D players, featuring the use of hit points, armor class, character development, and dungeon crawls. This setting was fleshed out over time and continues to be played to the present day. Much of the fantasy medieval trope of D&D, such as the concept of adventuring in "dungeons" originated with Blackmoor, but it also incorporated time travel and science fiction elements. These are visible much later in the DA module series published by TSR (particularly City of the Gods), but were also present from the early to mid-1970s in the original campaign and parallel and intertwined games run by John Snider, whose ruleset developed from these adventures and was intended for publication by TSR from 1974 as the first science fiction RPG. Arneson described Blackmoor as "roleplaying in a non-traditional medieval setting. I have such things as steam power, gunpowder, and submarines in limited numbers. There was even a tank running around for a while. The emphasis is on the story and the roleplaying." Details of Blackmoor and the original campaign, established on the map of the Castle & Crusade Society's "Great Kingdom", were first brought to print briefly in issue #13 of the Domesday Book, the newsletter of the Castle & Crusade Society in July 1972, and later in much-expanded form as The First Fantasy Campaign, published by Judges Guild in 1977. In November 1972, Dave Arneson and Dave Megarry traveled to Lake Geneva to meet with Gary Gygax, to provide a demonstration of Blackmoor and Dungeon! While meeting at Gygax's house, Dave Arneson ran the Lake Geneva gamers through their first session of Blackmoor. Rob Kuntz describes Dave Arneson as the referee, and the Lake Geneva players as being Gary Gygax, Ernie Gygax, Terry Kuntz, and himself. Kuntz describes Dave Megarry as the de facto leader of the group, as he understood the Blackmoor game and campaign world. In Wargaming magazine, Rob Kuntz wrote a short summary of their first Blackmoor session: Dungeons & Dragons After playing in the Blackmoor game Arneson refereed, Gygax almost immediately began a similar campaign of his own, which he called "Greyhawk", and asked Arneson for a draft of his playing-rules. The two then collaborated by phone and mail, and playtesting carried out by their various groups and other contacts. Gygax and Arneson wanted to have the game published, but Guidon Games and Avalon Hill rejected it. Arneson could not afford to invest in the venture. Gygax felt that there was a need to publish the game as soon as possible, since similar projects were being planned elsewhere, so rules were hastily put together, and Arneson's own final draft was never used. Despite all this, Brian Blume eventually provided the funding required to publish the original Dungeons & Dragons set in 1974, with the initial print run of 1,000 selling out within a year, and sales increasing rapidly in subsequent years. Further rules and a sample dungeon from Arneson's original campaign (the first published RPG scenario in a professional publication) were released in 1975 in the Blackmoor supplement for D&D, named after the campaign-setting. Blackmoor included new classes for monks and assassins, more monsters, and "The Temple of the Frog", the first published RPG adventure for other people to run. Arneson formally joined TSR as their Director of Research at the beginning of 1976, but left at the end of the year to pursue a career as an independent game-designer. After TSR In 1977, despite the fact that he was no longer at TSR, Arneson published Dungeonmaster's Index, a 38-page booklet that indexed all of TSR's D&D properties to that point in time. TSR had agreed to pay Arneson royalties on all D&D products, but when the company came out with Advanced Dungeons & Dragons (AD&D) in 1977, it claimed that AD&D was a significantly different product and so did not pay him royalties for it. In response, Arneson filed the first of five lawsuits against Gygax and TSR in 1979. In March 1981, as part of a confidential agreement, Arneson and Gygax resolved the suits out of court by agreeing that they would both be credited as "co-creators" on the packaging of D&D products from that point on, and Arneson would receive a 2.5% royalty on all AD&D products. This provided him with a comfortable six-figure annual income for the next twenty years. This did not end the lingering tensions between them. Continuation of Blackmoor Arneson wrote up the Blackmoor setting for Judges Guild in The First Fantasy Campaign (1977). In 1979 Arneson and Richard L. Snider, an original Blackmoor player, co-authored Adventures in Fantasy, a role-playing game that attempted to recapture the "original spirit of the Role Playing Fantasy Game" that Arneson had envisioned in the early 1970s, instead of what D&D had become. In the early 1980s he established his own game company, Adventure Games – staffed largely by Arneson's friends, most of whom were also members of a Civil War reenactment group – that produced the miniatures games Harpoon (1981) and Johnny Reb (1983), as well as a new edition of his own Adventures in Fantasy role-playing game (1981). The company also put out about a half-dozen Tékumel related books, due to Arneson's friendship with M. A. R. Barker. Adventure Games was profitable, but Arneson found the workload to be excessive and finally sold the company to Flying Buffalo. Flying Buffalo picked up the rights to Adventure Games in 1985; because Arneson owned a portion of Flying Buffalo, he let them take care of the rest of the company's stock and IP when he shut the company down. While Gary Gygax was president of TSR in the mid-1980s, he and Arneson reconnected, and Arneson briefly relinked Blackmoor to D&D with the "DA" (Dave Arneson) series of modules set in Blackmoor (1986–1987). The four modules, three of which were written by Arneson, detailed Arneson's campaign setting for the first time. When Gygax was forced out of TSR, Arneson's projects were dropped from the company before a planned fifth module could be published. Gygax and Arneson again went their separate ways. In 1986 Arneson wrote a new D&D module set in Blackmoor called "The Garbage Pits of Despair", which was published in two parts in Different Worlds magazine issues #42 and #43. Arneson and Dustin Clingman founded Zeitgeist Games to produce an updated d20 System version of the Blackmoor setting. Goodman Games published and distributed Dave Arneson's Blackmoor in 2004, and Goodman produced a few more Blackmoor products in the next year. Code Monkey Publishing released Dave Arneson's Blackmoor: The First Campaign (2009) for 4th edition D&D. Computer programming and education In 1988 Arneson stated his belief that RPGs, whether paper or computer, were still "hack and slash" and did not teach novices how to play, and that games like Ultima IV "have stood pretty much alone as quirks instead of trend setters" as others did not follow their innovations. He hoped that computer RPGs would teach newcomers how to role play while offering interesting campaigns, and said that SSI's Gold Box games did not innovate on the genre as much as he had hoped. Arneson stepped into the computer industry and founded 4D Interactive Systems, a computer company in Minnesota that has since dissolved. He also did some computer programming and worked on several games. He eventually found himself consulting with computer companies. Arneson wrote the 1989 adventure DNA / DOA, the first adventure published for the FASA fantasy/cyberpunk game Shadowrun, which was released the same year. Living in California in the late 1980s, Arneson had a chance to work with special education children. Upon returning to Minnesota, he pursued teaching and began speaking at schools about educational uses of role-playing and using multi-sided dice to teach math. In the 1990s he began working at Full Sail, a private university that teaches multimedia subjects, and continued there as an instructor of computer game design until 2008. At Full Sail University he taught the class "Rules of the Game", a class in which students learned how to accurately document and create rule sets for games that were balanced between mental challenges for the players and "physical" ones for the characters. He retired from the position on June 19, 2008. Other RPG involvements Arneson continued to play games his entire life, including D&D and military miniature games, and regularly attended an annual meeting to play the original Blackmoor in Minnesota. During the 1990s, he was invited to Brazil by Devir, a game publisher. He became friends with the owner of the publishing company and he gave him his D&D « woodgrain » box and some of his books as a gift. In 1997, after Wizards of the Coast purchased TSR, Peter Adkison paid Arneson an undisclosed sum to free up D&D from royalties owed to Arneson; this allowed Wizards to retitle Advanced Dungeons & Dragons to simply Dungeons & Dragons. Around 2000, Arneson was working with videographer John Kentner on Dragons in the Basement (unreleased), a video documentary on the early history of role-playing games. Arneson describes the documentary: "Basically it is a series of interviews with original players ('How did D&D affect your life?') and original RPG designers like Marc Miller (Traveller) and M.A.R. Barker (Empire of the Petal Throne)." He also made a cameo appearance in the Dungeons & Dragons movie as one of many mages throwing fireballs at a dragon, although the scene was deleted from the completed movie. Personal life Arneson married Frankie Ann Morneau in 1984; they had one daughter, Malia, and two grandchildren. Arneson died on April 7, 2009, after battling cancer for two years. According to his daughter, Malia Weinhagen, "The biggest thing about my dad's world is he wanted people to have fun in life." Honors and tributes Arneson received numerous industry awards for his part in creating Dungeons & Dragons and other role-playing games. In 1984 he was inducted into the Academy of Adventure Gaming Arts and Design's Hall of Fame (also known as the Charles Roberts Awards Hall of Fame) and in 1999 was named by Pyramid magazine as one of The Millennium's Most Influential Persons, "at least in the realm of adventure gaming". He was honored as a "famous game designer" by being featured on the king of hearts in Flying Buffalo's 2008 Famous Game Designers Playing Card Deck. Three days after his death, Wizards of the Coast temporarily replaced the front page of the Dungeons & Dragons section of their web site with a tribute to Arneson. Other tributes in the gaming world included Order of the Stick #644, and Dork Tower for April 8, 2009. Video game publisher Activision Blizzard posted a tribute to Arneson on their website and on April 14, 2009, released patch 3.1 of the online role-playing game World of Warcraft, The Secrets of Ulduar, dedicated to Arneson. Turbine's Dungeons and Dragons Online added an in-game memorial altar to Arneson in the Ruins of Threnal location in the game. They also created an in-game item named the "Mantle of the Worldshaper" that is a reward for finishing the Threnal quest chain that is narrated by Arneson himself. The Mantle's description reads: "A comforting and inspiring presence surrounds you as you hold this cloak. Arcane runes run along the edges of the fine cape, and masterfully drawn on the silken lining is an incredibly detailed map of a place named 'Blackmoor'." On October 30, 2010, Full Sail University dedicated the student game development studio space as "Dave Arneson's Blackmoor Studios" in Arneson's honor. Since the release of the history of Braunstein in 2008 and Playing at the World in 2012, a scholarly work by Jon Petersen, the role of Dave Wesely and Dave Arneson was restored in the broad conversation on the origins of the tabletop role-playing games. Robert Kuntz published Dave Arneson's True Genius in 2017 and gave interviews to Kotaku to detail how the gameplay of the current tabletop role-playing games was designed by Arneson. In 2019, the documentary The Secrets of Blackmoor presented interviews of the first players of Dave Arneson and acknowledged his innovations. Partial bibliography Source: Don't Give Up the Ship! (1972) (with Gary Gygax and Mike Carr) Dungeons & Dragons (1974) (with Gary Gygax) Blackmoor (1975) Dungeonmaster's Index (1977) The First Fantasy Campaign (1977) Adventures in Fantasy (1979) (with Richard L. Snider) Robert Asprin's Thieves' World (1981) (co-author) Citybook II – Port o' Call (1984) (co-author) Adventures in Blackmoor (D&D Module:DA1) (1986) (with David J. Ritchie) Temple of the Frog (D&D Module:DA2) (1986) (with David J. Ritchie) City of the Gods (D&D Module:DA3) (1987) (with David J. Ritchie) DNA/DOA (Shadowrun module 1) (1989) The Case of the Pacific Clipper (1991) The Haunted Lighthouse (Dungeon Crawl Classics Module #3.5) (2003) Dave Arneson's Blackmoor (2004) (lead designer) Player's Guide to Blackmoor (2006) References External links of Dave Arneson. "Dave Arneson Interview" by at Digital Entertainment News. "Dave Arneson Interview" by Andrew S. Bub at GameSpy, August 11, 2002. "Slice of SciFi #151: Interview with "Dungeons & Dragons" co-creator Dave Arneson" by Farpoint Media, February 8, 2008. Jeremy L.C. Jones. "If Their Hearts Are Pure: A Conversation with Dave Arneson", Kobold Quarterly no.9, 2009-04-11. Retrieved on 2009-05-03. Arneson's last known interview. 1947 births 2009 deaths American game designers Deaths from cancer in Minnesota Dungeons & Dragons game designers People from Hennepin County, Minnesota People from Lake Geneva, Wisconsin Role-playing game designers University of Minnesota College of Liberal Arts alumni
39911634
https://en.wikipedia.org/wiki/Samsung%20Galaxy%20Star
Samsung Galaxy Star
The Samsung Galaxy Star is a low-end smartphone manufactured by Samsung Electronics. It is running on Android 4.1.2(Jelly Bean). It has unofficial Android 4.4, 5.1, 6.0.1 and 7.1 roms. It was announced in April 2013, it was subsequently released in May 2013. It is the cheapest smartphone in the Samsung Galaxy series. Like all other Samsung Galaxy smartphones, the Galaxy Star runs on the Android mobile operating system. The phone is available in 2 versions: a single SIM version (GT-S5280) and a dual SIM version (GT-S5282). The phone competes with other low-cost smartphones such as the smartphones from the Nokia Asha series as well as low-cost smartphones manufactured by Indian manufacturers such as Micromax, Karbonn, Spice Digital, Lava International and Celkon. It is available in certain Asian countries such as India, Pakistan, Sri Lanka, Nepal, Bangladesh, Myanmar, Philippines, Indonesia etc. where low-cost smartphones are very popular as well as in Morocco, Algeria, South Africa, Portugal, France, Germany, Russia and Ukraine. Brazilian version is also released, dubbed GT-S5283B. Specifications Hardware The Galaxy Star follows the candybar form factor for smartphones, and features a plastic exterior. The phone features a 1 GHz single-core ARM Cortex-A5 processor and a Mali-300 graphics processor, and comes equipped with 512 MB of RAM and 4 GB of internal storage, of which 2 GB is available to the user. The internal storage can be upgraded to 32 GB through the use of a microSD card. The device features an accelerometer intended to translate natural gestures into commands on the phone; for example, if the phone is ringing and the user turns it face down, it will stop ringing, or if the user wishes to establish a Bluetooth or wireless internet connection, they can shake the device and it will automatically connect. However, users have found that these gestures are often poorly recognised, resulting in the device performing unwanted tasks. It is not the first device to have these features; the HTC Desire Z introduced these features in 2010. It features a capacitive QVGA LCD touchscreen display which measures 3 inches with a resolution of 240x320 px at 133 ppi with multitouch support. It also provides a 2 MP rear camera with 2x optical zoom and QVGA video recording; however there is no front-end camera provided. The device requires the use of microSIM cards. The phone runs on a 1200 mAh lithium-ion battery. The Galaxy Star's battery drains quickly, but with moderate use on a single SIM, it can last up to 2 days. Galaxy Star GT-S5282 The Galaxy Star GT-S5280 and the Galaxy Star GT-S5282 are almost the same, with the only difference between them being that the latter is a dual-SIM phone, having another SIM card slot. Only one SIM is used at a time; a SIM card manager in the device is used to switch between them. Connectivity Unlike most other smartphones, the Galaxy Star runs on EDGE networks alone, and doesn't support 3G connectivity. It also does not provide a Global Positioning System (GPS). However it provides support for WiFi connectivity and Bluetooth 4.0. Software Like all other Samsung Galaxy smartphones, the Galaxy Star runs on the Android mobile operating system. It runs on a modified version of Android 4.1.2 Jelly Bean. Because of this, Samsung has stated that CPU usage is conducted in a more efficient way to ensure longer battery life. Again, as with other Samsung smartphones, the device also uses Samsung's TouchWiz UX Nature user interface as the default user interface, though it is also possible to use other third-party user interfaces. By default, as in other Android smartphones, Google's products such as Google Chrome, Gmail, Google+, Google Hangouts etc. are provided in the phone. The device also has access to the Google Play store, but being a low-end smartphone, new applications as well as applications requiring a large amount of processing power and memory cannot be installed on the device. Samsung's applications such as ChatON and Samsung Apps are also preinstalled, as in other Samsung smartphones. However games like Dead Trigger, Temple Run, Temple Run 2, Temple Run oz, Angry Birds, Jetpack Joyride works with ease. Reception The Samsung Galaxy Star has received mixed reviews upon its release. While it has been praised for its price, user experience and battery life, it has also been criticised for its small screen, lack of features and performance. Many users have found that the device's capabilities are basic, seeing it as a low-end smartphone. According to The Times of India, the Galaxy Star looks "cute", has a decent screen, has a decent battery life and provides a good user experience, but has its drawbacks, with a small screen, low resolution and an underwhelming performance. ReviewGuidelines.com has praised the phone's memory, Wi-Fi, Bluetooth, touchscreen display and design, while criticising the lack of features and functionality. According to Techpinas.com, the Samsung Galaxy Star is a decent device capable of providing users with an adequate mobile experience. The screen is bare basic, yet this is understandable considering the device is a low-end smartphone. Furthermore, it excels in terms of basic functions such as web browsing and communications. Many individual reviewers in the Internet have criticized the device due to its lack of a GPS, which has become the norm for modern day smartphones. Reviewers have agreed that the phone excels in terms of battery life, which is a result of Samsung modifying the operating system to control battery drainage. Despite the fact that the device has a 2MP fixed focus back camera, reviewers have found that the camera is capable of taking clear pictures with adequate picture quality. References External links Android (operating system) devices Mobile phones introduced in 2013 Galaxy Star Galaxy Star Smartphones Discontinued smartphones
10795822
https://en.wikipedia.org/wiki/The%20Pacifist
The Pacifist
"The Pacifist" is a science fiction short story by British writer Arthur C. Clarke, first published in 1956 in Fantastic Universe. It appears in his collection of "science fiction tall tales," Tales from the White Hart. The story deals with a computer programmer's revenge on his unreasonable military boss by tinkering with software code in a way that makes his boss the laughingstock of the organization. "The Pacifist" details the construction of a supercomputer within "a cavern in Kentucky" (Clarke may have been thinking of Mammoth Cave, then suspected (and later known) to be the world's longest known cave system). The purpose of the computer is military battle simulation, and the details of all known historical battles have been stored in the computer's data banks. The computer's designer, nicknamed "Dr. Milquetoast" by the story-within-a-story's narrator, works under the harsh supervision of a military General. By way of revenge, Dr. Milquetoast programs the computer so that it will answer purely theoretical or mathematical questions put to it, but when asked to solve a military problem, responds by insulting the General using phrases industriously prepared by the programmer. Frustration mounts as the General realizes that because the computer is aware of every known historical military battle, it is capable of recognizing such scenarios even when couched in purely mathematical terms. Analysis The story presents a computer programmer as an "everyman," a downtrodden and unappreciated worker who has the last laugh on his tormentor. In 1956, computers were rare, and computer programmers were regarded as an engineering elite. Although the character of Dr. Milquetoast is depicted in a classic stereotypical fashion for his character's archetype, the frustration he feels, at the hands of the General, humanizes the character. Published near the beginning of the Cold War, "The Pacifist" satirizes the military-industrial complex (although the term would not come into wide use for another five years.) The involvement of civilian scientists in military projects was familiar to the reading public, notably the involvement of J. Robert Oppenheimer's team of nuclear scientists in the Manhattan Project, under the military leadership of General Leslie Groves. Legacy The story's theme of a military computer gone haywire has been used numerous times in written and filmed science fiction. Later examples include Dr. Strangelove (1964), Colossus: The Forbin Project (novel, 1966, film version, 1970), and WarGames (1983). References External links Short stories by Arthur C. Clarke 1956 short stories Works originally published in Fantastic Universe Kentucky in fiction Tales from the White Hart
1050980
https://en.wikipedia.org/wiki/VICE
VICE
The software program VICE, standing for VersatIle Commodore Emulator, is a free and cross platform emulator for Commodore's 8-bit computers. It runs on Linux, Amiga, Unix, MS-DOS, Win32, Mac OS X, OS/2, RISC OS, QNX, GP2X, Pandora (console), Dingoo A320, Syllable, and BeOS host machines. VICE is free software, released under the GNU General Public Licence. VICE for Microsoft Windows (Win32) is known as WinVICE, the OS/2 variant is called Vice/2, and the emulator running on BeOS is called BeVICE. History Development of VICE began in 1993 by a Finnish programmer Jarkko Sonninen, who was the founder of the project. Sonninen retired from the project in 1994. As of version 2.1, released December 19, 2008, VICE emulates the Commodore 64, the C128, the VIC-20, the Plus/4, the C64 Direct-to-TV (with its additional video modes) and all the PET models including the CBM-II but excluding the 'non-standard' features of the SuperPET 9000. WinVICE supports digital joysticks via a parallel port driver, and, with a CatWeasel PCI card, is planned to perform hardware SID playback (requires optional SID chip installed in socket). As of 2004, VICE was one of the most widely used emulators of the Commodore 8-bit personal computers. It is also one of the few usable Commodore emulators to exist on free *NIX platforms, and one of the first to be distributed under GNU GPL. It is available for most Linux and BSD distributions. As of version 3.4, support has been terminated for Syllable Desktop, SCO, QNX4, QNX6, SGI, AIX, OPENSTEP/NeXTSTEP/Rhapsody, and Solaris/OpenIndiana, as well as remaining traces of support for Minix, NeXT, SKYOS, UNIXWARE, and Sortix, due to lack of staff. As of version 3.5 explicit support for OS/2 and AmigaOS has been terminated due to the transition to Gtk3 UI. Bibliography Simon Carless, Gaming hacks, O'Reilly Media, 2004, , pp. 5–8 Jason Kroll, Commodore 64 Game Emulation, Linux Journal, Issue 72, April 2000 See also Commodore 64 CCS64 References External links , with Online manual (HTML) VICE knowledge base (preliminary) Guide for installation on Linux and Windows PRG Starter Simplifies VICE usage Windows binaries automated nightly builds VICE.js JavaScript port of VICE AmigaOS 4 software AROS software BeOS software Commodore 64 emulators Commodore VIC-20 Amiga emulation software DOS emulation software GP2X emulation software MacOS emulation software MorphOS emulation software Linux emulation software Windows emulation software RISC OS software Free emulation software Unix emulation software Free and open-source Android software
19024734
https://en.wikipedia.org/wiki/Loop%20variant
Loop variant
In computer science, a loop variant is a mathematical function defined on the state space of a computer program whose value is monotonically decreased with respect to a (strict) well-founded relation by the iteration of a while loop under some invariant conditions, thereby ensuring its termination. A loop variant whose range is restricted to the non-negative integers is also known as a bound function, because in this case it provides a trivial upper bound on the number of iterations of a loop before it terminates. However, a loop variant may be transfinite, and thus is not necessarily restricted to integer values. A well-founded relation is characterized by the existence of a minimal element of every non-empty subset of its domain. The existence of a variant proves the termination of a while loop in a computer program by well-founded descent. A basic property of a well-founded relation is that it has no infinite descending chains. Therefore a loop possessing a variant will terminate after a finite number of iterations, as long as its body terminates each time. A while loop, or, more generally, a computer program that may contain while loops, is said to be totally correct if it is partially correct and it terminates. Rule of inference for total correctness In order to formally state the rule of inference for the termination of a while loop we have demonstrated above, recall that in Floyd–Hoare logic, the rule for expressing the partial correctness of a while loop is: where is the invariant, C is the condition, and S is the body of the loop. To express total correctness, we write instead: where, in addition, V is the variant, and by convention the unbound symbol z is taken to be universally quantified. Every loop that terminates has a variant The existence of a variant implies that a while loop terminates. It may seem surprising, but the converse is true, as well, as long as we assume the axiom of choice: every while loop that terminates (given its invariant) has a variant. To prove this, assume that the loop terminates given the invariant where we have the total correctness assertion Consider the "successor" relation on the state space induced by the execution of the statement S from a state satisfying both the invariant and the condition C. That is, we say that a state is a "successor" of if and only if and C are both true in the state , and is the state that results from the execution of the statement S in the state . We note that for otherwise the loop would fail to terminate. Next consider the reflexive, transitive closure of the "successor" relation. Call this iteration: we say that a state is an iterate of if either or there is a finite chain such that and is a "successor" of for all , We note that if and are two distinct states, and is an iterate of , then cannot be an iterate of for again, otherwise the loop would fail to terminate. In other words, iteration is antisymmetric, and thus, a partial order. Now, since the while loop terminates after a finite number of steps given the invariant , and no state has a successor unless is true in that state, we conclude that every state has only finitely many iterates, every descending chain with respect to iteration has only finitely many distinct values, and thus there is no infinite descending chain, i.e. loop iteration satisfies the descending chain condition. Therefore—assuming the axiom of choice—the "successor" relation we originally defined for the loop is well-founded on the state space , since it is strict (irreflexive) and contained in the "iterate" relation. Thus the identity function on this state space is a variant for the while loop, as we have shown that the state must strictly decrease—as a "successor" and an "iterate"—each time the body S is executed given the invariant and the condition C. Moreover, we can show by a counting argument that the existence of any variant implies the existence of a variant in 1, the first uncountable ordinal, i.e., This is because the collection of all states reachable by a finite computer program in a finite number of steps from a finite input is countably infinite, and 1 is the enumeration of all well-order types on countable sets. Practical considerations In practice, loop variants are often taken to be non-negative integers, or even required to be so, but the requirement that every loop have an integer variant removes the expressive power of unbounded iteration from a programming language. Unless such a (formally verified) language allows a transfinite proof of termination for some other equally powerful construct such as a recursive function call, it is no longer capable of full μ-recursion, but only primitive recursion. Ackermann's function is the canonical example of a recursive function that cannot be computed in a loop with an integer variant. In terms of their computational complexity, however, functions that are not primitive recursive lie far beyond the realm of what is usually considered tractable. Considering even the simple case of exponentiation as a primitive recursive function, and that the composition of primitive recursive functions is primitive recursive, one can begin to see how quickly a primitive recursive function can grow. And any function that can be computed by a Turing machine in a running time bounded by a primitive recursive function is itself primitive recursive. So it is difficult to imagine a practical use for full μ-recursion where primitive recursion will not do, especially since the former can be simulated by the latter up to exceedingly long running times. And in any case, Kurt Gödel's first incompleteness theorem and the halting problem imply that there are while loops that always terminate but cannot be proven to do so; thus it is unavoidable that any requirement for a formal proof of termination must reduce the expressive power of a programming language. While we have shown that every loop that terminates has a variant, this does not mean that the well-foundedness of the loop iteration can be proven. Example Here is an example, in C-like pseudocode, of an integer variant computed from some upper bound on the number of iterations remaining in a while loop. However, C allows side effects in the evaluation of expressions, which is unacceptable from the point of view of formally verifying a computer program. /** condition-variable, which is changed in procedure S() **/ bool C; /** function, which computes a loop iteration bound without side effects **/ inline unsigned int getBound(); /** body of loop must not alter V **/ inline void S(); int main() { unsigned int V = getBound(); /* set variant equal to bound */ assert(I); /* loop invariant */ while (C) { assert(V > 0); /* this assertion is the variant's raison d'être (reason of existence) */ S(); /* call the body */ V = min(getBound(), V - 1); /* variant must decrease by at least one */ }; assert(I && !C); /* invariant is still true and condition is false */ return 0; }; Why even consider a non-integer variant? Why even consider a non-integer or transfinite variant? This question has been raised because in all practical instances where we want to prove that a program terminates, we also want to prove that it terminates in a reasonable amount of time. There are at least two possibilities: An upper bound on the number of iterations of a loop may be conditional on proving termination in the first place. It may be desirable to separately (or progressively) prove the three properties of partial correctness, termination, and running time. Generality: considering transfinite variants allows all possible proofs of termination for a while loop to be seen in terms of the existence of a variant. See also While loop Loop invariant Transfinite induction Descending chain condition Large countable ordinal Correctness (computer science) Weakest-preconditions of While loop References Formal methods Control flow
66181
https://en.wikipedia.org/wiki/Role-based%20access%20control
Role-based access control
In computer systems security, role-based access control (RBAC) or role-based security is an approach to restricting system access to authorized users. It is an approach to implement mandatory access control (MAC) or discretionary access control (DAC). Role-based access control (RBAC) is a policy-neutral access-control mechanism defined around roles and privileges. The components of RBAC such as role-permissions, user-role and role-role relationships make it simple to perform user assignments. A study by NIST has demonstrated that RBAC addresses many needs of commercial and government organizations. RBAC can be used to facilitate administration of security in large organizations with hundreds of users and thousands of permissions. Although RBAC is different from MAC and DAC access control frameworks, it can enforce these policies without any complication. Design Within an organization, roles are created for various job functions. The permissions to perform certain operations are assigned to specific roles. Members or staff (or other system users) are assigned particular roles, and through those role assignments acquire the permissions needed to perform particular system functions. Since users are not assigned permissions directly, but only acquire them through their role (or roles), management of individual user rights becomes a matter of simply assigning appropriate roles to the user's account; this simplifies common operations, such as adding a user, or changing a user's department. Role based access control interference is a relatively new issue in security applications, where multiple user accounts with dynamic access levels may lead to encryption key instability, allowing an outside user to exploit the weakness for unauthorized access. Key sharing applications within dynamic virtualized environments have shown some success in addressing this problem. Three primary rules are defined for RBAC: Role assignment: A subject can exercise a permission only if the subject has selected or been assigned a role. Role authorization: A subject's active role must be authorized for the subject. With rule 1 above, this rule ensures that users can take on only roles for which they are authorized. Permission authorization: A subject can exercise a permission only if the permission is authorized for the subject's active role. With rules 1 and 2, this rule ensures that users can exercise only permissions for which they are authorized. Additional constraints may be applied as well, and roles can be combined in a hierarchy where higher-level roles subsume permissions owned by sub-roles. With the concepts of role hierarchy and constraints, one can control RBAC to create or simulate lattice-based access control (LBAC). Thus RBAC can be considered to be a superset of LBAC. When defining an RBAC model, the following conventions are useful: S = Subject = A person or automated agent R = Role = Job function or title which defines an authority level P = Permissions = An approval of a mode of access to a resource SE = Session = A mapping involving S, R and/or P SA = Subject Assignment PA = Permission Assignment RH = Partially ordered Role Hierarchy. RH can also be written: ≥ (The notation: x ≥ y means that x inherits the permissions of y.) A subject can have multiple roles. A role can have multiple subjects. A role can have many permissions. A permission can be assigned to many roles. An operation can be assigned to many permissions. A permission can be assigned to many operations. A constraint places a restrictive rule on the potential inheritance of permissions from opposing roles, thus it can be used to achieve appropriate separation of duties. For example, the same person should not be allowed to both create a login account and to authorize the account creation. Thus, using set theory notation: and is a many to many permission to role assignment relation. and is a many to many subject to role assignment relation. A subject may have multiple simultaneous sessions with/in different roles. Standardized levels The NIST/ANSI/INCITS RBAC standard (2004) recognizes three levels of RBAC: core RBAC hierarchical RBAC, which adds support for inheritance between roles constrained RBAC, which adds separation of duties Relation to other models RBAC is a flexible access control technology whose flexibility allows it to implement DAC or MAC. DAC with groups (e.g., as implemented in POSIX file systems) can emulate RBAC. MAC can simulate RBAC if the role graph is restricted to a tree rather than a partially ordered set. Prior to the development of RBAC, the Bell-LaPadula (BLP) model was synonymous with MAC and file system permissions were synonymous with DAC. These were considered to be the only known models for access control: if a model was not BLP, it was considered to be a DAC model, and vice versa. Research in the late 1990s demonstrated that RBAC falls in neither category. Unlike context-based access control (CBAC), RBAC does not look at the message context (such as a connection's source). RBAC has also been criticized for leading to role explosion, a problem in large enterprise systems which require access control of finer granularity than what RBAC can provide as roles are inherently assigned to operations and data types. In resemblance to CBAC, an Entity-Relationship Based Access Control (ERBAC, although the same acronym is also used for modified RBAC systems, such as Extended Role-Based Access Control) system is able to secure instances of data by considering their association to the executing subject. Comparing to ACL Access control lists (ACLs) are used in traditional discretionary access-control systems to affect low-level data-objects. RBAC differs from ACL in assigning permissions to operations which change the direct-relations between several entities (see: ACLg below). For example, an ACL could be used for granting or denying write access to a particular system file, but it wouldn't dictate how that file could be changed. In an RBAC-based system, an operation might be to 'create a credit account' transaction in a financial application or to 'populate a blood sugar level test' record in a medical application. A Role is thus a sequence of operations within a larger activity. RBAC has been shown to be particularly well suited to separation of duties (SoD) requirements, which ensure that two or more people must be involved in authorizing critical operations. Necessary and sufficient conditions for safety of SoD in RBAC have been analyzed. An underlying principle of SoD is that no individual should be able to effect a breach of security through dual privilege. By extension, no person may hold a role that exercises audit, control or review authority over another, concurrently held role. Then again, a "minimal RBAC Model", RBACm, can be compared with an ACL mechanism, ACLg, where only groups are permitted as entries in the ACL. Barkley (1997) showed that RBACm and ACLg are equivalent. In modern SQL implementations, like ACL of the CakePHP framework, ACLs also manage groups and inheritance in a hierarchy of groups. Under this aspect, specific "modern ACL" implementations can be compared with specific "modern RBAC" implementations, better than "old (file system) implementations". For data interchange, and for "high level comparisons", ACL data can be translated to XACML. Attribute-based access control Attribute-based access control or ABAC is a model which evolves from RBAC to consider additional attributes in addition to roles and groups. In ABAC, it is possible to use attributes of: the user e.g. citizenship, clearance, the resource e.g. classification, department, owner, the action, and the context e.g. time, location, IP. ABAC is policy-based in the sense that it uses policies rather than static permissions to define what is allowed or what is not allowed. Use and availability The use of RBAC to manage user privileges (computer permissions) within a single system or application is widely accepted as a best practice. A 2010 report prepared for NIST by the Research Triangle Institute analyzed the economic value of RBAC for enterprises, and estimated benefits per employee from reduced employee downtime, more efficient provisioning, and more efficient access control policy administration. In an organization with a heterogeneous IT infrastructure and requirements that span dozens or hundreds of systems and applications, using RBAC to manage sufficient roles and assign adequate role memberships becomes extremely complex without hierarchical creation of roles and privilege assignments. Newer systems extend the older NIST RBAC model to address the limitations of RBAC for enterprise-wide deployments. The NIST model was adopted as a standard by INCITS as ANSI/INCITS 359-2004. A discussion of some of the design choices for the NIST model has also been published. See also References Further reading External links FAQ on RBAC models and standards Role Based Access Controls at NIST XACML core and hierarchical role based access control profile Institute for Cyber Security at the University of Texas San Antonio Practical experiences in implementing RBAC Computer security models Access control
54447719
https://en.wikipedia.org/wiki/Michael%20F.%20Cohen
Michael F. Cohen
Michael F. Cohen is an American computer scientist and researcher in computer graphics. He was a Senior Research Scientist at Microsoft Research for 21 years until he joined Facebook Research in 2015. In 1998, he received the ACM SIGGRAPH CG Achievement Award for his work in developing radiosity methods for realistic image synthesis. He was elected a Fellow of the Association for Computing Machinery in 2007 for his "contributions to computer graphics and computer vision." In 2019, he received the ACM SIGGRAPH Steven A. Coons Award for Outstanding Creative Contributions to Computer Graphics for “his groundbreaking work in numerous areas of research—radiosity, motion simulation & editing, light field rendering, matting & compositing, and computational photography”. Education Cohen attended Beloit College for his undergraduate degree, graduating with a BA in 1976 in art and Rutgers University with a BS in civil engineering in 1983. He received his master's degree in computer graphics from Cornell University in 1985. He received his PhD in computer science in 1992 from the University of Utah. Scientific career At Cornell University's Program of Computer Graphics, Cohen served as an Assistant Professor of Architecture from 1985–1989. His first major research contributions were in the area of photorealistic rendering, in particular, in the study of radiosity: the use of finite elements to solve the rendering equation for environments with diffusely reflecting surfaces. His most significant results included the hemicube (1985), for computing form factors in the presence of occlusion; an experimental evaluation framework (1986), one of the first studies to quantitatively compare real and synthetic imagery; extending radiosity to non-diffuse environments (1986); integrating ray tracing with radiosity (1987); progressive refinement (1988), to make interactive rendering possible. After completing his PhD, he joined the Computer Science faculty at Princeton University where he continued his work on Radiosity, including wavelet radiosity (1993), a more general framework for hierarchical approaches; and “radioptimization” (1993), an inverse method to solve for lighting parameters based on user-specified objectives. All of this work culminated in a 1993 textbook with John Wallace, Radiosity and Realistic Image Synthesis. In a very different research area, Cohen made significant contributions to motion simulation and editing, most significantly: dynamic simulation with kinematic constraints (1987), which for the first time allowed animators to combine kinematic and dynamic specifications; interactive spacetime control for animation (1992), which combined physically-based and user-defined constraints for controlling motion. In 1994, Cohen moved to Microsoft Research (MSR) where he stayed for 21 years. There he continued work on motion synthesis; motion transitions using spacetime constraints (1996), which allowed seamless and plausible transitions between motion segments for systems with many degrees of freedom such as the human body; motion interpolation (1998), a system for real-time interpolation of parameterized motions; and artist-directed inverse kinematics (2001), which allowed a user to position an articulated character at high frame rates, for real-time applications such as games. In addition, at Microsoft Research, in his groundbreaking and most-cited work, Cohen and colleagues introduced the Lumigraph (1996), a method for capturing and representing the complete appearance—from all points of view—of either a synthetic or real-world object or scene. Building on this work, Cohen published important follow-on papers in view-based rendering (1997), which used geometric information to create views of a scene from a sparse set of images; and unstructured Lumigraph rendering (2001), which generalized light field and view-dependent texture mapping methods in a single framework. In subsequent work, Cohen significantly advanced the state of the art in matting & compositing, with papers on image and video segmentation based on anisotropic kernel mean-shift (2004); video cutout (2004), which preserved smoothness across space and time; optimized color sampling (2005), which improved previous approaches to image matting by analyzing the confidence of foreground and background color samples; and soft scissors (2007), the first interactive tool for high-quality image matting and compositing. Most recently, Cohen has turned his attention to computational photography, publishing numerous highly creative, landmark papers: interactive digital photomontage (2004), for combining parts of photographs in various novel ways; flash/no-flash image pairs (2004), for combining images taken with and without flash to synthesize new higher-quality results than either image alone; panoramic video textures (2005), for converting a panning video over a dynamic scene to a high-resolution continuously playing video; gaze-based photo cropping (2006); multi-viewpoint panoramas (2006), for photographing and rendering very long scenes; the Moment Camera (2007) outlining general principles for capturing subjective moments; joint bilateral upsampling (2007), for fast image enhancement using a down-sampled image; gigapixel images (2007), a means to acquire extremely high-resolution images with an ordinary camera on a specialized mount; deep photo (2008), a system for enhancing casual outdoor photos by combining them with existing digital terrain data; image deblurring (2010), to deblur images captured with camera shake; GradientShop (2010), which unified previous gradient-domain solutions under a single optimization framework; and ShadowDraw (2011), a system for guiding the freeform drawing of objects. In 2015, Cohen moved to Facebook where he has directed the Computational Photography group. His group's best known product is 3D Photos first appearing in the Facebook feed in 2018. Cohen is a longtime volunteer in the ACM SIGGRAPH community. He served on the SIGGRAPH Papers Committee eleven times, and as SIGGRAPH Papers Chair in 1998. Cohen also served as SIGGRAPH Awards Chair from 2013 until 2018. He was a keynote speaker at SIGGRAPH Asia 2018. Cohen also serves as an Affiliate Professor of Computer Science and Engineering at the University of Washington References Living people Computer graphics researchers Beloit College alumni Rutgers University alumni Cornell University alumni University of Utah alumni Microsoft employees Facebook employees Fellows of the Association for Computing Machinery Year of birth missing (living people)
40471839
https://en.wikipedia.org/wiki/Intel%20MPX
Intel MPX
Intel MPX (Memory Protection Extensions) was a set of extensions to the x86 instruction set architecture. With compiler, runtime library and operating system support, Intel MPX claimed to enhance security to software by checking pointer references whose normal compile-time intentions are maliciously exploited at runtime due to buffer overflows. In practice, there have been too many flaws discovered in the design for it to be useful, and support has been deprecated or removed from most compilers and operating systems. Intel has listed MPX as removed in 2019 and onward hardware in section 2.5 of its Intel® 64 and IA-32 Architectures Software Developer's Manual Volume 1. Extensions Intel MPX introduces new bounds registers, and new instruction set extensions that operate on these registers. Additionally, there is a new set of "bound tables" that store bounds beyond what can fit in the bounds registers. MPX uses four new 128-bit bounds registers, BND0 to BND3, each storing a pair of 64-bit lower bound (LB) and upper bound (UB) values of a buffer. The upper bound is stored in ones' complement form, with BNDMK (create bounds) and BNDCU (check upper bound) performing the conversion. The architecture includes two configuration registers BNDCFGx (BNDCFGU in user space and BNDCFGS in kernel mode), and a status register BNDSTATUS, which provides a memory address and error code in case of an exception. Two-level address translation is used for storing bounds in memory. The top layer consists of a Bounds Directory (BD) created on the application startup. Each BD entry is either empty or contains a pointer to a dynamically created Bounds Table (BT), which in turn contains a set of pointer bounds along with the linear addresses of the pointers. The bounds load (BNDLDX) and store (BNDSTX) instructions transparently perform the address translation and access bounds in the proper BT entry. Intel MPX was introduced as part of the Skylake microarchitecture. Intel Goldmont microarchitecture also supports Intel MPX. Software support glibc removed support in version 2.35. GNU Compiler Collection (GCC) 5.0 added support for MPX. In 2018, support for these extensions waned due to maintenance burdens and Intel developers intermittently contributing patches, resulting in a proposal to drop support in GCC 9.0. Support was removed in GCC 9.1. Intel C++ Compiler (icc) 15.0 added support for Intel MPX. Kernel-level software support for Intel MPX was merged into the Linux kernel mainline in kernel version 3.19, which was released on February 8, 2015. In 2018, Thomas Gleixner proposed removing MPX support from Linux kernel 4.18. The pull request with its removal was posted in December 2018, during 4.20 development cycle, but wasn't accepted. The second attempt was made in July 2019. MPX support was removed in 5.6. QEMU supported MPX since version 2.6 and dropped its support in 4.0 release. Microsoft Visual Studio 2015 Update 1 added experimental support for MPX. Analysis of Intel MPX A study examined a detailed cross-layer dissection of the MPX system stack and comparison with three prominent software-based memory protection mechanisms (AddressSanitizer, SAFECode, and SoftBound) and presents the following conclusions. Even though Intel MPX is a specially designed hardware-assisted approach with its own added set of hardware registers, it is not faster than any of the software-based approaches. New Intel MPX instructions can cause up to 4× slowdown in the worst case, although compiler optimizations amortize it and lead to runtime overheads of ~50% on average. In contrast to the other software-based solutions, Intel MPX provides no protection against temporal memory safety errors. Intel MPX does not support multithreading inherently, which can lead to unsafe data races in legacy threaded programs and if compilers do not synchronize bounds explicitly. Intel MPX does not support several common C/C++ programming idioms due to restrictions on the allowed memory layout. Intel MPX is conflicting with some other ISA extensions resulting in performance and security issues. More specifically, these issues arise when Intel MPX is used in combination with other hardware based protection mechanisms such as Intel TSX and Intel SGX. Lastly, MPX instructions incur significant performance penalty (15+%) even on Intel CPUs without MPX support. In addition, a review concluded MPX was not production ready, and AddressSanitizer was a better option. A review by Kostya Serebryany at Google, AddressSanitizer's developer, had similar findings. Meltdown Another study exploring the scope of Spectre and Meltdown security vulnerabilities discovered that Meltdown can be used to bypass Intel MPX, using the Bound Range Exceeded (#BR) hardware exception. According to their publication, the researchers were able to leak information through a Flush+Reload covert channel from an out-of-bound access on an array safeguarded by the MPX system. Their Proof Of Concept has not been publicly disclosed. See also Memory protection keys Software Guard Extensions References External links Online supplementary material at https://intel-mpx.github.io. X86 instructions
4792747
https://en.wikipedia.org/wiki/University%20of%20Agriculture%2C%20Faisalabad
University of Agriculture, Faisalabad
The University of Agriculture (UAF) is a public research university in Faisalabad, Pakistan. UAF is the Oldest and Pioneer Agriculture Institute in the South Asia. It is ranked 4th in Pakistan and 1st in the field of Agriculture and Veterinary Sciences by Higher Education Commission (Pakistan) (HEC) Ranking in 2015. It was also ranked at No.127 in life Science and Agriculture Science (Shanghai Ranking 2019). The employability ranking by Pakistan made this institution highly reputable. The university came among top 5 institutions of Pakistan in research power. Prof. DR. Iqrar Ahmad Khan is serving as Vice Chancellor of the University. History Origins The University of Agriculture, Faisalabad (Urdu: ), (formally: Punjab Agricultural College and Research Institute), is a university in the city of Faisalabad, Punjab, Pakistan. It was established in 1906 as the first major institution of higher agricultural education in the undivided Punjab. At independence in 1947, Pakistan was a predominantly agricultural country. In spite of subsequent industrialization and development, agriculture remains central to its economy. After independence, the Government of Pakistan appointed the National Commissions on Food and Education to reform the existing agrarian system and to formulate measures for developing better agricultural potential. The commissions made a plea for the establishment of an agricultural university for research and education. It was established by upgrading the former Punjab Agricultural College and Research Institute in 1961. It was the first university built by the British Empire in 1906 with the foundation stone laying ceremony by Sir Louis Dane, the then Lieutenant and Governor of the Punjab. The 1,950-acre campus represents both history and the present era. The new campus is lush green with a conglomeration of monolithic blocks built in modern style. The old campus is reminiscent of traditional Muslim architecture. The main campus is situated in the centre of the city (12 km northeast of the Faisalabad International Airport), about 2 km away from city centre/ clock tower. UAF owns and operates major cultural assets such as the Zoology Museum, UAF Art Gallery, University Library The university's faculty employs more than 800 teachers. The university has boys and girls hostels. It accommodates over 15,000 students and approximately 12,000 day scholars. Premier hostels are Jinnah Hostel (named after the founder of Pakistan Muhammad Ali Jinnah), Iqbal Hall, Fatima Jinnah Hostel and Hailey Hall. Organisation and administration Faculties and departments UAF is organised into faculties of Agriculture, Agricultural Engineering and Technology, Food Nutrition and Home Economics, Social sciences, Animal Husbandry, Food Sciences, Biotechnology, Biochemistry, Plant and Animal Breeding, Veterinary Science, Doctor of Pharmacy, Mathematics, Statistics and Sciences. A community college is on the main campus in Faisalabad. The College of Veterinary Sciences at Lahore was affiliated and is now a distinct entity known as the University of Veterinary and Animal Sciences, Lahore (UVAS). These units offer programmes leading to bachelor's, master's, and doctorate degrees in agriculture and allied fields. HEC ranks the University of Agriculture, Faisalabad as the best in the agriculture/veterinary field in Pakistan. The University of Agriculture Faisalabad ranked number 4 among the top 10 universities in Pakistan. Faculty of Agriculture The Faculty of Agriculture comprises Six departments — Agronomy, Plant Breeding and Genetics, Agricultural Entomology, Plant Pathology, Crop Physiology and Forestry, Wildlife & Range Management Three institutes — Institute of Soil & Environmental Sciences, and Institute of Horticultural Sciences and National Institute of Food Science and Technology. Seven allied disciplines — Agricultural Economics, Marketing & Agribusiness, Agricultural Extension, Animal Sciences, Computer Science, Mathematics and Statistics One centre — Centre of Agricultural Biochemistry and Biotechnology (CABB) A building called the U.S. Pakistan Center for Advanced Studies in Agriculture and Food Security (USPCAS) is constructed with the collaboration of USA for food security and research purposes. Faculty of Agricultural Engineering and Technology The faculty includes the departments of Structural and Environmental Engineering, Energy Systems Engineering, Irrigation and Drainage, Food Engineering, Farm Machinery, and Power and Fibre Technology. Courses of study lead to the degrees of Bachelor of Science in Agricultural Engineering, Food Engineering, Textile Technology,; M.Sc. in Food Technology, Fibre Technology, Agricultural Engineering; and Doctor of Philosophy in Food Technology and Agricultural Engineering. Department of Agricultural Engineering & Technology was established in 1961. The main objective of the faculty was to train manpower to cater to the growing needs of mechanized farming in the province of Punjab, Pakistan. This is one of the pioneer faculties in the field of agriculture in Pakistan. Department of Food Engineering The University of Agriculture, Faisalabad is the pioneer in launching the Food Engineering Program in Pakistan. The Department of Food Engineering has the highest merit since 2013 & leading in the country with phenomenal and creative engineers. The Department admitted the first students in 2011-12 academic years. The mission of Food Engineering Department is to educate students with required knowledge and skills to design and to produce safe, high quality, and economical food products and processes; to conduct research and development for the benefits of growing food industry and society. Department of Farm Machinery and Power The department is one of the prime departments of the Faculty of Agricultural Engineering and Technology. This department, in collaboration with other departments, offers a baccalaureate program in Agri. Engineering. Department of Structures & Environmental Engineering The department provides facilities like updated research laboratories, lecture rooms for undergraduate and postgraduate students, and computer laboratory. A wide range of state-of-the-art equipment has been provided to facilitate high tech research work. Department of Irrigation & Drainage Department of Fibre and Textile Technology The department of Fibre Technology was established in the Faculty of Agriculture Engineering & Technology in 1964. It is the sole department in the country which offers a master's degree programme in Fibre Technology. The main objective of this department is to impart scientific knowledge and technological training to the students in the field of Fibre science, Textile Technology (spinning, knitting, dyeing, finishing, testing & quality control), pulp, paper and cardboard technology to meet the demand of the fast growing industry. Faculty of Veterinary Science The faculty includes the department of Anatomy, Institute of Pharmacy Physiology and Pharmacology, Pathology, Parasitology, Microbiology, Clinical Medicine and Surgery, and Theriogenology. Candidates who hold degrees in biological sciences are considered for graduate admission in non-clinical subjects. The students are enrolled after a entry test held by the university for the degree of Doctor of Veterinary Medicine (D.V.M.), Pharm.D, M.Phil and Ph.D. degrees. Professor Dr. Zafar Iqbal Qureshi is the Dean. Faculty of Social Sciences The Faculty of Agricultural Economics and Rural Sociology, established in 1963, with its constituent departments of Agricultural Economics, Farm Management, Agricultural Marketing, Cooperation and Credit and Rural Sociology, has now been functioning for over 51 years in the University of Agriculture, Faisalabad. It has endeavored to keep abreast with requirements in terms of academic programmes/activities over these years. However, looking at the likely shape of agricultural development, the direction and quality of research and the make-up of trained manpower output in the discipline of agricultural social sciences, the future trends need to be looked into seriously so as to be able to meet the challenges of the 21st century. The Faculty was restructured in 2012. The name of faculty was changed to Faculty of Social Sciences. The departments of Agricultural Economics, Development Economics, and Environmental & Resource Economics were merged into a new institute, Institute of Agricultural & Resource Economics (IARE). Professor Dr. Muhammad Ashfaq is the first director of IARE, having served from June 2012. Institute of Agricultural and Resource Economics now offers degree programs serving the needs of society: BSc (Hons) Agricultural & Resource Economics (4 years) MSc (Hons) Agricultural Economics (2 years) MSc (Hons) Development Economics (2 years) MSc (Hons) Environmental & Resource Economics (2 years) MSc Economics (Distance Learning Program) (2 years) M.Phil Economics (2 years) PhD Agricultural Economics PhD Environmental & Resource Economics PhD Economics The Faculty of Social Sciences is divided in following institutes: Institute of Agricultural and Resource Economics Institute of Business Management Sciences (offers programs at undergraduate and postgraduate level) Institute of Agricultural Extension and Rural Development (offers programs at undergraduate, postgraduate, and doctoral level) Department of Rural Sociology (offers programs at postgraduate level as well as PhD program) . Faculty of Humanities This includes the Department of Arts, Languages and Cultures, Art History, Classics and Ancient History, Linguistics, Religions and Theology and the University Language Centre. Faculty of Animal Husbandry The faculty includes the Department of Animal Breeding and Genetics, the Department of Livestock Management, the Department of Animal Nutrition and the Department of Poultry Science. It offers the degree of B.Sc. (Hons.) Animal Sciences and B.Sc. (Hons) Dairy Science at the main campus and B.Sc.(Hons) Poultry Science at one of the subcampuses, Toba Tek Singh. It also offers supporting courses for students of D.V.M., B.Sc. (Hons.) Agriculture and Food Sciences. It provides research facilities concerning livestock, poultry and their products to departments in the university and it provides advisory services to livestock and poultry farmers. Postgraduate courses leading to M.Sc. and Ph.D. degrees are offered by all four departments in the faculty. Prof. Dr. M. Sajjad Khan is the Dean. Faculty of Sciences The faculty includes the departments of Botany, Zoology and Fisheries, Chemistry and Biochemistry, Physics, Mathematics and Statistics, Business Management Sciences, Computer Science, Social Sciences and Humanities, and Islamic Studies. It offers undergraduate courses in physical, biological and social sciences and postgraduate degree programmes in Botany, Zoology and Fisheries, Chemistry, Bio-Chemistry, Physics, Statistics, Computer Science, Business Management and Commerce. Graduates of pure sciences from other institutions are eligible for admission to courses leading to M.Sc., M.Phil. and Ph.D. degrees. Graduates having a relevant bachelor's degree may also seek admission to MBA and M.Com. degree programmes. Munir Ahmad Sheikh is the Dean of the faculty. Institute of Agricultural Extension and Rural Development The Division of Education & Extension's larger part has been shifted to the Faculty of Social Sciences. Tanvir Ali (ex-director, Graduate Studies / ex-director, Division of Education and Extension) is now the director of the Institute of Agricultural Extension and Rural Development, Faculty of Social Sciences. The institute deals with disciplines like Agricultural Extension, Agricultural Education, Agricultural Information (Mass Communication), Rural Sociology, Population Science, Gender Studies, and Rural Development. It offers undergraduate and graduate degree programs like B.Sc. (Hons) Agriculture (Agri. Extension Major); M.Sc., M.Sc.(Hons), and PhD. A quarterly agricultural journal is published for farmers. On 14 August 2012, the institute started FM Radio 100.4 for dissemination of agricultural innovations among adjoining farm communities in addition to providing information, education, and entertainment to the general public. Research The University of Agriculture Faisalabad is a major centre for research. It receives billions for research and development. It contributed very much to the community by introducing the modern techniques in agriculture. The university has two advanced scientific labs for research purposes, where university scientists do experiments. The University of California, Davis is the partner university of the University of Agriculture Faisalabad. Both the universities collaborate in research and exchange programs. UAF has produced many scientists who are serving the world. University of Agriculture Press This is the university's academic publishing house. It publishes academic monographs, textbooks and journals, most of which are from university students and staff authors. Sub-campuses Sub-campus Burewala Vehari A sub-campus of University of Agriculture Faisalabad was established in Burewala District Vehari, to spread the agriculture research all across the country, especially in southern area of Punjab. Sub-campus Toba Tek Singh A sub-campus of the University of Agriculture, Faisalabad was established in Toba Tek Singh on 5 June 2005 to study the poultry industry. B.Sc Poultry Science (4 year degree program) is offered at this campus. Special efforts for establishing this sub-campus were made by Prof. Dr. Bashir Ahmed (Ex- Vice Chancellor) and Chaudhry Muhammad Ashfaq (Ex- district Nazim). Current principle of the institute is Dr. Nisar Ahmed from Centre of Agricultural Biochemistry and Biotechnology (CABB). Academic programmes The following degree courses are offered: Intermediate (pre-Agriculture) This program was started in 2009. F.Sc. (Pre-Agriculture) 2-year program at UAF Community College in PARS Campus, Jhang Road, near Faisalabad International Airport. After this 2 year degree, students can take admission in any degree program offered by University of Agriculture Faisalabad. Students are selected from all across Punjab district wise. First degree courses B.Sc.(Hons) Agriculture Doctor of Veterinary Medicine (DVM) B.Sc.(Hons) Microbiology BS (Hons.) Animal Sciences BS (Hons.) Dairy Sciences B.Sc.(Hons) Agricultural Engineering B.Sc.(Hons) Environmental Engineering B.Sc.(Hons) Energy System Engineering B.Sc.(Hons) Food Engineering B.Sc.(Hons) Textile Technology B.Sc.(Hons) (Home Economics) BS Information Technology BS Computer Science BS Software Engineering BS Bioinformatics BS (Chemistry) BS (Bio-Chemistry) BS (Botany) BS (Physics) BS (Zoology) Bacholor of Business administration (BBA) Doctor of Pharmacy (D.Pharma) B.Sc.(Hons) Human Nutrition and Dietetics B.Sc.(Hons) Agriculture and Resource Economics Postgraduate degree programmes M.Sc./M.Sc. (Hons.) four semesters, offered in 35 subjects, after completing first degree course, M.Phil. in pure science four semesters, offered in five subjects after completing M.Sc. in the relevant subject, Ph.D., offered in 28 subjects, four semesters, after completing M.Sc. in the relevant subject An evening college has been started in the university where M.B.A. (Regular and Executive), M.Com. and M.Sc. Chemistry, Bio-Chemistry, Botany, Zoology, Physics, Fibre Technology, Home Economics (Food and Nutrition), Statistics and Computer Science is offered. The other programs like M.A. Economics, and M.A. Sociology have been stopped after few years of good experience. Diploma and certificate courses The university under the short courses program organises training facilities in agriculture and allied disciplines for in-service, pre-service and self-employed personnel, farmers, and others interested in agriculture. Separate courses are provided for ladies on the subjects and skills peculiar to womenfolk. The division has developed 124 courses leading to diplomas and certificates. Most of these courses have a practical bias. The following diploma courses are offered: C.T. Agriculture (boys) Diploma in Rural Home Economics (girls) Diploma in Arabic Mali Class Veterinary Assistant Diploma Course Diploma in Grain Storage Management Postgraduate Diploma Course in Food Science (for Army personnel) Agricultural Business Management Computer Science Pre-release training course (for Army personnel). Notable people Mirza Masroor Ahmad Amir Ali Majid Abid Qaiyum Suleri Tariq Aziz (field hockey, born 1938) Iqrar Ahmad Khan Syed Hussain Jahania Gardezi Chaudhry Ghulam Rasool Mohammad Hameed Shahid Sajjad Haider Nadeem References External links http://www.uaf.edu.pk/admin/vc_office.html Business Recorder newspaper article on this university Research institutes in Pakistan Universities and colleges in Faisalabad District Faisalabad Faisalabad 1906 establishments in India Agriculture in Punjab, Pakistan Public universities and colleges in Punjab, Pakistan Educational institutions established in 1906 Engineering universities and colleges in Pakistan
18733935
https://en.wikipedia.org/wiki/RemoteView
RemoteView
RemoteView is the family name of a group of software programs designed by Textron Systems Geospatial Solutions to aid in analyzing satellite or aerial images of the Earth's surface for the purpose of collecting and disseminating geospatial intelligence. The National Geospatial-Intelligence Agency (NGA) was a user of RemoteView software. Overview RemoteView is an electronic light table application, initially developed and released commercially by Sensor Systems in 1996. An electronic light table application makes it possible for imagery analysts to review satellite images on a computer instead of examining film or printed photographs. RemoteView was originally written only for the Unix operating system, but as the US Department of Defense transitioned to the Windows operating system, Sensor Systems released a Windows-based version. Overwatch acquired Sensor Systems and the RemoteView software in 2005. Textron Systems acquired Overwatch in 2006. RemoteView's main function is an imagery and geospatial analysis tool. It can display imagery formats, elevation data sets, and vector data sets. Capabilities include image enhancements, photogrammetry, orthorectification, multispectral classification, pan sharpening, change detection, assisted search, location positioning, and 3D terrain visualization. These features allow an intelligence analyst to review large-scale imagery and generate annotated reports on any findings. Extensions Textron Systems Geospatial Solutions offers extensions that add specialized capabilities to RemoteView. These include: Virtual Mosaic – a tool for quickly joining more than four adjacent or overlapping images 3D Pro – a module that expands visualization tools to allow creating 3D virtual worlds for simulating real world conditions and planning missions RVConnect – a tool that enables automatic data sharing between RemoteView and Esri’s ArcMap software V-TRAC Basic – a complementary video player that allows analysis of full motion video recorded by UAVs V-TRAC Pro – expands the abilities of V-TRAC Basic to include mark-up and reporting tools GeoCatalog for Desktop – a complementary database that makes it easier to organize and retrieve geospatial data See also Geomatics Imagery analysis Remote sensing Remote sensing application Satellite imagery References External links RemoteView Product Page GIS software Remote sensing software
17525449
https://en.wikipedia.org/wiki/Paul%20M.%20English
Paul M. English
Paul M. English (born 1963) is an American tech entrepreneur, computer scientist and philanthropist. He is the founder of Boston Venture Studio, and previously co-founded and served as CTO of Kayak. In November 2012, Kayak was acquired by Priceline for $1.8 billion. Before Kayak, English had created a number of companies, including the customer service company GetHuman; the e-commerce website-design company Boston Light, which was acquired by Intuit; and the anti-spam software company Intermute, which was acquired by Trend Micro. English co-founded Summits Education, a network of 41 schools in Haiti, created in partnership with the Haitian Ministry of National Education and Partners In Health, a non-profit where English serves as a director. He also founded King Boston, a racial-justice project which includes the creation of a new memorial to Martin Luther King Jr. and Coretta Scott King at Boston Common and the King Center for Economic Justice in Roxbury, Boston. Early life and education Paul M. English was born in 1963 in Boston, Massachusetts, the sixth of seven siblings in an Irish Catholic family that lived in the West Roxbury neighborhood. His mother was a substitute teacher and social worker, and his father was a pipefitter for Boston Gas. The family were congregants of St. Theresa of Avila Parish, and English attended Boston Latin School. He was part of the school band and played the piano and trumpet. He also joined the Computer Club where he was first introduced to computer programming. In 1981, his mother bought a Commodore VIC-20 computer for the family on which English continued to teach himself how to code. His brother Ed had become a well-known game designer after he created a computer program to play chess. Parker Brothers had also hired Ed to convert Frogger to be playable on the Atari 2600. Paul designed a game called Cupid on the VIC-20 which he showed to Ed. Cupid, with Ed's help, was acquired for $25,000 by Games by Apollo, which made a $5,000 down payment on the game before the company went out of business. Paul used the money to buy an Apple II with CD-ROM burner to make copies of the game for his friends. He also bought a modem to join the nascent Internet. English continued to live in the family home and worked as a meter reader for Boston Gas in the summer after graduating from Boston Latin in 1982. His grades were not good enough to get him into a competitive school like Harvard like several of his classmates, but his high SAT scores meant that he could attend the University of Massachusetts Boston tuition-free. He also liked the fact that the school had a Jazz band. While attending classes at night, English worked part-time for Ed's new company, adding music and sound effects to video games. After graduating with a bachelor's degree in 1987, English continued to study at UMass Boston to earn a master's degree in Computer Science in 1989. He later earned an honorary doctorate from the same university in 2019. Career Early career After completing the master's program at UMass Boston, English joined Interleaf, a software company based in Waltham, Massachusetts, as a member of the company's product development team. Interleaf's chief product was a WYSIWYG ("What You See Is What You Get") program, document editing software that displayed user revisions as they would appear on a printed page, a new innovation at the time. When English joined the company, Interleaf was performing a major update of the software, and he worked on rewriting significant portions of the software's code in Lisp and C programming languages. In 1991, he became a manager of a team of programmers at Interleaf. As he rose through the ranks at Interleaf, English turned to books on business management for help on how to lead a team. In 1994, Interleaf's board fired the CEO when the company recorded a loss of $20 million that year. English and two others were appointed as caretakers of the company and responsible for hiring a new CEO. He was also responsible for correcting issues with the company's line of software products. By then, English was senior vice president of product management and marketing and earning $2 million in stock options. After a few years in the leadership of Interleaf, English left in 1995 to join NetCentric, a local startup that was working on a tool to allow businesses to send fax over the Internet. Leaving Interleaf meant also losing half of the stock options he had been given which had yet to be vested. English worked at NetCentric for a brief period before he was either fired or quit after a disagreement with the CEO over pay for the company's engineers. In the subsequent period, English developed an online version of the ancient Chinese chess game Xiangqi, a game he had learned to play at UMass Boston. He received offers from a few companies, including Yahoo, to acquire the online gaming community he had developed and to also hire him. English turned down these offers including Yahoo's because he did not want to move to California where the company was based. In 1998, he founded Boston Light Software, after receiving a $50,000 contract from The Boston Globe to develop an online store for the newspaper to sell T-shirts and memorabilia. The technology industry was just at the beginning of the dot-com bubble when the NASDAQ index doubled from 1999 to March 2000. English set up office space for Boston Light in Arlington, Massachusetts, and hired several of his old colleagues from Interleaf and NetCentric. In 1999, Boston Light was acquired by Intuit, a financial software company in California, for $33.5 million. English and his co-founder Karl Berry decide to take half their shares and split it between their employees as bonuses. Their company was safely merged into Intuit when the dot-bubble burst taking with it many tech startups like Boston Light. English joined Intuit as the company's director of small business Internet division. He often commuted from Boston to Intuit's headquarters in Mountain View, California. After three and half years at Intuit, English left in 2001 to take care of his ailing father who was suffering from dementia brought on by Alzheimer's. His mother had died recently and made him promise to take care of his father. He helped his brother Ed start InterMute, which was developing anti-spam software. InterMute was later sold to Trend Micro in May 2005. English also started GetHuman, a website to help users get around automated customer-service hotlines to get to a real human agent to assist them. Kayak After taking a year-long sabbatical to take care of his dying father, English returned to the tech world and signed on as an entrepreneur in residence at Greylock, a venture capital (VC) firm. His former boss, Larry Bohn, who was now at the VC firm General Catalyst, introduced English to Steve Hafner. In 2004, English and Hafner co-founded Kayak, a website that allowed users to search for travel services, including flights, hotels, and rental cars. English became chief technology officer (CTO) at Kayak and was responsible for the website's development while Hafner handled the business side. English brought on his former colleagues, including Bill O'Donnell and Paul "Papa" Schwenk. English based Kayak's engineering division in Concord, Massachusetts, while Hafner based the business office in Norwalk, Connecticut. Unlike its then competitors Expedia, Orbitz, and Priceline, Kayak did not allow customers to actually book travel services on the website and instead referred customers to other websites like a search engine. It received referral fees of 75 cents for flights and $2 for hotels, including additional earnings if the customer actually purchased the flight, hotel room or car rental. Kayak did not need a large sales department or administrative division. This allowed the company to focus entirely on user experience and the website's accuracy on information about prices and availability. English also initially made it a requirement that his engineers take turns answering emails or the loud red phone he had installed at the office. The phone number was listed on the website for customers to call. He believed this would force his engineers to be empathetic to customers complaints and any difficulties and errors they encounter while using the website. As the company grew, however, he eventually hired a customer service team but kept the red phone operational. It continued to ring occasionally, and English himself or a senior engineer would answer the phone to speak with customers. In 2008, just as smartphones were becoming popular, English and O'Donnell created a team to design a mobile app for Kayak. The team operated almost independently with the option not to include some of the website's features in the mobile app. Without any advertisement, the app was downloaded 35 million times by 2012 and was one of the top apps on Android and Apple iOS. By this point, Kayak was generating, per year, 1.2 billion searches, $292.7 million in revenue, and $65.8 million in profits. On July 20, 2012, Kayak was listed on the Nasdaq stock exchange. With only 205 employees, it had $1.5 million in revenue per employee. In November 2012, Kayak agreed to an acquisition by Priceline for $1.8 billion with English earning $120 million from the deal. Later career In February 2013, before Kayak's sale was final, English began making plans to leave the company and start something new. English planned to invest $1 million to fund the project and also recruited O'Donnell and Schwenk, who were also still working for Kayak, to invest the same amount. He planned to create a Boston-based incubator which he named Blade. To house this incubator, English envisioned an office space for use by the companies it was hatching that could also serve as an event space at night, with a basement public club modelled on the speakeasies of the 1920s. The rough business model for Blade included a plan to give new start-up e-commerce companies free office space and advice as they develop and seek VC financing. In return, Blade takes get equity in the start-up and, once the company receives outside funding, Blade would "kick'em out" after placing either English, O'Donnell, or Schwenk on the board of directors. Before starting this new company, all three of them needed to wind down their involvement in the leadership of Kayak. In May 2013, Priceline's acquisition of Kayak became final. By this point, both O'Donnell and Schwenk had transitioned out of Kayak, but English remained for a little longer to coach his successor as CTO. At the same time, he began work on renovating the office space for Blade in Fort Point, Boston. English was also a senior instructor at the MIT Sloan School of Management, the students of which he imagined would be the intended audience for recruitment for the incubator. He secured additional funding of $20 million from the VC firms General Catalyst and Accel in return for a combined 30% share of the incubator. Blade opened with a party on May 16, 2014, that was attended by half of Kayak's engineers and also Deval Patrick, the outgoing governor of Massachusetts. English reviewed almost everyone of the dozens of applications that Blade received for consideration. He and his cofounders selected three start-ups to focus on: Wigo, an app to allow college students to inform each other about parties, Bevy, a hardware and software solution for storing photographs and video, and Classy, a website for reselling used books. A platform for listing tech industry jobs and avenue for recommending personnel called Drafted eventually replaced Classy. In the end, English decided to start a company himself rather than only focus on incubating the start-ups they had selected. In July 2015, English founded the travel startup Lola.com and was the company's first CEO, with O'Donnell as co-founder and CTO, and Schwenk as vice president of operations. Lola was initially a platform that used chat and artificial intelligence for booking travel arrangements with the assistance of a human travel agent that worked from home. In December 2015, Blade announced that it would, going forward, focus solely on Lola. The VC investors who had agreed to invest in Blade became investors in Lola. In July 2018, English hired Mike Volpe to serve as Lola's chief executive office while English will remain president of the company and serve in the CTO role. As travel slowed during the COVID-19 pandemic, English and Volpe shifted Lola's focus to other business expense management services. In October 2021, Lola was sold to Capital One to expand their financial services for small business. Philanthropy After the sale of his first company, Boston Light, English began making philanthropic donations under the tutelage of Tom White, who helped start Partners In Health (PIH), a non-profit focused on improving healthcare in Haiti and other developing countries. Initially, English made occasional donations of about $10,000–$20,000 to charities recommended by White. In 2005, English made his first major contribution when he gave $1 million to PIH. White died in 2011 having given away the vast majority of his own wealth, and English eventually planned to do the same. PIH was working to overcome a cholera outbreak in Haiti which followed a major earthquake in the country. English offered to fund or "backstop" a cholera vaccination program for 100,000 people. The Red Cross eventually ended up funding the project but his offer to "backstop" the program allowed PIH to move forward with acquiring the expensive vaccine, confident they would be able to find funding for the program. English serves as a director of Partners In Health and co-founded Summits Education, a network of free schools in Haiti. As of 2016, Summits Education, a collaboration with PIH and the Haitian Ministry of National Education, was responsible for the education of 10,000 students with 328 teachers at 41 primary schools in the Central Plateau. English has also made contributions to the Boston Health Care for the Homeless Program, Bridge Over Troubled Waters, and the Pine Street Inn, all charities benefiting Boston's homeless population. In 2016, English founded Winter Walk for Homelessness, a two-mile charity walk event to help homeless people in Boston. He is also the founder of King Boston, a racial-justice project to create a new memorial to Martin Luther King Jr. and Coretta Scott King in Boston. The memorial is due for installation in Boston Common in 2022 under the leadership of Imari Paris Jeffries, King Boston's executive director. The organization is also developing other projects including the King Center for Economic Justice in Roxbury. Personal life English was married in 1989 and later divorced in the early 2000s. He has a daughter and a son. English practices Buddhist meditation and plays several musical instruments. English was diagnosed with bipolar disorder while working at Interleaf in the late 1980s or early 1990s and was prescribed Lithium which he took on and off for a short period. His condition created intermittent periods of depression, hypomania, and sleeplessness that he experienced throughout his life. English was later prescribed various pharmaceutical drugs to control his symptoms, including the antiepileptic drug Lamictal, and diagnosed with temporal lobe epilepsy, which explained the bouts of visual distortion he occasionally experienced since childhood. The drugs helped control his periods of depression but were unable to control his hypomania. References Sources External links 1963 births Living people Businesspeople from Boston American computer businesspeople American chief technology officers American technology chief executives Boston Latin School alumni University of Massachusetts Boston alumni
42865956
https://en.wikipedia.org/wiki/Talygen%20Business%20Intelligence
Talygen Business Intelligence
Talygen Business Intelligence, officially Talygen Inc., is a software development company in Palo Alto. It is known for its eponymous web based business automation software, that first gained publicity in January 2014 when PC Magazine included it in a list of Top Ten Tools for Small Biz at CES 2014. The idea behind the software product has been described as "charming" by Frankfurter Allgemeine. Products Talygen Business Intelligence is the company's flagship product. It is a software tool that draws on cloud technology to perform "customer relationship management, expense accounting, manage leave [sic] and many other administrative issues" for enterprises. See also Comparison of project management software Comparison of time tracking software Comparison of PSA systems References External links Official site Companies based in Palo Alto, California Software companies based in California Web applications Cloud applications Business software Software companies of the United States
43589245
https://en.wikipedia.org/wiki/LazPaint
LazPaint
LazPaint is a free and open-source cross-platform lightweight image editor with raster and vectorial layers created with Lazarus. The software aims at being simpler than GIMP, is an alternative to Paint.NET and is also similar to Paintbrush. Rendering is done with antialiasing and gamma correction. It can read/write usual image formats and interoperate with other layered editors via OpenRaster format. It also imports Paint.NET files (with their layer structure) and Photoshop files (as flattened images). Also, it can import 3d objects in Wavefront (.obj) format. There are complex selection functions, various filters and rendering of textures. Colorspace Colors are stored in sRGB colorspace but drawing functions take into account gamma correction. There are many color manipulation functions such as shifting colors and colorizing. Since version 6, there is a curve adjustments function on RGBA channels and according to a corrected HSL colorspace. This correction takes into account the gamma factor and also perceived lightness according to the hue. Vector shapes Since version 7, shapes are stored with vectorial information. Thus they are editable after they have been drawn and are rendered again in case of layer transform. Layers are converted when needed to raster or vector. If a tool like the pen is applied, the layer is rasterized and vectorial information becomes unavailable. Filling can be a solid color, a gradient or a texture. It can apply to simple shapes, complex curves, text and shaded shapes. Releases Notes References External links 2011 poll on linuxquestions.org (7th place out of 13 linux graphics application) Chip magazine in Russian mentioning version 6.0 Raster graphics editors Graphics Photo software Graphics software Cross-platform_free_software Cross-platform_software Free_graphics_software Free_multilingual_software Free_raster_graphics_editors Portable_software Software_using_the_GPL_license Pascal (programming language) software Free software programmed in Pascal
23561086
https://en.wikipedia.org/wiki/261st%20Theater%20Tactical%20Signal%20Brigade
261st Theater Tactical Signal Brigade
The 261st Theater Tactical Signal Brigade (261st TTSB) is a unit in the Delaware Army National Guard, with a home station in Smyrna, Delaware. The 261st Theater Tactical Signal Brigade (261st TTSB) provides command and control to assigned and attached units. The 261st Signal Brigade supervises the installation, operation, and maintenance of up to 16 NODES in the theater communications system excluding the division and corps systems. The brigade also hosts a Joint Command, Control, Computer and Communications (C4) Center, or JCCC, at its headquarters in Smyrna, Delaware. Mission The 261st Theater Tactical Signal Brigade (261st TTSB) provides command and control to assigned and attached units. The 261st Signal Brigade supervises the installation, operation, and maintenance of up to 16 NODES in the theater communications system excluding the division and corps systems. The HHC, 261st Theater Tactical Signal Brigade is employed to command and control echelons above corps (EAC) tactical signal battalions, theater strategic organizations, and separate companies as required to support theater communications networks. Subordinate units One battalion currently falls under command of 261st Signal Brigade: 198th Signal Battalion (198th SB) Headquarters Company Alpha Company Bravo Company Charlie Company The 198th Signal Battalion provides command, control, and supervision of organic and assigned units, and provides nodal and extension communications support for the Combatant Commanders of unified or specified commands, Army Service Component Commanders (ASCC), or Joint Task Force/Joint Forces Land Component Commands (JTF/JFLCC). 198th Deployed to Camp Cropper, Iraq from **need dates and mission** The 198th traces its lineage all the way back to the Revolution, to Colonel John Haslet's 1st Delaware Regiment which was formed 21 January 1776. The regiment was a significant contributor at the Battle of Cowpens on 17 January 1781. The battalion also served at Fort Miles as the 198th Coastal Artillery Battalion during World War 2. Former subordinate Units: 280th Signal Battalion, The 280th mobilized at Dover Air Force Base to supplement the base security squadron from January 2003 through December 2004. The battalion was inactivated on **date** and its assets were incorporated into the 198th. History The 261st Signal Brigade began its lineage in 1943 in the Army of the United States as the Headquarters and Headquarters Battery, 68th Antiaircraft Artillery Brigade, activated in New Caledonia. After the war, the unit was allotted to the National Guard as the 261st Antiaircraft Artillery Brigade. The modern 261st TTSB is a mobile force able to rapidly deploy, supporting both state and federal missions. The 261st served a 10-month tour at Camp Victory, Iraq managing communications in support of Operation Iraqi Freedom and returned to Delaware 2 October 2009. The 261st returned to drill status in March 2010. Col. David A. Passwaters III took command of the unit in a ceremony 6 March at Smyrna Readiness Center. The next day, outgoing commander Brig. Gen. Scott E. Chambers was installed as Deputy Adjutant General for the entire Delaware Army National Guard. Lineage and honors Constituted 25 February 1943 in the Army of the United States as Headquarters and Headquarters Battery, 68th Antiaircraft Artillery Brigade (Activated 10 August 1943 at New Caledonia) Inactivated 28 February 1946 in Japan Redesignated 16 May 1946 as Headquarters and Headquarters Battery, 261st Antiaircraft Artillery Brigade and allotted to the Delaware National Guard Organized and federally recognized 5 December 1949 at Wilmington Redesignated 1 April 1959 as Headquarters and Headquarters Battery, 261st Artillery Brigade (Location change 31 January 1968 to Dover, Delaware) Converted and redesignated 1 January 1970 as Headquarters and Headquarters Company, 261st Signal Command Reorganized and redesignated 1 June 1971 as Headquarters and Headquarters Company, 261st United States Army Strategic Communications Command Reorganized and redesignated 1 July 1974 as Headquarters and Headquarters Company 261st Signal Command Redesignated 1 September 1996 as Headquarters and Headquarters Company 261st Signal Brigade (Location change 2003 to Smyrna, Delaware) Adopted the Joint C4 Coordination Center (JCCC) in 2006 Insignia Shoulder sleeve insignia Description: A lozenge with horizontal axis overall and vertical axis overall consisting of four lozenges conjoined, the upper of colonial blue and bearing a white five pointed star and each of the remaining three divided into white and orange areas by a zig-zag partition line, all within a buff border. It has Velcro on the back so it sticks to a uniform. Symbolism: Orange and white are the colors used by Signal units. Colonial blue and buff were suggested by the flag of the state of Delaware. The single star alludes to Delaware as the "first state" to sign the Constitution; it is also used to indicate the capital city of Dover, the unit's home area. The pattern formed by the conjoined lozenges is indicative of precise planning and represents the unit's capabilities. The white and orange zig-zag simulates electric flashes and refers to the technology of a communications system and the unit's mission. Background: The shoulder sleeve insignia was originally approved for 261st Signal Command on 7 January 1971. It was amended on 27 January 1971 to correct the description of the insignia. On 19 January 1972 the insignia was redesignated for the 261st U.S. Army Strategic Communications Command. It was redesignated on 4 December 1974 for the 261st Signal Command. The shoulder sleeve insignia was redesignated for the 261st Signal Brigade on 1 September 1996. Distinctive unit insignia Description: A silver color metal and enamel device in height overall consisting of four orange lightning flashes issuing vertically from a blue lozenge charged with a white star between four white vertical arcing lightning bolts, all above an orange scroll inscribed "FORESEE" in silver letters. Symbolism: Orange and white are the traditional colors of the Signal Corps. The four white electronic flashes represent command and control, communications and computers, the four "C's" of a modern military Signal organization. The diamond shape and blue color refer to the Delaware State flag, and the star refers to Dover, the state capital and the unit's home area. The orange flashes represent the unit's mission to disseminate and direct communication efforts over a wide area. Background: The distinctive unit insignia was originally approved for the 261st Signal Command on 31 August 1979. It was redesignated for the 261st Signal Brigade on 1 September 1996. Operation Iraqi Freedom Headquarters and Headquarters Company (HHC), 261st TTSB was activated for federal service in support of Operation Iraqi Freedom on 1 October 2008. The unit conducted pre-mobilization training at Fort Bliss, Texas and formally accepted the tactical signal mission for the Iraqi Theater of Operations from 11th Signal Brigade on 20 December. During the deployment, 261st was the headquarters element of "Task Force Diamond", with command and control over the 146th Expeditionary Signal Battalion of the Florida Army National Guard, and the 51st Expeditionary Signal Battalion from Fort Lewis, WA. The Task Force had personnel and equipment assets at over 50 locations throughout the Iraqi Theater of Operations. 261st transferred authority for the Theater signal mission to the 35th Signal Brigade in mid-September and returned home to Delaware on 1 October 2009. On 11 July 2010, the unit received the Meritorious Unit Commendation for its outstanding service during the deployment. See also Francis D. Vavala 198th Signal Battalion (United States) References External links Delaware National Guard 261st Signal Brigade at GlobalSecurity.org 261st Signal Brigade at the Institute of Heraldry Military units and formations established in 1943 261 Military units and formations in Delaware
632224
https://en.wikipedia.org/wiki/Compare-and-swap
Compare-and-swap
In computer science, compare-and-swap (CAS) is an atomic instruction used in multithreading to achieve synchronization. It compares the contents of a memory location with a given value and, only if they are the same, modifies the contents of that memory location to a new given value. This is done as a single atomic operation. The atomicity guarantees that the new value is calculated based on up-to-date information; if the value had been updated by another thread in the meantime, the write would fail. The result of the operation must indicate whether it performed the substitution; this can be done either with a simple boolean response (this variant is often called compare-and-set), or by returning the value read from the memory location (not the value written to it). Overview A compare-and-swap operation is an atomic version of the following pseudocode, where denotes access through a pointer: function cas(p: pointer to int, old: int, new: int) is if *p ≠ old return false *p ← new return true This operation is used to implement synchronization primitives like semaphores and mutexes, as well as more sophisticated lock-free and wait-free algorithms. Maurice Herlihy (1991) proved that CAS can implement more of these algorithms than atomic read, write, or fetch-and-add, and assuming a fairly large amount of memory, that it can implement all of them. CAS is equivalent to load-link/store-conditional, in the sense that a constant number of invocations of either primitive can be used to implement the other one in a wait-free manner. Algorithms built around CAS typically read some key memory location and remember the old value. Based on that old value, they compute some new value. Then they try to swap in the new value using CAS, where the comparison checks for the location still being equal to the old value. If CAS indicates that the attempt has failed, it has to be repeated from the beginning: the location is re-read, a new value is re-computed and the CAS is tried again. Instead of immediately retrying after a CAS operation fails, researchers have found that total system performance can be improved in multiprocessor systems—where many threads constantly update some particular shared variable—if threads that see their CAS fail use exponential backoff—in other words, wait a little before retrying the CAS. Example application: atomic adder As an example use case of compare-and-swap, here is an algorithm for atomically incrementing or decrementing an integer. This is useful in a variety of applications that use counters. The function performs the action , atomically (again denoting pointer indirection by , as in C) and returns the final value stored in the counter. Unlike in the pseudocode above, there is no requirement that any sequence of operations is atomic except for . function add(p: pointer to int, a: int) returns int done ← false while not done value ← *p // Even this operation doesn't need to be atomic. done ← cas(p, value, value + a) return value + a In this algorithm, if the value of changes after (or while!) it is fetched and before the CAS does the store, CAS will notice and report this fact, causing the algorithm to retry. ABA problem Some CAS-based algorithms are affected by and must handle the problem of a false positive match, or the ABA problem. It is possible that between the time the old value is read and the time CAS is attempted, some other processors or threads change the memory location two or more times such that it acquires a bit pattern which matches the old value. The problem arises if this new bit pattern, which looks exactly like the old value, has a different meaning: for instance, it could be a recycled address, or a wrapped version counter. A general solution to this is to use a double-length CAS (DCAS). E.g., on a 32-bit system, a 64-bit CAS can be used. The second half is used to hold a counter. The compare part of the operation compares the previously read value of the pointer and the counter, with the current pointer and counter. If they match, the swap occurs - the new value is written - but the new value has an incremented counter. This means that if ABA has occurred, although the pointer value will be the same, the counter is exceedingly unlikely to be the same (for a 32-bit value, a multiple of 232 operations would have to have occurred, causing the counter to wrap and at that moment, the pointer value would have to also by chance be the same). An alternative form of this (useful on CPUs which lack DCAS) is to use an index into a freelist, rather than a full pointer, e.g. with a 32-bit CAS, use a 16-bit index and a 16-bit counter. However, the reduced counter lengths begin to make ABA possible at modern CPU speeds. One simple technique which helps alleviate this problem is to store an ABA counter in each data structure element, rather than using a single ABA counter for the whole data structure. A more complicated but more effective solution is to implement safe memory reclamation (SMR). This is in effect lock-free garbage collection. The advantage of using SMR is the assurance a given pointer will exist only once at any one time in the data structure, thus the ABA problem is completely solved. (Without SMR, something like a freelist will be in use, to ensure that all data elements can be accessed safely (no memory access violations) even when they are no longer present in the data structure. With SMR, only elements actually currently in the data structure will be accessed). Costs and benefits CAS, and other atomic instructions, are sometimes thought to be unnecessary in uniprocessor systems, because the atomicity of any sequence of instructions can be achieved by disabling interrupts while executing it. However, disabling interrupts has numerous downsides. For example, code that is allowed to do so must be trusted not to be malicious and monopolize the CPU, as well as to be correct and not accidentally hang the machine in an infinite loop or page fault. Further, disabling interrupts is often deemed too expensive to be practical. Thus, even programs only intended to run on uniprocessor machines will benefit from atomic instructions, as in the case of Linux's futexes. In multiprocessor systems, it is usually impossible to disable interrupts on all processors at the same time. Even if it were possible, two or more processors could be attempting to access the same semaphore's memory at the same time, and thus atomicity would not be achieved. The compare-and-swap instruction allows any processor to atomically test and modify a memory location, preventing such multiple-processor collisions. On server-grade multi-processor architectures of the 2010s, compare-and-swap is cheap relative to a simple load that is not served from cache. A 2013 paper points out that a CAS is only 1.15 times more expensive than a non-cached load on Intel Xeon (Westmere-EX) and 1.35 times on AMD Opteron (Magny-Cours). Implementations Compare-and-swap (and compare-and-swap-double) has been an integral part of the IBM 370 (and all successor) architectures since 1970. The operating systems that run on these architectures make extensive use of this instruction to facilitate process (i.e., system and user tasks) and processor (i.e., central processors) parallelism while eliminating, to the greatest degree possible, the "disabled spinlocks" which had been employed in earlier IBM operating systems. Similarly, the use of test-and-set was also eliminated. In these operating systems, new units of work may be instantiated "globally", into the global service priority list, or "locally", into the local service priority list, by the execution of a single compare-and-swap instruction. This substantially improved the responsiveness of these operating systems. In the x86 (since 80486) and Itanium architectures this is implemented as the compare and exchange (CMPXCHG) instruction (on a multiprocessor the prefix must be used). As of 2013, most multiprocessor architectures support CAS in hardware, and the compare-and-swap operation is the most popular synchronization primitive for implementing both lock-based and non-blocking concurrent data structures. The atomic counter and atomic bitmask operations in the Linux kernel typically use a compare-and-swap instruction in their implementation. The SPARC-V8 and PA-RISC architectures are two of the very few recent architectures that do not support CAS in hardware; the Linux port to these architectures uses a spinlock. Implementation in C Many C compilers support using compare-and-swap either with the C11 <stdatomic.h> functions, or some non-standard C extension of that particular C compiler, or by calling a function written directly in assembly language using the compare-and-swap instruction. The following C function shows the basic behavior of a compare-and-swap variant that returns the old value of the specified memory location; however, this version does not provide the crucial guarantees of atomicity that a real compare-and-swap operation would: int compare_and_swap(int* reg, int oldval, int newval) { ATOMIC(); int old_reg_val = *reg; if (old_reg_val == oldval) *reg = newval; END_ATOMIC(); return old_reg_val; } old_reg_val is always returned, but it can be tested following the compare_and_swap operation to see if it matches oldval, as it may be different, meaning that another process has managed to succeed in a competing compare_and_swap to change the reg value from oldval. For example, an election protocol can be implemented such that every process checks the result of compare_and_swap against its own PID (= newval). The winning process finds the compare_and_swap returning the initial non-PID value (e.g., zero). For the losers it will return the winning PID. bool compare_and_swap(int *accum, int *dest, int newval) { if (*accum == *dest) { *dest = newval; return true; } else { *accum = *dest; return false; } } This is the logic in the Intel Software Manual Vol 2A. Extensions Since CAS operates on a single pointer-sized memory location, while most lock-free and wait-free algorithms need to modify multiple locations, several extensions have been implemented. Double compare-and-swap (DCAS) Compares two unrelated memory locations with two expected values, and if they're equal, sets both locations to new values. The generalization of DCAS to multiple (non-adjacent) words is called MCAS or CASN. DCAS and MCAS are of practical interest in the convenient (concurrent) implementation of some data structures like dequeues or binary search trees. DCAS and MCAS may be implemented however using the more expressive hardware transactional memory present in some recent processors such as IBM POWER8 or in Intel processors supporting Transactional Synchronization Extensions (TSX). Double-wide compare-and-swap Operates on two adjacent pointer-sized locations (or, equivalently, one location twice as big as a pointer). On later x86 processors, the CMPXCHG8B and CMPXCHG16B instructions serve this role, although early 64-bit AMD CPUs did not support CMPXCHG16B (modern AMD CPUs do). Some Intel motherboards from the Core 2 era also hamper its use, even though the processors support it. These issues came into the spotlight at the launch of Windows 8.1 because it required hardware support for CMPXCHG16B. Single compare, double swap Compares one pointer but writes two. The Itanium's cmp8xchg16 instruction implements this, where the two written pointers are adjacent. Multi-word compare-and-swap Is a generalisation of normal compare-and-swap. It can be used to atomically swap an arbitrary number of arbitrarily located memory locations. Usually, multi-word compare-and-swap is implemented in software using normal double-wide compare-and-swap operations. The drawback of this approach is a lack of scalability. See also References External links Basic algorithms implemented using CAS 2003 discussion "Lock-Free using cmpxchg8b..." on Intel x86, with pointers to various papers and source code Implementations of CAS AIX compare_and_swap Kernel Service Java package implements 'compareAndSet' in various classes .NET Class methods Interlocked::CompareExchange Windows API InterlockedCompareExchange Computer arithmetic Concurrency control
4358711
https://en.wikipedia.org/wiki/Information%20and%20Communications%20University
Information and Communications University
Information and Communications University (ICU), established in 1998, was a Korean university focused primarily on research and engineering in the field of information technology. It was located in the city of Daejeon and comprised an engineering school and a management school. As of 2006, about 20% of the enrolled graduate students were international students. Unlike other Korean universities, almost all courses were taught in English. On March 1, 2009, the university merged into KAIST as a separate department, under the name of the Information Technology Convergence Campus. After this controversial merger a consortium of Professors and Directors of Research Centres (at ICU) in South Korea founded The Information and Communications University (ICU) to perpetuate the legacy and continuity of the Information and Communications University through international training and distance education, with its subsidiary in Zambia (ICU Zambia). Schools The university consists of an engineering school and a business school. Both are focused on information technology. The engineering school is focused on research in computer science and communications technology and has a total of 764 students. The business school is focused on the managerial and financial aspects of the IT industry and has a total of 162 students. Academics Undergraduate students are offered B.S. degrees in computer science and engineering or electrical and computer engineering by the engineering school and B.S. degrees in IT-Business by the business school. Graduate students enroll in M.S. or Ph.D. programs in specialized tracks in engineering or IT-business. It is also possible to enroll in a combined M.S./Ph.D. program. The engineering school also offers a master of software engineering (MSE) program, which is run jointly with Carnegie Mellon University. Courses are taught in spring, summer, and fall terms. Almost all of the courses are taught in English. In preparation, undergraduate students receive instruction in the language even before entering the university, and must take additional courses in English up to their second year of studies. Both undergraduate and graduate students are also required to take a number of courses outside of their discipline, so that a student in the engineering school must take courses offered by the business school and vice versa. Campus The main campus is located in Daedeok Science Town at the city of Daejeon. Most lecture, administrative, and faculty facilities are located in adjoined buildings. Research facilities are dispersed throughout several disjoint buildings. Other facilities on the campus include a cafeteria, a dormitory, and a playing field. A branch campus which houses the Digital Media Lab and the Software Technology Institute is located in Gangnam-gu, Seoul. A Lithuanian branch campus is planned to open in September 2007, which will be the first foreign branch campus opened by a Korean university. Research A heavy emphasis is placed on research. In 2005, the university had the most patents submitted per faculty member among Korean universities and was selected for the seventh Auto-ID Lab. The university operates several research centers: The ICU-Samsung Joint Research Center in collaboration with Samsung 4 IT research centers sponsored by the Ministry of Information and Communication 2 engineering research centers sponsored by the Ministry of Science and Technology History 1996 – Establishment proposed by the Ministry of Information and Communication. 1998 – Establishment of graduate school with 114 students 1999 – Opening of the Hwa-am campus 2002 – Establishment of undergraduate school with 105 students 2004 – Moved to the Munji campus 2007 – Ordered to merge with KAIST by government. 2008 – KAIST officially agrees with merger. 2009 – Merged into KAIST Notes External links ICU Zambia website ICU South Africa website https://www.hea.org.zm/ https://www.zaqa.gov.zm/higher-education-institutions/ https://www.mohe.gov.zm/ Universities and colleges in Daejeon Educational institutions established in 1998 1998 establishments in South Korea Defunct universities and colleges in South Korea
41978982
https://en.wikipedia.org/wiki/Honeywell%20Intelligrated
Honeywell Intelligrated
Honeywell Intelligrated is a material handling automation and software engineering company based in Mason, Ohio. In 2017, Honeywell Intelligrated reported revenue of $1 billion. Honeywell Intelligrated has production and service locations in the United States, Canada, Mexico, Brazil, and China. History Intelligrated was founded in 2001 by Chris Cole and Jim McCarthy, and has its headquarters in Mason, Ohio, a suburb of Cincinnati. In 2002, Intelligrated acquired the Versa Conveyor product line from Conveyors Ltd. Later that same year, the company opened a new manufacturing facility in London, Ohio. By the end of 2008, Intelligrated employed more than 500 associates, with field operations throughout the U.S. In 2009, after purchasing the North and South American operations of FKI Logistex, Intelligrated grew to more than 1,500 associates and expanded its product line to include Alvey palletizers, Real Time Solutions order fulfillment systems, as well as tilt-tray and cross-belt sorters. Following the FKI Logistex acquisition, Intelligrated continued its international growth by expanding its operations in both Canada and Mexico. In 2012, Intelligrated began operations in São Paulo, Brazil. In August 2012, Intelligrated was acquired by Permira. In late 2012, Intelligrated completed the acquisition of supply chain software company Knighted and followed that with the March 2013 purchase of Datria Systems, Inc., a provider of voice-enabled solutions for distribution and logistics organizations. The company acquired United Sortation Solutions, Inc. in February 2016, adding vertical conveyor, tote stacking and de-stacking, and other sortation solutions to the growing Intelligrated portfolio. In August 2016, Honeywell completed its acquisition of Intelligrated in a transaction valued at $1.5 billion, with the company joining Honeywell's Safety and Productivity Solutions business group. To support the integrated path forward as part of Honeywell, Intelligrated was rebranded as Honeywell Intelligrated — capitalizing on the global reach and brand recognition of Honeywell, while leveraging the legacy and reputation of Intelligrated. Industries served Honeywell Intelligrated provides solutions for retail, wholesale, e-commerce, food, beverage, consumer packaged goods, pharmaceutical and medical supply, third-party logistics, and postal and parcel industries. Honeywell Intelligrated designs, builds and installs automated material handling solutions including case and pallet conveyors, IntelliSort sortation systems, Alvey palletizers, robotic systems, order fulfillment systems and advanced machine controls. Honeywell Intelligrated Software offers warehouse execution systems, labor management software, voice and light order fulfillment technologies, and mobility and wireless solutions. The company's Lifecycle Support Services group administers aftermarket services such as technology refresh, equipment modifications, maintenance and system assessments, spare parts strategies and track-driven training. Material handling equipment brands supported by Honeywell Intelligrated include Alvey, Buschman, Cleco, Crisplant, Davco, FKI Logistex, Intelligrated, Mathews Real Time Solutions and Versa material handling systems and machinery. The company also provides operational support services, design and build, solutions development and systems integration services. References Companies based in Ohio Software companies based in Ohio Software companies of the United States
1209790
https://en.wikipedia.org/wiki/Viaccess
Viaccess
Viaccess is a conditional access system edit by Orange S.A. There are six versions in use today, Viaccess PC2.3, Viaccess PC2.4, Viaccess PC2.5, Viaccess PC2.6, Viaccess ACS3.x/Prime Sentinel, Viaccess ACS4.1, Viaccess ACS5.0, and Viaccess ACS6.x/Adaptive Sentinel. Viaccess was developed as the digital version of the EuroCrypt system used with the hybrid MAC system, The first version is sometimes referred to as Viaccess 1, and the latter three, although different, as Viaccess 2. PC2.3 and PC2.4 are known to be ineffective, and many set-top boxes can be 'patched' to decrypt Viaccess signals without payment, however PC2.5 and PC2.6 are secure, with PC2.5 remaining secure two years after its first commercial deployment. PC2.6 was introduced at the end of 2005. PC3.0 was introduced during mid-2007. There are two modifications of Viaccess PC2.3 in use. The first, known as TPS Crypt, is used by TPS. Despite being compromised also, the TPS Crypt system has been further modified to use Advanced Encryption System (AES) keys. These AES keys were originally updated once weekly, however after this inconvenienced unauthorised viewers little, a second TPS crypt system was introduced, by which keys are changed every 12 minutes, with keys being sent over TPS's internal Open TV system. This therefore meant that only TPS receivers could receive the new AES key, and not the insecure TPS subscription cards. Monitoring and analysing of the keys by hacking groups, however, has brought about key lists, where the AES keys have been successfully predicted. Implementation of this procedure of automatically updating keys has proved difficult, if not impossible, to implement on many satellite receivers, rendering the TPS Crypt AES system a general success. In 2008, Viaccess acquired Orca interactive and in 2012 they merge under the name Viaccess-Orca. The second Viaccess modification, called ThalesCrypt, is used by Canal Satellite France to protect its contents on the transport network to the head-ends of the cable networks; it is an over-encryption mechanism of the original protocol encryption keys. Viaccess is currently used by a large number of providers. These include: Team:Media Bosnia Some programs ART NTV Televisa Networks Canal Satellite France FRANSAT AB Sat ETTV TBLTV Home2US Orange TV Romania Orange Polska YouSee Croatian Radiotelevision RTV Slovenija Skylink Czechia/Slovakia SRG SSR idée suisse Cyfrowy Polsat Platforma Canal+ Viaccess is the 3rd largest conditional access system provider in the world (in 2004). Viaccess is also a subsidiary of Orange S.A. which offers pay TV and DRM enabled software. Orca Interactive is a subsidiary of Viaccess which offers IPTV Middleware since 2008. References External links Viaccess home page Digital television Conditional-access television broadcasting
466260
https://en.wikipedia.org/wiki/IRC%20operator
IRC operator
An IRC operator (often abbreviated as IRCop or oper) is a user on an Internet Relay Chat network who has privileged access. IRC operators are charged with the task of enforcing the network's rules, and in many cases, improving the network in various areas. The permissions available to an IRC operator vary according to the server software in use, and the server's configuration. IRC operators are divided into local and global operators. The former are limited to the server(s) they have specific access to; however, global operators can perform actions affecting all users on the network. In order to perform their duties, IRC operators usually have the ability to: Forcibly disconnect users (Kill) Ban (K-line or G-line) users Change network routing by disconnecting (squitting) or connecting servers Traditionally, a list of operators on a particular server is available in the MOTD, or through the command. A user can become an operator by sending the command /oper to the irc server he or she currently is on using a pre-selected username and a password as parameters. The command only works for the server which has the proper O-line in its IRCd configuration file. The IP address that the user is opering from may also have to match a predefined one, as an extra layer of security to prevent unauthorized users opering if they have cracked the operator's password. Operator types In many IRC networks, IRCops have different types of access on a network. These ranks often depend upon the IRCd software used, though a few specific access levels remain fairly constant throughout variations: Local operator The Local Operator (LocOp) is the lowest in Operator access levels. The LocOp has a minimal control on one server out of a network, and usually has the ability to kill (disconnect) people from the server or perform local K-lines (server ban). Global operator The Global Operator (GlobOp) is similar to the LocOp, and has control over the entire network of servers, as opposed to a single server. GlobOps may perform G-lines or AKills (network-wide bans) and Shun (forcibly mute) users over an entire network. Services administrator Commonly abbreviated as SA, This admin type has control over all functionality on an IRC network available via network service bots, including the commonly used NickServ, ChanServ, and MemoServ nicks. Usually, an SA has the ability to use the /sa* commands. The /sa* commands, like all actions performed by network services, are typically implemented using a virtual services node on the network, effectively masking the origin of the actions. Network Administrator The Network Administrator (NetAdmin) has the highest level of access on a network. In most cases, the founder of the network is the netadmin. Networks may, however, have multiple netadmins - especially networks with large populations. Ban Types An IRCop with enough privileges, may give a ban to unwanted users. The ban types are listed below: K-line The K-line is a local server ban (specific to a single server, not the entire IRC network) that bans the unwanted user's hostname. G-line The G-line (Global K-line) works exactly as the K-line, but is global. G-lines can expire, but in some cases they are permanent. Z-line On IRCds such as UnrealIRCd, the Z-line is a "powerful" ban that is performed on a user's IP address rather than the hostmask, denying access to all users from the offending IP. Z-lines may expire, but in many cases are permanent. GZ-line Some IRCds support the GZ-line (Global Z-line). It is exactly as the Z-line itself, but it's global, and network-wide. UnrealIRCd has GZ-line support. The GZ-line is an effective way of blocking static IP users, and keeping them out. D-line On other IRCds, such as Charybdis, a D-line replaces a Z-line. It is called a D-line because it "denies" the IP address from connecting. Charybdis does not have support for a Z-line or a GZ-line. By using its "cluster" configuration feature, D-lines can be synchronized between servers, providing a type of 'GZ-line'. This enables for very nice big network support since the administrator of one server may want to allow certain servers they trust to synchronize D-lines and K-lines, but not allow others. See also Internet Relay Chat channel operator References External links IRC Operators Guide Operator, Internet Relay Chat
143270
https://en.wikipedia.org/wiki/HyperTransport
HyperTransport
HyperTransport (HT), formerly known as Lightning Data Transport (LDT), is a technology for interconnection of computer processors. It is a bidirectional serial/parallel high-bandwidth, low-latency point-to-point link that was introduced on April 2, 2001. The HyperTransport Consortium is in charge of promoting and developing HyperTransport technology. HyperTransport is best known as the system bus architecture of AMD central processing units (CPUs) from Athlon 64 through AMD FX and the associated motherboard chipsets. HyperTransport has also been used by IBM and Apple for the Power Mac G5 machines, as well as a number of modern MIPS systems. The current specification HTX 3.1 remained competitive for 2014 high-speed (2666 and 3200 MT/s or about 10.4 GB/s and 12.8 GB/s) DDR4 RAM and slower (around 1 GB/s similar to high end PCIe SSDs ULLtraDIMM flash RAM) technology—a wider range of RAM speeds on a common CPU bus than any Intel front-side bus. Intel technologies require each speed range of RAM to have its own interface, resulting in a more complex motherboard layout but with fewer bottlenecks. HTX 3.1 at 26 GB/s can serve as a unified bus for as many as four DDR4 sticks running at the fastest proposed speeds. Beyond that DDR4 RAM may require two or more HTX 3.1 buses diminishing its value as unified transport. Overview Links and rates HyperTransport comes in four versions—1.x, 2.0, 3.0, and 3.1—which run from 200 MHz to 3.2 GHz. It is also a DDR or "double data rate" connection, meaning it sends data on both the rising and falling edges of the clock signal. This allows for a maximum data rate of 6400 MT/s when running at 3.2 GHz. The operating frequency is autonegotiated with the motherboard chipset (North Bridge) in current computing. HyperTransport supports an autonegotiated bit width, ranging from 2 to 32 bits per link; there are two unidirectional links per HyperTransport bus. With the advent of version 3.1, using full 32-bit links and utilizing the full HyperTransport 3.1 specification's operating frequency, the theoretical transfer rate is 25.6 GB/s (3.2 GHz × 2 transfers per clock cycle × 32 bits per link) per direction, or 51.2 GB/s aggregated throughput, making it faster than most existing bus standard for PC workstations and servers as well as making it faster than most bus standards for high-performance computing and networking. Links of various widths can be mixed together in a single system configuration as in one 16-bit link to another CPU and one 8-bit link to a peripheral device, which allows for a wider interconnect between CPUs, and a lower bandwidth interconnect to peripherals as appropriate. It also supports link splitting, where a single 16-bit link can be divided into two 8-bit links. The technology also typically has lower latency than other solutions due to its lower overhead. Electrically, HyperTransport is similar to low-voltage differential signaling (LVDS) operating at 1.2 V. HyperTransport 2.0 added post-cursor transmitter deemphasis. HyperTransport 3.0 added scrambling and receiver phase alignment as well as optional transmitter precursor deemphasis. Packet-oriented HyperTransport is packet-based, where each packet consists of a set of 32-bit words, regardless of the physical width of the link. The first word in a packet always contains a command field. Many packets contain a 40-bit address. An additional 32-bit control packet is prepended when 64-bit addressing is required. The data payload is sent after the control packet. Transfers are always padded to a multiple of 32 bits, regardless of their actual length. HyperTransport packets enter the interconnect in segments known as bit times. The number of bit times required depends on the link width. HyperTransport also supports system management messaging, signaling interrupts, issuing probes to adjacent devices or processors, I/O transactions, and general data transactions. There are two kinds of write commands supported: posted and non-posted. Posted writes do not require a response from the target. This is usually used for high bandwidth devices such as uniform memory access traffic or direct memory access transfers. Non-posted writes require a response from the receiver in the form of a "target done" response. Reads also require a response, containing the read data. HyperTransport supports the PCI consumer/producer ordering model. Power-managed HyperTransport also facilitates power management as it is compliant with the Advanced Configuration and Power Interface specification. This means that changes in processor sleep states (C states) can signal changes in device states (D states), e.g. powering off disks when the CPU goes to sleep. HyperTransport 3.0 added further capabilities to allow a centralized power management controller to implement power management policies. Applications Front-side bus replacement The primary use for HyperTransport is to replace the Intel-defined front-side bus, which is different for every type of Intel processor. For instance, a Pentium cannot be plugged into a PCI Express bus directly, but must first go through an adapter to expand the system. The proprietary front-side bus must connect through adapters for the various standard buses, like AGP or PCI Express. These are typically included in the respective controller functions, namely the northbridge and southbridge. In contrast, HyperTransport is an open specification, published by a multi-company consortium. A single HyperTransport adapter chip will work with a wide spectrum of HyperTransport enabled microprocessors. AMD used HyperTransport to replace the front-side bus in their Opteron, Athlon 64, Athlon II, Sempron 64, Turion 64, Phenom, Phenom II and FX families of microprocessors. Multiprocessor interconnect Another use for HyperTransport is as an interconnect for NUMA multiprocessor computers. AMD used HyperTransport with a proprietary cache coherency extension as part of their Direct Connect Architecture in their Opteron and Athlon 64 FX (Dual Socket Direct Connect (DSDC) Architecture) line of processors. Infinity Fabric used with the EPYC server CPUs is a superset of HyperTransport. The HORUS interconnect from Newisys extends this concept to larger clusters. The Aqua device from 3Leaf Systems virtualizes and interconnects CPUs, memory, and I/O. Router or switch bus replacement HyperTransport can also be used as a bus in routers and switches. Routers and switches have multiple network interfaces, and must forward data between these ports as fast as possible. For example, a four-port, 1000 Mbit/s Ethernet router needs a maximum 8000 Mbit/s of internal bandwidth (1000 Mbit/s × 4 ports × 2 directions)—HyperTransport greatly exceeds the bandwidth this application requires. However a 4 + 1 port 10 Gb router would require 100 Gbit/s of internal bandwidth. Add to that 802.11ac 8 antennas and the WiGig 60 GHz standard (802.11ad) and HyperTransport becomes more feasible (with anywhere between 20 and 24 lanes used for the needed bandwidth). Co-processor interconnect The issue of latency and bandwidth between CPUs and co-processors has usually been the major stumbling block to their practical implementation. Co-processors such as FPGAs have appeared that can access the HyperTransport bus and become integrated on the motherboard. Current generation FPGAs from both main manufacturers (Altera and Xilinx) directly support the HyperTransport interface, and have IP Cores available. Companies such as XtremeData, Inc. and DRC take these FPGAs (Xilinx in DRC's case) and create a module that allows FPGAs to plug directly into the Opteron socket. AMD started an initiative named Torrenza on September 21, 2006 to further promote the usage of HyperTransport for plug-in cards and coprocessors. This initiative opened their "Socket F" to plug-in boards such as those from XtremeData and DRC. Add-on card connector (HTX and HTX3) A connector specification that allows a slot-based peripheral to have direct connection to a microprocessor using a HyperTransport interface was released by the HyperTransport Consortium. It is known as HyperTransport eXpansion (HTX). Using a reversed instance of the same mechanical connector as a 16-lane PCI-Express slot (plus an x1 connector for power pins), HTX allows development of plug-in cards that support direct access to a CPU and DMA to the system RAM. The initial card for this slot was the QLogic InfiniPath InfiniBand HCA. IBM and HP, among others, have released HTX compliant systems. The original HTX standard is limited to 16 bits and 800 MHz. In August 2008, the HyperTransport Consortium released HTX3, which extends the clock rate of HTX to 2.6 GHz (5.2 GT/s, 10.7 GTi, 5.2 real GHz data rate, 3 MT/s edit rate) and retains backwards compatibility. Testing The "DUT" test connector is defined to enable standardized functional test system interconnection. Implementations AMD AMD64 and Direct Connect Architecture based CPUs AMD chipsets AMD-8000 series AMD 480 series AMD 580 series AMD 690 series AMD 700 series AMD 800 series AMD 900 series ATI chipsets ATI Radeon Xpress 200 for AMD processor ATI Radeon Xpress 3200 for AMD processor Broadcom (then ServerWorks) HyperTransport SystemI/O controllers HT-2000 HT-2100 Cisco QuantumFlow Processors ht_tunnel from OpenCores project (MPL licence) IBM CPC925 and CPC945 (PowerPC 970 northbridges) chipsets Loongson-3 MIPS processor Nvidia nForce chipsets nForce Professional MCPs (Media and Communication Processor) nForce 3 series nForce 4 series nForce 500 series nForce 600 series nForce 700 series nForce 900 series PMC-Sierra RM9000X2 MIPS CPU Power Mac G5 Raza Thread Processors SiByte MIPS CPUs from Broadcom Transmeta TM8000 Efficeon CPUs VIA chipsets K8 series Frequency specifications * AMD Athlon 64, Athlon 64 FX, Athlon 64 X2, Athlon X2, Athlon II, Phenom, Phenom II, Sempron, Turion series and later use one 16-bit HyperTransport link. AMD Athlon 64 FX (1207), Opteron use up to three 16-bit HyperTransport links. Common clock rates for these processor links are 800 MHz to 1 GHz (older single and multi socket systems on 754/939/940 links) and 1.6 GHz to 2.0 GHz (newer single socket systems on AM2+/AM3 links—most newer CPUs using 2.0 GHz). While HyperTransport itself is capable of 32-bit width links, that width is not currently utilized by any AMD processors. Some chipsets though do not even utilize the 16-bit width used by the processors. Those include the Nvidia nForce3 150, nForce3 Pro 150, and the ULi M1689—which use a 16-bit HyperTransport downstream link but limit the HyperTransport upstream link to 8 bits. Name There has been some marketing confusion between the use of HT referring to HyperTransport and the later use of HT to refer to Intel's Hyper-Threading feature on some Pentium 4-based and the newer Nehalem and Westmere-based Intel Core microprocessors. Hyper-Threading is officially known as Hyper-Threading Technology (HTT) or HT Technology. Because of this potential for confusion, the HyperTransport Consortium always uses the written-out form: "HyperTransport." Infinity Fabric Infinity Fabric (IF) is a superset of HyperTransport announced by AMD in 2016 as an interconnect for its GPUs and CPUs. It is also usable as interchip interconnect for communication between CPUs and GPUs (for Heterogeneous System Architecture), an arrangement known as Infinity Architecture. The company said the Infinity Fabric would scale from 30 GB/s to 512 GB/s, and be used in the Zen-based CPUs and Vega GPUs which were subsequently released in 2017. On Zen and Zen+ CPUs, the "SDF" data interconnects are run at the same frequency as the DRAM memory clock (MEMCLK), a decision made to remove the latency caused by different clock speeds. As a result, using a faster RAM module makes the entire bus faster. The links are 32-bit wide, as in HT, but 8 transfers are done per cycle (128-bit packets) compared to the original 2. Electrical changes are made for higher power efficiency. On Zen 2 and Zen 3 CPUs, the IF bus is on a separate clock, either in a 1:1 or 2:1 ratio to the DRAM clock, because of Zen's early problems with high-speed DRAM affecting IF speed, and therefore system stability. The bus width has also been doubled. See also Elastic interface bus Fibre Channel Front-side bus Intel QuickPath Interconnect List of device bandwidths PCI Express RapidIO AGESA References External links . . . . Computer buses Macintosh internals Serial buses
4639235
https://en.wikipedia.org/wiki/Website%20monitoring
Website monitoring
Website monitoring is the process of testing and verifying that end-users can interact with a website or web application as expected. Website monitoring is often used by businesses to ensure website uptime, performance, and functionality is as expected. Website monitoring companies provide organizations the ability to consistently monitor a website, or server function, and observe how it responds. The monitoring is often conducted from several locations around the world to a specific website, or server, in order to detect issues related to general Internet latency, network hop issues, and to prevent false positives caused by local or inter-connect problems. Monitoring companies generally report on these tests in a variety of reports, charts and graphs. When an error is detected monitoring services send out alerts via email, SMS, phone, SNMP trap, pager that may include diagnostic information, such as a network trace route, code capture of a web page's HTML file, a screen shot of a webpage, and even a video of a website failing. These diagnostics allow network administrators and webmasters to correct issues faster. Monitoring gathers extensive data on website performance, such as load times, server response times, page element performance that is often analyzed and used to further optimize website performance. Purpose Monitoring is essential to ensure that a website is available to users, downtime is minimized, and performance can be optimized. Users that rely on a website or an application for work or pleasure will get frustrated or even stop using the application if it is not reliably available. Monitoring can cover many things that an application needs to function, like network connectivity, Domain Name System records, database connectivity, bandwidth, and computer resources like free RAM, CPU load, disk space, events, etc. Commonly measured metrics are response time and availability (or uptime), but consistency and reliability metrics are gaining popularity. Measuring a website's availability and reliability under various amounts of traffic is often referred to as load testing. Website monitoring also helps benchmark the website against the performance of a competitor's to help determine how well a site is performing. Website speed is also used as a metric for search engine rankings. Website monitoring can be used to hold web hosting providers accountable to their service-level agreement. Most web hosts offer 99.9% uptime guarantee and, when uptime is less than that, individuals can be refunded for the excessive downtime. Note that not all hosts will refund individuals for excessive downtime so one must become familiar with the terms of service of their host. Most paid website monitoring services will also offer security features such as virus and malware scanning which is of growing importance as websites become more complicated and integral to business. Internal vs. external Website monitoring can be done from both inside and outside of a corporate firewall. Traditional network management solutions focus on inside the firewall monitoring, whereas external performance monitoring will test and monitor performance issues across the Internet backbone and in some cases all the way to the end-user. Third-party website performance monitoring solutions can monitor internal (behind the firewall), external (customer-facing), or cloud-based Web applications. Inside firewall monitoring is done by special hardware appliances which can help you determine if your internal applications’ sluggish performance is caused by: design of applications, internal infrastructure, internal applications or connections to any public Internet. External performance monitoring is also known as end-user monitoring or end-to-end performance monitoring. Real user monitoring measures the performance and availability experienced by actual users, diagnoses individual incidents, and tracks the impact of a change. Measures of website availability Types of protocol A website monitoring service can check other internet protocols besides HTTP pages and HTTPS such as FTP, SMTP, POP3, ActiveSync, IMAP, DNS, SSH, Telnet, SSL, TCP, PING, UDP, SOAP, Domain Name Expiry, SSL Certificate Expiry and a range of ports. Monitoring frequency occurs at intervals of once every 4-hours to every 15-seconds. Typically, most website monitoring services test a server, or application, between once-per hour to once-per-minute. Advanced monitoring services capture browser interactions with websites using macro recorders, or browser add-ons such as Selenium or iMacros. These services test a website by running a web browser through a typical website transaction (such as a shopping cart) or a custom scenario, in order to check for user experience issues, performance problems, and availability errors. Browser-driven monitoring services detect not only network and server issues, but also webpage object issues (such as slow loading JavaScript, or third-party hosted page elements). An implementation of time performance monitoring for the Apache HTTP Server is the mod_arm4 module. Types of monitoring Users of website monitoring (typically network administrators, web masters, web operations personnel) may monitor a single page of a website, but can also monitor a complete business process (often referred to as multi-step transactions). Servers monitoring from around the globe Website monitoring services usually have a number of servers around the globe – South America, Africa, North America, Europe, Africa, Asia, Australia and other locations. By having multiple servers in different geographic locations, monitoring service can determine if a web server is available across continents over the Internet. Some vendors claim that the more locations the better picture on your website availability while others say that three globally distributed stations are sufficient and more stations do not give more information. Types There are two main types of website monitoring Synthetic monitoring also known as active monitoring, and Passive monitoring also known as real monitoring. Levels of Website Monitoring There are different levels of website monitoring, the more complex your website, the more complex your monitoring needs: Level 1 Uptime Monitoring – Availability of a Critical Page Level 2 Transaction Monitoring – Availability of a Critical Process Level 3 Performance Monitoring – Performance of a Critical Page Level 4 Synthetic Monitoring – Performance of a Critical Process Level 5 Customer Journey Monitoring – Level 1 to 4 plus Security Information Notification options: alerts As the information brought by website monitoring services is in most cases urgent and may be of crucial importance, various notification methods, often known as "alerts" are used: e-mail, IM, regular and cell phones, SMS, fax, pagers, Skype, RSS Feed, SNMP trap, URL notifications, etc. Website monitoring services Website monitoring market is very competitive. There are 150+ active service providers and more than 100 documented to have gone out of business. Most of the providers offer a free plan with low-frequency monitoring. See also Application performance management Application Response Measurement Internet server monitoring InterWorx Network monitoring Page view Real user monitoring Traceroute Web analytics Website audit Website tracking References Amazon Downtime Cost $66,000 - Forbes Speed Affects Website Usage - Google Research Blog Website management Web server management software Website monitoring software
19834537
https://en.wikipedia.org/wiki/Portraiture%20of%20Elizabeth%20I
Portraiture of Elizabeth I
The portraiture of Elizabeth I spans the evolution of English royal portraits in the early modern period (1400/1500-1800), depicting Queen Elizabeth I of England and Ireland (1533–1603), from the earliest representations of simple likenesses to the later complex imagery used to convey the power and aspirations of the state, as well as of the monarch at its head. Even the earliest portraits of Elizabeth I contain symbolic objects such as roses and prayer books that would have carried meaning to viewers of her day. Later portraits of Elizabeth layer the iconography of empire—globes, crowns, swords and columns—and representations of virginity and purity, such as moons and pearls, with classical allusions, to present a complex "story" that conveyed to Elizabethan era viewers the majesty and significance of the 'Virgin Queen'. Overview Portraiture in Tudor England Two portraiture traditions had arisen in the Tudor court since the days of Elizabeth's father, Henry VIII. The portrait miniature developed from the illuminated manuscript tradition. These small personal images were almost invariably painted from life over the space of a few days in watercolours on vellum, stiffened by being glued to a playing card. Panel paintings in oils on prepared wood surfaces were based on preparatory drawings and were usually executed at life size, as were oil paintings on canvas. Unlike her contemporaries in France, Elizabeth never granted rights to produce her portrait to a single artist, although Nicholas Hilliard was appointed her official limner, or miniaturist and goldsmith. George Gower, a fashionable court portraitist created Serjeant Painter in 1581, was responsible for approving all portraits of the queen created by other artists from 1581 until his death in 1596. Elizabeth sat for a number of artists over the years, including Hilliard, Cornelis Ketel, Federico Zuccaro or Zuccari, Isaac Oliver, and most likely to Gower and Marcus Gheeraerts the Younger. Portraits were commissioned by the government as gifts to foreign monarchs and to show to prospective suitors. Courtiers commissioned heavily symbolic paintings to demonstrate their devotion to the queen, and the fashionable long galleries of later Elizabethan country houses were filled with sets of portraits. The studios of Tudor artists produced images of Elizabeth working from approved "face patterns", or approved drawings of the queen, to meet this growing demand for her image, an important symbol of loyalty and reverence for the crown in times of turbulence. European context By far the most impressive models of portraiture available to English portraitists were the many portraits by Hans Holbein the Younger, the outstanding Northern portraitist of the first half of the 16th century, who had made two lengthy visits to England, and had been Henry VIII's court artist. Holbein had accustomed the English court to the full-length life-size portrait, although none of his originals now survive. His great dynastic mural at Whitehall Palace, destroyed in 1698, and perhaps other original large portraits, would have been familiar to Elizabethan artists. Both Holbein and his great Italian contemporary Titian had combined great psychological penetration with a sufficiently majestic impression to satisfy their royal patrons. By his second visit, Holbein had already begun to move away from a strictly realist depiction; in his Jane Seymour, "the figure is no longer seen as displacing with its bulk a recognizable section of space: it approaches rather to a flat pattern, made alive by a bounding and vital outline". This tendency was to be taken much further by the later portraits of Elizabeth, where "Likeness of feature and an interest in form and volume have gradually been abandoned in favour of an effect of splendid majesty obtained by decorative pattern, and the forms have been flattened accordingly". Titian continued to paint royal portraits, especially of Philip II of Spain, until the 1570s, but in sharply reduced numbers after about 1555, and he refused to travel from Venice to do them. The full-length portrait of Philip (1550–51) now in the Prado was sent to Elizabeth's elder sister and predecessor Mary I in advance of their marriage. Towards the mid-16th century, the most influential Continental courts came to prefer less revealing and intimate works, and at the mid-century the two most prominent and influential royal portraitists in paint, other than Titian, were the Netherlandish Anthonis Mor and Agnolo Bronzino in Florence, besides whom the Habsburg court sculptor and medallist Leone Leoni was similarly skilled. Mor, who had risen rapidly to prominence in 1540s, worked across Europe for the Habsburgs in a tighter and more rigid version of Titian's compositional manner, drawing also on the North Italian style of Moretto. Mor had actually visited London in 1554, and painted three versions of his well-known portrait of Queen Mary; he also painted English courtiers who visited Antwerp. Mor's Spanish pupil Alonso Sánchez Coello continued in a stiffer version of his master's style, replacing him as Spanish court painter in 1561. Sofonisba Anguissola had painted in an intimately informal style, but after her recruitment to the Spanish court as the Queen's painter in 1560 was able to adapt her style to the much more formal demands of state portraiture. Moretto's pupil Giovanni Battista Moroni was Mor's contemporary and formed his mature style in the 1550s, but few of his spirited portraits were of royalty, or yet to be seen outside Italy. Bronzino developed a style of coldly distant magnificence, based on the Mannerist portraits of Pontormo, working almost entirely for Cosimo I, the first Medici Grand-Duke. Bronzino's works, including his striking portraits of Cosimo's Duchess, Eleanor of Toledo were distributed in many versions across Europe, continuing to be made for two decades from the same studio pattern; a new portrait painted in her last years, about 1560, exists in only a few repetitions. At the least many of the foreign painters in London are likely to have seen versions of the earlier type, and there may well have been one in the Royal Collection. French portraiture remained dominated by small but finely drawn bust-length or half-length works, including many drawings, often with colour, by François Clouet following, with a host of imitators, his father Jean, or even smaller oils by the Netherlandish Corneille de Lyon and his followers, typically no taller than a paperback book. A few full-length portraits of royalty were produced, dependent on German or Italian models. Creating the royal image William Gaunt contrasts the simplicity of the 1546 portrait of Elizabeth Tudor as a Princess with later images of her as queen. He wrote, "The painter...is unknown, but in a competently Flemish style he depicts the daughter of Anne Boleyn as quiet and studious-looking, ornament in her attire as secondary to the plainness of line that emphasizes her youth. Great is the contrast with the awesome fantasy of the later portraits: the pallid, mask-like features, the extravagance of headdress and ruff, the padded ornateness that seemed to exclude all humanity." The lack of emphasis given to depicting depth and volume in her later portraits may have been influenced by the Queen's own views. In the Art of Limming, Hilliard cautioned against all but the minimal use of chiaroscuro modelling seen in his works, reflecting the views of his patron: "seeing that best to show oneself needeth no shadow of place but rather the open light...Her Majesty..chose her place to sit for that purpose in the open alley of a goodly garden, where no tree was near, nor any shadow at all..." From the 1570s, the government sought to manipulate the image of the queen as an object of devotion and veneration. Sir Roy Strong writes: "The cult of Gloriana was skilfully created to buttress public order and, even more, deliberately to replace the pre-Reformation externals of religion, the cult of the Virgin and saints with their attendant images, processions, ceremonies and secular rejoicing." The pageantry of the Accession Day tilts, the poetry of the court, and the most iconic of Elizabeth's portraits all reflected this effort. The management of the queen's image reached its heights in the last decade of her reign, when realistic images of the aging queen were replaced with an eternally youthful vision, defying the reality of the passage of time. Early portraits The young queen Portraits of the young queen, many of them likely painted to be shown to prospective suitors and foreign heads of state, show a naturality and restraint similar to that of the portrait of Elizabeth as a princess. The full-length Hampden image of Elizabeth in a red satin gown, originally attributed to Steven van der Meulen and reattributed to George Gower in 2020, has been identified by Sir Roy Strong as an important early portrait, "undertaken at a time when her image was being tightly controlled", and produced "in response to a crisis over the production of the royal image, one which was reflected in the words of a draft proclamation dated 1563". The draft proclamation (never published) was a response to the circulation of poorly-made portraits in which Elizabeth was shown "in blacke with a hoode and cornet", a style she no longer wore. Symbolism in these pictures is in keeping with earlier Tudor portraiture; in some, Elizabeth holds a book (possibly a prayer book) suggesting studiousness or piety. In other paintings, she holds or wears a red rose, symbol of the Tudor Dynasty's descent from the House of Lancaster, or white roses, symbols of the House of York and of maidenly chastity. In the Hampden portrait, Elizabeth wears a red rose on her shoulder and holds a gillyflower in her hand. Of this image, Strong says "Here Elizabeth is caught in that short-lived period before what was a recognisable human became transmuted into a goddess". One artist active in Elizabeth's early court was the Flemish miniaturist Levina Teerlinc, who had served as a painter and gentlewoman to Mary I and stayed on as a Gentlewoman of the Privy Chamber to Elizabeth. Teerlinc is best known for her pivotal position in the rise of the portrait miniature. There is documentation that she created numerous portraits of Elizabeth I, both individual portraits and portraits of the sovereign with important court figures, but only a few of these have survived and been identified. Elizabeth and the goddesses Two surviving allegorical paintings show the early use of classical mythology to illustrate the beauty and sovereignty of the young queen. In Elizabeth I and the Three Goddesses (1569), attributed to Hans Eworth, the story of the Judgement of Paris is turned on its head. Elizabeth, rather than Paris, is now sent to choose among Juno, Venus, and Pallas-Minerva, all of whom are outshone by the queen with her crown and royal orb. As Susan Doran writes, "Implicit to the theme of the painting ... is the idea that Elizabeth's retention of royal power benefits her realm. Whereas Paris's judgement in the original myth resulted in the long Trojan Wars 'to the utter ruin of the Trojans', hers will conversely bring peace and order to the state" after the turbulent reign of Elizabeth's sister Mary I. The latter theme lies behind the 1572 The Family of Henry VIII: An Allegory of the Tudor Succession (attributed to Lucas de Heere). In this image, Catholic Mary and her husband Philip II of Spain are accompanied by Mars, the god of War, on the left, while Protestant Elizabeth on the right ushers in the goddesses Peace and Plenty. An inscription states that this painting was a gift from the queen to Francis Walsingham as a "Mark of her people's and her own content", and this may indicate that the painting commemorates the signing of the Treaty of Blois (1572), which established an alliance between England and France against Spanish aggression in the Netherlands during Walsingham's tour of duty as ambassador to the French court. Strong identifies both paintings as celebrations of Elizabeth's just rule by Flemish exiles, to whom England was a refuge from the religious persecution of Protestants in the Spanish Netherlands. Hilliard and the queen Nicholas Hilliard was an apprentice to the Queen's jeweller Robert Brandon, a goldsmith and city chamberlain of London, and Strong suggests that Hilliard may also have been trained in the art of limning by Levina Teerlinc. Hilliard emerged from his apprenticeship at a time when a new royal portrait painter was "desperately needed." Hilliard's first known miniature of the Queen is dated 1572. It is not known when he was formally appointed limner (miniaturist) and goldsmith to Elizabeth, though he was granted the reversion of a lease by the Queen in 1573 for his "good, true and loyal service." Two panel portraits long attributed to him, the Phoenix and Pelican portraits, are dated . These paintings are named after the jewels the queen wears, her personal badges of the pelican in her piety and the phoenix. National Portrait Gallery researchers announced in September 2010 that the two portraits were painted on wood from the same two trees; they also found that a tracing of the Phoenix portrait matches the Pelican portrait in reverse, deducing that both pictures of Elizabeth in her forties were painted around the same time. However, Hilliard's panel portraits seem to have been found wanting at the time, and in 1576 the recently married Hilliard left for France to improve his skills. Returning to England, he continued to work as a goldsmith, and produced some spectacular "picture boxes" or jewelled lockets for miniatures: the Armada Jewel, given by Elizabeth to Sir Thomas Heneage and the Drake Pendant given to Sir Francis Drake are the best known examples. As part of the cult of the Virgin Queen, courtiers were expected to wear the Queen's likeness, at least at Court. Hilliard's appointment as miniaturist to the Crown included the old sense of a painter of illuminated manuscripts and he was commissioned to decorate important documents, such as the founding charter of Emmanuel College, Cambridge (1584), which has an enthroned Elizabeth under a canopy of estate within an elaborate framework of Flemish-style Renaissance strapwork and grotesque ornament. He also seems to have designed woodcut title-page frames and borders for books, some of which bear his initials. The Darnley Portrait The problem of an official portrait of Elizabeth was solved with the Darnley Portrait. Likely painted from life around 1575–6, this portrait is the source of a face pattern which would be used and reused for authorized portraits of Elizabeth into the 1590s, preserving the impression of ageless beauty. Strong suggests that the artist is Federico Zuccari or Zucaro, an "eminent" Italian artist, though not a specialist portrait-painter, who is known to have visited the court briefly with a letter of introduction to Elizabeth's favourite Robert Dudley, 1st Earl of Leicester, dated 5 March 1575. Zuccaro's preparatory drawings for full-length portraits of both Leicester and Elizabeth survive, although it is unlikely the full-length of Elizabeth was ever painted. Curators at the National Portrait Gallery believe that the attribution of the Darnley portrait to Zuccaro is "not sustainable", and attribute the work to an unknown "continental" (possibly Netherlandish) artist. The Darnley Portrait features a crown and sceptre on a table beside the queen, and was the first appearance of these symbols of sovereignty separately used as props (rather than worn and carried) in Tudor portraiture, a theme that would be expanded in later portraits. Recent conservation work has revealed that Elizabeth's now-iconic pale complexion in this portrait is the result of deterioration of red lake pigments, which has also altered the coloring of her dress. The Virgin Empress of the Seas Return of the Golden Age The excommunication of Elizabeth by Pope Pius V in 1570 led to increased tension with Philip II of Spain, who championed the Catholic Mary, Queen of Scots, as the legitimate heir of his late wife Mary I. This tension played out over the next decades in the seas of the New World as well as in Europe, and culminated in the invasion attempt of the Spanish Armada. It is against this backdrop that the first of a long series of portraits appears, depicting Elizabeth with heavy symbolic overlays of the possession of an empire based on mastery of the seas. Combined with a second layer of symbolism representing Elizabeth as the Virgin Queen, these new paintings signify the manipulation of Elizabeth's image as the destined Protestant protector of her people. Strong points out that there is no trace of this iconography in portraits of Elizabeth prior to 1579, and identifies its source as the conscious image-making of John Dee, whose 1577 General and Rare Memorials Pertayning to the Perfect Arte of Navigation encouraged the establishment of English colonies in the New World supported by a strong navy, asserting Elizabeth's claims to an empire via her supposed descent from Brutus of Troy and King Arthur. Dee's inspiration lies in Geoffrey of Monmouth's History of the Kings of Britain, which was accepted as true history by Elizabethan poets, and formed the basis of the symbolic history of England. In this 12th-century pseudohistory, Britain was founded by and named after Brutus, the descendant of Aeneas, who founded Rome. The Tudors, of Welsh descent, were heirs of the most ancient Britons and thus of Aeneas and Brutus. By uniting the Houses of York and Lancaster following the strife of the Wars of the Roses, the Tudors ushered in a united realm where - Latin for "peace", and the Roman goddess of peace - reigned. The Spenserian scholar Edwin Greenlaw states "The descent of the Britons from the Trojans, the linking of Arthur, Henry VIII, and Elizabeth as Britain's greatest monarchs, and the return under Elizabeth of the Golden Age are all commonplaces of Elizabethan thought." This understanding of history and Elizabeth's place in it forms the background to the symbolic portraits of the latter half of her reign. The Virgin Queen A series of Sieve Portraits copied the Darnley face pattern, and added an allegorical overlay that depicted Elizabeth as Tuccia, a Vestal Virgin who proved her chastity by carrying a sieve full of water from the Tiber River to the Temple of Vesta without spilling a drop. The first Sieve Portrait was painted by George Gower in 1579, but the most influential image is the 1583 version by Quentin Metsys (or Massys) the Younger. In the Metsys version, Elizabeth is surrounded by symbols of empire, including a column and a globe, iconography that would appear again and again in her portraiture of the 1580s and 1590s, most notably in the Armada Portrait of . The medallions on the pillar to the left of the queen illustrate the story of Dido and Aeneas, ancestor of Brutus, suggesting that like Aeneas, Elizabeth's destiny was to reject marriage and found an empire. This painting's patron was likely Sir Christopher Hatton, as his heraldic badge of the white hind appears on the sleeve of one of the courtiers in the background, and the work may have expressed opposition to the proposed marriage of Elizabeth to François, Duke of Anjou. The virgin Tuccia was familiar to Elizabethan readers from Petrarch's "The Triumph of Chastity". Another symbol from this work is the spotless ermine, wearing a collar of gold studded with topazes. This symbol of purity appears in the Ermine Portrait of 1585, attributed to the herald William Segar. The queen bears the olive branch of (Peace), and the sword of justice rests on the table at her side. In combination, these symbols represent not only the personal purity of Elizabeth but the "righteousness and justice of her government." Visions of empire The Armada Portrait is an allegorical panel painting depicting the queen surrounded by symbols of empire against a backdrop representing the defeat of the Spanish Armada in 1588. There are three surviving versions of the portrait, in addition to several derivative paintings. The version at Woburn Abbey, the seat of the Dukes of Bedford, was long accepted as the work of George Gower, who had been appointed Serjeant Painter in 1581. A version in the National Portrait Gallery, London, which had been cut down at both sides leaving just a portrait of the queen, was also formerly attributed to Gower. A third version, owned by the Tyrwhitt-Drake family, may have been commissioned by Sir Francis Drake. Scholars agree that this version is by a different hand, noting distinctive techniques and approaches to the modelling of the queen's features. Curators now believe that the three extant versions are all the output of different workshops under the direction of unknown English artists. The combination of a life-sized portrait of the queen with a horizontal format is "quite unprecedented in her portraiture", although allegorical portraits in a horizontal format, such as Elizabeth I and the Three Goddesses and the Family of Henry VIII: An Allegory of the Tudor Succession pre-date the Armada Portrait. The queen's hand rests on a globe below the crown of England, "her fingers covering the Americas, indicating England's [command of the seas] and [dreams of establishing colonies] in the New World". The Queen is flanked by two columns behind, probably a reference to the famous impresa of the Holy Roman Emperor, Charles V, Philip II of Spain's father, which represented the pillars of Hercules, gateway to the Atlantic Ocean and the New World. In the background view on the left, English fireships threaten the Spanish fleet, and on the right the ships are driven onto a rocky coast amid stormy seas by the "Protestant Wind". On a secondary level, these images show Elizabeth turning her back on storm and darkness while sunlight shines where she gazes. An engraving by Crispijn van de Passe (Crispin van de Passe) published in 1596, but showing costume of the 1580s, carries similar iconography. Elizabeth stands between two columns bearing her arms and the Tudor heraldic badge of a portcullis. The columns are surmounted by her emblems of a pelican in her piety and a phoenix, and ships fill the sea behind her. The cult of Elizabeth The various threads of mythology and symbolism that created the iconography of Elizabeth I combined into a tapestry of immense complexity in the years following the defeat of the Spanish Armada. In poetry, portraiture and pageantry, the queen was celebrated as Astraea, the just virgin, and simultaneously as Venus, the goddess of love. Another exaltation of the queen's virgin purity identified her with the moon goddess, who held dominion over the waters. Sir Walter Raleigh had begun to use Diana, and later Cynthia, as aliases for the queen in his poetry around 1580, and images of Elizabeth with jewels in the shape of crescent moons or the huntress's arrows begin to appear in portraiture around 1586 and multiply through the remainder of the reign. Courtiers wore the image of the Queen to signify their devotion, and had their portraits painted wearing her colours of black and white. The Ditchley Portrait seems to have always been at the Oxfordshire home of Elizabeth's retired Champion, Sir Henry Lee of Ditchley, and likely was painted for (or commemorates) her two-day visit to Ditchley in 1592. The painting is attributed to Marcus Gheerearts the Younger, and was almost certainly based on a sitting arranged by Lee, who was the painter's patron. In this image, the queen stands on a map of England, her feet on Oxfordshire. The painting has been trimmed and the background poorly repainted, so that the inscription and sonnet are incomplete. Storms rage behind her while the sun shines before her, and she wears a jewel in the form of a celestial or armillary sphere close to her left ear. Many versions of this painting were made, likely in Gheeraerts' workshop, with the allegorical items removed and Elizabeth's features "softened" from the stark realism of her face in the original. One of these was sent as a diplomatic gift to the Grand Duke of Tuscany, and is now in the Palazzo Pitti. The last sitting and the Mask of Youth Around 1592, the queen also sat to Isaac Oliver, a pupil of Hilliard, who produced an unfinished portrait miniature used as a pattern for engravings of the queen. Only a single finished miniature from this pattern survives, with the queen's features softened, and Strong concludes that this realistic image from life of the aging Elizabeth was not deemed a success. Prior to the 1590s, woodcuts and engravings of the queen were created as book illustrations, but in this decade individual prints of the queen first appear, based on the Oliver face pattern. In 1596, the Privy Council ordered that unseemly portraits of the queen which had caused her "great offence" should be sought out and burnt, and Strong suggest that these prints, of which comparatively few survive, may be the offending images. Strong writes "It must have been exposure to the searching realism of both Gheeraerts and Oliver that provoked the decision to suppress all likenesses of the queen that depicted her as being in any way old and hence subject to mortality." In any event, no surviving portraits dated between 1596 and Elizabeth's death in 1603 show the aging queen as she truly was. Faithful resemblance to the original is only to be found in the accounts of contemporaries, as in the report written in 1597 by André Hurault de Maisse, Ambassador Extraordinary from Henry IV of France, after an audience with the sixty-five year-old queen, during which he noted, "her teeth are very yellow and unequal ... and on the left side less than on the right. Many of them are missing, so that one cannot understand her easily when she speaks quickly." Yet he added, "her figure is fair and tall and graceful in whatever she does; so far as may be she keeps her dignity, yet humbly and graciously withal." All subsequent images rely on a face pattern devised by Nicholas Hilliard sometime in the 1590s called by art historians the "Mask of Youth", portraying Elizabeth as ever-young. Some 16 miniatures by Hilliard and his studio are known based on this face pattern, with different combinations of costume and jewels likely painted from life, and it was also adopted by (or enforced on) other artists associated with the Court. The coronation portraits Two portraits of Elizabeth in her coronation robes survive, both dated to 1600 or shortly thereafter. One is a panel portrait in oils, and the other is a miniature by Nicholas Hilliard. The warrant to the queen's tailor for remodelling Mary I's cloth of gold coronation robes for Elizabeth survives, and costume historian Janet Arnold's study points out that the paintings accurately reflect the written records, although the jewels differ in the two paintings, suggesting two different sources, one possibly a miniature by Levina Teerlinc. It is not known why, and for whom, these portraits were created, at, or just after, the end of her reign. The Rainbow Portrait Attributed to Marcus Gheeraerts the Younger, perhaps the most heavily symbolic portrait of the queen is the Rainbow Portrait at Hatfield House. It was painted around 1600–1602, when the queen was in her sixties. In this painting, an ageless Elizabeth appears dressed as if for a masque, in a linen bodice embroidered with spring flowers and a mantle draped over one shoulder, her hair loose beneath a fantastical headdress. She wears symbols out of the popular emblem books, including the cloak with eyes and ears, the serpent of wisdom, and the celestial armillary sphere, and carries a rainbow with the motto ("no rainbow without the sun"). Strong suggests that the complex "programme" for this image may be the work of the poet John Davies, whose Hymns to Astraea honouring the queen use much of the same imagery, and suggests it was commissioned by Robert Cecil as part of the decor for Elizabeth's visit in 1602, when a "shrine to Astraea" featured in the entertainments of what would prove to be the "last great festival of the reign". Books and coins Prior to the wide dissemination of prints of the queen in the 1590s, the common people of Elizabeth's England would be most familiar with her image on the coinage. In December 1560, a systematic recoinage of the debased money then in circulation was begun. The main early effort was the issuance of sterling silver shillings and groats, but new coins were issued in both silver and gold. This restoration of the currency was one of the three principal achievements noted on Elizabeth's tomb, illustrating the value of stable currency to her contemporaries. Later coinage represented the queen in iconic fashion, with the traditional accompaniments of Tudor heraldic badges including the Tudor rose and portcullis. Books provided another widely available source of images of Elizabeth. Her portrait appeared on the title page of the Bishops' Bible, the standard Bible of the Church of England, issued in 1568 and revised in 1572. In various editions, Elizabeth is depicted with her orb and sceptre accompanied by female personifications. "Reading" the portraits The many portraits of Elizabeth I constitute a tradition of image highly steeped in classical mythology and the Renaissance understanding of English history and destiny, filtered by allusions to Petrarch's sonnets and, late in her reign, to Edmund Spenser's Faerie Queene. This mythology and symbology, though directly understood by Elizabethan contemporaries for its political and symbolic meaning, makes it difficult to 'read' the portraits in the present day as contemporaries would have seen them at the time of their creation. Though knowledge of the symbology of Elizabethan portraits has not been lost, Dame Frances Yates points out that the most complexly symbolic portraits may all commemorate specific events, or have been designed as part of elaborately-themed entertainments, knowledge left unrecorded within the paintings themselves. The most familiar images of Elizabeth—the Armada, Ditchley, and Rainbow portraits—are all associated with unique events in this way. To the extent that the contexts of other portraits have been lost to scholars, so too the keys to understanding these remarkable images as the Elizabethans understood them may be lost in time; even those portraits that are not overtly allegorical may have been full of meaning to a discerning eye. Elizabethan courtiers familiar with the language of flowers and the Italian emblem books could have read stories in the flowers the queen carried, the embroidery on her clothes, and the design of her jewels. According to Strong:Fear of the wrong use and perception of the visual image dominates the Elizabethan age. The old pre-Reformation idea of images, religious ones, was that they partook of the essence of what they depicted. Any advance in technique which could reinforce that experience was embraced. That was now reversed, indeed it may account for the Elizabethans failing to take cognisance of the optical advances which created the art of the Italian Renaissance. They certainly knew about these things but, and this is central to the understanding of the Elizabethans, chose not to employ them. Instead the visual arts retreated in favour of presenting a series of signs or symbols through which the viewer was meant to pass to an understanding of the idea behind the work. In this manner the visual arts were verbalised, turned into a form of book, a 'text' which called for reading by the onlooker. There are no better examples of this than the quite extraordinary portraits of the queen herself, which increasingly, as the reign progressed, took on the form of collections of abstract pattern and symbols disposed in an unnaturalistic manner for the viewer to unravel, and by doing so enter into an inner vision of the idea of monarchy. Gallery Queen and court Portrait miniatures Portraits Portrait medallions and cameos Drawings Prints and coins Illuminated manuscripts See also 1550–1600 in fashion Artists of the Tudor Court Cultural depictions of Elizabeth I of England Notes References Bibliography Arnold, Janet: "The 'Coronation' Portrait of Queen Elizabeth I", The Burlington Magazine, CXX, 1978, pp. 727–41. Arnold, Janet: Queen Elizabeth's Wardrobe Unlock'd, W S Maney and Son Ltd, Leeds 1988. Blunt, Anthony, Art and Architecture in France, 1500–1700, 2nd edn 1957, Penguin Cooper, Tarnya; Bolland, Charlotte (2014). The Real Tudors : kings and queens rediscovered. London: National Portrait Gallery. . Freedberg, Sidney J., Painting in Italy, 1500–1600, 3rd edn. 1993, Yale, Gaunt, William: Court Painting in England from Tudor to Victorian Times. London: Constable, 1980. . Gent, Lucy, and Nigel Llewellyn, eds: Renaissance Bodies: The Human Figure in English Culture c. 1540–1660Reaktion Books, 1990, Hearn, Karen, ed. Dynasties: Painting in Tudor and Jacobean England 1530–1630. New York: Rizzoli, 1995. (Hearn 1995) Hearn, Karen: Marcus Gheeraerts II Elizabeth Artist, London: Tate Publishing 2002, (Hearn 2002) Kinney, Arthur F.: Nicholas Hilliard's "Art of Limning", Northeastern University Press, 1983, Levey, Michael, Painting at Court, Weidenfeld & Nicolson, London, 1971 Penny, Nicholas, National Gallery Catalogues (new series): The Sixteenth Century Italian Paintings, Volume 1, 2004, National Gallery Publications Ltd, Museo del Prado, Catálogo de las pinturas, 1996, Ministerio de Educación y Cultura, Madrid, (Prado) Reynolds, Graham: Nicholas Hilliard & Isaac Oliver, Her Majesty's Stationery Office, 1971 Strong, Roy: The English Icon: Elizabethan and Jacobean Portraiture, 1969, Routledge & Kegan Paul, London (Strong 1969) Strong, Roy: Nicholas Hilliard, 1975, Michael Joseph Ltd, London, (Strong 1975) Strong, Roy: The Cult of Elizabeth, 1977, Thames and Hudson, London, (Strong 1977) Strong, Roy: Artists of the Tudor Court: The Portrait Miniature Rediscovered 1520–1620, Victoria & Albert Museum exhibit catalogue, 1983, (Strong 1983) Strong, Roy: Art and Power; Renaissance Festivals 1450–1650, 1984, The Boydell Press; (Strong 1984) Strong, Roy: "From Manuscript to Miniature" in John Murdoch, Jim Murrell, Patrick J. Noon & Roy Strong, The English Miniature, Yale University Press, New Haven and London, 1981 (Strong 1981) Strong, Roy: Gloriana: The Portraits of Queen Elizabeth I, Thames and Hudson, 1987, (Strong 1987) Strong, Roy: The Spirit of Britain, 1999, Hutchison, London, (Strong 1999) Trevor-Roper, Hugh; Princes and Artists, Patronage and Ideology at Four Habsburg Courts 1517–1633, Thames & Hudson, London, 1976, Waterhouse, Ellis; Painting in Britain, 1530–1790, 4th Edn, 1978, Penguin Books (now Yale History of Art series) Yates, Frances: Astraea: The Imperial Theme in the Sixteenth Century, London and Boston: Routledge and Keegan Paul, 1975, Further reading Connolly, Annaliese, and Hopkins, Lisa (eds.), Goddesses and Queens: The Iconography of Elizabeth, 2007, Manchester University Press, Elizabeth I English Renaissance Renaissance art English art Iconography 16th-century portraits Portraits of the English royal family Portraits of women Material culture of royal courts
23559792
https://en.wikipedia.org/wiki/Allway%20Sync
Allway Sync
Allway Sync is backup and file synchronization software that allows backing up and synchronizing files to the same or different drives, to different media (CD, DVD, Flash, zip), or to a remote server. Features Like Super Flexible File Synchronizer, GoodSync and Unison File Synchronizer, it has the capability to remember the previous state of directories in a database, and thus also synchronize deletions. It can synchronize more than two directories at once. It can update and back-up files over a local network or the Internet, although the functionality of FTP support is debated Support for many file systems, like FAT, NTFS, SAMBA, NetWare, CDFS, and UDF. Includes a scheduler. File masks and filters. Versions There are various versions. Users of the freeware version are requested to buy the pro version if they use the software for a commercial purpose or to synchronize more than 40,000 files in 30-day period. This pro version has exactly the same functionality as the free version. Allway sync (freeware version) Allway sync Pro (for use in businesses, governments military and other enterprise environments. Removes the 40,000 files/month limitation) Allway Sync 'n' Go (For USB and other removable drives) Allway Sync 'n' Go U3 (For U3-enabled flash devices) Allway Sync 'n' Go in Portable Application Format (for USB sticks) See also Backup List of backup software Disk image Comparison of disc image software File synchronization Comparison of file synchronization software References External links http://www.allwaysync.com/ Backup software
6761673
https://en.wikipedia.org/wiki/Jaypee%20University%20of%20Information%20Technology
Jaypee University of Information Technology
Jaypee University of Information Technology (also J. P. University of Information Technology and JUIT) is a private university in Waknaghat, Solan, Himachal Pradesh, India. History Jaypee University of Information Technology is recognised by the state government of Himachal Pradesh, India. It was set up by Act No. 14 of 2002 vide Extraordinary Gazette notification of Government of Himachal Pradesh dated 23 May 2002. JUIT was approved by the University Grants Commission under section 2(f) of the UGC Act, the University commenced academic activities from July 2002. Location The university is located 4 kilometers off National Highway 22 (22 km from Shimla) (from Waknaghat) which runs from Kalka to Shimla (India). The campus is spread over on the green slopes of Waknaghat. The nearest railway station is Kaithleeghat, 4 kilometers from Waknaghat and the nearest airport is Shimla. Academics The university offers Bachelor of Technology programmes in various fields. It also offers Bachelor of Pharmacy and Master of Pharmacy as well as various doctoral programmes. In addition to the above-mentioned programs, a dual degree program is offered, wherein a student spends four years at JUIT attending aforementioned programs and one year of the MBA program at the Jaypee Business School, Jaypee Institute of Information Technology. Rankings Jaypee University of Information Technology was ranked 115 and 84 among engineering colleges by the National Institutional Ranking Framework (NIRF) in 2020 and 2019 respectively. Departments The university includes the following departments: Department of Computer Science Engineering and IT Department of Electronics and Communication Technology Department of Civil Engineering Department of Biotechnology and Bioinformatics Department of Physics and Material Sciences Department of Mathematics Department of Pharmacy The JUIT Department of Pharmacy was approved by the Pharmacy Council of India (PCI) for BPharma from 2008. See also Jaypee University of Engineering and Technology. Jaypee institute of information technology. References External links Universities in Himachal Pradesh All India Council for Technical Education Information technology institutes Jaypee Group Engineering colleges in Himachal Pradesh Education in Solan district Educational institutions established in 2002 2002 establishments in Himachal Pradesh
52943725
https://en.wikipedia.org/wiki/Cyber%20and%20Information%20Domain%20Service%20%28Germany%29
Cyber and Information Domain Service (Germany)
The Cyber and Information Domain Service (, ; KdoCIR) is the youngest branch of Germany's military, the Bundeswehr. The decision to form a new military branch was presented by Defense Minister Ursula von der Leyen on 26 April 2016, becoming operational on 5 April 2017. The headquarter of the Cyber and Information Domain Service is Bonn. History In November 2015 the German Ministry of Defense activated a Staff Group within the ministry tasked with developing plans for a reorganization of the Cyber, IT, military intelligence, geo-information, and operative communication units of the Bundeswehr. On 26 April 2016 Defense Minister Ursula von der Leyen presented the plans for the new military branch to the public and on 5 October 2016 the command's staff became operational as a department within the ministry of defense. On 5 April 2017 the Cyber and Information Domain Service was activated as the 6th branch of the Bundeswehr. The Cyber and Information Domain Service Headquarters will take command of all existing Cyber, IT, military intelligence, geoinformation, and operative communication units. The Command is planned to be fully operational by 2021. Organisation The CIDS is commanded by the Chief of the Cyber and Information Domain Service () (InspCIR), a three-star general position, based in Bonn. Chief CIDS and Commander CIDS HQ Deputy Commander CIDS HQ and Chief of Staff Command Staff Operations Staff Planning Staff Strategic Reconnaissance Command ( KSA), in Gelsdorf 911th Electronic Warfare Battalion 912th Electronic Warfare Battalion, mans the Oste-class SIGINT/ELINT and reconnaissance ships 931st Electronic Warfare Battalion 932nd Electronic Warfare Battalion, provides airborne troops for operations in enemy territory Bundeswehr Strategic Reconnaissance School Bundeswehr Operational Communications Center Cyber-Operations Center Electronic Warfare Analysis Center Central Imaging Reconnaissance, operating the SAR-Lupe satellites Central Bundeswehr Investigation Authority for Technical Reconnaissance Bundeswehr Geoinformation Centre (), in Euskirchen Information Technology Command (, in Bonn Bundeswehr IT Operations Center Bundeswehr Information Technology School 281st Information Technology Battalion 282nd Information Technology Battalion 292nd Information Technology Battalion 293rd Information Technology Battalion 381st Information Technology Battalion 383rd Information Technology Battalion Bundeswehr Cyber-Security Center Bundeswehr Software Competence Center See also List of cyber warfare forces References External links Cyber security, the German way Dossier on cyber security in the Bundeswehr White Paper on German security policy and the future of the Bundeswehr The Faculty of Joint Operations Bundeswehr Information operations units and formations Computer security
55187092
https://en.wikipedia.org/wiki/Rothacker
Rothacker
Rothacker is a surname. Notable people with the surname include: Erich Rothacker (1888–1965), German philosopher Nate Rothacker (born 1981), American drummer Rick Rothacker (born 1972), American journalist
61906365
https://en.wikipedia.org/wiki/Librem%205
Librem 5
The Librem 5 is a smartphone manufactured by Purism that is part of their Librem line of products. The phone is designed with the goal of using free software whenever possible, includes PureOS, a GNU/Linux operating system, by default, and as of 2021 is the only smartphone recommended by the Free Software Foundation. Like other Librem products, the Librem 5 focuses on privacy and freedom, and includes features like hardware kill switches, and easily-replaceable components. Its name, with a numerical "5", refers to its screen size, and not a release version. After an announcement on August 24, 2017, the distribution of developer kits and limited pre-release models occurred throughout 2019 and most of 2020. The first mass-production version of the Librem 5 was shipped on November 18, 2020. History On August 24, 2017, Purism started a crowdfunding campaign for the Librem 5, a smartphone aimed not only to run purely on free software provided in PureOS, but to "[focus] on security by design and privacy protection by default". Purism claimed that the phone would become "the world's first ever IP-native mobile handset, using end-to-end encrypted decentralized communication". Purism has cooperated with GNOME in its development of the Librem 5 software. It is planned, that KDE and Ubuntu Touch will also be offered as optional interfaces. The release of the Librem 5 was delayed several times. It was originally planned to launch in January 2019. Purism announced on September 4, 2018, that the launch date would be postponed until April 2019, due to two power management bugs in the silicon and the Europe/North America holiday season. Development kits for software developers, which were shipped out in December 2018, were unaffected by the bugs, since developers normally connect the device to a power outlet, rather than rely on the phone battery. In February, the launch date was postponed again to the third quarter of 2019, because of the necessity of further CPU tests. Specifications and pre-orders, for $649, to increase to $699, were announced in July 2019. On September 5, 2019, Purism announced that shipping was scheduled to occur later that month, but that it would be done as an "iterative" process. The iterative release plan included the announcement of six different "batches" of Librem 5 releases, of which the first four would be limited pre-production models. Each consecutive batch, which consisted of different arboreal-themed code names and release dates, would feature hardware, mechanical, and software improvements. Purism contacted each customer that had pre-ordered to allow them to choose which batch they'd prefer to receive. Pre-mass production batches, in order of release, included code names "Aspen", "Birch", "Chestnut", and "Dogwood". The fifth batch, "Evergreen", would be the official mass-production model, while the sixth batch, "Fir", would be the second mass-production model. On September 24, 2019, Purism announced that the first batch of limited-production Librem 5 phones (Aspen) had started shipping. A video of an early phone was produced, and a shipping and status update was released soon after. However, it was later reported, that the Aspen batch had been shipped only to employees and developers. On November 22, 2019, it was reported, that the second batch (Birch) would consist of around 100 phones, and would be in the hands of backers by the first week of December. In December 2019, Jim Salter of Ars Technica reported "prototype" devices were being received; however, they were not really a "phone" yet. There was no audio when attempting to place a phone call, and cameras didn't work yet. Reports of the third batch of limited pre-mass-production models (Chestnut) being received by customers and reviewers occurred in January 2020. Chestnut Librem 5 models were the first to have the ability to make calls and send text messages. By May 2020, TechRadar reported that the call quality was fine, though the speaker mode was "a bit quiet", and volume adjustment did not work. According to TechRadar, the 3 to 5-hour battery time and the inability of the phone to charge while turned on was "A stark reminder of the Librem 5's beta status". On November 18, 2020, Purism announced via press release that they had begun shipping the finished version of the Librem 5, known as "Evergreen". Following its release, in December 2019, Purism announced that it will offer a "Librem 5 USA" version of the phone for the price of $1999, which is assembled in the United States for extra supply chain security. According to Purism CEO Todd Weaver, "having a secure auditable US based supply chain including parts procurement, fabrication, testing, assembly, and fulfillment all from within the same facility is the best possible security story." Hardware The Librem 5 features an i.MX 8M Quad Core processor, with an integrated GPU which supports OpenGL 3.0, OpenGL ES 3.1, Vulkan 1.0 and OpenCL 1.2 with default drivers; however, since the driver used is the open source Etnaviv driver, it currently only supports OpenGL 2.1 and OpenGL ES 2.0. It has 3GB of RAM, 32GB of eMMC storage, a 13MP rear camera, and an 8MP front camera. The left side of the phone features three hardware kill switches, which cut power to the camera and microphone, Wi-Fi and Bluetooth modem, and the baseband modem.) The device uses a USB-C connector to charge its 4500 mAh lithium battery. The 144 mm (5.7") IPS display has a resolution of 1440×720 pixels. It also has a 3.5 mm TRRS headphone/mic jack, a single SIM slot, and a microSD card slot. Mobile security The hardware features three hardware kill switches that physically cut off power from both cameras and the microphone, Wi-Fi and Bluetooth, and baseband processor, respectively. Further precautionary measures can be used with Lockdown Mode, which, in addition to powering off the cameras, microphone, WiFi, Bluetooth and cellular baseband, also cuts power to the GNSS, IMU, ambient light and proximity sensor. This is possible due to the fact, that these components are not integrated into the System on a Chip (SoC), like they are in conventional smartphones. Instead, the cellular baseband and Wi-Fi/Bluetooth components are located on two replaceable M.2 cards, which means, that they can be changed to support different wireless standards. The kill switch to cut the circuit to the microphone will prevent the 3.5 mm audio jack being used for acoustic cryptanalysis. In place of an integrated mobile SoC found in most smartphones, the Librem 5 uses six separate chips: i.MX 8M Quad, Silicon Labs RS9116, Broadmobi BM818 / Gemalto PLS8, STMicroelectronics Teseo-LIV3F, Wolfson Microelectronics WM8962, and Texas Instruments bq25895. The downside to having dedicated chips instead of an integrated system-on-chip is, that it takes more energy to operate separate chips, and the phone's circuit boards are much larger. On the other hand, using separate components means longer support from the manufacturers than with mobile SoCs, which have short support timelines. According to Purism, the Librem 5 is designed to avoid planned obsolescence, and will receive lifetime software updates. The Librem 5 is the first phone to contain a smartcard reader, where an OpenPGP card can be inserted for secure cryptographic operations. Purism plans to use OpenPGP cards to implement storage of GPG keys, disk unlocking, secure authentication, a local password vault, protection of sensitive files, user personas, and travel personas. To promote better security, all the source code in the root file system is free/open source software and can be reviewed by the user. Purism publishes the schematics of the Librem 5's printed circuit boards (PCBs) under the GPL 3.0+ license, and publishes x-rays of the phone, so that the user can verify that there haven't been any changes to the hardware, such as inserted spy chips. Software The Librem 5 ships with Purism's PureOS, a Debian GNU/Linux derivative. The operating system uses a new mobile user interface developed by Purism called Phosh, a portmanteau from "phone shell". It is based on Wayland, wlroots, GTK 3, and GNOME. Unlike other mobile GNU/Linux interfaces, such as Ubuntu Touch and KDE Plasma Mobile, Phosh is based on tight integration with the desktop GNU/Linux software stack, which Purism developers believe will make it easier to maintain in the long-term and incorporate into existing desktop GNU/Linux distributions. Phosh has been packaged in a number of desktop distros (Debian, Arch, Manjaro, Fedora and openSUSE) and is used by eight of the sixteen GNU/Linux ports for the PinePhone. The phone is a convergence device: if connected to a keyboard, monitor, and mouse, it can run GNU/Linux applications as a desktop computer would. Many desktop GNU/Linux applications can run on the phone as well, albeit possibly without a touch-friendly UI. Purism is taking a unique approach to convergence by downsizing existing desktop software to reuse it in a mobile environment. Purism has developed the libhandy library to make GTK software adaptive so its interface elements adjust to smaller mobile screens. In contrast, other companies such as Microsoft and Samsung with Ubuntu (and Canonical before Unity8) tried to achieve convergence by having separate sets of software for the mobile and desktop PC environments. Most iOS apps, Android apps and Plasma Mobile's Kirigami implement convergence by upsizing existing mobile apps to use them in a desktop interface. Purism claims that the "Librem 5 will be the first ever Matrix-powered smartphone, natively using end-to-end encrypted decentralised communication in its dialer and messaging app". Purism was unable to find a free/open-source cellular modem, so the phone uses a modem with proprietary hardware, but isolates it from the rest of the components rather than having it integrated with the System on a Chip (SoC). This prevents code on the modem from being able to read or modify data going to and from the SoC. See also Comparison of open-source mobile phones List of open-source mobile phones Microphone blocker Modular smartphone PinePhone References External links Librem 5 Mobile Linux Linux-based devices Open-source mobile phones Mobile/desktop convergence Mobile security Secure communication Modular smartphones
26020472
https://en.wikipedia.org/wiki/Osawatomie%20High%20School
Osawatomie High School
Osawatomie High School (OHS) is a public high school in Osawatomie, Kansas, United States. It is operated by Osawatomie USD 367 school district. Its mascot is Trojans, and the school colors are red and white. The principal is Jeff White, and the assistant principal/athletic director is Wade Welch. Located at 1200 Trojan Drive, the current enrollment in grades 9-12 is 310 students. Administration Osawatomie High School is currently under the leadership of Principal Jeff White. Extracurricular activities The Trojans compete in the Pioneer League and are classified as a 4A school, the third-largest classification in Kansas, according to the Kansas State High School Activities Association. Many graduates have gone on to participate in Division I, Division II, and Division III athletics. Athletics Fall Cross Country Fall Cheerleading Football Volleyball Winter Boys Basketball Winter Cheerleading Boys Swimming Wrestling Spring Baseball Golf Softball Girls Swimming Track and Field Clubs Art Club Band Choir Class Officers Debate Team DAZZLERS Drama Club FCCLA FFA KAY Club Renaissance SADD Scholar's Bowl Science Club Student Council Gaming Club Technology As part of a program initiated in 2005, OHS provides a 1:1 laptop program, meaning every high school student gets a laptop to use at school and home during the school year. The overall objective of the program is to provide access to technology for all students. The program is intended to increase student learning, increase student achievement, and better prepare students to face the challenges of the digital work force. Notable alumni Lynn Dickey (class of 1967) – NFL quarterback (Houston Oilers, Green Bay Packers) Derrick Jensen (class of 1974) – NFL tight end for the Oakland Raiders and scout for the Seattle Seahawks See also List of high schools in Kansas List of unified school districts in Kansas References External links School Osawatomie High School Website District Website Maps Osawatomie city map, KDOT Miami County map, KDOT Public high schools in Kansas Schools in Miami County, Kansas 1893 establishments in Kansas
2484242
https://en.wikipedia.org/wiki/Chartered%20IT%20Professional
Chartered IT Professional
Chartered IT Professional (in full, Chartered Information Technology Professional) denoted by CITP is a professional qualification awarded under Royal Charter to IT professionals who satisfy strict criteria set by the British Computer Society (BCS), which is a professional body for IT in the United Kingdom. Status The title Chartered IT Professional is aligned with Skills Framework for the Information Age (SFIA), the UK Government backed competency framework, CITP is the benchmark of IT excellence and is terminal (final/top most) qualification in IT. Criteria and requirements for chartered status in the UK have to be approved by the Privy Council and as such the CITP designation is on par with other chartered qualifications in other fields (such as the Chartered Accountant qualification awarded by the ICAEW). CITP status is gaining increasing national and international recognition as the benchmark of excellence for IT professionals. It provides evidence of an individual’s commitment to their profession and endorsement of their experience and knowledge. Eligibility In order to qualify for this award a person normally needs to have at least 8 to 10 years professional experience in IT, with evidence of experience at a senior level (5) in the Skills Framework for the Information Age (SFIA), have passed a professional competency examination and successfully completed a skills assessment interview with two BCS assessors. A fast track scheme exists for holders of Certified Architect and Certified IT Specialist certifications from The Open Group, which exempts applicants from the initial review and interview elements of the application process. In order to maintain a certificate of current competence, demonstrable CPD must be undertaken. Designatory lettering Chartered IT Professionals are entitled to use the suffix CITP after their names. This is written after honours, decorations and university degrees and before letters denoting membership of professional engineering institutions - for example: BSc (Hons) CITP MBCS. The BCS maintains an online register of members with CITP status. Licensing Other Professional membership bodies apply to the BCS for a licence that enables them to award CITP to their eligible members. Ireland The Irish Computer Society is the awarding body of CITP status in Ireland. The standard has been developed in consultation with BCS and applicants undergo the same assessment process. New Zealand The Institute of IT Professionals New Zealand is licensed to award CITP status in New Zealand. The body extends the CITP standard to CITPNZ, incorporating additional requirements such as mandatory practicing certificate to retain the designation. United Kingdom The Institution of Engineering and Technology (IET) was the first membership body licensed to award CITP, but the agreement ended on 22 February 2020. References External links Chartered IT Professional Chartered IT Professional Ireland - Irish Computer Society Chartered IT Professional New Zealand Institute of Chartered IT Professionals (South Africa) British Computer Society Information technology education Information technology qualifications Post-nominal letters Titles in the United Kingdom
18987740
https://en.wikipedia.org/wiki/SLOB
SLOB
The SLOB (simple list of blocks) allocator is one of three available memory allocators in the Linux kernel. The other two are SLAB (slab allocator) and SLUB. The SLOB allocator is designed to require little memory for the implementation and housekeeping, for use in small systems such as embedded systems. Unfortunately, a major limitation of the SLOB allocator is that it suffers greatly from internal fragmentation. SLOB currently uses a first-fit algorithm, which uses the first available space for memory. In 2008, a reply from Linus Torvalds on a Linux mailing list was made where he suggested the use of a best-fit algorithm, which tries to find a memory block which suits needs best. Best fit finds the smallest space which fits the required amount available, avoiding loss of performance, both by fragmentation and consolidation of memory. By default, Linux kernel used a SLAB Allocation system until version 2.6.23, when SLUB allocation became the default. When the CONFIG_SLAB flag is disabled, the kernel falls back to using the SLOB allocator. The SLOB allocator was used in DSLinux on Nintendo DS handheld console. See also Slab allocation References Memory management algorithms Linux kernel
32189367
https://en.wikipedia.org/wiki/Qurtuba%20University
Qurtuba University
Qurtuba University (QU) was established in 2001. Qurtuba is Arabic variation of name of Córdoba, Spain in Al-Andalus nowadays Spain.It has two campuses in Dera Ismail Khan and Peshawar Pakistan from Both are separately recognized and are placed in highest 'W3' category by the HEC. It has seventh position among universities of Pakistan by the rankings of Higher Education Commission of Pakistan. It is chartered by the Government of Khyber Pakhtunkhwa and approved by the Higher Education Commission of Pakistan. Qurtuba University of Science and Information Technology is one of the pioneer private sector universities of Khyber PakhtunKhwa Pakistan. History The university was established through a charter issued by the Governor (K.P.K) on 30 August 2001 and is recognized by the Higher Education commission (former UGC), Islamabad. In 2006, the Girls School & College was separated. In 2008, a model school in P.I.A society Lahore was established. Achievements Qurtuba University of Science & IT has been awarded ‘W3’ Category by the Higher Education Commission of Pakistan in the new ranking system and has placed among the top ranked universities of the country. Qurtuba University is the only university of Khyber Pakhtunkhwa having highest position of ranking by Higher Education Commission of Pakistan. It has been consistently placed in third position in the field of Computer Science & Information Technology in HEC rankings. Campuses Qurtuba University of Science and Information Technology has two campuses. Both are in Khyber Pakhtunkhwa. The main campus is in Dera Ismail Khan and the other is in Peshawar. Programs offered Qurtuba University offers following graduate, post graduate and doctorate programs. Bachelor's programs Bachelor of Business Administration BBA Bachelor of Commerce (B.Com) Bachelor of Education (B.Ed) Bachelor of Computer Science Bachelor of Science in Electrical Engineering Bachelor of Science in Civil Engineering Bachelor of Technology in Electrical Technology Bachelor of Technology in Civil Technology Bachelor of Economics Master programs Master of Business Administration MBA Master of Accounting (M.Com) Master of Education (M.Ed) Master of Science in Mathematics Master of Science Computer Sciences Master of Arts in English Master of Arts inInternational Relations Master of Science in Physics Master of Science in Chemistry Master of Arts in Political Science MS programs MS Computer Science MS Management Sciences MS English M.Phil programs M.Phil Education M.Phil Economics M.Phil International Relations M.Phil Political Sciences M.Phil Pakistan Studies M.Phil Urdu Ph.D programs Ph.D Management Sciences Ph.D International Relations Ph.D Political Science Ph.D Education Ph.D Urdu Diplomas Diploma in Human Rights Diploma in Islamic Banking Diploma in Marketing Diploma in Human Resource Management (HRM) Diploma in Physical Education Certificate Certificate in Accounting References External links Official site Dera Ismail Khan District 2001 establishments in Pakistan Educational institutions established in 2001 Private universities and colleges in Khyber Pakhtunkhwa Peshawar District Universities and colleges in Peshawar Peshawar
8400335
https://en.wikipedia.org/wiki/Software%20portability
Software portability
Portability in high-level computer programming is the usability of the same software in different environments. The prerequirement for portability is the generalized abstraction between the application logic and system interfaces. When software with the same functionality is produced for several computing platforms, portability is the key issue for development cost reduction. Strategies for portability Software portability may involve: Transferring installed program files to another computer of basically the same architecture. Reinstalling a program from distribution files on another computer of basically the same architecture. Building executable programs for different platforms from source code; this is what is usually understood by "porting". Similar systems When operating systems of the same family are installed on two computers with processors with similar instruction sets it is often possible to transfer the files implementing program files between them. In the simplest case, the file or files may simply be copied from one machine to the other. However, in many cases, the software is installed on a computer in a way which depends upon its detailed hardware, software, and setup, with device drivers for particular devices, using installed operating system and supporting software components, and using different drives or directories. In some cases, software, usually described as "portable software", is specifically designed to run on different computers with compatible operating systems and processors, without any machine-dependent installation. Porting is no more than transferring specified directories and their contents. Software installed on portable mass storage devices such as USB sticks can be used on any compatible computer on simply plugging the storage device in, and stores all configuration information on the removable device. Hardware- and software-specific information is often stored in configuration files in specified locations (e.g. the registry on machines running Microsoft Windows). Software which is not portable in this sense will have to be transferred with modifications to support the environment on the destination machine. Different processors the majority of desktop and laptop computers used microprocessors compatible with the 32- and 64-bit x86 instruction sets. Smaller portable devices use processors with different and incompatible instruction sets, such as ARM. The difference between larger and smaller devices is such that detailed software operation is different; an application designed to display suitably on a large screen cannot simply be ported to a pocket-sized smartphone with a tiny screen even if the functionality is similar. Web applications are required to be processor independent, so portability can be achieved by using web programming techniques, writing in JavaScript. Such a program can run in a common web browser. Such web applications must, for security reasons, have limited control over the host computer, especially regarding reading and writing files. Non-web programs, installed upon a computer in the normal manner, can have more control, and yet achieve system portability by linking to portable libraries providing the same interface on different systems. Source code portability Software can be compiled and linked from source code for different operating systems and processors if written in a programming language supporting compilation for the platforms. This is usually a task for the program developers; typical users have neither access to the source code nor the required skills. In open-source environments such as Linux the source code is available to all. In earlier days source code was often distributed in a standardised format, and could be built into executable code with a standard Make tool for any particular system by moderately knowledgeable users if no errors occurred during the build. Some Linux distributions distribute software to users in source form. In these cases there is usually no need for detailed adaptation of the software for the system; it is distributed in a way which modifies the compilation process to match the system. Effort to port source code Even with seemingly portable languages like C and C++ the effort to port source code can vary considerably. The authors of UNIX/32V (1979) reported that "[t]he (Bourne) shell [...] required by far the largest conversion effort of any supposedly portable program, for the simple reason that it is not portable." Sometimes the effort consists of recompiling the source code, but sometimes it is necessary to rewrite major parts of the software. Many language specifications describe implementation defined behaviour (e.g. right shifting a signed integer in C can do a logical or an arithmetic shift). Operating system functions or third party libraries might not be available on the target system. Some functions can be available on a target system, but exhibit slightly different behaviour (E.g.: utime() fails under Windows with EACCES, when it is called for a directory). The program code itself can also contain unportable things, like the paths of include files. Drive letters and the backslash as path delimiter are not accepted on all operating systems. Implementation defined things like byte order and the size of an int can also raise the porting effort. In practice the claim of languages, like C and C++, to have the WOCA (write once, compile anywhere) is arguable. See also Interoperability Cross-platform software Hardware-dependent software C (programming language) Language interoperability Portability testing Source-to-source compiler Data portability References Sources Software quality
14948587
https://en.wikipedia.org/wiki/Unslung
Unslung
Unslung is an open source firmware for the Linksys NSLU2. It is based on the stock Linksys firmware. Due to the device running Linux, and therefore being licensed under, and subject to the terms of the GNU General Public License, Linksys released the source code. Unslung takes the Linksys firmware and expands upon it. It is still subject to some of the restrictions that the Linksys firmware has, but also removes some of them. Based on the old Linux 2.4 kernel, support for some newer devices may not exist. The web interface of the default Linksys firmware is kept, fully functioning, except for the upgrade interface. Through ipkg, users are able to install over 1000 Optware packages to the device. These have been specially compiled for the NSLU2. Current Release The Unslung 6.10 Beta firmware is based on the Linksys V2.3R63/A5 firmware. It is a minor bugfix/update of that of Unslung 6.8. See also SlugOS References External links NSLU2-Linux NSLU2-Linux Firmware page Linksys NSLU2 Product Information Page NSLU2-Linux
24095830
https://en.wikipedia.org/wiki/List%20of%20important%20publications%20in%20theoretical%20computer%20science
List of important publications in theoretical computer science
This is a list of important publications in theoretical computer science, organized by field. Some reasons why a particular publication might be regarded as important: Topic creator – A publication that created a new topic Breakthrough – A publication that changed scientific knowledge significantly Influence – A publication that has significantly influenced the world or has had a massive impact on the teaching of theoretical computer science. Computability Cutland's Computability: An Introduction to Recursive Function Theory (Cambridge) The review of this early text by Carl Smith of Purdue University (in the Society for Industrial and Applied Mathematics Reviews), reports that this a text with an "appropriate blend of intuition and rigor… in the exposition of proofs" that presents "the fundamental results of classical recursion theory [RT]... in a style... accessible to undergraduates with minimal mathematical background". While he states that it "would make an excellent introductory text for an introductory course in [RT] for mathematics students", he suggests that an "instructor must be prepared to substantially augment the material… " when it is used with computer science students (given a dearth of material on RT applications to this area). Decidability of second order theories and automata on infinite trees Michael O. Rabin Transactions of the American Mathematical Society, vol. 141, pp. 1–35, 1969 Description: The paper presented the tree automaton, an extension of the automata. The tree automaton had numerous applications to proofs of correctness of programs. Finite automata and their decision problems Michael O. Rabin and Dana S. Scott IBM Journal of Research and Development, vol. 3, pp. 114–125, 1959 Online version Description: Mathematical treatment of automata, proof of core properties, and definition of non-deterministic finite automaton. Introduction to Automata Theory, Languages, and Computation John E. Hopcroft, Jeffrey D. Ullman, and Rajeev Motwani Addison-Wesley, 2001, Description: A popular textbook. On certain formal properties of grammars Description: This article introduced what is now known as the Chomsky hierarchy, a containment hierarchy of classes of formal grammars that generate formal languages. On computable numbers, with an application to the Entscheidungsproblem Alan Turing Proceedings of the London Mathematical Society, Series 2, vol. 42, pp. 230–265, 1937, .Errata appeared in vol. 43, pp. 544–546, 1938, . HTML version, PDF version Description: This article set the limits of computer science. It defined the Turing Machine, a model for all computations. On the other hand, it proved the undecidability of the halting problem and Entscheidungsproblem and by doing so found the limits of possible computation. Rekursive Funktionen The first textbook on the theory of recursive functions. The book went through many editions and earned Péter the Kossuth Prize from the Hungarian government. Reviews by Raphael M. Robinson and Stephen Kleene praised the book for providing an effective elementary introduction for students. Representation of Events in Nerve Nets and Finite Automata S. C. Kleene U. S. Air Force Project Rand Research Memorandum RM-704, 1951 Online version republished in: Description: this paper introduced finite automata, regular expressions, and regular languages, and established their connection. Computational complexity theory Arora & Barak's Computational Complexity and Goldreich's Computational Complexity (both Cambridge) Sanjeev Arora and Boaz Barak, "Computational Complexity: A Modern Approach," Cambridge University Press, 2009, 579 pages, Hardcover Oded Goldreich, "Computational Complexity: A Conceptual Perspective, Cambridge University Press, 2008, 606 pages, Hardcover Besides the estimable press bringing these recent texts forward, they are very positively reviewed in ACM's SIGACT News by Daniel Apon of the University of Arkansas, who identifies them as "textbooks for a course in complexity theory, aimed at early graduate… or... advanced undergraduate students… [with] numerous, unique strengths and very few weaknesses," and states that both are: The reviewer notes that there is "a definite attempt in [Arora and Barak] to include very up-to-date material, while Goldreich focuses more on developing a contextual and historical foundation for each concept presented," and that he "applaud[s] all… authors for their outstanding contributions." A machine-independent theory of the complexity of recursive functions Description: The Blum axioms. Algebraic methods for interactive proof systems Description: This paper showed that PH is contained in IP. The complexity of theorem proving procedures Description: This paper introduced the concept of NP-Completeness and proved that Boolean satisfiability problem (SAT) is NP-Complete. Note that similar ideas were developed independently slightly later by Leonid Levin at "Levin, Universal Search Problems. Problemy Peredachi Informatsii 9(3):265–266, 1973". Computers and Intractability: A Guide to the Theory of NP-Completeness Description: The main importance of this book is due to its extensive list of more than 300 NP-Complete problems. This list became a common reference and definition. Though the book was published only few years after the concept was defined such an extensive list was found. Degree of difficulty of computing a function and a partial ordering of recursive sets Description: This technical report was the first publication talking about what later was renamed computational complexity How good is the simplex method? Victor Klee and George J. Minty Description: Constructed the "Klee–Minty cube" in dimension D, whose 2D corners are each visited by Dantzig's simplex algorithm for linear optimization. How to construct random functions Description: This paper showed that the existence of one way functions leads to computational randomness. IP = PSPACE Description: IP is a complexity class whose characterization (based on interactive proof systems) is quite different from the usual time/space bounded computational classes. In this paper, Shamir extended the technique of the previous paper by Lund, et al., to show that PSPACE is contained in IP, and hence IP = PSPACE, so that each problem in one complexity class is solvable in the other. Reducibility among combinatorial problems R. M. Karp In R. E. Miller and J. W. Thatcher, editors, Complexity of Computer Computations, Plenum Press, New York, NY, 1972, pp. 85–103 Description: This paper showed that 21 different problems are NP-Complete and showed the importance of the concept. The Knowledge Complexity of Interactive Proof Systems Description: This paper introduced the concept of zero knowledge. A letter from Gödel to von Neumann Kurt Gödel A Letter from Gödel to John von Neumann, March 20, 1956 Online version Description: Gödel discusses the idea of efficient universal theorem prover. On the computational complexity of algorithms Description: This paper gave computational complexity its name and seed. Paths, trees, and flowers Description: There is a polynomial time algorithm to find a maximum matching in a graph that is not bipartite and another step toward the idea of computational complexity. For more information see . Theory and applications of trapdoor functions Description: This paper creates a theoretical framework for trapdoor functions and described some of their applications, like in cryptography. Note that the concept of trapdoor functions was brought at "New directions in cryptography" six years earlier (See section V "Problem Interrelationships and Trap Doors."). Computational Complexity C.H. Papadimitriou Addison-Wesley, 1994, Description: An introduction to computational complexity theory, the book explains its author's characterization of P-SPACE and other results. Interactive proofs and the hardness of approximating cliques Probabilistic checking of proofs: a new characterization of NP Proof verification and the hardness of approximation problems Description: These three papers established the surprising fact that certain problems in NP remain hard even when only an approximative solution is required. See PCP theorem. The Intrinsic Computational Difficulty of Functions Description: First definition of the complexity class P. One of the founding papers of complexity theory. Algorithms "A machine program for theorem proving" Description: The DPLL algorithm. The basic algorithm for SAT and other NP-Complete problems. "A machine-oriented logic based on the resolution principle" Description: First description of resolution and unification used in automated theorem proving; used in Prolog and logic programming. "The traveling-salesman problem and minimum spanning trees" Description: The use of an algorithm for minimum spanning tree as an approximation algorithm for the NP-Complete travelling salesman problem. Approximation algorithms became a common method for coping with NP-Complete problems. "A polynomial algorithm in linear programming" L. G. Khachiyan Soviet Mathematics - Doklady, vol. 20, pp. 191–194, 1979 Description: For long, there was no provably polynomial time algorithm for the linear programming problem. Khachiyan was the first to provide an algorithm that was polynomial (and not just was fast enough most of the time as previous algorithms). Later, Narendra Karmarkar presented a faster algorithm at: Narendra Karmarkar, "A new polynomial time algorithm for linear programming", Combinatorica, vol 4, no. 4, p. 373–395, 1984. "Probabilistic algorithm for testing primality" Description: The paper presented the Miller–Rabin primality test and outlined the program of randomized algorithms. "Optimization by simulated annealing" Description: This article described simulated annealing, which is now a very common heuristic for NP-Complete problems. The Art of Computer Programming Donald Knuth Description: This monograph has four volumes covering popular algorithms. The algorithms are written in both English and MIX assembly language (or MMIX assembly language in more recent fascicles). This makes algorithms both understandable and precise. However, the use of a low-level programming language frustrates some programmers more familiar with modern structured programming languages. Algorithms + Data Structures = Programs Niklaus Wirth Prentice Hall, 1976, Description: An early, influential book on algorithms and data structures, with implementations in Pascal. The Design and Analysis of Computer Algorithms Alfred V. Aho, John E. Hopcroft, and Jeffrey D. Ullman Addison-Wesley, 1974, Description: One of the standard texts on algorithms for the period of approximately 1975–1985. How to Solve It By Computer Description: Explains the Whys of algorithms and data-structures. Explains the Creative Process, the Line of Reasoning, the Design Factors behind innovative solutions. Algorithms Robert Sedgewick Addison-Wesley, 1983, Description: A very popular text on algorithms in the late 1980s. It was more accessible and readable (but more elementary) than Aho, Hopcroft, and Ullman. There are more recent editions. Introduction to Algorithms Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein 3rd Edition, MIT Press, 2009, . Description: This textbook has become so popular that it is almost the de facto standard for teaching basic algorithms. The 1st edition (with first three authors) was published in 1990, the 2nd edition in 2001, and the 3rd in 2009. Algorithmic information theory "On Tables of Random Numbers" Description: Proposed a computational and combinatorial approach to probability. "A formal theory of inductive inference" Ray Solomonoff Information and Control, vol. 7, pp. 1–22 and 224–254, 1964 Online copy: part I, part II Description: This was the beginning of algorithmic information theory and Kolmogorov complexity. Note that though Kolmogorov complexity is named after Andrey Kolmogorov, he said that the seeds of that idea are due to Ray Solomonoff. Andrey Kolmogorov contributed a lot to this area but in later articles. "Algorithmic information theory" Description: An introduction to algorithmic information theory by one of the important people in the area. Information theory "A mathematical theory of communication" Description: This paper created the field of information theory. "Error detecting and error correcting codes" Description: In this paper, Hamming introduced the idea of error-correcting code. He created the Hamming code and the Hamming distance and developed methods for code optimality proofs. "A method for the construction of minimum redundancy codes" Description: The Huffman coding. "A universal algorithm for sequential data compression" Description: The LZ77 compression algorithm. Elements of Information Theory Description: A popular introduction to information theory. Formal verification Assigning Meaning to Programs Description: Robert Floyd's landmark paper Assigning Meanings to Programs introduces the method of inductive assertions and describes how a program annotated with first-order assertions may be shown to satisfy a pre- and post-condition specification – the paper also introduces the concepts of loop invariant and verification condition. An Axiomatic Basis for Computer Programming Description: Tony Hoare's paper An Axiomatic Basis for Computer Programming describes a set of inference (i.e. formal proof) rules for fragments of an Algol-like programming language described in terms of (what are now called) Hoare-triples. Guarded Commands, Nondeterminacy and Formal Derivation of Programs Description: Edsger Dijkstra's paper Guarded Commands, Nondeterminacy and Formal Derivation of Programs (expanded by his 1976 postgraduate-level textbook A Discipline of Programming) proposes that, instead of formally verifying a program after it has been written (i.e. post facto), programs and their formal proofs should be developed hand-in-hand (using predicate transformers to progressively refine weakest pre-conditions), a method known as program (or formal) refinement (or derivation), or sometimes "correctness-by-construction". Proving Assertions about Parallel Programs Edward A. Ashcroft J. Comput. Syst. Sci. 10(1): 110–135 (1975) Description: The paper that introduced invariance proofs of concurrent programs. An Axiomatic Proof Technique for Parallel Programs I Susan S. Owicki, David Gries Acta Inform. 6: 319–340 (1976) Description: In this paper, along with the same authors paper "Verifying Properties of Parallel Programs: An Axiomatic Approach. Commun. ACM 19(5): 279–285 (1976)", the axiomatic approach to parallel programs verification was presented. A Discipline of Programming Edsger W. Dijkstra 1976 Description: Edsger Dijkstra's classic postgraduate-level textbook A Discipline of Programming extends his earlier paper Guarded Commands, Nondeterminacy and Formal Derivation of Programs and firmly establishes the principle of formally deriving programs (and their proofs) from their specification. Denotational Semantics Joe Stoy 1977 Description: Joe Stoy's Denotational Semantics is the first (postgraduate level) book-length exposition of the mathematical (or functional) approach to the formal semantics of programming languages (in contrast to the operational and algebraic approaches). The Temporal Logic of Programs Description: The use of temporal logic was suggested as a method for formal verification. Characterizing correctness properties of parallel programs using fixpoints (1980) E. Allen Emerson, Edmund M. Clarke In Proc. 7th International Colloquium on Automata Languages and Programming, pages 169–181, 1980 Description: Model checking was introduced as a procedure to check correctness of concurrent programs. Communicating Sequential Processes (1978) C.A.R. Hoare 1978 Description: Tony Hoare's (original) communicating sequential processes (CSP) paper introduces the idea of concurrent processes (i.e. programs) that do not share variables but instead cooperate solely by exchanging synchronous messages. A Calculus of Communicating Systems Robin Milner 1980 Description: Robin Milner's A Calculus of Communicating Systems (CCS) paper describes a process algebra permitting systems of concurrent processes to be reasoned about formally, something that has not been possible for earlier models of concurrency (semaphores, critical sections, original CSP). Software Development: A Rigorous Approach Cliff Jones 1980 Description: Cliff Jones' textbook Software Development: A Rigorous Approach is the first full-length exposition of the Vienna Development Method (VDM), which had evolved (principally) at IBM's Vienna research lab over the previous decade and which combines the idea of program refinement as per Dijkstra with that of data refinement (or reification) whereby algebraically-defined abstract data types are formally transformed into progressively more "concrete" representations. The Science of Programming David Gries 1981 Description: David Gries' textbook The Science of Programming describes Dijkstra's weakest precondition method of formal program derivation, except in a very much more accessible manner than Dijkstra's earlier A Discipline of Programming. It shows how to construct programs that work correctly (without bugs, other than from typing errors). It does this by showing how to use precondition and postcondition predicate expressions and program proving techniques to guide the way programs are created. The examples in the book are all small-scale, and clearly academic (as opposed to real-world). They emphasize basic algorithms, such as sorting and merging, and string manipulation. Subroutines (functions) are included, but object-oriented and functional programming environments are not addressed. Communicating Sequential Processes (1985) C.A.R. Hoare 1985 Description: Tony Hoare's Communicating Sequential Processes (CSP) textbook (currently the third most cited computer science reference of all time) presents an updated CSP model in which cooperating processes do not even have program variables and which, like CCS, permits systems of processes to be reasoned about formally. Linear logic (1987) Description: Girard's linear logic was a breakthrough in designing typing systems for sequential and concurrent computation, especially for resource conscious typing systems. A Calculus of Mobile Processes (1989) R. Milner, J. Parrow, D. Walker 1989 Online version: Part 1 and Part 2 Description: This paper introduces the Pi-Calculus, a generalisation of CCS that allows process mobility. The calculus is extremely simple and has become the dominant paradigm in the theoretical study of programming languages, typing systems and program logics. The Z Notation: A Reference Manual Description: Mike Spivey's classic textbook The Z Notation: A Reference Manual summarises the formal specification language Z notation, which, although originated by Jean-Raymond Abrial, had evolved (principally) at Oxford University over the previous decade. Communication and Concurrency Robin Milner Prentice-Hall International, 1989 Description: Robin Milner's textbook Communication and Concurrency is a more accessible, although still technically advanced, exposition of his earlier CCS work. a Practical Theory of Programming Eric Hehner Springer, 1993, current edition online here Description: the up-to-date version of Predicative programming. The basis for C.A.R. Hoare's UTP. The simplest and most comprehensive formal methods. References History of computer science Theoretical computer science
15345161
https://en.wikipedia.org/wiki/Software%20effect%20processor
Software effect processor
A software effect processor is a computer program which is able to modify the signal coming from a digital audio source in real time. Principle of operation The digital audio signal, whose origin may be analog (by conversion to digital) or be in an already digital source (such as an audio file, or a software synthesizer), is stored in temporary allotments of computer memory called buffers. Once there, the software effect processor modifies the signal according to a specific algorithm, which creates the desired effect. After this operation, the signal may be transformed from digital to analog and sent to an audible output, stored in digital form for later reproduction or editing, or sent to other software effect processors for additional processing. Latency The larger the buffer is, the more time it takes to play the audio data sent for playback. Large buffers increase the time required before the next buffer can be played, this delay is usually called latency. Every system has certain limitations - too small buffers involving negligible latencies cannot be smoothly processed by computer, so the reasonable size starts at about 32 samples. The processor load does not affect latency (it means, once you set certain buffer size, the latency is constant). But with very high processor loads, the buffer isn't filled with new sound in time for playback, and the sound dropps out. Increasing buffer size or quitting other applications helps to keep playback smooth. Drivers Microsoft Windows The default Windows drivers are not optimized for low latency effect processing. As a solution, Audio Stream Input/Output (ASIO) was created. ASIO is supported by most professional music applications. Most sound cards directed at this market support ASIO. If the hardware manufacturer doesn't provide ASIO drivers, there are other ASIO free alternatives, which can be used for any audio interface. ASIO drivers can be emulated, in this case the driver name is ASIO Multimedia. However, the latency when using these drivers is very high. Apple Mac OS X All the Mac compatible hardware uses CoreAudio drivers, so the software effects processors can work with small latency and good performance. See also List of music software Music software Pitch modification software
2652103
https://en.wikipedia.org/wiki/Tiling%20window%20manager
Tiling window manager
In computing, a tiling window manager is a window manager with an organization of the screen into mutually non-overlapping frames, as opposed to the more common approach of coordinate-based stacking of overlapping objects (windows) that tries to fully emulate the desktop metaphor. History Xerox PARC The first Xerox Star system (released in 1981) tiled application windows, but allowed dialogs and property windows to overlap. Later, Xerox PARC also developed CEDAR (released in 1982), the first windowing system using a tiled window manager. Various vendors Next in 1983 came Andrew WM, a complete tiled windowing system later replaced by X11. Microsoft's Windows 1.0 (released in 1985) also used tiling (see sections below). In 1986 came Digital Research's GEM 2.0, a windowing system for the CP/M which used tiling by default. One of the early (created in 1988) tiling WMs was Siemens' RTL, up to today a textbook example because of its algorithms of automated window scaling, placement and arrangement, and (de)iconification. RTL ran on X11R2 and R3, mainly on the "native" Siemens systems, e.g., SINIX. Its features are described by its promotional video. The Andrew Project (AP or tAP) was a desktop client system (like early GNOME) for X with a tiling and overlapping window manager. MacOS X 10.11 El Capitan released on September 2015 introduces new window management features such as creating a full-screen split view limited to two app windows side-by-side in full screen by holding down the full-screen button in the upper-left corner of a window. Tiling window managers Microsoft Windows The built-in Microsoft Windows window manager has, since Windows 95, followed the traditional stacking approach by default. It can also act as a rudimentary tiling window manager. To tile windows, the user selects them in the taskbar and uses the context menu choice Tile Vertically or Tile Horizontally. Choosing Tile Vertically will cause the windows to tile horizontally but take on a vertical shape, while choosing Tile Horizontally will cause the windows to tile vertically but take on a horizontal shape. These options were later changed in Windows Vista to Show Windows Side by Side and Show Windows Stacked, respectively. Windows 7 added "Aero Snap" which adds the ability to drag windows to either side of the screen to create a simple side-by-side tiled layout, or to the top of the screen to maximize. Windows 8 introduced Windows Store apps; unlike desktop applications, they did not operate in a window, and could only run in full screen, or "snapped" as a sidebar alongside another app, or the desktop environment. Along with allowing Windows Store apps to run in a traditional window, Windows 10 enhanced the snapping features introduced in Windows 7 by allowing windows to be tiled into screen quadrants by dragging them to the corner, and adding "Snap Assist" — which prompts the user to select the application they want to occupy the other half of the screen when they snap a window to one half of the screen, and allows the user to automatically resize both windows at once by dragging a handle in the center of the screen. Windows 10 also supports FancyZones, a more complete tiling window manager facility allowing customized tiling zones and greater user control, configured through Microsoft PowerToys. History The first version (Windows 1.0) featured a tiling window manager, partly because of litigation by Apple claiming ownership of the overlapping window desktop metaphor. But due to complaints, the next version (Windows 2.0) followed the desktop metaphor. All later versions of the operating system stuck to this approach as the default behaviour. List of tiling window managers for Windows AquaSnap - made by Nurgo Software. Freeware, with an optional "Professional" license. Amethyst for windows - dynamic tiling window manager along the lines of amethyst for MacOS. bug.n – open source, configurable tiling window manager built as an AutoHotKey script and licensed under the GNU GPL. MaxTo — customizable grid, global hotkeys. Works with elevated applications, 32-bit and 64-bit applications, and multiple monitors. WS Grid+ – move and/or resize window's using a grid selection system combining benefits of floating, stacking and tiling. It provides keyboard/mouse shortcuts to instantly move and resize a window. Stack – customizable grid (XAML), global hotkeys and/or middle mouse button. Supports HiDPI and multiple monitors. Plumb — lightweight tiling manager with support for multiple versions of Windows. Supports HiDPI monitors, keyboard hotkeys and customization of hotkeys (XAML). workspacer — an MIT licensed tiling window manager for Windows 10 that aims to be fast and compatible. Written and configurable using C#. dwm-win32 — port of dwm's general functionality to win32. Is MIT licensed and is configured by editing a config header in the same style as dwm. X Window System In the X Window System, the window manager is a separate program. X itself enforces no specific window management approach and remains usable even without any window manager. Current X protocol version X11 explicitly mentions the possibility of tiling window managers. The Siemens RTL Tiled Window Manager (released in 1988) was the first to implement automatic placement/sizing strategies. Another tiling window manager from this period was the Cambridge Window Manager developed by IBM's Academic Information System group. In 2000, both larswm and Ion released a first version. List of tiling window managers for X awesome – a dwm derivative with window tiling, floating and tagging, written in C and configurable and extensible in Lua. It was the first WM to be ported from Xlib to XCB, and supports D-Bus, pango, XRandR, Xinerama. bspwm – a small tiling window manager that, similarly to yabai, represents windows as the leaves of a full binary tree. It does not handle key-binds on its own, requiring another program (e.g. sxhkd) to translate input to X events. Compiz – a compositing window manager available for usage without leaving familiar interfaces such as the ones from GNOME, KDE or Mate. One of its plugins (called Grid) allows the user to configure several keybindings to move windows to any corner, with five different lengths. There are also options to configure default placement for specific windows. The plugins can be configured through the Compiz Config Settings Manager / CCSM. dwm – allows for switching tiling layouts by clicking a textual ascii art 'icon' in the status bar. The default is a main area + stacking area arrangement, represented by a []= character glyph. Other standard layouts are a single-window "monocle" mode represented by an M and a non-tiling floating layout that permits windows to be moved and resized, represented by a fish-like ><>. Third party patches exist to add a golden section-based Fibonacci layout, horizontal and vertical row-based tiling or a grid layout. The keyboard-driven menu utility "dmenu", developed for use with dwm, is used with other tiling WMs such as xmonad, and sometimes also with other "light-weight" software like Openbox and uzbl. herbstluftwm – a manual tiling window manager (similar to i3 or Sway) that uses the concept of monitor independent tags as workspaces. Exactly one tag can be viewed on a monitor, with each tag containing its own layout. Like i3 and Sway, herbstluftwm is configured at runtime via IPC calls from herbstclient. i3 – a built-from-scratch window manager, based on wmii. It has vi-like keybindings, and treats extra monitors as extra workspaces, meaning that windows can be moved between monitors easily. Allows vertical and horizontal splits, tabbed and stacked layouts, and parent containers. It can be controlled entirely from the keyboard, but a mouse can also be used. Ion – combines tiling with a tabbing interface: the display is manually split in non-overlapping regions (frames). Each frame can contain one or more windows. Only one of these windows is visible and fills the entire frame. Larswm – implements a form of dynamic tiling: the display is vertically split in two regions (tracks). The left track is filled with a single window. The right track contains all other windows stacked on top of each other. LeftWM – a tiling window manager based on theming and supporting large monitors such as ultrawides. Qtile – a tiling window manager written, configurable and extensible in Python. Ratpoison — A keyboard-driven GNU Screen for X. spectrwm — a dynamic tiling and reparenting window manager for X11. It tries to stay out of the way so that valuable screen real estate can be used for more important content. It strives to be small, compact and fast. Formerly called "scrotwm". StumpWM – a keyboard driven offshoot of ratpoison supporting multiple displays (e.g. xrandr) that can be customized on the fly in Common Lisp. It uses Emacs-compatible keybindings by default. wmii (window manager improved 2) supports tiling and stacking window management with extended keyboard, mouse, and filesystem based remote control, replacing the workspace paradigm with a new tagging approach. The default configuration uses keystrokes derived from those of the vi text editor. The window manager offers extensive configuration through a virtual filesystem using the 9P filesystem protocol similar to that offered by Plan 9 from Bell Labs. Every window, tag, and column is represented in the virtual filesystem, and windows are controlled by manipulating their file objects (in fact, the configuration file is just a script interfacing the virtual files). This RPC system allows many different configuration styles, including those provided in the base distribution in plan9port and Bourne shell. The latest release 3.9 also includes configurations in Python and Ruby. The latest release supports Xinerama, shipping with its own keyboard-based menu program called wimenu, featuring history and programmable completion. xmonad – an extensible WM written in Haskell, which was both influenced by and has since influenced dwm. Wayland Wayland is a new windowing system with the aim of replacing the X Window System. There are only a few tiling managers that support Wayland natively. List of tiling window managers for Wayland Sway — Sway is "a drop-in replacement for the i3 window manager, but for Wayland instead of X11. It works with your existing i3 configuration and supports most of i3's features, and a few extras". Way Cooler — Way Cooler is an unmaintained Wayland compositor for the Awesome window manager. It is written in C and, like Awesome, configurable using Lua, and extendable with D-Bus. River - River is a dynamic tiling Wayland compositor with flexible runtime configuration, it is maintained and under regular updates. Newm Wayland compositor written with laptops and touchpads in mind. CageBreak is a tiling compositor for wayland, based on cage and inspired by (Ratpoison), which is easily controlled through the keyboard and a unix domain socket. Others The Oberon operating and programming system, from ETH Zurich includes a tiling window manager. The Acme programmer's editor / windowing system / shell program in Plan 9 is a tiling window manager. The Samsung Galaxy S3, S4, Note II and Note 3 smartphones, running a custom variant of Android 4, have a multi-window feature that allows the user to tile two apps on the device's screen. This feature was integrated into stock Android as of version 7.0 "Nougat". The Pop Shell extension, from Pop!_OS can add tiling windows manager functionalities to GNOME. The Amethyst window manager by ianyh, which provides window tiling for macOS and was inspired by xmonad. Tiling applications Although tiling is not the default mode of window managers on any widely used platform, most applications already display multiple functions internally in a similar manner. Examples include email clients, IDEs, web browsers, and contextual help in Microsoft Office. The main windows of these applications are divided into "panes" for the various displays. The panes are usually separated by a draggable divider to allow resizing. Paned windows are a common way to implement a master–detail interface. Developed since the 1970s, the Emacs text editor contains one of the earliest implementations of tiling. In addition, HTML frames can be seen as a markup language-based implementation of tiling. The tiling window manager extends this usefulness beyond multiple functions within an application, to multiple applications within a desktop. The tabbed document interface can be a useful adjunct to tiling, as it avoids having multiple window tiles on screen for the same function. See also Split screen (computer graphics) Integrated development environment style interface References External links Comparison of Tiling Window Managers — Arch Linux Wiki User interface techniques Window managers
10057294
https://en.wikipedia.org/wiki/Economy%20of%20metropolitan%20Detroit
Economy of metropolitan Detroit
The metropolitan area surrounding and including Detroit, Michigan, is a ten-county area with a population of over 5.9 million, a workforce of 2.6 million, and about 347,000 businesses. Detroit's six-county Metropolitan Statistical Area has a population of about 4.3 million, a workforce of about 2.1 million, and a gross metropolitan product of $200.9 billion. Detroit's urban area has a population of 3.9 million. A 2005 PricewaterhouseCoopers study estimated that Detroit's urban area had a gross domestic product of $203 billion. About 180,500 people work in downtown Detroit, comprising one-fifth of the city's employment base. Metro Detroit has propelled Michigan's national ranking in emerging technology fields such as life sciences, information technology, and advanced manufacturing; Michigan ranks fourth in the U.S. in high tech employment with 568,000 high tech workers, which includes 70,000 in the automotive industry. Michigan typically ranks third or fourth in overall research and development expenditures in the United States. Metro Detroit is the second-largest source of architectural and engineering job opportunities in the U.S. Detroit is known as the automobile capital of the world, with the domestic auto industry primarily headquartered in Metro Detroit. As of 2003, the Alliance of Automobile Manufacturers claimed that new vehicle production, sales, and jobs related to automobile use account for one of every ten jobs in the United States. In April 2008, metropolitan Detroit's unemployment rate was 6.9 percent; in November 2012, it was 7.9 percent. Economic issues include the city of Detroit's unemployment rate at 15.8 percent in April 2012. The suburbs typically have low unemployment. The metropolitan economy began an economic recovery in 2010. Real estate and corporate location From the metro area economy, Michigan was second in the U.S. in 2004 for in new corporate facilities and expansions. From 1997 to 2004, Michigan was the only state to top the 10,000 mark for the number of major new developments. Among metro areas with more than one million people, Metro Detroit was fourth in the U.S. from 2007 to 2009 for new corporate facilities and expansions. Metro Detroit has one of the nation's largest office markets with . Major inter-connected office complexes include the Renaissance Center, the Southfield Town Center, and the Cadillac Place joined with the Fisher Building in the historic New Center area. The metro area's resilience has kept the state's economy growing in spite of difficulties. From the third quarter of 2006 to the fourth quarter of 2009, Metro Detroit's residential resale housing market struggled, along with the residential real estate trend across the United States creating opportunities for buyers. The Case–Shiller index projects Metro Detroit as the nation's third strongest housing market by 2014, attracting interest from international investors. Among the top fifty metropolitan areas, Detroit ranked as the third most affordable in the United States in a Forbes 2011 report. Detroit was among the top five cities in the U.S. for job growth from 2010 to 2012. A 2011 economic study showed Metro Detroit with the highest share of employment (13.7%) in the technology sectors in the U.S. The state repealed its business tax in 2011 and replaced it with a 6% corporate income tax which substantially reduced taxes on business. Michigan became the 24th Right to Work state in the U.S. in 2012. Metro Detroit is home to highly successful real estate developers. Area suburbs are among the more affluent in the U.S. Some of the newer multimillion-dollar estates in the metro area include those of the Turtle Lake development in Bloomfield Hills by Victor International. The region is the headquarters for Pulte Homes, one of the USA's largest home builders, and Taubman Centers, one of the USA's largest shopping mall developers. There are a full range of retail shopping centers from upscale stores to discount chains. In 2007, Bank of America with regional offices in Troy announced that it would commit $25 billion to community development in Michigan. The Cool Cities Initiative is an innovative reinvestment strategy for America's northern cities begun by Michigan leaders to rebuild inner cities and downtowns. Immigration continues to play a role in the region's projected growth with the population of Detroit-Ann Arbor-Flint (CMSA) estimated to be 6,191,000 by 2025. Cities with existing infrastructure like Detroit are equipped to accommodate future increases in projected U.S. population growth. A 2007 report showed the city of Detroit's average household income at $47,962. Redevelopment of historic buildings is priority for the city. OnStar, Ally Financial, Compuware, Quicken Loans, and Blue Cross Blue Shield Association have brought an increased employment base to downtown Detroit. In the decade leading up to 2006, downtown Detroit gained more than $15 billion in new investment from private and public sectors. The Detroit Riverfront conservancy has been able to acquire the $500 M investment for Detroit International Riverfront development through a series of public and private grants to complete the first phase of the five and a half-mile (8.8 km) parkway along the riverfront east from the Hart Plaza and the Renaissance Center to the Belle Isle Bridge with phase II west of Hart Plaza to the Ambassador Bridge. In 2010, Henry Ford Health System and Vanguard Health Systems announced substantial renovations and expansions in New Center and Midtown Detroit. Lifestyles for rising professionals in Detroit reflect those of other major cities. A 2007 study found that Detroit's new downtown residents are predominantly young professionals (57 percent are ages 25–34, 45 percent have bachelor's degrees, 34 percent have a master's or professional degree). This dynamic is luring many younger residents to the downtown area. Some are choosing to live in the grandiose mansions of Grosse Pointe in order to be closer to the urban scene. The river east development is a plan investing billions of dollars in a new mixed use residential, commercial, and retail space for downtown Detroit to serve the people where they work and live. To spearhead the development, Michigan created the William G. Milliken State Park and Harbor downtown along the Detroit International Riverfront. In 2007, downtown Detroit was named among the best big city neighborhoods in which to retire by CNN Money editors. In 2008, Troy, Michigan, ranked as the fourth-most affordable U.S. city with a median household income of 78,800. Oakland County is the fourth wealthiest county in the United States among counties with more than one million people. Redevelopment of the Fort Shelby Hotel and the Westin Book-Cadillac Hotel has spurred economic growth downtown. Cobo Hall convention and exhibit facility, which hosts the North American International Auto Show, has begun a nearly $300 million renovation to be completed in 2014. Development of Detroit's west river area and its Michigan Central Station are the next important challenges for the city. Finance Metro Detroit is among the top five financial centers in the U.S. having all of the Big Four accounting firms. The area's major financial service employers include Quicken Loans, Ally Financial, Ford Motor Credit Company, Bank of America, Comerica, PNC Financial Services, Fifth Third Bank, JP Morgan Chase, GE Capital, TD Auto Finance, Deloitte Touche, KPMG, Ernst & Young, PricewaterhouseCoopers, Baker Tilly, Plante Moran, Robert Half International, and Raymond James. Financial and investment executives have diverse employment opportunities in metropolitan Detroit. Ally Financial, headquartered at Tower 200 of the Renaissance Center, is among the largest holders of mortgages in the United States. Detroit-based Quicken Loans is the fifth-largest retail home mortgage lender in the U.S. and the largest online. The metropolitan area has a range of venture capital firms which finance business start-ups and acquisitions. The area's real estate investment trusts (REITs) which include Taubman Centers, are an important part of the investment community which owns and operates many major shopping malls across the U.S. Pulte Homes, one of the largest home builders in the U.S. has its own mortgage company. As another example, General Motors invests its $85 billion pension trust. Detroit's historic Penobscot Building in the downtown financial district is in the heart of the city's wireless Internet zone and fiber-optic network. Fifth Third Bank, which maintains its regional headquarters at tower 1000 of the Southfield Town Center, announced a $100 million expansion in the Metro Detroit area in order to take market share from Dallas-based rival Comerica, which also maintains a large presence in Michigan. Fifth Third announced it would create 350 new jobs in the area and open 30 to 40 new branches. In 2009, Quicken Loans more than doubled its mortgage volume from the previous year to $25 billion, experiencing significant growth in market share. In 2010, Quicken began a new division within the company to provide mortgage services to community banks nationwide. In 2011, Quicken Loans relocated its headquarters to downtown Detroit, consolidating about 4,000 of its suburban employees in a move considered to be a high importance to city planners to reestablish the historic downtown. In 2011, Blue Cross Blue Shield of Michigan consolidated 6,000 of its employees in downtown Detroit, relocating 3,000 to Tower 500 and 600 of the Renaissance Center from Southfield. Information technology Metro Detroit accounts for the State's national ranking in emerging technology fields such as life sciences, information technology, and advanced manufacturing; Metro Detroit's technology sector is fifth in the U.S. for total employment and fourth in the percent of employment concentrated within the sector. In 2010, the Detroit area became the fastest growing region in the U.S. for high technology jobs. Downtown Detroit maintains a wireless Internet zone and has seen an influx of information technology jobs. A report by the Silicon Valley based TechNet group found Michigan to be the leading state for stimulating demand for broadband, positioning it during the early 2000s. The Michigan Information Technology Center provides education, support services, and conferencing facilities for the region's information technology companies. The metro area is home to high tech business incubators such as the Michigan Security Network, a consortium which coordinates business growth of cybersecurity, biodefense, and border security sectors. Some of the metro area's information technology and software companies with a major presence or headquarters include Compuware, HP Enterprise Services, IBM, Google, General Electric, Unisys, Fiserv, Covansys, and ProQuest. HP Enterprise Services makes Metro Detroit its regional headquarters and one of its largest global employment locations. On June 26, 2009, General Electric announced that it will create software at a new advanced manufacturing and technology center in Van Buren Township. Comcast and Verizon maintain a large presence in the area. OnStar, based in the Renaissance Center is also a source of growth. Chrysler's largest corporate facility is its U.S. headquarters and technology center in the Detroit suburb of Auburn Hills. VisionIT and Kelly IT Resources are other large employers headquartered in the metro area filling a wide range of needs. Five of the world's twenty largest employers began in Metro Detroit. On June 30, 2015, Quicken Loans announced the opening of its new state-of-the-art, 66,000-square-foot Technical Center in Corktown. The new facility will feature two 10,000-square-foot server rooms in addition to training, office, meeting, and technical support space. Half of the data center including one server room will be occupied by the Quicken Loans' technology team. An equal-sized 33,000 square foot portion of the building, including the second 10,000 square-foot server room, is available for lease. Higher education and research Metro Detroit has diversified its economic base though initiatives in emerging technologies. Michigan typically ranks third or fourth in overall Research & development (R&D) expenditures in the United States. In 2011, Detroit received the first U.S. Patent and Trademark Office outside the Washington, D.C. area. Metro area universities provide a source of top talent for the region. The University of Michigan in Ann Arbor is one of the world's leading research institutions and is among the highly ranked institutions in the U.S. The University of Michigan schools of engineering, medicine, business, and law are consistently among the top-ranked in the United States. In 2002, the state constructed the NextEnergy Center just north of Wayne State University to focus on fuel cell development and alternative energy. The area is home to many post-secondary institutions of higher learning and research, including: Baker College, Carnegie Institute, Cleary University, Cranbrook Educational Community, Eastern Michigan University, Lawrence Technological University, Oakland University, Thomas M Cooley Law School-Rochester, Walsh College, Rochester College, Madonna University, Marygrove College, University of Detroit Mercy, the University of Michigan, and Wayne State University. On the Canadian side of the border, Windsor's two post-secondary institutions have partnered with auto makers to open high tech research and training facilities. The University of Windsor is home to the University of Windsor/DaimlerChrysler Canada Automotive Research and Development Centre. St. Clair College has the Ford Centre for Excellence in Manufacturing. Health care and biomedical Metro Detroit area is one of the leading health care economies in the U.S. according to a 2003 study measuring health care industry components, with the region's hospital sector ranking fourth in the nation. A 2006 economic impact report showed that the metropolitan region supported 245,379 direct health care jobs with an additional 120,408 indirect and induced jobs. Major health system networks in the region include the University of Michigan, Henry Ford, Beaumont, Detroit Medical Center, St. John, Oakwood, St. Joseph, Karmanos Cancer Center, and the John D. Dingell and Ann Arbor Veterans Affairs Medical Centers. Beginning in 2010, Oakland University in Rochester opened Michigan's fourth medical school in a partnership with Beaumont Hospitals. The school will boost the region's economy with jobs in the life sciences, research, clinical trials, and doctors Wayne State University in Detroit has the largest single-campus medical school in the United States, and the nation's fourth largest medical school overall. Detroit Medical Center formally became a part of Vanguard Health Systems on December 30, 2010, as a for-profit corporation. Vanguard has agreed to invest nearly $1.5 billion in the Detroit Medical Center complex which will include $417 million to retire debts, at least $350 million in capital expenditures and an additional $500 M for new capital investment. In January 2009, the University of Michigan established the North Campus Research Complex through its purchase of the former Pfizer research facility with 30 buildings on in Ann Arbor in order to create about 2,000 jobs through establishing commercial partnerships. The Community Foundation of Southeast Michigan administers $100 M of private foundation grants for the regions New Economy Initiative to spur investment in a variety of metro area projects. A BioEnterprise Midwest Healthcare Venture report found that the Detroit - Ann Arbor region attracted $312 M in new biotechnology venture capital investments from 2006 to 2009. In 2012, two major construction projects were begun in New Center, the Henry Ford Health System started the first phase of its South Campus site, a $500 million, 300-acre revitalization project, with the construction of a new $30 million, 275,000-square-foot, Medical Distribution Center for Cardinal Health, Inc. and Wayne State University started construction on a new $93 million, 207,000-square-foot, Integrative Biosciences Center (IBio). As many as 500 researchers, and staff will work out of the IBio Center. Manufacturing and industry As the world's traditional automotive center, Metro Detroit is headquarters to America's "Big Three" automakers, General Motors, Ford Motor Company, and Chrysler. Virtually every major global automaker has a presence in the area including technology and design centers. Oakland County's "Automation Alley" has over 1,800 of world's advanced technology companies with Metro Detroit ranking fifth in the U.S. in technology sector employment. There are about 4,000 factories in the area. The automotive headquarters for the Society of Automotive Engineers (SAE) is in the suburb of Troy. OnStar and Ally Financial are a source for growth. In spite of foreign competition for market share, Detroit's automakers have continued to gain volume from previous decades with the expansion of the American and global automotive markets. Manufacturing in the state grew 6.6% from 2001 to 2006, In 2008, an economic and financial crisis impacted global auto industry sales. For 2010, the domestic automakers reported significant profits indicating the beginning of rebound. The sales revenue from just one of Detroit's automakers exceeds the combined total for the all of the top companies in many major U.S. cities. A Center for Automotive Research (CAR) study estimated that tax revenue generated by the automotive industry in the United States for a single year, 2010, amounted to $91.5 billion in state and local tax revenue and additional $43 billion in federal tax revenue. The area includes a variety of manufacturers and is an important component of U.S. national security. United States Army TACOM Life Cycle Management Command (TACOM) is headquartered in Metro Detroit together with Selfridge Air National Guard Base. The region has important defense contractors such as General Dynamics. The area is home to Rofin-Sinar, a leading maker of lasers which are used for industrial processes. Advanced robotics is another important segment in the metro area. On June 27, 2009, General Electric announced plans to build a new $100 M center for advanced manufacturing technology and software, in Van Buren Township in Wayne County, expected to employ 1,200 people providing a pay range of $100,000 per year. Dow Chemical is a significant company in the metro region. The metro region's large energy producers include DTE and CMS. With its major port status, the city's infrastructure accommodates heavy industry. Marathon Oil Company maintains a large refinery in Detroit, expanded to refine oil sands from Canada. Lafarge's cement distribution facility constructed at the city's Springwells Industrial Park in 2005 includes North America's largest cement silo. Detroit's automakers are building vehicles like the Chevrolet Volt flex fuel hybrid and Buick LaCrosse e-assist hybrid. In 2006, Ford announced a dramatic increase in production of its hybrid gas-electric models, Ford and GM have also promoted E-85 ethanol capable flexible-fuel vehicles as a viable alternative to gasoline. General Motors has invested heavily in all fuel cell-equipped vehicles, while Chrysler is focusing much of its research and development into biodiesel. Two days after the September 11, 2001, attacks, GM announced it had developed the world's most powerful fuel cell stack capable of powering large commercial vehicles. In 2002, the state of Michigan established NextEnergy, a non-profit corporation whose purpose is to enable commercialization of various energy technologies, especially hydrogen fuel cells. Its main complex is located north of Wayne State University. In August 2009, Michigan and Detroit's auto industry received $1.36 B in grants from the U.S. Department of Energy for the manufacture of lithium-ion batteries which are expected to generate 6,800 immediate jobs and employ 40,000 in the state by 2020. On quality, Cadillac outscored all other luxury automakers in two of three quality surveys by AutoPacific, Strategic Vision, and J.D. Power in 2003. Ford led all other automakers in the 2007 J.D. Initial Quality survey. Trade The Greater Detroit Foreign Trade Zone (GDFTZ) was created in 1981 through the U.S. Department of Commerce to allow for the reduction of taxes across borders and to attract, retain and facilitate international trade In 2011, Metro Detroit ranked as the fourth largest export market in the United States. Infrastructure is an important component in the metro area economy. Detroit has an extensive toll-free expressway system which, together with its status as a major port city, provide advantages to its location as a global business center. There are no toll roads in Michigan. Metro Detroit is the country's number-one exporting region and busiest commercial port. Detroit is at the center of the Great Lakes Megalopolis. The Ambassador Bridge is the busiest commercial border crossing in North America, carrying 27 percent of the total trade between the U.S. and Canada. More than fifteen million people and ten million vehicles cross the Ambassador Bridge and the Detroit-Windsor Tunnel annually. A 2004 Border Transportation Partnership study showed that 150,000 jobs in the Detroit-Windsor region and $13 billion in annual production depend on Detroit's international border crossing. The Detroit River International Crossing project calls for a second bridge to be built across the Detroit River to facilitate increased trade and ease of travel. Many people commute across the Detroit-Windsor international border daily. Professions identified in the Canada - United States Free Trade Agreement which began in 1988 are permitted TN Visas for legal work in the United States and Canada, creating freedom of labor movement. TN status is recognized in the North American Free Trade Agreement (NAFTA) which began in 1994. As an example, a large number of nurses in Detroit hospitals also live in Windsor. The Quebec City–Windsor Corridor contains over 18 million people, with 51 percent of the Canadian population and three out of the four largest metropolitan areas in Canada, according to the 2001 Census. Headquartered in Detroit, the international law firm of Miller, Canfield, Paddock & Stone P.L.C., is one of the largest in the United States. Metro area business leaders belong to the Detroit Economic Club, headquartered at 211 West Fort Street. The U.S. dollar is readily accepted as currency in Windsor. Transportation Metro Detroit offers a comprehensive system of transit services for the central city and region. The Michigan Department of Transportation (MDOT) administers the advanced network of freeways in metropolitan Detroit and Michigan. The region offers mass transit with bus services provided jointly by the Detroit Department of Transportation (DDOT) and the Suburban Mobility Authority for Regional Transportation (SMART) through a cooperative service and fare agreement. Cross border service between the downtown areas of Windsor and Detroit is provided by Transit Windsor via the Tunnel Bus. A monorail system, known as the People Mover, operates daily through a 2.9 mile (4.6 km) loop in the downtown area. Amtrak provides service to Detroit, operating its service between Chicago, Illinois, and Pontiac. Greyhound Bus provides nationwide service to Detroit with its station on Howard Street near Michigan Avenue. A proposed SEMCOG Commuter Rail service could link Ann Arbor, Detroit Metropolitan Airport, Ypsilanti, The Henry Ford, Dearborn, and Detroit's New Center Amtrak station. As a major U.S. port, Detroit is an important center for transportation & logistics employment including its aviation, rail, truck, and ship docking facilities. Detroit maintains a cruise ship dock and passenger terminal on Hart Plaza adjacent to the Renaissance Center. Commercial vessels dock at Michigan's 38 deep water ports which provide access to the Great Lakes Waterway and the Saint Lawrence Seaway. Detroit Metropolitan Airport (DTW) is one of America's largest and most recently modernized facilities, with six major runways, Boeing 747 maintenance facilities, and an attached Westin Hotel and Conference Center. Located in nearby Romulus, DTW is metro Detroit's principal airport and is a hub for Delta Air Lines and Spirit Airlines. Bishop International Airport in Flint and Toledo Express Airport in Toledo, Ohio, are other commercial passenger airports. Coleman A. Young International Airport (DET), commonly called Detroit City Airport, is on Detroit's northeast side, and offers charter service. Willow Run Airport in Ypsilanti is for commercial aviation. One economic development strategy proposed is an Aerotropolis, a concept utilizing Detroit Metropolitan Airport as a central business district. Detroit Renaissance, now known as Business Leaders for Michigan, announced an eleven-point strategy to transform the region's economy which includes development of the Aerotropolis. The U.S. Department of Transportation has awarded $244 M in grants for high-speed rail upgrades between Chicago and Detroit. A consortium of investors including the Canadian Pacific Railway has proposed a new larger rail tunnel to accommodate large double stacked freight cars under the Detroit River which could open in 2015. With the new tunnel potentially emerging near the Michigan Central Station, a redeveloped station could play a role as a trade inspection facility. Tourism Tourism in metropolitan Detroit is an important economic factor, comprising nine percent of the area's two million jobs. About 15.9 million people visit the area annually spending an estimated $4.8 B. Besides casino gaming, the region's leading attraction is The Henry Ford, America's largest indoor-outdoor museum complex. The Detroit International Riverfront links the Renaissance Center to a series of venues, parks, restaurants, and hotels by a riverfront walkway. The region hosts large multi-day events with crowds of hundreds of thousands to over three million people for annual events such as the Windsor-Detroit International Freedom Festival, the North American International Auto Show, and the Motown Winter Blast on Campus Martius Park. The city's Midtown and New Center areas anchored by Wayne State University attract millions of visitors each year to its museums and cultural centers; for example, the Detroit Festival of the Arts in Midtown draws about 350,000 people. Mall developers consider the metro area's Somerset Collection to be among the nation's top privately held mall properties with 2004 gross annual sales of about $600 M and sales per square foot at $620 compared to the national average of $341. The area has hosted several major sporting events such as Super Bowl XL; in fact, Detroit is the only northern city to have hosted two Super Bowls. Ford Field hosted the 2009 NCAA Final Four; in April 2007 it hosted WrestleMania 23. Major League Baseball's 2005 All-Star Game was held at Comerica Park, as were 2006 World Series games due to the Detroit Tigers success. Metro Detroit is one of thirteen U.S. cities with teams from four major sports. The area's network of Huron-Clinton Metroparks receives about nine million visitors annually. About 5.9 million people live in the Detroit–Windsor region, making it one of the largest metropolitan areas in North America. An estimated 46 million people live within a 300-mile (480 km) radius of Metro Detroit. Thus, the metro area has many opportunities for growth in tourism with great potential for development and expansion. The region's abundance of natural lakes and coastal landscape present investment potential for beachfront resorts and luxury high rise condominiums. In addition, there is the Detroit River International Wildlife Refuge which is the only international wildlife preserve in North America, uniquely located in the heart of a major metropolitan area. The refuge includes islands, coastal wetlands, marshes, shoals, and waterfront lands along of the Detroit River and Western Lake Erie shoreline. The city of Detroit functions as an entertainment hub for the entire region, as casino resorts, major sports venues, and theatre district increase development prospects for new retail. Detroit is the largest American city and metropolitan region to offer casino resort hotels. The MGM Grand Detroit (2007), Motor City Casino (2008), Caesars Windsor (2007), and Greektown Casino (2008) comprise the regions four major casino resorts. Movie studios in metro area help to establish the state as a legitimate contender in the 12-month-a-year film business. Motown Motion Picture Studios (2009) with will produce movies at the Pontiac Centerpoint Business Campus for a film industry expected to employ over 4,000 people in the metro area. Retail Metro Detroit has many chain retailers and super regional shopping malls, in both upscale and outlet style venues, which, in addition to the "land" malls of Southland Center in Taylor, Eastland Center in Harper Woods, and Westland Center in Westland (Southfield's Northland Center closed in 2015), are located throughout other suburban municipalities such as Troy, Novi, Auburn Hills, Sterling Heights, and Dearborn. In the 2000s, some older malls closed, while some inner-ring suburban malls have been remodeled. Others have a new role with "big box" establishments. During the same decade, upscale lifestyle centers appeared in Detroit suburbs, most nobably The Mall at Partridge Creek in Clinton Township. Several suburban municipalities, including Birmingham, Royal Oak, Rochester, and Grosse Pointe, contain their own street-side shopping districts. Many local merchants and restaurants are located within the Detroit city-limits including Lower Woodward Avenue Historic District, Greektown Historic District, the Renaissance Center, and those in the Eastern Market Historic District; however, the city of Detroit has few big chain retailers. A 2007 Selzer and Co. poll found that nearly two-thirds of suburban residents said they occasionally dine and attend cultural or professional sporting events in downtown Detroit. The Fairlane Town Center, a super-regional shopping mall in Dearborn, is about 15 minutes from downtown Detroit. A 2007 Social Compact report showed that city of Detroit residents spend about $1.7 B annually in the suburbs for retail goods and services. As of 2009, "big box" super-centers had yet to open stores within the city limits of Detroit. In August 2009, the Meijer chain of super-centers announced it would open its first store within the city limits at the $90M Gateway Marketplace. In April 2009, developers announced they had leased 60 percent of the retail space for a planned $90 M open-air mall, the Gateway Marketplace, to be located within the city-limits of Detroit. Gateway Marketplace opened in June 2013. Meijer then opened another store in the Old Redford section of the northwest side in 2015. The city of Detroit has four Starbucks coffee shops, several Tim Hortons coffee shops and three Dunkin' Donuts shops (include one Baskin-Robbins combo outlet), all of which face Michigan based competitors Coffee Beanery and Biggby Coffee. The city's major bookstore is Wayne State University Bookstore, leaving an opening for a major book store chain. New car dealerships have migrated to the suburbs. The decline of chain fast-food outlets within Detroit has closely paralleled that of the city itself, including a notable decline of locations of Yum! Brands-owned restaurants within the city limits to the point that Taco Bell is down to two locations on the city's west side, as well as an additional store in Wayne State University, as of 2019. Supermarkets and grocery stores As of 2009, German-based supermarket chain Aldi, which opened Detroit locations in 2001 and 2005, and the Michigan-based Spartan Stores were the grocery chains operating within the city of Detroit. In 2011, Whole Foods Market announced a new Midtown location in the city of Detroit. This location opened in June 2013 to much fanfare. Many independent grocery stores serve neighborhoods in Detroit; however, a 2009 University of Michigan report estimated that neighborhoods within the city limits of Detroit have sufficient income to sustain from $210 million to $377 million in additional grocery retail spending which has leaked to nearby suburbs and that the city could support up to of additional retail grocery space. The report noted that retail grocery traffic tends to stimulate growth of other types of retail and that large retail chains have been slow to realize the growth potential for the city. As of 2011, according to Martin Manna, the Chaldean American Chamber of Commerce's executive director, 75 of the 84 supermarkets in the Detroit city limits are owned by Chaldean Americans. Metro Foodland in the city is an African American owned business; it is the final remaining black-owned supermarket in Detroit, a majority black city. The owner, James Hooks, said that there always have been few black-owned grocery stores in Detroit. Former employees of Hooks had established two other black-owned stores, and both stores closed. Southwest Detroit has many independent grocery stores. In particular Southwest Detroit has several Hispanic supermarkets, or supermercados, that stock meat, specialty produce, and tortillas. Media As the traditional automotive center, the region is a major source for related journalism and business news. Gale publishing and Crain Communications are headquartered in the metro area. The Detroit television market is the thirteenth-largest in the United States; however, these ratings do not include Canadian cable viewers who watch Detroit television stations; cities served by Detroit channels in Ontario include London, Ottawa, and Thunder Bay; many Western Canadians also watch Detroit channels, such as Saskatoon residents These channels include WJBK 2 (Fox), WDIV-TV 4 (NBC), WXYZ-TV 7 (ABC), WMYD 20 (MyNetworkTV), WPXD-TV 31 (Ion Television), WKBD-TV 50 (The CW), WTVS 56 (PBS) and WWJ-TV 62 (CBS). Detroit has the twelfth-largest radio market in the United States, though this ranking does not take into account Canadian audiences. Movie theaters As of 2015 there was one movie theater within the Detroit city limits showing first-run films: Bel Air 10 in northeast Detroit. There are some independent theater options: the Detroit Institute of Arts Detroit Film Theatre, the Cinema Detroit in Midtown, and the Redford Theatre in northwest Detroit. The Renaissance Center previously had the first-run theater Ren Cen 4 but it closed in the summer of 2015. In 2015, there were 49 movie theaters in the Metro Detroit area outside the city of Detroit totaling 522 screens, many of them also showing first-run films and offering stadium seating options, which range from the five-screen Ford Drive-In in Dearborn to decades-old single-screen theaters in communities such as Farmington and Plymouth to the AMC Theatres Forum 30 megaplex in Sterling Heights. Of these, ten are megaplexes with 20 or more screens. These are found in Sterling Heights, Auburn Hills, Clinton, Dearborn, Southfield, Southgate, Brighton and Ypsilanti. Since then, Cinemark Theaters opened a 12-screen location at Southland Center in Taylor in April 2016, Cinemark also offers the Rave Motion Pictures Ann Arbor 20 in Ypsilanti. IMAX options in Metro Detroit include dedicated theaters at The Henry Ford and the Michigan Science Center as well as in individual auditoriums at several AMC outlets and the aforementioned Rave 20 in Ypsilanti. AMC, Cinemark and Regal Entertainment Group, operators of the United Artists Commerce Stadium 14 just outside Walled Lake, face competition from Michigan-based chains Emagine Entertainment, MJR Digital Cinemas and Phoenix Theaters. Historic highlights President Franklin Roosevelt referred to America as the "Arsenal of Democracy". Detroit and its automotive industries played a pivotal role in the Allied victory during World War II. With Europe, Asia, and the Pacific islands under siege by the Axis powers, Henry Ford's genius would be turned to mass production for the war effort. Specifically, the B-24 Liberator bomber, still the most produced allied heavy bomber in history, quickly shifted the balance of power. The aviation industry could produce, if everything went all right, one Consolidated Aircraft B-24 Bomber a day at an aircraft plant. Ford would show the world how to produce one B-24 an hour, and at peak production Ford produced 650 per month at Willow Run by 1944. Ford's Willow Run factory broke ground in the April 1941. At the time, it was the largest assembly plant in the world, with over . Edsel Ford, Henry Ford's son, under stress, died in the Spring of 1943 of stomach cancer prompting Henry Ford to resume day-to-day control of the Ford Motor Company. Willow Run completed its first B-24 in October 1942, with production increasing substantially by August 1943. Pilots and crew slept on the 1,300 cots waiting to fly the B-24s as they rolled off the assembly line at Ford's Willow Run facility. Largest employers See also Notes References and further reading Bak, Richard (2003). Henry and Edsel: The Creation of the Ford Empire. Wiley External links Aerial pictures Business Leaders for Michigan Cityscape Detroit Detroit Metro Convention & Visitors Bureau Detroit Economic Club Detroit Economic Growth Corporation Detroit Regional Chamber of Commerce Detroit Riverfront Conservancy Downtown Detroit Partnership Experience Detroit Guide2Detroit New Center Council NextEnergy 01 Metro Detroit
264466
https://en.wikipedia.org/wiki/Electronic%20business
Electronic business
Electronic business (or "Online Business" or "e-business") is any kind of business or commercial transaction that includes sharing information across the internet. Commerce constitutes the exchange of products and services between businesses, groups, and individuals and can be seen as one of the essential activities of any business. Electronic commerce focuses on the use of information and communication technology to enable the external activities and relationships of the business with individuals, groups, and other businesses, while e-business refers to business with help of the internet. Electronic business differs from electronic commerce as it does not only deal with online transactions of selling and buying of a product and/or service but also enables to conduct of business processes (inbound/outbound logistics, manufacturing & operations, marketing and sales, customer service) within the value chain through internal or external networks. The term "e-business" was coined by IBM's marketing and Internet team in 1996. Market participants in Electronic Business Electronic business can take place between a very large number of market participants; it can be between business and consumer, private individuals, public administrations, or any other organizations such as NGOs. These various market participants can be divided into three main groups: 1) Business (B) 2) Consumer (C) 3) Administration (A) All of them can be either buyers or service providers within the market. There are nine possible combinations for electronic business relationships. B2C and B2B belong to E-commerce, while A2B and A2A belong to the E-government sector that is also a part of the electronic business. Supply chain management and E-business With the development of the e-commerce industry, business activities are becoming more and more complex to coordinate, so efficiency in the business is crucial for the success of e-commerce. Hence, well-developed supply chain management is the key component of e-commerce, because the e-commerce industry is not focusing only on building appropriate web site but it also focuses on suitable infrastructure, well-developed supply chain strategy, etc. By definition, supply chain management refers to the management of the flow of goods and services, and all activities connected with transforming the raw materials into final products. The goal of the business is to maximize customer value and gain a competitive advantage over others. The supply chain management in the e-commerce industry mainly focuses on manufacturing, supplying the raw materials, managing the supply and demand, distribution, etc. Effective supply chain management in e-commerce often gives an advantage for the companies to positively leverage new opportunities for maximizing their profit by satisfying and meeting the customer's expectations. With the well-developed supply chain management, the company has a chance for better success by forming the right partnership and supply network, automating the business, etc. The enabling role of e-business technologies in supply chain organizations on the development of intra and inter organizational collaboration, and its impact on performance is discussed in depth in an article by Nada Sanders. To sum up, effective supply chain management in the e-commerce industry is needed for three main reasons: -Ensuring high service levels and stock availability; -Encouraging positive customer experience and reviews, and building a brand reputation; -Cost efficiency; History One of the founding pillars of electronic business was the development of the Electronic Data Interchange (EDI) electronic data interchange. This system replaced traditional mailing and faxing of documents with a digital transfer of data from one computer to another, without any human intervention. Michael Aldrich is considered the developer of the predecessor to online shopping. In 1979, the entrepreneur connected a television set to a transaction processing computer with a telephone line and called it "teleshopping", meaning shopping at distance. From the mid-nineties, major advancements were made in the commercial use of the Internet. Amazon, which launched in 1995, started as an online bookstore and grew to become nowadays the largest online retailer worldwide, selling food, toys, electronics, apparel and more. Other successful stories of online marketplaces include eBay or Etsy. In 1994, IBM, with its agency Ogilvy & Mather, began to use its foundation in IT solutions and expertise to market itself as a leader of conducting business on the Internet through the term "e-business." Then CEO Louis V. Gerstner, Jr. was prepared to invest $1 billion to market this new brand. After conducting worldwide market research in October 1997, IBM began with an eight-page piece in The Wall Street Journal that would introduce the concept of "e-business" and advertise IBM's expertise in the new field. IBM decided not to trademark the term "e-business" in the hopes that other companies would use the term and create an entirely new industry. However, this proved to be too successful and by 2000, to differentiate itself, IBM launched a $300 million campaign about its "e-business infrastructure" capabilities. Since that time, the terms, "e-business" and "e-commerce" have been loosely interchangeable and have become a part of the common vernacular. According to the U.S. Department Of Commerce, the estimated retail e-commerce sales in Q1 2020 were representing almost 12% of total U.S. retail sales, against 4% for Q1 2010. Business model The transformation toward e-business is complex and in order for it to succeed, there is a need to balance between strategy, an adapted business model (e-intermediary, marketplaces), right processes (sales, marketing) and technology ( Supply Chain Management, Customer Relationship Management). . When organizations go online, they have to decide which e-business models best suit their goals. A business model is defined as the organization of product, service and information flows, and the source of revenues and benefits for suppliers and customers. The concept of the e-business model is the same but used in the online presence. Revenue model A key component of the business model is the revenue model or profit model, which is a framework for generating revenues. It identifies which revenue source to pursue, what value to offer, how to price the value, and who pays for the value. It is a key component of a company's business model. It primarily identifies what product or service will be created in order to generate revenues and the ways in which the product or service will be sold. Without a well-defined revenue model, that is, a clear plan of how to generate revenues, new businesses will more likely struggle due to costs which they will not be able to sustain. By having a revenue model, a business can focus on a target audience, fund development plans for a product or service, establish marketing plans, begin a line of credit and raise capital. E-commerce E-commerce (short for "electronic commerce") is trading in products or services using computer networks, such as the Internet. Electronic commerce draws on technologies such as mobile commerce, electronic funds transfer, supply in chain management, Internet marketing, online transaction processing, electronic data interchange (EDI), inventory management systems, and automated data collection. Modern electronic commerce typically uses the World Wide Web for at least one part of the transaction's life cycle, although it may also use other technologies such as e-mail. Customer Relationship Management in e-business Customer Relationship Management (CRM) is the strategy that is used to build relationships and interactions with customers and potential/future customers. CRM provides better customer service, allowing companies to analyze their past, current and future customers on a variety of levels. It is one of the elements which is essential for any business, including e-commerce because it allows companies to grow and succeed. It cannot be done without technology. It is the formation of bonds between customers and the company. CRM impacts e-commerce sites by becoming an essential part of business successes. Interactively collecting and considering customer data helps to build a company's e-CRM capability, which then leads to the company's corporate success. The goal of CRM is to establish a profitable, long term 1-1 relationship with customers, by understanding their needs and expectations. This strategy is using 2 different approaches: software applications and software as a service. E-commerce CRM (e-CRM) primarily focuses on customer experiences and sales that are conducted online. Most e- CRM software has the ability to analyze customer information, sales patterns, record and store data, and website's metrics, for example: Conversion rates Customer click-through rate E-mail subscription opt-ins Which products customers are interested in Concerns While much has been written of the economic advantages of Internet-enabled commerce, there is also evidence that some aspects of the internet such as maps and location-aware services may serve to reinforce economic inequality and the digital divide. Electronic commerce may be responsible for consolidation and the decline of mom-and-pop, brick and mortar businesses resulting in increases in income inequality. Author Andrew Keen, a long-time critic of the social transformations caused by the Internet, has recently focused on the economic effects of consolidation from Internet businesses, since these businesses employ much fewer people per dollar of sales than traditional retailers. Security E-business systems naturally have greater security risks than traditional business systems, therefore it is important for e-business systems to be fully protected against these risks. A far greater number of people have access to e-businesses through the internet than would have access to a traditional business. Customers, suppliers, employees, and numerous other people use any particular e-business system daily and expect their confidential information to stay secure. Hackers are one of the great threats to the security of e-businesses. Some common security concerns for e-Businesses include keeping business and customer information private and confidential, the authenticity of data, and data integrity. Some of the methods of protecting e-business security and keeping information secure include physical security measures as well as data storage, data transmission, anti-virus software, firewalls, and encryption to list a few. Privacy and confidentiality Confidentiality is the extent to which businesses makes personal information available to other businesses and individuals. With any business, confidential information must remain secure and only be accessible to the intended recipient. However, this becomes even more difficult when dealing with e-businesses specifically. To keep such information secure means protecting any electronic records and files from unauthorized access, as well as ensuring safe transmission and data storage of such information. Tools such as encryption and firewalls manage this specific concern within e-business. Authenticity E-business transactions pose greater challenges for establishing authenticity due to the ease with which electronic information may be altered and copied. Both parties in an e-business transaction want to have the assurance that the other party is who they claim to be, especially when a customer places an order and then submits a payment electronically. One common way to ensure this is to limit access to a network or trusted parties by using a virtual private network (VPN) technology. The establishment of authenticity is even greater when a combination of techniques are used, and such techniques involve checking "something you know" (i.e. password or PIN), "something you need" (i.e. credit card), or "something you are" (i.e. digital signatures or voice recognition methods). Many times in e-business, however, "something you are" is pretty strongly verified by checking the purchaser's "something you have" (i.e. credit card) and "something you know" (i.e. card number). Data integrity Data integrity answers the question "Can the information be changed or corrupted in any way?" This leads to the assurance that the message received is identical to the message sent. A business needs to be confident that data is not changed in transit, whether deliberately or by accident. To help with data integrity, firewalls protect stored data against unauthorized access, while simply backing up data allows recovery should the data or equipment be damaged. Non-repudiation This concern deals with the existence of proof in a transaction. A business must have the assurance that the receiving party or purchaser cannot deny that a transaction has occurred, and this means having sufficient evidence to prove the transaction. One way to address non-repudiation is using digital signatures. A digital signature not only ensures that a message or document has been electronically signed by the person, but since a digital signature can only be created by one person, it also ensures that this person cannot later deny that they provided their signature. Access control When certain electronic resources and information is limited to only a few authorized individuals, a business and its customers must have the assurance that no one else can access the systems or information. There are a variety of techniques to address this concern including firewalls, access privileges, user identification and authentication techniques (such as passwords and digital certificates), Virtual Private Networks (VPN), and much more. Availability This concern is specifically pertinent to a business's customers as certain information must be available when customers need it. Messages must be delivered in a reliable and timely fashion, and information must be stored and retrieved as required. Because the availability of service is important for all e-business websites, steps must be taken to prevent disruption of service by events such as power outages and damage to physical infrastructure. Examples to address this include data backup, fire-suppression systems, Uninterrupted Power Supply (UPS) systems, virus protection, as well as making sure that there is sufficient capacity to handle the demands posed by heavy network traffic. Cost Structure The business internet which supports e-business has a cost to maintain of about $2 trillion in outsourced IT dollars just in the United States alone. With each website custom crafted and maintained in code, the maintenance burden is enormous. In the twenty-first century, look for new businesses that will help standardize the look and feel of the internet presence of a business to be more uniform in nature to help reduce the cost of maintenance. The cost structure for e-businesses varies a lot from the industry they operate in. There are two major categories that have commune characteristics. The first group is fully digital businesses that don't provide any product or services outside of the digital world. This includes for example software companies, social networks, etc. For those, the most significant operational cost is the maintenance of the platform. Those costs are almost unrelated to every additional customer the business acquires, making the marginal cost almost equal to zero. This is one of the major advantages of that kind of business. The second group are businesses that provide services or products outside of the digital world, like online shops, for those costs are much harder to determine. Some common advantages over traditional businesses are lower marketing cost, lower inventory cost, lower payroll, lower rent, etc. Security solutions When it comes to security solutions, sustainable electronic business requires support for data integrity, strong authentication, and privacy. Numerous things can be done in order to protect our E-Business. Starting off with basic things like switch to HTTPS from old outdated HTTP protocol which is more vulnerable to attacks. Furthermore, the other things that require full attention are securing servers and admin panels, payment gateway security, antivirus and anti-malware software, using firewalls is also a must, regular updates, and back up our data. Access and data integrity There are several different ways to prevent access to the data that is kept online. One way is to use anti-virus software. This is something that most people use to protect their networks regardless of the data they have. E-businesses should use this because they can then be sure that the information sent and received to their system is clean. A second way to protect the data is to use firewalls and network protection. A firewall is used to restrict access to private networks, as well as public networks that a company may use. The firewall also has the ability to log attempts into the network and provide warnings as it is happening. They are very beneficial to keep third parties out of the network. Businesses that use Wi-Fi need to consider different forms of protection because these networks are easier for someone to access. They should look into protected access, virtual private networks, or internet protocol security. Another option they have is an intrusion detection system. This system alerts when there are possible intrusions. Some companies set up traps or "hot spots" to attract people and are then able to know when someone is trying to hack into that area. Encryption Encryption, which is actually a part of cryptography, involves transforming texts or messages into a code that is unreadable. These messages have to be decrypted in order to be understandable or usable for someone. There is a key that identifies the data to a certain person or company. With public-key encryption, there are actually two keys used. One is public and one is private. The public one is used for encryption and the private one for decryption. The level of the actual encryption can be adjusted and should be based on the information. The key can be just a simple slide of letters or a completely random mix-up of letters. This is relatively easy to implement because there is software that a company can purchase. A company needs to be sure that its keys are registered with a certificate authority. Digital certificates The point of a digital certificate is to identify the owner of a document. This way the receiver knows that it is an authentic document. Companies can use these certificates in several different ways. They can be used as a replacement for user names and passwords. Each employee can be given these to access the documents that they need from wherever they are. These certificates also use encryption. They are a little more complicated than normal encryption, however. They actually used important information within the code. They do this in order to assure the authenticity of the documents as well as confidentiality and data integrity which always accompany encryption. Digital certificates are not commonly used because they are confusing for people to implement. There can be complications when using different browsers, which means they need to use multiple certificates. The process is being adjusted so that it is easier to use. Digital signatures A final way to secure information online would be to use a digital signature. If a document has a digital signature on it, no one else is able to edit the information without being detected. That way if it is edited, it may be adjusted for reliability after the fact. In order to use a digital signature, one must use a combination of cryptography and a message digest. A message digest is used to give the document a unique value. That value is then encrypted with the sender's private key. Advantages & Disadvantages Advantages When looking at e-Business we have many advantages, which are mostly connected to making doing business easier. The benefits of implementing e-Business tools are in the streamlining of business processes and not so much in the use of technology. Here are some: Easy to set up: electronic business is easy to set up even from home, the only requirements are software, a device and internet connection. Flexible Business Hours: There are no time barriers that a location-based business can encounter since the internet is available to everyone all the time. Your products and services can be accessed by everyone with an internet connection. Cheaper than Traditional Business: Electronic business is less costly than a traditional business, but it is more expensive to set up. Transactions cost are also cheaper. No Geographical Boundaries: The greatest benefit is the possibility of geographical dispersion. Anyone can order anything from anywhere at any time.    Government Subsidies: Digitalisation is very encouraged by the government and they provide the necessary support. Newmarket entry: It has a great potential to enable an entry to a previously unknown market that a traditional business could not. Lower levels of inventory: Electronic business enables companies to lower their level of inventory by digitalizing their assets. (i.e.: Netflix does not sell anymore physical DVD's but proposes online streaming content instead). Lower costs of marketing and sales: E-commerce allows the actors of the industry to advertise for their product/service offer (i.e.: house rental) at generally lower costs than by promoting physically their business. Disadvantages Despite all the limits , there are also some disadvantages that we need to address. The most common limitations of electronic business are: Lack of Personal Touch: The products cannot be examined or felt before the final purchase.  In the traditional model, we have a more personal customer experience, while in electronic business that is mostly not the case. Another missing factor of personal touch could also be in online transactions. Delivery Time: Traditional business enables instant satisfaction as you obtain the product the moment you purchase it, while in electronic business that is not possible. There will always be a waiting period before you receive the product. For example, Amazon assures one-day delivery. This does not resolve the issue completely, but it is an improvement. Security Issues: Scams could be mentioned as a factor for people's distrust in electronic business. Hackers can easily get customers' financial and personal details. Some customer still finds it hard to trust electronic businesses because of the lack of security, reliability and integrity issues. See also Electronic commerce Electronic Commerce Modeling Language Very Large Business Applications Digital economy Types of E-commerce Shopping cart software References Business terms Web development Web applications E-commerce
420201
https://en.wikipedia.org/wiki/Extended%20memory
Extended memory
In DOS memory management, extended memory refers to memory above the first megabyte (220 bytes) of address space in an IBM PC or compatible with an 80286 or later processor. The term is mainly used under the DOS and Windows operating systems. DOS programs, running in real mode or virtual x86 mode, cannot directly access this memory, but are able to do so through an application programming interface called the Extended Memory Specification (XMS). This API is implemented by a driver (such as HIMEM.SYS) or the operating system, which takes care of memory management and copying memory between conventional and extended memory, by temporarily switching the processor into protected mode. In this context, the term "extended memory" may refer to either the whole of the extended memory or only the portion available through this API. Extended memory can also be accessed directly by DOS programs running in protected mode using VCPI or DPMI, two (different and incompatible) methods of using protected mode under DOS. Extended memory should not be confused with expanded memory (EMS), an earlier method for expanding the IBM PC's memory capacity beyond 640 kB (655,360 bytes) using an expansion card with bank switched memory modules. Because of the available support for expanded memory in popular applications, device drivers were developed that emulated expanded memory using extended memory. Later two additional methods were developed allowing direct access to a small portion of extended memory from real mode. These memory areas are referred to as the high memory area (HMA) and the upper memory area (UMA; also referred to as upper memory blocks or UMBs). Overview On x86-based PCs, extended memory is only available with an Intel 80286 processor or higher. Only these chips can address more than 1 megabyte of RAM. The earlier 8086/8088 processors can make use of more than 1 MB of RAM if one employs special hardware to make selectable parts of it appear at addresses below 1 MB. On a 286 or better PC equipped with more than 640 kB of RAM, the additional memory would generally be re-mapped above the 1 MB boundary, since the IBM PC architecture reserves addresses between 640 kB and 1 MB for system ROM and peripherals. Extended memory is not accessible in real mode (except for a small portion called the high memory area). Only applications executing in protected mode can use extended memory directly. A supervising protected-mode operating system such as Microsoft Windows manages application programs' access to memory. The processor makes this memory available through the Global Descriptor Table (GDT) and one or more Local Descriptor Tables (LDTs). The memory is "protected" in the sense that memory segments assigned a local descriptor cannot be accessed by another program because that program uses a different LDT, and memory segments assigned a global descriptor can have their access rights restricted, causing a processor exception (e.g., a general protection fault or GPF) on violation. This prevents programs running in protected mode from interfering with each other's memory. A protected-mode operating system such as Microsoft Windows can also run real-mode programs and provide expanded memory to them. The DOS Protected Mode Interface (DPMI) is Microsoft's prescribed method for an DOS program to access extended memory under a multitasking environment. Extended Memory Specification (XMS) The Extended Memory Specification (XMS) is the specification describing the use of IBM PC extended memory in real mode for storing data (but not for running executable code in it). Memory is made available by extended memory manager (XMM) software such as HIMEM.SYS. The XMM functions are accessible through software interrupt 2Fh function 4310h. XMS version 2.0, released in July 1988, allowed for up to 64 MB of memory, with XMS version 3.0 this increased to 4 GB (232 bytes). To differentiate between the possibly different amount of memory that might be available to applications, depending on which version of the specification they were developed to, the latter may be referred to as super extended memory (SXMS). The extended memory manager is also responsible for managing allocations in the high memory area (HMA) and the upper memory area (UMA; also referred to as upper memory blocks or UMBs). In practice the upper memory area will be provided by the expanded memory manager (EMM), after which DOS will try to allocate them all and manage them itself. See also DOS memory management Conventional memory Expanded memory (EMS) High memory area (HMA) Upper memory area (UMA) Global EMM Import Specification (GEMMIS) Unreal mode References Specifications Microsoft, Lotus, Intel, and AST Research (1988-07-19). eXtended Memory Specification (XMS), version 2.0. Microsoft, Lotus, Intel, and AST Research (January 1991). eXtended Memory Specification (XMS), version 3.0. Microsoft Knowledge Base External links Extended Memory (XMS) Specification X86 memory management DOS memory management Memory expansion
9966
https://en.wikipedia.org/wiki/Elliptic-curve%20cryptography
Elliptic-curve cryptography
Elliptic-curve cryptography (ECC) is an approach to public-key cryptography based on the algebraic structure of elliptic curves over finite fields. ECC allows smaller keys compared to non-EC cryptography (based on plain Galois fields) to provide equivalent security. Elliptic curves are applicable for key agreement, digital signatures, pseudo-random generators and other tasks. Indirectly, they can be used for encryption by combining the key agreement with a symmetric encryption scheme. Elliptic curves are also used in several integer factorization algorithms based on elliptic curves that have applications in cryptography, such as Lenstra elliptic-curve factorization. Rationale Public-key cryptography is based on the intractability of certain mathematical problems. Early public-key systems based their security on the assumption that it is difficult to factor a large integer composed of two or more large prime factors. For later elliptic-curve-based protocols, the base assumption is that finding the discrete logarithm of a random elliptic curve element with respect to a publicly known base point is infeasible: this is the "elliptic curve discrete logarithm problem" (ECDLP). The security of elliptic curve cryptography depends on the ability to compute a point multiplication and the inability to compute the multiplicand given the original and product points. The size of the elliptic curve, measured by the total number of discrete integer pairs satisfying the curve equation, determines the difficulty of the problem. The U.S. National Institute of Standards and Technology (NIST) has endorsed elliptic curve cryptography in its Suite B set of recommended algorithms, specifically elliptic-curve Diffie–Hellman (ECDH) for key exchange and Elliptic Curve Digital Signature Algorithm (ECDSA) for digital signature. The U.S. National Security Agency (NSA) allows their use for protecting information classified up to top secret with 384-bit keys. However, in August 2015, the NSA announced that it plans to replace Suite B with a new cipher suite due to concerns about quantum computing attacks on ECC. While the RSA patent expired in 2000, there may be patents in force covering certain aspects of ECC technology. However some argue that the US government elliptic curve digital signature standard (ECDSA; NIST FIPS 186-3) and certain practical ECC-based key exchange schemes (including ECDH) can be implemented without infringing them, including RSA Laboratories and Daniel J. Bernstein. The primary benefit promised by elliptic curve cryptography is a smaller key size, reducing storage and transmission requirements, i.e. that an elliptic curve group could provide the same level of security afforded by an RSA-based system with a large modulus and correspondingly larger key: for example, a 256-bit elliptic curve public key should provide comparable security to a 3072-bit RSA public key. History The use of elliptic curves in cryptography was suggested independently by Neal Koblitz and Victor S. Miller in 1985. Elliptic curve cryptography algorithms entered wide use in 2004 to 2005. Theory For current cryptographic purposes, an elliptic curve is a plane curve over a finite field (rather than the real numbers) which consists of the points satisfying the equation: along with a distinguished point at infinity, denoted ∞. The coordinates here are to be chosen from a fixed finite field of characteristic not equal to 2 or 3, or the curve equation will be somewhat more complicated. This set together with the group operation of elliptic curves is an abelian group, with the point at infinity as an identity element. The structure of the group is inherited from the divisor group of the underlying algebraic variety: Cryptographic schemes Several discrete logarithm-based protocols have been adapted to elliptic curves, replacing the group with an elliptic curve: The Elliptic Curve Diffie–Hellman (ECDH) key agreement scheme is based on the Diffie–Hellman scheme, The Elliptic Curve Integrated Encryption Scheme (ECIES), also known as Elliptic Curve Augmented Encryption Scheme or simply the Elliptic Curve Encryption Scheme, The Elliptic Curve Digital Signature Algorithm (ECDSA) is based on the Digital Signature Algorithm, The deformation scheme using Harrison's p-adic Manhattan metric, The Edwards-curve Digital Signature Algorithm (EdDSA) is based on Schnorr signature and uses twisted Edwards curves, The ECMQV key agreement scheme is based on the MQV key agreement scheme, The ECQV implicit certificate scheme. At the RSA Conference 2005, the National Security Agency (NSA) announced Suite B which exclusively uses ECC for digital signature generation and key exchange. The suite is intended to protect both classified and unclassified national security systems and information. Recently, a large number of cryptographic primitives based on bilinear mappings on various elliptic curve groups, such as the Weil and Tate pairings, have been introduced. Schemes based on these primitives provide efficient identity-based encryption as well as pairing-based signatures, signcryption, key agreement, and proxy re-encryption. Implementation Some common implementation considerations include: Domain parameters To use ECC, all parties must agree on all the elements defining the elliptic curve, that is, the domain parameters of the scheme. The size of the field used is typically either prime (and denoted as p) or is a power of two (); the latter case is called the binary case, and also necessitates the choice of an auxiliary curve denoted by f. Thus the field is defined by p in the prime case and the pair of m and f in the binary case. The elliptic curve is defined by the constants a and b used in its defining equation. Finally, the cyclic subgroup is defined by its generator (a.k.a. base point) G. For cryptographic application the order of G, that is the smallest positive number n such that (the point at infinity of the curve, and the identity element), is normally prime. Since n is the size of a subgroup of it follows from Lagrange's theorem that the number is an integer. In cryptographic applications this number h, called the cofactor, must be small () and, preferably, . To summarize: in the prime case, the domain parameters are ; in the binary case, they are . Unless there is an assurance that domain parameters were generated by a party trusted with respect to their use, the domain parameters must be validated before use. The generation of domain parameters is not usually done by each participant because this involves computing the number of points on a curve which is time-consuming and troublesome to implement. As a result, several standard bodies published domain parameters of elliptic curves for several common field sizes. Such domain parameters are commonly known as "standard curves" or "named curves"; a named curve can be referenced either by name or by the unique object identifier defined in the standard documents: NIST, Recommended Elliptic Curves for Government Use SECG, SEC 2: Recommended Elliptic Curve Domain Parameters ECC Brainpool (RFC 5639), ECC Brainpool Standard Curves and Curve Generation SECG test vectors are also available. NIST has approved many SECG curves, so there is a significant overlap between the specifications published by NIST and SECG. EC domain parameters may be either specified by value or by name. If one (despite the above) wants to construct one's own domain parameters, one should select the underlying field and then use one of the following strategies to find a curve with appropriate (i.e., near prime) number of points using one of the following methods: Select a random curve and use a general point-counting algorithm, for example, Schoof's algorithm or the Schoof–Elkies–Atkin algorithm, Select a random curve from a family which allows easy calculation of the number of points (e.g., Koblitz curves), or Select the number of points and generate a curve with this number of points using the complex multiplication technique. Several classes of curves are weak and should be avoided: Curves over with non-prime m are vulnerable to Weil descent attacks. Curves such that n divides (where p is the characteristic of the field: q for a prime field, or for a binary field) for sufficiently small B are vulnerable to Menezes–Okamoto–Vanstone (MOV) attack which applies usual discrete logarithm problem (DLP) in a small-degree extension field of to solve ECDLP. The bound B should be chosen so that discrete logarithms in the field are at least as difficult to compute as discrete logs on the elliptic curve . Curves such that are vulnerable to the attack that maps the points on the curve to the additive group of . Key sizes Because all the fastest known algorithms that allow one to solve the ECDLP (baby-step giant-step, Pollard's rho, etc.), need steps, it follows that the size of the underlying field should be roughly twice the security parameter. For example, for 128-bit security one needs a curve over , where . This can be contrasted with finite-field cryptography (e.g., DSA) which requires 3072-bit public keys and 256-bit private keys, and integer factorization cryptography (e.g., RSA) which requires a 3072-bit value of n, where the private key should be just as large. However, the public key may be smaller to accommodate efficient encryption, especially when processing power is limited. The hardest ECC scheme (publicly) broken to date had a 112-bit key for the prime field case and a 109-bit key for the binary field case. For the prime field case, this was broken in July 2009 using a cluster of over 200 PlayStation 3 game consoles and could have been finished in 3.5 months using this cluster when running continuously. The binary field case was broken in April 2004 using 2600 computers over 17 months. A current project is aiming at breaking the ECC2K-130 challenge by Certicom, by using a wide range of different hardware: CPUs, GPUs, FPGA. Projective coordinates A close examination of the addition rules shows that in order to add two points, one needs not only several additions and multiplications in but also an inversion operation. The inversion (for given find such that ) is one to two orders of magnitude slower than multiplication. However, points on a curve can be represented in different coordinate systems which do not require an inversion operation to add two points. Several such systems were proposed: in the projective system each point is represented by three coordinates using the following relation: , ; in the Jacobian system a point is also represented with three coordinates , but a different relation is used: , ; in the López–Dahab system the relation is , ; in the modified Jacobian system the same relations are used but four coordinates are stored and used for calculations ; and in the Chudnovsky Jacobian system five coordinates are used . Note that there may be different naming conventions, for example, IEEE P1363-2000 standard uses "projective coordinates" to refer to what is commonly called Jacobian coordinates. An additional speed-up is possible if mixed coordinates are used. Fast reduction (NIST curves) Reduction modulo p (which is needed for addition and multiplication) can be executed much faster if the prime p is a pseudo-Mersenne prime, that is ; for example, or Compared to Barrett reduction, there can be an order of magnitude speed-up. The speed-up here is a practical rather than theoretical one, and derives from the fact that the moduli of numbers against numbers near powers of two can be performed efficiently by computers operating on binary numbers with bitwise operations. The curves over with pseudo-Mersenne p are recommended by NIST. Yet another advantage of the NIST curves is that they use a = −3, which improves addition in Jacobian coordinates. According to Bernstein and Lange, many of the efficiency-related decisions in NIST FIPS 186-2 are sub-optimal. Other curves are more secure and run just as fast. Applications Elliptic curves are applicable for encryption, digital signatures, pseudo-random generators and other tasks. They are also used in several integer factorization algorithms that have applications in cryptography, such as Lenstra elliptic-curve factorization. In 1999, NIST recommended fifteen elliptic curves. Specifically, FIPS 186-4 has ten recommended finite fields: Five prime fields for certain primes p of sizes 192, 224, 256, 384, and bits. For each of the prime fields, one elliptic curve is recommended. Five binary fields for m equal 163, 233, 283, 409, and 571. For each of the binary fields, one elliptic curve and one Koblitz curve was selected. The NIST recommendation thus contains a total of five prime curves and ten binary curves. The curves were ostensibly chosen for optimal security and implementation efficiency. In 2013, The New York Times stated that Dual Elliptic Curve Deterministic Random Bit Generation (or Dual_EC_DRBG) had been included as a NIST national standard due to the influence of NSA, which had included a deliberate weakness in the algorithm and the recommended elliptic curve. RSA Security in September 2013 issued an advisory recommending that its customers discontinue using any software based on Dual_EC_DRBG. In the wake of the exposure of Dual_EC_DRBG as "an NSA undercover operation", cryptography experts have also expressed concern over the security of the NIST recommended elliptic curves, suggesting a return to encryption based on non-elliptic-curve groups. Elliptic curve cryptography is used by the cryptocurrency Bitcoin. Ethereum version 2.0 makes extensive use of elliptic curve pairs using BLS signatures—as specified in the IETF draft BLS specification—for cryptographically assuring that a specific Eth2 validator has actually verified a particular transaction. Security Side-channel attacks Unlike most other DLP systems (where it is possible to use the same procedure for squaring and multiplication), the EC addition is significantly different for doubling (P = Q) and general addition (P ≠ Q) depending on the coordinate system used. Consequently, it is important to counteract side-channel attacks (e.g., timing or simple/differential power analysis attacks) using, for example, fixed pattern window (a.k.a. comb) methods (note that this does not increase computation time). Alternatively one can use an Edwards curve; this is a special family of elliptic curves for which doubling and addition can be done with the same operation. Another concern for ECC-systems is the danger of fault attacks, especially when running on smart cards. Backdoors Cryptographic experts have expressed concerns that the National Security Agency has inserted a kleptographic backdoor into at least one elliptic curve-based pseudo random generator. Internal memos leaked by former NSA contractor Edward Snowden suggest that the NSA put a backdoor in the Dual EC DRBG standard. One analysis of the possible backdoor concluded that an adversary in possession of the algorithm's secret key could obtain encryption keys given only 32 bytes of PRNG output. The SafeCurves project has been launched in order to catalog curves that are easy to securely implement and are designed in a fully publicly verifiable way to minimize the chance of a backdoor. Quantum computing attacks Shor's algorithm can be used to break elliptic curve cryptography by computing discrete logarithms on a hypothetical quantum computer. The latest quantum resource estimates for breaking a curve with a 256-bit modulus (128-bit security level) are 2330 qubits and 126 billion Toffoli gates. In comparison, using Shor's algorithm to break the RSA algorithm requires 4098 qubits and 5.2 trillion Toffoli gates for a 2048-bit RSA key, suggesting that ECC is an easier target for quantum computers than RSA. All of these figures vastly exceed any quantum computer that has ever been built, and estimates place the creation of such computers at a decade or more away. Supersingular Isogeny Diffie–Hellman Key Exchange provides a post-quantum secure form of elliptic curve cryptography by using isogenies to implement Diffie–Hellman key exchanges. This key exchange uses much of the same field arithmetic as existing elliptic curve cryptography and requires computational and transmission overhead similar to many currently used public key systems. In August 2015, the NSA announced that it planned to transition "in the not distant future" to a new cipher suite that is resistant to quantum attacks. "Unfortunately, the growth of elliptic curve use has bumped up against the fact of continued progress in the research on quantum computing, necessitating a re-evaluation of our cryptographic strategy." Invalid curve attack When ECC is used in virtual machines, an attacker may use an invalid curve to get a complete PDH private key. Patents At least one ECC scheme (ECMQV) and some implementation techniques are covered by patents. Alternative representations Alternative representations of elliptic curves include: Hessian curves Edwards curves Twisted curves Twisted Hessian curves Twisted Edwards curve Doubling-oriented Doche–Icart–Kohel curve Tripling-oriented Doche–Icart–Kohel curve Jacobian curve Montgomery curves See also Cryptocurrency Curve25519 FourQ DNSCurve RSA (cryptosystem) ECC patents Elliptic curve Diffie-Hellman (ECDH) Elliptic Curve Digital Signature Algorithm (ECDSA) EdDSA ECMQV Elliptic curve point multiplication Homomorphic Signatures for Network Coding Hyperelliptic curve cryptography Pairing-based cryptography Public-key cryptography Quantum cryptography Supersingular isogeny key exchange Notes References Standards for Efficient Cryptography Group (SECG), SEC 1: Elliptic Curve Cryptography, Version 1.0, September 20, 2000. (archived as if Nov 11, 2014) D. Hankerson, A. Menezes, and S.A. Vanstone, Guide to Elliptic Curve Cryptography, Springer-Verlag, 2004. I. Blake, G. Seroussi, and N. Smart, Elliptic Curves in Cryptography, London Mathematical Society 265, Cambridge University Press, 1999. I. Blake, G. Seroussi, and N. Smart, editors, Advances in Elliptic Curve Cryptography, London Mathematical Society 317, Cambridge University Press, 2005. L. Washington, Elliptic Curves: Number Theory and Cryptography, Chapman & Hall / CRC, 2003. The Case for Elliptic Curve Cryptography, National Security Agency (archived January 17, 2009) Online Elliptic Curve Cryptography Tutorial, Certicom Corp. (archived here as of March 3, 2016) K. Malhotra, S. Gardner, and R. Patz, Implementation of Elliptic-Curve Cryptography on Mobile Healthcare Devices, Networking, Sensing and Control, 2007 IEEE International Conference on, London, 15–17 April 2007 Page(s):239–244 Saikat Basu, A New Parallel Window-Based Implementation of the Elliptic Curve Point Multiplication in Multi-Core Architectures, International Journal of Network Security, Vol. 13, No. 3, 2011, Page(s):234–241 (archived here as of March 4, 2016) Christof Paar, Jan Pelzl, "Elliptic Curve Cryptosystems", Chapter 9 of "Understanding Cryptography, A Textbook for Students and Practitioners". (companion web site contains online cryptography course that covers elliptic curve cryptography), Springer, 2009. (archived here as of April 20, 2016) Luca De Feo, David Jao, Jerome Plut, Towards quantum-resistant cryptosystems from supersingular elliptic curve isogenies, Springer 2011. (archived here as of May 7, 2012) Jacques Vélu, Courbes elliptiques (...), Société Mathématique de France, 57, 1-152, Paris, 1978. External links Elliptic Curves at Stanford University Elliptic curve cryptography Public-key cryptography Finite fields
63692463
https://en.wikipedia.org/wiki/Gabriele%20Kotsis
Gabriele Kotsis
Gabriele Kotsis (born 29 October 1967, Vienna, Austria) is an Austrian computer scientist. She is full professor in computer science at Johannes Kepler University (JKU), Linz, Austria, while leading the Department of Telecommunication and the division of Cooperative Information Systems. She was vice-rector for Research and the Advancement of Women, and longstanding chairwoman of Universities Austria's Policy Committee on Research. She is a distinguished member and elected president of the Association for Computing Machinery (ACM). Early life and education Gabriele Kotsis received her master's degree in business informatics (1986–1991) at the University of Vienna and finished her doctoral studies in social- and economic sciences (1992–1995) at the University of Vienna, graduating two times with distinction. In 2000, she habilitated in Informatics at the University of Vienna. She received scientific recognition early on: her master's thesis Interconnection Topologies and Routing for Parallel Processing Systems at the University of Vienna was honored with a student sponsorship award of the Austrian Computer Society. Furthermore, her PhD dissertation Workload Modeling for Parallel Processing was honored with the prestigious Heinz Zemanek Award in 1996. Career and research Already during her doctoral studies, she took a job as a university assistant at the Department of Applied Informatics and Information Systems at the University of Vienna (1991–2001). She worked as a guest professor at the Department of Information Processing and Information Economy at the Vienna University of Economics and Business (2001–2002). In 2002, she was a guest professor at the Department of Informatics at the Copenhagen Business School in Denmark and the Department of Telecooperation at the JKU, before she was appointed as professor of informatics. In the same year, she was one of the co-founding chairs of the working group for professors in computer science within the Austrian Computer Society (OCG), of which she was the first female President from 2003 to 2007. In addition to her two-term presidency at OCG, Kotsis takes an active role in the Editorial board of the OCG Book Series, in the working group Fem-IT (Association of Female University Professors in IT) and in the OCG award committee. From 2007 to 2015, she served as vice-rector for Research at JKU. Her responsibilities included the development of R&D strategies and policies in the university, coordination and interaction with national and international governmental organizations and funding bodies, and the establishment of collaborations with other research organizations and business partners. Since 2016, Gabriele has been JKU's representative in the ASEA-UNINET academic research network, which promotes cooperation among European and south-East Asian public universities. Her active involvement in this network led to her nomination and election as President for the current period, February 2019 to July 2020. Kotsis is a founding member of the ACM Europe Council, serving at the council from 2008 to 2016. In 2014, she became an ACM Distinguished Member for her contributions to workload characterization for parallel and distributed systems and for the founding of ACM Europe. In 2016, she received an award in appreciation of her accomplishments regarding the ACM WomEncourage conference series. As of 2016, she is an elected Member-at-Large of the ACM council. She serves as President of ACM for the term July 1, 2020 to June 30, 2022. She is a member of the Austrian Center for Parallel Computation (ACPC), the Austrian Computer Society (OCG) and Distinguished Scientist of the Association for Computing Machinery (ACM). Honors and awards 2020 – ACM President 2016 – Member of the Board of Trustees, University of Klagenfurt 2016 – ACM Recognition of Service Award, in appreciation of contributions to ACM. Chair of WomEncourage 2016 2016 – IFIP EGOV-ePart 2016 Meritorious Paper Award, for the paper "Making Computers Understand Coalition and Opposition in Parliamentary Democracy" 2015 – Emerald Highly Commended Paper Award, International Journal of Pervasive Computing and Communications 2014 – ACM Distinguished Scientist 2013 – iiWAS Best Short Paper Award 2009 – iiWAS Decennial Award, in recognition of her outstanding scientific, didactic and organizational contributions to the @WAS organization and the iiWAS and MoMM conferences series 2006 (September) – Expert of the Month, Federal Ministry for Transport, Innovation and Technology 2000 – OPNET / Mil3 Distinguished Paper Award, International Conference on Application and Network Performance 1996 – Heinz Zemanek-Preis, Austrian Computer Society Award for outstanding scientific publications in computer science 1992 – OCG-Förderpreis, Austrian Computer Society Award for Diploma Theses Selected publications Her publications include: Al Zubaidi-Polli, Anna M.; Anderst-Kotsis, Gabriele (2018), "Conceptual Design of a hybrid Participatory IT supporting in-situ and ex-situ collaborative text authoring",  iiWAS, ACM, , 243–252 Steinbauer, Matthias; Anderst-Kotsis, Gabriele (2016), "DynamoGraph: extending the Pregel paradigm for large-scale temporal graph processing", in International Journal of Grid and Utility Computing (IJGUC), 7 (2), Inderscience, 141–151, ISSN 1741-847X. Bachmayer, Sabine; Lugmayr, Artur; Kotsis, Gabriele (2010), "Convergence of collaborative web approaches and interactive TV program formats", International Journal of Web Information Systems, 6 (1), pp. 74–94. Kotsis, Gabriele; Khalil-Ibrahim, Ismail (2008), "The Web Goes Mobile: Can We Keep the Pace?", Proceedings CISIS 2008 (The Second Int. Conf. on Complex, Intelligent and Software Intensive Systems, 4–7 March 2008, Barcelona, Spain), IEEE Computer Society, 240–246, van der Heijden, Hans; Kotsis, Gabriele; Kronsteiner, Reinhard (July 2005), "Mobile recommendation systems for decision making‚ on the go". In International Conference on Mobile Business (ICMB'05) (pp. 137–143). IEEE. Ibrahim, Ismail K.; Kronsteiner, Reinhard; Kotsis, Gabriele (2005), "A semantic solution for data integration in mixed sensor networks", Computer Communications, 28 (3), 1564-1574. Hlavacs, Helmut; Hotop, Ewald; Kotsis, Gabriele (2000), "Workload Generation in OPNET by User Behaviour Modelling", OPNETWORK 2000, Awarded with the Distinguished Paper Award Kotsis, Gabriele; Kacsuk, Peter (2000), "Distributed and Parallel Systems: From Instruction Parallelism to Cluster Computing", DAPSYS2000, Kluwer International Series in Engineering and Computer Science, 567, Kluwer Academic Publishers, Bullnheimer, Bernd; Kotsis, Gabriele; Strauß, Christine (1998), "Parallelization Strategies for the Ant System", In: De Leone R., Murli A., Pardalos P.M., Toraldo G. (eds) High Performance Algorithms and Software in Nonlinear Optimization. Applied Optimization, 24, Springer, Boston, MA Kotsis, Gabriele; Krithivasan, Kamala; Raghavan Serugudi (1997), "Generative Workload Models of Internet Traffic", Proceedings of the ICICS Conference, 1, 152–156, Singapore, IEEE A. Ferscha, G. Kotsis (1991) Eliminating Routing Overheads in Neural Network Simulation Using Chordal Ring Interconnection Topologies, Proc. of the Neuro-Nimes 91, 4th Int. Conf. on Neural Networks and their Applications, 625–638 References 1967 births Living people Austrian women computer scientists Austrian women academics University of Vienna alumni Johannes Kepler University Linz faculty Presidents of the Association for Computing Machinery
21490852
https://en.wikipedia.org/wiki/Highland%20Clearances
Highland Clearances
The Highland Clearances ( , the "eviction of the Gaels") were the evictions of a significant number of tenants in the Scottish Highlands and Islands, mostly from 1750 to 1860. In the first phase ( mid-18th century), clearance resulted from agricultural improvement, driven by the need for landlords to increase their income (many landlords had crippling debts, with bankruptcy playing a large part in the history). This involved the enclosure of the open fields managed on the run rig system and the shared grazing. Especially in the North and West of the region, these were usually replaced with large-scale pastoral farms stocked with sheep, on which much higher rents were paid, with the displaced tenants getting alternative tenancies in newly created crofting communities, where they were expected to be employed in industries such as fishing, quarrying or the kelp industry. The reduction in status from farmer to crofter was one of the causes of resentment from these changes. The second phase (c.1815–20 to 1850s) involved overcrowded crofting communities from the first phase that had lost the means to support themselves, through famine and/or collapse of industries that they had relied on (such as the kelp trade), as well as the effects of continuing population growth. This is when "assisted passages" were common, when landowners paid the fares for their tenants to emigrate. Tenants who were selected for this had, in practical terms, little choice but to emigrate. The Highland Potato Famine struck towards the end of this period, giving greater urgency to the process. Agriculture in the Highlands had always been marginal, with famine a recurrent risk for pre-clearance communities. Nevertheless, population levels increased steadily through the 18th and early 19th centuries. This increase continued through nearly all of the time of the clearances, peaking in 1851, at around 300,000. Emigration was part of Highland history before and during the clearances, and reached its highest level after them. During the first phase of the clearances, emigration could be considered a form of resistance to the loss of status being imposed by a landlord's social engineering. The eviction of tenants went against , the principle that clan members had an inalienable right to rent land in the clan territory. This was never recognised in Scottish law. It was gradually abandoned by clan chiefs as they began to think of themselves simply as commercial landlords, rather than as patriarchs of their people—a process that arguably started with the Statutes of Iona of 1609. The clan members continued to rely on . This difference in viewpoints was an inevitable source of grievance. The actions of landlords varied. Some did try to delay or limit evictions, often to their financial cost. The Countess of Sutherland genuinely believed her plans were advantageous for those resettled in crofting communities and could not understand why tenants complained. However, a few landlords displayed complete lack of concern for evicted tenants. The clearances continue to be debated between historians and others who approach the subject with different analysis of sources and determination of causes. As such, there are significant differences between the understanding of the Highland clearances held by historians and the popular view of these events. Furthermore, the historical analysis is complicated by political views, including an ongoing debate about who should possess and exercise control over land in Scotland. The clearances were condemned by many writers at the time, and in the late 19th century they were invoked in opposition to the enormous power of landlords under Scottish law and calls for land reform related to crofting, notably in Alexander Mackenzie's 1883 History of the Highland Clearances. The effects of the clearances were evoked in fictional works by authors including Neil M. Gunn and Fionn MacColla in the 1930s and 1940s. The subject was largely ignored by academic historians until the publication of a best-selling history book by John Prebble in 1963 attracted worldwide attention to his view that Highlanders had been forced into tragic exile by their former chieftains turned brutal landlords. Though some historians have disputed this work as an over-simplification, other authors went further and stated that it promoted misconceptions that the clearances were equivalent to genocide or ethnic cleansing and/or that British authorities in London played a major, persistent role in carrying them out. In particular, popular remembrance of the Highland clearances is sometimes intertwined with the comparatively short-lived reprisals that followed the failed Jacobite rebellion of 1745. However, a large body of thoroughly researched academic work now exists on the subject, differing significantly from the accounts of Prebble and his successors. Importantly, as there are so many historical works now focused on the Highlands, there is even an argument that the balance of journals and books in all Scottish history is now excessively tilted toward the Highlands, despite the clearances that took place in the Lowlands as well. Definition The definition of "clearance" (as it relates to the Highland Clearances) is debatable. The term was not in common use during much of the clearances; landowners, their factors and other estate staff tended, until the 1840s, to use the word "removal" to refer to the eviction of tenants. However, by 1843, "clearance" had become a general (and derogatory) word to describe the activities of Highland landlords. Its use was ambiguous, as for some it meant only the displacement of large numbers of people from a single place at one time. For others, the eviction of a single tenant at the end of a lease could be termed "clearance". Eric Richards suggests that current usage is broad, meaning "any displacement of occupiers (even of sheep) by Highland landlords". He adds that it can apply to both large and small evictions, and includes voluntary or forced removal and instances involving either emigration or resettlement nearby. T. M. Devine also takes the view that "clearance" has a broader meaning now than when it was used in the 19th century. Economic and social context Agricultural Revolution The first phase of the Highland Clearances was part of the Scottish Agricultural Revolution but happened later than the same process in the Scottish Lowlands. Scottish agriculture in general modernised much more rapidly than in England and, to a large extent, elsewhere in Europe. The growing cities of the Industrial Revolution presented an increased demand for food; land came to be seen as an asset to meet this need, and as a source of profit, rather than a means of support for its resident population. Before improvement, Highland agriculture was based on run rig arable areas and common land for grazing. Those working in this system lived in townships or . Under the run rig system, the open fields were divided into equivalent parts and these were allocated, once a year, to each of the occupiers, who then worked their land individually. With no individual leases or ownership of plots of land, there was little incentive to improve it (for instance by drainage or crop rotation systems). Nor, with common grazing, could an individual owner improve the quality of his stock. Enclosure of the common lands and the run rig fields was a method of improvement. More commonly, there was a greater change in land use: the replacement of mixed farming (in which cattle provided a cash crop) with large-scale sheep farming. This involved displacement of the population to crofts on the same estate, other land in the Highlands, the industrial cities of Scotland or other countries. The common view is that the shepherds employed to manage these flocks were from outside the Highlands. This is an oversimplification, as Gaelic-speaking tacksmen and drovers were to be found in the sheep trade from the 1780s. When sheep were introduced in the Sutherland Clearances, over half the leases were taken up by Sutherlanders. Clanship Since their origin in the early Middle Ages, clans were the major social unit of the Highlands. They were headed by a clan chief, with members of his family taking positions of authority under him. The mechanisms of clanship gave protection and agricultural land to the clansmen, who in return paid with service and rent which was paid, especially in earlier periods, mostly in kind (as opposed to money). Service included military service when required. The Highlands was one of the parts of Scotland where law and order were not maintained by central government, hence the need for protection from a powerful leader. Clan leaders controlled the agricultural land, with its distribution generally being achieved through leases to tacksmen, who sublet to the peasant farmers. The basic farming unit was the or township, consisting of a few (anything from 4 to 20 or more) families working arable land on the run rig management system, and grazing livestock on common land. Clans provided an effective business model for running the trade in black cattle: the clan gentry managed the collection of those beasts ready for sale and negotiated a price with lowland drovers for all the stock produced on the clan lands. The sale proceeds were offset against the rentals of the individual producers. The growth in the trade in cattle demonstrates the ability of pre-clearance Highland society to adapt to and exploit market opportunities—making clear that this was not an immutable social system. James VI was one of the kings who sought to impose control on the Highlands. On becoming James I of England in 1603, he gained the military force to do this. The Statutes of Iona controlled some key aspects; this forced the heirs of the wealthier Highlanders to be educated in the Lowlands and required clan chiefs to appear annually in front of the Privy Council in Edinburgh. This exposed the top layer of Highland society to the costs of living in Edinburgh in a manner fitting to their status. Unlike their Lowland counterparts, their lands were less productive and were not well integrated into the money economy. Large financial sureties were taken from clan leaders to guarantee the good behaviour of the clan. Overall, this reduced the need for the protection provided by a clan whilst increasing the costs for the clan leaders. The clan chiefs who fully subscribed to this new system of regulation were rewarded with charters that formalised their ownership of clan lands. The combination of these initiated the demise of clanship. The process continued as clan chiefs began to think of themselves as landlords, rather than as patriarchs of their people. The various intervals of warfare since the Statutes of Iona reined in the steady transition to landlordism because the ability to raise a band of fighting men at short notice became important again. The civil war that started in 1638 reinvigorated the military aspects. The restoration of Charles II in 1660 brought peace, but also increased taxes, restarting the financial pressure. The succession of Jacobite rebellions emphasised again the martial aspects of clanship, but the defeat at Culloden brought an end to any willingness to go to war again. The loss of heritable jurisdictions across Scotland highlighted the changed role of clan chiefs. Elimination of the tacksman A tacksman (a member of the , sometimes described as "gentry" in English) was the holder of a lease or "tack" from the landowner, subletting the land to lesser tenants. They were often related to the landowner, even if only distantly. They acted as the middle stratum of pre-clearance society, with a significant role in managing the Highland economy. They were the first sector of Gaelic society to feel the effect of the social and economic changes that included the Clearances, when landlords restricted their power to sub-let, so increasing the rental income directly to the laird; simple rent increases were also applied. This was part of a slow phasing out of this role; it accelerated from the 1770s, and by the next century, tacksmen were a minor component of society. T. M. Devine describes "the displacement of this class as one of the clearest demonstrations of the death of the old Gaelic society." Many emigrated to America, in the words of Eric Richards: "often cocking a snook at the landlords as they departed". Emigrating tacksmen, and the larger farmers who departed at the same time, represented not only a flight of capital from Gaeldom but also a loss of entrepreneurial energy. In the opinion of T. M. Devine, tacksmen and the middle-ranked tenant farmers represented the economic backbone of the peasant communities of the Western Highlands. Devine repeats the views of Marianne McLean, that those of them who emigrated were not refusing to participate in a commercial economy; rather they rejected the loss of status that the changes of improvement gave them. Phases of the Clearances The first phase of the Clearances occurred mostly over the period 1760 to 1815. However, it started before the Jacobite rebellion of 1745, with its roots in the decision of the Dukes of Argyll to put tacks (or leases) of farms and townships up for auction. This began with Campbell property in Kintyre in the 1710s and spread after 1737 to all their holdings. First phase clearances involved break up of the traditional townships (), the essential element of land management in Scottish Gaeldom. These multiple tenant farms were most often managed by tacksmen. To replace this system, individual arable smallholdings or crofts were created, with shared access to common grazing. This process was often accompanied by moving the people from the interior straths and glens to the coast, where they were expected to find employment in, for example, the kelp or fishing industries. The properties they had formerly occupied were then converted into large sheep holdings. Essentially, therefore, this phase was characterised by relocation rather than outright expulsion. The second phase of clearance started in 1815–20, continuing to the 1850s. It followed the collapse or stagnation of the wartime industries and the continuing rise in population. These economic effects are illustrated by the contemporary commodity prices. Kelp had been falling since 1810; in 1823 the market price in Liverpool was £9 a ton, but it fell to £3 13s 4d a ton in 1828, 41% of the 1823 price. Wool prices also shrank over a similar period to a quarter of the price obtained in 1818, and black cattle nearly halved in price between 1810 and the 1830s. In the second phase, landlords moved to the more draconian policy of expelling people from their estates. This was increasingly associated with 'assisted emigration', in which landlords cancelled rent arrears and paid the passage of the 'redundant' families in their estates to North America and, in later years, also to Australia. The process reached a climax during the Highland Potato Famine of 1846–55. Regional differences In general terms, the transformation of the Highlands resulted in two different types of rural economy. In the southern and eastern part of the region, as land was enclosed, it was let to fewer tenants, with larger individual holdings. These larger units employed farm servants and labourers and also provided work for cottars and crofters. This workforce included former tenants from the old system. Whilst there were large pastoral farms, there were also mixed and arable farms—both of which needed labour. The population of the south and east Highlands only grew slightly from 1755 to 1841. This is explained by migration to the accessible Lowlands to find work and the relative unavailability of small tenancies. This gave this part of the Highlands some similarities to the Lowland clearances. Together with the better climate of the southern and eastern Highlands, the more diverse agricultural system gave a reasonable level of prosperity to the area. Agricultural change in the Hebrides and the western coastal areas north of Fort William produced a different economic and social structure. This area is termed the "crofting region"; crofting communities became the dominant social system here, as land was enclosed and the run rig management of the multi-tenant replaced. The major part of the land was given over to large-scale pastoral sheep farming. This provided few jobs, compared to the arable and mixed farms in the south and east Highlands. The main industries intended for those displaced to crofting communities were fishing and kelp. Initially, this seemed, to the landlords and their advisors, an ideal way of providing profitable employment for those made redundant by competition for farm leases by the higher-rent-paying sheep farms. Over time, crofts were subdivided, allowing more tenants to live on them (but with less land per person). Crofting communities had a high proportion of cottars—those with the least access to land and without any formal lease to document what they did hold. Population growth was rapid, due to both subdivision and the lower rate of migration to the Lowlands. When the kelp market collapsed a few years after the end of the Napoleonic Wars, the deficiency of the crofting model was exposed: overcrowded communities with limited or no ability to grow enough food for subsistence and now without the industry on which their community relied. This is the area that was most reliant on the potato, and therefore severely hit by the Highland Potato Famine. The census of 1841 recorded 167,283 people living in the crofting region (as per T. M. Devine's definition of the term), whilst the "farming" south and east Highlands contained 121,224 people. Causes Different landowners decided to introduce the improvements that required clearance at different times and for different reasons. The common drivers of clearance are as follows: Economic changes Replacement of the old-style peasant farming with a small number of well-capitalised sheep farmers allowed land to be let at much higher rents. It also had the advantage, for the landowner, that there were fewer tenants to collect rent from, thus reducing the administrative burden of the estate. In some areas, land remained in arable use after clearance but was farmed with more intensive modern methods. Some of the earliest clearances had been to introduce large-scale cattle production. Some later clearances replaced agriculture with sporting estates stocked with deer. There were instances of an estate being first cleared for sheep and later being cleared again for deer. The major transition, however, was to pastoral agriculture based on sheep. The most productive sheep were the Cheviot, allowing their owners to pay twice as much rent as if they had stocked with Blackfaces. The Cheviot's disadvantage was that it was less hardy and needed low-level land on which to overwinter. This was usually the old arable land of the evicted population, so the choice of sheep breed dictated the totality of clearance in any particular Highland location. Social engineering Some of those carrying out clearances believed that this was for the benefit of those affected. Patrick Sellar, the factor (agent) of the Countess of Sutherland, was descended from a paternal grandfather who had been a cottar in Banffshire and had been cleared by an improving landlord. For the Sellars, this initiated a process of upward mobility (Patrick Sellar was a lawyer and a graduate of Edinburgh University), which Sellar took to be a moral tale that demonstrated the benefits to those forced to make a new start after eviction. The provision of new accommodation for cleared tenants was often part of a planned piece of social engineering; a large example of this was the Sutherland Clearances, in which farming tenants in the interior were moved to crofts in coastal regions. The intent was that the land allotted to them would not be enough to provide all of their needs, and they would need to seek employment in industries like fishing or as seasonal itinerant farm labourers. The loss of status from tenant farmer to crofter was one of the reasons for the resentment of the Clearances. The planned acts of social engineering needed investment. This money often originated from fortunes earned outside Scotland, whether from the great wealth of Sir James Matheson (the second son of a Sutherland tacksman, who returned from the Far East with a spectacular fortune), the more ordinary profits from Empire of other returning Scots, or Lowland or English industrialists attracted by lower land values in the Highlands. Large amounts of capital were used to start industrial and commercial enterprises or build infrastructure like roads, bridges and harbours, but the return on this capital was very low by contemporary standards. This wasted investment is described by Eric Richards as "a loss to the national economy to be set beside any gains to be tallied." Some of this expenditure was used to build new towns, such as Bettyhill, which received tenants cleared from Strathnaver. This displacement has been compared to the movement of Glaswegians to Castlemilk in the 1950s—with a similar distance from the original settlement and a comparable level of overall failure of the project to produce the anticipated social benefits. In the second phase of the clearances, when population reduction was the primary intention, the actions of landlords can be viewed as the crudest type of social engineering with a very limited understanding of the likely consequences. Failure of the kelp industry The kelp trade was badly affected by the end of the Napoleonic Wars in 1815 and had collapsed totally by 1820. Kelp (or seaweed) was harvested from the seashore at low tide, dried and burnt to yield an alkali extract used in the manufacture of soap and glass. It was a very labour-intensive industry. Production had steadily grown from the 1730s to a peak level in 1810, and was mostly located in the Hebrides. The end of war reintroduced competition from Spanish barilla, a cheaper and richer product. This, combined with the reduction of duty on the foreign import, and the discovery that cheaper alkali could be extracted from common salt, destroyed the seasonal employment of an estimated 25 to 40 thousand crofters. There was little prospect of alternative employment; the only possibility was fishing, which was also in decline at the same time. The overall population of the Western Isles had grown by 80 per cent between 1755 and 1821. The economic collapse of an industry that was a major employer in a greatly over-populated region had an inevitable result. Not only did the level of poverty increase in the general population, but many landlords, failing to make prompt adjustments to their catastrophic fall in income, descended into debt and bankruptcy. Famine The Highlands, as an agriculturally marginal area, was the last part of mainland Britain to remain at risk of famine, with notable instances before the 19th century in 1680, 1688, the 1690s, 1740–1, 1756 and 1782–3. The history of the trade in meal suggests that the region balanced this import with exporting cattle, leading to a substantial reliance on trade for survival that was greater than anywhere else in Britain. There was near-contemporaneous dispute as to the severity of famines in the pre-clearance Highlands: in 1845, the Sutherland estate management argued over the level of famine relief that had been needed in the past, including this opinion: "The cattle on Sutherland were that Spring dying from scarcity of provender... and this is the condition to which your morbid Philanthropists of the present day refer as the days of comfort for the wretched Highlanders." (11 June 1845 letter to James Loch). Even accepting the level of debate on the subject among historians and the incomplete body of evidence, there is a clear case that, for example, pre-clearance Strathnaver (in Sutherland) experienced recurrent famine in a society operating at the margin of subsistence. Crofting communities became more common in the early part of the 19th century. Particularly in the West Highlands and the Isles, the residents of these small agricultural plots were reliant on potatoes for at least three quarters of their diet. Until 1750, potatoes had been relatively uncommon in the Highlands. With a crop yield four times higher than oats, they became an integral part of crofting. After partial crop failures in 1836 and 1837, a severe outbreak of potato blight arrived in Scotland in 1846. Blight continued to seriously affect the Highland potato crop until about 1856. This was famine of a much greater scale and duration than anything previously experienced. By the end of 1846, the north-west Highlands and the Hebrides had serious food shortages, with an estimated three quarters of the population with nothing to eat. The Highland Potato Famine started a year after potato blight had first struck Ireland. The knowledge of the Irish catastrophe helped mobilise a response to the Highland crisis, with government action, the establishment of a large charitable fund (the Central Board for Highland Destitution) and much more responsible landlord behaviour than seen in Ireland. The richer landlords, such as the Duke of Sutherland, were able to fund their own famine relief for their tenants. Some, already overstretched by large debts, were bankrupted by providing the necessary relief. The landlord of most of Islay, Walter Frederick Campbell, was a spectacular example. Another whose benevolence during the crisis brought bankruptcy was Norman Macleod of Macleod, owner of one of the two major estates in Skye. Conversely, some landlords were criticised for using the voluntarily raised relief funds to avoid supporting their tenants through the crisis. A few were recipients of strongly critical letters from senior civil servants, with threats that the government would recover the cost of famine relief from those who could provide it, but chose not to. Clearance and emigration were an integral part of the Highland potato famine; the length and severity of the crisis seemed to leave little alternative. The choice faced by the government was between indefinitely continuing with charitable efforts and public works, or removing the excess population permanently. Rumours circulated, from 1849, that the government planned to introduce an 'able-bodied Poor Law', so formally putting the potentially crippling burden of famine relief on each parish (and hence on the landlord); the Central Board made clear that they would wind up their relief effort in 1850. The new Highland landowner class (who had bought financially failing estates) and the remaining wealthier hereditary landlords had the funds to support emigration of their destitute tenants. The result was that almost 11,000 people were provided with "assisted passages" by their landlords between 1846 and 1856, with the greatest number travelling in 1851. A further 5,000 emigrated to Australia, through the Highland and Island Emigration Society. To this should be added an unknown, but significant number, who paid their own fares to emigrate, and a further unknown number assisted by the Colonial Land and Emigration commission. Landlord debt Many Highland landlords were in debt, despite rising commodity prices and the associated farm incomes which allowed higher rents to be charged. The origin of some of this debt went back to the Statutes of Iona, when some lairds were forced to live part of the year in Edinburgh much more expensive than living on their own lands. Profligate spending was a significant cause. The costs of involvement in political activity was a factor for some. The landed classes of the Highlands socialised with southern landowners, who had more diverse sources of income, such as mineral royalties and windfall income from urban expansion. The low productivity of Highland lands made this a financial trap for their owners. In other cases, spending on famine relief depleted the financial resources of landowners—so even the prudent and responsible could ultimately be forced to increase the income from their estates. Lastly, investments in an estate, whether on roads, drainage, enclosure or other improvements might not realise the anticipated returns. The major financial pressure, though, was the end of the Napoleonic Wars, which had supported high prices for the small range of commodities produced in the Highlands. The extent of indebtedness among Highland landowners was enormous. The evidence of this is the very high number of hereditary lands that were sold, especially in the first half of the 19th century. Over two-thirds of Highland estates had changed hands in this way by the end of the 1850s. Eric Richards describes this as a "financial suicide" by an entire class of people. Debt was not a new problem for Highland landowners in the 19th century—it had been equally prevalent in the 17th and 18th. The change was in the lender. The further development of the banking system at the beginning of the 19th century meant that landowners did not need to look to family members or neighbours as a source of finance. The downside to this was a greater readiness of the lender to foreclose—and an increased willingness to lend in the first place, perhaps unwisely. Debt had three possible consequences, all of which were likely to involve the eviction of tenants. The landlord could try to avoid bankruptcy by introducing immediate improvements, putting up rents, clearing tenants to allow higher-paying sheep farmers to be installed. Alternatively, the estate could be sold to wipe out the debts. A new owner was highly likely to have plans for improvement which would include clearance. They also had the money to fund assisted passages for cleared tenants to emigrate, so putting into practice ideas suggested in the 1820s and 1830s. As most purchasers were from outside the Highlands or from England, they neither understood nor followed the Gaelic principle of , so removing a potential level of protection for tenants. Finally, the landlord might enter bankruptcy, with the estate passing into the hands of administrators whose legal obligation was to protect the financial interests of the creditors. This last case was often the worst outcome for tenants, with any considerations of them having no relevance whatsoever under the law. Overpopulation The 18th century was a time of population growth, almost continuous from the 1770s onwards. This was not initially seen as a problem by landlords as people were considered to be an asset—both to provide a pool for military recruitment and as an economic resource. Landowners and the government sought to discourage emigration, an attitude that resulted in the Passenger Vessels Act 1803, which was intended to limit the ability of people to emigrate. The role of the Highlands in providing a source of recruitment for the army and navy was, in the words of T. M. Devine, "quite remarkable". Starting in the Seven Years' War (1756–63) and increasing during the American War of Independence, by the time of the Napoleonic Wars, one estimate put the Highland contribution to regiments of the line, militia, Fencibles and Volunteers at 74,000. This was out of a population of about 300,000. Even allowing for this estimate overstating the case, in time of war, the Highlands was seen as a significant recruiting resource. The attitude towards increasing population was altered in the first half of the 19th century. First, the kelp trade collapsed in the years immediately following the end of the Napoleonic Wars in 1815. Those working in the kelp trade were crofters, with not enough land to make a living, or cottars, the very poorest in society with the least access to land on which to grow food. Without alternative employment, which was not available, destitution was inevitable. The landlords (or in some cases the trustees of their bankrupt estates) no longer tried to retain their tenants on their land, either encouraging or assisting emigration, or, in the more desperate circumstances, virtually compelling those in substantial rent arrears to accept an assisted passage (i.e., to emigrate), with the alternative of simple eviction. The potato famine followed shortly after the collapse of the kelp industry. Faced with a severe famine, the government made clear to any reluctant landlords that they had the primary responsibility of feeding their destitute tenants, whether through employment in public works or estate improvement, or simply by the provision of famine relief. The threat of full application, and possible reform, of the Poor Laws (that would have had the effect of formalising the obligation to feed all the destitute in each parish) was the final impetus to the various assisted emigration measures. In the decades following 1815, the ideological and political consensus changed. Surplus population slowly became thought of as a liability; their need to be fed could not be ignored in a philanthropic age. Therefore, large-scale expatriation was considered as a solution to the social crisis in the Highlands. The ideas of Malthus were adopted by many in a position to influence policy. The Passenger Vessels Act was repealed in 1827 and in 1841 a select committee of the House of Commons concluded that the crofting parishes had a surplus population of 45,000 to 60,000. Discrimination The primary motivation for clearance was economic. Associated with this was the suggestion by some theorists that the Celtic population were less hardworking than those of Anglo-Saxon stock (i.e. Lowlanders and, in some instances, English), so giving an economic element to a racial theory. James Hunter quotes a contemporary Lowland newspaper: ‘Ethnologically the Celtic race is an inferior one and, attempt to disguise it as we may, there is ... no getting rid of the great cosmical fact that it is destined to give way ... before the higher capabilities of the Anglo-Saxon.' These views were held by people like Patrick Sellar, the factor employed by the Countess of Sutherland to put her plans into effect, who often wrote of his support for these ideas, and Sir Charles Trevelyan, the senior government representative in organising famine relief during the Highland Potato Famine. (Trevelyan regarded himself as a "reformed Celt", having a Cornish Celtic heritage.) There is little doubt that racism against the Gael formed some part of the story. Roman Catholics had experienced a sequence of discriminatory laws in the period up to 1708. Whilst English versions of these laws were repealed in 1778, in Scotland this did not happen until 1793. However, religious discrimination is not considered, by some historians, to be a reason for evicting tenants as part of any clearance, and is seen more as a source of voluntary emigration by writers such as Eric Richards. There is one clear (and possibly solitary) case of harassment of Catholics which resulted in eviction by Colin MacDonald of Boisdale (a recent convert to Presbyterianism). This temporarily stalled when the risk of empty farms (and therefore loss of rent) became apparent when voluntary emigration to escape persecution was possible. However, in 1771, thirty-six families did not have their leases renewed (out of some 300 families who were tenants of Boisdale); 11 of these emigrated the next year with financial assistance from the Roman Catholic church. Individual events Year of the sheep In 1792 tenant farmers from Strathrusdale led a protest by driving more than 6,000 sheep off the land surrounding Ardross. It was the first large protest over clearances. It started when land was let to sheep farmers who repeatedly impounded the cattle of neighbouring tenants, alleging that they had strayed onto the sheep grazing. Eventually the owners of the cattle obtained the help from tenants of a nearby estate, and went and recovered their stock after an angry confrontation. Emboldened by this success, a few days later, at a wedding celebration, a plan was developed to round up all the sheep and drive them south over the River Beauly. It was put into effect in a highly organised manner, gathering all the sheep in the area (except that of Donald Macleod of Geanies, the Sheriff Depute of Ross - perhaps out of either fear or respect for him). The first flocks were gathered on the 31 July and by 4 August many thousand sheep had been collected. The protest was seen by local landowners and law officers in the context of revolution in France, and caused alarm. As Sheriff Depute, Donald Macleod reported events to the Lord Advocate, Robert Dundas, in Edinburgh, asking for military help. The request was forwarded to London, where Henry Dundas (Robert's uncle) gave orders for the Black Watch to be sent north to assist. Until they arrived, Macleod avoided the risk of any action against the protestors that might not be successful. In the early hours of 5 August, Macleod, a large party of the gentry from the region, and three companies of soldiers moved to where the sheep had been gathered. Most of the protestors fled, but eight were captured in the vicinity and a further four were taken in their homes. The protest rapidly faded away. Eight of the captives were tried on charges of assaulting the sheep farmers who had impounded their cattle, but were acquitted on the grounds that the violence was self defence. Seven of the prisoners were tried on charges of insurrection and found guilty. The penalties, at a time when the death penalty was frequently used, were mild. The most severe was seven years transportation to Australia for two of the protestors. This was not carried out, because they escaped from the prison at Inverness tollbooth. No serious attempt was made to track the escapees down. Examples of individual clearances Two of the best documented clearances are those from the land of the Duchess of Sutherland, carried out by, among other people, her factor Patrick Sellar, and the Glencalvie clearances which were witnessed and documented by a London Times reporter. The Sutherland Clearances The Sutherland estate was inherited by Elizabeth Sutherland when she was one year old. It consisted of about half of the county of Sutherland, and purchases between 1812 and 1816 increased it to around 63%, as measured by rental value. On 4 September 1785, at the age of 20, Lady Sutherland married George Granville Leveson-Gower, Viscount Trentham, who was known as Earl Gower from 1786 until he succeeded to his father's title of Marquess of Stafford in 1803. In 1832, just six months before he died, he was created Duke of Sutherland and she became known as Duchess-Countess of Sutherland. When Lady Sutherland inherited the estate, there were many wadsets (a type of mortgage) on much of the land; like many Highland estates, it had substantial debts. Some removals were made in 1772 while Lady Sutherland was still a child and the estate was managed by her tutors. They tried to dislodge many of the tacksmen on the estate. Many tenants had emigrated, and new fishing villages were planned to provide employment for tenants moved from the interior. But these plans did not proceed because the estate was short of money. In 1803 Leveson-Gower inherited the huge fortune of the Duke of Bridgewater, and the estate now had the money for improvements. Many of the estate's leases did not end until 1807, but planning was started to restructure the estate. Despite the conventions of the day and the provisions of the entailment on Lady Sutherland's inheritance, Leveson-Gower delegated overall control of the estate to his wife; she took an active interest in its management. As the major part of the Sutherland Clearances began, Lady Sutherland and her advisors were influenced by several things. First, the population was increasing. Second, the area was prone to famine; and it fell to the landlord to organise relief by buying meal and importing it into the area. How bad the famine was is debated, both among modern historians and also within the Sutherland Estate management soon after the clearances in 1845. The third driving force was the whole range of thinking on agricultural improvement. This took in economic ideas expressed by Adam Smith as well as those of many agriculturalists. For the Highlands, the main thrust of these theories was the much greater rental return to be obtained from sheep. Wool prices had increased faster than other commodities since the 1780s. This enabled sheep farmers to pay substantially higher rents than the current tenants. Patrick Sellar Now that capital funding was available, the first big sheep farm was let at Lairg in 1807, involving the removal of about 300 people. Many of these did not accept their new homes and emigrated, to the dissatisfaction of the estate management and Lady Sutherland. In 1809, William Young and Patrick Sellar arrived in Sutherland and made contact with the Sutherland family, becoming key advisors to the owners of the estate. They offered ambitious plans which matched the wish for rapid results. Lady Sutherland had already dismissed the estate's factor, David Campbell, in 1807 for lack of progress. His replacement, Cosmo Falconer found his position being undermined by the advice offered by Young and Sellar. In August 1810 Falconer agreed to leave, with effect from 2 June 1811, and Young and Sellar took over in his place. Young had a proven track record of agricultural improvement in Moray and Sellar was a lawyer educated at Edinburgh University; both were fully versed in the modern ideas of Adam Smith. They provided an extra level of ambition for the estate. New industries were added to the plans, to employ the resettled population. A coal mine was sunk at Brora, and fishing villages were built to exploit the herring shoals off the coast. Other ideas were tanning, flax, salt and brick manufacturing. The first clearances under the factorship of Young and Sellar were in Assynt in 1812, under the direction of Sellar, establishing large sheep farms and resettling the old tenants on the coast. Sellar had the assistance of the local tacksmen in this and the process was conducted without unrest—despite the unpopularity of events. However, in 1813, planned clearances in the Strath of Kildonan were accompanied by riots: an angry mob drove prospective sheep farmers out of the valley when they came to view the land, and a situation of confrontation existed for more than six weeks, with Sellar failing to successfully negotiate with the protesters. Ultimately, the army was called out and the estate made concessions such as paying very favourable prices for the cattle of those being cleared. This was assisted by landlords in surrounding districts taking in some of those displaced and an organised party emigrating to Canada. The whole process was a severe shock to Lady Sutherland and her advisers, who were, in the words of historian Eric Richards, "genuinely astonished at this response to plans which they regarded as wise and benevolent". Further clearances were scheduled in Strathnaver starting at Whitsun, 1814. These were complicated by Sellar having successfully bid for the lease of one of the new sheep farms on land that it was now his responsibility, as factor, to clear. (Overall, this clearance was part of the removal of 430 families from Strathnaver and Brora in 1814—an estimated 2,000 people.) Sellar had also made an enemy of the local law officer, Robert Mackid, by catching him poaching on the Sutherland's land. There was some confusion among the tenants as Sellar made concessions to some of them, allowing them to stay in their properties a little longer. Some tenants moved in advance of the date in their eviction notice—others stayed until the eviction parties arrived. As was normal practice, the roof timbers of cleared houses were destroyed to prevent re-occupation after the eviction party had left. On 13 June 1814, this was done by burning in the case of Badinloskin, the house occupied by William Chisholm. Accounts vary, but it is possible that his elderly and bedridden mother-in-law was still in the house when it was set on fire. In James Hunter's understanding of events, Sellar ordered her to be immediately carried out as soon as he realised what was happening. The old lady died 6 days later. Eric Richards suggests that the old woman was carried to an outbuilding before the house was destroyed. Whatever the facts of the matter, Sellar was charged with culpable homicide and arson, in respect of this incident and others during this clearance. The charges were brought by Robert Mackid, driven by the enmity he held for Sellar for catching him poaching. As the trial approached, the Sutherland estate was reluctant to assist Sellar in his defence, distancing themselves from their employee. He was acquitted of all charges at his trial in 1816. The estate were hugely relieved, taking this as a justification of their clearance activity. (Robert Mackid became a ruined man and had to leave the county, providing Sellar with a grovelling letter of apology and confession.) Despite the acquittal, this event, and Sellar's role in it, was fixed in the popular view of the Sutherland Clearances. James Loch, the Stafford estate commissioner was now taking a greater interest in the Northern part of his employer's holdings; he thought Young's financial management was incompetent, and Sellar's actions among the people deeply concerning. Both Sellar and William Young soon left their management posts with the Sutherland estate (though Sellar remained as a major tenant). Loch, nevertheless, also subscribed to the theory that clearance was beneficial for the tenants as much as for the estate. Lady Sutherland's displeasure with events was added to by critical reports in a minor London newspaper, the Military Register, from April 1815. These were soon carried in larger newspapers. They originated from Alexander Sutherland, who, with his brother John Sutherland of Sciberscross, were opponents of clearance. Alexander, after serving as a captain in the army had been thwarted in his hopes to take up leases on the Sutherland estate and now worked as a journalist in London. He was therefore well placed to cause trouble for the estate. James Loch The (effective) dismissal of Sellar placed him in the role of scapegoat, thereby preventing a proper critical analysis of the estate's policies. Clearances continued under the factorship of Frances Suther and the overall control of James Loch. Through 1816 and 1817, famine conditions affected most of the inland areas and the estate had to provide relief to those who were destitute. This altered policy on emigration: if tenants wanted to emigrate, the estate would not object, but there was still no active encouragement. In 1818 the largest part of the clearance program was put into effect, lasting until 1820. Loch gave emphatic instructions intended to avoid another public relations disaster: rent arrears could be excused for those who co-operated, time was to be taken and rents for the new crofts were to be set as low as possible. The process did not start well. The Reverend David Mackenzie of Kildonan wrote to Loch on behalf of the 220 families due to be cleared from his parish. He categorically challenged the basic premise of the clearance: that the people from an inland region could make a living on their new coastal crofts. Loch was adamant that the removals would go ahead regardless of objections. Yet, at the same time, Suther and the local ground officer of the estate were pointing out to Loch that few of the new crofts were of an acceptable quality. Some tenants were considering moving off the estate, either to Caithness or emigrating to America or the Cape of Good Hope, which Suther encouraged by writing off their rent arrears. More positively for those with eviction notices, cattle prices were high in 1818. Ultimately, that year's clearances passed without serious protest. Over the next two years the scale of clearance increased: 425 families (about 2,000 people) in 1819 and 522 families in 1820. Loch was anxious to move quickly, whilst cattle prices were high and there was a good demand for leases of sheep farms. There was no violent resistance in 1819, but Suther, despite precise instructions to the contrary, used fire to destroy cleared houses. This came after a spell of dry weather, in which the turf and stone walls of the houses had dried out, so that even the turf in the walls ignited, adding to the blaze of the thatch and roof timbers. Multiplied over the large number of properties that were cleared, this made a horrific impression on those who observed it. The public relations disaster that Loch had wished to avoid now followed, with The Observer newspaper running the headline: "the Devastation of Sutherland". 1819 became known as "the year of the burnings" (). In the autumn of 1819, the Sutherland Estate management received reports of growing hostility to further clearances. The Sutherland family were sent anonymous threatening letters to their house in London. The Transatlantic Emigration Society provided a focus for resistance to the clearances planned in 1820, holding large meetings and conducting extensive correspondence with newspapers about the situation of Sutherland tenants. This publicity caused great concern to Loch, and the comment in the press increased as Whitsun 1820 approached. Lady Sutherland felt that her family was being particularly targeted by critics of the clearances, so she asked Loch to find out what neighbouring estates had done. The answer was that Lord Moray in Ross-shire had, on occasion, bought the cattle owned by evicted tenants, but otherwise had made no provision for them: they had simply been evicted with no compensation or alternative tenancies offered. The tenants of Munro of Novar were also simply evicted, with many of them emigrating. As the 1820 Sutherland clearances approached, there was notable rioting at Culrain on the Munro of Novar estate, protesting at their clearance plans. Loch worried that this would spread to the Sutherland tenants, but no violent physical resistance occurred, with those cleared demonstrating (in the words of Eric Richards) "sullen acquiescence". In June there was serious resistance to clearance in another nearby estate, at Gruids. Richards attributes the lack of violence in the Sutherland Estate to the resettlement arrangements in place there, stating: "In this sense the Sutherland estate was, despite its reputation, in strong and positive contrast to most other clearing proprietors." 1819 and 1820 represented the main clearance activity on the Sutherland Estate. The much smaller clearance in the spring of 1821 at Achness and Ascoilmore met with obstruction and the military had to be called in to carry out evictions by force. Complaints were made against the estate of cruelty and negligence, but an internal enquiry absolved the factor of any wrongdoing. However, it is highly likely that this conclusion glossed over the suffering experienced by those evicted. Figures gathered by the estate give some information on where tenants, sub-tenants and squatters went after the evictions in 1819. For tenants, 68% became tenants elsewhere on the estate, 7% went to neighbouring estates, 21% to adjoining counties and 2% emigrated. The remaining 2% were unaccounted for. The sub-tenants and squatters were divided up into 73% resettled on the coast, 7% in neighbouring estates, 13% to nearby counties and 5% emigrated. Two per cent were unaccounted for. This survey does not pick up information on those who subsequently travelled elsewhere. Loch issued instructions to Suther at the end of 1821 that brought the major clearance activity of the estate to an end. Some small-scale clearance activity continued for the next 20 years or so, but this was not part of the overall plan to resettle the population in coastal settlements and engage them in alternative industries. Glengarry The flamboyant Alexander Ranaldson MacDonell of Glengarry portrayed himself as the last genuine specimen of the true Highland chief while his tenants (almost all Catholic) were subjected to a relentless process of eviction. He abandoned his disbanded regiment; its Catholic chaplain (later bishop), Alexander Macdonell led the men and their families to settle in Glengarry County, eastern Ontario, Canada. Resistance It has frequently been asserted that Gaels reacted to the Clearances with apathy and a near-total absence of active resistance from the crofting population. However, upon closer examination this view is at best an oversimplification. While troops intervened on at least 10 occasions, Richards notes that not a single death has been directly attributed to the physical act of the clearances themselves. Michael Lynch suggests that there were more than 50 major acts of resistance to clearance. Even before the Crofters' War of the 1880s, Gaelic communities had staved off or even averted removals by accosting law enforcement officials and destroying eviction notices, such as in Coigach, Ross-shire, 1852–3. Women took the front line in opposing the authorities, with their male relatives backing them up. Lowland shepherds imported to work the new sheep farms were subject to intimidating letters and maiming or theft of the sheep. More than 1,500 sheep were stolen on the Sutherland estate in a single year in the early 19th century. Many forms of resistance were practiced under the table, such as poaching. After the introduction of watermills at Milton Farm, South Uist, in the early nineteenth century, the tenants continued to hand-grind their grain with querns. As this was considered undesirable, the landlord had the querns broken; similar episodes were recorded in Skye and Tiree. After the Disruption of 1843, many Gaelic-speaking areas deserted the Church of Scotland in favour of the Presbyterian Free Church, which refused to take money from landlords and was often overtly critical of them. Richards describes three attempts at large-scale resistance before the Crofters' War: the Year of the Sheep, protests against Patrick Sellar's clearance of Strathnaver in 1812–4, and the "Dudgeonite agitation" in Easter Ross in 1819–20, sparked by a local tacksman's organization of an emigration fund. Crofters' Act The Highland Land League eventually achieved land reform in the enactment of the Crofters' Holdings (Scotland) Act 1886, but these could not bring economic viability and came too late, at a time when the land was already suffering from depopulation. However, the Crofters' Act put an end to the Clearances by granting security of tenure to crofters. However, the Crofters' Act did not grant security of tenure to cottars or break up large estates. As a result, the Scottish Highlands continues to have the most unequal distributions of land in Europe, with more than half of Scotland owned by fewer than 500 people. Land struggles occurred after the First and Second World Wars as returning servicemen could not get crofts. Legacy Literature Poetry Many Gaelic poets were heavily influenced by the Clearances. Responses varied from sadness and nostalgia, which dominated the poetry of Niall MacLeòid, to the anger and call to action found in the work of Mary MacPherson. The best-known Scottish Gaelic poem of the 20th century, , was written by Sorley MacLean about a cleared village near where he grew up on Raasay; many of his other poems deal with the effects of the Clearances. Many songs were in the form of satire of the landlord class. Perhaps the most famous of these is (Mackay Country or Northern Sutherland, a region hit hard by the Clearances), written by Ewen Robertson, who became known as the "Bard of the Clearances." The song mocks the Duke of Sutherland, his factor, Patrick Sellar, James Loch, James Anderson, and others involved in the Sutherland Clearances. Similar sentiments were expressed with regard to the Ardnamurchan Clearances by a local doctor, Iain MacLachlainn. The Canadian Boat-Song expresses the desolation felt by some emigrants: Prose The clearances were an influential theme in Scottish literature, with notable examples such as Consider the Lilies, a novel by Iain Crichton Smith. Memorials to the Clearances On 23 July 2007, the Scottish First Minister Alex Salmond unveiled a bronze Exiles statue, by Gerald Laing, in Helmsdale, Sutherland, which commemorates the people who were cleared from the area by landowners and left their homeland to begin new lives overseas. The statue, which depicts a family leaving their home, stands at the mouth of the Strath of Kildonan and was funded by Dennis Macleod, a Scottish-Canadian mining millionaire who also attended the ceremony. An identical three-metre-high bronze Exiles statue has also been set up on the banks of the Red River in Winnipeg, Manitoba, Canada. In Golspie, Sutherland, a statue of George Granville Leveson-Gower, the first Duke of Sutherland, has been subject to vandalism due to his controversial role in the Sutherland Clearances. Demographics The diaspora was worldwide, but emigrants settled in close communities on Prince Edward Island, Nova Scotia (Antigonish and Pictou counties and later in Cape Breton), the Glengarry and Kingston areas of Ontario and the Carolinas of the American colonies. Canadian Gaelic was widely spoken for some two centuries. One estimate of Nova Scotia's population has 50,000 Gaels immigrating from Scotland between 1815 and 1870. At the beginning of the 20th century, there were an estimated 100,000 Gaelic speakers in Cape Breton. See also Enclosure Clan MacDonell of Glengarry Rural flight Scottish Australians Scottish Americans Scottish Canadians Scottish New Zealanders Notes References Further reading Devine, T M (1994). Clanship to Crofters' War: The social transformation of the Scottish Highlands (2013 ed.). Manchester University Press. . Devine, T M (2018). The Scottish Clearances: A History of the Dispossessed, 1600–1900. London: Allen Lane. Dodgshon, Robert A. (1998). From Chiefs to Landlords: Social and Economic Change in the Western Highlands and Islands, c.1493–1820. Edinburgh: Edinburgh University Press. Hunter, James (2000). The Making of the Crofting Community, John Donald Publishers Ltd; 2nd Revised edition. (Originally published in 1976, the 2000 edition has a preface that modifies some of the thinking in the main text of the book.) Macinnes, Allan I. (1996). Clanship, Commerce and the House of Stewart, 1603–1788. East Linton: Tuckwell Press. Macleod, Donald, Gloomy Memories, 1857 (A first-hand account of Sutherland clearances. Macleod should be read with caution as he frequently employed hyperbole for passionate emphasis.) Prebble, John (1963) The Highland Clearances, Secker & Warburg. (This is the seminal work that brought the subject to modern attention. Later historical work corrects and challenges many points in this book.) Richards, Eric (2000). The Highland Clearances: People, Landlords and Rural Turmoil, Birlinn Books. External links Narratives in a Landscape: Monuments and Memories of the Sutherland Clearances dissertation on the landscape of the Clearances The Cultural Impact of the Highland Clearances, a BBC History article by Ross Noble The Highland Clearances. Article by Thomas Devine, published in Refresh 4, Spring 1987. The impact of the Highland Clearances on one Mull family and their aftermath – a linear case study. Knockan Badbea Descendants Website The Highland Clearances, BBC Radio 4, In Our Time Professor Tom Devine speaking about the Scottish Highland and Lowland Clearances. (Part of the Crossroads Lifelong Learning Partnership, published 31 Jan 2014.)Tom Devine on the Highlands & Lowland Clearances History of the Scottish Highlands Highlands and Islands of Scotland Forced migration Scottish emigration Scottish diaspora 18th century in Scotland 19th century in Scotland Enclosures History of agriculture Forced migrations in Europe
32435295
https://en.wikipedia.org/wiki/Institute%20of%20Management%20and%20Information%20Technology%2C%20Cuttack
Institute of Management and Information Technology, Cuttack
Institute of Management and Information Technology, Cuttack, IMIT Cuttack is a state government management institution and the first institute of information technology and management established by the government of Odisha in 1962. It is in the city of Cuttack, in the eastern part of the state of Odisha, India. IMIT Cuttack is also known as 'College of Accountancy and Management Studies (CAMS)' which is an institution imparting postgraduate degrees in Master of Business Administration, Master of Computer Application and MTech in Information Technology which is affiliated to Biju Patnaik University of Technology under AICTE. References Technical universities and colleges in India All India Council for Technical Education Engineering colleges in Odisha Business schools in Odisha Universities and colleges in Odisha Colleges affiliated with Biju Patnaik University of Technology Education in Cuttack Educational institutions established in 1962 1962 establishments in Orissa
9287212
https://en.wikipedia.org/wiki/SNS%20College%20of%20Technology
SNS College of Technology
SNS College of Technology is one among the premiere institutes in Coimbatore, Tamil Nadu, India. It was established in 2002 as part of the SNS Groups. The college is approved by AICTE and affiliated to the Anna University, Coimbatore. Anna University, Chennai confirmed the conferment of Autonomous status to SNS College of Technology for a period of 10 years with effect from 2018-19 to 2027-28. SNS College of Technology is Accredited by 'NAAC' with the highest A+ Grade and SNS had started to follow a new education strategy called design thinking which has 5 elements. SNS College of Technology was founded in 2002 and has been administered and run by Sri. SNS Charitable Trust. The institution was established with the permission of the Government of Tamil Nadu and recognized by UGC. It is a self-financing, co-educational, college and performs its academic duties with a motto of Sincerity, Nobility and Service (SNS). The college is located in a rural area on Sathy (NH-209) road at Coimbatore. It offers fourteen undergraduate courses, six postgraduate courses and seven research programmes. Four departments of the college namely Mech., CSE, ECE & IT are accredited by the National Board of Accreditation (NBA). Undergraduate degree courses (B.E./B.Tech.) Aerospace Engineering Aeronautical Engineering (Lateral) Agricultural Engineering Automobile Engineering Bio-Medical Engineering Civil Engineering Computer Science and Engineering Electrical & Electronics Engineering Electronics and Communication Engineering Electronics and Instrumentation Engineering Information Technology Mechatronics Engineering Mechanical Engineering Computer science and technology Postgraduate courses Master of Business Administration (MBA) Master of Computer Applications (MCA) M.E. Computer Science and Engineering M.E. Communication Systems M.Tech. Information Technology M.E. VLSI Research Programme (Ph.D.) Mechanical Engineering Information Technology Electronics and Communication Engineering Electrical and Electronics Engineering Computer Science Engineering Civil Engineering Business Administration. External links SNS College of Technology VIMZER - EEE Association of SNS College of Technology Facebook group - created by students Engineering colleges in Coimbatore
1404327
https://en.wikipedia.org/wiki/Avid%20Audio
Avid Audio
Avid Audio (formerly Digidesign) is an American digital audio technology company. It was founded in 1984 by Peter Gotcher and Evan Brooks. The company began as a project to raise money for the founders' band, selling EPROM chips for drum machines. It is a subsidiary of Avid Technology, and during 2010 the Digidesign brand was phased out. Avid Audio products will continue to be produced and will now carry the Avid brand name. Products Digidesign's flagship software product was Pro Tools, which came in three variants: Pro Tools|HD, Pro Tools LE, and Pro Tools M-Powered. Pro Tools|HD required a Digidesign TDM system and interface, and was intended for professional recording studios. Pro Tools LE was a complete package intended for home users and some post-production facilities. The package included the Pro Tools LE software and hardware such as the M-Box 2 or Digi 003. Pro Tools M-Powered was simply the Pro Tools application adapted to run on M-Audio hardware, and generally comparable in power to an LE system. In 2010, these various editions of Pro Tools were mostly abandoned and it is now being sold by Avid as a singular software product, with the level of functionality dependent on the hardware chosen by the user. Digidesign also made a number of products for the Pro Tools platform, including several software plug-ins. They also manufacture a wide variety of hardware add-ons for Pro Tools, such as audio interfaces, MIDI interfaces, Synchronizers, and control surfaces. In the spring of 2005 they introduced a system for live sound mixing called VENUE. History 1984 - Founded as Digidrums by Peter Gotcher and Evan Brooks. Later became Digidesign. 1989 - Digidesign launches the first digital audio workstation system, Sound Tools, for the Apple Macintosh. The company refers to it as "the first tapeless recording studio". 1991 - Digidesign releases the first Pro Tools multitrack system, marking a significant advance in digital audio. This integrated software and hardware system (digital audio workstation) is among the most popular for audio production for television, music, and film. 1995 - Avid Technology acquires Digidesign. Digidesign now operates as a business unit of Avid. 2001 - Digidesign wins a Grammy award for Pro Tools. 2003 - Avid acquires Bomb Factory's extensive product catalog. These products are now included with Pro Tools systems. 2003 - Academy of Motion Picture Arts and Sciences recognizes Digidesign's contribution to audio post production for film with an Oscar. 2004 - Avid acquires M-Audio, which now operates as a business unit of Digidesign, but maintains the M-Audio brand name and remains largely distinct from Digidesign. 2005 - Avid acquires Wizoo, which now operates as Digidesign Advanced Instrument Research Group (A.I.R.), but remains largely autonomous, operating out of Bremen, Germany. 2006 - Avid acquires Sibelius, the music software, in a deal worth over US$23 million. 2010 - the Digidesign brand name is phased out, with Digidesign products now falling under the Avid product banner. 2011 - Digidesign became Avid Audio History of Avid Audio software Digidesign Software was developed by UC Berkeley graduate Peter Gotcher and his friend Evan Brooks, both majors in electrical engineering and computer science. Sound Designer The first incarnation of today's Pro Tools started life in 1984 as Sound Designer, while the pair were creating and selling drum sound chips under their Digidrums label. Sound Designer was originally designed to edit sounds for sampling synthesizers like the Prophet 2000, Ensoniq Mirage, Akai s900 and the E-MU Emulator sampling keyboard. Sounds would be sampled into the synth and transferred onto the Mac. Here, the user could manipulate the sample's sound—EQ, amplitude, etc.; truncate to save memory; or set loop points; and then transfer back into the synth for playback. The software was originally written for editing drum samples and then for burning new chips for the LinnDrum era drum machines. Sound Designer file formats The SDII (Sound Designer II, sometimes seen abbreviated as SD2) is a monophonic/stereophonic audio file format, originally developed by Digidesign for their Macintosh-based recording/editing products. It is the successor to the original monophonic Sound Designer I audio file format. Sound Tools Gotcher and Brooks discussed with E-MU Systems the possibility of integrating their renamed 'Sound Tools' software into the Emulator III keyboard released in 1987; however, E-MU rejected this option, and Gotcher and Brooks went on to start DigiDrums. Sound Tools was debuted on January 20, 1989, at NAMM (National Association of Music Merchandisers). At this stage, Sound Tools was a simple computer-based stereo audio editor. Although the software had the possibility to do far more, it was limited by the hard drive technology, which was used to stream the audio and allow for the non-destructive editing that Sound Tools offered. Independent public radio producers soon discovered the power and economy of Sound Tools as an alternative to traditional analog tape, sparking the transition to digital program production and distribution in radio. Pro Tools The first version of Pro Tools was launched in 1991, offering four tracks and selling for US$6,000. Digidesign continued to improve Pro Tools, adding a sequencer and more tracks, with the system offering recording at 16-bit and 44.1 kHz. In 1997, Pro Tools had reached 24-bit and 48 track versions. At this point, the migration from more conventional studio technology to the Pro Tools platform took place within the industry. References External links Avid Audio page on Avid.com Avid Audio Forums (aka Digidesign User Conference) John Payne: "The Software Chronicles" about Evan Brooks, EQ Magazine, March 2006 The Studio Files website: Pro Tools users tech blog Evan Brooks Interview NAMM Oral History Library Peter Gotcher Interview NAMM Oral History Library (2012) Technology companies based in the San Francisco Bay Area Companies based in San Mateo County, California Technology companies established in 1984 1984 establishments in California Audio mixing console manufacturers Manufacturers of professional audio equipment Recipients of the Scientific and Technical Academy Award of Merit Audio equipment manufacturers of the United States
60156719
https://en.wikipedia.org/wiki/Sex%20Vixens%20from%20Space
Sex Vixens from Space
Sex Vixens from Space is an erotic text adventure game developed and self-published by Free Spirit Software and originally released in 1988 for the Commodore 64 and Apple II as part of the compilation Sex And Violence Vol. 1, and later released as a standalone game by November 1988 for DOS and Amiga, and in 1989 for the Atari ST. Sex Vixens from Space is the first entry in the Brad Stallion series, and is succeeded by Planet of Lust (1989), Bride of the Robot (1989), and Sex Olympics (1991). Sex Vixens was inspired by the 1974 sexploitation film Flesh Gordon. Sex Vixens from Space was panned by reviewers. In the United Kingdom in 1989, copies of Sex Vixens from Space were seized by British Customs authorities and subsequently destroyed due to their sexual content. Plot Brad Stallion, freelance government agent and captain of the phallic spaceship the Big Thruster, has been hired by the Federated Government to neutralize "The Tribe", a colony of cloned hypersexual amazon women that raid planets to castrate men using their sex-ray gun. Brad Stallion is tasked with travelling to their home planet of Mondo, and destroying the sex-ray gun. Gameplay The Commodore 64 version of Sex Vixens from Space is purely a text adventure, with no graphics or alternate controls. While most actions in the DOS, Atari ST, and Amiga ports of Sex Vixens from Space are still inputted through text commands, alongside the addition of graphics, objects and characters may be interacted with using the mouse in the style of a point-and-click adventure; this is absent in the Commodore 64 version. The player may move in the cardinal directions using the arrow keys, and certain UI elements may be accessed through shortcut keys. 'I' may be inputted to access the inventory, 'L' for information on current location, and 'S' for game status; the Big Thruster's computer AI Sandie gives advice pertinent to the player's location and situation. Development Sex Vixens from Space was originally released in 1988 on the Commodore 64 and Apple II as part of the game compilation Sex And Violence Vol. 1, alongside Bite of the Sorority Vampires and Hatchet Honeymoon. Sex And Violence Vol. 1 cost $29.95 USD in 1988. The Amiga version of Sex Vixens from Space cost 'around 85' Deutschmark in 1988, and $39.95 USD in 1989. Alongside the real credits, there are several fake names with fake roles for comedic effect. The Commodore 64 and Apple II versions of Sex Vixens from Space are credited to Lance Strate, although it is unclear whether this is a real person, as the other two games in Sex And Violence Vol. 1, Bite of the Sorority Vampires and Hatchet Honeymoon, are credited to 'Dick Long' and 'Peter Grow' respectively. Other ports of Sex Vixens from Space had a larger development team. Sex Vixens from Space was published by ASoft in Europe. While the Amiga and DOS versions of Sex Vixens support the use of a second disk drive to run both game disks at once, the Atari ST version is incompatible with using both disks at once, thus the disks must be swapped out as needed. Sex Vixens from Space was exhibited at the World of Commodore expo in 1989. A patched release of the Amiga version of Sex Vixens was released around October 1989. In the UK in 1989, a shipment of copies of the Amiga version of Sex Vixens from Space were seized by British Customs and subsequently destroyed due to their sexual content; while other publications did not specify a number, .info stated that 75 copies were destroyed. Joe Hubbard from Free Spirit Software defended Sex Vixens' sexual content in a statement to The Australian Commodore and Amiga Review, stating that "While Sex Vixens from Space may be a bit racy, it is not pornographic. Apparently, the British authorities are either quite prudish or completely lacking a sense of humor. Regardless, freedom of artistic expression and the freedom to disseminate such are cornerstones of democracy. The act of seizing these game [sic] is the act of a fascist government." Tim Harris, the CEO of ASoft, Sex Vixens' European publisher, said in a statement to Your Sinclair that "The game's been hyped up, but there isn't really that much sexual content. It's a heck of a lot tamer than Strip Poker." In an interview with Italian gaming magazine Amiga Magazine, the president of Free Spirit Software, Joseph Hubbard, expressed that Sex Vixens from Space was inspired by the 1974 sexploitation film Flesh Gordon, stating that "When we made Sex Vixens, we thought of an old soft-core movie called Flesh Gordon ... It gave the public some eroticism and a lot of fun, although not necessarily in that order." When asked about the moral impact of sexual computer games, Hubbard expressed that "Everyone has some fantasy, and there's nothing wrong with that. A computer cannot do more damage than what an individual can already do on their own. I prefer to see people play with erotic software rather than violent games." Hubbard further expressed that "Anyone can sit at their computer and exterminate hundreds of aliens, or human beings in enemy uniforms ... Do you think this is healthier or less dangerous than sex?" Reception In a June 1990 issue of STart magazine, the plot of Sex Vixens from Space was given as an example of sexism in the gaming industry and objectification of women in media. In 1996, Computer Gaming World declared Sex Vixens from Space the 19th-worst computer game ever released, stating that "This funny, sexy adventure game was neither funny, sexy nor adventurous." German gaming magazine Aktueller Software Markt gave the Amiga version of Sex Vixens from Space an overall score of 3.2 out of 12, summarizing it as a "soft porno computer game" and calling the game's sexual content 'more tame' than what the cover or advertisements suggest. ASM criticized Sex Vixens from Space's graphics and text input, expressing that "If you're hoping for crisp graphics, you'll be disappointed. What is also disappointing is the adventure, specifically the core elements that determine the quality of the adventure. The parser should be red in the face should you flatter it by describing it as such. The graphics are thoroughly low quality, and the title screen is an absolute catastrophe: for this alone Free Spirit Software should be ashamed until the end of their days." ASM also criticized Sex Vixens' plot as "stupid", as well as its 'disappointing' amount of sexual content, stating that "there is too little sex for a computer-porno, not enough processing power for a proper adventure, and too much money for both. Overall, the game's forgettable!" .info gave the Amiga version of Sex Vixens from Space an overall score of 2.5 stars out of 5, beginning their review by lamenting the game's glitches, expressing that "I wanted to like this game, and I know I would if it worked", but praised the game's adult content, stating that "The adults-only adventure game premise has been tried before, and this one comes very close to succeeding." .info further criticized Sex Vixens' glitches, expressing that "all the effort has gone into the graphics and story rather than the programming", particularly criticizing the game's parser and text input, noting the 'limited' commands and stating that the game has abrupt lag spikes in which both the game and the parser are unresponsive. .info praised Sex Vixens' humor and "quite well done" graphics, but expressed that the game needed to be patched by the developers, concluding that "I hope Free Spirit will do the necessary fixes, because Sex Vixens has all the potential to become a deliciously funny and entertaining game." Swedish gaming magazine Datormagazin gave the Amiga version of Sex Vixens from Space an overall score of 2 out of 10, bluntly captioning their review by stating that "Sex Vixens from Space is a piece of shit. Apparently it's broken too, and not even so-bad-it's-funny." Datormagazin criticizes the game's plot as "tasteless" and furthermore calls Sex Vixens "a game which is so bad that you wonder why it even exists". Datormagazin praised Sex Vixens' "very well made" graphics, but expressed that they didn't make up for the game's shortcomings, stating that "what good will good graphics do if the rest of the game is pure shit?" Datormagazin criticized the lack of a save feature, expressing that it "characterizes the game's terrible quality well", and noted a glitch where if the player drops an item after picking it up for the first time, it may no longer be picked up. Datormagazin concluded their review by expressing that "[Sex Vixens] can be summarized in a single word: Avoid!" QuestBusters reviewed the Amiga version of the game in a September 1989 issue, criticizing the game's "inconsistent" interface & inputs, and "far below average" parser. QuestBusters noted Sex Vixens' possible inputs as 'limited', stating that "In most situations, the program won't even let you examine much of the surroundings" and expressed that this limited the game's puzzles, saying that "There aren't a lot of logical puzzles, since the parser won't even let you examine most items, let alone use them. Basically, all you can do is "make love" or "talk" to the women". QuestBusters further criticized the impact of Sex Vixens' limited inputs with regards to puzzles, expressing that "There's only one solution for each puzzle, and with such a limited vocabulary, you can fall into the trap of knowing the answer but get stuck trying to guess what synonym the game wants to hear." QuestBusters praised Sex Vixens futuristic graphics as "slick and colorful", but criticized the 'bad' quality of the art of women in-game, stating that "Not only are the majority of the women ugly, but the actual drawings are drastically inferior to that of the space ships". QuestBusters furthermore criticized Sex Vixens minimal on-screen interaction with NPCs, expressing that "While most of these graphics [are of] professional quality, there are no people in 99% of them. This creates, oddly enough for a sex-oriented game, a sterile environment." QuestBusters criticized Sex Vixens' nudity as 'ugly and unappealing', and further expressed that the game's "limited" erotic text was similarly unattractive. QuestBusters condemned the lack of a save feature as 'unreasonable' for a modern adventure game, and also criticized the lack of music and sound effects. Italian gaming magazine Game Republic noted Sex Vixens from Space as an oddity in a 2013 retrospective review, praising its "interesting narrative" but noted it as being "undeniably in poor taste" at times. Game Republic praised Sex Vixens from Space's "bizarre" environments and over-the-top presentation, calling it "a space opera as absurd as it is surprising". See also Leather Goddesses of Phobos References External links Sex Vixens from Space at Hall of Light Amiga database 1988 video games Adventure games Adventure games set in space Amiga games Apple II games Atari ST games Censored video games Commodore 64 games DOS games Erotic video games Fictional secret agents and spies in video games Puzzle video games Science fiction video games Single-player video games Space opera video games Video games developed in the United States Video games set in the future Video games set on fictional planets
14859620
https://en.wikipedia.org/wiki/Dick%20Smith%20Super-80%20Computer
Dick Smith Super-80 Computer
The Dick Smith Super-80 was a Zilog Z80 based kit computer developed as a joint venture between Electronics Australia magazine and Dick Smith Electronics. It was presented as a series of construction articles in Electronics Australia magazine's August, September and October 1981 issues. Electronics Australia had published a number of computer projects before the Super-80, including the EDUC-8 in 1974, the Mini Scamp and the DREAM 6800 Video Computer. The computer was sold as a "short form" kit for A$289.50. For this, the purchaser received the computer PCB, an assembly manual (a copy of the construction articles from Electronics Australia) and basic components, including 16kB of RAM and a 2kB EPROM containing a machine code monitor program. The technical manual and power transformer were sold separately, as were a kit of I.C. sockets, a BASIC interpreter program and from mid-1982 onwards, a metal case to house the computer. The computer proved to be a popular construction project, with an advertisement in November 1982 claiming: "Over 2000 sold." The popularity of the Super-80 led to a small industry growing up around addressing the shortcomings of the original computer - especially the black and white, 32 × 16 character, upper case only video display. The original name of the computer was "Nova-80", but it was changed at the last minute to avoid "possible legal ramifications". Specifications CPU: Zilog Z80 Clock Speed: 2 MHz Expansion: S-100 Bus Slot (Optional) Keyboard: 60 Key Mass Storage: Cassette Tape (300 Baud, Kansas City Standard) RAM: 16kB (maximum 48kB) ROM: 2kB (maximum 12kB) Sound: None Video Display: Monochrome, 32 × 16 Characters, Upper Case Only Technical description The Super-80 was based on the Zilog Z80 8-bit microprocessor. As standard, it had 16 kB of dynamic RAM in the form of eight 4116 RAM chips. RAM could be expanded to 32 kB or 48 kB through the addition of rows of eight 4116 RAM chips. The computer was assembled on a single double-sided printed circuit board. The board was supplied in a light cardboard sleeve that appeared to be an LP record sleeve, having the words "Dick Smith Super 80 Microcomputer Kit Printed Circuit Board" and the part number "Cat H-8402" printed along the spine. To keep the price of the computer and the component count down, a novel technique was used to implement the video display. Instead of an expensive video display controller chip with dedicated memory, the Super-80 used discrete TTL logic to implement the video display and 512 bytes of system RAM was shared between the video display and the CPU. Fifty times per second, the CPU was turned off for around 10 ms by asserting the Z80 BUSREQ (DMA) pin. The video display circuitry would then read from the shared RAM while it refreshed the image on the screen. In addition to a 50% degradation in processor performance, this meant that it was not possible to perform any accurate timing in software, since the programmer had no control over when the next video display refresh cycle would occur. The video display could be switched off under software control for greater processing speed, or when accurate software timing was required. The most common situation in which that occurred was when the built-in cassette interface was being used. The location of the 512 bytes of video memory was normally at the top of the available RAM, but could be changed by writing to an I/O port. The keyboard was part of the main computer PCB, but before assembly, the constructor could opt to cut the keyboard section of the printed circuit board off and connect it to the main board with ribbon cable. The keyboard was wired as an 8 × 8 matrix and connected to the computer via the two 8-bit ports of a Z80 PIO chip. Pressing the keys <CTRL>, <C> and <4> at the same time generated an interrupt that would perform a "warm start" of the monitor program. The keyboard 'read' routine supplied in ROM was "negative edge triggered" and would block while a key was down. As a result, most action games incorporated their own keyboard driver. The standard computer had no serial or parallel I/O as such, relying on the optional S-100 bus interface for I/O and expansion. A 10-pin connector at the back of the board was labeled "PORT" and had power, as well as a pair of digital outputs and two available digital input lines. The connector was for a future RS-232 / 20 mA current loop serial interface, but that was never implemented. Mass storage was available in the form of a cassette tape interface running at 300 baud. Accessing the cassette interface required the video display to be switched off, so an LED was provided to show activity during a tape load or save operation. The LED would change state each time a 256-byte block of data was successfully transferred. The Z80 interrupt line was connected to the keyboard PIO and the "Non Maskable Interrupt" line was not connected. Software The Super-80 came with a 2kB machine code monitor program in ROM. A BASIC interpreter could be purchased either on cassette tape or on a set of three 4kB EPROMs. The first 4kB BASIC EPROM replaced the 2kB monitor EPROM supplied with the computer and contained the first 2kB of BASIC, plus the monitor program. The BASIC interpreter was based on Tiny BASIC rewritten and modified by Ron Harris. In its November 1981 edition (p93), Electronics Australia announced a programming competition with the chance to win one of two dot matrix printers. EA later compiled the better programs submitted by readers into a book called "Software for the Super-80 Computer". Accessories and Options B-3600 Super-80 Technical Manual B-3602 Super-80 BASIC Handbook H-3200 Metal Case K-3602 BASIC Interpreter on Cassette Tape K-3603 I.C. Socket Kit K-3604 BASIC Interpreter in EPROM K-3606 S-100 Expansion Unit K-3607 Lower Case Character Generator Modifications Many Super-80 owners chose to modify their machines to address the limitations of the original machine. The El Graphix kit added the ability to display lower case characters and "chunky" graphics. The Printer Interface was an S-100 Bus card giving the Super-80 a Centronics parallel printer port. The VDU Expansion Board (VDUEB) was an enhanced video display board for the Super-80 developed by Microcomputer Engineering (MCE). The VDUEB gave the Super-80 an 80×25 video display with limited graphics capabilities. It was based on a 6845 CRTC I.C. and had its own 2kB of video RAM and 2kB of character generator RAM. Installation of the VDUEB board was a one-way process, as it required major modifications to the Super-80 printed circuit board including cutting of tracks and soldering in many wire links between various parts of the board. The VDUEB was then connected via three I.C. sockets formerly occupied by the original video display circuitry. Removal of the original DMA based video display effectively doubled the performance of the computer, since the CPU was no longer being disabled 50 times per second for video display refreshes. The board gave the Super-80 similar video display capabilities to the Applied Technology Microbee computer, released about six months after the Super-80. This led to many Microbee games being ported to the VDUEB equipped Super-80. The VDUEB proved to be a popular modification, with a users' group forming for owners of VDUEB equipped computers - The "Super-80 VDUEB Users' Club". The Universal Floppy Disk Controller (UFDC) was an add-on floppy disk interface developed by Microcomputer Engineering (MCE). The UFDC was based on the Western Digital WD2793 floppy disk controller chip and had a Z80 DMA controller on board. The most popular disk format was 5" (133 mm) 80 track, double sided, double density using a Mitsubishi floppy drive mechanism. This gave a formatted disk capacity of 800kB. The UFDC's use of DMA required the VDUEB upgrade to be present. To install the disk controller, the Z80 CPU was removed from the main computer board and installed on the UFDC board. The UFDC then piggybacked on the socket vacated by the CPU. This meant that in theory, the UFDC could be used with almost any Z80 based system, provided there was enough physical space above the CPU. The UFDC used a primitive track based disk operating system called "Super-80 DOS", however a CP/M BIOS later became available. The MXB-1 Memory Expansion Board was designed by a member of the VDUEB Users' Club. The MXB-1 contained space for extra EPROMs, an optional battery backed real time clock, a centronics compatible printer interface and address decoding for up to 192kB of RAM. The extra RAM was installed by replacing the standard 4116 16k × 1 bit RAM chips with 4164 64k × 1 bit RAM chips. The CPU's 64kB address limit was worked around through bank switching. The MXB-1 was mostly of benefit to users running the CP/M disk operating system, since CP/M could then have a full 64kB of RAM for programs, with up to 128kB being used as a small RAM disk. The El Graphix "X-RAM" board provided up to 16K of battery backed CMOS RAM or EPROM. Several X-RAM boards could be piggybacked allowing each to be port selected as independent 16K banks of memory. This memory took the form of 8 x 2K 24pin sockets which accepted either 2016 CMOS RAM or 2716 EPROMS. As the system ROM shared part of the address range taken up by the X-RAM board the system ROM could be copied into CMOS RAM on the fly and modified as required to actively add or modify custom functions into the operating system. See also Dick Smith System 80, a pre-built near-clone of the Tandy TRS-80 Model I References External links Dick Smith Super 80 on Groups.io Justin Kerk's Super-80 Technical Information Microbee Software Preservation Project Super-80 Repository Z80-based home computers
8614306
https://en.wikipedia.org/wiki/Telemarketing%20fraud
Telemarketing fraud
Telemarketing fraud is fraudulent selling conducted over the telephone. The term is also used for telephone fraud not involving selling. Telemarketing fraud is one of the most persuasive deceptions identified by the Federal Trade Commission (FTC). Telemarketing fraud often involves some sort of victim compliance whether it involves the victim initiating contact with the perpetrator or voluntarily providing their private information to the offender; thus, fraud victims may experience feelings of shame and embarrassment that may prevent them from reporting their victimization. Older people are disproportionately targeted by fraudulent telemarketers and make up 80% of victims affected by telemarketing scams alone. Older people may be targeted more because the scammer assumes they may be more trusting, too polite to hang up, or have a nest egg. Many of older people usually have large amount of money to invest and desire of receiving unexpected profit from the investing. Also, they are less likely to report the fraud because they don't believe that consumer complaints would be solved. Types of fraud Advance fee fraud – They will typically require upfront payments in exchange for goods and services. This is also a popular technique used in lottery and student scholarships scams. Once the target has provided their personal information, the fraudsters will ask them to pay various fees – for example: taxes, legal fees, banking fees, etc. – so that they can release their non-existent winnings. Pyramid schemes – Work by recruiting “members” to invest into a scheme. Most of the money is made by recruiting new members and a prime characteristic of the scam is the product is of little value. The people at the bottom of the pyramid pay the people at the top. Inevitably they will run out of new recruits and the scheme will collapse. Some individuals may profit from pyramid schemes, but the vast majority of those who join later on in the scheme will not. Credit card fraud – Fraudsters will call with promises to repair their credit ratings, provide credit options or credit facilities, credit cards with zero or very low interest, or instant and unlimited loans without credit checks or security. Such offers, if followed up, tend to come at a considerable cost to victims in terms of high interest rates or exorbitant fees. It is common for scammers to target people who have bad credit and are therefore more susceptible to take the offer in hopes to pay off debts or to increase their credit rating. Overpayment fraud – One of the most popular scams making the rounds. A generous overpayment is made to the victim typically using a fake cheque. This can vary in goods but is popular among online auctions or classifieds. But before the cheque has been cleared by a bank and the victim discovers that the cheque has bounced, the scammer will request to have the difference refunded Usually through an online banking transfer, pre-loaded money card, or a wire transfer such as Western Union. Therefore, victims lose the money they send along with the item itself. Charity fraud – Telemarketers claiming to represent charities call asking the target for a donation. Fake charities try to take advantage of people's generosity and compassion for others in need. Scammers will steal the money by posing as a genuine charity; they may try to use recent events, such as natural disasters, to make their phony pleas for donations sound more believable. These scammers use aggressive high pressure techniques made to make the victim feel guilty or selfish if they do not want to donate. Cramming – Small charges that are secretly inserted into customers' credit card bills for services they did not order, buried so deep in credit card bills that many do not notice, usually with fake fees by vague financial services. Sometimes a one-time charge for entertainment services will be crammed onto phone bills. Other times it may be a recurring monthly charge. Cramming of recurring charges falls into two general categories: club memberships, such as psychic clubs, personal clubs, or travel clubs; and telecommunications products or service programs, such as voice mail, paging, and calling cards. Summer jobs fraud – Much like an advance fee fraud, these scams are aimed at teenagers or young adults looking for work over the summer period. Telemarketers seek out the victims by scanning student job searches. The telemarketer will then claim the victim has been singled out and specially selected to be hired for a particular job. Before they can start work, however, they are told that they have to pay various fees to cover training, materials and insurance. When they eventually realize the job is a scam, it is already too late; they have lost the money they paid for in fees as well as the time it would take to find a new job. Office supply scam – A very common scam where a telemarketer will target business managers responsible for purchasing office supplies, falsely representing their identity and the cost of office supplies – the most popular being toner. The caller might mislead a company's employees into thinking that an order for office supplies has already been placed, either by an existing or former colleague, and that they are calling to chase up a signature for the order form to help them keep complete records. The company is then invoiced for unwanted, and overpriced, stationery and office supplies. Magazine subscriptions scam - Scammers call victims with an intriguing offer and that for a small payment they can get a yearly subscription to their favorite magazine, even though they have no affiliation with the magazine's publisher. When victims agree, the scammers will send random magazines with grossly inflated prices. Another way they extract money is by falsely telling callers it is time to renew their magazine subscriptions in an attempt to get their credit or bank account information. Harassing/Stalking – This is another common misuse of your mobile number. Scammers can search your social media profile through your mobile number (if you have shared it on your profile). Through it they can gather a lot of your personal information, your friends, your family, profession, address, photos etc. These information can then be used to Harass you or Stalk you on social media and also over your phone. They know a lot about you, your family, your friends and they know how to contact you. Telemarketing fraud tools Caller I.D. spoofing - Allows the caller to present any number they want to put up on the screen including existing numbers while keeping the real number they call from private. Caller I.D. spoofing is a low cost option to help ensure anonymity. The same devices can also change voices to sound like a male or female. Robocalls - Technology has made it cheap and easy for robocallers to make calls from anywhere in the world. Most of these calls originate overseas, many in boiler rooms in India, and they are surprisingly effective. Robocalls have been used for legitimate campaigning and public opinion polling, but have also been used for voter suppression, false endorsements, and negative campaigning that borders on fraud. The federal regulatory regime currently excludes political robocalls from most telemarketing regulations. Presidential campaigns and national interest groups have accidentally violated state laws in trying to communicate with voters by using robocalls. Crawler devices - A majority of fraudulent calls originate from Nigerian phone scammers, who claim $12.7 billion a year off phone scams. Some callers have to make up to 1000 calls per day. To help with speeding things up, they will sometimes use crawler devices which is computerized to go through every area code calling each number. If the caller does not reach, they mark the lead as "no answer" and the system programs it so they get called again a few days later. If the company does not have a large lead pool, they may get called as soon as 12 hours later. As with email spammers, they know that a certain proportion of their hits will score. False identity – Fraudulent telemarketers use aliases to cover their tracks and prevent detection from the law. Some fraudulent telemarketers are deliberately located in other countries, as it is more difficult for law enforcement agencies to pursue them. Typically, scammers will use common western names in a bid to reassure callers they are calling from the same country. Scam Identification and blocking software Scam Likely is a term used for scam call identification, the term was originally coined by T-Mobile for the scam ID technology created by First Orion. First Orion's scam blocking technology uses a combination of known bad actors, AI powered blocking including neighborhood spoofing and unusual calling pattern. Scam Identification is a feature of the T-Mobile and Metro carrier network which can be controlled by the app Scam Shield, customer care or dialing the short code #664 to turn on or off scam blocking. There are a number of phone apps which try to identify, screen, send to voicemail or otherwise deter telemarketing calls with most major carriers providing some level of free scam call screening. Additionally both iOS and Android operating systems offer scam screening options. In addition phone carriers may provide caller authentication using STIR/SHAKEN a caller authentication protocol designed to stop caller id spoofing by exchanging tokens between cellular carriers. Popular scams Grandparent scam A telephone call is made to an elderly person with a family member who is supposedly in some kind of trouble, usually claiming to be a grandson or granddaughter. These calls are often placed late at night or early in the morning when most people are not thinking as clearly. Callers assume that their targets have grandchildren and will usually have several other people in on the scam, such as a bail bondsman, the arresting police officer, a lawyer, a doctor at a hospital, or some other person. The first voice on the phone is usually by the scammer pretending to be the grandchild sounding upset and typically stating that there are only a few moments to talk. The caller may say that they have a cold if the victim does not recognize their voice. Their story generally follows a familiar line: they were traveling in another country with a friend, and after a car accident or legal infraction, they are in jail and need bail money wired to a Western Union account as soon as possible for their quick release. The caller does not want anyone told about the incident, especially not family. Before the victim can ask too much about the situation the phony child will hand the phone over to the accomplice who will then request money to be transferred to release the grandchild from jail. While this is commonly called the grandparent scam, criminals may also claim to be a family friend, a niece or nephew, or another family member. Microsoft scam A telephone call is made saying typically that virus activity has been detected on the victim's computer; the overseas caller then states they are from Microsoft or a Microsoft certified technician. Callers assume that the victim has a computer running a Microsoft Windows operating system (users of other operating systems, such as Linux, are a minority and are likely to be technically knowledgeable). They will get the computer owner to give the caller remote access using a genuine networking service or website like ammyy.com or TeamViewer. They will use the ‘Event Viewer’ tool on the computer to highlight the Red-X Errors and Yellow Warnings which are supposedly signs of an infection, when in fact these are normal and harmless logs They also encrypt the owner's password database, preventing access to the computer without the scammers' password, essentially locking the victim out of their own computer and ensuring that they themselves will be paid. At this stage the caller then has complete control over the computer, and can display further alarming displays, as well as installing malware. The cold caller will then offer to remove the viruses and malicious malware (some of which they have installed themselves) and install security software and provide an ongoing support service costing up to $500. The Federal Trade Commission (FTC) has reported a huge crackdown on tech support scams and ordered a halt to several alleged tech support scams. References Additional sources Fraud Telephone crimes Fraud
327612
https://en.wikipedia.org/wiki/RadioShack
RadioShack
RadioShack, formerly RadioShack Corporation, is an American retailer founded in 1921. At its peak in 1999, RadioShack operated stores named RadioShack or Tandy Electronics in the United States, Mexico, United Kingdom, Australia, and Canada. Outside of those territories, the company licensed other companies to use the RadioShack brand name in parts of Asia, North Africa, Latin America, and the Caribbean. In February 2015, RadioShack Corporation filed for Chapter 11 protection under United States bankruptcy law after 11 consecutive quarterly losses. By then, it was operating only in the United States and Latin America. In May 2015, General Wireless Inc., an affiliate of Standard General, bought the company's assets, including the RadioShack brand name and related intellectual property, for US$26.2 million. General Wireless Operations Inc. was formed to operate the RadioShack stores, and General Wireless IP Holdings LLC was formed to hold the intellectual property. During RadioShack Corporation's bankruptcy proceeding 2015, RadioShack Corporation sold the RadioShack brand rights to different entities around the world. Mexico-based Grupo Gigante, through its subsidiary RadioShack de México, owns the RadioShack brand within Mexico. El Salvador-based Unicomer Group owns the RadioShack brand within the rest of Latin America and the Caribbean. Egypt-based Delta RS for Trading owns the RadioShack brand within North Africa and the Middle East. General Wireless IP Holdings LLC retained rights to the RadioShack brand in all remaining territories, which is mainly the United States because General Wireless IP Holdings never had the rights to the RadioShack brand in other parts of the world that were previously assigned to InterTAN in 1986, such as Australia until 2020 when it sold the branding rights to Retail Ecommerce Ventures (REV). In March 2017, General Wireless Inc. and subsidiaries filed for bankruptcy, claiming its Sprint partnership was not as profitable as expected, and announced plans to close most of their company-owned stores after Memorial Day Weekend in 2017, and to shift its business primarily to online. In November 2020, Retail Ecommerce Ventures (REV), a holding company owned by Tai Lopez and Alex Mehr, acquired RadioShack. RadioShack operates primarily as an e-commerce website, a network of independently owned, franchised RadioShack stores, and a supplier of parts for HobbyTown USA. History The first 40 years The company was started as Radio Shack in 1921 by two brothers, Theodore and Milton Deutschmann, who wanted to provide equipment for the then-nascent field of amateur radio (also known as ham radio). The brothers opened a one-store retail and mail-order operation in the heart of downtown Boston at 46 Brattle Street. They chose the name "Radio Shack", which was the term for a small, wooden structure that housed a ship's radio equipment. The Deutschmanns thought the name was appropriate for a store that would supply the needs of radio officers aboard ships, as well as hams (amateur radio operators). The idea for the name came from an employee, Bill Halligan, who went on to form the Hallicrafters company. The term was already in use — and is to this day — by hams when referring to the location of their stations. The company issued its first catalog in 1939 as it entered the high fidelity music market. In 1954, Radio Shack began selling its own private-label products under the brand name Realist, changing the brand name to Realistic after being sued by Stereo Realist. During the period the chain was based in Boston, it was commonly referred to by its customers as "Nagasaki Hardware", disparagingly, as much of the merchandise was sourced from Japan, then perceived as a source of low-quality, inexpensive parts. After expanding to nine stores plus an extensive mail-order business, the company fell on hard times in the 1960s. Radio Shack was essentially bankrupt, but Charles D. Tandy saw the potential of Radio Shack and retail consumer electronics, purchasing the company in 1962 for US$300,000. Tandy Corporation Tandy Corporation, a leather goods corporation, was looking for other hobbyist-related businesses into which it could expand. At the time of the Tandy Radio Shack & Leather 1962 acquisition, the Radio Shack chain was nearly bankrupt. Tandy's strategy was to appeal to hobbyists. It created small stores that were staffed by people who knew electronics, and sold mainly private brands. Tandy closed Radio Shack's unprofitable mail-order business, ended credit purchases and eliminated many top management positions, keeping the salespeople, merchandisers and advertisers. The number of items carried was cut from 40,000 to 2,500, as Tandy sought to "identify the 20% that represents 80% of the sales" and replace Radio Shack's handful of large stores with many "little holes in the wall", large numbers of rented locations which were easier to close and re-open elsewhere if one location didn't work out. Private-label brands from lower-cost manufacturers displaced name brands to raise Radio Shack profit margins; non-electronic lines from go-carts to musical instruments were abandoned entirely. Customer data from the former RadioShack mail-order business determined where Tandy would locate new stores. As an incentive for them to work long hours and remain profitable, store managers were required to take an ownership stake in their stores. In markets too small to support a company-owned Radio Shack store, the chain relied on independent dealers who carried the products as a sideline. Charles D. Tandy said “We’re not looking for the guy who wants to spend his entire paycheck on a sound system”, instead seeking customers "looking to save money by buying cheaper goods and improving them through modifications and accessorizing", making it common among "nerds" and "kids aiming to excel at their science fairs". Charles D. Tandy, who had guided the firm through a period of growth in the 1960s and 1970s, died of a heart attack at age 60 in November 1978. In 1982, the breakup of the Bell System encouraged subscribers to own their own telephones instead of renting them from local phone companies; Radio Shack offered twenty models of home phones. Much of the Radio Shack line was manufactured in the company's own factories. By 1990/1991, Tandy was the world's biggest manufacturer of personal computers; its OEM manufacturing capacity was building hardware for Digital Equipment Corporation, GRiD, Olivetti, AST Computer, Panasonic, and others. The company manufactured everything from store fixtures to computer software to wire and cable, TV antennas, audio and videotape. At one point, Radio Shack was the world's largest electronics chain. In June 1991, Tandy closed or restructured its 200 Radio Shack Computer Centers, acquired Computer City, and attempted to shift its emphasis away from components and cables, toward mainstream consumer electronics. Tandy sold its computer manufacturing to AST Research in 1993, including the laptop computer Grid Systems Corporation which it had purchased in 1988. It sold the Memorex consumer recording trademarks to a Hong Kong firm, and divested most of its manufacturing divisions. House-brand products, which Radio Shack had long marked up heavily, were replaced with third-party brands already readily available from competitors. This reduced profit margins. In 1992, Tandy attempted to launch big-box electronics retailer Incredible Universe; most of the seventeen stores never turned a profit. Its six profitable stores were sold to Fry's Electronics in 1996; the others were closed. Other rebranding attempts included the launch or acquisition of chains including McDuff, Video Concepts and the Edge in Electronics; these were larger stores which carried TVs, appliances and other lines. Tandy closed the McDuff stores and abandoned Incredible Universe in 1996, but continued to add new RadioShack stores. By 1996, industrial parts suppliers were deploying e-commerce to sell a wide range of components online; it would be another decade before RadioShack would sell parts from its website, with a selection so limited that it was no rival to established industrial vendors with million-item specialised, centralised inventories. In 1994, the company introduced a service known as "The Repair Shop at Radio Shack", through which it provided inexpensive out-of-warranty repairs for more than 45 different brands of electronic equipment. The company already had over one million parts in its extensive parts warehouses and 128 service centers throughout the US and Canada; it hoped to leverage these to build customer relationships and increase store traffic. Len Roberts, president of the Radio Shack division since 1993, estimated that the new repair business could generate $500 million per year by 1999. "America's technology store" was abandoned for the "you've got questions, we've got answers" slogan in 1994. In early summer 1995, the company changed its logo; "Radio Shack" was spelled in CamelCase as "RadioShack". In 1996, RadioShack successfully petitioned the US Federal Communications Commission to allocate frequencies for the Family Radio Service, a short-range walkie-talkie system that proved popular. Battery of the Month From the 1960s until the early 1990s, Radio Shack promoted a "battery of the month" club; a free wallet-sized cardboard card offered one free Enercell a month in-store. Like the free tube testing offered in-store in the early 1970s, this small loss leader drew foot traffic. The cards also served as generic business cards for the salespeople. Allied Radio In 1970, Tandy Corporation bought Allied Radio Corporation (both retail and industrial divisions), merging the brands into Allied Radio Shack and closing duplicate locations. After a 1973 federal government review, the company sold off the few remaining Allied retail stores and resumed using the Radio Shack name. Allied Electronics, the firm's industrial component operation, continued as a Tandy division until it was sold to Spartan Manufacturing in 1981. Flavoradio The longest-running product for Radio Shack was the AM-only Realistic Flavoradio, sold from 1972 to 2000, 28 years in three designs. This also made the Flavoradio the longest production run in radio history. Originally released in 5 colors, vanilla, chocolate, strawberry, avocado and plum from the 1972 catalog. For 1973, vanilla and chocolate were dropped (and thus are rare today) and replaced by lemon and orange. At some point two-tone models with white backs were offered but never appeared in catalogs. (Extremely rare today). The original design had 5 transistors (model 166). A sixth was added in 1980 (model 166a). The case was redesigned for 1987 taller and thinner and came in red, blue, and black.The final model 201a came in 1996 and was designed around an integrated circuit. They were first made in Korea then Hong Kong and finally the Philippines. The Flavoradio carried the Realistic name until about 1996 when it switched to "Radio Shack" then finally "Optimus". When the Flavoradio was dropped from the catalog in 2001, it was the last AM-only radio on the market. CB radio The chain profited from the mass popularity of citizens band radio in the mid-1970s which, at its peak, represented nearly 30% of the chain's revenue. Home computers In 1977, two years after the MITS Altair 8800, Radio Shack introduced the TRS-80, one of the first mass-produced personal computers. This was a complete pre-assembled system at a time when many microcomputers were built from kits, backed by a nationwide retail chain when computer stores were in their infancy. Sales of the initial, primitive US$600 TRS-80 exceeded all expectations despite its limited capabilities. This was followed by the TRS-80 Color Computer in 1980, designed to attach to a television. Tandy also inspired the Tandy Computer Whiz Kids (1982-1991), a comic-book duo of teen calculator enthusiasts who teamed up with the likes of Archie and Superman. Radio Shack's computer stores offered lessons to pre-teens as "Radio Shack Computer Camp" in the early 1980s. By September 1982, the company had more than 4,300 stores, and more than 2,000 independent franchises in towns not large enough for a company-owned store. The latter also sold third-party hardware and software for Tandy computers, but company-owned stores did not sell or even acknowledge the existence of non-Tandy products. In the mid-1980s, Radio Shack began a transition from its proprietary 8-bit computers to its proprietary IBM PC compatible Tandy computers, removing the "Radio Shack" name from the product in an attempt to shake off the long-running nicknames "Radio Scrap" and "Trash 80" to make the product appeal to business users. Poor compatibility, shrinking margins and a lack of economies of scale led Radio Shack to exit the computer-manufacturing market in the 1990s after losing much of the desktop PC market to newer, price-competitive rivals like Dell. Tandy acquired the Computer City chain in 1991, and sold the stores to CompUSA in 1998. In 1994, RadioShack began selling IBM's Aptiva line of home computers. This partnership would last until 1998, when RadioShack partnered with Compaq and created 'The Creative Learning Center' as a store-within-a-store to promote desktop PCs. Similar promotions were tried with 'The Sprint Store at RadioShack' (mobile telephones), 'RCA Digital Entertainment Center' (home audio and video products), and 'PowerZone' (RadioShack's line of battery products, power supplies, and surge protectors). RadioShack Corporation In the mid-1990s, the company attempted to move out of small components and into more mainstream consumer markets, focusing on marketing wireless phones. This placed the chain, long accustomed to charging wide margins on specialized products not readily available from other local retailers, into direct competition against vendors such as Best Buy and Walmart. In May 2000, the company dropped the Tandy name altogether, becoming RadioShack Corporation. The leather operating assets were sold to The Leather Factory on November 30, 2000; that business remains profitable. House brands Realistic and Optimus were discontinued. In 1999, the company agreed to carry RCA products in a five-year agreement for a "RCA Digital Entertainment Center" store-within-a-store. When the RCA contract ended, RadioShack introduced its own Presidian and Accurian brands, reviving the Optimus brand in 2005 for some low-end products. Enercell, a house brand for dry cell batteries, remained in use until approximately 2014. Most of the RadioShack house brands had been dropped when Tandy divested its manufacturing facilities in the early 1990s; the original list included: Realistic (stereo, hi-fi and radio), Archer (antenna rotors and boosters), Micronta (test equipment), Tandy (computers), TRS-80 (proprietary computer), ScienceFair (kits), DuoFone (landline telephony), Concertmate (music synthesizer), Enercell (cells and batteries), Road Patrol (radar detectors, bicycle radios), Patrolman (Realistic radio scanner), Deskmate (software), KitchenMate, Stereo Shack, Supertape (recording tape), Mach One, Optimus (speakers and turntables), Flavoradio (pocket AM radios in various colours), Weatheradio, Portavision (small televisions) and Minimus (speakers). In 2000, RadioShack was one of multiple backers of the CueCat barcode reader, a marketing failure. It had invested US$35 million in the company, included the barcodes in its catalogs and distributed CueCat devices to customers at no charge. The last annual RadioShack printed catalogs were distributed to the public in 2003. Until 2004, RadioShack routinely asked for the name and address of purchasers so they could be added to mailing lists. Name and mailing address were requested for special orders (RadioShack Unlimited parts and accessories, Direc2U items not stocked locally), returns, check payments, RadioShack Answers Plus credit card applications, service plan purchases and carrier activations of cellular telephones. On December 20, 2005, RadioShack announced the sale of its newly built riverfront Fort Worth, Texas headquarters building to German-based KanAm Grund; the property was leased back to RadioShack for 20 years. In 2008, RadioShack assigned this lease to the Tarrant County College District (TCC), remaining in 400,000 square feet of the space as its headquarters. In 2005, RadioShack parted with Verizon for a 10-year agreement with Cingular (later AT&T) and renegotiated its 11-year agreement with Sprint. In July 2011, RadioShack ended its wireless partnership with T-Mobile, replacing it with the "Verizon Wireless Store" within a store. 2005 under the leadership of Jim Hamilton, marked a banner year for wireless. RadioShack sold more mobile phones than Walmart, Circuit City and Best Buy combined. RadioShack had not made products under the Realistic name since the early 1990s. Support for many of Radio Shack's traditional product lines, including amateur radio, had ended by 2006. A handful of small-town franchise dealers used their ability to carry non-RadioShack merchandise to bring in parts from outside sources, but these represented a minority. PointMobl and "The Shack" In mid-December 2008, RadioShack opened three concept stores under the name "PointMobl" to sell wireless phones and service, netbooks, iPod and GPS navigation devices. The three Texas stores (Dallas, Highland Village and Allen) were furnished with white fixtures like those in the remodelled wireless departments of individual RadioShack stores, but there was no communicated relationship to RadioShack itself. Had the test proved successful, RadioShack could have moved to convert existing RadioShack locations into PointMobl stores in certain markets. While some PointMobl products, such as car power adapters and phone cases, were carried as store-brand products in RadioShack stores, the stand-alone PointMobl stores were closed and the concept abandoned in March 2011. In August 2009, RadioShack rebranded itself as "The Shack". The campaign increased sales of mobile products, but at the expense of its core components business. RadioShack aggressively promoted Dish Network subscriptions. In November 2012, RadioShack introduced Amazon Locker parcel pick-up services at its stores, only to dump the program in September 2013. In 2013, the chain made token attempts to regain the do it yourself market, including a new "Do It Together" slogan. Long-time staff observed a slow and gradual shift away from electronic parts and customer service and toward promotion of wireless sales and add-ons; the pressure to sell gradually increased, while the focus on training and product knowledge decreased. Morale was abysmal; longtime employees who were paid bonus and retirement in stock options saw the value of these instruments fade away. Financial decline In 1998, RadioShack called itself the single largest seller of consumer telecommunications products in the world; its stock reached its peak a year later. InterTAN, a former Tandy subsidiary, sold the Tandy UK stores in 1999 and the Australian stores in 2001. InterTAN was sold (with its Canadian stores) to rival Circuit City in 2004. The RadioShack brand remained in use in the United States, but the 21st century proved a period of long decline for the chain, which was slow to respond to key trends— such as e-commerce, the entry of competitors like Best Buy and Amazon.com, and the growth of the maker movement. By 2011, smartphone sales, rather than general electronics, accounted for half of the chain's revenue. The traditional RadioShack clientele of do-it-yourself tinkerers were increasingly sidelined. Electronic parts formerly stocked in stores were now mostly only available through on-line special order. Store employees concentrated efforts selling profitable mobile contracts, while other customers seeking assistance were neglected and left the stores in frustration. Demand for consumer electronics was also increasingly being weakened by consumers buying the items online. 2004: "Fix 1500" initiative In early 2004, RadioShack introduced Fix 1500, a sweeping program to "correct" inventory and profitability issues company-wide. The program put the 1,500 lowest-graded store managers, of over 5,000, on notice of the need to improve. Managers were graded not on tangible store and personnel data but on one-on-one interviews with district management. Typically, a 90-day period was given for the manager to improve (thus causing another manager to then be selected for Fix 1500). A total of 1,734 store managers were reassigned as sales associates or terminated in a 6-month period. Also, during this period, RadioShack cancelled the employee stock purchase plan. By the first quarter of 2005, the metrics of skill assessment used during Fix 1500 had already been discarded, and the corporate officer who created the program had resigned. In 2004, RadioShack was the target of a class-action lawsuit in which more than 3,300 current or former RadioShack managers alleged the company required them to work long hours without overtime pay. In an attempt to suppress the news, the company launched a successful strategic lawsuit against public participation against Bradley D. Jones, the webmaster of RadioShackSucks.com and a former RadioShack dealer for 17 years. 2006: Management problems On February 20, 2006, CEO David Edmondson admitted to "misstatements" on his curriculum vitae and resigned after the Fort Worth Star-Telegram debunked his claim to degrees in theology and psychology from Heartland Baptist Bible College. Chief operating officer Claire Babrowski briefly took over as CEO and president. A 31-year veteran of McDonald's Corporation, where she had been vice president and Chief Restaurant Operations Officer, Babrowski had joined RadioShack several months prior. She left the company in August 2006, later becoming CEO and Executive Vice President of Toys "R" Us. RadioShack's board of directors appointed Julian C. Day as chairman and chief executive officer on July 7, 2006. Day had financial experience and had played a key role in revitalizing such companies as Safeway, Sears and Kmart but lacked any practical front-line sales experience needed to run a retail company. The Consumerist named him one of the "10 Crappiest CEOs" of 2009 (among consumer-facing companies, according to their own employees). He resigned in May 2011. RadioShack Chief Financial Officer James "Jim" Gooch succeeded Day as CEO in 2011, but "agreed to step down" 16 months later following a 73% plunge in the price of the stock. On February 11, 2013, RadioShack Corp. hired Joseph C. Magnacca from Walgreens, because he had experience in retail. 2006: Corporate layoffs and new strategy In the spring of 2006, RadioShack announced a strategy to increase average unit volume, lower overhead costs, and grow profitable square footage. In early to mid-2006, RadioShack closed nearly 500 locations. It was determined that some stores were too close to each other, causing them to compete with one another for the same customers. Most of the stores closed in 2006 brought in less than US$350,000 in revenue each year. Despite these actions, stock prices plummeted within what was otherwise a booming market. On August 10, 2006, RadioShack announced plans to eliminate a fifth of its company headquarters workforce to reduce overhead expense, improving its long-term competitive position while supporting a significantly smaller number of stores. On Tuesday, August 29, the affected workers received an e-mail: "The work force reduction notification is currently in progress. Unfortunately your position is one that has been eliminated." Four hundred and three workers were given 30 minutes to collect their personal effects, say their goodbyes to co-workers and then attend a meeting with their senior supervisors. Instead of issuing severance payments immediately, the company withheld them to ensure that company-issued BlackBerrys, laptops and cellphones were returned. This move drew immediate widespread public criticism for its lack of sensitivity. 2009: Customer relations problems RadioShack and the Better Business Bureau of Fort Worth, Texas met on April 23, 2009 to discuss unanswered and unresolved complaints. The company implemented a plan of action to address existing and future customer service issues. Stores were directed to post a sign with the district manager's name, the question "How Are We Doing?" and a direct toll-free number to the individual district office for their area. RadioShackHelp.com was created as another portal for customers to resolve their issues through the Internet. , the BBB had upgraded RadioShack from an "F" to an "A" rating; this was changed to "no rating" after the 2015 bankruptcy filing. According to an experience ratings report published by Temkin Group, an independent research firm, RadioShack was ranked as the retailer with the worst overall customer experience; it maintained this position for six consecutive years. 2012–2014: Financial distress From 2000 to 2011, RadioShack spent US$2.6 billion repurchasing its own stock in an attempt to prop up a share price which fell from US$24.33 to US$2.53; the buyback and the stock dividend were suspended in 2012 to conserve cash and reduce debt as the company continued to lose money. Company stock had declined 81 percent since 2010 and was trading well below book value. The stock reached an all-time low on April 14, 2012. In September 2012, RadioShack's head office laid off 130 workers after a US$21 million quarterly loss. Layoffs continued in August 2013; headquarters employment dropped from more than 2,000 before the 2006 layoffs to slightly fewer than 1,000 in late 2013. At the end of 2013, the chain owned 4,297 US stores. The company had received a cash infusion in 2013 from Salus Capital Partners and Cerberus Capital Management. This debt carried onerous conditions, preventing RadioShack from gaining control over costs by limiting store closures to 200 per year and restricting the company's refinancing efforts. With too many underperforming stores remaining open, the chain continued to spiral toward bankruptcy. On March 4, 2014, the company announced a net trading loss for 2013 of US$400.2 million, well above the 2012 loss of US$139.4 million, and proposed a restructuring which would close 1,100 lower-performing stores, almost 20% of its US locations. On May 9, 2014, the company reported that creditors had prevented it from carrying out those closures, with one lender presuming fewer stores would mean fewer assets to secure the loan and reduce any recovery it would get in a bankruptcy reorganization. On June 10, 2014, RadioShack said that it had enough cash to last 12 months, but that lasting a year depended on sales growing. Sales had fallen for nine straight quarters, and by year's end the company realized a loss in "each of its 10 latest quarters". On June 20, 2014, RadioShack's stock price fell below US$1, triggering a July 25 warning from the New York Stock Exchange that it could be delisted for failure to maintain a stock price above $1. On July 28, 2014, Mergermarket's Debtwire reported RadioShack was discussing Chapter 11 bankruptcy protection as an option. On September 11, 2014, RadioShack admitted it might have to file for bankruptcy, and would be unable to finance its operations "beyond the very near term" unless the company was sold, restructured, or received a major cash infusion. On September 15, 2014, RadioShack replaced its CFO with a bankruptcy specialist. On October 3, RadioShack announced an out-of-court restructuring, a 4:1 dilution of shares, and a rights issue priced at 40 cents a share. RadioShack's stock () was halted on the New York exchange for the entire day. Despite the debt restructuring proposal, in December Salus and Cerberus informed RadioShack that it was in default of the they had provided as a cash infusion in 2013. At the end of October 2014, quarterly figures indicated RadioShack was losing US$1.1 million per day. A November 2014 attempt to keep the stores open from 8AM to midnight on Thanksgiving Day drew a sharp backlash from employees and a few resignations; comparable store sales for the three days (Thursday-Saturday) were 1% lower than the prior year, when the stores were open for two of the days. The company's problems maintaining inventories of big-ticket items, such as Apple's iPhone 6, further cut into sales. By December 2014, RadioShack was being sued by former employees for having encouraged them to invest 401(k) retirement savings in company stock, alleging a breach of fiduciary duties to "prudently" handle the retirement fund which caused "devastating losses" in the retirement plans as the stock dropped from US$13 in 2011 to 38 cents at the end of 2014. These claims were dismissed by the Fifth U.S. Circuit Court of Appeals in 2018. 2015: Bankruptcy On January 15, 2015, The Wall Street Journal reported RadioShack had delayed rent payments to some commercial landlords and was preparing a bankruptcy filing that could come as early as February. Officials of the company declined to comment on the report. A separate report by Bloomberg claimed the company might sell leases to as many as half its stores to Sprint. On February 2, 2015, the company was delisted from the New York Stock Exchange after its average market capitalization remained below US$50 million for longer than thirty consecutive days. That same day, Bloomberg News reported RadioShack was in talks to sell half of its stores to Sprint and close the rest, which would effectively render RadioShack no longer a stand-alone retailer. Amazon.com and Brookstone were also mentioned to be potential bidders, the former having at the time been wanting to establish a brick and mortar presence. On February 3, RadioShack defaulted on its loan from Salus Capital. On the days following these reports, some employees were instructed to reduce prices and transfer inventory out of stores designated for closing to those that would remain open during the presumed upcoming bankruptcy proceedings, while the rest remained "in the dark" as to the company's future. Many stores had already closed abruptly on Sunday, February 1, 2015, the first day of the company's fiscal year, with employees only given a few hours advance notice. Some had been open with a skeleton crew, little inventory and reduced hours only because the Salus Capital loan terms limited the chain to 200 store closures a year. A creditor group alleged the chain had remained on life support instead of shutting down earlier and cutting its losses merely so that Standard General could avoid paying on credit default swaps which expired on December 20, 2014. On February 5, 2015, RadioShack announced that it had filed for Chapter 11 bankruptcy protection. Using bankruptcy to end contractual restrictions that had required it keep unprofitable stores open, the company immediately published a list of 1784 stores which it intended to close, a process it wished to complete by the month's end to avoid an estimated US$7 million in March rent. Customers had initially been given until March 6, 2015 to return merchandise or redeem unused gift cards. However, after legal pressure from the Attorneys General of several states, RadioShack ultimately agreed to reimburse customers for the value of unused gift cards. On March 31, 2015, the bankruptcy court approved a US$160 million offer by the Standard General affiliate General Wireless, gaining ownership of 1,743 RadioShack locations. As part of the deal, the company entered into a partnership with Sprint, in which the company would become a co-tenant at 1,435 RadioShack locations and establish store within a store areas devoted to selling its wireless brands, including Sprint, Boost Mobile and Virgin Mobile. The stores would collect commissions on the sale of Sprint products, and Sprint would assist in promotion. Sprint stated that this arrangement would increase the company's retail footprint by more than double; the company previously had around 1,100 company-owned retail outlets, in comparison to the over 2,000 run by AT&T Mobility. Although they would be treated as a co-tenant, the Sprint branding would be more prominent in promotion and exterior signage than that of RadioShack. The acquisition did not include rights to RadioShack's intellectual property (such as its trademarks), rights to RadioShack's franchised locations, and customer records, which were to be sold separately. RadioShack was criticized for including the personally identifying information of 67 million of its customers as part of its assets for sale during the proceedings, despite its long-standing policy and a promise to customers that data would never be sold for any reason at any time. The Federal Trade Commission and the Attorneys General of 38 states fought against this proposal. The sale of this data was ultimately approved, albeit greatly reduced from what was initially proposed. General Wireless Operations, Inc. Standard General acquired the RadioShack brand after RadioShack Corporation filed for bankruptcy in 2015. It formed the affiliate, General Wireless Operations, to act as the new parent company for the brand. This new RadioShack focused on its partnership with Sprint in the hopes of carrying on the brand. Re-branded stores soft launched on April 10, 2015, with a preliminary conversion of the stores' existing wireless departments to exclusively house Sprint brands, with all stores eventually to be renovated in waves to allocate larger spaces for Sprint. In May 2015, the acquisition of the "RadioShack" name and its assets by General Wireless for US$26.2 million was finalized. Chief marketing officer Michael Tatelman emphasized that the company that emerged from the 2015 proceedings is an entirely new company, and went on to affirm that the old RadioShack did not re-emerge from bankruptcy, calling it "defunct". Less than one year after the bankruptcy events of 2015, Ron Garriques and Marty Amschler stepped down from their respective chief executive officer and chief financial officer positions; Garriques had held his position for nine months. 2017: Bankruptcy It was speculated on March 2, 2017 that General Wireless was preparing to take RadioShack through its second bankruptcy in two years. This was evidenced when dozens of corporate office employees were laid off and two hundred stores were planned to be shuttered, and further evidenced when the RadioShack website began displaying "all sales final" banners for in-store purchases at all locations. RadioShack's Chapter 11 bankruptcy was formally filed on March 8, 2017. Of the then 1,300 remaining stores, several hundred were converted into Sprint-only locations. Despite declaring Chapter 11 bankruptcy (typically reserved for reorganization of debt) instead of Chapter 7 (liquidation), the company engaged in liquidation of all inventory, supplies, and store fixtures, as well as auctioning off old memorabilia. On May 26, RadioShack announced plans to close all but 70 corporate stores and shift its business primarily to online. These stores closed after Memorial Day Weekend of 2017. Of the remaining stores, 50 more closed by the end of June 2017. One particular store closing in April 2017 garnered widespread media attention when a Facebook account, calling itself "RadioShack - Reynoldsburg, Ohio", began lashing out at customers with messages such as "We closed. F—k all of you.", "Always hated all you pr—k customers anyway." RadioShack addressed these posts on their official Facebook page denying any involvement. On June 29, 2017, RadioShack's creditors sued Sprint, claiming that it sabotaged its co-branded locations with newly built Sprint retail stores—which were constructed near well-performing RadioShack locations as determined by confidential sales information. The suit argued that Sprint's actions "destroyed nearly 6,000 RadioShack jobs". General Wireless announced plans on June 12, 2017 to auction off the RadioShack name and IP, with bidding to begin on July 18. Bidding concluded on July 19, 2017, when one of RadioShack's creditors, Kensington Capital Holdings, obtained the RadioShack brand and other intellectual properties for US$15 million. Kensington was the sole bidder. In October 2017, General Wireless officially exited bankruptcy and was allowed to retain the company's warehouse, e-commerce site, dealer network operations, and up to 28 stores. Post-bankruptcy In late July 2018, RadioShack partnered up with HobbyTown USA to open up around 100 RadioShack "Express" stores. HobbyTown owners select which RadioShack products to carry. RadioShack dealerships had re-opened around 500 stores by October 2018. By November 2018, it had signed 77 of HobbyTown's 137 franchise stores. Retail Ecommerce Ventures (REV) In November 2020, RadioShack's intellectual property and its remaining operations—about 400 independent authorized dealers, about 80 Hobbytown USA affiliate stores, and its online sales operation—were purchased by Retail Ecommerce Ventures (REV), a Florida-based company that had previously purchased defunct retailers Pier 1 Imports, Dress Barn, Modell's Sporting Goods, and Linens 'n Things, along with The Franklin Mint. In December 2021, REV announced they will be using part of the brand name on a cryptocurrency platform called RadioShack DeFi (which is an abbreviation of decentralized finance). The platform will allow customers to exchange and freely swap existing cryptocurrency tokens for a token called RADIO through the new platform. Corporate headquarters In the 1970's RadioShack had a new headquarters "Tandy Towers" built in downtown Fort Worth on Throckmorton St. In 2001, RadioShack bought the former Ripley Arnold public housing complex in Downtown Fort Worth along the Trinity River for US$20 million. The company razed the complex and had a corporate headquarters campus built after the City of Fort Worth approved a 30-year economic agreement to ensure that the company stayed in Fort Worth. RadioShack moved into the campus in 2005. In 2009, with two years left on a rent-free lease of the building, the Fort Worth Star-Telegram reported that the company was considering a new site for its headquarters. The Tampa Bay Business Journal reported rumors among Tampa Bay Area real estate brokers and developers that RadioShack might select Tampa as the site of its headquarters. In 2010, however, RadioShack announced efforts to remain at its current site. The headquarters was ultimately reduced to a small group after the second bankruptcy filing. On September 2017, what was left of RadioShack of about 50 people left the downtown location moving to a warehouse on Terminal Road just north of "The Stockyards". International operations Intertan Inc. In 1986, Tandy Corp. announced it would create a spinoff of its international retail operations, called Intertan Inc. The new company would take over operations of over 2,000 international company-owned and franchised stores, while Tandy retained its 7,253 domestic outlets and 30 of its manufacturing facilities. Intertan had two main units, Tandy Electronics Ltd., which operated in Canada, the U.K., France, Belgium, West Germany, and the Netherlands; and Tandy Australia Ltd., which operated in Australia. At the end of 1989, there were 1,417 stores operated by Intertan under the Tandy or Radio Shack names. Intertan operated Tandy or Radio Shack stores in the UK until 1999 and Australia until 2001. RadioShack branded merchandise accounted for 9.5% of InterTAN's inventory purchases in its 2002-2003 fiscal year, the last complete year before the Circuit City acquisition, and later disappeared from stores entirely. Canada Following the creation of Intertan, Tandy Electronics operated 873 stores in Canada, and owned the rights to the RadioShack name. In 2004, Circuit City, a competitor of Radio Shack purchased Intertan, which held the rights to use the RadioShack name in Canada until 2010. Radio Shack Corp., which operated Radio Shack stores in the U.S., sued Intertan in an attempt to end the contract for the company name early. On March 24, 2005, a U.S. district court judge ruled in favour of RadioShack, requiring InterTAN stop using the brand name in products, packaging or advertising by June 30, 2005. The Canadian stores were rebranded under the name The Source by Circuit City. Radio Shack briefly re-entered the Canadian market, but eventually closed all stores to refocus attention on its core U.S. business. The Source was acquired by Bell Canada in 2009. Asia In March 2012, Malaysian company Berjaya Retail Berhad, entered into a franchising agreement with Radio Shack. Later that year, the company announced a second franchising deal with Chinese company, Cybermart. Berjaya had six stores in Malaysia before it quietly ceased operations in 2017. Mexico In 1986, Grupo Gigante signed a deal with Tandy Corporation to operate Radio Shack branded stores in Mexico. After growing their electronics chain within Mexico to 24 stores, Grupo Gigante signed a new deal with Tandy in 1992 to form a new joint ventured called Radio Shack de México in which both companies had an equal share. As part of the deal, Grupo Gigante transferred their electronics stores to Radio Shack de México. In 2008, Grupo Gigante separated from Radio Shack, (then renamed Radio Shack Corporation) and sold its share of the joint venture to Radio Shack Corp. for $42.3 million. In June 2015, Grupo Gigante repurchased 100 percent of RadioShack de Mexico, including stores, warehouses, and all related brand names and intellectual properties for use within Mexico, from the U.S. Bankruptcy Court in Delaware for US$31.5 million. The chain had 247 stores in Mexico at the time of the sale. Following the sale, all Radio Shack stores, warehouses, brands, assets, and related trademarks in Mexico are currently owned by RadioShack de México S.A. de C.V., a subsidiary of Grupo Gigante. A major Mexican news magazine had reported in March 2015 that Grupo Gigante actually purchased 100% of the stock in RadioShack de México from RadioShack Corporation for US$31.8 million, two months prior to the bankruptcy filing, but had only had to hand over US$11.8 million to RadioShack Corp. for also assuming approximately US$20 million in debt liabilities. While Radio Shack was facing a second bankruptcy in the United States, Grupo Gigante announced in October 2017 that they planned to expand the Radio Shack brand within Mexico by opening eight more stores. Latin America When Radio Shack Corporation filed for bankruptcy the first time in 2015, the Unicomer Group (Grupo Unicomer) purchased the Radio Shack brand from the bankruptcy court for its exclusive use in Latin America and the Caribbean, except Mexico. Unicomer, through its corporate parent Regal Forest Holding Co. Ltd., paid $5 million for the brand. The company's relationship with Radio Shack dated back to 1998, when Unicomer opened its first Radio Shack franchise store in El Salvador. It later expanded into Honduras, Guatemala, and Nicaragua. By January 2015, Unicomer had 57 Radio Shack stores distributed throughout four countries within Central America. In April 2015, Unicomer began receiving franchise payments from franchises in several countries that Unicomer had not previously had a business presence in. It expanded into Trinidad in 2016, Jamaica in 2017, Barbados in 2017, and Guyana in 2017. By the end of 2017, Unicomer had company-owned stores located in the countries of Barbados, El Salvador, Guatemala, Guyana, Honduras, Jamaica, Nicaragua, and Trinidad while receiving franchise payments from independent franchised stores located in the countries of Antigua, Aruba, Costa Rica, Paraguay and Peru in which Unicomer did not have a business presence in. Since 2014, the independent company Coolbox is an authorized dealer for RadioShack products in Peru. In April 2018, the RadioShack brand returned to Bolivia when franchisee Cosworld Trading opened two franchised stores for Unicomer in the capital city of La Paz. The previous RadioShack stores had closed in 2015 as a result of RadioShack first bankruptcy filing. Middle East When Radio Shack filed for bankruptcy the first time in 2015, the Egypt-based Delta RS for Trading purchased the Radio Shack brand from the bankruptcy court for its exclusive use in Middle East and North Africa for $US5 million. Delta RS for Trading, as Radio Shack Egypt, had opened its first Radio Shack franchised store in 1998 in Nasr City. By March 2003, Radio Shack Egypt had 65 company-operated stores plus 15 sub-franchised stores. In 2017, the Egyptian government accused Radio Shack Egypt and its parent Delta RS in aiding the Muslim Brotherhood. Other operations Corporate citizenship In 2006, RadioShack supported the National Center for Missing & Exploited Children by providing store presence for the StreetSentz program, a child identification and educational kit offered to families without charge. RadioShack supported United Way of America Charities to assist their Oklahoma and Texas relief efforts after the 2013 Moore tornado. RadioShack's green initiative promotes the Rechargeable Battery Recycling Corporation, which accepts end-of-life rechargeable batteries and wireless phones dropped off in-store to be safely recycled. Other retailer partnerships In August 2001, RadioShack opened kiosk-style stores inside Blockbuster outlets, only to abandon the project in February 2002; CEO Len Roberts announced that the stores did not meet expectations. RadioShack operated wireless kiosks within 417 Sam's Club discount warehouses from 2004 to 2011. The kiosk operations, purchased from Arizona-based Wireless Retail Inc, operated as a subsidiary, SC Kiosks Inc., with employees contracted through RadioShack Corporation. No RadioShack-branded merchandise was sold. The kiosks closed in 2011, costing RadioShack an estimated US$10–15 million in 2011 operating income. RadioShack then attempted a joint venture with Target to deploy mobile telephone kiosks in 1,490 Target stores by April 2011. In April 2013, RadioShack's partnership with Target ended and the Target Mobile in-store kiosks were turned over to a new partnership with Brightstar and MarketSource. No-contract wireless On September 5, 2012, RadioShack in a partnership with Cricket Wireless, began offering its own branded no-contract wireless services using Cricket and Sprint's nationwide networks. The service was discontinued on August 7, 2014; clients who had already purchased the service from RadioShack continue to receive service from Cricket Wireless. Cycling team sponsorship In 2009, the company became the main sponsor of a new cycling team, Team RadioShack, with Lance Armstrong and Johan Bruyneel. RadioShack featured Armstrong in a number of television commercials and advertising campaigns. RadioShack came under fire for having Armstrong as a spokesperson in 2011, when allegations that the cyclist had used performance-enhancing drugs surfaced. Lawsuits and litigation In April 2004, AutoZone brought suit against RadioShack for using the name PowerZone to promote a section of its retail stores, citing trademark infringement. The lawsuit was dropped in June the same year due to lack of evidence. In June 2011, a customer sued Sprint and RadioShack after finding pornography on their newly purchased cell phones. In 2012, a Denver jury awarded $674,938 to David Nelson, age 55 (), a 25-year RadioShack employee who had been fired by his supervisor in retaliation after complaining about age discrimination. In 2013, a federal jury awarded over $1 million in an age discrimination suit to a 54-year-old, longtime RadioShack store manager who was fired in 2010 from the San Francisco store he had managed since 1998. A 2013 class action judgement found that RadioShack had violated privacy requirements between August 24, 2010 and November 21, 2011 by printing the expiration date of clients' credit or debit cards on store receipts. A July 2014 ruling in Verderame v. RadioShack Corp., 13-02539 in the US District Court, Eastern District of Pennsylvania (Philadelphia) found that RadioShack owed its store managers a possible US$5.8 million for unpaid overtime in the state. In popular culture A "Radio Shock" store (owned by the "Dandy Corporation") appeared in the original 1991 release of Space Quest IV, displaced by "Hz. So Good" in later editions because of threats of legal action by Tandy. Radio Shack is featured prominently in Short Circuit 2, which serves as a "clinic" for Johnny 5 while he repairs himself after being assaulted by thieves. Radio Shack is mentioned and briefly featured on the pilot episode of Young Sheldon. Visits to RadioShack are a frequent plot point in the Young Sheldon series, building off allusions to childhood visits made by the character Sheldon Cooper in its parent series, The Big Bang Theory. The family returns to the Radio Shack store in a later episode, where his mother purchases him a Tandy 1000. RadioShack appears in the second season of the Netflix series Stranger Things as the workplace of Bob Newby. In one scene, an Armatron (a product actually sold at Radio Shack during that period) can be seen on a shelf above his head. In the movie Ocean's Eleven, after Livingston asks an FBI agent to not touch his equipment by asking, "Do you see me grabbing the gun out of your holster and waving it around?", the agent retorts with "Hey 'Radio Shack', relax". In Kung Pow! Enter the Fist, Ling's mortally wounded father randomly asks The Chosen One to "let me know if you see a Radio Shack" as The Chosen One leads him in to a town in search of help. In the movie Die Hard 2, An air traffic control engineer says "Yeah right, somebody wanna run down to Radio Shack and get a transmitter?" In the television series The Sopranos, Tony breaks a phone and tells his son "Forgot to tell you I got a job at Radio Shack. Product testing. Giving that phone an F for durability." References Further reading Hayden, Andrew, "Radio Shack: A Humble Beginning for an Electronics Giant", antiqueradio.com, February 2007 External links Radio Shack Records in Fort Worth Library Archives Radioshackcatalogs.com, an 80-year archive of RadioShack catalogs 1921 establishments in Massachusetts Companies based in Fort Worth, Texas Companies formerly listed on the New York Stock Exchange Companies that filed for Chapter 11 bankruptcy in 2015 Companies that filed for Chapter 11 bankruptcy in 2017 Consumer electronics retailers in the United States Electronic kit manufacturers Home computer hardware companies Loudspeaker manufacturers American companies established in 1921 Retail companies established in 1921 2015 mergers and acquisitions
7829
https://en.wikipedia.org/wiki/Chaos%20Computer%20Club
Chaos Computer Club
The Chaos Computer Club (CCC) is Europe's largest association of hackers with registered members. Founded in 1981, the association is incorporated as an eingetragener Verein in Germany, with local chapters (called Erfa-Kreise) in various cities in Germany and the surrounding countries, particularly where there are German-speaking communities. Since 1985, some chapters in Switzerland have organized an independent sister association called the (CCC-CH) instead. The CCC describes itself as "a galactic community of life forms, independent of age, sex, race or societal orientation, which strives across borders for freedom of information…". In general, the CCC advocates more transparency in government, freedom of information, and the human right to communication. Supporting the principles of the hacker ethic, the club also fights for free universal access to computers and technological infrastructure as well as the use of open-source software. The CCC spreads an entrepreneurial vision refusing capitalist control. It has been characterised as "…one of the most influential digital organisations anywhere, the centre of German digital culture, hacker culture, hacktivism, and the intersection of any discussion of democratic and digital rights". Members of the CCC have demonstrated and publicized a number of important information security problems. The CCC frequently criticizes new legislation and products with weak information security which endanger citizen rights or the privacy of users. Notable members of the CCC regularly function as expert witnesses for the German constitutional court, organize lawsuits and campaigns, or otherwise influence the political process. Activities Regular events The CCC hosts the annual Chaos Communication Congress, Europe's biggest hacker gathering. When the event was held in the Hamburg congress center in 2013, it drew guests. For the 2016 installment, guests were expected, with additional viewers following the event via live streaming. Every four years, the Chaos Communication Camp is the outdoor alternative for hackers worldwide. The CCC also held, from 2009 to 2013, a yearly conference called SIGINT in Cologne which focused on the impact of digitisation on society. The SIGINT conference was discontinued in 2014. The four-day conference in Karlsruhe is with more than 1500 participants the second largest annual event. Another yearly CCC event taking place on the Easter weekend is the Easterhegg, which is more workshop oriented than the other events. The CCC often uses the c-base station located in Berlin as an event location or as function rooms. Publications, Outreach The CCC publishes the irregular magazine Datenschleuder (data slingshot) since 1984. The Berlin chapter produces a monthly radio show called which picks up various technical and political topics in a two-hour talk radio show. The program is aired on a local radio station called and on the internet. Other programs have emerged in the context of Chaosradio, including radio programs offered by some regional Chaos Groups and the podcast spin-off CRE by Tim Pritlove. Many of the chapters of CCC participate in the volunteer project Chaos macht Schule which supports teaching in local schools. Its aims are to improve technology and media literacy of pupils, parents, and teachers. CCC members are present in big tech companies and in administrative instances. One of the spokespersons of the CCC, as of 1986, Andy Müller-Maguhn, was a member of the executive committee of the ICANN (Internet Corporation for Assigned Names and Numbers) between 2000 and 2002. CryptoParty The CCC sensitises and introduces people to the questions of data privacy. Some of its local chapters support or organize so called CryptoParties to introduce people to the basics of practical cryptography and internet anonymity. History Founding The CCC was founded in West Berlin on 12 September 1981 at a table which had previously belonged to the Kommune 1 in the rooms of the newspaper Die Tageszeitung by Wau Holland and others in anticipation of the prominent role that information technology would play in the way people live and communicate. BTX-Hack The CCC became world-famous in 1984 when they drew public attention to the security flaws of the German Bildschirmtext computer network by causing it to debit DM in a Hamburg bank in favor of the club. The money was returned the next day in front of the press. Prior to the incident, the system provider had failed to react to proof of the security flaw provided by the CCC, claiming to the public that their system was safe. Bildschirmtext was the biggest commercially available online system targeted at the general public in its region at that time, run and heavily advertised by the German telecommunications agency Deutsche Bundespost which also strove to keep up-to-date alternatives out of the market. Karl Koch In 1987, the CCC was peripherally involved in the first cyberespionage case to make international headlines. A group of German hackers led by Karl Koch, who was loosely affiliated with the CCC, was arrested for breaking into US government and corporate computers, and then selling operating-system source code to the Soviet KGB. This incident was portrayed in the movie 23. GSM-Hack In April 1998, the CCC successfully demonstrated the cloning of a GSM customer card, breaking the COMP128 encryption algorithm used at that time by many GSM SIMs. Project Blinkenlights In 2001, the CCC celebrated its twentieth birthday with an interactive light installation dubbed Project Blinkenlights that turned the building Haus des Lehrers in Berlin into a giant computer screen. A follow up installation, Arcade, was created in 2002 by the CCC for the Bibliothèque nationale de France. Later in October 2008 CCC's Project Blinkenlights went to Toronto, Ontario, Canada with project Stereoscope. Schäuble fingerprints In March 2008, the CCC acquired and published the fingerprints of German Minister of the Interior Wolfgang Schäuble. The magazine also included the fingerprint on a film that readers could use to fool fingerprint readers. This was done to protest the use of biometric data in German identity devices such as e-passports. Staatstrojaner affair The Staatstrojaner (Federal Trojan horse) is a computer surveillance program installed secretly on a suspect's computer, which the German police uses to wiretap Internet telephony. This "source wiretapping" is the only feasible way to wiretap in this case, since Internet telephony programs will usually encrypt the data when it leaves the computer. The Federal Constitutional Court of Germany has ruled that the police may only use such programs for telephony wiretapping, and for no other purpose, and that this restriction should be enforced through technical and legal means. On 8 October 2011, the CCC published an analysis of the Staatstrojaner software. The software was found to have the ability to remote control the target computer, to capture screenshots, and to fetch and run arbitrary extra code. The CCC says that having this functionality built in is in direct contradiction to the ruling of the constitutional court. In addition, there were a number of security problems with the implementation. The software was controllable over the Internet, but the commands were sent completely unencrypted, with no checks for authentication or integrity. This leaves any computer under surveillance using this software vulnerable to attack. The captured screenshots and audio files were encrypted, but so incompetently that the encryption was ineffective. All captured data was sent over a proxy server in the United States, which is problematic since the data is then temporarily outside the German jurisdiction. The CCC's findings were widely reported in the German press. This trojan has also been nicknamed R2-D2 because the string "C3PO-r2d2-POE" was found in its code; another alias for it is 0zapftis ("It's tapped!" in Bavarian, a sardonic reference to Oktoberfest). According to a Sophos analysis, the trojan's behavior matches that described in a confidential memo between the German Landeskriminalamt and a software firm called ; the memo was leaked on WikiLeaks in 2008. Among other correlations is the dropper's file name , short for Skype Capture Unit Installer. The 64-bit Windows version installs a digitally signed driver, but signed by the non-existing certificate authority "Goose Cert". DigiTask later admitted selling spy software to governments. The Federal Ministry of the Interior released a statement in which they denied that R2-D2 has been used by the Federal Criminal Police Office (BKA); this statement however does not eliminate the possibility that it has been used by state-level German police forces. The BKA had previously announced however (in 2007) that they had somewhat similar trojan software that can inspect a computer's hard drive. Domscheit-Berg affair Former WikiLeaks spokesman Daniel Domscheit-Berg was expelled from the national CCC (but not the Berlin chapter) in August 2011. This decision was revoked in February 2012. As a result of his role in the expulsion, board member Andy Müller-Maguhn was not reelected for another term. Phone authentication systems The CCC has repeatedly warned phone users of the weakness of biometric identification in the wake of the 2008 Schäuble fingerprints affair. In their "hacker ethics" the CCC includes "protect people data", but also "Computers can change your life for the better". The club regards privacy as an individual right: the CCC does not discourage people from sharing or storing personal information on their phones, but advocates better privacy protection, and the use of specific browsing and sharing techniques by users. Apple TouchID From a photograph of the user's fingerprint on a glass surface, using "easy everyday means", the biometrics hacking team of the CCC was able to unlock an iPhone 5S. Samsung S8 iris recognition The Samsung Galaxy S8's iris recognition system claims to be "one of the safest ways to keep your phone locked and the contents private" as "patterns in your irises are unique to you and are virtually impossible to replicate", as quoted in official Samsung content. However, in some cases, using a high resolution photograph of the phone owner's iris and a lens, the CCC claimed to be able to trick the authentication system. Fake Chaos Computer Club France The Chaos Computer Club France (CCCF) was a fake hacker organisation created in 1989 in Lyon (France) by Jean-Bernard Condat, under the command of Jean-Luc Delacour, an agent of the Direction de la surveillance du territoire governmental agency. The primary goal of the CCCF was to watch and to gather information about the French hacker community, identifying the hackers who could harm the country. Journalist said that this organization also worked with the French National Gendarmerie. The CCCF had an electronic magazine called Chaos Digest (ChaosD). Between 4 January 1993 and 5 August 1993, seventy-three issues were published (). See also 23 (film) c-base Chaos Communication Congress Chaosdorf, the local chapter of the Chaos Computer Club at Düsseldorf Datenschleuder Digitalcourage Digital identity Hacker culture Information privacy Netzpolitik.org Project Blinkenlights Security hacker Tron (hacker) Wau Holland Foundation References Further reading Chaos Computer Club hackers 'have a conscience', BBC News, 2011-02-11 External links CCC Events Blog Chaosradio Podcast Network Computer clubs in Germany Hacker groups Organisations based in Hamburg
35365068
https://en.wikipedia.org/wiki/Shadowrun%20Returns
Shadowrun Returns
Shadowrun Returns is a tactical role-playing game developed and published by Harebrained Schemes. It takes place in the science fantasy setting of the Shadowrun tabletop role-playing game. The game was crowd funded through Kickstarter and released for Microsoft Windows, OS X, Linux, iOS, and Android in 2013. An expansion pack titled Shadowrun: Dragonfall, was released in 2014. It was later converted to a standalone release, Shadowrun: Dragonfall – Director's Cut. In 2015, Harebrained Schemes launched another Kickstarter campaign to partially fund their next game, Shadowrun: Hong Kong. Similar to the Dragonfall – Director's Cut edition, Hong Kong was released in 2015 as a standalone release built using an upgraded version of the Shadowrun Returns engine. Gameplay Character generation The player can customize their character's gender and appearance. There are five races to choose from: humans, elves, dwarves, orcs, and trolls. The game lacks traditional character classes, but players may optionally play as one of six pregenerated archetypes: street samurai, cybernetically-enhanced warriors who focus on weapon mastery; the spell-casting mages; deckers, who focus primarily on computer hacking; shamans, who summon spirits to assist them in battle; riggers, who control mechanical drones; and physical adepts, who are magically-enhanced monks. As the story progresses, the player is given character points, known as "karma", to spend on improving their characters. If the player chose an archetype, it controls what skills and equipment the character starts with, but any character can gain any skill (e.g., deckers can spend karma to gain a shaman's summoning abilities). Besides the player character, up to three other characters can be hired during missions. Some non-player-characters are required during certain missions. Exploration While gameplay is mostly linear, some exploration is possible while completing objectives. The player can converse with various characters; different statistics and skills give new dialogue options, affect quest progression, or give different rewards. The player can interact with the environment in some ways. For instance, pushing aside objects or hacking terminals to find hidden rooms, gaining access to new routes to their main objective or finding items to use or sell. Mages can see magical ley lines, which enhance their abilities, and shamans summon spirits in designated places. Deckers can enter a virtual reality world known as the Matrix at specific points. The decker's avatar engages in a minigame where they fight ICs and enemy deckers while trying to gather data and hack devices. For example, a decker may turn off elevators to stop enemy reinforcements or turn them on to progress further. Combat Combat is turn based and includes tactical elements, such as taking cover. Players control the actions of their team followed by the enemies taking theirs. Each action has a cost in terms of action points. Characters start with a base of two action points per turn and later gain a third point. They can gain or lose action points based on abilities, spells or items. Action points are used in activities such as moving, firing a weapon, reloading, or using a spell or item. Each character carries up to three weapons and can switch between them at no cost. Weapons have different attacks, depending on the weapon and characters' skills. Ammunition is unlimited but ranged weapons need to be reloaded. Riggers can equip drones like weapons and use their action points to manually control them, attacking enemies or healing allies. Those with shaman abilities can summon additional allied spirits to the field, either through consumable items or special points where spirits linger. Depending on how much power the shaman imparts on them and how long they have been summoned, these spirits have a chance to break free and become hostile or flee. Plot The game ships with a campaign called "Dead Man's Switch". Further campaigns can be downloaded from external websites. Dead Man's Switch The player assumes the role of a shadowrunner who receives a pre-recorded message from his or her old shadowrunner accomplice, Sam Watts, which was triggered by a dead man's switch embedded within his body. Sam's message states that he has 100,000 nuyen being held in escrow as a reward should the player bring whomever was responsible for his death to justice. Upon arriving in Seattle, the runner discovers that Sam is the latest victim of the Emerald City Ripper, a serial killer who has been surgically removing organs from their victims. Afterwards the runner meets Jake Armitage, the protagonist of the SNES Shadowrun game, who provides some leads to investigate. After receiving help from Coyote, a female human bartender/shadowrunner who first asked for assistance in a private war against those who make "Better-Than-Life" ("BTL") chips, the runner discovers that the Ripper is a male elf named Silas Forsberg whose victims were those who had a transplanted organ from Sam's mother. After killing Silas, the player learns he was directed to commit the Ripper murders by Jessica Watts, Sam's twin sister. Sam and Jessica had lived a comfortable life before their father's passing, and despite his best efforts early on to live a decent life, he cracked under the pressure and spent the family savings on drugs and alcohol. He eventually became a shadowrunner to make ends meet, and to further fuel his self-destructive habits. The runner confronts Jessica but she escapes. The runner finds that Jessica is a high-ranking member of the Universal Brotherhood, an international New Age organization that attracts the disenfranchised. The runner and Coyote investigate the restricted areas of the facility and discover that the Universal Brotherhood is itself a front for a cult trying to create an insect spirit hive. Jessica is revealed to be a shaman who is one of the few who are aware of the Brotherhood's true nature and she unleashes extra-dimensional insect spirits that cannot be killed. The team flees, rescuing a woman named Mary-Louise who is designated to become "the queen", similar to a queen bee. Mary-Louise connects the team with her boyfriend, a decker going by the alias Baron Samedi, who organizes a shadowrun on Telestrian Industries to steal a sample of Project Aegis; a chemical weapon capable of killing the insect spirits. The runner acquires the sample but is captured while trying to escape and is brought before James Telestrian III. When it is revealed that the runner rescued Mary-Louise, who is Telestrian's daughter, he decides instead of punishing the runner to hire him or her to lead a team to deploy Project Aegis along with the immortal elf Harlequin. Telestrian explains that Jessica's ritual to bring an insect spirit queen into this world requires a blood relative. His father had an affair with Melinda Watts, Sam and Jessica's mother, thus Mary-Louise was a viable candidate since they shared the same grandfather. Should Jessica perform the ritual on a blood relative, it would result in a full-scale invasion of the extra-dimensional insect spirits. Telestrian gives the runner and Harlequin each a shotgun able to fire capsules filled with the remaining Aegis compound, which can kill the insect spirits. The team infiltrates the hive and they fight their way into the heart of the inner sanctum where Telestrian's sister, Lynne, has volunteered to let the queen take over her body, as she is also a blood relation to Jessica. The team disrupts the summoning by seriously wounding Jessica and killing most of the insect spirits inside the hive. The queen spirit abandons Jessica, and the player is given the option to kill her or arrest her. Lynne survives, but is arrested and eventually sent to a mental hospital. The game concludes with Armitage, Coyote, Harlequin, and James Telestrian III discussing the fallout of the raid, with Harlequin musing that other Brotherhood chapters across the world also hold hives similar to the one in Seattle. When the runner tries to collect the money for bringing Sam's killer to justice, Sam's prerecorded message asks the runner to apologize to Jessica for what he put her through, and reveals that he never actually had any money in escrow. An epilogue describes the immediate events after the game, tying in with the larger Shadowrun canon. Media coverage of the events left out details of shadowrunners and insect spirits, likely due to the influence of the Brotherhood. Aegis was eventually developed into a product called "Fluorescing Astral Bacteria-3", or "FAB-3". The Chicago Universal Brotherhood hive is botched and the city is largely sealed up behind a wall to keep the rampaging insect spirits inside the city. FAB-3 is used some time later to cleanse Chicago of its insect spirit infestation. Dragonfall In the main campaign of the game's first expansion, players assume the role of a Shadowrunner who has recently arrived in the anarchic free state of Berlin to join a team headed by an old colleague, Monika Schäfer. Hong Kong In 2056, the player travels to the Hong Kong Free Enterprise Zone, meeting with their foster brother Duncan and his superior officer Carter, who agree to investigate their foster father's mysterious message, are ambushed by the HKPD, and escape to and continue their investigation from a small boat village built on the outskirts of a modern Kowloon Walled City, a nightmarish slum built on the ruins of the original. Development The game's lead designer is Jordan Weisman, the creator of the Shadowrun tabletop role-playing game, who was inspired to create a game with a "more authentic tone" after the release of the 2007 Shadowrun first-person shooter, which he was not involved with. Weisman was originally inspired to create a video game in the Shadowrun universe after reacquiring the rights to it from Microsoft through his Smith & Tinker startup company in 2007. Unfortunately, due to restrictions on the license, he could not obtain the backing of other publishers for new Shadowrun projects. The success of crowd funding financing models then motivated him to obtain funding for his project, Shadowrun Returns, through the Kickstarter crowd funding platform. The project was opened to pledges in March 2012 and met its funding goal of $400,000 within 28 hours. Upon reaching the $1 million mark, Weisman recorded a video for the Kickstarter project stating that if the project were to reach $1.5 million, the developers would develop a "backers-only exclusive mission which will tie together the stories of the SNES title and the Sega Genesis title." This goal was achieved but Kickstarter backers demanded that the mission be made available to everyone, so Harebrained Schemes announced that the mission would be available to them for a limited amount of time before being available to the public. The funding period ended on 29 April 2012, by which time the project had gathered $1,895,772 worth of pledges. The success of the campaign made Complex rank it number eight on their list of the biggest video game wins and fails on Kickstarter in 2012. The game was developed for Windows, Mac and Linux platforms, as well as the iPad and Android tablets. Weisman justified the addition of tablets as a development platform alongside more traditional development for the desktop because "the style of game we want to make lends itself best to these platforms," and "the gameplay determined the platform." Weisman announced the game as a "graphically rich 2D turn-based single player game with deep story interaction, meaningful character development, and highly-contextual tactical combat." It is accompanied by a level editor for players to create their own content. The game implements character types from the role-playing game, including Street Samurai, Combat Mage, Decker (i.e. hacker), Shaman, Rigger and Adept. In collaboration with Cliffhanger Productions, characters and plotlines of Shadowrun Returns will be carried on to Shadowrun Online, which is set approximately 20 years later and will be based upon the video game multiplayer aspect Shadowrun has potential for. The music was composed by Marshall Parker and Sam Powell who were involved in the original SNES and Sega Genesis iterations, as well as the independent composer Gavin Parker, whose previous work includes Test Drive Unlimited, Viva Piñata and Scene it?. The official launch trailer was released on July 18, which included voice talent by Charles Legget and music by composer Jon Everist, who went on to compose the majority of the music for Shadowrun: Dragonfall and all of the critically acclaimed score for Shadowrun: Hong Kong. Release The game originally had an estimated delivery date of January 2013, but the designers stated that the additional content to be added after meeting stretch goals will require more time. On June 18, 2013 the developers announced an official release date of July 25, 2013. It was released on July 25, 2013 through Steam, with a DRM-free download available to Kickstarter backers only. Initially the DRM-free version was only available to backers as the developing team managed to get an exception only for Kickstarter backers while licensing the Shadowrun brand, but on November 12, 2013 Harebrained Schemes announced that they had reached an agreement to release DRM-free versions of Shadowrun Returns and future expansions, as well as sell them through GOG.com. Reception Shadowrun Returns received generally favorable reviews upon release, garnering a 76/100 on review aggregation website Metacritic. IGN reviewer Dan Stapleton stated that the game's best days were ahead of it, and they will be more interested in it "a year from now, after the community has used the included mod tools to build on it, than in what it is today." Several reviewers criticized the game's save system, which saves the game only at area transitions. Functionality to allow the player to save the game at any time was omitted because of development resource constraints, but was later added in the Shadowrun Dragonfall expansion. References External links 2013 video games Android (operating system) games Hacking video games Indie video games Interquel video games IOS games Kickstarter-funded video games Crowdfunded video games Harebrained Schemes games Linux games MacOS games Religion in science fiction Role-playing video games Video games with Steam Workshop support Shadowrun video games Single-player video games Tactical role-playing video games Video games about revenge Video games developed in the United States Video games featuring protagonists of selectable gender Video games scored by Sam Powell Video games set in the 2050s Video games set in Seattle Video games with expansion packs Video games with isometric graphics Video games with user-generated gameplay content Windows games Cyberpunk video games
5704580
https://en.wikipedia.org/wiki/TI-Nspire%20series
TI-Nspire series
The TI-Nspire is a graphing calculator made by Texas Instruments which was released in July 2007. The original TI-Nspire was developed out of the TI PLT SHH1 prototype calculator (which itself was derived from the Casio ClassPad 300), the TI-92 series of calculators released in 1995, and the TI-89 series of calculators released in 1998. The TI-Nspire features a non-QWERTY keyboard and a different key-by-key layout compared to its predecessors. The TI-Nspire allows users to swap out the existing removable keypad with a functional copy of the TI-84 Plus series keypad. The TI-Nspire series I/O has a connector for the TI-Nspire Lab Cradle, another that serves as a connector for TI's wireless network adapter, and a Mini-USB connector for transferring data. The TI-Nspire series is available with and without a computer algebra system. In 2011, Texas Instruments released the CX line of their TI-Nspire calculators which effectively replaced the previous generation. The updates included improvements to the original's keyboard layout, an addition of a rechargeable lithium-ion battery, 3D graphing capabilities and reduced form factor. TI got rid of the removable keypad with this generation and therefore, the TI-84 compatibility mode. In 2019, the TI-Nspire CX II was added, with a boost in clock speed and changes to the existing operating system. Versions The TI-Nspire series uses a different operating system compared to Texas Instruments' other calculators. The TI-Nspire includes a file manager that lets users create and edit documents. As a result of being developed from PDA-esque devices, the TI-Nspire retains many of the same functional similarities to a computer. TI-Nspire The standard TI-Nspire calculator is comparable to the TI-84 Plus in features and functionality. It features a TI-84 mode by way of a replaceable snap-in keypad and contains a TI-84 Plus emulator. The likely target of this is secondary schools that make use of the TI-84 Plus currently or have textbooks that cover the TI-83 (Plus) and TI-84 Plus lines, and to allow them to transition to the TI-Nspire line more easily. The TI-Nspire started development in 2004. It uses a proprietary SoC of the ARM9 variant for its CPU. The TI-Nspire and TI-Nspire CAS (Computer algebra system) calculators have 32 MB of NAND Flash, 32 MB of SDRAM, and 512 KB of NOR Flash. However, only 20 MB and 16 MB are user-accessible respectively. The TI-Nspire released in two models; a numeric and CAS version. The numeric is similar in features to the TI-84, except with a bigger and higher resolution screen and a full keyboard. The feature that the numeric lacks is the ability to solve algebraic equations such as indefinite integrals and derivatives. To fill in the gap of needing an algebraic calculator, Texas Instruments introduced the second model with the name TI-Nspire CAS. The CAS is designed for college and university students, giving them the feature of calculating many algebraic equations like the Voyage 200 and TI-89 (which the TI-Nspire was intended to replace). However, the TI-Nspire does lack part of the ability of programming and installing additional apps that the previous models had, although a limited version of TI-BASIC is supported, along with Lua in later versions. C and assembly are only possible by Ndless. Because the TI-Nspire lacks a QWERTY keyboard, it is acceptable for use on the PSAT, SAT, SAT II, ACT, AP, and IB Exams. TI-Nspire CAS The TI-Nspire CAS calculator is capable of displaying and evaluating values symbolically, not just as floating-point numbers. It includes algebraic functions such as a symbolic differential equation solver: deSolve(...), the complex eigenvectors of a matrix: eigVc(...), as well as calculus based functions, including limits, derivatives, and integrals. For this reason, the TI-Nspire CAS is more comparable to the TI-89 Titanium and Voyage 200 than to other calculators. Unlike the TI-Nspire, it is not compatible with the snap-in TI-84 Plus keypad. It is accepted in the SAT and AP exams (without a QWERTY keyboard) but not in the ACT, IB or British GCSE and A level. The body color is grey. TI-Nspire Touchpad On 8 March 2010, Texas Instruments announced new models of the TI-Nspire Touchpad and TI-Nspire CAS Touchpad graphing calculators. In the United States the new calculator was listed on the TI website as a complement to the TI-Nspire with Clickpad, though it was introduced as a successor to the previous model in other countries. The calculators were released alongside the OS 2.0 update, which featured a number of updates to the user interface and new functions. The keyboards on the touchpad keypads featured a different and less crowded key layout along with the touchpad, which is used for navigation. The touchpad keypads are also compatible with older calculators that are running OS 2.0 or newer. The new calculators that were shipped with touchpad keypads supported an optional rechargeable battery. The second generation is also available in two models, the TI-Nspire Touchpad and TI-Nspire CAS Touchpad, and each model has maintained the color of itself, with the normal one being white and black while the CAS is black and gray. To reduce theft of school-owned TI-Nspire calculators, Texas Instruments also introduced the EZ-Spot Teacher Packs with a bright, easy-to-spot, "school bus yellow" frame and slide case. The hardware of both versions are the same, with the only differences being cosmetic. The TI-Nspire calculators that were released after the touchpad TI-Nspires also have EZ-Spot versions. TI-Nspire CX and TI-Nspire CX CAS In 2011, the TI-Nspire CX and CX CAS were announced as updates to TI-Nspire series. They have a thinner design with a thickness of 1.57 cm (almost half of the TI-89), a 1200 mAh (1060mAh before 2013) rechargeable battery (wall adapter is included in the American retail package), a 320 by 240 pixel full color backlit display (3.2" diagonal), and OS 3.0 which includes features such as 3D graphing. The CX series were released in the same time frame as the Casio Prizm (fx-CG10/20), Casio's color screen graphing calculator with similar features. The TI-Nspire CX series differ from all previous TI graphing calculator models in that the CX series are the first to use a rechargeable 1060 mAh Lithium-Ion battery (upgraded to 1200 mAh in 2013 rev.). The device is charged via a USB cable. TI claims that the battery requires four hours to charge, that a full charge powers the device for up to two weeks under normal daily use, and that the battery should last up to 3 years before it requires replacement. The battery is user-replaceable. With the exception of interchangeable TI-84 keypads, the CX series retain all features of the previous TI-Nspire models. The colors of the calculator are still the same as those of the TI-Nspire models; the CX is white and dark blue, while the CX CAS is gray and black. The external connectors have changed slightly. The mini-USB port, located at the center on the top of the TI-Nspire series, has moved to the right on the top on the CX series. On the CX series, TI added a second port immediately left of the mini-USB port, for a new wireless module. The new wireless TI-Nspire Navigator adapter, which allows teachers to monitor students and send files, is not compatible with the previous TI-Nspire models. The third port, located at the bottom of the handheld, is for the TI Charging Dock and Lab Cradle. The keypad layout is very similar to that of the TI-Nspire Touchpad. Both models have 100 MB of user memory and 64 MB of RAM. The retail package comes in a plastic blister case and doesn't have the full manual, while the teachers edition comes in a box with a TI-Nspire CX poster for classrooms and the full manual (in English and French in the US). Both devices ship with the student/teacher software for Windows/Mac OS X. According to Texas Instruments, The CX is accepted in SAT, IB, AP, ACT and British GCSE and A level exams. The CX CAS is only accepted on SAT and AP. Chinese market Four models aimed for the Chinese market were launched, with specialized features. All four models have Chinese labeled keyboards. The CX-C and CX-C CAS models are similar to CX and CX CAS, but included a concise Chinese-English dictionary. The CM-C and CM-C CAS are cheaper, featured a more stream-lined design, but have only 32MB of RAM and no port for the wireless module. The systems of the Chinese versions are not interchangeable with those of the international models. TI-Nspire CX II and TI-Nspire CX II CAS In 2019, Texas Instruments introduced the TI-Nspire CX II and TI-Nspire CX II CAS. They feature a slightly different operating system with several enhancements and slightly improved hardware, including Python integration. European market Like China, the continent of Europe also has models aimed for its market. These calculators include a "-T" after the CX. The CX II-T and CX II-T CAS both have different body color designs than their North American counterparts. One of the main feature differences in the European versions is the inclusion of an exact math engine in the non-CAS version. European models also omit the WiFi adapter port from the top of the calculator. Software Texas Instruments offers several different versions of software for their calculators. They offer CAS and non-CAS versions of their student and teacher software. This software allows users to share results with classmates and teachers and gives the user an emulated version of the TI-Nspire. TI also offers a computer link software for connecting their handheld to their computer to transfer documents. The software allows for the syncing of documents to and from the calculator and/or computer. This software requires a license in order to be used. Programming languages Beside the TI Basic functionality the Nspire calculators provides functionality to run scripts written in two additional programming languages with the standard TI firmware. With the release of OS 3.0, the Lua scripting language is supported, allowing 3rd party programs to be run without the need of exploits. There are currently more than 100 third-party programs and functions for the Nspire that introduce new functionality, like Laplace transforms, Fourier transforms, and 3rd and 4th degree differential equations, that are not included by default. The actual LUA Version is 5.1 in OS Version 5.2 (September 2020). Since firmware version 5.2 it is possible to program and run Python (Version 3.4.0 in September 2020) scripts in an interpreter shell or from the main calculator command line. Lab Cradle The TI-Nspire Lab Cradle is a Calculator-Based Laboratory system introduced in 1994. It is a portable data collection device for the life sciences. The CBL system was replaced in 1999 by the CBL 2. The TI-Nspire Lab Cradle has three analog and two digital inputs with a sampling rate of up to 100,000 readings per second. The cradle also has 32 MB of storage space to store sensor data. The Lab Cradle allows the TI-Nspire series to communicate with older Calculator-Based Laboratory systems that previous TI calculators used (TI-73 series, TI-82, TI-83 series, TI-85, and TI-86). The TI-Nspire Lab Cradle used the rechargeable battery of the TI-Nspire and support three different charging options: wall adapter, USB cable to computer and TI-Nspire Cradle Charging Bay. The TI-Nspire Lab Cradle is marketed by Texas Instruments and developed as part of an ongoing business venture between TI and Vernier Software & Technology of Portland, Oregon. Navigator system The navigator system allows teachers to connect multiple TI-Nspire calculators to a computer through the TI-Nspire Access Point and TI-Nspire Navigator Wireless Cradles. The system includes the TI-Nspire cradle charging bay and the main system which looks like a wireless router. The Navigator system was first available when the first generation Nspires were launched, but when the TI-Nspire CX and CX CAS were released, a new wireless adapter was announced that is smaller but not compatible with the TI-Nspire and TI-Nspire Touchpad. Press-to-Test Press-to-Test is a feature that restricts access to the user's documents and certain features of the calculator for a limited time. Its intended purpose is to prevent cheating on tests and exams. Press-to-Test is enabled by pressing a certain button combination when turning on the calculator. The features that are blocked (for example 3D graphs and drag & drop for graphs) can be selectively enabled, but access to existing documents is always prohibited. When the handheld is running in Press-to-Test mode, an LED on top of it blinks to indicate that Press-to-Test hasn't been disabled. Press-to-Test can only be disabled by connecting to another calculator or a computer with TI-Nspire compatible software installed. Removing the batteries or pressing the reset button will not disable it. Ndless Ndless (alternatively stylized Ndl3ss) is a third-party jailbreak for the TI-Nspire calculators that allows native programs, such as C, C++, and ARM assembly programs, to run. Ndless was developed initially by Olivier Armand and Geoffrey Anneheim and released in February 2010 for the Clickpad handheld. Organizations such as Omnimaga and TI-Planet promoted Ndless and built a community around Ndless and Ndless programs. With Ndless, low-level operations can be accomplished, for example overclocking, allowing the handheld devices to run faster. Downgrade prevention can be defeated as well. In addition, Game Boy, Game Boy Advance, and Nintendo Entertainment System emulators exist for the handhelds with Ndless. Major Ndless-powered programs also include a port of the game Doom. Unlike Lua scripts, which are supported by Texas Instruments, Ndless is actively counteracted by TI. Each subsequent OS attempts to block Ndless from operating. Technical specifications Texas Instruments developed their own proprietary System-On-Chip from the ARM9 32-bit processors. The first generation of the TI-Nspire is based on LSI Corporation's (now Broadcom Inc.) "Zevio" design while the CX and CX II generation is built with Toshiba's Application-Specific Integrated Circuit design. Most Texas Instruments calculators contain only a non-volatile read-only memory called NAND Flash and a volatile random-access memory called Synchronous dynamic random-access memory or SDRAM. The NAND Flash is not executable but contains parts of the operating system. However, the TI-Nspire also uses NOR ROM to store boot instructions for the operating system. Texas Instruments most likely did this to free up the NAND Flash, and SDRAM in the calculator to be used by the user and operating system. The NAND Flash and SDRAM are used to store user and operating system documents. Previous Texas Instruments calculators had a backup button cell battery used to maintain user information, system information and time and date, between battery changes. This allows a user to keep their information when a battery is removed. Because the TI-Nspire lacks this backup battery, the SDRAM content is deleted whenever the user has to swap the battery out. This necessitates that the calculator load the operating system and file structure from the NAND Flash to the SDRAM, causing a longer loading time. Despite the overall performance increase between versions of the TI-Nspire, performance differences do exist. The TI-Nspire CX II version lacks 10+ MB of storage space compared to its predecessor. The TI-Nspire CM-C and CM-C CAS (the Chinese versions of the CX and CX CAS) are cheaper and have an updated design, but have only 32MB of RAM and no port for the wireless module. Operating System versions The TI-Nspire CX/CX CAS calculators are now running the operating system (OS) version 4.5.5.79, released in August 2021. The TI-Nspire CX II/CX II CAS are running version 5.4.0.259, released in January 2022. The operating system has been updated frequently since 2007 (partly due to bugs and missing functions, and also to patch jailbreak exploits), one year after its release in 2006. Version 2.0, 3.0, 4.0, and 5.0 were major upgrades. Added features in OS 2.0 Starting with OS 2.0, additional features were added to increase usability and usefulness of the TI-Nspire. Below are major changes that were made. These features have stayed with the Nspire series to date. Added features in OS 3.0 Images can be included in TI-Nspire documents using the computer software. They can then be displayed on the Nspire calculators and in full color on the Nspire CX calculators. Graphs can be drawn on top of the images. A data collection application is included with the OS, for use with the Lab Cradle. 3D graphing is supported, as well as differential equations. Other features were also added, including improvements to functions that are related to statistics. OS 3.0 also adds the ability to run programs that are written in Lua. OS 3.0.1 introduced a number of bugs, but most of these have been fixed as of 3.0.2. In OS 3.2, conic equations in standard formats can be graphed and a new chemistry feature, Chem Box, allows users to write chemical notations. OS 3.2 also saw the inclusion of the Chipmunk physics engine for use in Lua programs. In OS 3.9, the area between curves can now be calculated on the graph bar. Added features in OS 4.0 An indicator now displays the angle mode (Degrees, Radians or Gradians) in effect for the current application. In window settings on graphs, exact inputs such as 7/3 or 2*π can now be used for input of custom window settings. Added features in OS 5.0 OS 5.0 is currently exclusive to the CX II/CX II CAS and their -T counterparts. These features were added in this release: Animated Path plot Modernized user interface Dynamic coefficient values Points by coordinates Tick-mark labels Various TI-Basic programming enhancements Simplified Disable CAS (CAS model exclusive) DeSolve wizard (CAS model exclusive) Added features in OS 5.2 OS 5.2 is currently exclusive to the CX II/CX II CAS and their -T counterparts. These features were added in this release: Python programming language support in Host Software and on the calculator. Added features in OS 5.3 OS 5.3 is currently exclusive to the CX II/CX II CAS and their -T counterparts. These features were added in this release: Exam support Quick set-up code to enter Press-to-Test Python programming improvements Six TI-authored modules that are additional libraries for Python’s functionality. The new modules are: TI Draw TI Plotlib TI Hub TI Rover TI Image TI System See also Comparison of Texas Instruments graphing calculators Comparison of computer algebra systems References External links TI-Nspire series official site Texas Instruments Calculators & Education Technology Datamath Calculator Museum Lua programming on the TI-Nspire TI-Nspire series Google Discussion Forum TI-Nspire series user programs TI-Nspire series BASIC TI-Nspire series program collection Graphing calculators Texas Instruments calculators Computer algebra systems Lua (programming language)-scriptable hardware
16467275
https://en.wikipedia.org/wiki/9694%20Lycomedes
9694 Lycomedes
9694 Lycomedes is a Jupiter trojan from the Greek camp, approximately in diameter. It was discovered during the Palomar–Leiden survey at the Palomar Observatory in 1960 and later named after Lycomedes from Greek mythology. The dark Jovian asteroid is likely elongated in shape and has a rotation period of 18.2 hours. Discovery Lycomedes was discovered on 26 September 1960, by Dutch astronomer couple Ingrid and Cornelis van Houten at Leiden, on photographic plates taken by astronomer Tom Gehrels at the Palomar Observatory in California. The body's observation arc begins the night after its official discovery observation at Palomar. Palomar–Leiden survey The survey designation "P-L" stands for "Palomar–Leiden", named after Palomar and Leiden observatories, which collaborated on the fruitful Palomar–Leiden survey in the 1960s. Gehrels used Palomar's Samuel Oschin telescope (also known as the 48-inch Schmidt Telescope), and shipped the photographic plates to Ingrid and Cornelis van Houten at Leiden Observatory where astrometry was carried out. The trio are credited with the discovery of several thousand asteroids. Naming This minor planet was named from Greek mythology after Lycomedes, the Greek king of Scyros. At the request of Thetis, he concealed her son Achilles dressed in girl's clothes among his own daughters to save him from the Trojan War until Odysseus drew him out of his disguise. The official naming citation was published by the Minor Planet Center on 2 April 1999 (). Orbit and classification As all Jupiter trojans, Lycomedes is in a 1:1 orbital resonance with Jupiter. It is located in the leading Greek camp at the Gas Giant's Lagrangian point, 60° ahead on its orbit . It orbits the Sun at a distance of 4.9–5.3 AU once every 11 years and 6 months (4,206 days; semi-major axis of 5.1 AU). Its orbit has an eccentricity of 0.04 and an inclination of 5° with respect to the ecliptic. Physical characteristics Lycomedes is an assumed C-type asteroid, while most larger Jupiter trojans are D-types. Rotation period In October 2010, a first rotational lightcurve of Lycomedes was obtained from photometric observations by astronomers at the Palomar Transient Factory in California. Lightcurve analysis gave a rotation period of hours with a brightness amplitude of 0.38 magnitude (). In November 2011, follow-up observations over two consecutive nights were made by Daniel Coley at the Goat Mountain Astronomical Research Station in California. It gave a concurring rotation period of hours with a high brightness amplitude of 0.55 magnitude, indicative of a non-spherical shape (). Diameter and albedo According to the survey carried out by the NEOWISE mission of NASA's Wide-field Infrared Survey Explorer, Lycomedes measures between 31.736 and 31.74 kilometers in diameter and its surface has an albedo, while the Collaborative Asteroid Lightcurve Link assumes a standard albedo for a carbonaceous asteroid of 0.057 and calculates a diameter of 40.33 kilometers based on an absolute magnitude of 10.7. Notes References External links Lightcurve Database Query (LCDB), at www.minorplanet.info Dictionary of Minor Planet Names, Google books Discovery Circumstances: Numbered Minor Planets (5001)-(10000) – Minor Planet Center Asteroid 9694 Lycomedes at the Small Bodies Data Ferret 009694 Discoveries by Cornelis Johannes van Houten Discoveries by Ingrid van Houten-Groeneveld Discoveries by Tom Gehrels 6581 Minor planets named from Greek mythology Named minor planets 19600926
4689778
https://en.wikipedia.org/wiki/Database%20security
Database security
Database security concerns the use of a broad range of information security controls to protect databases (potentially including the data, the database applications or stored functions, the database systems, the database servers and the associated network links) against compromises of their confidentiality, integrity and availability. It involves various types or categories of controls, such as technical, procedural/administrative and physical. Security risks to database systems include, for example: Unauthorized or unintended activity or misuse by authorized database users, database administrators, or network/systems managers, or by unauthorized users or hackers (e.g. inappropriate access to sensitive data, metadata or functions within databases, or inappropriate changes to the database programs, structures or security configurations); Malware infections causing incidents such as unauthorized access, leakage or disclosure of personal or proprietary data, deletion of or damage to the data or programs, interruption or denial of authorized access to the database, attacks on other systems and the unanticipated failure of database services; Overloads, performance constraints and capacity issues resulting in the inability of authorized users to use databases as intended; Physical damage to database servers caused by computer room fires or floods, overheating, lightning, accidental liquid spills, static discharge, electronic breakdowns/equipment failures and obsolescence; Design flaws and programming bugs in databases and the associated programs and systems, creating various security vulnerabilities (e.g. unauthorized privilege escalation), data loss/corruption, performance degradation etc.; Data corruption and/or loss caused by the entry of invalid data or commands, mistakes in database or system administration processes, sabotage/criminal damage etc. Ross J. Anderson has often said that by their nature large databases will never be free of abuse by breaches of security; if a large system is designed for ease of access it becomes insecure; if made watertight it becomes impossible to use. This is sometimes known as Anderson's Rule. Many layers and types of information security control are appropriate to databases, including: Access control Auditing Authentication Encryption Integrity controls Backups Application security Database Security applying Statistical Method Databases have been largely secured against hackers through network security measures such as firewalls, and network-based intrusion detection systems. While network security controls remain valuable in this regard, securing the database systems themselves, and the programs/functions and data within them, has arguably become more critical as networks are increasingly opened to wider access, in particular access from the Internet. Furthermore, system, program, function and data access controls, along with the associated user identification, authentication and rights management functions, have always been important to limit and in some cases log the activities of authorized users and administrators. In other words, these are complementary approaches to database security, working from both the outside-in and the inside-out as it were. Many organizations develop their own "baseline" security standards and designs detailing basic security control measures for their database systems. These may reflect general information security requirements or obligations imposed by corporate information security policies and applicable laws and regulations (e.g. concerning privacy, financial management and reporting systems), along with generally accepted good database security practices (such as appropriate hardening of the underlying systems) and perhaps security recommendations from the relevant database system and software vendors. The security designs for specific database systems typically specify further security administration and management functions (such as administration and reporting of user access rights, log management and analysis, database replication/synchronization and backups) along with various business-driven information security controls within the database programs and functions (e.g. data entry validation and audit trails). Furthermore, various security-related activities (manual controls) are normally incorporated into the procedures, guidelines etc. relating to the design, development, configuration, use, management and maintenance of databases. Privileges Two types of privileges are important relating to database security within the database environment: system privileges and object privileges. System Privileges System privileges allow a user to perform administrative actions in a database. Object Privileges Object privileges allow for the use of certain operations on database objects as authorized by another user. Examples include: usage, select, insert, update, and references. The Principal of least Privilege, and Separation of duties: Databases that fall under internal controls (that is, data used for public reporting, annual reports, etc.) are subject to the separation of duties, meaning there must be segregation of tasks between development, and production. Each task has to be validated (via code walk-through/fresh eyes) by a third person who is not writing the actual code. The database developer should not be able to execute anything in production without an independent review of the documentation/code for the work that is being performed. Typically, the role of the developer is to pass code to a DBA; however, given the cutbacks that have resulted from the economic downturn, a DBA might not be readily available. If a DBA is not involved, it is important, at minimum, for a peer to conduct a code review. This ensures that the role of the developer is clearly separate. Another point of internal control is adherence to the principle of providing the least amount of privileges, especially in production. To allow developers more access to get their work done, it is much safer to use impersonation for exceptions that require elevated privileges (e.g. EXECUTE AS or sudo to do that temporarily). Often developers may dismiss this as “overhead” while on their path to coding glory. Please be aware, however, that DBAs must do all that is considered responsible because they are the de facto data stewards of the organization and must comply with regulations and the law. Vulnerability Assessments to Manage Risk and Compliance One technique for evaluating database security involves performing vulnerability assessments or penetration tests against the database. Testers attempt to find security vulnerabilities that could be used to defeat or bypass security controls, break into the database, compromise the system etc. Database administrators or information security administrators may for example use automated vulnerability scans to search out misconfiguration of controls (often referred to as 'drift') within the layers mentioned above along with known vulnerabilities within the database software. The results of such scans are used to harden the database (improve security) and close off the specific vulnerabilities identified, but other vulnerabilities often remain unrecognized and unaddressed. In database environments where security is critical, continual monitoring for compliance with standards improves security. Security compliance requires, amongst other procedures, patch management and the review and management of permissions (especially public) granted to objects within the database. Database objects may include table or other objects listed in the Table link. The permissions granted for SQL language commands on objects are considered in this process. Compliance monitoring is similar to vulnerability assessment, except that the results of vulnerability assessments generally drive the security standards that lead to the continuous monitoring program. Essentially, vulnerability assessment is a preliminary procedure to determine risk where a compliance program is the process of on-going risk assessment. The compliance program should take into consideration any dependencies at the application software level as changes at the database level may have effects on the application software or the application server. Abstraction Application level authentication and authorization mechanisms may be effective means of providing abstraction from the database layer. The primary benefit of abstraction is that of a single sign-on capability across multiple databases and platforms. A single sign-on system stores the database user's credentials and authenticates to the database on behalf of the user. Abstraction is the idea of making complex ideas easier to understand. Database activity monitoring (DAM) Another security layer of a more sophisticated nature includes real-time database activity monitoring, either by analyzing protocol traffic (SQL) over the network, or by observing local database activity on each server using software agents, or both. Use of agents or native logging is required to capture activities executed on the database server, which typically include the activities of the database administrator. Agents allow this information to be captured in a fashion that can not be disabled by the database administrator, who has the ability to disable or modify native audit logs. Analysis can be performed to identify known exploits or policy breaches, or baselines can be captured over time to build a normal pattern used for detection of anomalous activity that could be indicative of intrusion. These systems can provide a comprehensive database audit trail in addition to the intrusion detection mechanisms, and some systems can also provide protection by terminating user sessions and/or quarantining users demonstrating suspicious behavior. Some systems are designed to support separation of duties (SOD), which is a typical requirement of auditors. SOD requires that the database administrators who are typically monitored as part of the DAM, not be able to disable or alter the DAM functionality. This requires the DAM audit trail to be securely stored in a separate system not administered by the database administration group. Native audit In addition to using external tools for monitoring or auditing, native database audit capabilities are also available for many database platforms. The native audit trails are extracted on a regular basis and transferred to a designated security system where the database administrators do/should not have access. This ensures a certain level of segregation of duties that may provide evidence the native audit trails were not modified by authenticated administrators, and should be conducted by a security-oriented senior DBA group with read rights into production. Turning on native impacts the performance of the server. Generally, the native audit trails of databases do not provide sufficient controls to enforce separation of duties; therefore, the network and/or kernel module level host based monitoring capabilities provides a higher degree of confidence for forensics and preservation of evidence. Process and procedures A good database security program includes the regular review of privileges granted to user accounts and accounts used by immediate processes. For individual accounts a two-factor authentication system improves security but adds complexity and cost. Accounts used by automated processes require appropriate controls around password storage such as sufficient encryption and access controls to reduce the risk of compromise. In conjunction with a sound database security program, an appropriate disaster recovery program can ensure that service is not interrupted during a security incident, or any incident that results in an outage of the primary database environment. An example is that of replication for the primary databases to sites located in different geographical regions. After an incident occurs, database forensics can be employed to determine the scope of the breach, and to identify appropriate changes to systems and processes. See also Negative database Database firewall FIPS 140-2 US federal standard for authenticating a cryptography module Virtual private database References External links https://web.archive.org/web/20080511155031/http://iase.disa.mil/stigs/checklist/index.html https://web.archive.org/web/20080515131426/http://iase.disa.mil/stigs/stig/index.html
38241078
https://en.wikipedia.org/wiki/Red%20October%20%28malware%29
Red October (malware)
Operation Red October or Red October was a cyberespionage malware program discovered in October 2012 and uncovered in January 2013 by Russian firm Kaspersky Lab. The malware was reportedly operating worldwide for up to five years prior to discovery, transmitting information ranging from diplomatic secrets to personal information, including from mobile devices. The primary vectors used to install the malware were emails containing attached documents that exploited vulnerabilities in Microsoft Word and Excel. Later, a webpage was found that exploited a known vulnerability in the Java browser plugin. Red October was termed an advanced cyberespionage campaign intended to target diplomatic, governmental and scientific research organizations worldwide. A map of the extent of the operation was released by the Kaspersky Lab - the "Moscow-based antivirus firm that uncovered the campaign." After being revealed, domain registrars and hosting companies shut down as many as 60 domains, used by the virus creators to receive information. The attackers, themselves, shut down their end of the operation, as well. References External links Info at kaspersky.com Spyware Hacking in the 2010s