id
stringlengths 3
8
| url
stringlengths 32
207
| title
stringlengths 1
114
| text
stringlengths 93
492k
|
---|---|---|---|
42429807 | https://en.wikipedia.org/wiki/Colleges%20and%20Schools%20of%20North%20Carolina%20Agricultural%20and%20Technical%20State%20University | Colleges and Schools of North Carolina Agricultural and Technical State University | North Carolina Agricultural and Technical State University offers 177 Undergraduate, 30 master, and 9 doctoral degrees through its 9 professional colleges. The colleges and schools function as autonomous units within the university, and adheres to the university's mission and philosophy. Bachelor and Master degree programs are offered through the Colleges of Agriculture and Environmental Sciences; Arts, Humanities & Social Sciences; Business and Economics; Education; Engineering; Health & Human Sciences and the Science & Technology. Doctoral programs are offered through the Colleges of Agriculture and Environmental Sciences; Engineering; Science & Technology; The Joint School of Nanoscience and Nanoengineering; and The Graduate College.
North Carolina A&T is one of the nation's leading producers of African-American engineers with bachelor's, master's and doctorate degrees, and The university is also the nation's top producer of minorities with degrees (as a whole) in science, technology, engineering and mathematics. The university is also a leading producer of minority certified public accountants, landscape architects, veterinarians, and agricultural graduates.
Since 1968, A&T's academic programs were divided among nine different academic divisions. This alignment would remain until 2016, when the university again realigned its academic programs in an effort to meet the objectives of their Preeminence 2020 strategic plan.
College of Arts, Humanities and Social Sciences
The College of Arts and Science was established in 1968. With 13 departments and programs ranging from the arts, humanities, communications, mathematics; social, behavioral and natural sciences; the College of Arts and Sciences is the largest academic unit at North Carolina A&T. The College is the nation's largest Producer of African American psychology graduates.
As of 2012, the College of Arts and Sciences has a total enrollment of 3,465 students with 3,196 being undergraduates and 269 students enrolled in the graduate program. In the 2011-2012 academic year, the university awarded 647 bachelor's, 56 master's, and 8 doctoral degrees from the college.
The current Dean of the College of Arts and Sciences is Dr. Goldie Byrd, who was appointed to the position in 2011.
Programs Offered
Atmospheric Sciences and Meteorology
Biology
Chemistry
Criminal Justice
English
Energy and Environmental Systems
History
Journalism and Mass Communication
Liberal Studies
Mathematics
Physics
Political Science
Psychology
Secondary Education
Social Work
Sociology
Speech
Visual & Performing Arts/Theater & Dance
College of Engineering
Established in 1968, the College of Engineering is distributed across six departments; chemical biological and bio engineering; civil, architectural and environmental engineering; computer science; electrical and computer engineering; industrial and systems engineering; and mechanical engineering and the interdisciplinary computational science and engineering program. The College of Engineering has consistently ranked 1st in the nation for the number of degrees awarded to African Americans at the undergraduate level for 13 consecutive years and has been the leading producer of African American female engineers at the baccalaureate level in the U.S. for at least eight consecutive years.
As of 2013, the College of Engineering has a total enrollment of 1,549 students with 1,297 undergraduates and 307 students enrolled in the graduate program. In the 2011-2012 academic year, the university awarded 199 bachelor's, 60 master's, and 15 doctoral degrees from the college.
The current Dean of the College of Engineering is Dr. Robin N. Coger. Under Dr. Coger's leadership the College continues to implement several initiatives designed to facilitate and showcase College's excellence, in line with the University's Preeminence 2020 strategic plan and the College's strategic priorities. Prior to joining A&T's faculty in July 2011, Dr. Coger served as the Founder and Director of the Center for Biomedical Engineering Systems (now the Center for Biomedical Engineering and Science) and was a Professor in the Department of Mechanical Engineering and Engineering Science (MEES) at the University of North Carolina at Charlotte.
Programs Offered
Bachelor's degree Programs
Architectural Engineering
Bioengineering
Biological Engineering
Chemical Engineering
Civil Engineering
Computer Science
Computer Engineering
Electrical Engineering
Industrial and Systems Engineering
Mechanical Engineering
Master's degree Programs
Bioengineering
Chemical Engineering
Civil Engineering
Computer Science
Computational Science and Engineering
Electrical Engineering
Industrial and Systems Engineering
Mechanical Engineering
Doctoral Degree Programs
Computer Science
Computational Science and Engineering
Electrical Engineering
Industrial and Systems Engineering
Mechanical Engineering
College of Agriculture and Environmental Sciences
North Carolina A&T is home to the largest agricultural school among historically black universities and is the nation's second largest producer of minority agricultural graduates. The College of Agriculture and Environmental Sciences also is a leading producer of minority landscape architects and veterinarians. The school is divided into four academic departments: the Department of Agribusiness, Applied Economics and Agriscience Education; the Department of Animal Sciences; the Department of Family and Consumer Sciences; and the Department of Natural Resources and Environmental Design.
Since 1990, enrollment in the school has increased by 75 percent. As of 2012, the total enrollment is 956 students with 777 being undergraduates and 179 students enrolled in the graduate program. In the 2011-2012 academic year, the university awarded 132 bachelor's and 65 master's from the college.
The current Dean of the College is Dr. William “Bill” Randle, who assumed leadership in 2013.
Programs Offered
Agribusiness, Applied Economics & Agriscience Education
Agricultural Economics
Agricultural Education
Animal Science
Animal Science
Laboratory Animal Science
Animal Health Science
Animal Science (Animal Industry)
Family and Consumer Sciences
Child Development
Child Development Early Education And Family Studies
Family And Consumer Science
Fashion Design and Fashion Merchandising
Food Science
Nutritional Sciences and Dietetics
Natural Resources & Environmental Design
Landscape Architecture
Environmental Science
Agricultural Science, Natural Resources (Plant Science)
Biological Engineering
Earth and Biological Sciences
Horticulture
College of Education
Established in 1968, the School of Education is spans 5 departments: Curriculum and Instruction; Human Development and Services; Human Performance and Leisure Studies; Sports Science and Fitness Management; and Leadership Studies. The School is housed in Samuel D. Proctor Hall, named for the University's Fifth President.
As of 2012, the School of Education has a total enrollment of 1,322 students with 642 being undergraduates and 680 students enrolled in the graduate program. In the 2011-2012 academic year, the university awarded 64 bachelor's, 133 master's, and 6 doctoral degrees from the college.
The current Dean of the School of Education is Dr. William B. Harvey who was assumed the position in 2013.
Programs Offered
Bachelor's Programs
Elementary Education
Special Education
Business Administration
Leisure Studies
Pre-Physical Therapy
Masters Programs
Industrial Technology
Elementary Education
Teaching
Reading Education
Adult Education
School Counseling
School Administration
Mental Health Counseling
Physical Education
Doctoral Programs
Leadership Studies
John R. and Kathy R. Hairston College of Health & Human Sciences
The School of Nursing was established in 1953 with the first class of 15 baccalaureate nurses graduating in 1957. The school was first accredited by the National League for Nursing in 1971.
The school offers three distinct academic tracks for those wishing to pursue a degree in the Field. The Bachelor of Science in Nursing traditional (BSN) program is designed for students pursuing their first degrees in Nursing. The Accelerated Bachelor of Science in Nursing (ABSN) Entry Option is designed for second degree students who are high level achievers and desire to pursue a career as a professional registered nurse. The ABSN Entry Option curriculum is an intensive program delivered in block format over 12 months (January to December). Lastly, the BSN Completion Entry Option is designed specifically for the registered nurse whose career goals will be enhanced through additional study. The BSN Completion Entry Option is tailored for RNs that have an associate degree and wish to advance their career by getting a bachelor's degree. The entry option is designed to facilitate either part- time or full-time study and builds on the knowledge gained from the student's previous degree.
As of 2012, the School of Education has a total enrollment of 334 students. In the 2011-2012 academic year, the university awarded 34 bachelor's degrees from the school. The current Dean of the School of Nursing is Dr. Inez Tuck.
Programs Offered
Nursing
Pre-Nursing
Psychology
Social Work
Sociology
Sport Science and Fitness Management (SSFM)
Willie A. Deese College of Business and Economics
Established in 1970, the School of Business and Economics is one of the largest producers of African-American Certified Public Accountants in the nation. According to research conducted by the School, over the last 10 Years, graduates of the department of Business Education have a 98% success rate on the Praxis II examinations. The School of Business and Economics is housed in two building, Merrick and Craig Halls, with the latter named for the former Dean of the School Dr. Quiester Craig, who served as the head of the School for over 4 decades.
The school is accredited by the Association to Advance Collegiate Schools of Business (AACSB) The school's Department of Accounting and Finance was the first accounting program at a Historically Black College and University (HBCU) to receive accreditation by AACSB International.
As of 2012, the School of Business and Economics has a total enrollment of 1,100 students with 1054 being undergraduates and 46 students enrolled in the graduate program. In the 2011-2012 academic year, the university awarded 198 bachelor's and 11 master's degrees from the college.
The current interim Dean of the School of Business and Economics is Dr. Patrick R. Liverpool. who was appointed to the position in July 2013.
Programs Offered
Business Education
Accounting
Business Education
Administrative Systems
Vocational Business Education
Information Technology
Teaching (Business Education)
Business Teacher Education
Economics and Finance
Economics
Business Economics
Economics Law
Finance
Management
Business Administration
Management
Management Information Systems
Human Resources Management
Entrepreneurship
Marketing, Transportation, and Supply Chain
Supply Chain Management
Marketing
Marketing (Sales)
Transportation and Supply Chain Management
College of Technology
Established in 1987, The School of Technology features 10 academic programs ranging from: Applied Engineering Technology, Construction Management, Electronics Technology, Environmental Health and Safety, Geomatics, Graphic Communication Systems, and Motorsports Technology; to graduate programs in Information Technology, Technology Management, and Technology Education.
Graduates of the School enjoy in many cases over 90% placement and demand competitive salaries. As of 2012, the School of Technology has a total enrollment of 796 students; with 675 undergraduates and 121 graduate students. In the 2011-2012 academic year, the university awarded 136 bachelor's and 30 master's degrees from the school.
The current Dean of the School of Technology is Ben Obinero Uwakweh.
Programs Offered
Undergraduate Programs
Applied Engineering Technology
Computer Aided Drafting and Design
Construction Management
Electronics Technology
Environmental Health and Safety
Geomatics
Integrated Internet Technology
Motorsports Technology
Printing and Publishing
Technology Management
Graduate Programs
Technology Management
Information Technology
Technology Education
Joint School of Nanoscience and Nanoengineering
Established in 2010, The Joint School of Nanoscience & Nanoengineering is an academic collaboration between North Carolina A&T and The University of North Carolina at Greensboro. The JSNN opened with 17 students in the doctoral program in nanoscience and 1 student in the professional master's program in nanoscience. According to the National Nanotechnology Initiative, The JSNN became one of fewer than 10 schools nationally to offer degree programs in nanotechnology, and is the only program created and operated collaboratively by two universities. In 2011, N.C. A&T received approval from the University of North Carolina Board of Governors for its Master of Science in Nanoengineering program, to be offered through the JSNN. In addition to the Masters of Science program, the university was approved to offer a doctoral program in Nanoengineering.
As of 2012, the JSNN has an enrollment of 26 Masters and Doctoral students. The current Dean of the JSNN is Dr. James G. Ryan. Dr. Ryan is the Founding Dean of the Joint School of Nanoscience and Nanoengineering. He received his bachelor's, master's, and doctorate degrees in Chemistry in addition to an M.S. degree in Biomedical Engineering from Rensselaer Polytechnic Institute. Ryan joined the JSNN after working at the College of Nanoscale Science and Engineering of the University at Albany as Associate Vice President of Technology and Professor of Nanoscience from 2005 to 2008.
Programs Offered
Currently, the JSNN offeres 4 Masters and Doctoral degree programs.
Masters Programs
Professional Science Master's in Nanoscience
Nanoengineering
Doctoral Programs
Nanoscience
Nanoengineering
The Graduate College
North Carolina A&T offers 45 master's concentrations through 30 degree programs and 11 doctoral concentrations through 9 doctoral degree programs, as well as a number of certificate programs through its colleges and schools. Currently, Master's and Doctoral programs are offered through schools and colleges of Agriculture and Environmental sciences, Arts and Sciences, Business and Economics, Education, Engineering, the Joint School of Nanoscience and Nanoengineering, and Technology.
In the 2013 U.S. News’ Best Grad School edition, N.C. A&T was ranked 75th for industrial, manufacturing and systems engineering and 104 th for social work.
The current Dean of the School of Graduate Studies is Dr. Sanjiv Sarin.
Programs Offered
Masters Programs
English and African American Literature
Elementary Education
Reading Education
Master of Arts in Teaching
Adult Education
Agricultural Education
Agricultural and Environmental Systems
Applied Mathematics
Bioengineering
Biology
Chemical Engineering
Chemistry
Civil Engineering
Computational Science and Engineering
Computer Science
Electrical Engineering
Food and Nutritional Science
Industrial and Systems Engineering
Information Technology
Instructional Technology
Management
Mechanical Engineering
Mental Health Counseling
Nanoengineering
Health and Physical Education
Physics
School Counseling
Technology Management
School Administration
Social Work
Doctoral Programs
Computational Science and Engineering
Computer Science
Electrical Engineering
Energy and Environmental Systems
Industrial and Systems Engineering
Leadership Studies
Mechanical Engineering
Nanoengineering
Rehabilitation Counseling & Rehabilitation Counselor Education
External links
College of Arts and Sciences Website
College of Engineering Website
College of Agriculture and Environmental Sciences Website
School of Business and Economics Website
School of Education Website
School of Graduate Studies Website
School of Nursing Website
School of Technology WebsiteJoint School of Nanoscience and Nanoengineering Website
References
Academics
University and college academics in the United States
University of North Carolina |
36366382 | https://en.wikipedia.org/wiki/Packet%20capture%20appliance | Packet capture appliance | A packet capture appliance is a standalone device that performs packet capture. Packet capture appliances may be deployed anywhere on a network, however, most commonly are placed at the entrances to the network (i.e. the internet connections) and in front of critical equipment, such as servers containing sensitive information.
In general, packet capture appliances capture and record all network packets in full (both header and payload), however, some appliances may be configured to capture a subset of a network's traffic based on user-definable filters. For many applications, especially network forensics and incident response, it is critical to conduct full packet capture, though filtered packet capture may be used at times for specific, limited information gathering purposes.
Deployment
The network data that a packet capture appliance captures depends on where and how the appliance is installed on a network. There are two options for deploying packet capture appliances on a network. One option is to connect the appliance to the SPAN port (port mirroring) on a network switch or router. A second option is to connect the appliance inline, so that network activity along a network route traverses the appliance (similar in configuration to a network tap, but the information is captured and stored by the packet capture appliance rather than passing on to another device).
When connected via a SPAN port, the packet capture appliance may receive and record all Ethernet/IP activity for all of the ports of the switch or router.
When connected inline, the packet capture appliances captures only the network traffic traveling between two points, that is, traffic that passes through the cable to which the packet capture appliance is connected.
There are two general approaches to deploying packet capture appliances: centralized and decentralized.
Centralized
With a centralized approach, one high-capacity, high-speed packet capture appliance connects to a data-aggregation point. The advantage of a centralized approach is that with one appliance you gain visibility over the network's entire traffic. This approach, however, creates a single point of failure that is a very attractive target for hackers; additionally, one would have to re-engineer the network to bring traffic to appliance and this approach typically involves high costs.
Decentralized
With a decentralized approach you place multiple appliances around the network, starting at the point(s) of entry and proceeding downstream to deeper network segments, such as workgroups. The advantages include: no network re-configuration required; ease of deployment; multiple vantage points for incident response investigations; scalability; no single point of failure – if one fails, you have the others; if combined with electronic invisibility, this approach practically eliminates the danger of unauthorized access by hackers; low cost. Cons: potential increased maintenance of multiple appliances.
In the past, packet capture appliances were sparingly deployed, oftentimes only at the point of entry into a network. Packet capture appliances can now be deployed more effectively at various points around the network. When conducting incident response, the ability to see the network data flow from various vantage points is indispensable in reducing time to resolution and narrowing down which parts of the network ultimately were affected. By placing packet capture appliances at the entry point and in front of each work group, following the path of a particular transmission deeper into the network would be simplified and much quicker. Additionally, the appliances placed in front of the workgroups would show intranet transmissions that the appliance located at the entry point would not be able to capture.
Capacity
Packet capture appliances come with capacities ranging from 500 GB to 192 TB and more. Only a few organizations with extremely high network usage would have use for the upper ranges of capacities. Most organizations would be well served with capacities from 1 TB to 4 TB.
A good rule of thumb when choosing capacity is to allow 1 GB per day for heavy users down to 1 GB per month for regular users. For a typical office of 20 people with average usage, 1 TB would be sufficient for about 1 to 4 years.
The ratio 100/0 means simplex traffic on real links you can have even more traffic
Features
Filtered vs. full packet capture
Full packet capture appliances capture and record all Ethernet/IP activity, while filtered packet capture appliances capture only a subset of traffic based on a set of user-definable filters; such as IP address, MAC address or protocol. Unless using the packet capture appliance for a very specific purpose covered by the filter parameters, it is generally best to use full packet capture appliances or otherwise risk missing vital data. Particularly when using a packet capture for network forensics or cybersecurity purposes, it is paramount to capture everything because any packet not captured on the spot is a packet that is gone forever. It is impossible to know ahead of time the specific characteristics of the packets or transmissions needed, especially in the case of an advanced persistent threat (APT). APTs and other hacking techniques rely for success on network administrators not knowing how they work and thus not having solutions in place to counteract them.
Intelligent Packet Capture
Intelligent packet capture uses machine learning to filter and reduce the amount of network traffic captured. Traditional filtered packet capture relies on rules and policies which are manually configured to capture all potentially malicious traffic. Intelligent packet capture uses machine learning models, including features from Cyber threat intelligence feeds, to scientifically target and capture the most threatening traffic. Machine learning techniques for network intrusion detection, traffic classification, and anomaly detection are used to identify potentially malicious traffic for collection.
Encrypted vs. unencrypted storage
Some packet capture appliances encrypt the captured data before saving it to disk, while others do not. Considering the breadth of information that travels on a network or internet connection and that at least a portion of it could be considered sensitive, encryption is a good idea for most situations as a measure to keep the captured data secure. Encryption is also a critical element of authentication of data for the purposes of data/network forensics.
Sustained capture speed vs. peak capture speed
The sustained captured speed is the rate at which a packet capture appliance can capture and record packets without interruption or error over a long period of time. This is different from the peak capture rate, which is the highest speed at which a packet capture appliance can capture and record packets. The peak capture speed can only be maintained for short period of time, until the appliance's buffers fill up and it starts losing packets. Many packet capture appliances share the same peak capture speed of 1 Gbit/s, but actual sustained speeds vary significantly from model to model.
Permanent vs. overwritable storage
A packet capture appliance with permanent storage is ideal for network forensics and permanent record-keeping purposes because the data captured cannot be overwritten, altered or deleted. The only drawback of permanent storage is that eventually the appliance becomes full and requires replacement. Packet capture appliances with overwritable storage are easier to manage because once they reach capacity they will start overwriting the oldest captured data with the new, however, network administrators run the risk of losing important capture data when it gets overwritten. In general, packet capture appliances with overwrite capabilities are useful for simple monitoring or testing purposes, for which a permanent record is not necessary. Permanent, non-overwritable recording is a must for network forensics information gathering.
GbE vs. 10 GbE
Most businesses use Gigabit Ethernet speed networks and will continue to do so for some time. If a business intends to use one centralized packet capture appliance to aggregate all network data, it would probably be necessary to use a 10 GbE packet capture appliance to handle the large volume of data coming to it from all over the network. A more effective way is to use multiple 1 Gbit/s inline packet capture appliances placed strategically around the network so that there is no need to re-engineer a gigabit network to fit a 10 GbE appliance.
Data security
Since packet capture appliances capture and store a large amount of data on network activity, including files, emails and other communications, they could, in themselves, become attractive targets for hacking. A packet capture appliance deployed for any length of time should incorporate security features, to protect the recorded network data from access by unauthorized parties. If deploying a packet capture appliance introduces too many additional concerns about security, the cost of securing it may outweigh the benefits. The best approach would be for the packet capture appliance to have built-in security features. These security features may include encryption, or methods to “hide” the appliance's presence on the network. For example, some packet capture appliances feature “electronic invisibility”, where they have a stealthy network profile by not requiring or using IP nor MAC addresses.
Though connecting a packet capture appliance via a SPAN port appears to make it more secure, the packet capture appliance would ultimately still have to be connected to the network in order to allow management and data retrieval. Though not accessible via the SPAN link, the appliance would be accessible via the management link.
Despite the benefits, the ability to control a packet capture appliance from a remote machine presents a security issue that could make the appliance vulnerable. Packet capture appliances that allow remote access should have a robust system in place to protect it against unauthorized access. One way to accomplish this is to incorporate a manual disable, such as a switch or toggle that allows the user to physically disable remote access. This simple solution is very effective, as it is doubtful that a hacker would have an easy time gaining physical access to the appliance in order to flip a switch.
A final consideration is physical security. All the network security features in the world are moot if someone is simply able to steal the packet capture appliance or make a copy of it and have ready access to the data stored on it. Encryption is one of the best ways to address this concern, though some packet capture appliances also feature tamperproof enclosures.
See also
Intrusion detection
Packet capture
Packet sniffer
References
Packets (information technology)
Computer network security |
535034 | https://en.wikipedia.org/wiki/Software%20engineering%20demographics | Software engineering demographics | Software engineers form part of the workforce around the world. As of 2016, it is estimated that there are 21 million professional software developers.
United States
As of 2016, it was estimated that 3.87 million professional software developers worked in the US out of a total employed workforce of 152 million (2.54%).
Summary
Based on data from the U.S. Bureau of Labor Statistics from 2002, about 612,000 software engineers worked in the U.S. - about one out of every 200 workers. There were 55% to 60% as many software engineers as all traditional engineers. This comparison holds whether one compares the number of practitioners, managers, educators, or technicians/programmers. Software engineering had 612,000 practitioners; 264,790 managers, 16,495 educators, and 457,320 programmers.
Software Engineers Versus Traditional Engineers
The following two tables compare the number of software engineers (611,900), versus the number of traditional engineers (1,157,020). The ratio is 53%.
There are another 1,500,000 people in system analysis, system administration, and computer support, many of whom might be called software engineers. Many systems analysts manage software development teams and analysis is an important software engineering role, so many of them might be considered software engineers in the near future. This means that the number of software engineers may actually be much higher.
Note also that the number of software engineers declined by 5% to 10% from 2000 to 2002.
Computer Managers Versus Construction and Engineering Managers
Computer and information system managers (264,790) manage software projects, as well as computer operations. Similarly, Construction and engineering managers (413,750) oversee engineering projects, manufacturing plants, and construction sites. Computer management is 64% the size of construction and engineering management.
Software Engineering Educators Versus Engineering Educators
Until now, computer science has been the main degree to acquire, whether one wanted to make software systems (software engineering) or study the theoretical and mathematical facets of software systems (computer science). The data shows that the number of chemistry and physics educators (29,610) nearly equals the number of engineering educators (29,310). It is estimated that similarly, of computer science educators emphasize the practical (software engineering) (16,495) and of computer science educators emphasize the theoretical (computer science) (16,495). This means that software engineering education is 56% the size of traditional engineering education. Computer science is larger than all engineering, and larger than all physics and chemistry.
Other Software and Engineering Roles
Relation to IT demographics
Software engineers are part of the much larger software, hardware, application, and operations community. In 2000 in the U.S., there were about 680,000 software engineers and about 10,000,000 IT workers.
There are no numbers on testers in the BLS data.
India
There has been a healthy growth in the number of India's IT professionals over the past few years. From a base of 6,800 knowledge workers in 1985-86, the number increased to 522,000 software and services professionals by the end of 2001-02. It is estimated that out of these 528,000 knowledge workers, almost 170,000 are working in the IT software and services export industry; nearly 106,000 are working in the IT enabled services and over 230,000 in user organizations.
References
See also
Software engineering
List of software engineering topics
Software engineering economics
Software engineering professionalism
Software engineering
Demographics |
22314395 | https://en.wikipedia.org/wiki/Grml | Grml | Grml is a Linux distribution based on Debian. It is designed to run mainly from a live CD, but can be made to run from a USB flash drive. Grml aims to be well-suited to system administrators (sysadmin) and other users of text tools. It includes an X Window System server and a few minimalist window managers such as wmii, Fluxbox, and openbox to use the graphical programs like Mozilla Firefox which are included in the distribution.
Features
In addition to the sysadmin tools, security and network related software, data recovery and forensic tools, editors, shells, and many text tools included with grml, the distribution focuses on accessibility by providing kernel support for speakup and software like brltty, emacspeak, and flite.
Another feature Grml is its use of the Z shell (zsh) as the default login shell. The customized zsh configuration used by Grml can be retrieved from the project's repository.
Since early 2009, Grml ISOs come with MirOS bsd4grml, a minimal MirOS BSD flavour. After the release of Grml “Lackdose-Allergie” 2009.05, daily ISOs and later releases, such as Grml “Hello-Wien” 2009.10, use the manifold-boot technology to provide ISOs that can be written directly to a USB stick, CF/SD card, hard disc, etc. and are immediately bootable. Since Grml 2010.12 the ISOLINUX loader is used in all cases by default, providing a consistent menu.
While Grml is primarily designed as a live CD image, it can also be run as a desktop operating system through its "persistent home" feature.
References
Further reading
Chirurgisches Besteck - Live-Werkzeugkasten für die Shell (Author: Michael Prokop, article in German)
External links
Operating system distributions bootable from read-only media
Debian-based distributions
Linux distributions |
42531949 | https://en.wikipedia.org/wiki/Not%20a%20Hero | Not a Hero | Not a Hero is a cover system 2D shooter video game developed by the British indie development studio Roll7 and published by Devolver Digital. The game released on 14 May 2015 for Microsoft Windows. A later update, which was built using the Chowdren runtime for Clickteam Fusion 2.5, introduced builds for OS X and Linux, on 30 September 2015. The PlayStation 4 version of the game released on 2 February 2016, with the PlayStation Vita version being cancelled. A Super Snazzy Edition, including a new extra campaign, was released on Xbox One by Team17 in May 2016, and on Nintendo Switch by Devolver Digital in August 2018.
Plot
The anthropomorphic, purple rabbit BunnyLord has travelled back in time from 2048 to be elected as mayor to save the world as it is known from total destruction and alien invasion. During his candidateship, he needs to show the citizens why he should be elected and sets up freelance anti-heroes to clean up crime in the city under his name.
Gameplay
Not a Hero is a 2D side-scrolling cover-based shooter presented in a pixel art style. Players can choose one of nine protagonists at the start of each level, each with their own twists on the mechanics of the game. The player is equipped with a primary weapon that can be upgraded temporarily or for the rest of the level with various upgrades found on the map. The player can pick up special weapons, such as deployable turrets or, more bizarrely, exploding cats. The player is unable to jump but can slide, and is able to use this to take cover against objects in the game world or tackle enemies, leaving them open to an execution. The player can only sustain a few hits, but health is regenerated quickly.
The objectives of each level vary, but they amount to accomplishing different tasks to promote BunnyLord's mayoral campaign, such as killing all criminals on the level, rescuing hostages or destroying drug production - each level also has three minor objectives to complete, with some randomness used to generate these. The game also boasts dynamic mid-level events, such as an attacking SWAT team or a helicopter gunship.
The nine protagonists, in order of unlocking, include: Steve, a cockney assassin who wields a fast-reloading pistol; Cletus, a supposedly-Scottish hillbilly who uses a shotgun to blow enemies backwards and shoot doors open; Samantha, a fast Welsh woman who can reload and fire while moving; Jesus, a hip-thrusting SMG-wielder, can run fast and execute enemies while moving; Mike, a rapidly-moving alcoholic from St. Helens, with a powerful sawed-off shotgun; Stanley, a slow-moving and slow-reloading paramilitary soldier with a high-capacity rifle; Clive, a bumbling spy who can shoot and run, as well as shoot two guns at each side of the screen; Ronald Justice, a deranged superhero wielding a hammer and pistol and Kimmy, who can use a katana in conjunction with her SMG. Each have various modifiers upon their normal gameplay mechanics such as movement speed.
Development
Not a Hero was originally created under the title Ur Not a Hero by John Ribbins as a free indie game to accomplish his game list of ideas he made in 2012, which also included OlliOlli. Ur Not a Hero was released to on The Daily Click under the user name butterfingers on 10 January 2013. Although Not a Hero mostly relies on Ur Not a Hero, its mechanics are based on three games from John Ribbins' list: the cover-based shooter aspect, and with that the major aspect, comes from Ur Not a Hero, the indoor level design comes from his game Jeffrey Archer and BunnyLord's randomly generated sentences and sounds are from his game Hackathor. Roll7 later developed the game further, partnering up with ISO-Slant, a Clickteam Fusion add-on to make a 2D game appear 2¼D and to make one able to look around it using ISO-Slant glasses. This port was arranged by Roll7's artist Jake Hollands, making Not a Hero the first game to use ISO-Slant technology. The visual style developed for the game is a bright pixel art. In an interview, lead artist Hollands stated "I learned to create pixel art in the week before my interview at Roll7 and got much better at it whilst working on Not a Hero, but don't plan to return to it - I think that outside of a nostalgic choice there should be a good reason to use it".
The game was later picked up by Devolver Digital, who had published OlliOlli as well. BunnyLord was later given his own Twitter account to post his ideas and opinions about politics, as well as parodying it. On 21 April 2015, Not a Hero was announced to be hacked, although it later turned out to be official demo being released. For that, the developer, Roll7, created a web page for the so-called UJIP Party, representing the opponent of BunnyLord, with advertisement video, information on why BunnyLord's "illegal immigration" should not be tolerated and a download for the game's "hacked" demo. On 1 May 2015, Roll7 announced that the game's release will be delayed one week, due to Jake Hollands wanting to implement 60 FPS into the game, as all previous version have run at 30 FPS. A later update, which was built using the Chowdren runtime for Clickteam Fusion 2.5 by MP2 Games, introduced builds for OS X and Linux on 30 September 2015. On 12 January 2016, it was announced that Not a Hero would be released for PlayStation 4 on 2 February 2016, with the PlayStation Vita version being officially cancelled later that day.
Reception
Not a Hero received positive reviews upon release, with the PC version holding an aggregated Metacritic score of 75/100, based on 57 critic reviews, the PlayStation 4 version a score of 74/100, based on 26 critic reviews, and the Xbox One version a score of 75/100 based on 13 critic reviews.
References
2015 video games
Android (operating system) games
Black comedy video games
Cancelled PlayStation Vita games
Clickteam Fusion games
Devolver Digital games
Linux games
MacOS games
Nintendo Switch games
PlayStation 4 games
PlayStation Network games
Roll7 games
Tactical shooter video games
Team17 games
Video games developed in the United Kingdom
Windows games
Xbox One games |
41919914 | https://en.wikipedia.org/wiki/Lmctfy | Lmctfy | lmctfy ("Let Me Contain That For You", pronounced "l-m-c-t-fi") is an implementation of an operating system–level virtualization, which is based on the Linux kernel's cgroups functionality.
It provides similar functionality to other container-related Linux tools such as Docker and LXC. Lmctfy is the release of Google's container tools and is free and open-source software subject to the terms of Apache License version 2.0.
The maintainers in May 2015 stated their effort to merge their concepts and abstractions into Docker's underlying library libcontainer and thus stopped active development of lmctfy.
References
External links
Presentation slides from initial release announcement
Project website
Project "README" file providing overview
Google Groups post providing in depth comparison with the LXC tools
Linux-only free software
Virtualization-related software for Linux |
30703633 | https://en.wikipedia.org/wiki/HP%20Business%20Service%20Management | HP Business Service Management | HP Business Service Management (BSM) is an end-to-end management software tool that integrates network, server, application and business transaction monitoring. HP Business Service Management is developed and marketed by the HP Software Division.
HP introduced HP Business Service Management 9.0 as a common single platform for managing complex applications, including those supported by both private and public cloud computing, outsourced IT, software-as-a-service (Saas) and traditional IT service delivery.
The 9.0 release was made generally available in June 2010 and was part of a portfolio of applications developed by HP to aid businesses and government organizations with the management of cloud computing as well as traditional IT service delivery.
Business service management is an area of Information Technology that focuses on management of software tools, methods and processes that help the IT department manage technology in a way that supports the business through the services they provide. The BSM methodology connects key IT components to the goals of the business so that the IT department can forecast how technology will affect the business and how business will impact the IT infrastructure.
Components
HP Business Service Management includes operations intelligence, operations bridge, application performance management, systems and virtualization management, network management and storage management.
See also
Business service management
Cloud computing
Software as a service
References
Business software
Business Service Management
Network management |
33170045 | https://en.wikipedia.org/wiki/Stingray%20phone%20tracker | Stingray phone tracker | The StingRay is an IMSI-catcher, a cellular phone surveillance device, manufactured by Harris Corporation. Initially developed for the military and intelligence community, the StingRay and similar Harris devices are in widespread use by local and state law enforcement agencies across Canada, the United States, and in the United Kingdom. Stingray has also become a generic name to describe these kinds of devices.
Technology
The StingRay is an IMSI-catcher with both passive (digital analyzer) and active (cell-site simulator) capabilities. When operating in active mode, the device mimics a wireless carrier cell tower in order to force all nearby mobile phones and other cellular data devices to connect to it.
The StingRay family of devices can be mounted in vehicles, on air planes, helicopters and unmanned aerial vehicles. Hand-carried versions are referred to under the trade name KingFish.
Active mode operations
Extracting stored data such as International Mobile Subscriber Identity (IMSI) numbers and Electronic Serial Number (ESN),
Writing cellular protocol metadata to internal storage
Forcing an increase in signal transmission power
Forcing an abundance of radio signals to be transmitted
Forcing a downgrade to an older and less secure communications protocol if the older protocol is allowed by the target device, by making the Stingray pretend to be unable to communicate on an up-to-date protocol
Interception of communications data or metadata
Using received signal strength indicators to spatially locate the cellular device
Conducting a denial of service attack
Radio jamming for either general denial of service purposes or to aid in active mode protocol rollback attacks
Passive mode operations
conducting base station surveys, which is the process of using over-the-air signals to identify legitimate cell sites and precisely map their coverage areas
Active (cell site simulator) capabilities
In active mode, the StingRay will force each compatible cellular device in a given area to disconnect from its service provider cell site (e.g., operated by Verizon, AT&T, etc.) and establish a new connection with the StingRay. In most cases, this is accomplished by having the StingRay broadcast a pilot signal that is either stronger than, or made to appear stronger than, the pilot signals being broadcast by legitimate cell sites operating in the area. A common function of all cellular communications protocols is to have the cellular device connect to the cell site offering the strongest signal. StingRays exploit this function as a means to force temporary connections with cellular devices within a limited area.
Extracting data from internal storage
During the process of forcing connections from all compatible cellular devices in a given area, the StingRay operator needs to determine which device is the desired surveillance target. This is accomplished by downloading the IMSI, ESN, or other identifying data from each of the devices connected to the StingRay. In this context, the IMSI or equivalent identifier is not obtained from the cellular service provider or from any other third-party. The StingRay downloads this data directly from the device using radio waves.
In some cases, the IMSI or equivalent identifier of a target device is known to the StingRay operator beforehand. When this is the case, the operator will download the IMSI or equivalent identifier from each device as it connects to the StingRay. When the downloaded IMSI matches the known IMSI of the desired target, the dragnet will end and the operator will proceed to conduct specific surveillance operations on just the target device.
In other cases, the IMSI or equivalent identifier of a target is not known to the StingRay operator and the goal of the surveillance operation is to identify one or more cellular devices being used in a known area. For example, if visual surveillance is being conducted on a group of protestors, a StingRay can be used to download the IMSI or equivalent identifier from each phone within the protest area. After identifying the phones, locating and tracking operations can be conducted, and service providers can be forced to turn over account information identifying the phone users.
Forcing an increase in signal transmission power
Cellular telephones are radio transmitters and receivers, much like a walkie-talkie. However, the cell phone communicates only with a repeater inside a nearby cell tower installation. At that installation, the devices take in all cell calls in its geographic area and repeat them out to other cell installations which repeat the signals onward to their destination telephone (either by radio or landline wires). Radio is used also to transmit a caller's voice/data back to the receiver's cell telephone. The two-way duplex phone conversation then exists via these interconnections.
To make all that work correctly, the system allows automatic increases and decreases in transmitter power (for the individual cell phone and for the tower repeater, too) so that only the minimum transmit power is used to complete and hold the call active, "on", and allows the users to hear and be heard continuously during the conversation. The goal is to hold the call active but use the least amount of transmitting power, mainly to conserve batteries and be efficient. The tower system will sense when a cell phone is not coming in clearly and will order the cell phone to boost transmit power. The user has no control over this boosting; it may occur for a split second or for the whole conversation. If the user is in a remote location, the power boost may be continuous. In addition to carrying voice or data, the cell phone also transmits data about itself automatically, and that is boosted or not as the system detects need.
Encoding of all transmissions ensures that no crosstalk or interference occurs between two nearby cell users. The boosting of power, however, is limited by the design of the devices to a maximum setting. The standard systems are not "high power" and thus can be overpowered by secret systems using much more boosted power that can then take over a user's cell phone. If overpowered that way, a cell phone will not indicate the change due to the secret radio being programmed to hide from normal detection. The ordinary user can not know if their cell phone is captured via overpowering boosts or not. (There are other ways of secret capture that need not overpower, too.)
Just as a person shouting drowns out someone whispering, the boost in RF watts of power into the cell telephone system can overtake and control that system—in total or only a few, or even only one, conversation. This strategy requires only more RF power, and thus it is more simple than other types of secret control. Power boosting equipment can be installed anywhere there can be an antenna, including in a vehicle, perhaps even in a vehicle on the move. Once a secretly boosted system takes control, any manipulation is possible from simple recording of the voice or data to total blocking of all cell phones in the geographic area.
Tracking and locating
A StingRay can be used to identify and track a phone or other compatible cellular data device even while the device is not engaged in a call or accessing data services.
A Stingray closely resembles a portable cellphone tower. Typically, law enforcement officials place the Stingray in their vehicle with a compatible computer software. The Stingray acts as a cellular tower to send out signals to get the specific device to connect to it. Cell phones are programmed to connect with the cellular tower offering the best signal. When the phone and Stingray connect, the computer system determines the strength of the signal and thus the distance to the device. Then, the vehicle moves to another location and sends out signals until it connects with the phone. When the signal strength is determined from enough locations, the computer system centralizes the phone and is able to find it.
Cell phones are programmed to constantly search for the strongest signal emitted from cell phone towers in the area. Over the course of the day, most cell phones connect and reconnect to multiple towers in an attempt to connect to the strongest, fastest, or closest signal. Because of the way they are designed, the signals that the Stingray emits are far stronger than those coming from surrounding towers. For this reason, all cell phones in the vicinity connect to the Stingray regardless of the cell phone owner's knowledge. From there, the stingray is capable of locating the device, interfering with the device, and collecting personal data from the device.
Denial of service
The FBI has claimed that when used to identify, locate, or track a cellular device, the StingRay does not collect communications content or forward it to the service provider. Instead, the device causes a disruption in service. Under this scenario, any attempt by the cellular device user to place a call or access data services will fail while the StingRay is conducting its surveillance. On August 21, 2018, Senator Ron Wyden noted that Harris Corporation confirmed that Stingrays disrupt the targeted phone's communications. Additionally, he noted that "while the company claims its cell-site simulators include a feature that detects and permits the delivery of emergency calls to 9-1-1, its officials admitted to my office that this feature has not been independently tested as part of the Federal Communication Commission’s certification process, nor were they able to confirm this feature is capable of detecting and passing-through 9-1-1 emergency communications made by people who are deaf, hard of hearing, or speech disabled using Real-Time Text technology."
Interception of communications content
By way of software upgrades, the StingRay and similar Harris products can be used to intercept GSM communications content transmitted over-the-air between a target cellular device and a legitimate service provider cell site. The StingRay does this by way of the following man-in-the-middle attack: (1) simulate a cell site and force a connection from the target device, (2) download the target device's IMSI and other identifying information, (3) conduct "GSM Active Key Extraction" to obtain the target device's stored encryption key, (4) use the downloaded identifying information to simulate the target device over-the-air, (5) while simulating the target device, establish a connection with a legitimate cell site authorized to provide service to the target device, (6) use the encryption key to authenticate the StingRay to the service provider as being the target device, and (7) forward signals between the target device and the legitimate cell site while decrypting and recording communications content.
The "GSM Active Key Extraction" performed by the StingRay in step three merits additional explanation. A GSM phone encrypts all communications content using an encryption key stored on its SIM card with a copy stored at the service provider. While simulating the target device during the above explained man-in-the-middle attack, the service provider cell site will ask the StingRay (which it believes to be the target device) to initiate encryption using the key stored on the target device. Therefore, the StingRay needs a method to obtain the target device's stored encryption key else the man-in-the-middle attack will fail.
GSM primarily encrypts communications content using the A5/1 call encryption cypher. In 2008 it was reported that a GSM phone's encryption key can be obtained using $1,000 worth of computer hardware and 30 minutes of cryptanalysis performed on signals encrypted using A5/1. However, GSM also supports an export weakened variant of A5/1 called A5/2. This weaker encryption cypher can be cracked in real-time. While A5/1 and A5/2 use different cypher strengths, they each use the same underlying encryption key stored on the SIM card. Therefore, the StingRay performs "GSM Active Key Extraction" during step three of the man-in-the-middle attack as follows: (1) instruct target device to use the weaker A5/2 encryption cypher, (2) collect A5/2 encrypted signals from target device, and (3) perform cryptanalysis of the A5/2 signals to quickly recover the underlying stored encryption key. Once the encryption key is obtained, the StingRay uses it to comply with the encryption request made to it by the service provider during the man-in-the-middle attack.
A rogue base station can force unencrypted links, if supported by the handset software. The rogue base station can send a 'Cipher Mode Settings' element (see GSM 04.08 Chapter 10.5.2.9) to the phone, with this element clearing the one bit that marks if encryption should be used. In such cases the phone display could indicate the use of an unsafe link—but the user interface software in most phones does not interrogate the handset's radio subsystem for use of this insecure mode nor display any warning indication.
Passive capabilities
In passive mode, the StingRay operates either as a digital analyzer, which receives and analyzes signals being transmitted by cellular devices and/or wireless carrier cell sites or as a radio jamming device, which transmits signals that block communications between cellular devices and wireless carrier cell sites. By "passive mode", it is meant that the StingRay does not mimic a wireless carrier cell site or communicate directly with cellular devices.
Base station (cell site) surveys
A StingRay and a test phone can be used to conduct base station surveys, which is the process of collecting information on cell sites, including identification numbers, signal strength, and signal coverage areas. When conducting base station surveys, the StingRay mimics a cell phone while passively collecting signals being transmitted by cell-sites in the area of the StingRay.
Base station survey data can be used to further narrow the past locations of a cellular device if used in conjunction with historical cell site location information ("HCSLI") obtained from a wireless carrier. HCSLI includes a list of all cell sites and sectors accessed by a cellular device, and the date and time each access was made. Law enforcement will often obtain HCSLI from wireless carriers in order to determine where a particular cell phone was located in the past. Once this information is obtained, law enforcement will use a map of cell site locations to determine the past geographical locations of the cellular device.
However, the signal coverage area of a given cell site may change according to the time of day, weather, and physical obstructions in relation to where a cellular device attempts to access service. The maps of cell site coverage areas used by law enforcement may also lack precision as a general matter. For these reasons, it is beneficial to use a StingRay and a test phone to map out the precise coverage areas of all cell sites appearing in the HCSLI records. This is typically done at the same time of day and under the same weather conditions that were in effect when the HCSLI was logged. Using a StingRay to conduct base station surveys in this manner allows for mapping out cell site coverage areas that more accurately match the coverage areas that were in effect when the cellular device was used.
Usage by law enforcement
In the United States
The use of the devices has been frequently funded by grants from the Department of Homeland Security. The Los Angeles Police Department used a Department of Homeland Security grant in 2006 to buy a StingRay for "regional terrorism investigations". However, according to the Electronic Frontier Foundation, the "LAPD has been using it for just about any investigation imaginable."
In addition to federal law enforcement, military and intelligence agencies, StingRays have in recent years been purchased by local and state law enforcement agencies.
In 2006, Harris Corporation employees directly conducted wireless surveillance using StingRay units on behalf the Palm Bay Police Department—where Harris has a campus —in response to a bomb threat against a middle school. The search was conducted without a warrant or Judicial oversight.
The American Civil Liberties Union (ACLU) confirmed that local police have cell site simulators in Washington, Nevada, Arizona, Alaska, Missouri, New Mexico, Georgia, and Massachusetts. State police have cell site simulators in Oklahoma, Louisiana, and Pennsylvania, and Delaware. Local and state police have cell site simulators in California, Texas, Minnesota, Wisconsin, Michigan, Illinois, Indiana, Tennessee, North Carolina, Virginia, Florida, Maryland, and New York [60]. The police use of cell site simulators is unknown in the remaining states. However, many agencies do not disclose their use of StingRay technology, so these statistics are still potentially an under-representation of the actual number of agencies. According to the most recent information published by the American Civil Liberties Union, 72 law enforcement agencies in 24 states own StingRay technology in 2017. Since 2014, these numbers have increased from 42 agencies in 17 states [60]. The following are federal agencies in the United States that have validated their use of cell-site simulators: Federal Bureau of Investigation, Drug Enforcement Administration, US Secret Service, Immigration and Customs Enforcement, US Marshals Service, Bureau of Alcohol, Tobacco, Firearms, and Explosives, US Army, US Navy, US Marine Corps, US National Guard, US Special Command, and National Security Agency [60]. In the 2010-14 fiscal years, the Department of Justice has confirmed spending "more than $71 million on cell-cite simulation technology," while the Department of Homeland Security confirmed spending "more than $24 million on cell-cite simulation technology."
Several court decisions have been issued on the legality of using a Stingray without a warrant, with some courts ruling a warrant is required and others not requiring a warrant.
Outside the United States
Police in Vancouver, British Columbia, Canada, admitted after much speculation across the country that they had made use of a Stingray device provided by the RCMP. They also stated that they intended to make use of such devices in the future. Two days later, a statement by Edmonton's police force had been taken as confirming their use of the devices, but they said later that they did not mean to create what they called a miscommunication.
Privacy International and The Sunday Times reported on the usage of StingRays and IMSI-catchers in Ireland, against the Irish Garda Síochána Ombudsman Commission (GSOC), which is an oversight agency of the Irish police force Garda Síochána. On June 10, 2015, the BBC reported on an investigation by Sky News about possible false mobile phone towers being used by the London Metropolitan Police. Commissioner Bernard Hogan-Howe refused comment.
Between February 2015 and April 2016, over 12 companies in the United Kingdom were authorized to export IMSI-catcher devices to states including Saudi Arabia, UAE, and Turkey. Critics have expressed concern about the export of surveillance technology to countries with poor human rights records and histories of abusing surveillance technology.
Secrecy
The increasing use of the devices has largely been kept secret from the court system and the public. In 2014, police in Florida revealed they had used such devices at least 200 additional times since 2010 without disclosing it to the courts or obtaining a warrant. One of the reasons the Tallahassee police provided for not pursuing court approval is that such efforts would allegedly violate the non-disclosure agreements (NDAs) that police sign with the manufacturer. The American Civil Liberties Union has filed multiple requests for the public records of Florida law enforcement agencies about their use of the cell phone tracking devices.
Local law enforcement and the federal government have resisted judicial requests for information about the use of stingrays, refusing to turn over information or heavily censoring it. In June 2014, the American Civil Liberties Union published information from court regarding the extensive use of these devices by local Florida police. After this publication, United States Marshals Service then seized the local police's surveillance records in a bid to keep them from coming out in court.
In some cases, police have refused to disclose information to the courts citing non-disclosure agreements signed with Harris Corporation. The FBI defended these agreements, saying that information about the technology could allow adversaries to circumvent it. The ACLU has said "potentially unconstitutional government surveillance on this scale should not remain hidden from the public just because a private corporation desires secrecy. And it certainly should not be concealed from judges."
In 2015 Santa Clara County pulled out of contract negotiations with Harris for StingRay units, citing onerous restrictions imposed by Harris on what could be released under public records requests as the reason for exiting negotiations.
Criticism
In recent years, legal scholars, public interest advocates, legislators and several members of the judiciary have strongly criticized the use of this technology by law enforcement agencies. Critics have called the use of the devices by government agencies warrantless cell phone tracking, as they have frequently been used without informing the court system or obtaining a warrant. The Electronic Frontier Foundation has called the devices "an unconstitutional, all-you-can-eat data buffet."
In June 2015, WNYC Public Radio published a podcast with Daniel Rigmaiden about the StingRay device.
In 2016, Professor Laura Moy of the Georgetown University Law Center filed a formal complaint to the FCC regarding the use of the devices by law enforcement agencies, taking the position that because the devices mimic the properties of cell phone towers, the agencies operating them are in violation of FCC regulation, as they lack the appropriate spectrum licenses.
On December 4, 2019, the American Civil Liberties Union and the New York Civil Liberties Union (NYCLU) filed a federal lawsuit against the Customs and Border Protection and the Immigrations and Customs Enforcement agencies. According to the ACLU, the union had filed a Freedom of Information Act request in 2017, but were not given access to documents. The NYCLU and ACLU proceeded with the lawsuit under the statement that both CBP and ICE had failed, "to produce a range of records about their use, purchase, and oversight of Stingrays." In an official statement expanding their reasoning for the lawsuit, the ACLU expressed their concern over the Stingrays current and future applications, stating that ICE were using them for "unlawfully tracking journalists and advocates and subjecting people to invasive searches of their electronic devices at the border."
Countermeasures
A number of countermeasures to the StingRay and other devices have been developed, for example crypto phones such as GSMK's Cryptophone have firewalls that can identify and thwart the StingRay's actions or alert the user to IMSI capture.
EFF developed a system to catch stingrays.
See also
Authentication and Key Agreement (protocol)
Cellphone surveillance
Evil Twin Attack
Mobile phone tracking
Kyllo v. United States (lawsuit re thermal image surveillance)
United States v. Davis (2014) found warrantless data collection violated constitutional rights, but okayed data use for criminal conviction, as data collected in good faith
References
Further reading
IMSI catchers, and specifically the Harris Stingray, are extensively used in the Intelligence Support Activity/Task Force Orange thriller written by J. T. Patten, a former counterterrorism intelligence specialist. Patten, J. T., Buried in Black. A Task Force Orange novel. Lyrical Press/Penguin, 2018.
Telecommunications equipment
Mass intelligence-gathering systems
Surveillance
Mobile security
Telephone tapping
Telephony equipment
Law enforcement equipment |
356488 | https://en.wikipedia.org/wiki/ATX | ATX | ATX (Advanced Technology eXtended) is a motherboard and power supply configuration specification developed by Intel in 1995 to improve on previous de facto standards like the AT design. It was the first major change in desktop computer enclosure, motherboard and power supply design in many years, improving standardization and interchangeability of parts. The specification defines the dimensions; the mounting points; the I/O panel; and the power and connector interfaces among a computer case, a motherboard, and a power supply.
ATX is the most common motherboard design. Other standards for smaller boards (including microATX, FlexATX, nano-ITX, and mini-ITX) usually keep the basic rear layout but reduce the size of the board and the number of expansion slots. Dimensions of a full-size ATX board are , which allows many ATX chassis to accept microATX boards. The ATX specifications were released by Intel in 1995 and have been revised numerous times since. The most recent ATX motherboard specification is version 2.2. The most recent ATX12V power supply unit specification is 2.53, released in June 2020. EATX (Extended ATX) is a bigger version of the ATX motherboard with dimensions. While some dual CPU socket motherboards have been implemented in ATX, the extra size of EATX makes it the typical form factor for dual socket systems, and with sockets that support four or eight memory channels, for single socket systems with a large number of memory slots.
In 2004, Intel announced the BTX (Balanced Technology eXtended) standard, intended as a replacement for ATX. While some manufacturers adopted the new standard, Intel discontinued any future development of BTX in 2006. , the ATX design still remains the de facto standard for personal computers.
Connectors
On the back of the computer case, some major changes were made to the AT standard. Originally AT style cases had only a keyboard connector and expansion slots for add-on card backplates. Any other onboard interfaces (such as serial and parallel ports) had to be connected via flying leads to connectors which were mounted either on spaces provided by the case or brackets placed in unused expansion slot positions.
ATX allowed each motherboard manufacturer to put these ports in a rectangular area on the back of the system with an arrangement they could define themselves, though a number of general patterns depending on what ports the motherboard offers have been followed by most manufacturers. Cases are usually fitted with a snap-out panel, also known as an I/O plate or I/O shield, in one of the common arrangements. If necessary, I/O plates can be replaced to suit a motherboard that is being fitted; the I/O plates are usually included with motherboards not designed for a particular computer. The computer will operate correctly without a plate fitted, although there will be open gaps in the case which may compromise the EMI/RFI screening and allow ingress of dirt and random foreign bodies. Panels were made that allowed fitting an AT motherboard in an ATX case. Some ATX motherboards come with an integrated I/O plate.
ATX also made the PS/2-style mini-DIN keyboard and mouse connectors ubiquitous. AT systems used a 5-pin DIN connector for the keyboard and were generally used with serial port mice (although PS/2 mouse ports were also found on some systems). Many modern motherboards are phasing out the PS/2-style keyboard and mouse connectors in favor of the more modern Universal Serial Bus. Other legacy connectors that are slowly being phased out of modern ATX motherboards include 25-pin parallel ports and 9-pin RS-232 serial ports. In their place are onboard peripheral ports such as Ethernet, FireWire, eSATA, audio ports (both analog and S/PDIF), video (analog D-sub, DVI, HDMI, or DisplayPort), extra USB ports, and Wi-Fi.
A notable issue with the ATX specification was that it was last revised when power supplies were normally placed at the top, rather than the bottom, of computer cases. This has led to some problematic standard locations for ports, in particular the 4/8 pin CPU power, which is normally located along the top edge of the board to make it convenient for top mounted power supplies. This makes it very difficult for cables from bottom mounted power supplies to reach, and commonly requires a special cutout in the back plane for the cable to come in from behind and bend around the board, making insertion and wire management very difficult. Many power supply cables barely reach or fail to reach, or are too stiff to make the bend, and extensions are commonly required due to this placement.
Variants
Several ATX-derived designs have been specified that use the same power supply, mountings and basic back panel arrangement, but set different standards for the size of the board and number of expansion slots. Standard ATX provides seven slots at spacing; the popular microATX size removes and three slots, leaving four. Here width refers to the distance along the external connector edge, while depth is from front to rear. Note each larger size inherits all previous (smaller) colors area.
AOpen has conflated the term Mini ATX with a more recent design. Since references to Mini ATX have been removed from ATX specifications since the adoption of microATX, the AOpen definition is the more contemporary term and the one listed above is apparently only of historical significance. This sounds contradictory to the now common Mini-ITX standard (), which is a potential source of confusion. A number of manufacturers have added one to three additional expansion slots (at the standard 0.8 inch spacing) to the standard 12-inch ATX motherboard width.
Form factors considered obsolete in 1999 included Baby-AT, full size AT, and the semi-proprietary LPX for low-profile cases. Proprietary motherboard designs such as those by Compaq, Packard-Bell, Hewlett Packard and others existed, and were not interchangeable with multi-manufacturer boards and cases. Portable and notebook computers and some 19-inch rackmount servers have custom motherboards unique to their particular products.
Although true E-ATX is most motherboard manufacturers also refer to motherboards with measurements , , and as E-ATX. While E-ATX and SSI EEB (Server System Infrastructure (SSI) Forum's Enterprise Electronics Bay (EEB)) share the same dimensions, the screw holes of the two standards do not all align; rendering them incompatible.
In 2008, Foxconn unveiled a Foxconn F1 motherboard prototype, which has the same width as a standard ATX motherboard, but an extended 14.4" length to accommodate 10 slots. The firm called the new design of this motherboard "Ultra ATX" in its CES 2008 showing. Also unveiled during the January 2008 CES was the Lian Li Armorsuit PC-P80 case with 10 slots designed for the motherboard.
The name "XL-ATX" has been used by at least three companies in different ways:
In September 2009, EVGA Corporation had already released a "XL-ATX" motherboard as its EVGA X58 Classified 4-Way SLI.
Gigabyte Technology launched another XL-ATX motherboard, with model number GA-X58A-UD9 in 2010 measuring at , and GA-X79-UD7 in 2011 measuring at . In April 2010, Gigabyte announced its GA-890FXA-UD7 motherboard that allowed all seven slots to be moved downward by one slot position. The added length could have allowed placement of up to eight expansion slots, but the top slot position is vacant on this particular model.
MSI released MSI X58 Big Bang in 2010, MSI P67 Big Bang Marshal in 2011, MSI X79 Xpower Big Bang 2 in 2012 and MSI Z87 Xpower in 2013 all of them are . Although these boards have room for additional expansion slots (9 and 8 total, respectively), all three provide only seven expansion connectors; the topmost positions are left vacant to provide more room for the CPU, chipset and associated cooling.
In 2010, EVGA Corporation released a new motherboard, the "Super Record 2", or SR-2, whose size surpasses that of the "EVGA X58 Classified 4-Way SLI". The new board is designed to accommodate two Dual QPI LGA1366 socket CPUs (e.g. Intel Xeon), similar to that of the Intel Skulltrail motherboard that could accommodate two Intel Core 2 Quad processors and has a total of seven PCI-E slots and 12 DDR3 RAM slots. The new design is dubbed "HPTX" and is .
Power supply
The ATX specification requires the power supply to produce three main outputs, +3.3 V, +5 V and +12 V. Low-power −12 V and +5 VSB (standby) supplies are also required. The −12 V supply is primarily used to provide the negative supply voltage for RS-232 ports and is also used by one pin on conventional PCI slots primarily to provide a reference voltage for some models of sound cards. The 5 VSB supply is used to produce trickle power to provide the soft-power feature of ATX when a PC is turned off, as well as powering the real-time clock to conserve the charge of the CMOS battery. A −5 V output was originally required because it was supplied on the ISA bus; it was removed in later versions of the ATX standard, as it became obsolete with the removal of the ISA bus expansion slots (the ISA bus itself is still found in any computer which is compatible with the old IBM PC specification; e.g., not found in the PlayStation 4).
Originally, the motherboard was powered by one 20-pin connector. An ATX power supply provides a number of peripheral power connectors and (in modern systems) two connectors for the motherboard: an 8-pin (or 4+4-pin) auxiliary connector providing additional power to the CPU and a main 24-pin power supply connector, an extension of the original 20-pin version. 20-pin MOLEX 39-29-9202 at the motherboard. 20-pin MOLEX 39-01-2200 at the cable. The connector pin pitch is 4.2 mm (one sixth of an inch).
Four wires have special functions:
PS_ON# (power on) is a signal from the motherboard to the power supply. When the line is connected to ground (by the motherboard), the power supply turns on. It is internally pulled up to +5 V inside the power supply.
PWR_OK ("power good") is an output from the power supply that indicates that its output has stabilized and is ready for use. It remains low for a brief time (100–500 ms) after the PS_ON# signal is pulled low.
+5 VSB (+5 V standby) supplies power even when the rest of the supply wire lines are off. This can be used to power the circuitry that controls the power-on signal.
+3.3 V sense should be connected to the +3.3 V on the motherboard or its power connector. This connection allows remote sensing of the voltage drop in the power-supply wiring. Some manufacturers also provided a +5 V sense wire (typically colored pink) connected to one of the red +5 V wires on some models of power supply; however, the inclusion of such wire was a non-standard practice and was never part of any official ATX standard.
Generally, supply voltages must be within ±5% of their nominal values at all times. The little-used negative supply voltages, however, have a ±10% tolerance. There is a specification for ripple in a 10 Hz–20 MHz bandwidth:
The 20–24-pin Molex Mini-Fit Jr. has a power rating of 600 volts, 8 amperes maximum per pin (while using 18 AWG wire). As large server motherboards and 3D graphics cards have required progressively more and more power to operate, it has been necessary to revise and extend the standard beyond the original 20-pin connector, to allow more current using multiple additional pins in parallel. The low circuit voltage is the restriction on power flow through each connector pin; at the maximum rated voltage, a single Mini-Fit Jr pin would be capable of 4800 watts.
Physical characteristics
ATX power supplies generally have the dimensions of , with the width and height being the same as the preceding LPX (Low Profile eXtension) form factor (which are often incorrectly referred to as "AT" power supplies due to their ubiquitous use in later AT and Baby AT systems, even though the actual AT and Baby AT power supply form factors were physically larger) and share a common mounting layout of four screws arranged on the back side of the unit. That last dimension, the 140 mm depth, is frequently varied, with depths of 160, 180, 200 and 230 mm used to accommodate higher power, larger fan and/or modular connectors.
Main changes from AT and LPX designs
Power switch
Original AT cases (flat case style) have an integrated power switch that protruded from the power supply and
sits flush with a hole in the AT chassis. It utilizes a paddle-style DPST switch and is similar to the PC and PC-XT style power supplies.
Later AT (so-called "Baby AT") and LPX style computer cases have a power button that is directly connected to the system computer power supply (PSU). The general configuration is a double-pole latching mains voltage switch with the four pins connected to wires from a four-core cable. The wires are either soldered to the power button (making it difficult to replace the power supply if it failed) or blade receptacles were used.
An ATX power supply is typically controlled by an electronic switch connected to the power button on the computer case and allows the computer to be turned off by the operating system. In addition, many ATX power supplies have a manual switch on the back that also ensures no power is being sent to the components. When the switched off the power supply is fully turned off and computer cannot be turned on with the front power button.
Power connection to the motherboard
The power supply's connection to the motherboard was changed from the older AT and LPX standards; AT and LPX had two similar connectors that could be accidentally interchanged by forcing the different keyed connectors into place, usually causing short-circuits and irreversible damage to the motherboard (the rule of thumb for safe operation was to connect the side-by-side connectors with the black wires together). ATX uses one large, keyed connector which can not be connected incorrectly. The new connector also provides a 3.3 volt source, removing the need for motherboards to derive this voltage from the 5 V rail. Some motherboards, particularly those manufactured after the introduction of ATX but while LPX equipment was still in use, support both LPX and ATX PSUs.
If using an ATX PSU for purposes other than powering an ATX motherboard, power can be fully turned on (it is always partly on to operate "wake-up" devices) by shorting the "power-on" pin on the ATX connector (pin 16, green wire) to a black wire (ground), which is what the power button on an ATX system does. A minimum load on one or more voltages may be required (varies by model and vendor); the standard does not specify operation without a minimum load and a conforming PSU may shut down, output incorrect voltages, or otherwise malfunction, but will not be hazardous or damaged. An ATX power supply is not a replacement for a current-limited bench laboratory DC power supply; instead it is better described as a bulk DC power supply.
Airflow
The original ATX specification called for a power supply to be located near to the CPU with the power supply fan drawing in cooling air from outside the chassis and directing it onto the processor. It was thought that in this configuration, cooling of the processor would be achievable without the need of an active heatsink. This recommendation was removed from later specifications; modern ATX power supplies usually exhaust air from the case.
ATX power supply revisions
Original ATX
ATX, introduced in late 1995, defined three types of power connectors:
4-pin "Molex connector" – transferred directly from AT standard: +5 V and +12 V for P-ATA hard disks, CD-ROMs, 5.25 inch floppy drives and other peripherals.
4-pin Berg floppy connector – transferred directly from AT standard: +5 V and +12 V for 3.5 inch floppy drives and other peripherals.
20-pin Molex Mini-fit Jr. ATX motherboard connector – new to the ATX standard.
A supplemental 6-pin AUX connector providing additional 3.3 V and 5 V supplies to the motherboard, if needed. This was used to power the CPU in motherboards with CPU voltage regulator modules which required 3.3 volt and/or 5 volt rails and could not get enough power through the regular 20-pin header.
The power distribution specification defined that most of the PSU's power should be provided on 5 V and 3.3 V rails, because most of the electronic components (CPU, RAM, chipset, PCI, AGP and ISA cards) used 5 V or 3.3 V for power supply. The 12 V rail was only used by computer fans and motors of peripheral devices (HDD, FDD, CD-ROM, etc.)
ATX12V 1.x
While designing the Pentium 4 platform in 1999/2000, the standard 20-pin ATX power connector was found insufficient to meet increasing power-line requirements; the standard was significantly revised into ATX12V 1.0 (ATX12V 1.x is sometimes inaccurately called ATX-P4). ATX12V 1.x was also adopted by AMD Athlon XP and Athlon 64 systems. However, some early model Athlon XP and MP boards (including some server boards) and later model lower-end motherboards do not have the 4-pin connector as described below.
Numbering of the ATX revisions may be a little confusing: ATX refers to the design, and goes up to version 2.2 in 2004 (with the 24 pins of ATX12V 2.0), while ATX12V describes only the PSU.
For instance, ATX 2.03 is quite commonly seen on PSUs from 2000 and 2001 and often includes the P4 12V connector, even if the norm itself did not define it yet.
ATX12V 1.0
The main changes and additions in ATX12V 1.0 (released in February 2000) were:
Increased the power on the 12 V rail (power on 5 V and 3.3 V rails remained mostly the same).
An extra 4-pin mini fit JR (Molex 39-01-2040), 12-volt connector to power the CPU.
Formally called the +12 V Power Connector, this is commonly referred to as the P4 connector because this was first needed to support the Pentium 4 processor.
Before the Pentium 4, processors were generally powered from the 5 V rail. Later processors operate at much lower voltages, typically around 1 V and some draw over 100 A. It is infeasible to provide power at such low voltages and high currents from a standard system power supply, so the Pentium 4 established the practice of generating it with a DC-to-DC converter on the motherboard next to the processor, powered by the 4-pin 12 V connector.
ATX12V 1.1
This is a minor revision from August 2000. The power on the 3.3 V rail was slightly increased and other smaller changes were made.
ATX12V 1.2
A relatively minor revision from January 2002. The only significant change was that the −5 V rail was no longer required (it became optional). This voltage was required by the ISA bus, which is no longer present on almost all modern computers.
ATX12V 1.3
Introduced in April 2003 (a month after 2.0). This standard introduced some changes, mostly minor. Some of them are:
Slightly increased the power on 12 V rail.
Defined minimal required PSU efficiencies for light and normal load.
Defined acoustic levels.
Introduction of Serial ATA power connector (but defined as optional).
Guidance for the −5 V rail was removed (but it was not prohibited).
ATX12V 2.x
ATX12V 2.x brought a significant design change regarding power distribution. By analyzing the power demands of then-current PCs, it was determined that it would be much cheaper and more practical to power most PC components from 12 V rails, instead of from 3.3 V and 5 V rails.
In particular, PCI Express expansion cards take much of their power from the 12 V rail (up to 5.5 A), while the older AGP graphics cards took only up to 1 A on 12 V and up to 6 A on 3.3 V. The CPU is also driven by a 12 V rail, while it was done by a 5 V rail on older PCs (before the Pentium 4).
ATX12V 2.0
The power demands of PCI Express were incorporated in ATX12V 2.0 (introduced in February 2003), which defined quite different power distribution from ATX12V 1.x:
Most power is now provided on 12 V rails. The standard specifies that two independent 12 V rails (12 V2 for the four-pin connector and 12 V1 for everything else) with independent overcurrent protection are needed to meet the power requirements safely (some very high power PSUs have more than two rails; recommendations for such large PSUs are not given by the standard).
The power on 3.3 V and 5 V rails was significantly reduced.
The ATX motherboard connector was extended to 24 pins. The extra four pins provide one additional 3.3 V, 5 V and 12 V circuit.
The six-pin AUX connector from ATX12V 1.x was removed because the extra 3.3 V and 5 V circuits which it provided are now incorporated in the 24-pin ATX motherboard connector.
The power supply is required to include a Serial ATA power cable.
Many other specification changes and additions
ATX12V v2.01
This is a minor revision from June 2004. An errant reference for the −5 V rail was removed. Other minor changes were introduced.
ATX12V v2.1
This is a minor revision from March 2005. The power was slightly increased on all rails. Efficiency requirements changed.
ATX12V v2.2
Also released in March 2005 it includes corrections and specifies High Current Series wire terminals for 24-pin ATX motherboard and 4-pin +12 V power connectors.
ATX12V v2.3
Effective March 2007. Recommended efficiency was increased to 80% (with at least 70% required) and the 12 V minimum load requirement was lowered. Higher efficiency generally results in less power consumption (and less waste heat) and the 80% recommendation brings supplies in line with new Energy Star 4.0 mandates. The reduced load requirement allows compatibility with processors that draw very little power during startup. The absolute over-current limit of 240 VA per rail was removed, allowing 12 V lines to provide more than 20 A per rail.
ATX12V v2.31
This revision became effective in February 2008. It added a maximum allowed ripple/noise specification of 400 millivolts to the PWR_ON and PWR_OK signals, requires that the DC power must hold for more than 1 millisecond after the PWR_OK signal drops, clarified country-specific input line harmonic content and electromagnetic compatibility requirements, added a section about Climate Savers, updated recommended power supply configuration charts, and updated the cross-regulation graphs.
ATX12V v2.32
This the unofficial name given to the later revisions of the v2.31 spec.
ATX12V v2.4
The ATX12V 2.4 specifications were published in April 2013. It is specified in Revision 1.31 of the 'Design Guide for Desktop Platform Form Factors', which names this as ATX12V version 2.4.
ATX12V v2.51
The specifications for ATXV12 2.51 were released in September 2017 and introduced support for Alternative Sleep Mode (ASM) which supersedes the traditional S3 power state. Windows 10 implements this functionality as Modern Standby.
ATX12V v2.52
The specifications for ATXV12 2.52 were released in June 2018 introduces minor changes to the standard, most notably it requires power supply manufacturers to ensure power supplies with Alternative Sleep Mode (ASM) support are able to withstand power cycles every 180 seconds (480 times per day or 175,200 per year). Power supply fans are also recommended to turn on with at least a two second delay for an improved user experience.
ATX12V v2.53
The specifications for ATXV12 2.53 were released in June 2020 and constitute another minor update to the ATX standard. ATXV12 2.53 makes further recommendations on efficiency and references the Energy Star Computers Specification Version 8.0 which was finalized in April 2020.
ATX power supply derivatives
ATX12VO
Standing for ATX 12-volt-only, this is a new specification published by Intel in 2019, aimed at pre-built systems in the first run, and possibly affecting DIY and "high expandability" systems (defined as a pre-built computer with a discrete GPU) when a market emerges. It was motivated by stricter power efficiency requirements by the California Energy Commission going into effect in 2021. Several OEMs were already using a similar design with proprietary connectors and this effectively standardizes those.
Under this standard, power supplies provide only a 12V output. ATX12VO introduces a new 10-pin connector to supply the motherboard, replacing the 24-pin ATX12V connector. This greatly simplifies power supplies, but moves DC-to-DC conversion and some connectors to the motherboard instead. Notably, SATA power connectors, which include 3.3V and 5V pins, need to move to the motherboard instead of being connected directly to the power supply.
SFX
SFX is merely a design for a small form factor (SFF) power supply casing (such as those using microATX, FlexATX, nano-ITX, mini-ITX, and NLX), with the power specifications almost identical to ATX. Thus, an SFX power supply is mostly pin-compatible with the ATX power supply as the main difference is its reduced dimensions; the only electrical difference is that the SFX specifications do not require the −5 V rail. Since −5 V is required only by some ISA-bus expansion cards, this is not an issue with modern hardware and decreases productions costs. As a result, ATX pin 20, which carried −5 V, is absent in current power supplies; it was optional in ATX and ATX12V version 1.2 and deleted as of ATX version 1.3.
SFX has dimensions of 125 × 63.5 × 100 mm (width × height × depth), with a 60 mm fan, compared with the standard ATX dimensions of 150 × 86 × 140 mm. Optional 80 or 40 mm fan replacement increases or decreases the height of an SFX unit.
Some manufacturers and retailers incorrectly market SFX power supplies as μATX or MicroATX power supplies.
Some manufacturers make SFX-L dimensions of 125 × 63.5 × 130 mm to accommodate a 120 mm fan.
TFX
Thin Form Factor is another small power supply design with standard ATX specification connectors. Generally dimensioned (W × H × D): 85 × 64 × 175 mm (3.34 × 2.52 × 6.89 in).
WTX
WTX power supplies provide a WTX style motherboard connector which is incompatible with the standard ATX motherboard connector.
AMD GES
This is an ATX12V power supply derivative made by AMD to power its Athlon MP (dual processor) platform. It was used only on high-end Athlon MP motherboards. It has a special 8-pin supplemental connector for motherboard, so an AMD GES PSU is required for such motherboards (those motherboards will not work with ATX(12 V) PSUs).
a. ATX12V-GES 24-pin P1 motherboard connector. The pinout on the motherboard connector is as follows when viewing the motherboard from above:
b. ATX12V-GES 8-pin P2 motherboard connector. This pinout on the motherboard connector is as follows when viewing the motherboard from above:
EPS12V
EPS12V is defined in Server System Infrastructure (SSI) and used primarily by SMP/multi-core systems such as Core 2, Core i7, Opteron and Xeon. It has a 24-pin ATX motherboard connector (same as ATX12V v2.x), an 8-pin secondary connector and an optional 4-pin tertiary connector. Rather than include the extra cable, many power supply makers implement the 8-pin connector as two combinable 4-pin connectors to ensure backwards compatibility with ATX12V motherboards.
Recent specification changes and additions
High-performance video card power demands dramatically increased during the 2000s and some high-end graphics cards have power demands that exceed AGP or PCIe slot capabilities. For these cards, supplementary power was delivered through a standard 4-pin peripheral or floppy power connector. Midrange and high-end PCIe graphics cards manufactured after 2004 typically use a standard 6 or 8-pin PCIe power connector directly from the PSU.
Interchanging PSUs
Although the ATX power supply specifications are mostly vertically compatible in both ways (both electrically and physically), there are potential issues with mixing old motherboards/systems with new PSUs and vice versa. The main issues to consider are the following:
The power allocation between 3.3 V, 5 V and 12 V rails is very different between older and newer ATX PSU designs, as well as between older and newer PC system designs.
Older PSUs may not have connectors which are required for newer PC systems to properly operate.
Newer systems generally have higher power requirements than older systems.
This is a practical guidance what to mix and what not to mix:
Older systems (before Pentium 4 and Athlon XP platforms) were designed to draw most power from 5 V and 3.3 V rails.
Because of the DC-DC converters on the motherboard that convert 12 V to the low voltages required by the Intel Pentium 4 and AMD Athlon XP (and subsequent) processors, such systems draw most of their power from the 12 V rail.
Original ATX PSUs have power distribution designed for pre-P4/XP PCs. They lack the supplemental 4-pin 12-volt CPU power connector, so they most likely cannot be used with P4/XP or newer motherboards. Adapters do exist but power drain on the 12 V rail must be checked very carefully. There is a chance it can work without connecting the 4-pin 12 V connector, but caution is advised.
ATX12V 1.x PSUs have power distribution designed for P4/XP PCs, but they are also greatly suitable for older PCs, since they give plenty of power (relative to old PCs' needs) both on 12 V and on 5 V/3.3 V. It is not recommended to use ATX12V 1.x PSUs on ATX12V 2.x motherboards because those systems require much more power on 12 V than ATX12V 1.x PSUs provide.
ATX12V 2.x PSUs have power distribution designed for late P4/XP PCs and for Athlon 64 and Core Duo PCs. They can be used with earlier P4/XP PCs, but the power distribution will be significantly suboptimal, so a more powerful ATX12V 2.0 PSU should be used to compensate for that discrepancy. ATX12V 2.x PSUs can also be used with pre-P4/XP systems, but the power distribution will be greatly suboptimal (12 V rails will be mostly unused, while the 3.3 V/5 V rails will be overloaded), so this is not recommended.
Systems that use an ISA bus should have a PSU that provides the −5 V rail, which became optional in ATX12V 1.2 and was subsequently phased out by manufacturers.
Some proprietary brand-name systems require a matching proprietary power supply, but some of them may also support standard and interchangeable power supplies.
Efficiency
Efficiency in power supplies means the extent to which power is not wasted in converting electricity from a household supply to regulated DC. Computer power supplies vary from around 70% to over 90% efficiency.
Various initiatives exist to improve the efficiency of computer power supplies. Climate Savers Computing Initiative promotes energy saving and reduction of greenhouse gas emissions by encouraging development and use of more efficient power supplies. 80 PLUS certifies a variety of efficiency levels for power supplies and encourages their use via financial incentives. Efficient power supplies also save money by wasting less power; as a result they use less electricity to power the same computer, and they emit less waste heat which results in significant energy savings on central air conditioning in the summer. The gains of using an efficient power supply are more substantial in computers that use a lot of power.
Although a power supply with a larger than needed power rating will have an extra margin of safety against overloading, such a unit is often less efficient and wastes more electricity at lower loads than a more appropriately sized unit. For example, a 900-watt power supply with the 80 Plus Silver efficiency rating (which means that such a power supply is designed to be at least 85-percent efficient for loads above 180 W) may only be 73% efficient when the load is lower than 100 W, which is a typical idle power for a desktop computer. Thus, for a 100 W load, losses for this supply would be 27 W; if the same power supply was put under a 450 W load, for which the supply's efficiency peaks at 89%, the loss would be only 56 W despite supplying 4.5 times the useful power. For a comparison, a 500-watt power supply carrying the 80 Plus Bronze efficiency rating (which means that such a power supply is designed to be at least 82-percent efficient for loads above 100 W) may provide an 84-percent efficiency for a 100 W load, wasting only 19 W.
See also
AT (form factor)
BTX (form factor)
Computer form factor
Mini-ITX
PC 97
Power supply unit (computer)
SSI CEB
Notes
References
External links
ATX Motherboard Specifications
ATX Motherboard Specification, v1.1
ATX Motherboard Specification, v2.2
ATX Power Supply Specifications
ATX12V Power Supply Design Guide, v2.01
ATX12V Power Supply Design Guide, v2.2
ATX12V Power Supply Design Guide, v2.3
ATX12V Power Supply Design Guide, v2.31
ATX12V Power Supply Design Guide, v2.4
ATX12V Power Supply Design Guide, v2.5 (Revision 002 / v2.52)
ATX12V Power Supply Design Guide, v2.5 (Revision 003 / v2.53)
EPS Power Supply Specifications
EPS12V Power Supply Design Guide, v2.0
EPS12V Power Supply Design Guide, v2.91
EPS12V Power Supply Design Guide v2.92
Other
Power Supply Form Factors
Various power supply cables and connectors
A short history of power supply voltage rails
ATX power supply connectors with pinouts
More ATX power supply connectors with pinouts
ATX Power Supply Terminology
Motherboard form factors
IBM PC compatibles |
1228060 | https://en.wikipedia.org/wiki/Internet%20privacy | Internet privacy | Internet privacy involves the right or mandate of personal privacy concerning the storing, repurposing, provision to third parties, and displaying of information pertaining to oneself via Internet. Internet privacy is a subset of data privacy. Privacy concerns have been articulated from the beginnings of large-scale computer sharing.
Privacy can entail either personally identifiable information (PII) or non-PII information such as a site visitor's behaviour on a website. PII refers to any information that can be used to identify an individual. For example, age and physical address alone could identify who an individual is without explicitly disclosing their name, as these two factors are unique enough to identify a specific person typically. Other forms of PII may soon include GPS tracking data used by apps, as the daily commute and routine information can be enough to identify an individual.
It has been suggested that the "appeal of online services is to broadcast personal information on purpose." On the other hand, in his essay "The Value of Privacy", security expert Bruce Schneier says, "Privacy protects us from abuses by those in power, even if we're doing nothing wrong at the time of surveillance."
Levels of privacy
Internet and digital privacy are viewed differently from traditional expectations of privacy. Internet privacy is primarily concerned with protecting user information. Law Professor Jerry Kang explains that the term privacy expresses space, decision, and information. In terms of space, individuals have an expectation that their physical spaces (e.g. homes, cars) not be intruded. Privacy within the realm of decision is best illustrated by the landmark case Roe v. Wade. Lastly, information privacy is in regards to the collection of user information from a variety of sources, which produces great discussion.
In the United States, the 1997 Information Infrastructure Task Force (IITF) created under President Clinton defined information privacy as "an individual's claim to control the terms under which personal information — information identifiable to the individual — is acquired, disclosed, and used." At the end of the 1990s, with the rise of the internet, it became clear that governments, companies, and other organisations would need to abide by new rules to protect individuals' privacy. With the rise of the internet and mobile networks internet privacy is a daily concern for users.
People with only a casual concern for Internet privacy need not achieve total anonymity. Internet users may protect their privacy through controlled disclosure of personal information. The revelation of IP addresses, non-personally-identifiable profiling, and similar information might become acceptable trade-offs for the convenience that users could otherwise lose using the workarounds needed to suppress such details rigorously. On the other hand, some people desire much stronger privacy. In that case, they may try to achieve Internet anonymity to ensure privacy — use of the Internet without giving any third parties the ability to link the Internet activities to personally-identifiable information of the Internet user. In order to keep their information private, people need to be careful with what they submit to and look at online. When filling out forms and buying merchandise, information is tracked and because it was not private, some companies send Internet users spam and advertising on similar products.
There are also several governmental organizations that protect an individual's privacy and anonymity on the Internet, to a point. In an article presented by the FTC, in October 2011, a number of pointers were brought to attention that helps an individual internet user avoid possible identity theft and other cyber-attacks. Preventing or limiting the usage of Social Security numbers online, being wary and respectful of emails including spam messages, being mindful of personal financial details, creating and managing strong passwords, and intelligent web-browsing behaviors are recommended, among others.
Posting things on the Internet can be harmful or expose people to malicious attacks. Some information posted on the Internet persists for decades, depending on the terms of service, and privacy policies of particular services offered online. This can include comments written on blogs, pictures, and websites, such as Facebook and Twitter. It is absorbed into cyberspace and once it is posted, anyone can potentially find it and access it. Some employers may research a potential employee by searching online for the details of their online behaviors, possibly affecting the outcome of the success of the candidate.{{}}
Risks of Internet privacy
Companies are hired to track which websites people visit and then use the information, for instance by sending advertising based on one's web browsing history. There are many ways in which people can divulge their personal information, for instance by use of "social media" and by sending bank and credit card information to various websites. Moreover, directly observed behaviour, such as browsing logs, search queries, or contents of the Facebook profile can be automatically processed to infer potentially more intrusive details about an individual, such as sexual orientation, political and religious views, race, substance use, intelligence, and personality.
Those concerned about Internet privacy often cite a number of privacy risks — events that can compromise privacy — which may be encountered through online activities. These range from the gathering of statistics on users to more malicious acts such as the spreading of spyware and the exploitation of various forms of bugs (software faults).
Several social networking websites try to protect the personal information of their subscribers, as well as provide a warning through a privacy and terms agreement. On Facebook, for example, privacy settings are available to all registered users: they can block certain individuals from seeing their profile, they can choose their "friends", and they can limit who has access to their pictures and videos. Privacy settings are also available on other social networking websites such as Google Plus and Twitter. The user can apply such settings when providing personal information on the Internet. The Electronic Frontier Foundation has created a set of guides so that users may more easily use these privacy settings and Zebra Crossing: an easy-to-use digital safety checklist is a volunteer-maintained online resource.
In late 2007, Facebook launched the Beacon program in which user rental records were released to the public for friends to see. Many people were enraged by this breach of privacy, and the Lane v. Facebook, Inc. case ensued.
Children and adolescents often use the Internet (including social media) in ways that risk their privacy: a cause for growing concern among parents. Young people also may not realize that all their information and browsing can and may be tracked while visiting a particular site and that it is up to them to protect their own privacy. They must be informed about all these risks. For example, on Twitter, threats include shortened links that may lead to potentially harmful websites or content. Email threats include email scams and attachments that persuade users to install malware and disclose personal information. On Torrent sites, threats include malware hiding in video, music, and software downloads. When using a smartphone, threats include geolocation, meaning that one's phone can detect where one's location and post it online for all to see. Users can protect themselves by updating virus protection, using security settings, downloading patches, installing a firewall, screening email, shutting down spyware, controlling cookies, using encryption, fending off browser hijackers, and blocking pop-ups.
However most people have little idea how to go about doing these things. Many businesses hire professionals to take care of these issues, but most individuals can only do their best to educate themselves.
In 1998, the Federal Trade Commission in the US considered the lack of privacy for children on the internet and created the Children Online Privacy Protection Act (COPPA). COPPA limits the options which gather information from children and created warning labels if potential harmful information or content was presented. In 2000, the Children's Internet Protection Act (CIPA) was developed to implement Internet safety policies. Policies required taking technology protection measures that can filter or block children's Internet access to pictures that are harmful to them. Schools and libraries need to follow these requirements in order to receive discounts from E-rate program. These laws, awareness campaigns, parental and adult supervision strategies, and Internet filters can all help to make the Internet safer for children around the world.
The privacy concerns of Internet users pose a serious challenge (Dunkan, 1996; Till, 1997). Owing to the advancement in technology, access to the internet has become easier to use from any device at any time. However, the increase of access from multiple sources increases the number of access points for an attack. In an online survey, approximately seven out of ten individuals responded that what worries them most is their privacy over the Internet, rather than over the mail or phone. Internet privacy is slowly but surely becoming a threat, as a person's personal data may slip into the wrong hands if passed around through the Web.
Internet protocol (IP) addresses
All websites receive and many track the IP address of a visitor's computer. Companies match data over time to associate the name, address, and other information to the IP address. There is ambiguity about how private IP addresses are. The Court of Justice of the European Union has ruled they need to be treated as personally identifiable information if the website tracking them, or a third party like a service provider, knows the name or street address of the IP address holder, which would be true for static IP addresses, not for dynamic addresses.
California regulations say IP addresses need to be treated as personal information if the business itself, not a third party, can link them to name and street address.
An Alberta court ruled that police can obtain the IP addresses and the names and addresses associated with them without a search warrant; the Calgary, Alberta police found IP addresses that initiated online crimes. The service provider gave police the names and addresses associated with those IP addresses.
HTTP cookies
An HTTP cookie is data stored on a user's computer that assists in automated access to websites or web features, or other state information required in complex web sites. It may also be used for user-tracking by storing special usage history data in a cookie, and such cookies — for example, those used by Google Analytics — are called tracking cookies. Cookies are a common concern in the field of Internet privacy. Although website developers most commonly use cookies for legitimate technical purposes, cases of abuse occur. In 2009, two researchers noted that social networking profiles could be connected to cookies, allowing the social networking profile to be connected to browsing habits.
In the past, websites have not generally made the user explicitly aware of the storing of cookies, however tracking cookies and especially third-party tracking cookies are commonly used as ways to compile long-term records of individuals' browsing histories — a privacy concern that prompted European and US lawmakers to take action in 2011. Cookies can also have implications for computer forensics. In past years, most computer users were not completely aware of cookies, but users have become conscious of possible detrimental effects of Internet cookies: a recent study done has shown that 58% of users have deleted cookies from their computer at least once, and that 39% of users delete cookies from their computer every month. Since cookies are advertisers' main way of targeting potential customers, and some customers are deleting cookies, some advertisers started to use persistent Flash cookies and zombie cookies, but modern browsers and anti-malware software can now block or detect and remove such cookies.
The original developers of cookies intended that only the website that originally distributed cookies to users could retrieve them, therefore returning only data already possessed by the website. However, in practice programmers can circumvent this restriction. Possible consequences include:
the placing of a personally-identifiable tag in a browser to facilitate web profiling , or
use of cross-site scripting or other techniques to steal information from a user's cookies.
Cookies do have benefits. One is that for websites that one frequently visits that require a password, cookies may allow a user to not have to sign in every time. A cookie can also track one's preferences to show them websites that might interest them. Cookies make more websites free to use without any type of payment. Some of these benefits are also seen as negative. For example, one of the most common ways of theft is hackers taking one's username and password that a cookie saves. While many sites are free, they sell their space to advertisers. These ads, which are personalized to one's likes, can sometimes freeze one's computer or cause annoyance. Cookies are mostly harmless except for third-party cookies. These cookies are not made by the website itself but by web banner advertising companies. These third-party cookies are dangerous because they take the same information that regular cookies do, such as browsing habits and frequently visited websites, but then they share this information with other companies.
Cookies are often associated with pop-up windows because these windows are often, but not always, tailored to a person's preferences. These windows are an irritation because the close button may be strategically hidden in an unlikely part of the screen. In the worst cases, these pop-up ads can take over the screen and while one tries to close them, they can take one to another unwanted website.
Cookies are seen so negatively because they are not understood and go unnoticed while someone is simply surfing the internet. The idea that every move one makes while on the internet is being watched, would frighten most users.
Some users choose to disable cookies in their web browsers. Such an action can reduce some privacy risks, but may severely limit or prevent the functionality of many websites. All significant web browsers have this disabling ability built-in, with no external program required. As an alternative, users may frequently delete any stored cookies. Some browsers (such as Mozilla Firefox and Opera) offer the option to clear cookies automatically whenever the user closes the browser. A third option involves allowing cookies in general, but preventing their abuse. There are also a host of wrapper applications that will redirect cookies and cache data to some other location. Concerns exist that the privacy benefits of deleting cookies have been over-stated.
The process of profiling (also known as "tracking") assembles and analyzes several events, each attributable to a single originating entity, in order to gain information (especially patterns of activity) relating to the originating entity. Some organizations engage in the profiling of people's web browsing, collecting the URLs of sites visited. The resulting profiles can potentially link with information that personally identifies the individual who did the browsing.
Some web-oriented marketing-research organizations may use this practice legitimately, for example: in order to construct profiles of "typical internet users". Such profiles, which describe average trends of large groups of internet users rather than of actual individuals, can then prove useful for market analysis. Although the aggregate data does not constitute a privacy violation, some people believe that the initial profiling does.
Profiling becomes a more contentious privacy issue when data-matching associates the profile of an individual with personally-identifiable information of the individual.
Governments and organizations may set up honeypot websites – featuring controversial topics – with the purpose of attracting and tracking unwary people. This constitutes a potential danger for individuals.
Flash cookies
When some users choose to disable HTTP cookies to reduce privacy risks as noted, new types of cookies were invented: since cookies are advertisers' main way of targeting potential customers, and some customers were deleting cookies, some advertisers started to use persistent Flash cookies and zombie cookies. In a 2009 study, Flash cookies were found to be a popular mechanism for storing data on the top 100 most visited sites. Another 2011 study of social media found that, "Of the top 100 web sites, 31 had at least one overlap between HTTP and Flash cookies." However, modern browsers and anti-malware software can now block or detect and remove such cookies.
Flash cookies, also known as local shared objects, work the same ways as normal cookies and are used by the Adobe Flash Player to store information at the user's computer. They exhibit a similar privacy risk as normal cookies, but are not as easily blocked, meaning that the option in most browsers to not accept cookies does not affect Flash cookies. One way to view and control them is with browser extensions or add-ons.
Flash cookies are unlike HTTP cookies in a sense that they are not transferred from the client back to the server. Web browsers read and write these cookies and can track any data by web usage.
Although browsers such as Internet Explorer 8 and Firefox 3 have added a "Privacy Browsing" setting, they still allow Flash cookies to track the user and operate fully. However, the Flash player browser plugin can be disabled or uninstalled, and Flash cookies can be disabled on a per-site or global basis. Adobe's Flash and (PDF) Reader are not the only browser plugins whose past security defects have allowed spyware or malware to be installed: there have also been problems with Oracle's Java.
Evercookies
Evercookies, created by Samy Kamkar, are JavaScript-based applications which produce cookies in a web browser that actively "resist" deletion by redundantly copying themselves in different forms on the user's machine (e.g., Flash Local Shared Objects, various HTML5 storage mechanisms, window.name caching, etc.), and resurrecting copies that are missing or expired. Evercookie accomplishes this by storing the cookie data in several types of storage mechanisms that are available on the local browser. It has the ability to store cookies in over ten types of storage mechanisms so that once they are on one's computer they will never be gone. Additionally, if evercookie has found the user has removed any of the types of cookies in question, it recreates them using each mechanism available. Evercookies are one type of zombie cookie. However, modern browsers and anti-malware software can now block or detect and remove such cookies.
Anti-fraud uses
Some anti-fraud companies have realized the potential of evercookies to protect against and catch cyber criminals. These companies already hide small files in several places on the perpetrator's computer but hackers can usually easily get rid of these. The advantage to evercookies is that they resist deletion and can rebuild themselves.
Advertising uses
There is controversy over where the line should be drawn on the use of this technology. Cookies store unique identifiers on a person's computer that are used to predict what one wants. Many advertisement companies want to use this technology to track what their customers are looking at online. This is known as online behavioral advertising which allows advertisers to keep track of the consumer's website visits to personalize and target advertisements. Evercookies enable advertisers to continue to track a customer regardless of whether their cookies are deleted or not. Some companies are already using this technology but the ethics are still being widely debated.
Criticism
Anonymizer "nevercookies" are part of a free Firefox plugin that protects against evercookies. This plugin extends Firefox's private browsing mode so that users will be completely protected from evercookies. Nevercookies eliminate the entire manual deletion process while keeping the cookies users want like browsing history and saved account information.
Device fingerprinting
A device fingerprint is information collected about the software and hardware of a remote computing device for the purpose of identifying individual devices even when persistent cookies (and also zombie cookies) can't be read or stored in the browser, the client IP address is hidden, and even if one switches to another browser on the same device.
This may allow a service provider to detect and prevent identity theft and credit card fraud, but also to compile long-term records of individuals' browsing histories even when they're attempting to avoid tracking, raising a major concern for internet privacy advocates.
Third Party Requests
Third Party Requests are HTTP data connections from client devices to addresses in the web which are different than the website the user is currently surfing on. Many alternative tracking technologies to cookies are based on third party requests. Their importance has increased during the last years and even accelerated after Mozilla (2019), Apple (2020), and Google (2022) have announced to block third party cookies by default. Third requests may be used for embedding external content (e.g. advertisements) or for loading external resources and functions (e.g. images, icons, fonts, captchas, JQuery resources and many others). Dependent on the type of resource loaded, such requests may enable third parties to execute a device fingerprint or place any other kind of marketing tag. Irrespective of the intention, such requests do often disclose information that may be sensitive, and they can be used for tracking either directly or in combination with other personally identifiable information . Most of the requests disclose referrer details that reveal the full URL of the actually visited website. In addition to the referrer URL further information may be transmitted by the use of other request methods such as HTTP POST. Since 2018 Mozilla partially mitigates the risk of third party requests by cutting the referrer information when using the private browsing mode. However, personal information may still be revealed to the requested address in other areas of the HTTP-header.
Photographs on the Internet
Today many people have digital cameras and post their photographs online, for example street photography practitioners do so for artistic purposes and social documentary photography practitioners do so to document people in everyday life. The people depicted in these photos might not want them to appear on the Internet. Police arrest photos, considered public record in many jurisdictions, are often posted on the Internet by online mug shot publishing sites.
Some organizations attempt to respond to this privacy-related concern. For example, the 2005 Wikimania conference required that photographers have the prior permission of the people in their pictures, albeit this made it impossible for photographers to practice candid photography and doing the same in a public place would violate the photographers' free speech rights. Some people wore a "no photos" tag to indicate they would prefer not to have their photo taken .
The Harvard Law Review published a short piece called "In The Face of Danger: Facial Recognition and Privacy Law", much of it explaining how "privacy law, in its current form, is of no help to those unwillingly tagged." Any individual can be unwillingly tagged in a photo and displayed in a manner that might violate them personally in some way, and by the time Facebook gets to taking down the photo, many people will have already had the chance to view, share, or distribute it. Furthermore, traditional tort law does not protect people who are captured by a photograph in public because this is not counted as an invasion of privacy. The extensive Facebook privacy policy covers these concerns and much more. For example, the policy states that they reserve the right to disclose member information or share photos with companies, lawyers, courts, government entities, etc. if they feel it absolutely necessary. The policy also informs users that profile pictures are mainly to help friends connect to each other. However, these, as well as other pictures, can allow other people to invade a person's privacy by finding out information that can be used to track and locate a certain individual. In an article featured in ABC News, it was stated that two teams of scientists found out that Hollywood stars could be giving up information about their private whereabouts very easily through pictures uploaded to the internet. Moreover, it was found that pictures taken by some phones and tablets including iPhones automatically attach the latitude and longitude of the picture taken through metadata unless this function is manually disabled.
Face recognition technology can be used to gain access to a person's private data, according to a new study. Researchers at Carnegie Mellon University combined image scanning, cloud computing and public profiles from social network sites to identify individuals in the offline world. Data captured even included a user's social security number. Experts have warned of the privacy risks faced by the increased merging of online and offline identities. The researchers have also developed an 'augmented reality' mobile app that can display personal data over a person's image captured on a smartphone screen. Since these technologies are widely available, users' future identities may become exposed to anyone with a smartphone and an internet connection. Researchers believe this could force a reconsideration of future attitudes to privacy.
Google Street View
Google Street View, released in the U.S. in 2007, is currently the subject of an ongoing debate about possible infringement on individual privacy. In an article entitled "Privacy, Reconsidered: New Representations, Data Practices, and the Geoweb", Sarah Elwood and Agnieszka Leszczynski (2011) argue that Google Street View "facilitate[s] identification and disclosure with more immediacy and less abstraction." The medium through which Street View disseminates information, the photograph, is very immediate in the sense that it can potentially provide direct information and evidence about a person's whereabouts, activities, and private property. Moreover, the technology's disclosure of information about a person is less abstract in the sense that, if photographed, a person is represented on Street View in a virtual replication of his or her own real-life appearance. In other words, the technology removes abstractions of a person's appearance or that of his or her personal belongings – there is an immediate disclosure of the person and object, as they visually exist in real life. Although Street View began to blur license plates and people's faces in 2008, the technology is faulty and does not entirely ensure against accidental disclosure of identity and private property.
Elwood and Leszczynski note that "many of the concerns leveled at Street View stem from situations where its photograph-like images were treated as definitive evidence of an individual's involvement in particular activities." In one instance, Ruedi Noser, a Swiss politician, barely avoided public scandal when he was photographed in 2009 on Google Street View walking with a woman who was not his wife – the woman was actually his secretary. Similar situations occur when Street View provides high-resolution photographs – and photographs hypothetically offer compelling objective evidence. But as the case of the Swiss politician illustrates, even supposedly compelling photographic evidence is sometimes subject to gross misinterpretation. This example further suggests that Google Street View may provide opportunities for privacy infringement and harassment through public dissemination of the photographs. Google Street View does, however, blur or remove photographs of individuals and private property from image frames if the individuals request further blurring and/or removal of the images. This request can be submitted, for review, through the "report a problem" button that is located on the bottom left-hand side of every image window on Google Street View, however, Google has made attempts to report a problem difficult by disabling the "Why are you reporting the street view" icon.
Search engines
Search engines have the ability to track a user's searches. Personal information can be revealed through searches by the user's computer, account, or IP address being linked to the search terms used. Search engines have claimed a necessity to retain such information in order to provide better services, protect against security pressure, and protect against fraud.
A search engine takes all of its users and assigns each one a specific ID number. Those in control of the database often keep records of where on the internet each member has traveled to. AOL's system is one example. AOL has a database 21 million members deep, each with their own specific ID number. The way that AOLSearch is set up, however, allows for AOL to keep records of all the websites visited by any given member. Even though the true identity of the user isn't known, a full profile of a member can be made just by using the information stored by AOLSearch. By keeping records of what people query through AOLSearch, the company is able to learn a great deal about them without knowing their names.
Search engines also are able to retain user information, such as location and time spent using the search engine, for up to ninety days. Most search engine operators use the data to get a sense of which needs must be met in certain areas of their field. People working in the legal field are also allowed to use information collected from these search engine websites. The Google search engine is given as an example of a search engine that retains the information entered for a period of three-fourths of a year before it becomes obsolete for public usage. Yahoo! follows in the footsteps of Google in the sense that it also deletes user information after a period of ninety days. Other search engines such as Ask! search engine has promoted a tool of "AskEraser" which essentially takes away personal information when requested.
Some changes made to internet search engines included that of Google's search engine. Beginning in 2009, Google began to run a new system where the Google search became personalized. The item that is searched and the results that are shown remembers previous information that pertains to the individual. Google search engine not only seeks what is searched but also strives to allow the user to feel like the search engine recognizes their interests. This is achieved by using online advertising. A system that Google uses to filter advertisements and search results that might interest the user is by having a ranking system that tests relevancy that includes observation of the behavior users exude while searching on Google. Another function of search engines is the predictability of location. Search engines are able to predict where one's location is currently by locating IP Addresses and geographical locations.
Google had publicly stated on January 24, 2012, that its privacy policy will once again be altered. This new policy would change the following for its users: (1) the privacy policy would become shorter and easier to comprehend and (2) the information that users provide would be used in more ways than it is presently being used. The goal of Google is to make users’ experiences better than they currently are.
This new privacy policy is planned to come into effect on March 1, 2012. Peter Fleischer, the Global Privacy Counselor for Google, has explained that if a person is logged into his/her Google account, and only if he/she is logged in, information will be gathered from multiple Google services in which he/she has used in order to be more accommodating. Google's new privacy policy will combine all data used on Google's search engines (i.e., YouTube and Gmail) in order to work along the lines of a person's interests. A person, in effect, will be able to find what he/she wants at a more efficient rate because all searched information during times of login will help to narrow down new search results.
Google's privacy policy explains what information they collect and why they collect it, how they use the information, and how to access and update information. Google will collect information to better service its users such as their language, which ads they find useful or people that are important to them online. Google announces they will use this information to provide, maintain, protect Google and its users. The information Google uses will give users more relevant search results and advertisements. The new privacy policy explains that Google can use shared information on one service in other Google services from people who have a Google account and are logged in. Google will treat a user as a single user across all of their products. Google claims the new privacy policy will benefit its users by being simpler. Google will, for example, be able to correct the spelling of a user's friend's name in a Google search or notify a user they are late based on their calendar and current location. Even though Google is updating their privacy policy, its core privacy guidelines will not change. For example, Google does not sell personal information or share it externally.
Users and public officials have raised many concerns regarding Google's new privacy policy. The main concern/issue involves the sharing of data from multiple sources. Because this policy gathers all information and data searched from multiple engines when logged into Google, and uses it to help assist users, privacy becomes an important element. Public officials and Google account users are worried about online safety because of all this information being gathered from multiple sources.
Some users do not like the overlapping privacy policy, wishing to keep the service of Google separate. The update to Google's privacy policy has alarmed both public and private sectors. The European Union has asked Google to delay the onset of the new privacy policy in order to ensure that it does not violate E.U. law. This move is in accordance with objections to decreasing online privacy raised in other foreign nations where surveillance is more heavily scrutinized. Canada and Germany have both held investigations into the legality of both Facebook, against respective privacy acts, in 2010. The new privacy policy only heightens unresolved concerns regarding user privacy.
An additional feature of concern to the new Google privacy policy is the nature of the policy. One must accept all features or delete existing Google accounts. The update will affect the Google+ social network, therefore making Google+’s settings uncustomizable, unlike other customizable social networking sites. Customizing the privacy settings of a social network is a key tactic that many feel is necessary for social networking sites. This update in the system has some Google+ users wary of continuing service. Additionally, some fear the sharing of data amongst Google services could lead to revelations of identities. Many using pseudonyms are concerned about this possibility, and defend the role of pseudonyms in literature and history.
Some solutions to being able to protect user privacy on the internet can include programs such as "Rapleaf" which is a website that has a search engine that allows users to make all of one's search information and personal information private. Other websites that also give this option to their users are Facebook and Amazon.
Privacy focused search engines/browsers
Search engines such as Startpage.com, Disconnect.me and Scroogle (defunct since 2012) anonymize Google searches. Some of the most notable Privacy-focused search-engines are:
Brave A free software that reports to be privacy-first website browsing services, blocking online trackers and ads, and not tracking users' browsing data.
DuckDuckGo A meta-search engine that combines the search results from various search engines (excluding Google) and providing some unique services like using search boxes on various websites and providing instant answers out of the box.
Qwant An EU-based web-search engine that is focusing on privacy. It has its own index and has servers hosted in the European Union.
Searx A free and open source privacy-oriented meta-search engine which is based on a number of decentralized instances. There are a number of existing public instances, but any user can create their own if they desire.
Fireball Germany's first search engine and obtains web results from various sources (mainly Bing). Fireball is not collecting any user information. All servers are stationed in Germany, a plus considering the German legislation tends to respect privacy rights better than many other European countries.
MetaGer A meta-search engine (obtains results from various sources) and in Germany by far the most popular safe search engine. MetaGer uses similar safety features as Fireball.
Ixquick A Dutch-based meta-search engine (obtains results from various sources). It commits also to the protection of the privacy of its users. Ixquick uses similar safety features as Fireball.
Yacy A decentralized-search engine developed on the basis of a community project, which started in 2005. The search engine follows a slightly different approach to the two previous ones, using a peer-to-peer principle that does not require any stationary and centralized servers. This has its disadvantages but also the simple advantage of greater privacy when surfing due to basically no possibility of hacking.
Search Encrypt An internet search engine that prioritizes maintaining user privacy and avoiding the filter bubble of personalized search results. It differentiates itself from other search engines by using local encryption on searches and delayed history expiration.
Tor Browser A free software that provides access to anonymised network that enables anonymous communication. It directs the internet traffic through multiple relays. This encryption method prevents others from tracking a certain user, thus allowing user's IP address and other personal information to be concealed.
Privacy issues of social networking sites
The advent of the Web 2.0 has caused social profiling and is a growing concern for internet privacy. Web 2.0 is the system that facilitates participatory information sharing and collaboration on the internet, in social networking media websites like Facebook, Instagram, Twitter, and MySpace. These social networking sites have seen a boom in their popularity starting from the late 2000s. Through these websites, many people are giving their personal information out on the internet.
It has been a topic of discussion of who is held accountable for the collection and distribution of personal information. Some blame social networks, because they are responsible for storing the information and data, while others blame the users who put their information on these sites. This relates to the ever-present issue of how society regards social media sites. There is a growing number of people that are discovering the risks of putting their personal information online and trusting a website to keep it private. Yet in a recent study, researchers found that young people are taking measures to keep their posted information on Facebook private to some degree. Examples of such actions include managing their privacy settings so that certain content can be visible to "Only Friends" and ignoring Facebook friend requests from strangers.
In 2013 a class action lawsuit was filed against Facebook alleging the company scanned user messages for web links, translating them to “likes” on the user's Facebook profile. Data lifted from the private messages was then used for targeted advertising, the plaintiffs claimed. "Facebook's practice of scanning the content of these messages violates the federal Electronic Communications Privacy Act (ECPA also referred to as the Wiretap Act), as well as California's Invasion of Privacy Act (CIPA), and section 17200 of California's Business and Professions Code," the plaintiffs said. This shows that once information is online it is no longer completely private. It is an increasing risk because younger people are having easier internet access than ever before, therefore they put themselves in a position where it is all too easy for them to upload information, but they may not have the caution to consider how difficult it can be to take that information down once it is out in the open. This is becoming a bigger issue now that so much of society interacts online which was not the case fifteen years ago. In addition, because of the quickly evolving digital media arena, people's interpretation of privacy is evolving as well, and it is important to consider that when interacting online. New forms of social networking and digital media such as Instagram and Snapchat may call for new guidelines regarding privacy. What makes this difficult is the wide range of opinions surrounding the topic, so it is left mainly up to individual judgement to respect other people's online privacy in some circumstances.
Privacy issues of medical applications
With the rise of technology focused applications, there has been a rise of medical apps available to users on smart devices. In a survey of 29 migraine management specific applications, researcher Mia T. Minen (et al.) discovered 76% had clear privacy policies, with 55% of the apps stated using the user data from these giving data to third parties for the use of advertising. The concerns raised discusses the applications without accessible privacy policies, and even more so - applications that aren't properly adhering to the Health Insurance Portability and Accountability Act (HIPAA) are in need of proper regulation, as these apps store medical data with identifiable information on a user.
Internet service providers
Internet users obtain internet access through an internet service provider (ISP). All data transmitted to and from users must pass through the ISP. Thus, an ISP has the potential to observe users' activities on the internet. ISPs can breach personal information such as transaction history, search history, and social media profiles of users. Hackers could use this opportunity to hack ISP and obtain sensitive information of victims.
However, ISPs are usually prohibited from participating in such activities due to legal, ethical, business, or technical reasons.
Normally ISPs do collect at least some information about the consumers using their services. From a privacy standpoint, ISPs would ideally collect only as much information as they require in order to provide internet connectivity (IP address, billing information if applicable, etc.).
Which information an ISP collects, what it does with that information, and whether it informs its consumers, pose significant privacy issues. Beyond the usage of collected information typical of third parties, ISPs sometimes state that they will make their information available to government authorities upon request. In the US and other countries, such a request does not necessarily require a warrant.
An ISP cannot know the contents of properly-encrypted data passing between its consumers and the internet. For encrypting web traffic, https has become the most popular and best-supported standard. Even if users encrypt the data, the ISP still knows the IP addresses of the sender and of the recipient. (However, see the IP addresses section for workarounds.)
An Anonymizer such as I2P – The Anonymous Network or Tor can be used for accessing web services without them knowing one's IP address and without one's ISP knowing what the services are that one accesses. Additional software has been developed that may provide more secure and anonymous alternatives to other applications. For example, Bitmessage can be used as an alternative for email and Cryptocat as an alternative for online chat. On the other hand, in addition to End-to-End encryption software, there are web services such as Qlink which provide privacy through a novel security protocol which does not require installing any software.
While signing up for internet services, each computer contains a unique IP, Internet Protocol address. This particular address will not give away private or personal information, however, a weak link could potentially reveal information from one's ISP.
General concerns regarding internet user privacy have become enough of a concern for a UN agency to issue a report on the dangers of identity fraud. In 2007, the Council of Europe held its first annual Data Protection Day on January 28, which has since evolved into the annual Data Privacy Day.
T-Mobile USA doesn't store any information on web browsing. Verizon Wireless keeps a record of the websites a subscriber visits for up to a year. Virgin Mobile keeps text messages for three months. Verizon keeps text messages for three to five days. None of the other carriers keep texts of messages at all, but they keep a record of who texted who for over a year. AT&T Mobility keeps for five to seven years a record of who text messages who and the date and time, but not the content of the messages. Virgin Mobile keeps that data for two to three months.
HTML5
HTML5 is the latest version of Hypertext Markup Language specification. HTML defines how user agents, such as web browsers, are to present websites based upon their underlying code. This new web standard changes the way that users are affected by the internet and their privacy on the internet. HTML5 expands the number of methods given to a website to store information locally on a client as well as the amount of data that can be stored. As such, privacy risks are increased. For instance, merely erasing cookies may not be enough to remove potential tracking methods since data could be mirrored in web storage, another means of keeping information in a user's web browser. There are so many sources of data storage that it is challenging for web browsers to present sensible privacy settings. As the power of web standards increases, so do potential misuses.
HTML5 also expands access to user media, potentially granting access to a computer's microphone or webcam, a capability previously only possible through the use of plug-ins like Flash. It is also possible to find a user's geographical location using the geolocation API. With this expanded access comes increased potential for abuse as well as more vectors for attackers. If a malicious site was able to gain access to a user's media, it could potentially use recordings to uncover sensitive information thought to be unexposed. However, the World Wide Web Consortium, responsible for many web standards, feels that the increased capabilities of the web platform outweigh potential privacy concerns. They state that by documenting new capabilities in an open standardization process, rather than through closed source plug-ins made by companies, it is easier to spot flaws in specifications and cultivate expert advice.
Besides elevating privacy concerns, HTML5 also adds a few tools to enhance user privacy. A mechanism is defined whereby user agents can share blacklists of domains that should not be allowed to access web storage. Content Security Policy is a proposed standard whereby sites may assign privileges to different domains, enforcing harsh limitations on JavaScript use to mitigate cross-site scripting attacks. HTML5 also adds HTML templating and a standard HTML parser which replaces the various parsers of web browser vendors. These new features formalize previously inconsistent implementations, reducing the number of vulnerabilities though not eliminating them entirely.
Big data
Big data is generally defined as the rapid accumulation and compiling of massive amounts of information that is being exchanged over digital communication systems. The data is large (often exceeding exabytes) and cannot be handled by conventional computer processors, and are instead stored on large server-system databases. This information is assessed by analytic scientists using software programs; which paraphrase this information into multi-layered user trends and demographics. This information is collected from all around the internet, such as by popular services like Facebook, Google, Apple, Spotify or GPS systems.
Big data provides companies with the ability to:
Infer detailed psycho-demographic profiles of internet users, even if they were not directly expressed or indicated by users.
Inspect product availability and optimize prices for maximum profit while clearing inventory.
Swiftly reconfigure risk portfolios in minutes and understand future opportunities to mitigate risk.
Mine customer data for insight, and create advertising strategies for customer acquisition and retention.
Identify customers who matter the most.
Create retail coupons based on a proportional scale to how much the customer has spent, to ensure a higher redemption rate.
Send tailored recommendations to mobile devices at just the right time, while customers are in the right location to take advantage of offers.
Analyze data from social media to detect new market trends and changes in demand.
Use clickstream analysis and data mining to detect fraudulent behavior.
Determine root causes of failures, issues and defects by investigating user sessions, network logs and machine sensors.
Other potential Internet privacy risks
Cross-device tracking identifies users' activity across multiple devices.
Massive personal data extraction through mobile device apps that receive carte-blanche-permissions for data access upon installation.
Malware is a term short for "malicious software" and is used to describe software to cause damage to a single computer, server, or computer network whether that is through the use of a virus, trojan horse, spyware, etc.
Spyware is a piece of software that obtains information from a user's computer without that user's consent.
A web bug is an object embedded into a web page or email and is usually invisible to the user of the website or reader of the email. It allows checking to see if a person has looked at a particular website or read a specific email message.
Phishing is a criminally fraudulent process of trying to obtain sensitive information such as user names, passwords, credit card or bank information. Phishing is an internet crime in which someone masquerades as a trustworthy entity in some form of electronic communication.
Pharming is a hacker's attempt to redirect traffic from a legitimate website to a completely different internet address. Pharming can be conducted by changing the hosts file on a victim's computer or by exploiting a vulnerability on the DNS server.
Social engineering where people are manipulated or tricked into performing actions or divulging confidential information.
Malicious proxy server (or other "anonymity" services).
Use of weak passwords that are short, consist of all numbers, all lowercase or all uppercase letters, or that can be easily guessed such as single words, common phrases, a person's name, a pet's name, the name of a place, an address, a phone number, a social security number, or a birth date.
Use of recycled passwords or the same password across multiple platforms which have become exposed from a data breach.
Using the same login name and/or password for multiple accounts where one compromised account leads to other accounts being compromised.
Allowing unused or little used accounts, where unauthorized use is likely to go unnoticed, to remain active.
Using out-of-date software that may contain vulnerabilities that have been fixed in newer, more up-to-date versions.
WebRTC is a protocol which suffers from a serious security flaw that compromises the privacy of VPN tunnels, by allowing the true IP address of the user to be read. It is enabled by default in major browsers such as Firefox and Google Chrome.
Reduction of risks to Internet privacy
Inc. magazine reports that the Internet's biggest corporations have hoarded Internet users' personal data and sold it for large financial profits. The magazine reports on a band of startup companies that are demanding privacy and aiming to overhaul the social-media business. Popular privacy-focused mobile messaging apps include Wickr, Wire, and Signal, which provide peer-to-peer encryption and give the user the capacity to control what message information is retained on the other end; Ansa, an ephemeral chat application, also described as employing peer-to-peer encryption; and Omlet, an open mobile social network, described as giving the user control over their data so that if a user does not want their data saved, they are able to delete it from the data repository.
Noise society – protection through information overflow
According to Nicklas Lundblad, another perspective on privacy protection is the assumption that the quickly growing amount of information produced will be beneficial. The reasons for this are that the costs for the surveillance will raise and that there is more noise, noise being understood as anything that interferes the process of a receiver trying to extract private data from a sender.
In this noise society, the collective expectation of privacy will increase, but the individual expectation of privacy will decrease. In other words, not everyone can be analyzed in detail, but one individual can be. Also, in order to stay unobserved, it can hence be better to blend in with the others than trying to use for example encryption technologies and similar methods. Technologies for this can be called Jante-technologies after the Law of Jante, which states that you are nobody special.
This view offers new challenges and perspectives for the privacy discussion.
Public views
While internet privacy is widely acknowledged as the top consideration in any online interaction, as evinced by the public outcry over SOPA/CISPA, public understanding of online privacy policies is actually being negatively affected by the current trends regarding online privacy statements. Users have a tendency to skim internet privacy policies for information regarding the distribution of personal information only, and the more legalistic the policies appear, the less likely users are to even read the information. Coupling this with the increasingly exhaustive license agreements companies require consumers to agree to before using their product, consumers are reading less about their rights.
Furthermore, if the user has already done business with a company, or is previously familiar with a product, they have a tendency to not read the privacy policies that the company has posted. As internet companies become more established, their policies may change, but their clients will be less likely to inform themselves of the change. This tendency is interesting because as consumers become more acquainted with the internet they are also more likely to be interested in online privacy. Finally, consumers have been found to avoid reading the privacy policies if the policies are not in a simple format, and even perceive these policies to be irrelevant. The less readily available terms and conditions are, the less likely the public is to inform themselves of their rights regarding the service they are using.
Concerns of internet privacy and real life implications
While dealing with the issue of internet privacy, one must first be concerned with not only the technological implications such as damaged property, corrupted files, and the like, but also with the potential for implications on their real lives. One such implication, which is rather commonly viewed as being one of the most daunting fears risks of the internet, is the potential for identity theft. Although it is a typical belief that larger companies and enterprises are the usual focus of identity thefts, rather than individuals, recent reports seem to show a trend opposing this belief. Specifically, it was found in a 2007 "Internet Security Threat Report" that roughly ninety-three percent of "gateway" attacks were targeted at unprepared home users. The term "gateway attack" was used to refer to an attack which aimed not at stealing data immediately, but rather at gaining access for future attacks.
According to Symantec's "Internet Security Threat Report", this continues despite the increasing emphasis on internet security due to the expanding "underground economy". With more than fifty percent of the supporting servers located in the United States, this underground economy has become a haven for internet thieves, who use the system in order to sell stolen information. These pieces of information can range from generic things such as a user account or email to something as personal as a bank account number and PIN.
While the processes these internet thieves use are abundant and unique, one popular trap unsuspecting people fall into is that of online purchasing. This is not to allude to the idea that every purchase one makes online will leave them susceptible to identity theft, but rather that it increases the chances. In fact, in a 2001 article titled "Consumer Watch", the popular online site PC World went as far as calling secure e-shopping a myth. Though unlike the gateway attacks mentioned above, these incidents of information being stolen through online purchases generally are more prevalent in medium to large e-commerce sites, rather than smaller individualized sites. This is assumed to be a result of the larger consumer population and purchases, which allow for more potential leeway with information.
Ultimately, however, the potential for a violation of one's privacy is typically out of their hands after purchasing from an online "e-tailer" or store. One of the most common forms in which hackers receive private information from online e-tailers actually comes from an attack placed upon the site's servers responsible for maintaining information about previous transactions. For as experts explain, these e-tailers are not doing nearly enough to maintain or improve their security measures. Even those sites that clearly present a privacy or security policy can be subject to hackers’ havoc as most policies only rely upon encryption technology which only applies to the actual transfer of a customer's data. However, with this being said, most e-tailers have been making improvements, going as far as covering some of the credit card fees if the information's abuse can be traced back to the site's servers.
As one of the largest growing concerns American adults have of current internet privacy policies, identity and credit theft remain a constant figure in the debate surrounding privacy online. A 1997 study by the Boston Consulting Group showed that participants of the study were most concerned about their privacy on the internet compared to any other media. However, it is important to recall that these issues are not the only prevalent concerns society has. Another prevalent issue remains members of society sending disconcerting emails to one another. It is for this reason in 2001 that for one of the first times the public expressed approval of government intervention in their private lives.
With the overall public anxiety regarding the constantly expanding trend of online crimes, in 2001 roughly fifty-four percent of Americans polled showed a general approval for the FBI monitoring those emails deemed suspicious. Thus, it was born the idea for the FBI program: "Carnivore", which was going to be used as a searching method, allowing the FBI to hopefully home in on potential criminals. Unlike the overall approval of the FBI's intervention, Carnivore was not met with as much of a majority's approval. Rather, the public seemed to be divided with forty-five percent siding in its favor, forty-five percent opposed to the idea for its ability to potentially interfere with ordinary citizen's messages, and ten percent claiming indifference. While this may seem slightly tangent to the topic of internet privacy, it is important to consider that at the time of this poll, the general population's approval on government actions was declining, reaching thirty-one percent versus the forty-one percent it held a decade prior. This figure in collaboration with the majority's approval of FBI intervention demonstrates an emerging emphasis on the issue of internet privacy in society and more importantly, the potential implications it may hold on citizens’ lives.
Online users must seek to protect the information they share with online websites, specifically social media. In today's Web 2.0 individuals have become the public producers of personal information. Users create their own digital trails that hackers and companies alike capture and utilize for a variety of marketing and advertisement targeting. A recent paper from the Rand Corporation claims "privacy is not the opposite of sharing – rather, it is control over sharing." Internet privacy concerns arise from the surrender of personal information to engage in a variety of acts, from transactions to commenting in online forums. Protection against invasions of online privacy will require individuals to make an effort informing and protecting themselves via existing software solutions, to pay premiums for such protections or require individuals to place greater pressure on governing institutions to enforce privacy laws and regulations regarding consumer and personal information.
Impact of internet surveillance tools on marginalized communities
Internet privacy issues also affect existing class distinctions in the United States, often disproportionately impacting historically marginalized groups typically classified by race and class. Individuals with access to private digital connections that have protective services are able to more easily prevent data privacy risks of personal information and surveillance issues. Members of historically marginalized communities face greater risks of surveillance through the process of data profiling, which increases the likelihood of being stereotyped, targeted, and exploited, thus exacerbating pre-existing inequities that foster uneven playing fields. There are severe, and often unintentional, implications for big data which results in data profiling. For example, automated systems of employment verification run by the federal government such as E-verify tend to misidentify people with names that do not adhere to standardized Caucasian-sounding names as ineligible to work in the United States, thus widening unemployment gaps and preventing social mobility. This case exemplifies how some programs have bias embedded within their codes.
Tools using algorithms and artificial intelligence have also been used to target marginalized communities with policing measures, such as using facial recognition softwares and predictive policing technologies that use data to predict where a crime will most likely occur, and who will engage in the criminal activity. Studies have shown that these tools exacerbate the existing issue of over-policing in areas that are predominantly home to marginalized groups. These tools and other means of data collection can also prohibit historically marginalized and low-income groups from financial services regulated by the state, such as securing loans for house mortgages. Black applicants are rejected by mortgage and mortgage refinancing services at a much higher rate than white people, exacerbating existing racial divisions. Members of minority groups have lower incomes and lower credit scores than white people, and often live in areas with lower home values. Another example of technologies being used for surveilling practices is seen in immigration. Border control systems often use artificial intelligence in facial recognition systems, fingerprint scans, ground sensors, aerial video surveillance machines, and decision-making in asylum determination processes. This has led to large-scale data storage and physical tracking of refugees and migrants.
While broadband was implemented as a means to transform the relationship between historically marginalized communities and technology to ultimately narrow the digital inequalities, inadequate privacy protections compromise user rights, profile users, and spur skepticism towards technology among users. Some automated systems, like the United Kingdom government’s Universal Credit system in 2013, have failed to take into account that people, often minorities, may already lack internet access or digital literacy skills and therefore be deemed ineligible for online identity verification requirements, such as forms for job applications or to receive social security benefits, for example. Marginalized communities using broadband services may also not be aware of how digital information flows and is shared with powerful media conglomerates, reflecting a broader sense of distrust and fear these communities have with the state. Marginalized communities may therefore end up feeling dissatisfied or targeted by broadband services, whether from nonprofit community service providers or state providers.
Laws and regulations
Global privacy policies
The General Data Protection Regulation (GDPR) is the toughest privacy and security law in the world. Though it was drafted and passed by the European Union (EU), it imposes obligations onto organizations anywhere, so long as they target or collect data related to people in the EU. And unfortunately there is no globally unified laws and regulations.
European General Data protection regulation
In 2009 the European Union has for the first time created awareness on tracking practices when the ePrivacy-Directive (2009/136/EC) was put in force. In order to comply with this directive, websites had to actively inform the visitor about the use of cookies. This disclosure has been typically implemented by showing small information banners. 9 years later, by 25 May 2018 the European General Data Protection Regulation (GDPR) came in force, which targets to regulate and restrict the usage of personal data in general, irrespective of how the information is being processed. The regulation primarily applies to so-called “controllers”, which are (a) all organizations that process personal information within the European Union, and (b) all organizations which process personal information of EU-based persons outside the European Union. Article 4 (1) defines personal information as anything that may be used for identifying a “data subject” (e.g. natural person) either directly or in combination with other personal information. In theory this even takes common internet identifiers such as cookies or IP-Addresses in scope of this regulation. Processing such personal information is restricted unless a "lawful reason" according to Article 6 (1) applies. The most important lawful reason for data processing on the internet is the explicit content given by the data subject. More strict requirements apply for sensitive personal information (Art 9), which may be used for revealing information about ethnic origin, political opinion, religion, trade union membership, biometrics, health or sexual orientation. However, explicit user content still is sufficient to process such sensitive personal information (Art 9 (2) lit a). “Explicit consent” requires an affirmative act (Art 4 (11)), which is given if the individual person is able to freely choose and does consequently actively opt in.
As per June 2020, typical cookie implementations are not compliant to this regulation, and other practices such as device fingerprinting, cross-website-logins or 3rd party-requests are typically not disclosed, even though many opinions consider such methods in scope of the GDPR. The reason for this controversary is the ePrivacy-Directive 2009/136/EC which is still unchanged in force. An updated version of this directive, formulated as ePrivacy Regulation, shall enlarge the scope from cookies only to any type of tracking method. It shall furthermore cover any kind of electronic communication channels such as Skype or WhatsApp. The new ePrivacy-Regulation was planned to come in force together with the GDPR, but as per July 2020 it was still under review. Some people assume that lobbying is the reason for this massive delay.
Irrespective of the pending ePrivacy-Regulation, the European High Court has decided in October 2019 (case C-673/17) that the current law is not fulfilled if the disclosed information in the cookie disclaimer is imprecise, or if the consent checkbox is pre-checked. Consequently, many cookie disclaimers that were in use at that time were confirmed to be incompliant to the current data protection laws. However, even this high court judgement only refers to cookies and not to other tracking methods.
Internet privacy in China
One of the most popular topics of discussion in regards to internet privacy is China. Although China is known for its remarkable reputation on maintaining internet privacy among many online users, it could potentially be a major jeopardy to the lives of many online users who have their information exchanged on the web on a regular basis. For instance, in China, there is a new software that will enable the concept of surveillance among the majority of online users and present a risk to their privacy. The main concern with privacy of internet users in China is the lack thereof. China has a well known policy of censorship when it comes to the spread of information through public media channels. Censorship has been prominent in Mainland China since the communist party gained power in China over 60 years ago. With the development of the internet, however, privacy became more of a problem for the government. The Chinese Government has been accused of actively limiting and editing the information that flows into the country via various media. The internet poses a particular set of issues for this type of censorship, especially when search engines are involved. Yahoo! for example, encountered a problem after entering China in the mid-2000s. A Chinese journalist, who was also a Yahoo! user, sent private emails using the Yahoo! server regarding the Chinese government. Yahoo! provided information to the Chinese government officials track down journalist, Shi Tao. Shi Tao allegedly posted state secrets to a New York-based website. Yahoo provided incriminating records of the journalist's account logins to the Chinese government and thus, Shi Tao was sentenced to ten years in prison. These types of occurrences have been reported numerous times and have been criticized by foreign entities such as the creators of the Tor network, which was designed to circumvent network surveillance in multiple countries.
User privacy in China is not as cut-and-dry as it is in other parts of the world. China, reportedly, has a much more invasive policy when internet activity involves the Chinese government. For this reason, search engines are under constant pressure to conform to Chinese rules and regulations on censorship while still attempting to keep their integrity. Therefore, most search engines operate differently in China than in other countries, such as the US or Britain, if they operate in China at all. There are two types of intrusions that occur in China regarding the internet: the alleged intrusion of the company providing users with internet service, and the alleged intrusion of the Chinese government. The intrusion allegations made against companies providing users with internet service are based upon reports that companies, such as Yahoo! in the previous example, are using their access to the internet users' private information to track and monitor users' internet activity. Additionally, there have been reports that personal information has been sold. For example, students preparing for exams would receive calls from unknown numbers selling school supplies. The claims made against the Chinese government lie in the fact that the government is forcing internet-based companies to track users private online data without the user knowing that they are being monitored. Both alleged intrusions are relatively harsh and possibly force foreign internet service providers to decide if they value the Chinese market over internet privacy. Also, many websites are blocked in China such as Facebook and Twitter. However many Chinese internet users use special methods like a VPN to unblock websites that are blocked.
Internet privacy in Sweden
Sweden is considered to be at the forefront of internet use and regulations. On 11 May 1973 Sweden enacted the Data Act − the world's first national data protection law. They are constantly innovating the way that the internet is used and how it impacts their people. In 2012, Sweden received a Web Index Score of 100, a score that measures how the internet significantly influences political, social, and economic impact, placing them first among 61 other nations. Sweden received this score while in the process of exceeding new mandatory implementations from the European Union. Sweden placed more restrictive guidelines on the directive on intellectual property rights enforcement (IPRED) and passed the FRA-law in 2009 that allowed for the legal sanctioning of surveillance of internet traffic by state authorities. The FRA has a history of intercepting radio signals and has stood as the main intelligence agency in Sweden since 1942. Sweden has a mixture of government's strong push towards implementing policy and citizens' continued perception of a free and neutral internet. Both of the previously mentioned additions created controversy by critics but they did not change the public perception even though the new FRA-law was brought in front of the European Court of Human Rights for human rights violations. The law was established by the National Defense Radio Establishment (Forsvarets Radio Anstalt - FRA) to eliminate outside threats. However, the law also allowed for authorities to monitor all cross-border communication without a warrant. Sweden's recent emergence into internet dominance may be explained by their recent climb in users. Only 2% of all Swedes were connected to the internet in 1995 but at last count in 2012, 89% had broadband access. This was due in large part once again to the active Swedish government introducing regulatory provisions to promote competition among internet service providers. These regulations helped grow web infrastructure and forced prices below the European average.
For copyright laws, Sweden was the birthplace of the Pirate Bay, an infamous file-sharing website. File sharing has been illegal in Sweden since it was developed, however, there was never any real fear of being persecuted for the crime until 2009 when the Swedish Parliament was the first in the European Union to pass the intellectual property rights directive. This directive persuaded internet service providers to announce the identity of suspected violators.
Sweden also has its infamous centralized block list. The list is generated by authorities and was originally crafted to eliminate sites hosting child pornography. However, there is no legal way to appeal a site that ends up on the list and as a result, many non-child pornography sites have been blacklisted. Sweden's government enjoys a high level of trust from their citizens. Without this trust, many of these regulations would not be possible and thus many of these regulations may only be feasible in the Swedish context.
Internet privacy in the United States
With the Republicans in control of all three branches of the U.S. government, lobbyists for internet service providers (ISPs) and tech firms persuaded lawmakers to dismantle regulations to protect privacy which had been made during the Obama administration. These FCC rules had required ISPs to get "explicit consent" before gathering and selling their private internet information, such as the consumers' browsing histories, locations of businesses visited and applications used. Trade groups wanted to be able to sell this information for profit. Lobbyists persuaded Republican senator Jeff Flake and Republican representative Marsha Blackburn to sponsor legislation to dismantle internet privacy rules; Flake received $22,700 in donations and Blackburn received $20,500 in donations from these trade groups. On March 23, 2017, abolition of these privacy protections passed on a narrow party-line vote. In June 2018, California passed the law restricting companies from sharing user data without permission. Also, users would be informed to whom the data is being sold and why. On refusal to sell the data, companies are allowed to charge a little higher to these consumers. Mitt Romney, despite approving a Twitter comment of Mark Cuban during a conversation with Glenn Greenwald about anonymity in January 2018, was revealed as the owner of the Pierre Delecto lurker account in October 2019.
Legal threats
Used by government agencies are array of technologies designed to track and gather internet users' information are the topic of much debate between privacy advocates, civil liberties advocates and those who believe such measures are necessary for law enforcement to keep pace with rapidly changing communications technology.
Specific examples:
Following a decision by the European Union's council of ministers in Brussels, in January 2009, the UK's Home Office adopted a plan to allow police to access the contents of individuals' computers without a warrant. The process, called "remote searching", allows one party, at a remote location, to examine another's hard drive and internet traffic, including email, browsing history and websites visited. Police across the EU are now permitted to request that the British police conduct a remote search on their behalf. The search can be granted, and the material gleaned turned over and used as evidence, on the basis of a senior officer believing it necessary to prevent a serious crime. Opposition MPs and civil liberties advocates are concerned about this move toward widening surveillance and its possible impact on personal privacy. Says Shami Chakrabarti, director of the human rights group Liberty, "The public will want this to be controlled by new legislation and judicial authorisation. Without those safeguards it's a devastating blow to any notion of personal privacy."
The FBI's Magic Lantern software program was the topic of much debate when it was publicized in November 2001. Magic Lantern is a Trojan Horse program that logs users' keystrokes, rendering encryption useless to those infected.
Children and internet privacy
Internet privacy is a growing concern with children and the content they are able to view. Aside from that, many concerns for the privacy of email, the vulnerability of internet users to have their internet usage tracked, and the collection of personal information also exist. These concerns have begun to bring the issues of internet privacy before the courts and judges.
See also
Anonymous blogging
Anonymous P2P
Anonymous post
Anonymous remailer
Anonymous web browsing
Index of Articles Relating to Terms of Service and Privacy Policies
Internet censorship
Location-based service#Privacy issues
Privacy-enhancing technologies
PRISM
Privacy concerns with social networking services
Spatial cloaking
Right to be forgotten
Privacy in Australian law
Canadian privacy law
European Union Data Protection Directive
Privacy in English law
Privacy laws in Russia
Privacy laws of the United States
Computer and network surveillance
Mass surveillance
Unauthorized access in online social networks
References
Further reading
Lohr, Steve, "How Privacy Can Vanish Online, a Bit at a Time", The New York Times, Wednesday, March 17, 2010
Gazaleh, Mark (2008) "Online trust and perceived utility for consumers of web privacy statements – Overview" WBS, 35pp.
Federal Trade Commission, "Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers", December 2010
Topolsky, J. (2012, February 16). "Tempted by cool apps, users should see Apple’s privacy issues as a wake-up call". Washington Post, p. A15.
"PRISM-Proof Security Considerations", Internet-Draft, Phillip Hallam-Baker, Internet Engineering Task Force (IETF), October 27, 2014.
External links
Electronic Frontier Foundation - an organization devoted to privacy and intellectual freedom advocacy
Ponemon Institute - independent research center dedicated to privacy, data protection and information security policy
Pew Research Center - Online Privacy and Safety - nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world
Expectation of privacy for company email not deemed objectively reasonable – Bourke v. Nissan
Internet Privacy: The Views of the FTC, the FCC, and NTIA: Joint Hearing before the Subcommittee on Commerce, Manufacturing, and Trade and the Subcommittee on Communications and Technology of the Committee on Energy and Commerce, House of Representatives, One Hundred Twelfth Congress, First Session, July 14, 2011
Data laws
Terms of service |
8546742 | https://en.wikipedia.org/wiki/Utmp | Utmp | utmp, wtmp, btmp and variants such as utmpx, wtmpx and btmpx are files on Unix-like systems that keep track of all logins and logouts to the system.
Format
utmp, wtmp and btmp
utmp maintains a full accounting of the current status of the system, system boot time (used by uptime), recording user logins at which terminals, logouts, system events etc.
wtmp acts as a historical utmp
btmp records failed login attempts
These files are not regular text files, but rather a binary format which needs to be edited by specially crafted programs. The implementation and the fields present in the file differ depending on the system or the libc version, and are defined in the utmp.h header file. The wtmp and btmp format are exactly like utmp except that a null value for "username" indicates a logout on the associated terminal (the actual user name is located by finding the preceding login on that terminal). Furthermore, the value "~" as a terminal name with username "shutdown" or "reboot" indicates a system shutdown or reboot (respectively).
These files are not set by any given PAM module (such as pam_unix.so or pam_sss.so) but are set by the application performing the operation (e.g. mingetty, /bin/login, or sshd). As such it is the obligation of the program itself to record the utmp information.
utmpx, wtmpx and btmpx
Utmpx and wtmpx are extensions to the original utmp and wtmp, originating from Sun Microsystems. Utmpx is specified in POSIX. The utmp, wtmp and btmp files were never a part of any official Unix standard, such as Single UNIX Specification, while utmpx and corresponding APIs are part of it. While some systems create different newer files for the utmpx variants and have deprecated/obsoleted former formats, this is not always the case. Linux for example uses the utmpx structure in the place of the older file structure.
Location
Depending on the system, those files may commonly be found in different places (non-exhaustive list) :
Linux :
/var/run/utmp
/var/log/wtmp
/var/log/btmp
Solaris:
/var/adm/utmp (deprecated), /var/adm/utmpx
/var/adm/wtmp (deprecated), /var/adm/wtmpx
HP-UX:
/etc/utmp (deprecated), /etc/utmpx
/var/adm/wtmp (deprecated), /var/adm/wtmpx
/var/adm/btmp (deprecated), /var/adm/btmpx
FreeBSD 9.0 introduced new files while adding support for utmpx:
/var/run/utx.active (replaces utmp)
/var/log/utx.lastlogin (replaces lastlog)
/var/log/utx.log (replaces wtmp)
Related commands
Different commands allow users to consult the information stored in those files, including who (which show current system users), last (which show last logged in users) and lastb (which show last failed login attempts, Linux-specific).
See also
lastlog
References
External links
Solaris Trim wtmpx file
Unix
Unix software |
4982675 | https://en.wikipedia.org/wiki/Get%20a%20Mac | Get a Mac | The "Get a Mac" campaign is a television advertising campaign created for Apple Inc. (Apple Computer, Inc. at the start of the campaign) by TBWA\Media Arts Lab, the company's advertising agency, that ran from 2006 to 2009. The advertisements were shown in the United States, Canada, Australia, New Zealand, the United Kingdom, Japan and Germany.
Synopsis
The Get a Mac advertisements follow a standard template. They open to a plain white background, and a man dressed in casual clothes introduces himself as an Apple Macintosh computer ("Hello, I'm a Mac."), while a man in a more formal suit-and-tie combination introduces himself as a Microsoft Windows personal computer ("And I'm a PC.").
The two then act out a brief vignette, in which the capabilities and attributes of Mac and PC are compared, with PC—characterized as formal and somewhat polite, though uninteresting and overly concerned with work—often being frustrated by the more laid-back Mac's abilities. The earlier commercials in the campaign involved a general comparison of the two computers, whereas the later ones mainly concerned Windows Vista and Windows 7.
The aim of this commercial film series is to associate PC users (namely Windows users) with the "unpopular nerd" cliché, while representing Apple Mac users as young, creative, attractive, and lucky.
The original American advertisements star actor Justin Long as the Mac, and author and humorist John Hodgman as the PC, and were directed by Phil Morrison. The American advertisements also aired on Canadian, Australian, and New Zealand television, and at least 24 of them were dubbed into Spanish, French, German, and Italian. The British campaign stars comedic duo Robert Webb as Mac and David Mitchell as PC while the Japanese campaign features the comedic duo Rahmens. Several of the British and Japanese advertisements, although based on the originals, were slightly altered to better target the new audiences. Both the British and Japanese campaigns also feature several original ads not seen in the American campaign.
The Get a Mac campaign is the successor to the Switch ads that were first broadcast in 2002. Both campaigns were filmed against a plain white background. Apple's former CEO, Steve Jobs, introduced the campaign during a shareholders meeting the week before the campaign started. The campaign also coincided with a change of signage and employee apparel at Apple retail stores detailing reasons to switch to Macs.
The Get a Mac campaign received the Grand Effie Award in 2007. The song in the commercial is called "Having Trouble Sneezing" by Mark Mothersbaugh.
On November 10, 2020, John Hodgman returned and portrayed a PC at the end of Apple's "One More Thing" event; criticizing the upgrades made to the Macintosh lineup earlier in the event.
Advertisements
The ads play on perceived weaknesses of non-Mac personal computers, especially those running Microsoft Windows, of which PC is clearly intended to be a parody, and corresponding strengths possessed by the Mac OS (such as immunity to circulating viruses and spyware targeted at Microsoft Windows). The target audience of these ads is not devoted PC users but rather, those who are more likely to "swing" towards Apple. Apple realizes that many consumers who choose PCs do so because of their lack of knowledge of the Apple brand. With this campaign, Apple was targeting those users who may not consider Macs when purchasing but may be persuaded to when they view these ads. Each of the ads is about 30 seconds in length and is accompanied by a song called "Having Trouble Sneezing," which was composed by Mark Mothersbaugh. The advertisements are presented below in alphabetical order, not chronological order.
North American campaign
The following is an alphabetical list of the ads that appeared in the campaign shown in the United States, Canada, Australia and New Zealand.
Accident—A wheelchair-bound PC, who is wearing casts on his arms, explains that he fell off his desk when someone tripped over his power cord, thus prompting Mac to point out that the MacBook's and MacBook Pro's magnetic power cord prevents such an occurrence. The Macbook pictured at the end demonstrates a harmless cord disconnection.
Angel/Devil—Mac gives PC an iPhoto book to view. Suddenly, angel and devil versions of PC appear behind him. The angel encourages PC to compliment Mac, while the devil prods PC to destroy the book. In the end, PC says the book is good and then turns around, feeling the air where the angel and devil versions of himself were.
Bake Sale—When Mac questions PC regarding a bake sale he has set up, PC replies that he is trying to raise money by himself in order to fix Vista's problems. Mac decides to contribute by buying a cupcake, but as soon as he takes a bite, PC asks him to pay ten million dollars for it.
Bean Counter—PC is trying to balance his budget, admitting that Vista's problems are frustrating PC users and it's time to take drastic action: spending almost all of the money on advertising. When Mac asks PC if he thinks the small amount of money left will fix Vista, PC reallocates all of it to advertising. This ad coincided with the introduction of Microsoft's "I'm a PC" campaign.
Better—Mac praises PC's ability with spreadsheets but explains that he is better with life-related activities such as music, pictures, and movies. PC defensively asks what Mac means by "better," only to sheepishly claim a different definition when Mac tells him.
Better Results—PC and Mac discuss making home movies and show each other their efforts. Supermodel Gisele Bündchen enters, representing Mac's movie, while PC's movie is represented by a man with a hairy chest wearing a blonde wig and a dress similar to Bündchen's. PC states that his movie is a "work-in-progress."
Biohazard Suit—PC first appears wearing a biohazard suit to protect himself from PC viruses and malware, of which PC says there are 20,000 discovered every day. Mac asks PC if he is going to live in the suit for the rest of his life, but PC cannot hear him because he is too protected by his virus-proof mask, and takes it off. PC then shrieks and struggles to place it on again.
Boxer—PC is introduced by a ring announcer as if he were in a boxing match, stating that he's not going down without a fight. Mac explains that the issue is not a competition but, rather, people switching to a computer that's simpler and more intuitive. The announcer admits his brother-in-law recently purchased a Mac and loves it. This is also the first ad to show Mac OS X Leopard.
Breakthrough—Mac and PC's therapist (played by Corinne Bohrer, see "Counselor" below) suggest that PC's problems are simply a result of software and hardware coming from various sources, whereas Mac gets all his hardware and software from one place. PC keeps repeating "It's not my fault!" with support of Mac and the therapist before concluding, "It's Mac's fault! It's Mac's fault!" Mac and the therapist are disappointed in PC's conclusion, but PC nevertheless ends with the comment "What a Breakthrough!"
Broken Promises—PC tells Mac how excited he is about the launch of Windows 7 and assures him it won't have the same problems as Vista. However, Mac feels like he has heard this before and has a series of flashbacks with past versions of PC assuring him about Windows Vista, XP, ME, 98, 95, and 2.0. On the last flashback, PC says, "Trust me." Back in the present, he explains this time it's going to be different and says, "Trust me," in an almost identical way to his flashback counterpart.
Calming Teas—PC announces calming teas and bath salts to make Vista's annoyances easier to live with, such as "Crashy-time Chamomile", "Missing Driver Mint", "Pomegranate Patience", and "Raspberry Restart". He doesn't get time to talk about his bath salts.
Choose a Vista—Confused about which of the six versions of Windows Vista to get, PC spins a large game show wheel. PC lands on Lose a Turn, and Mac questions why PC put that space on the wheel.
Computer Cart—PC and three other men in suits are on a computer cart. When Mac asks why, PC says that he gets an error with a Windows Media Player Dynamic-link library file (WMP.DLL), and that the others suffer from similar errors. The man in the beige suit represents error 692, the man in the grey suit represents a Syntax error, and the man in the bottom of the cart represents a fatal system error (PC whispers, "He's a goner," at the commercial's end). Mac explains that Macs don't get cryptic error messages.
Counselor—PC and Mac visit a psychotherapist (played by Corinne Bohrer) to resolve their differences. While Mac finds it easy to compliment PC ("You are a wizard with numbers and you dress like a gentleman."), PC's resentment is too deep for him to reciprocate ("I guess you are better at creating stuff, even though it's completely juvenile and a waste of time."). The counselor suggests that they come twice a week.
Customer Care—Mac is seen with a Mac Genius from an Apple Retail Store's Genius Bar, who can fix Mac problems. PC then has a short montage of endless automated customer-support messages, never reaching a real person, much to his disappointment. PC then says that his source of help is "the same" as a Mac Genius.
Elimination—PC attempts to find Megan, a new laptop hunter, the perfect PC. He starts by eliminating from a lineup of fellow PCs all those who have too-small screens and too-slow processors. However, none of the PCs is "immune" to viruses, which is Megan's #1 concern, so PC leaves her with Mac.
Flashback—Mac asks PC if he would like to see the website and home movie that he made. This prompts PC to remember a time when both he and Mac were children: when the younger Mac asks the younger PC if he would like to see some artwork he did, the younger PC takes out a calculator and calculates the time they have just wasted (This may be a reference to the time when PC's were text-based, while Macs were slower but had GUIs). Returning from the flashback, PC does the same thing.
Genius—Mac introduces PC to one of the Mac Geniuses from the Apple Retail Store's Genius Bar. PC tests the Genius, starting with math questions, which culminates in asking her, on a scale of one to ten, how much does he loathe Mac, to which she answers "Eleven." Surprised, PC says "She's good. Very good."
Gift Exchange—Mac and PC exchange gifts for Christmas. PC, who is hoping for a C++ GUI programming guide, is disappointed to receive a photo album of previous Get a Mac ads made on iPhoto. In contrast, he gives Mac a C++ GUI programming guide.
Goodwill—Mac and PC agree to put aside their differences because of the Christmas season. Although PC momentarily slips and states that Mac wastes his time with frivolous pursuits like home movies and blogs, the two agree, as Mac says, to "Pull it into hug harbor," and they wish each other a good holiday.
Group—PC is at a support group for PCs living with Vista. The other PCs there tell him to take it one day at a time and that he is facing the biggest fact of all—that Vista isn't working as it should. They all wish the Vista problems will go away sooner and a lot easier. One of them says pleasingly that he has been error-free for a week, but he starts to repeat himself uncontrollably, discouraging the others.
iLife—PC listens to an iPod and praises iTunes. Mac replies that the rest of iLife works just as well and comes on every Mac. PC defensively responds by listing the cool apps that he comes with, but he can only identify a calculator and a clock.
I Can Do Anything—In this animated commercial designed for the holiday season, PC asks Mac why the former loves the holidays so much. Mac asks if it's the season for peace on earth, but PC replies that they get to be animated and can do anything. PC demonstrates by floating in the air, building a snowman in fast motion, and asking a hopping bunny where he is going. The bunny, who can speak, says he's going to the Apple Store for some last-minute gifts. PC then purposely tips off the snowman's head, making it fall on the bunny, and sarcastically apologizes to him, calling himself clumsy. The animation style for this ad mimics the Rankin/Bass animation style seen in a number of classic Christmas specials.
Legal Copy—Every time PC says something positive about himself, the legal copy that appears on the screen bottom increases. He finally states that PCs are now 100% trouble-free, and the legal copy covers the whole screen.
Meant for Work—PC, looking haggard and covered in stickers, complains about the children who use him and their activities, such as making movies and blogging, which are wearing him out. He also says he cries himself to sleep mode every night, complaining that, unlike Mac, he is meant more for office work. PC is then alerted because his user wants to listen to some emo music and, with a loud groan, trudges off, showing an Anarchy sticker on his back.
Misprint—PC is on the phone with PC World, attempting to report a misprint. He explains how the print said, "The fastest Windows Vista notebook we tested this year is a Mac." PC argues how impossible it is for a Mac to run Vista faster than a PC, while Mac tries to explain that it is true. While arguing with PCWorld over the phone, PC says that he'll put Mac on the line to set things straight. However, he instead impersonates Mac, saying that PCs are faster.
Network—Mac and PC are holding hands to demonstrate their ability to network with each other. A Japanese woman representing a new digital camera enters and takes Mac's free hand. While Mac and the camera are perfectly compatible and speak to each other fluently, PC—who cannot speak Japanese—is utterly confused and unable to communicate, representing that Windows PCs need a driver installation with virtually all new hardware. This commercial has been criticized as being projectory and blame shifting, because Apple is infamous for having a proprietary system that is isolated from open standards (e.g. no support for Bluetooth filesharing, WiFi Direct, NFC filesharing; no USB charging connector on the iPhone, lack of modularity (no MicroSD-expandable storage in iPhones), etc.).
Now What—PC begins by showing off his new, long book, I Want to Buy a Computer — Now What? to help customers deal with all the difficult computer-buying decisions if they have no one to help. Mac then explains that, at Apple Stores, personal shoppers help customers find the perfect Mac, even offering workshops to teach people about using the computers. Upon hearing this, PC brings out his book's companion volume, I Just Bought a Computer — Now What?
Office Stress—Mac's new Microsoft Office 2008 has just been released. In the box that PC gives Mac is a stress toy for him to use when he gets overwhelmed from doing lots more work. However, PC begins using the toy, complaining that Microsoft Office is also compatible with Mac, that he wants to switch his files over, and that he is getting less work than Mac, eventually breaking the toy.
Off the Air—Mac and PC appear with a Mac Genius, who announces it is now easier than ever to switch to a Mac and that a Mac Genius can switch over a PC files to a new Mac for free. PC then protests that fear is what keeps people from switching, and people don't need to hear about the Mac Genius. In protest, he pulls a cover over the camera, which has a test card drawn on it, and declares that they are off the air.
Out of the Box—Mac (in a white box) and PC (in a brown box doing some exercises) are discussing what they will do when they are unpacked. Mac says that he can get started right away, but PC is held up by the numerous activities that he must complete before being useful. Mac eventually leaves to get right to work, but PC is forced to wait for parts that are still in other boxes.
Party is Over—PC unhappily throws a party celebrating the release of Windows Vista. He complains to Mac that he had to upgrade his hardware and now can't use some of his old software and peripherals. He then talks with one of the party members about throwing another party in five years, which turns into five years and a day, and so on.
PC Choice Chat—PC has his own radio talk show called PC Choice Chat, and people begin to call in asking for advice on which computer to get. All the callers ask for advice on a computer that would qualify as a Mac but not as a PC. One caller asks for a computer for people who hate getting viruses, another caller asks for PC help like Mac Geniuses, and a third caller wants to switch to Mac altogether. PC ignores these calls.
PC Innovations Lab—PC introduces himself and then starts talking about the PC Innovations Lab he has set up. When Mac questions him about it, he tells Mac that in response to the Mac's magnetic power cord, he wrapped another PC in bubble wrap, and in response to Mac's all-day battery life, he made an extremely long power cord. Mac tells PC that innovations should make people's lives easier, to which PC shows mac another PC with cupholders on its shoulders. PC then takes the cup and says "Cheers to innovation!"
PC News—PC is sitting at a news desk and turns it over to a correspondent at what seems to be a launch party for Windows 7. A person being interviewed reveals that he is switching to a Mac. PC is surprised by this and asks why, but more people speak of how Mac is #1 with customer satisfaction until PC finally says to cut the feed. He then suggests going to commercial, but Mac acknowledges that they are in a commercial, so PC instead suggests going to another commercial.
Pep Rally—PC is introduced by a cheerleading squad. When asked, PC explains Mac's number-one status on college campuses with a built-in iSight camera, a stable operating system, and an ability to run Microsoft Office so well, so he wants to win students back with a pep rally. The cheerleaders cheer, "Mac's Number One!" and upon PC's complaint, they cheer, "PC's Number Two!"
Pizza Box—PC tries to attract college students by posing as a free box of pizza. This ad was aired during Apple's 2008 back-to-school promotion.
Podium—PC, in the style of a political candidate, is standing at a podium making declarations about Windows Vista, urging those who are having compatibility problems with existing hardware to simply replace them and to ignore the new features of Mac OS X Leopard. However, he privately admits to Mac that he himself has downgraded to Windows XP three weeks ago. His key slogan is: "It's not about what Vista can do for you; it's what you can buy for Vista."
PR Lady—Mac and PC are joined by a public relations representative (played by Mary Chris Wall), who has been hired by PC to place a positive spin on the reaction to Windows Vista and claims that many people are even downgrading back to Windows XP. Her response to claims that more people are switching to Mac instead is a sheepish "No comment."
Referee—A referee is present, according to PC, to make sure that Mac doesn't go on saying that Leopard is better and faster than Vista. When Mac defends himself, saying it was The Wall Street Journal that compared the two, PC complains, and the referee sides with Mac. Upon insulting the referee, PC is ejected, but PC rebuts, saying that he has nowhere to go (in the ad's area).
Restarting—Mac and PC explain how they both have a lot in common, but their discussion is hampered by PC's unfortunate habit of freezing and restarting.
Sabotage—PC is present, but a different actor (Robert Webb in UK version) appears in Mac's place, obviously reciting poorly memorized lines to flatter PC. The real Mac arrives soon after, and, while PC denies anything is happening, the impostor Mac tells the real Mac that he is a big fan of his.
Sad Song—PC sings a short country-and-Western-style song to express his grievances about people leaving PCs for Macs and Vista's technical issues. A hound-dog then howls, which Mac says is a "nice touch." A longer version ends with Mac asking PC if the dog is his, which it isn't.
Sales Pitch—Although Mac introduces himself as usual, PC says, "And buy a PC." He explains that Mac's increasing popularity is forcing him to be more forward in his self-promotion, so he is reduced to holding up red signs depicting various pitches.
Santa Claus—Another animated Get a Mac commercial featuring Santa Claus and Christmas caroling by both PC and Mac. PC spoils the group's singing of "Santa Claus is Coming to Town" by inserting "Buy a PC and not a Mac this holiday season or any other time for goodness sake," and claims, "That's how I learned it." The animation style is similar to the Rankin/Bass television specials Rudolph the Red-Nosed Reindeer and Santa Claus Is Comin' to Town.
Security—In a reference to criticisms of Windows Vista's security features, PC is a joined by a tall United States Secret Service-style bodyguard (Patrick Warburton) who represents Vista's new security feature. The guard intrusively demands PC's decisions to cancel or allow every incoming or outgoing interaction he has with Mac.
Self Pity—Mac, for once, is wearing a suit. He explains that he "does work stuff, too," and has been running Microsoft Office for years. Upon hearing this, PC becomes despondent and collapses on the floor, begging to be left alone to depreciate.
Stuffed—PC enters slowly with a ballooned torso, explaining that all the trial software is slowing him down. Mac replies that Macs only come with the specific software for which customers ask (namely, the iLife package). As PC finally gets on his mark, Mac begins his intro again, but PC realizes that he has forgotten something and begins to slowly leave.
Stacks—PC is searching through all of his pictures, trying to find a photograph of his friend. He searches one picture at a time, but Mac states that iPhoto has a feature called Faces, in which iPhoto can tag the face of a person and find other pictures of the same person, putting them all into the same folder and saving search time. PC responds to the facial-recognition technology as expensive and tells Mac to sort the pictures instead because he has the technology to make it easier.
Surgery—PC appears in the garb of a patient awaiting surgery, and explains that he is upgrading to Windows Vista but requires surgery to upgrade (specifically, upgrading such items as graphics cards, processors, memory, etc.). In reference to perceived difficulties in upgrading, PC admits that he is worried about going through it and bequeaths his peripherals to Mac should he not survive. Mac asks PC if, like him, his upgrade could be straightforward.
Surprise—Mac appears alongside a customer (Andrée Vermeulen) with PC notably absent. Mac tries to convince the customer, who wants to buy an effective computer, that she should get a PC, telling her that they're much better and more stable. The customer seems skeptical, tells Mac she'll "think about it", and leaves. A frustrated Mac pulls off a mask and his clothes, revealing himself to be PC in disguise. The real Mac then appears, sees PC's discarded mask and clothes, and says, "I don't even want to ask."
Tech Support—A technician (Brian Huskey) is present to install a webcam on PC (using masking tape to attach it to his head). PC is extremely pleased by his new upgrade, but upon hearing from the technician that Mac has a built-in webcam, he storms off without waiting for the camera to be fully installed.
Teeter Tottering—A woman who had a PC has a box of things that were in her PC and says she's switching to Mac. PC tries to convince her to stay while she goes over to Mac every time.
Throne—PC appears in a king's robe and on a throne saying, even though switching computers can be difficult, his subjects won't leave him and that he's still the "king" of computers. Mac then begins talking about how PC's subjects can bring their PC into an Apple Store wherein all PC files can be transferred over to a new Mac, at which point PC declares Mac banished.
Time Machine—Mac appears with nine clones of himself behind him, who all introduce themselves at once. PC is shocked, so the various Macs explain that it is simply Time Machine, a feature in Leopard that makes regular backups of a user's hard drive. PC is forced to admit that such a feature is pretty awesome followed by thanks from the various Macs.
Time Traveler—PC uses a time machine to travel to the year 2150 to see if any major issues such as freezing and crashing have been removed from the PC and to see if PCs will eventually be as hassle-free as Macs are. Promptly after PC arrives in 2150, his future self freezes, which answers the question.
Top of the Line—PC and Mac appear with a customer who is looking for a new computer. PC introduces her to the "top-of-the-line" PC (Patrick Warburton), a handsome and overly slick PC in a suit. She asks him about screen size and speed, to which the top-of-the-line PC says he's the best. However, he balks when she says she doesn't want to deal with any viruses or hassle. She decides to go with Mac, so the top-of-the-line PC hands her his business card and tells her, "When you're ready to compromise...you call me."
Touché—Right after PC introduces himself, Mac replies, "And I'm a PC, too." Mac explains to the confused PC that he can run both Mac OS X and Microsoft Windows, calling himself "the only computer you'll ever need." PC mutters, "Oh...touché." Mac explains, referring to the rules of fencing, that one only says touché after he or she makes a point and someone else makes a counterpoint, but PC continues to misuse the word. A similar conversation occurred inDodgeball: A True Underdog Story, a film in which Justin Long (Mac) appeared.
Trainer—The commercial starts off traditionally, but PC is doing sit-ups with a trainer in a striped shirt (Robert Loggia), whose fierce coaching style discourages PC. PC suggests the trainer try some "positive reinforcement," but the trainer compliments Mac instead, and PC is offended. This is the first commercial to show the Mac OS X Snow Leopard.
Tree Trimming—In another animated Get a Mac commercial for the holiday season, Mac and PC set aside their disagreements and decide to trim a Christmas tree by hanging ornaments and stringing lights. Mac tells PC that they are good friends, while PC gets nervous. When they are finished, PC does not want to light the lights on the tree, but Mac persuades him to do so. PC plugs in the tree's lights, but, when illuminated, the lights spell: "PC RULES." He apologizes to Mac and says that it "just sort of happened."
Trust Mac—PC, in an attempt to hide from spyware, is wearing a trench coat, a fedora, dark glasses, and a false mustache. PC offers Mac a disguise, but Mac declines, saying he does not have to worry about the normal PC spyware and viruses with Mac OS X Leopard.
V Word—PC declares that people should to stop referring to his operating system (Vista) by name. He says using the word "doesn't sit well with frustrated PC users. From now on, we're going to use a word with a lot less baggage:'Windows.'" During the scene, he holds a black box with a large red button that sounds a buzzer when pressed. PC presses the button whenever Mac says Vista. After pointing out that not using the word isn't the same as fixing the operating system's problems, Mac ends the ad by saying Vista several times in rapid succession, thwarting PC's attempts to sound the buzzer.
Viruses—PC has caught a new virus (represented as a cold) and warns Mac to stay away from him, citing the 114,000 known viruses for PCs. Mac states the viruses that affect PCs do not affect him, and PC announces that he will crash before collapsing onto the floor in a faint.
Work vs. Home—Mac describes how he enjoys doing fun activities such as podcasts and movies, which leads PC to claim that he also does fun activities such as timesheets, spreadsheets, and pie charts. After Mac states that it's difficult to capture a family vacation using a pie chart, PC rebuts by showing a pie chart representing "hanging-out time" and "just kicking it" with different shades of gray. Mac replies, "I feel like I was there."
Wall Street Journal—Mac is reading a favorable review of himself by Walt Mossberg in The Wall Street Journal. Jealous, PC claims he also received a great review but is caught off-guard when Mac asks for specific details. This ad is currently not available on the Apple website but can be found on YouTube.
Yoga—Mac is watching PC have a yoga session in which the yoga instructor (Judy Greer) is coaching PC in expelling bad Vista energy and forgetting Vista's problems. When the yoga instructor goes on to complain that Vista caused errors in her yoga billing and then storms off, PC considers switching to pilates.
Web-exclusive campaign
Several advertisements have been shown exclusively in Flash ad campaigns running on numerous websites. Unlike the ads shown on television, these advertisements have not been posted as high-quality QuickTime videos on Apple's website. These ads run for approximately 20 seconds each and reference specific online advertising features (such as banner ads), making it unlikely they will ever appear on television.
The titles are taken from the Flash-video file names.
Banging—PC expresses his regret for upgrading to Windows Vista because it is causing him various problems. Mac tries to comfort him, but PC continues to bang his head on the side of the banner advertisement.
Booby Trap— PC and Mac are at PCMag. PC is angry that they put up a banner ad saying that iLife '09 is the best office suite. PC hooks some cables up to the banner claiming that whoever clicks that will get shocked. PC proves it himself by clicking it.
Claw—In a skyscraper ad, PC is using a grabber claw to try to grab a boxed copy of Microsoft Office 2008 for Mac that is sitting in the top banner ad. He claims that if people see that Office 08 is on the Mac, that they will ask questions regarding what a PC can do that the Mac can't. Mac points out that Office has been on the Mac for years, and that this is simply the latest version. PC knocks over the Office box, which causes an alarm to go off. PC hands the grabber claw to Mac, saying "He did it!"
Cramped—In the only known UK web-exclusive ad, PC and Mac (portrayed by Mitchell and Webb) are lying head-to-head in a banner ad, complaining about the size and format of the banner ad, and encouraging the user to click the ad quicker.
Customer Experience—A banner ad shows that Mac is rated #1 among customers experience. PC is frustrated and goes to more opinions from a before and after hair ad. Both say that the Mac is better.
Customer Satisfaction—A "Mac Customer Satisfaction Score" meter appears in a banner ad above Mac and PC. The meter's needle is hovering at about 85 out of 100. PC excuses himself and climbs up to the upper banner ad, and pulls on the needle. He accidentally breaks off the tip of the meter, and then waves it at the 20 mark, saying "Customer satisfaction is dropping..."
Easy as 1–23—In a Web banner, PC shows Mac his new slogan. Mac assumes it means "PC. Easy as 1-2-3," but PC corrects him by stating it means "Easy as 1 through 23". He then pulls out 23 steps for using a PC.
Editorial—PC drags his own op-ed column into the banner ad (since these ads appeared on news sites, such as cnn.com, it "blends" in with the rest of the site). The op-ed headline says "Stop Switching to Mac!" PC explains that people are switching to Macs more than ever, and that they need to know how much it is hurting PC. He makes a couple of anguished poses in the photo box to illustrate how frustrated he is.
Hiding—PC peeks in from the left side of the screen. When Mac asks what PC is doing, PC explains that he is hiding from viruses and spyware. PC then leaves, saying that he has to run a scan. There are two versions of this ad: a 300x250 square ad and a 160x600 vertical banner ad. PC is identical in both versions, but Mac's performance features a different take in each.
Knocking—PC panics about needing to search for new drivers for his hardware now that he's upgraded to Windows Vista. He tries to force his way off the left side of the screen so he can leave to find the new drivers but repeatedly runs into a wall. When he finally succeeds in breaking through the left side of the screen, he finds himself jumping back in from the right side of the screen.
Newswire—PC, jealous of Mac's good press, gets his own newswire ticker above the ad. Unfortunately, the newswire displays unflattering headlines such as "Vista Users Upset Over Glitches" and "Users Downgrade to XP." PC says he hates his stupid newswire and then the next headline on the newswire is "PC Hates His Stupid Newswire."
Not—A banner ad on the top of the page reads, "Leopard is better and faster than Vista." —Wall Street Journal. On the side, Mac introduces himself while PC climbs a ladder. Mac asks what PC is doing and he says that he is fixing an embarrassing typo. He then climbs all the way to the top and staples a piece of paper that says NOT at the end of the quotation. He then tells Mac that they have the whole Internet to correct and asks Mac to grab the ladder.
PC Turf (PCMag and PCWorld exclusive)—PC welcomes Web surfers to his turf, PCWorld.com, and remarks that Mac must feel out of place there. Mac points out that they said some great things about Macs, so PC asks security to remove Mac because he's going to be a problem. The PCMag version is identical, except PC's voice is re-dubbed to say "PCMag.com."
Refresh—A banner ad on the top of the page reads, "Vista...one of the biggest blunders in technology?" —CNET.com. Off to the side, PC sees the banner and realizes its another bad review of Vista and decides to do an emergency refresh. He walks over and opens a compartment door that says, "Emergency Banner Refresh." PC flips the switch, and the banner is replaced by another banner that reads, "It's time for a Vista do-over" —PC Magazine. PC, frustrated about this review, flips the switch again. The banner is replaced by another that reads, "Mac OS X Leopard: A Perfect 10" —InfoWorld. PC sees this positive review and is relieved until he realizes it's about Leopard. PC angrily flips the switch again to end the ad.
Sign—In a skyscraper ad, Mac asks PC about an unlit sign in a separate banner ad that reads, "DON'T GIVE UP ON VISTA." PC replies that it will stop the problem of frustrated Windows Vista users downgrading to XP or switching to Macs. He presses a button, lighting up only the GIVE UP part of the sign. He presses it again, lighting up ON VISTA. Frustrated, PC presses the button repeatedly, causing GIVE UP and ON VISTA to light up alternately.
Switcher Cams—A banner ad at the top of the page displays a bank of 5 security camera screens which show users walking into Apple Stores; as users walk past each camera "PC SWITCHER" lights up in red beneath each screen. On the side, PC sees the switchers and is disappointed they are upgrading to Mac instead of to Windows 7. Mac says he thought Windows 7 was "supposed to be an improvement", to which PC responds that Macs are still #1 in customer satisfaction and that people will have to move their files over anyway. Still observing the switchers, PC leaves the side and appears on one of the video screens, managing to stop one switcher from going into the Apple Store but says there are still "thousands and thousands to go".
UK campaign
For the British market, the ads were recast with the popular British comedy double act Mitchell and Webb in the lead roles; David Mitchell as PC and Robert Webb as Mac. As well as original ads, several ads from the American campaign were reshot with new dialogue and slightly altered scenes. These ads are about 40 seconds long, which is slightly longer than the US advertisements.
The following ads are exclusive to the UK:
Art Language—In an effort to relate to the creative artistic types whom he assumes own Macs, PC, dressed in a stereotypically bohemian fashion, begins speaking to Mac using unnecessarily pretentious language. Despite Mac's insistence that he enables anyone to be creative, PC continues using big words, eventually confusing even himself.
Court—PC, dressed in a barrister's outfit, questions Mac on how long it takes to make an iPhoto photo book that Mac claims to have made in a few minutes. Doubting Mac's claim, PC eventually resorts to cutting off Mac whenever he tries to speak.
Magic—Exchanging an average 50k Word document in a file to Mac, PC makes out that the process is much harder than it actually is through the use of a drum roll and a magician's assistant, and shouting "Amazing!" at the end of the transfer. Bemused, Mac points out that he is compatible with PC and passes him back a photo with no fuss at all, at the end of which PC shouts "Amazing!"
Naughty Step—PC unveils his naughty step: the ultimate deterrent to an unruly errant child (similar to the technique used by Jo Frost in the UK and US series Supernanny). He goes on to explain that children should not be making pictures, movies and websites on a proper, grown-up PC. Mac points out that this is the fun stuff children like to do, resulting in his own banishment to the naughty step.
Office at Home—PC is proud of his role in both the office and the home, but Mac retaliates by stating that homes are not run like offices, and thus shouldn't have office computers. PC eagerly begins to describe the ways in which homes can be run like offices, with his increasing authoritarianism prompting Mac to sarcastically comment that PC's home sounds like a fun place.
Office Posse—PC wonders why Microsoft Office (Excel, PowerPoint, Word and Entourage) are standing with Mac and is surprised when Mac says that he runs Office also. PC attempts to order and then entice the Office members to join him, but they refuse, resulting in what Mac calls an awkward moment.
Tentacle—PC praises Britain's work ethic, chastising Mac's insistence on the need for fun in life. In attempting to persuade Mac of his point of view, PC employs the use of several animal metaphors, but becomes sidetracked through his increasingly eager musing about the practical applications of octopus tentacles in an office.
Several American ads were modified for the UK market. In some of these ads, the events that occur in the narrative differ significantly from the original American campaign. Others follow the original ads more closely, with only minor differences (many based on the differences in characterization from the actors involved or language differences between American English and British English). These ads are also performed by Mitchell and Webb.
The adapted ads are
Accident—The ad follows the same narrative, with a different ending: PC, clearly heavily drugged, requests to be pushed over to the window so he can look at the pigeons, only for Mac to point out that there are no pigeons nor a window. PC responds with a dreamy "You're funny...."
Network—The ad follows the same narrative, but in the British version Mac connects with a Japanese printer instead of a digital camera. PC is also more involved in the dialogue, attempting to communicate in Japanese with the printer, only to mangle his words, first declaring that he is a rice cake before asking, "Where is the train station?" This larger involvement of PC, when compared to PC in the American ad, is also shown by the appearance of subtitles whenever PC, Mac, or the printer speak in Japanese; in the American ad, there are no subtitles translating Mac and the camera's dialogue, further evidencing that PC is lost in the conversation.
Out of the Box—The ad is almost exactly the same as the American version. However, Mac doesn't mention his built-in camera. Also, at the end, PC pulls out an extremely thick user manual and starts reading it.
Pie Chart—The ad is based on the American Work vs. Home. The light-grey area of PC's family holiday pie chart now represents shennanigans and tomfoolery and the dark-grey area represents hijinks. Also, PC further divides hijinks into capers, monkey business, and just larking about.
Restarting—The ad follows much the same narrative as the American ad, with the only major difference being that after Mac has left to get someone from IT, PC awakens and wonders where everyone has gone.
Stuffed—This ad contains no significant changes from the American version.
Trust Mac—The ad follows the same narrative as the American version, but at the end, PC yells out that there is nobody present but two Macs having fun.
Virus—Based on the American ad Viruses, it contains the dialogue "This one's a humdinger" instead of "a doozy" but otherwise contains no significant changes.
Japanese campaign
On December 12, 2006, Apple began to release ads in Japan that were similar in style to the US Get a Mac ads. The Mac and PC are played by the Rahmens, a Japanese comedy duo. The ads used to be viewable at Apple's Japan website.
The following ads are exclusive to Japan:
Nengajo—Mac shows PC the New Year's Card he made using iPhoto. PC then looks at it, remarking about the picture of the wild boar on the card.
Nicknames—PC is confused as to why Mac is not called a PC. Mac then explains that more people use him at home, and PC counters that he is more business-oriented. PC then asks for a nickname for himself; Mac then names him Wāku (work).
Practice Drawing—PC says he can create pictures, but they are all graphs. For example, what Mac thinks is Manhattan is a bar graph and what Mac thinks is a mountain view is a line graph. Mac catches on, correctly identifying a pie chart, but PC responds that it is a pizza, chiding Mac for having no artistic sense. This is similar to Art Language, in that PC is trying to connect with artsy people like Mac.
Steps—Mac tells PC that he has made his own webpage using iWeb. PC then asks for the steps to make his own. Mac gives them, finishing after step three. PC then pesters Mac for step four, which Mac finally explains is to have a cup of coffee.
Several American ads were modified for the Japanese market. In some of these ads, the events that occur in the narrative differ significantly from the original American campaign. Others follow the original ads more closely, with only minor differences (many based on the differences in characterization from the actors involved).
The adapted ads are
Bloated—This ad is similar to Stuffed, but in this ad, PC makes no reference to bloatware (limited or useless versions of programs loaded onto new PCs), instead complaining about how much space installing a new operating system takes. Mac expresses his hopes that PC didn't have to delete any important data.
iLife—This ad is almost exactly the same as the American version, except that PC is listening to Eurobeat on his iPod rather than slow jams, and Mac gives a pregnant pause instead of complimenting PC on his pre-loaded calculator and clock.
iMovie—This ad with Miki Nakatani, is nearly identical to the American ad Better Results, except that PC actually thinks that his home movie is comparable to the Mac home movie.
Microsoft Office—Based on the UK ad Office Posse, the ad contains only minor differences. At the end of the ad, PC tries to entice Office by chanting, "Overtime! Overtime! All together now!"
Pie Chart—This ad is based on the American ad Work vs. Home. The narrative is largely the same, with the only significant differences being that Mac is blogging rather than working with movies, music, and podcasts, and the names of the divisions of the pie chart each represent Sightseeing and Relaxing at a Café.
Restart—This ad is identical to the American ad Restarting, except that PC doesn't restart again after Mac goes off to get IT.
Security—This ad is based on the American ad Trust Mac, but contains some significant changes. Rather than disguising himself to hide from viruses, PC dons protective gear to fight viruses. PC demands that any virus out there come and fight him. After Mac points out a virus, PC slowly moves behind Mac to protect himself.
Virus—The ad contains no significant changes from the American ad Viruses.
Keynote videos
While not strictly a part of the ad campaign, Hodgman and Long appeared in videos during Steve Jobs's keynote addresses at the 2006, 2007, and 2009 Worldwide Developers Conference and the 2008 MacWorld Expo. Hodgman also appeared in the November 2020 Apple Event.
WWDC 2006—In an attempt to stall Mac development, PC claims to have a message from Steve Jobs that says that the developers should take the rest of the year off, and that Microsoft could use some help with Vista. He starts to go off-topic about his vacation with Jobs, but when Mac arrives he says he's just preparing for their next commercial and starts to sing the Meow Mix theme song off-key.
WWDC 2007—PC dresses up as Steve Jobs, and announces that he is quitting and shutting down Apple. He claims that Vista did so well, selling tens of dozens of copies, that there's no need for Leopard, and that he got his iPod-killer, a brown Zune. He tells the developers to just go home because they're no longer needed. Mac arrives and chides PC for trying to mislead the developers again like last year. He asks if PC really thinks the audience will believe he is Jobs. PC then claims he is Phil Schiller.
MacWorld Expo 2008—PC and Mac stand under a Happy New Year sign, and PC talks about what a terrible year 2007 has been for him, referring to Windows Vista as a failure while Apple Inc. experienced success with Mac OS X Leopard, iPod Touch, and iPhone. Despite this, PC says he is optimistic for the future, claiming it to be the Year of the PC. When asked what his plans are for 2008, PC states he is "just going to copy everything [Mac] did in 2007."
WWDC 2009—PC comes out and greets the crowd and says that he wants them to have a great conference with "incredible innovations that will keep Apple at the forefront..." He stops, then says, "I think I can do that better." Now it's take 2. He wishes them a "week with some innovation, but not a lot, please. Yeah, I like that." Then he says some stuff about the 1 Billion App Countdown. He asks for apps and ideas. He says, "I hope you're thinking of some great ideas because I'm thinking of some great ideas too!...What are your ideas?" Eventually at Take 16, PC gives up and Mac tells everyone to have a great conference.
Apple Event November 2020—PC criticizes the upgrades made to the MacBook Air earlier in the event.
Release dates (U.S. campaign)
The different spots were released gradually:
The original set of Viruses, Restarting, Better, iLife, Network, WSJ, were launched on May 2, 2006.
Work vs. Home, Touché, and Out of the Box were released on June 12, 2006.
Accident, Angel/Devil and Trust Mac, were released for the campaign on August 27, 2006, during the 2006 Primetime Emmy Awards.
In September, three new commercials made their debut on Canadian television, one (Better Results) features Gisele Bündchen alongside Hodgman and Long in an advertisement that had been sighted at certain Apple Stores. They were published on Apple's website on October 9, 2006.
In October 2006, 3 new ads, Better Results, Counselor, and Self Pity, were sighted on U.S. network TV.
In late November 2006, 3 new ads were released, Gift Exchange, Sales Pitch, and Meant for Work.
On December 19, 2006, the ad Goodwill was released on apple.com. Wall Street Journal disappeared from the See All the Ads section afterward (but is still on the site).
On January 9, 2007, with the introduction on Macworld 2007, Surgery was added, and Network was removed from the menu.
On January 16, 2007, Sabotage and Tech Support were added, and the 2006 holiday ads (Gift Exchange and Goodwill) and Better were removed. Network was added once again.
On February 6, 2007, Security was added.
On February 7, 2007, Gift Exchange, Goodwill, and Better were re-added, meaning that all of the U.S. campaign ads except for Wall Street Journal could be seen at apple.com/getamac/ads.
On April 11, 2007, Computer Cart and Flashback were added.
On April 14, 2007, The Stuffed ad was added.
On May 7, 2007, Choose a Vista, Genius, and Party Is Over were added.
On November 11, 2007, PR Lady, Boxer, and Podium were added. Network, iLife, and Restarting were no longer on the menu.
In November 2007, an internet-only ad, Sign, was sighted.
On December 4, 2007, Misprint was added.
On December 6, 2007, Now What? was added.
On December 13, 2007, a fully claymation Santa Claus ad was added.
On January 6, 2008, Referee was added in conjunction with the beginning of the NFL playoffs.
On January 13, 2008, Time Machine was added.
As of January 25, 2008, the Web-exclusive ad Not was sighted on the Yahoo! News opening page. It also appeared at the New York Times site and elsewhere.
On April 1, 2008, Breakthrough and Yoga were added.
On April 9, 2008, Office Stress was added.
On May 12, 2008, Group and Pep Rally were added.
On May 13, 2008, Sad Song was added.
On August 18, 2008, Calming Teas, Throne, Pizza Box, and Off the Air were added.
On October 19, 2008, Bean Counter, and V Word were added
On October 20, 2008, Bake Sale was added.
On December 16, 2008, Tree Trimming, and I Can Do Anything were added.
On April 19, 2009, Time Traveler, Stacks, Legal Copy, and Biohazard Suit were added.
On May 12, 2009, Elimination, PC Choice Chat, and Customer Care were added.
On August 25, 2009, Surprise and Top of the Line were added.
On August 29, 2009, Trainer was added.
On September 11, 2009, PC Innovation Lab was added.
On October 23, 2009, Broken Promises, Teeter Tottering and PC News were added on Windows 7 launch day.
Effectiveness
Before the campaign's launch, Apple had seen lower sales in 2005–06. One month after the start of the "Get a Mac" campaign, Apple saw an increase of 200,000 Macs sold, and at the end of July 2006, Apple announced that it had sold 1.3 million Macs. Apple had an overall increase in sales of 39% for the fiscal year ending September 2006.
Criticism
In an article for Slate magazine, Seth Stevenson criticized the campaign as being too "mean spirited", suggesting, "isn't smug superiority (no matter how affable and casually dressed) a bit off-putting as a brand strategy?".
In an article in The Guardian, Charlie Brooker points out that the use of the comedians Mitchell and Webb in the UK campaign is curious. They both star in the sitcom Peep Show in which, to quote the article's author, "Mitchell plays a repressed, neurotic underdog, and Webb plays a selfish, self-regarding poseur... So when you see the ads, you think, 'PCs are a bit rubbish yet ultimately lovable, whereas Macs are just smug, preening tossers.'"
Differentiating between a Mac and a PC
Many computer experts have argued over the definition of PC, or personal computer, which can raise questions about the actual differentiation between a Mac and a PC. Editor in Chief of PC Magazine, Lance Ulanoff states in a 2008 column in PC Magazine, "Of course, the ads would then be far less effective, because consumers might realize that the differences Apple is trying to tout aren't quite as huge as Apple would like you to believe."
Projectory "Japanese Camera" advertisement
The commercial "Network" has been criticized as being projectory and blame shifting, because Apple is notorious for having proprietary systems (vendor lock-in), isolated from open standards (e.g. no support for Bluetooth file sharing, WiFi Direct, NFC file sharing; no USB charging connector on the iPhone, lack of modularity (no MicroSD-expandable storage in iPhones), historically poor support for codecs and file systems other than those developed by Apple themselves, etc.) and poor repairability (historically poor iFixit scores).
I'm a PC
Microsoft responded to the Get a Mac advertising campaign in late 2008 by releasing the I'm a PC campaign, featuring Microsoft employee Sean Siler as a John Hodgman look-alike. While Apple's ads show personifications of both Mac and PC systems, the Microsoft ads show PC users instead proudly defining themselves as PCs.
In popular culture
Videos parodying the Get a Mac campaign have been published online by Novell, to promote Linux, represented by a young and fashionable woman. A different set of videos parodying the campaign have been produced, but with Linux portrayed as a typical male nerd.
To promote Steam on Mac, Valve made a parody with Portal and Team Fortress 2 sentry guns.
After the 2007–2008 Writers Guild of America strike, the cast and crew of the American television show Numb3rs decided to parody the "Get a Mac" commercials to promote the return of the show on Friday, April 4, 2008. In the ad, brothers Don Eppes (Rob Morrow) and Dr. Charlie Eppes (David Krumholtz) debate the merits of being a FBI agent versus being a mathematician. The cast and crew used two hours of production time to film the 34-second ad.
The Get a Mac campaign became the basis for the long running YouTube series Hi, I'm a Marvel...and I'm a DC by ItsJustSomeRandomGuy. The series took the classic superhero characters from Marvel Comics and DC Comics and compared their film adaptations. In this case Marvel was constantly touted as being superior due to having more successful film adaptations of their characters than DC who have notoriously not only have had fewer adaptations, but also many of them being critically or commercially panned.
Late Show with David Letterman made parodies of the Get a Mac campaign, from Mac's wig being taken off by PC to reveal baldness, to Mac as David Hasselhoff eating a cheeseburger drunk.
In the Comedy World theme of the website GoAnimate, there is 2 characters modeled after the main characters.
On an episode of Air Farce Live, aired around the time of the Canadian federal election, had a sketch where one of the comedians was introduced as a Liberal, and the other as a PC (Progressive Conservative Party of Canada). The sketch was split into separate parts during the episode.
City of Heroes offered a series of online video parodies with a commercial featuring dialog centered around two machinima characters. They all start the same: one proclaiming "I'm a hero" and the other proclaiming "I'm a villain." The video was made to promote their new Mac edition of the game for OS X computers, released in February 2009.
Instant Star and Degrassi: The Next Generation were in a parody where they would describe their own shows. Alexz Johnson portrayed Instant Star (Johnson portrays Jude Harrison in the show) and Miriam McDonald portrayed Degrassi (McDonald portrays Emma Nelson in the show).
SuperNews! made 2 shorts based on the "Get a Mac" ads, which features Bill Gates and Steve Jobs fighting each other. Before all videos was removed from Current's YouTube Channel by Al Jazeera Media Network, the first one was the highest viewed video in said channel with over 3,000,000 views.
A Funny or Die promo video for the release of John Hodgman's book That Is All includes a segment in which Hodgman walks through a 'void' room in his deranged millionaire mansion. Justin Long sits alone in the white open space from the Get A Mac ads, happy to see Hodgman again and eager to make another commercial.
In the wake of the Mac transition to Apple silicon, in March 2021, Intel made a similar advertising campaign, known as Justin Gets Real, featuring Justin Long promoting Intel PCs over Macs.
See also
Apple Switch ad campaign
Apple evangelist
Cola Wars
Comparative advertising
References
Apple Inc. advertising
American television commercials
Advertising campaigns
2006 introductions
American advertising slogans
2006 neologisms
2000s television commercials |
8041752 | https://en.wikipedia.org/wiki/Commercial%20use%20of%20copyleft%20works | Commercial use of copyleft works | Commercial advantage of copyleft works differs from traditional commercial advantage of Intellectual Property Rights (IPR). The economic focus tends to be on monetizing other scarcities, complimentary goods rather than the free content itself. One way to make money with copylefted works is to sell consultancy and support for users of a copylefted work. Generally, financial profit is expected to be much lower in a "copyleft" business than in a business using proprietary works. Another way is to use the copylefted work as a commodity tool or component to provide a service or product. Android phones, for example, are based on the Linux kernel. Firms with proprietary products can make money by exclusive sales, by single and transferable ownership, and litigation rights over the work.
Internal use
Businesses and governments can obtain value and cut costs by using copyleft software internally. See for example Linux adoption.
Development
By building on existing free software, businesses can reduce their development costs. With software that is copyleft, the business will then have the disadvantage that selling licences is rarely possible (because anyone can distribute copies at no cost), but the business will have the advantage that their competitors can't incorporate that improved version into a product and then distribute it without that competitor also making their modifications available to the original distributor, thereby avoiding a type of free rider problem.
Copyleft enables volunteer programmers and organizations to feel involved and contribute to software and feel confident any future derivatives will remain accessible to them, and that their contributions are part of a larger goal, like developing the kernel of an operating system (OS). Copylefting software makes clear the intent of never abusing or hiding any knowledge that is contributed. Copyleft also ensures that all contributing programmers and companies cannot fork proprietary versions to create an advantage over another.
The argument for investments in research and development for copyleft businesses may seem weak, by not having exclusivity over the profits gained from the result. Economically, copyleft is considered the only mechanism able to compete with monopolistic firms that rely on financial exploitation of copyright, trademark and patent laws.
Distribution
Commercial distributors of Linux-based systems (like Red Hat and Mandriva) might have had some ups and downs in finding a successful construction (or business model) for setting up such businesses, but in time it was shown to be possible to base a business on a commercial service surrounding a copylefted creation. One well-known example is Mandrake, which was one of the first companies to succeed on the stock market after the implosion of large parts of the IT market in the early 21st century. They also had success in convincing government bodies to switch to their flavor of Linux.
However, excluding some notable exceptions like the Debian Project (which is expressly noncommercial and committed to free software on principle), most Linux distributors don't actively seek to limit their usage of proprietary software or restrict the proliferation of non-free licenses in connection with their distributions. There appears to be no real reason why an exploitation of commercial services surrounding copylefted creations would not be possible in small-scale business, which as a business concept is no more complex than making money with a "public domain" recipe for brewing coffee—successfully exploited by so many cafeteria owners. However, there are few examples so far of SMEs having risked such a leap for their core business. UserLinux, a project set up by Bruce Perens, supports the emergence of such small-scale business based on free software, that is, copylefted or otherwise freely licensed computer programs. The UserLinux website showcased some case studies and success stories of such businesses.
Art
In art, making commercial services out of a copylefted creation is more difficult to do in practice than in software development. Public performances could be considered as one of a few possibilities of providing such "services".
The music industry objected to peer-to-peer file exchanging software, but the Electronic Frontier Foundation (EFF) gave some suggestions to resolve the issue.
Objections to the appropriation of ideas for commerce believe intellectual works should not be compared to material property. Giving someone a physical object results in lost possession and control of that thing and can require asking for something in return, payment or barter. But when someone gives an idea to someone, they lose nothing, and need not ask for anything in return.
Often copylefted artistic creations can be seen to have a (supporting) publicity function, promoting other, more traditionally copyrighted creations by the same artist(s). Artists sticking to an uncompromising copylefting of the whole of their artistic output, could, in addition to services and consultancy, revert to some sort of patronage (sometimes considered as limiting artistic freedom), or to other sources of income, not related to their artistic production (and so mostly limiting the time they can devote to artistic creation too). The least that can be said is that copylefting in art tends toward keeping the art thus produced as much as possible out of the commercial arena—which is considered as an intrinsic positive goal by some.
Some artists, such as Girl Talk and Nine Inch Nails, use copyleft licenses such as the Creative Commons Attribution-NonCommercial-ShareAlike license that don't allow commercial use. In this way they can choose to sell their creations without having to compete with others selling copies of the same works. However, some argue that the Attribution-NonCommercialShareAlike license is not a true copyleft.
Where copylefted art has a large audience of modest means or a small audience of considerable wealth, the act of releasing the art may be offered for sale. See Street Performer Protocol. This approach can be used for the release of new works, or can be used for the conversion of proprietary works to copylefted works. See Blender.
See also
Business models for open source software
Open-source economics
Commons-based peer production
References
Copyright law
Copyleft
Economics of intellectual property |
65061820 | https://en.wikipedia.org/wiki/Mark%20Nixon%20%28academic%29 | Mark Nixon (academic) | Mark S. Nixon is an author, researcher, editor and an academic. He is the former president of IEEE Biometrics Council, and former vice-Chair of IEEE PSPB. He retired from his position as Professor of Electronics and Computer Science at University of Southampton in 2019.
Nixon’s main research interests include using gait and ear as biometrics and using soft biometrics for identification. He has served the International Association for Pattern Recognition (IAPR) in several offices. Nixon has authored 4 books including Image Processing & Feature Extraction, Human Identification based on Gait and Introductory Digital Design. He has around 20,000 citations.
Nixon became a BMVA Distinguished Fellow in 2015. He is a Fellow of the International Association of Pattern Recognition.
Education
Nixon graduated from University of Reading in BSc Cybernetics Science with Subsidiary Mathematics in 1979. He completed his doctoral studies in Applied Estimation Theory from the same university in 1983.
Career
Nixon joined University of Southampton in 1983 and became a Professor of Electronics and Computer Science; he was awarded Personal Chair in 2001 and retired in 2019.
Nixon has held several leadership and management positions over the course of his career. He was appointed as President of the IEEE Biometrics Council from 2017 to 2018. He was appointed as Chair of the International Association of Pattern Recognition (IAPR), while also serving as member of Nominating Committee and Advisory Committee. He has been serving as the vice-Chair IEEE PSPB since 2018. He is an advisory editor for Pattern Recognition Letters.
Research
Nixon’s research on biometrics and computer science has continued for over 20 years, focusing on using human gait as a biometric and the use of ears in biometrics. In his initial work with face recognition, his team applied new techniques for shape extraction and description and more recently formulated them for moving objects. This research is also applied to medical imagery and in remotely–sensed image analysis. Nixon’s research approaches operate on spatial images, video and in 3D.
Nixon's later research was focused on mechanisms that seek to cross the semantic gap and to learn human descriptions from computer vision features and conversely to learn image features from human descriptions; also using soft biometrics where the use of human descriptions for biometrics purpose is pioneered. In the recent years, Nixon and his team have described how subjects can be recognized by human descriptions (attributes) of their body, face and their clothing. Nixon’s work of fusion of soft biometrics leads to the identification and search for subjects in video material and also has an impact on the eyewitness procedures.
Nixon has also worked into developing a new approach to analyze acceleration in image sequences based on the fact that most approaches to analyzing velocity actually subsume many types of motion. By separating out acceleration, a new capability can be observed, e.g. rotational acceleration to find a walking subject’s feet, as well as new capability to detect violent actions since acceleration is innate to such acts.
Nixon was featured in the IAPR newsletter and in ‘Biometric Technology Today’ in 2016. His work received extensive media coverage in 2018 when it was applied in an egregious case in Australia to identify a subject who cased a shop and then murdered its owner.
Nixon’s book Feature Extraction and Image Processing for Computer Vision published in 2002 was the first book concentrating solely on the topic. He was awarded the Notable Book Award by the Computing Reviews in 2012 for his work in the book. According to Elisa Smith from IAPR newsletter, the book contains “a broad overview of the field presented at a level of depth aimed at those who are new to the field”. The book is praised for its citation of “an overwhelming and impressive number of books, conference, and journal articles on a broad range of topics”. She also stated that this book could be used “as a good framework to facilitate a broad range of discussion topics”.
Awards and honors
2008 - Fellow, International Association of Pattern Recognition (IAPR)
2012 - Notable Book Award, Computing Reviews
2015 - BMVA Distinguished Fellow
Bibliography
Books
Introductory Digital Design - A Programmable Approach - 1995
Feature Extraction and Image Processing for Computer Vision - 2002
Human identification based on gait, MS Nixon, T Tan, R Chellappa – 2010
Digital Electronics: A Primer - 2015
Selected articles
D. Cunado, M. S. Nixon and J. N. Carter, Automatic Extraction and Description of Human Gait Models for Recognition Purposes, Computer Vision and Image Understanding, 90(1), pp. 1–41, 2003
D. J. Hurley, M. S. Nixon and J. N. Carter, Force Field Feature Extraction for Ear Biometrics, Computer Vision and Image Understanding, 98(3), pp. 491–512, 2005
D. Cunado, M. S. Nixon and J. N. Carter, Using Gait as a Biometric, via Phase-Weighted Magnitude Spectra, In: J Bigun, G. Chollet and G. Borgefors Eds.: Lecture Notes in Computer Science, 1206 (Proceedings of 1st Int. Conf. on Audio- and Video-Based Biometric Person Authentication AVBPA97), pp. 95–102, 1997
J. D. Shutler, M. G. Grant, M. S. Nixon, and J. N. Carter, On a Large Sequence-Based Human Gait Database, A. Lotfi, J. M. Garibaldi Eds., Applications in Science and Soft Computing, Springer, pp. 339–346, 2003
M. S. Nixon, J. N. Carter, M. G. Grant, L. G. Gordon and J. B. Hayfron-Acquah, Automatic Recognition by Gait: Progress and Prospects, Sensor Review, 23(4), 323-331, 2003
S. R. Gunn and M. S. Nixon, A Robust Snake Implementation; A Dual Active Contour, IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(1), pp. 63–67, 1997
S. V. Stevenage, M. S. Nixon and K. Vince, Visual Analysis of Gait as a Cue to Identity, Applied Cognitive Psychology, 13(6), pp. 513–526, 1999
D. K. Wagg and M. S. Nixon, On Automated Model-Based Extraction and Analysis of Gait, IEEE Face and Gesture Analysis ‘04, Seoul (Korea), pp. 11–16, 2004
A. J. Tatem, H. G. Lewis, P. Atkinson and M. S. Nixon, Super-Resolution Land Cover Pattern Prediction using a Hopfield Neural Network, Remote Sensing of Environment, 79(1), pp. 1–14, 2002
J. B. Hayfron-Acquah, M. S. Nixon and J. N. Carter Automatic Gait Recognition by Symmetry Analysis, Pattern Recognition Letters, 24(13), 2175-2183, 200 (Invited from AVBPA 2001)
P. S. Huang, C. J. Harris and M. S. Nixon, Recognising Humans by Gait via Parametric Canonical Space, Proc. International ICSC Workshop on Engineering of Intelligent Systems EIS’98, 3, pp. 384–389, 1998
P. S. Huang, C. J. Harris and M. S. Nixon, Human Gait Recognition in Canonical Space using Temporal Templates, IEE Proceedings Vision Image and Signal Processing, 146(2), pp. 93–100, 1999
References
Living people
1958 births
Fellows of the International Association for Pattern Recognition
British engineers
Alumni of the University of Reading |
829828 | https://en.wikipedia.org/wiki/Picasa | Picasa | Picasa was a cross-platform image organizer and image viewer for organizing and editing digital photos, integrated with a now defunct photo-sharing website, originally created by a company named Lifescape (which at that time was incubated by Idealab) in 2002. "Picasa" is a blend of the name of Spanish painter Pablo Picasso, the phrase mi casa (Spanish for "my house") and "pic" for pictures.
Native applications for Windows XP, Windows Vista, Windows 7, and macOS were available, and for Linux, the Windows version was bundled with Wine compatibility layer. An iPhoto plugin and a standalone program for uploading photos were available for Mac OS X 10.4 and later.
In July 2004, Google acquired Picasa from Lifescape and began offering it as freeware. On February 12, 2016, Google announced it was discontinuing support for Picasa Desktop and Web Albums, effective March 15, 2016, and focusing on the cloud-based Google Photos as its successor. Picasa Web Albums, a companion service, was closed on May 1, 2016.
Version history
Windows
, the latest version of Picasa is 3.9, which supports Windows XP, Windows Vista, and Windows 7, and has Google+ integration for users of that service. Version 3.9 also removed integration with Picasa Web Albums for users of Google+.
Linux
Since June 2006, Linux versions have become available as free downloads for most distributions of the Linux operating system. It is not a native Linux program but an adapted Windows version that uses the Wine libraries. Google has announced that there will be no Linux version for 3.5. Currently, Google has only officially offered Picasa 3.0 Beta for Linux.
On April 20, 2012, Google announced that they were deprecating Picasa for Linux and will no longer maintain it for Linux.
To use latest version of Picasa on Linux, Linux users can use Wine and install Picasa for Windows. Linux users can use other programs to upload to Picasa Web Albums, including Shotwell and Digikam.
Mac OS X
On January 5, 2009, Google released a beta version of Picasa for Mac (Intel-based Macs only). Also, a plugin is available for iPhoto to upload to the Picasa Web Albums hosting service. There is also a standalone Picasa Web Albums uploading tools for OS X 10.4 or later. The Picasa for Mac is a Google Labs release.
Features
Organization and editing
For organizing photos, Picasa has file importing and tracking features, as well as tags, facial recognition, and collections for further sorting. It also offers several basic photo editing functions, including color enhancement, red eye reduction, and cropping. Other features include slide shows, printing, and image timelines. Images can also be prepared for external use, such as for e-mailing or printing, by reducing file size and setting up page layouts. There is also integration with online photo printing services. Other simple editing features include adding text to the image. Picasa supports Google's WebP image format as well as the JPG format and most Raw image format (RAW files). A user can view and edit RAW files and save the finished edit (as JPG, or other forms) without any changes to the original RAW file.
Keywords
Picasa uses picasa.ini files to keep track of keywords for each image. In addition to this, Picasa attaches IPTC Information Interchange Model (IPTC) keyword data to JPEG files, but not to any other file format. Keywords attached to JPEG files in Picasa can be read by other image library software like Adobe Photoshop Album, Adobe Bridge, Adobe Photoshop Lightroom, digiKam, Aperture, and iPhoto.
According to the Picasa Readme, Picasa can parse Extensible Metadata Platform (XMP) data. However, it cannot search local files for existing XMP keywords.
Searching
Picasa has a search bar that is always visible when viewing the library. Searches are live, so that displayed items are filtered as one types. The search bar will search filenames, captions, tags, folder names, and other metadata.
Picasa also has an experimental feature that allows searching for images that contain certain colors with the "color" operator.
Viewing
Picasa has no separate view window. There is only an "edit view" with a viewing area. Fullscreen view is available in slideshow mode, by holding down the ctrl+alt keys while in "edit view", or by pressing the Alt Gr key. This feature is also available through the context menu of Windows Explorer, and provides a way to start the Picasa editor as well.
Backup
In Picasa 2 and earlier versions, changes to pictures made in Picasa overwrite the original file, but a backup version of the original is saved in a hidden folder named "Originals" in the same folder as the original picture (.picasaoriginals on Mac OS X).
In Picasa 3, changes to pictures made in Picasa are saved to a hidden file picasa.ini in the same folder as the original picture. This allows multiple edits to be performed without altering the original image. Viewing the picture in Picasa or using the Picasa Photo Viewer will apply modifications on the fly, whereas viewing through other programs (such as Windows XP's Photo and Fax Viewer) will display the original image. Changes can also be made permanent using the "Save" function, where the original file is backed up in a hidden folder .picasaoriginals located in the same folder as the original picture and the modified version is written in its place.
Face recognition
On August 15, 2006, Google announced it had acquired Neven Vision, whose technology can be used to search for features within photos such as people or buildings. Google applied this technology for face recognition, and this functionality was launched on Picasa Web Albums on September 2, 2008.
Neven Vision incorporates several patents specifically centered around face recognition from digital photo and video images.
Geotagging
Since June 2007, Picasa can write geographic coordinates to Exif metadata, thus geotagging an image.
Since version 3.5 of Picasa, geotagging may be done directly inside Picasa, in the Places panel.
The geotagging functionality is described in the Picasa User's Guide.
Other Picasa applications
Picasa Web Albums
Besides Google+, Picasa also integrated with Picasa Web Albums, an image hosting and sharing web service. The service allowed users with a Google account to store and share their photos on the service. Users with a Google+ account received unlimited storage for photos of a resolution less than 2048x2048 pixels; all others received unlimited storage for photos of a resolution less than 800x800.
Hello
Hello by Google's Picasa was a free computer program that enabled users to send images across the Internet and publish them to their blogs. It was similar to an instant messaging program because it allowed users to send text, but Hello focused on digital photographs. Users could opt to view the same pictures as their friends in real-time. One of the advantages claimed at the website is that photos could be shared through firewalls.
The service was canceled at the end of 2006, and users were instructed to try the Picasa "Blog This" functionality for uploading pictures to their blogs.
According to the official website, the Hello project was shut down on May 15, 2008.
Discontinuation
On February 12, 2016, Google announced that the Picasa desktop application would be discontinued on March 15, 2016, followed by the closure of the Picasa Web Albums service on May 1, 2016. Google stated that the primary reason for retiring Picasa was that it wanted to focus its efforts "entirely on a single photos service" the cross-platform, web-based Google Photos. While support for the desktop version of Picasa is ending, Google has stated that users who have downloaded the software, or who choose to download it prior to the March 15th deadline will still be able to use its functionality, albeit with no support from Google.
See also
Google Photos
Comparison of image viewers
Desktop organizer
List of Google products
List of photo sharing websites
References
External links
Picasa Web Albums
Picasa Release Notes
2002 software
Freeware
Discontinued Google acquisitions
Discontinued Google software
Discontinued Google services
Image organizers
Image sharing websites
Software derived from or incorporating Wine
Windows graphics-related software
MacOS graphics-related software
2014 mergers and acquisitions |
2374317 | https://en.wikipedia.org/wiki/Par%20%28command%29 | Par (command) | The computer program par is a text formatting utility for Unix and Unix-like operating systems, written by Adam M. Costello as a replacement for the fmt command.
Par reformats paragraphs of text to fit into a given line length optimally, keeping prefixes and suffixes intact, which is useful for formatting source code comments. It also understands the conventions commonly used for quoting in email replies, and is capable of intelligently reformatting these several levels deep while rewrapping the text they quote.
Par can be invoked from text editors such as Vim or Emacs. To support Unicode par needs to be compiled with a patch that adds multi-byte character support, typically, UTF-8 encoding. Unlike fmt, par also supports text justification.
References
Costello, Adam M. (2001). "par.doc". Accessed August 4, 2005.
External links
Add multi-byte character support to par (Patch author's website). Accessed April 15, 2015.
Vimcasts.org: formatting text with par
Vim wikia: Par text reformatter
Unix text processing utilities |
17310927 | https://en.wikipedia.org/wiki/Service%20Component%20Architecture | Service Component Architecture | Service Component Architecture (SCA) is a software technology designed to provide a model for applications that follow service-oriented architecture principles. The technology, created by major software vendors, including IBM, Oracle Corporation and TIBCO Software, encompasses a wide range of technologies and as such is specified in independent specifications to maintain programming language and application environment neutrality. Many times it uses an enterprise service bus (ESB).
History
The original partners announced on November 30, 2005 were: BEA Systems, IBM, IONA Technologies, Oracle Corporation, SAP AG, Sybase, Xcalia and Zend Technologies.
Additional members announced on July 26, 2006 were Cape Clear, Interface21, Primeton Technologies, Progress Software, Red Hat, Rogue Wave Software, Software AG, Sun Microsystems and TIBCO Software.
Siemens AG joined the collaboration of companies working on the technology on September 18, 2006.
In addition to the partners, the SCA community had some formal supporters.
Definition
On March 21, 2007, the OSOA Collaboration released the first version of specification.
The specifications said that an application designed with SCA should have:
Decoupling of application business logic from the details of its invoked service calls
Target services in a multitude of languages including C++, Java, COBOL, and PHP as well as XML, BPEL, and XSLT
The ability to work with various communications constructs including one-way, asynchronous, call-return, and notification
The ability to "bind" to legacy components or services, accessed normally by technologies such as Web Services, EJB, JMS, JCA, RMI, RPC, CORBA and others
The ability to declare (outside of business logic) the quality of service requirements, such as security, transactions and the use of reliable messaging
Data could be represented in Service Data Objects
SCA, therefore, was promoted to offer flexibility for composite applications, flexibly incorporating reusable components in an SOA programming style.
Marketing firm Gartner Group published a short brief that promoted the SCA and its included technology of Service Data Objects (SDO) in December 2005 .
Advantages:
caters for all existing Java platform technologies and C++
less technology dependence – does not have to rely on the Java programming language or XML
Service Data Objects is a technology specification for data access
Disadvantages:
Specification does not address performance of SOA applications, which continues to be a detractor of adoption.
Focusing on portability (instead of interoperability), making it vulnerable to repeating CORBA's mistakes.
SCA was said to provide interoperability through an approach called "Activation". It is the method that provides the highest degree of component autonomy, compared to older "mediation" (e.g., JBI) or "Invocation" method used in JCA, as explained by an architect at SAP.
Artifacts
The SCA Assembly Model consists of a series of artifacts, which are defined by elements contained in XML files. An SCA runtime may have other non-standard representations of the artifacts represented by these XML files, and may allow for the configuration of systems to be modified dynamically. However, the XML files define the portable representation of the SCA artifacts.
The basic artifact is the composite, which is the unit of deployment for SCA and which holds services that can be accessed remotely. A composite contains one or more components, which contain the business function provided by the module. Components offer their function as services, which can either be used by other components within the same module or which can be made available for use outside the module through Entry Points. Components may also depend on services provided by other components — these dependencies are called references. References can either be linked to services provided by other components in the same module, or references can be linked to services provided outside the module, which can be provided by other modules. References to services provided outside the module, including services provided by other modules, are defined by External Services in the module. Also contained in the module are the linkages between references and services, represented by wires.
A component consists of a configured implementation, where an implementation is the piece of program code implementing business functions. The component configures the implementation with specific values for settable properties declared by the implementation. The component can also configure the implementation with wiring of references declared by the implementation to specific target services.
Composites are deployed within an SCA System. An SCA System represents a set of services providing an area of business functionality that is controlled by a single organization. As an example, for the accounts department in a business, the SCA System might cover all financial-related functions, and it might contain a series of modules dealing with specific areas of accounting, with one for customer accounts and another dealing with accounts payable. To help build and configure the SCA System, Composites can be used as component implementations, in the same way as Java classes or BPEL processes. In other words, SCA allows a hierarchy of composites that is arbitrarily deep – such a nested model is termed recursive.
The capture and expression of non-functional requirements, such as security, is an important aspect of service definition, and affects SCA throughout the life-cycle of components and compositions. SCA provides the Policy Framework to support specification of constraints, capabilities and Quality of Service (QoS) expectations, from component design through to concrete deployment.
Transition to a standards body
After several years of incubation under an informal industry collaboration, early (V1.0) implementations of the specification are now coming to market. The collaboration partners indicated that formal industry standardization would be the appropriate next step and announced their intentions in March 2007. The chosen Standards Development Organization is the OASIS organization, and a new OASIS Open CSA Member Section has been established. Charters for six new Technical Committees (TCs) have been submitted to OASIS and a Call for Participation for Technical Committee members has been issued within the OASIS organization. The Technical Committees planned to start their work in September 2007. Participation in these OASIS SCA TCs remains open to all companies, non-profit groups, governments, academic institutions, and individuals. Archives of the work will be accessible to both members and non-members, and OASIS will offer a mechanism for public comment.
See also
Apache ServiceMix
Business Process Model and Notation (BPMN)
Docker (software)
Enterprise application integration (EAI)
Mule (software)
Open ESB
Reactive programming
Semantic service-oriented architecture (SSOA)
Service-oriented modeling
Windows Communication Foundation (WCF)
References
Further reading
Understanding SCA from experts Jim Marino and Michael Rowley
SOA for the Business Developer: Concepts, BPEL, and SCA—
Apache Tuscany in Action,
Open Source SOA,
External links
Mail of 2013-02-19 with Death of OASIS SCA Assembly technical committee, on that mail the amazing SCA died
Mail of 2015-02-20 with the leader Jim Marino telling about Oracle killing SCA
NetBeans SOA Composite Application Project Home
camelse
Running Apache Camel in OpenESB
Introduction to programming for SCA Dr. Dobb's
Apache Tuscany – Open Source implementation of the SCA specification
SALT - Enterprise ready SCA runtime for C++, Python, Ruby, and PHP
PocoCapsule for WS and SCA An open source C++ SCA container based on inversion-of-control (IoC) and domain-specific-modeling (DSM)
Newton open source distributed SCA & OSGi runtime
A French public research project, which includes a SCA runtime called FraSCati
SCA Home Page at OASIS web site
Current SCA Resources & Collateral
Latest SCA & SDO News
Introducing SCA – A tutorial by David Chappell, Chappell & Associates
Eclipse STP/SCA sub project An Open Source Eclipse project that provides a set of tools for SCA
Microservice architecture patterns and best practices
martinfowler microservice architecture site
Smart endpoints and dumb pipes – martinfowler
Enterprise application integration
Service-oriented (business computing) |
9864381 | https://en.wikipedia.org/wiki/Mii | Mii | A Mii ( ) is a customizable avatar used on several Nintendo video game consoles and mobile apps. Miis were first introduced on the Wii console in 2006 and later appeared on the 3DS, Wii U, the Switch, and various apps for smart devices. Miis can be created using different body, facial and clothing features, and can then be used as characters within games on the consoles, either as an avatar of a specific player (such as in the Wii series) or in some games (such as Tomodachi Life and Miitopia) portrayed as characters with their own personalities. Miis can be shared and transferred between consoles, either manually or automatically with other users over the internet and local wireless communications.
On the 3DS and Wii U, user accounts are associated with a Mii as their avatar and used as the basis of the systems' social networking features, most prominently the now-defunct Miiverse. On the Nintendo Switch, a Mii can still be used as an account avatar, but avatars depicting various Nintendo characters are also available. Miis are also used as profile pictures for Nintendo Accounts and can be used in Nintendo smart device games such as Super Mario Run and the now-defunct Miitomo.
Games such as Mario Kart 8 Deluxe, Go Vacation, Super Mario Maker 2, Super Smash Bros. Ultimate and New Super Mario Bros. U Deluxe use Miis as playable characters.
History
Nintendo's first public debut of free-form personal avatar software was at the Game Developers Conference in 1997, during the Nintendo 64 era. There, Shigeru Miyamoto said that the personal avatar concept had originally been intended as a Famicom demo, where a user could draw a face onto an avatar. Miyamoto commented that the concept could not be turned into a game and the concept was suspended.
In 1999, the 64DD (a disk drive peripheral for the N64) was launched in Japan. Nintendo had produced a short film using the 64DD's Mario Artist: Talent Studios avatar maker, which includes clothes and a built-in movie editor. The player can optionally utilize the Game Boy Camera and the 64DD's Capture Cassette to put their own face upon the avatar.
The next avatar implementation was for the Nintendo e-Reader and GameCube. Along with the Game Boy Camera, it can build an avatar maker. Miyamoto showed another short film they made with this software, which was shown at E3 2002 with the name Stage Debut. This software, renamed to Manebito, was discontinued prior to release.
Nintendo designer Yamashita Takayuki attributes his work on Talent Studio as having been foundational to his eventual work on the Mii, which was necessitated by the development of the game Wii Sports.
Mii creation
Mii characters are created and stored in the Mii Channel or the Mii Maker, which are pre-installed on the Wii and the Nintendo 3DS/Wii U consoles respectively. While the user can assign a gender, name, birthday, favorite color, and mingle preference to a Mii, the majority of the interface used for Mii creation focuses on the appearance of its face and head: the user is given a variety of different hairstyles, eye, nose, and mouth shapes, and other features such as facial hair or wrinkles, to select from. Most of the facial features can be further adjusted, including their size, position, color, and alignment. Accessories such as hats and glasses are also available to add, and the Mii's height and build can also be adjusted. The Mii Maker installed on the Nintendo 3DS and Wii U can use facial recognition to generate a Mii, which selects facial features based on a photograph of a person's face taken with the system's and GamePad's cameras respectively. The features can then be fine-tuned by the user. These versions also have more options than their Wii counterpart. Because the selection of facial features is considered by some to be limited, users are encouraged to develop caricatures of real persons instead of accurate depictions.
Special Miis
Nintendo periodically released special Miis, usually during E3 or to commemorate game and franchise anniversaries. For a limited time between March 13 and March 30, 2007, Wii owners in Japan were sent Mii versions of comedian Sanma Akashiya and tennis player Shuzo Matsuoka. The duo had been featured in Japanese promotions for the Wii, highlighting Miis themselves. Miis of Satoru Iwata and Reggie Fils-Aime (the presidents of Nintendo and Nintendo of America, respectively) were released on the 3DS for the 1st anniversary of the handheld console. During 2013, Nintendo released special Miis of Shigeru Miyamoto and Kensuke Tanabe, and during E3 2013, also released special Miis for Takashi Tezuka, Koichi Hayashida, Eiji Aonuma, and Hideki Konno. Their Miis feature gold pants, as opposed to a gray pair, and cannot be edited or copied. If owners transfer them to another Wii or Wii Remote, they will be removed from their original location, instead of traditionally making another copy.
In late 2011, Nintendo released Swapnote/Nintendo Letter Box for the Nintendo 3DS, which features an original female Mii character called Nikki (ニッキー). Nikki gained a relatively small fan base of her own right, especially in Japan. After Nikki's debut, Nintendo featured the character in a few other games and apps, such as , a Nintendo 3DS travel guide app that was exclusively distributed via the now-defunct Club Nintendo in Japan.
Editing limitations
It is possible to create special Mii-like characters through the use of third-party software, but Nintendo typically will force these entrepreneurs to shut down. Sometimes when a customer has needed to return his or her Wii for service, a replacement machine must be sent. When that happens, the Mii software recognizes it is a different system and will not allow any editing of Miis created on the original system. Nintendo, while offering to copy game data and Miis to the new machine, will not alter the Miis so that they can be edited on the replacement machine.
Wii
Mii Channel
The Mii Channel is the app that allows Mii creation on the Wii menu. It can store up to 100+ Miis similarly than the Nintendo 3DS and Nintendo Switch and Wii Remotes are also able to store and transfer up to ten Miis to other consoles. It is also possible to see other Miis from TV shows and games.
Uses in games
Miis are intended to be an extension of the player, and in keeping with this spirit, the user can use them in several Nintendo titles for the Wii. Wii Sports is perhaps the best-known example of this, and it adds a further personal touch to Miis by saving game statistics and records for individual Miis. Miis will make cameo appearances as computer-controlled opponents, teammates, or within the audience. Miis have been used to serve as game file icons (profiles) within several games. Often appearing as just a head for identification, this Mii has no impact on the actual gameplay other than to identify a player in another way besides the name, or representation based on looks.
Miis are primarily used in games such as Wii Sports, Wii Play, Wii Fit, Wii Party, Wii Fit Plus, Wii Music and Wii Sports Resort. Players can also use their Miis, however, in other first-party games, most noticeably within WarioWare: Smooth Moves, Mario & Sonic at the Olympic Games, Mario Party 8, Mario Kart Wii, Mario Super Sluggers, Animal Crossing: City Folk (using their Mii's head as a mask) and in Go Vacation. The Japan-only Sega game Pachinko: Sammy's Collection is the first third-party game to incorporate Miis, while the Wii version of FIFA Soccer 08 is the first third-party game released in North America and in Europe and Australia to use the Mii Channel. Many other games, like We Ski, and Guitar Hero World Tour and Sonic Colors also use Miis.
While a Mii's head always remains the same, its body varies between games. For example, in Wii Sports, the Mii's body is stylized, with spherical floating hands and bearing no arms, and Mii audiences or CPUs floating with spherical bottoms with no legs instead, but in Wii Fit its body is designed to look more natural, and its weight will be determined by the weight the Wii Fit found of the player in Wii Fit tests. Sometimes Miis will wear outfits in context with the game. In Super Mario Galaxy and its sequel, the Miis can be optionally used for the planet's icon that represents the save file, the other option being to use Mario characters for the planets. Only the Mii's head is shown and it's shown in a sphere-shaped like the planet. In Mario Kart Wii, Mii racers can be dressed in jumpsuits, or Mario style overalls for males and a Peach style dress for females, in Dr. Mario Online Rx, Miis appear in medical clothing, and in Metroid Prime 3: Corruption, where they appear as bobblehead dolls, they will be dressed up in bounty hunter Samus Aran's Power Suit. In MLB Power Pros, Miis are designed to look like regular Power Pro-Kun avatar, with legs detached from the main body. In Dance Dance Revolution Hottest Party 2, the Mii's body is formed more like a regular human. This design was, however, criticized by IGN's Lucas M. Thomas, who sarcastically commented that "[it] doesn't look disturbing at all."
While Miis are not playable in Super Smash Bros. Brawl, the creator of the series, Masahiro Sakurai, explained that Miis were originally considered to be playable in the game, but the idea was decided against due to fears of bullying. They would eventually debut in Super Smash Bros. for Nintendo 3DS and Wii U.
Everybody Votes Channel
Miis were incorporated in the downloadable Everybody Votes Channel, where Miis represented the voter. Up to six different Miis could be registered within the channel to use in voting. The channel was discontinued along with the Check Mii Out channel by Nintendo on June 28, 2013, as they moved on to other next-generation projects.
Check Mii Out Channel
Another Mii-centric channel, the Check Mii Out Channel, also known as the in Japan, Europe, and Oceania, was released on November 11, 2007. Perhaps an evolution of an idea shared by Shigeru Miyamoto at the Game Developers Conference in 2007, this channel allowed players to upload their Mii characters and share them with other users. There were also popularity contests, in which players would design a Mii that would personify a specific idea or character and then vote on the Mii that would best fit the suggestion. The channel was available for free download on the Wii Shop Channel from November 12, 2007, until June 27, 2013, when Nintendo discontinued the channel along with the Everybody Votes channel.
Nintendo DS
Although the platform lacks native support for Mii creation, a few games on the Nintendo DS console do support Mii functionality, and will work in conjunction with the Wii's Mii Channel.
Uses in games
Miis can be transferred from a user's Wii to supported Nintendo DS games via the Mii Channel. A code must be entered by the user to unlock the feature on the Mii Channel.
The Nintendo DS game Personal Trainer: Walking uses Miis to allow players to track their progress in the game. Players are also able to create Miis in-game. The Nintendo DS version of Final Fantasy Crystal Chronicles: Echoes of Time also uses Miis.
The life-simulation game Tomodachi Collection, only released in Japan for the Nintendo DS, also uses Miis and has a built-in Mii editor. Miis from the user's Wii's Mii Channel can be transferred to the game, and vice versa.
Nintendo 3DS
Mii Maker
The is the app that allows Mii creation on the Nintendo 3DS. It can store up to 100 Mii characters again like the same to the Nintendo Wii and the Nintendo Switch, and it is also possible to see other Mii characters from TV shows and games. The Mii Maker installed on the Nintendo 3DS can use facial recognition to generate a Mii, which selects facial features based on a photograph of a person's face taken with the system's cameras.
Uses in games
Unlike the Nintendo DS, which features limited support of Mii characters, its successor the Nintendo 3DS features Miis as a standard. Similarly to the Wii's Mii Channel, the Nintendo 3DS features its own Mii-creating application called Mii Maker, which is more advanced than the Mii Channel.
Mii characters can be created manually with Mii Maker as on the Wii's Mii Channel, but they can also be created automatically through the use of the Nintendo 3DS's cameras. The system captures an image of a subject's face, and the application converts the image into a Mii likeness using integrated recognition software. Automated Mii character designs can be manually adjusted. Mii characters can also be imported from the Wii to the 3DS or from the 3DS to the Wii U. However, Miis cannot be sent from the 3DS to the Wii, as Mii Maker features an expanded selection of design parts that are not available on Mii Channel.
The Nintendo 3DS can generate and read QR codes that represent Mii characters. QR codes and pictures of Mii characters can also be transferred to an SD card in any picture format, and be used in various ways, such as posting them on a web page. Miis on the Nintendo 3DS can also be used in conjunction with the device's Augmented Reality software - the software includes a mini-app named 'Mii Pics' which allows the user to take a photo of their Mii within a regular photo, using an augmented reality card included with the system.
The first Nintendo 3DS game to include support for Mii characters is Pilotwings Resort. Miis obtained through StreetPass appear as non-player characters in Nintendogs + Cats. Mii characters also appear in Pokémon Rumble Blast, Mario Kart 7, and in many more games.
Currently, the most notable game to feature Miis in their entirety is Tomodachi Life, the sequel to Collection. This is also the first game to give Miis complete lines of dialogue as well as the first game to allow players to choose what Miis say. Miitopia is another Mii-centric game title for the Nintendo 3DS, also released on the Nintendo Switch in May 2021.
StreetPass Mii Plaza
A feature on the Nintendo 3DS, the makes use of the handheld's StreetPass feature, which data between nearby Nintendo 3DS consoles in standby mode. As Miis are gathered in the plaza, they can be used in numerous minigames, with the initial two being Puzzle Swap and Find Mii (known as StreetPass Quest in PAL regions). In Puzzle Swap, players can exchange pieces of several jigsaw puzzle panels based on Nintendo games, in which there were initially seven, but this number increased with occasional updates. Find Mii is an RPG minigame in which players use the Miis they gathered to fight through dungeons, earning accessories for their Mii. Each Mii possesses a different type of magic depending on their color, and become more powerful if the player meets them more than once. These games can be optionally played with Play Coins, though the results are more random than with Streetpass Miis. On December 6, 2011, the feature was updated to include SpotPass functionality, as well as new puzzle panels, a sequel to Find Mii, a map showing where players met other Miis, Accomplishments and a music player.
Special Miis released by Nintendo and obtained through SpotPass can also be used in StreetPass Mii Plaza. They have access to all Puzzle Swap pieces and provide a level 5 player for Find Mii.
Wii U
Mii Maker
The is the app that allows Mii creation on the Wii U. It can store up to 3,000 Miis. It is also possible to see other Mii characters from TV shows and games. The Mii Maker installed on the Wii U can use facial recognition to generate a Mii, which selects the features based on a photograph of a person's face taken with the Wii U GamePad camera.
Uses in games
Mii characters evolve further for the Wii's successor, Wii U. In addition to previous uses on the Wii, Mii characters are wholly integrated into the Wii U's social online network Miiverse, the WaraWara Plaza community where clusters of Mii characters crowd around the hottest games, and being depicted as personal avatars for individual Wii U players, who have the ability for twelve separate Nintendo Network ID User accounts that can be used on a single console at a time. User accounts with Mii representatives are used for both games and apps such as Nintendo TVii. Mii characters can be transferred from the Wii and/or the Nintendo 3DS to the Wii U, in which in the latter's case transfers between consoles can occur as many times as possible, as the Wii U has its own Mii Maker app similar to that of the Nintendo 3DS version, where Users could transfer, create, and/or store up to 3000 Mii characters on the Wii U. Mii characters also return as in-game characters for certain Wii U games, which in addition to Nintendo-published launch titles such as Super Smash Bros. for Wii U, Mario Kart 8, New Super Mario Bros. U and Nintendo Land, they are also included in third-party titles such as Sonic & All-Stars Racing Transformed. In Mario Kart 8, there are 19+ amiibo suits when you tap an amiibo on the left side of the Wii U Gamepad based on the customizable characters.
The Legend of Zelda: Breath of the Wild uses an evolved form of Mii, UMii, to render non-essential NPCs.
Nintendo Switch
Miis can be used on the Nintendo Switch to represent accounts; unlike previous Nintendo consoles, users may optionally use a Nintendo character as their avatar instead, but Miis can still be integrated into games as a playable character such as Mario Kart 8 Deluxe, Go Vacation, Super Smash Bros. Ultimate and New Super Mario Bros. U Deluxe. The Mii Maker is located within the console's system settings menu and, unlike previous Mii Makers, provides unnatural hair and eye colors. Miis can be transferred between Nintendo Switch consoles and imported from a Nintendo 3DS or Wii U using an amiibo figure. Miis can also be imported from a Nintendo Account. On Mario Kart 8 Deluxe, there are 20 amiibos registered after when you tap an amiibo on the NFC reader here on the Joy-Con (R), or on the top of the Nintendo Switch Pro Controller, Like with the Wii and Nintendo 3DS, the Nintendo Switch can hold up to 100+ Miis.
The first Mii-centered game released on the Nintendo Switch so far is Miitopia, a remaster of the 2017 role-playing video game originally on the Nintendo 3DS.
Once again, the Miis will be incorporated into the new Nintendo Switch Sports which will go on sale on April 29, 2022.
Other platforms
When the freemium mobile app (since shut down) Miitomo launched on iOS and Android devices, it was possible for the first time to officially create Mii characters without the need for a Nintendo console. Mii characters created on the app initially resembled their Wii U counterparts, with a later update introducing Nintendo Switch standards, and use the same attributes. Nintendo Account holders can opt to use the app to create Mii avatars without the need to link their accounts to any Nintendo console, with the option also available. Miitomo was also be used to support Mii avatars on other Nintendo apps for smart devices. For example, an update released for Super Mario Run on April 25, 2017, introduced player Mii icon customization options with the support of Miitomo, and its in-game costumes, via the same Nintendo Account. Miitomo was only available in 60 of the 165 countries/territories recognized by Nintendo Accounts by May 9, 2018, when it was discontinued.
On May 24, 2018, Nintendo introduced a browser-based Mii editor called Mii Studio. The editor is integrated into all Nintendo Accounts for users in all regions, including regions that did not originally have official support for Miitomo. Mii Studio supports all Mii attributes and standards introduced for the Nintendo Switch. Up to six Mii avatars can be created at a time, including any Mii imported from a linked Nintendo Network ID (also known as NNID)
See also
Xbox Avatars
PlayStation Home, which also featured avatar creation
References
External links
Mii Channel at Nintendo.com
Computer-related introductions in 2006
Nintendo protagonists
Video game characters of selectable gender
Video game mascots
Virtual avatars
Wii
Super Smash Bros. fighters |
47690099 | https://en.wikipedia.org/wiki/Summer%20School%20Marktoberdorf | Summer School Marktoberdorf | The International Summer School Marktoberdorf is an annual two-week summer school for international computer science and mathematics postgraduate students and other young researchers, held annually since 1970 in Marktoberdorf, near Munich in southern Germany. Students are accommodated in the boarding house of a local high school, Gymnasium Marktoberdorf. Proceedings are published when appropriate.
Status
This is a summer school for theoretical computer science researchers, with some directors/co-directors who are Turing Award winners (the nearest equivalent to the Nobel Prize in computer science).
The summer school is supported as an Advanced Study Institute of the NATO Science for Peace and Security Program. It is administered by the Faculty of Informatics at the Technical University of Munich.
Directors
Past academic directors and co-directors include:
Manfred Broy
Robert Lee Constable
Javier Esparza
Orna Grumberg
David Harel
Tony Hoare*
Orna Kupferman
Tobias Nipkow
Doron Peled
Amir Pnueli*
Alexander Pretschner
Shmuel Sagiv
Helmut Schwichtenberg
Helmut Seidl
Stanley S. Wainer
* Turing Award winners.
References
External links
1970 establishments in Germany
Recurring events established in 1970
August events
Marktoberdorf
Computer science conferences
Computer science education
Theoretical computer science
Annual events in Germany
Events in West Germany
NATO
Technical University of Munich
Education in Bavaria |
9424 | https://en.wikipedia.org/wiki/Ericsson | Ericsson | (lit. "Telephone Stock Company of LM Ericsson"), commonly known as Ericsson, is a Swedish multinational networking and telecommunications company headquartered in Stockholm. The company sells infrastructure, software, and services in information and communications technology for telecommunications service providers and enterprises, including, among others, 3G, 4G, and 5G equipment, and Internet Protocol (IP) and optical transport systems. The company employs around 100,000 people and operates in more than 180 countries. Ericsson has over 57,000 granted patents.
Ericsson has been a major contributor to the development of the telecommunications industry and is one of the leaders in 5G.
The company was founded in 1876 by Lars Magnus Ericsson and is jointly controlled by the Wallenberg family through its holding company Investor AB and the investment company Industrivärden. The Wallenbergs and the Handelsbanken sphere acquired their voting-strong A-shares, and thus the control of Ericsson, after the fall of the Kreuger empire in the early 1930s.
Ericsson is the inventor of Bluetooth technology.
History
Foundation
Lars Magnus Ericsson began his association with telephones in his youth as an instrument maker. He worked for a firm that made telegraph equipment for the Swedish government agency Telegrafverket. In 1876, at the age of 30, he started a telegraph repair shop with help from his friend Carl Johan Andersson in central Stockholm and repaired foreign-made telephones. In 1878 Ericsson began making and selling his own telephone equipment. His telephones were not technically innovative. In 1878 he made an agreement to supply telephones and switchboards to Sweden's first telecommunications operating company, Stockholms Allmänna Telefonaktiebolag.
International expansion
As production grew in the late 1890s, and the Swedish market seemed to be reaching saturation, Ericsson expanded into foreign markets through a number of agents. The UK (Ericsson Telephones Ltd.) and Russia were early markets, where factories were later established improve the chances of gaining local contracts and to augment the output of the Swedish factory. In the UK, the National Telephone Company was a major customer; by 1897 sold 28% of its output in the UK. The Nordic countries were also Ericsson customers; they were encouraged by the growth of telephone services in Sweden.
Other countries and colonies were exposed to Ericsson products through the influence of their parent countries. These included Australia and New Zealand, which by the late 1890s were Ericsson's largest non-European markets. Mass production techniques now firmly established; telephones were losing some of their ornate finish and decoration.
Despite their successes elsewhere, Ericsson did not make significant sales into the United States. The Bell Group, Kellogg and Automatic Electric dominated the market. Ericsson eventually sold its U.S. assets. Sales in Mexico led to inroads into South American countries. South Africa and China were also generating significant sales. With his company now multinational, Lars Ericsson stepped down from the company in 1901.
Automatic equipment
Ericsson ignored the growth of automatic telephony in the United States and concentrated on manual exchange designs. Their first dial telephone was produced in 1921, although sales of the early automatic switching systems were slow until the equipment had proven itself on the world's markets. Telephones of this period had a simpler design and finish, and many of the early automatic desk telephones in Ericsson's catalogues were magneto styles with a dial on the front and appropriate changes to the electronics. Elaborate decals decorated the cases. World War I, the subsequent Great Depression, and the loss of its Russian assets after the Revolution slowed the company's development while sales to other countries fell by about half.
Shareholding changes
The acquisition of other telecommunications companies put pressure on Ericsson's finances; in 1925, Karl Fredric Wincrantz took control of the company by acquiring most of the shares. Wincrantz was partly funded by Ivar Kreuger, an international financier. The company was renamed Telefonaktiebolaget L M Ericsson. Kreuger started showing interest in the company, being a major owner of Wincrantz holding companies.
Wallenberg era begins
Ericsson was saved from bankruptcy and closure with the help of banks including Stockholms Enskilda Bank (now Skandinaviska Enskilda Banken) and other Swedish investment banks controlled by the Wallenberg family, and some Swedish government backing. Marcus Wallenberg Jr. negotiated a deal with several Swedish banks to rebuild Ericsson financially. The banks gradually increased their possession of LM Ericsson "A" shares, while ITT was still the largest shareholder. In 1960, the Wallenberg family bought ITT's shares in Ericsson, and has since controlled the company.
Market development
In the 1920s and 1930s, the world telephone markets were being organized and stabilized by many governments. The fragmented town-by-town systems serviced by small, private companies that had evolved were integrated and offered for lease to a single company. Ericsson obtained some leases, which represented further sales of equipment to the growing networks. Ericsson got almost one-third of its sales under the control of its telephone operating companies.
Further development
Ericsson introduced the world's first fully automatic mobile telephone system, MTA, in 1956. It released one of the world's first hands-free speaker telephones in the 1960s. In 1954, it released the Ericofon. Ericsson crossbar switching equipment was used in telephone administrations in many countries. In 1983 the company introduced the ERIPAX suite of network products and services.
Emergence of the Internet (1995–2003)
In the 1990s, during the emergence of the Internet, Ericsson was regarded as slow to realize its potential and falling behind in the area of IP technology. But the company had established an Internet project in 1995 called Infocom Systems to exploit opportunities leading from fixed-line telecom and IT. CEO Lars Ramqvist wrote in the 1996 annual report that in all three of its business areas – Mobile Telephones and Terminals, Mobile Systems, and Infocom Systems – "we will expand our operations as they relate to customer service and Internet Protocol (IP) access (Internet and intranet access)".
The growth of GSM, which became a de facto world standard, combined with Ericsson's other mobile standards, such as D-AMPS and PDC, meant that by the start of 1997, Ericsson had an estimated 40% share of the world's mobile market, with around 54 million subscribers. There were also around 188 million AXE lines in place or on order in 117 countries. Telecom and chip companies worked in the 1990s to provide Internet access over mobile telephones. Early versions such as Wireless Application Protocol (WAP) used packet data over the existing GSM network, in a form known as GPRS (General Packet Radio Service), but these services, known as 2.5G, were fairly rudimentary and did not achieve much mass-market success.
The International Telecommunication Union (ITU) had prepared the specifications for a 3G mobile service that included several technologies. Ericsson pushed hard for the WCDMA (wideband CDMA) form based on the GSM standard, and began testing it in 1996. Japanese operator NTT Docomo signed deals to partner with Ericsson and Nokia, who came together in 1997 to support WCDMA over rival standards. DoCoMo was the first operator with a live 3G network, using its own version of WCDMA called FOMA. Ericsson was a significant developer of the WCDMA version of GSM, while US-based chip developer Qualcomm promoted the alternative system CDMA2000, building on the popularity of CDMA in the US market. This resulted in a patent infringement lawsuit that was resolved in March 1999 when the two companies agreed to pay each other royalties for the use of their respective technologies and Ericsson purchased Qualcomm's wireless infrastructure business and some R&D resources.
Ericsson issued a profit warning in March 2001. Over the coming year, sales to operators halved. Mobile telephones became a burden; the company's telephones unit made a loss of SEK 24 billion in 2000. A fire in a Philips chip factory in New Mexico in March 2000 caused severe disruption to Ericsson's phone production, dealing a coup de grâce to Ericsson's mobile phone hopes. Mobile phones would be spun off into a joint venture with Sony, Sony Ericsson Mobile Communications, in October 2001.
Ericsson launched several rounds of restructuring, refinancing and job-cutting; during 2001, staff numbers fell from 107,000 to 85,000. A further 20,000 went the next year, and 11,000 more in 2003. A new rights issue raised SEK 30 billion to keep the company afloat. The company had survived as mobile Internet started growing. With record profits, it was in better shape than many of its competitors.
Rebuilding and growing (2003–2018)
The emergence of full mobile Internet began a period of growth for the global telecom industry, including Ericsson. After the launch of 3G services during 2003, people started to access the Internet using their telephones.
Ericsson was working on ways to improve WCDMA as operators were buying and rolling it out; it was the first generation of 3G access. New advances included IMS (IP Multimedia Subsystem) and the next evolution of WCDMA, called High-Speed Packet Access (HSPA). It was initially deployed in the download version called HSDPA; the technology spread from the first test calls in the US in late 2005 to 59 commercial networks in September 2006. HSPA would provide the world's first mobile broadband.
In July 2016, Hans Vestberg stepped down as Ericsson's CEO after heading the company for six years. Jan Frykhammar, who has been working for the company since 1991 will be stepping in as interim CEO as Ericsson searches for a full-time replacement. On 16 January 2017, following Ericsson's announcement on 26 October 2016, new CEO Börje Ekholm started and interim CEO Jan Frykhammar stepped down the following day.
In June 2018, Ericsson, Inc. and Ericsson AB have agreed to pay $145,893 to settle potential civil liability for an apparent violation of the International Emergency Economic Powers Act (IEEPA) and the Sudanese Sanctions Regulations, 31 C.F.R. part 538 (SSR).1
Acquisitions and cooperation
Around 2000, companies and governments began to push for standards for mobile Internet. In May 2000, the European Commission created the Wireless Strategic Initiative, a consortium of four telecommunications suppliers in Europe – Ericsson, Nokia, Alcatel (France) and Siemens (Germany) – to develop and test new prototypes for advanced wireless communications systems. Later that year, the consortium partners invited other companies to join them in a Wireless World Research Forum in 2001. In December 1999, Microsoft
and Ericsson announced a strategic partnership to combine the former's web browser and server software with the latter's mobile-internet technologies. In 2000, the Dot-com bubble burst with marked economic implications for Sweden. Ericsson, the world's largest producer of mobile telecommunications equipment, shed thousands of jobs, as did the country's Internet consulting firms and dot-com start-ups. In the same year, Intel, the world's largest semiconductor chip manufacturer, signed a $1.5 billion deal to supply flash memory to Ericsson over the next three years.
The short-lived joint venture called Ericsson Microsoft Mobile Venture, owned 70/30 percent by Ericsson and Microsoft, respectively, ended in October 2001 when Ericsson announced it would absorb the former joint venture and adopt a licensing agreement with Microsoft instead. The same month, Ericsson announced the launch of Sony Ericsson, a joint venture mobile telephone business, together with Sony. Sony Ericsson remained in operation until February 2012, when Sony bought out Ericsson's share; Ericsson said it wanted to focus on the global wireless market as a whole.
Lower stock prices and job losses affected many telecommunications companies in 2001. The major equipment manufacturers – Motorola (U.S.), Lucent Technologies (U.S.), Cisco Systems (U.S.), Marconi (UK), Siemens (Germany), Nokia (Finland), as well as Ericsson – all announced job cuts in their home countries and in subsidiaries around the world. Ericsson's workforce worldwide fell during 2001 from 107,000 to 85,000.
In September 2001, Ericsson purchased the remaining shares in EHPT from Hewlett Packard. Founded in 1993, Ericsson Hewlett Packard Telecom (EHPT) was a joint venture made up of 60% Ericsson interests and 40% Hewlett-Packard interests.
In 2002, ICT investor losses topped $2 trillion and share prices fell by 95% until August that year. More than half a million people lost their jobs in the global telecom industry over the two years. The collapse of U.S. carrier WorldCom, with more than $107 billion in assets, was the biggest in U.S. history. The sector's problems caused bankruptcies and job losses, and led to changes in the leadership of a number of major companies. Ericsson made 20,000 more staff redundant and raised about $3 billion from its shareholders. In June 2002, Infineon Technologies (then the sixth-largest semiconductor supplier and a subsidiary of Siemens) bought Ericsson's microelectronics unit for $400 million.
Ericsson was an official backer in the 2005 launch of the .mobi top level domain created specifically for the mobile internet.
Co-operation with Hewlett-Packard did not end with EHPT; in 2003 Ericsson outsourced its IT to HP, which included Managed Services, Help Desk Support, Data Center Operations, and HP Utility Data Center. The contract was extended in 2008. In October 2005, Ericsson acquired the bulk of the troubled UK telecommunications manufacturer Marconi Company, including its brand name that dates back to the creation of the original Marconi Company by the "father of radio" Guglielmo Marconi. In September 2006, Ericsson sold the greater part of its defense business Ericsson Microwave Systems, which mainly produced sensor and radar systems, to Saab AB, which renamed the company to Saab Microwave Systems.
In 2007, Ericsson acquired carrier edge-router maker Redback Networks, and then Entrisphere, a US-based company providing fiber-access technology. In September 2007, Ericsson acquired an 84% interest in German customer-care and billing software firm LHS, a stake later raised to 100%. In 2008, Ericsson sold its enterprise PBX division to Aastra Technologies, and acquired Tandberg Television, the television technology division of Norwegian company Tandberg.
In 2009, Ericsson bought the CDMA2000 and LTE business of Nortel's carrier networks division for US$1.18 billion; Bizitek, a Turkish business support systems integrator; the Estonian manufacturing operations of electronic manufacturing company Elcoteq; and completed its acquisition of LHS. Acquisitions in 2010 included assets from the Strategy and Technology Group of inCode, a North American business and consulting-services company; Nortel's majority shareholding (50% plus one share) in LG-Nortel, a joint venture between LG Electronics and Nortel Networks providing sales, R&D and industrial capacity in South Korea, now known as Ericsson-LG; further Nortel carrier-division assets, relating from Nortel's GSM business in the United States and Canada; Optimi Corporation, a U.S.–Spanish telecommunications vendor specializing in network optimization and management; and Pride, a consulting and systems-integration company operating in Italy.
In 2011, Ericsson acquired manufacturing and research facilities, and staff from the Guangdong Nortel Telecommunication Equipment Company (GDNT) as well as Nortel's Multiservice Switch business. Ericsson acquired U.S. company Telcordia Technologies in January 2012, an operations and business support systems (OSS/BSS) company. In March, Ericsson announced it was buying the broadcast-services division of Technicolor, a media broadcast technology company. In April 2012 Ericsson completed the acquisition of BelAir Networks a strong Wi-Fi network technology company.
On 3 May 2013, Ericsson announced it would divest its power cable operations to Danish company NKT Holding. On 1 July 2013, Ericsson announced it would acquire the media management company Red Bee Media, subject to regulatory approval. The acquisition was completed on 9 May 2014. In September 2013, Ericsson completed its acquisition of Microsoft's Mediaroom business and televisions services, originally announced in April the same year. The acquisition makes Ericsson the largest provider of IPTV and multi-screen services in the world, by market share; it was renamed Ericsson Mediaroom. In September 2014, Ericsson acquired majority stake in Apcera for cloud policy compliance. In October 2015, Ericsson completed the acquisition of Envivio, a software encoding company. In April 2016, Ericsson acquired Polish and Ukrainian operations of software development company Ericpol, a long-time supplier to Ericsson. Approximately 2,300 Ericpol employees joined Ericsson, bringing software development competence in radio, cloud, and IP.
On 20 June 2017, Bloomberg disclosed that Ericsson hired Morgan Stanley to explore the sale of its media businesses. The Red Bee Media business was kept in-house as an independent subsidiary company, as no suitable buyer was found, but a 51% stake of the remainder of the Media Solution division was sold to private equity firm One Equity Partners, the new company being named MediaKind. The transaction was completed on 31 January 2019. In February 2018, Ericsson acquired the location-based mobile data management platform Placecast. Ericsson has since integrated Placecast's platform and capabilities with its programmatic mobile ad subsidiary, Emodo. In May 2018, SoftBank partnered with Ericsson to trial new radio technology. In September 2020, Ericsson acquired US-based carrier equipment manufacturer Cradlepoint for $1.1 billion.
In November 2021, Ericsson announced it has reached an agreement to acquire Vonage for $6.2 billion.
Corporate governance
, members of the board of directors of LM Ericsson were: Leif Johansson, Jacob Wallenberg, Kristin S. Rinne, Helena Stjernholm, Sukhinder Singh Cassidy, Börje Ekholm, Ulf J. Johansson, Mikael Lännqvist, Zlatko Hadzic, Kjell-Åke Soting, Nora Denzel, Kristin Skogen Lund, Pehr Claesson, Karin Åberg and Roger Svensson.
Research and development
Ericsson has structured its R&D in three levels depending on when products or technologies will be introduced to customers and users. Its research and development organization is part of 'Group Function Technology' and addresses several facets of network architecture: wireless access networks; radio access technologies; broadband technologies; packet technologies; multimedia technologies; services software; EMF safety and sustainability; security; and global services. The head of research since 2012 is Sara Mazur.
Group Function Technology holds research co-operations with several major universities and research institutes including: Lund University in Sweden, Eötvös Loránd University in Hungary and Beijing Institute of Technology in China. Ericsson also holds research co-operations within several European research programs such as GigaWam and OASE. Ericsson holds 33,000 granted patents, and is the number-one holder of GSM/GPRS/EDGE, WCDMA/HSPA, and LTE essential patents. In 2021, the WIPO’s annual World Intellectual Property Indicators report ranked Ericsson's number of patent applications published under the PCT System as 6th in the world, with 1,989 patent applications being published during 2020. This position is up from their previous ranking as 7th in 2019 with 1,698 applications.
Ericsson hosts a developer program called Ericsson Developer Connection designed to encourage development of applications and services. Ericsson also has an open innovation initiative for beta applications and beta API's & tools called Ericsson Labs. The company hosts several internal innovation competitions among its employees.
Products and services
Ericsson's business includes technology research, development, network systems and software development, and running operations for telecom service providers. and software Ericsson offers end-to-end services for all major mobile communication standards, and has three main business units.
Business Area Networks
Business Area Networks, previously called Business Unit Networks, develops network infrastructure for communication needs over mobile and fixed connections. Its products include radio base stations, radio network controllers, mobile switching centers and service application nodes. Operators use Ericsson products to migrate from 2G to 3G and, most recently, to 4G networks.
The company's network division has been described as a driver in the development of 2G, 3G, 4G/LTE and 5G technology, and the evolution towards all-IP, and it develops and deploys advanced LTE systems, but it is still developing the older GSM, WCDMA, and CDMA technologies. The company's networks portfolio also includes microwave transport, Internet Protocol (IP) networks, fixed-access services for copper and fiber, and mobile broadband modules, several levels of fixed broadband access, radio access networks from small pico cells to high-capacity macro cells and controllers for radio base stations.
Network services
Ericsson's network rollout services employ in-house capabilities, subcontractors and central resources to make changes to live networks. Services such as technology deployment, network transformation, support services and network optimization are also provided.
Business Area Digital Services
This unit provides core networks, Operations Support Systems such as network management and analytics, and Business Support Systems such as billing and mediation. Within the Digital Services unit, there is an m-Commerce offering, which focuses on service providers and facilitates their working with financial institutions and intermediaries. Ericsson has announced m-commerce deals with Western Union and African wireless carrier MTN.
Business Area Managed Services
The unit is active in 180 countries; it supplies managed services, systems integration, consulting, network rollout, design and optimization, broadcast services, learning services and support.
The company also works with television and media, public safety, and utilities. Ericsson claims to manage networks that serve more than 1 billion subscribers worldwide, and to support customer networks that serve more than 2.5 billion subscribers.
Broadcast services
Ericsson's Broadcast Services unit was evolved into a unit called Red Bee Media, which has been spun out into a joint venture. It deals with the playout of live and pre-recorded, commercial and public service television programmes, including presentation (continuity announcements), trailers, and ancillary access services such as closed-caption subtitles, audio description and in-vision sign language interpreters. Its media management services consist of Managed Media Preparation and Managed Media Internet Delivery.
Divested businesses
Sony Ericsson Mobile Communications AB (Sony Ericsson) was a joint venture with Sony that merged the previous mobile telephone operations of both companies. It manufactured mobile telephones, accessories and personal computer (PC) cards. Sony Ericsson was responsible for product design and development, marketing, sales, distribution and customer services. On 16 February 2012, Sony announced it had completed the full acquisition of Sony Ericsson, after which it changed name to Sony Mobile Communications, and nearly a year later it moved headquarters from Sweden to Japan.
Mobile (cell) telephones
As a joint venture with Sony, Ericsson's mobile telephone production was moved into the company Sony Ericsson in 2001. The following is a list of mobile phones marketed under the brand name Ericsson.
Ericsson GS88 – Cancelled mobile telephone where Ericsson invented the "Smartphone" name for
Ericsson GA628 – Known for its Z80 CPU
Ericsson SH888 – First mobile telephone to have wireless modem capabilities
Ericsson A1018 – Dualband cellphone, notably easy to hack
Ericsson A2618 & Ericsson A2628 – Dualband cellphones. Use graphical LCD display based on PCF8548 I²C controller.
Ericsson PF768
Ericsson GF768
Ericsson GH388
Ericsson T10 – Colourful Cellphone
Ericsson T18 – Business model of the T10, with active flip
Ericsson T28 – Very slim telephone. Uses lithium polymer batteries. Ericsson T28 FAQ use graphical LCD display based on PCF8558 I²C controller.
Ericsson T20s
Ericsson T29s – Similar to the T28s, but with WAP support
Ericsson T29m – pre-alpha prototype for the T39m
Ericsson T36m – Prototype for the T39m. Announced in yellow and blue. Never hit the market due to release T39m
Ericsson T39 – Similar to the T28, but with a GPRS modem, Bluetooth and triband capabilities
Ericsson T65
Ericsson T66
Ericsson T68m – The first Ericsson handset to have a color display, later branded as Sony Ericsson T68i
Ericsson R250s Pro – Fully dust and water resistant telephone
Ericsson R310s
Ericsson R320s
Ericsson R320s Titan – Limited Edition with titanium front
Ericsson R320s GPRS – Prototype for testing GPRS networks
Ericsson R360m – Pre-alpha prototype for the R520m
Ericsson R380 – First cellphone to use the Symbian OS
Ericsson R520m – Similar to the T39, but in a candy bar form factor and with added features such as a built-in speakerphone and an optical proximity sensor
Ericsson R520m UMTS – Prototype to test UMTS networks
Ericsson R520m SyncML – Prototype to test the SyncML implementation
Ericsson R580m – Announced in several press releases. Supposed to be a successor of the R380s without external antenna and with color display
Ericsson R600
Telephones
Ericsson Dialog
Ericofon
Ericsson Mobile Platforms
Ericsson Mobile Platforms existed for eight years; on 12 February 2009, Ericsson announced it would be merged with the mobile platform company of STMicroelectronics, ST-NXP Wireless, to create a 50/50 joint venture owned by Ericsson and STMicroelectronics.
This joint venture was divested in 2013 and remaining activities can be found in Ericsson Modems and STMicroelectronics. Ericsson Mobile Platform ceased being a legal entity early 2009.
Ericsson Enterprise
Starting in 1983 Ericsson Enterprise provided communications systems and services for businesses, public entities and educational institutions. It produced products for voice over Internet protocol (VoIP)-based private branch exchanges (PBX), wireless local area networks (WLAN), and mobile intranets.
Ericsson Enterprise operated mainly from Sweden but also operated through regional units and other partners/distributors. In 2008 it was sold to Aastra.
Corruption
On 7 December 2019, Ericsson agreed to pay more than $1.2 billion (€1.09 billion) to settle U.S. Department of Justice FCPA criminal and civil investigations into foreign corruption. US authorities accused the company of conducting a campaign of corruption between 2000 and 2016 across China, Indonesia, Vietnam, Kuwait and Djibouti. Ericsson admitted to paying bribes, falsifying books and records and failing to implement reasonable internal accounting controls in an attempt to strengthen its position in the telecommunications industry.
In 2022, an internal investigation into corruption inside the company was leaked. It detailed corruption in at least 10 countries. Ericsson has admitted “serious breaches of compliance rules”.
See also
Cedergren
Damovo
Ericsson Nikola Tesla
Erlang (programming language)
Investor AB
List of networking hardware vendors
List of Sony Ericsson products
Red Jade
Tandberg Television
References
Further reading
John Meurling & Richard Jeans (1994) A switch in time: AXE – creating a foundation for the information age. London: Communications Week International. .
John Meurling & Richard Jeans (1997). The ugly duckling. Stockholm: Ericsson Mobile Communications. .
John Meurling & Richard Jeans (2000). The Ericsson Chronicle: 125 years in telecommunications. Stockholm: Informationsförlaget. .
The Mobile Phone Book: The Invention of the Mobile Telephone Industry.
Mobile media and applications – from concept to cash: successful service creation and launch.
Anders Pehrsson (1996). International Strategies in Telecommunications. London: Routledge Research.
External links
General reference to all chapters on the Ericsson history:
Ericsson Logo
1876 establishments in Sweden
Companies based in Stockholm
Manufacturing companies established in 1876
Companies formerly listed on the London Stock Exchange
Companies listed on the Nasdaq
Companies listed on Nasdaq Stockholm
Companies related to the Wallenberg family
Electronics companies of Sweden
Mobile phone manufacturers
Multinational companies headquartered in Sweden
Networking hardware companies
Swedish brands
Telecommunications equipment vendors
Video equipment manufacturers
Swedish companies established in 1876 |
274144 | https://en.wikipedia.org/wiki/Home%20directory | Home directory | A home directory is a file system directory on a multi-user operating system containing files for a given user of the system. The specifics of the home directory (such as its name and location) are defined by the operating system involved; for example, Linux / BSD (FHS) systems use /home/ and Windows systems between 2000 and Server 2003 keep home directories in a folder called Documents and Settings.
Description
A user's home directory is intended to contain that user's files; including text documents, music, pictures or videos, etc. It may also include their configuration files of preferred settings for any software they have used there and might have tailored to their liking: web browser bookmarks, favorite desktop wallpaper and themes, passwords to any external services accessed via a given software, etc. The user can install executable software in this directory, but it will only be available to users with permission to this directory. The home directory can be organized further with the use of sub-directories.
The content of a user's home directory is protected by file system permissions, and by default is accessible to all authenticated users and administrators. Any other user that has been granted administrator privileges has authority to access any protected location on the filesystem including other users home directories.
Benefits
Separating user data from system-wide data avoids redundancy and makes backups of important files relatively simple. Furthermore, Trojan horses, viruses and worms running under the user's name and with their privileges will in most cases only be able to alter the files in the user's home directory, and perhaps some files belonging to workgroups the user is a part of, but not actual system files.
Default home directory per operating system
Subdirectories
The file on many Linux systems defines the subdirectories created for users by default.
Creation is normally done with the first login by Xdg-user-dirs, a tool to help manage "well known" user directories like desktop, downloads, documents, pictures, videos or music. The tool is also capable of localization (i.e. translation) of the folders names.
Other features per operating system
Unix
In Unix, the current working directory is automatically set to a user's home directory when they log in. The (tilde character) shorthand command refers to that particular user's home directory.
The Unix superuser has access to all directories on the filesystem, and hence can access home directories of all users. The superuser's home directory on older systems was , but on many newer systems it is located at (Linux, BSD), or (Mac OS X).
VMS
In the OpenVMS operating system, a user's home directory is called the root directory, and the equivalent of a Unix/DOS/Windows/AmigaOS root directory is referred to as the Master File Directory.
Contrast with single-user operating systems
Single-user operating systems simply have a single directory or partition for all user files, there is no individual directory setup per user (though users can still setup and maintain directories inside this main working directory manually).
AmigaOS versions 2 and up have "System" and "Work" partitions on hard disks by default.
BeOS (and its successors) have a /home directory which contain the files belonging to the single user of the system.
Versions of Windows prior Windows 95 OEM Service Release 2 did not have a user folder, but since that release, became in effect the single user's home directory.
NeXTSTEP and OPENSTEP in a single user, non-networked setup, is used, as well as when logged in as superuser.
See also
Filesystem hierarchy standard
My Documents
Parent directory
Root directory
Working directory
References
Computer file systems
File system directories |
34037291 | https://en.wikipedia.org/wiki/Joseph%20Walther | Joseph Walther | Joseph B. Walther (born 1958) is the Mark and Susan Bertelsen Presidential Chair in Technology and Society and the Director of the Center for Information Technology & Society at the University of California, Santa Barbara. His research focuses on social and interpersonal dynamics of computer-mediated communication, in groups, personal relationships, organizational and educational settings. He is noted for creating social information processing theory in 1992 and the hyperpersonal model in 1996.
Life and work
Joseph B. Walther was born in 1958 in Santa Monica, Calif. Walther attended Sana Ana College, Saddleback College and spent time with the Royal Shakespeare Company at Coastline Community College before transferring to the University of Arizona and graduating magna cum laude in 1983. Walther continued at the University of Arizona, earning a master's degree in speech communication in 1984 and a doctorate in 1990.
Walther has previously held appointments in Information Technology, Psychology, and Education and Social Policy at universities in the U.S. and the United Kingdom and was chair of the Organizational Communication and Information Systems division of the Academy of Management, and the Communication and Technology division of the International Communication Association.
Based on his research into computer-mediated communication, Walther introduced social information processing theory in 1992. Social information processing theory finds that the development of relationships via computer-mediated communication depends on sufficient time and message exchanges, and on the application of available communicative cues by users. The lack of nonverbal cues means that computer-mediated communications contain less information than face-to-face communications, however social information processing theory finds that longer and/or more frequent communication as well as the use of other cues (i.e. spelling ability) while participating in computer-mediated communication help address the issue of information exchange.
The social information perspective assumes that communicators in computer-mediated exchanges are similarly driven to acquire social information that will encourage the development of social relationships as are communicators using other media. Support for social information processing theory has been found in contexts such as online dating and online multi-player video games.
Walther's research also led him to develop the hyperpersonal model of communication in 1996. Walther's work on the hyperpersonal model is his research that has been most cited by other researchers. The hyperpersonal model finds that in certain circumstances, computer-mediated communication surpasses the affection and emotion of similar situations of face-to-face interpersonal communication. This model also offers a robust view of computer-mediated communication, taking into account the contributions of the sender, receiver, channel and feedback in a computer-mediated interaction.
The hyperpersonal model finds that two characteristics of computer-mediated communication – reduced communication cues and potentially asynchronous communication – facilitate both optimized self-presentation by message senders and idealized perceptions of the sender by message receivers. Walther's hyperpersonal model predicts that media classified as less rich by media richness theory or less natural by media naturalness theory allow more socially desirable levels of interaction than face-to-face communication.
Academic appointments
1990-1992: Assistant Professor, Dept. of Communication, University of Oklahoma
Fall 1995: Visiting Professor, Dept. of Psychology, University of Manchester
1992-1997: Assistant Professor of Communication Studies, Northwestern University
1995-1996: Ameritech Research Professor, Northwestern University
1995-1997: Assistant Professor of Education and Social Policy, Northwestern University
Spring, 1999: Adjunct Associate Professor (Virtual) of Communication Studies, University of Kansas
1997-2002: Associate Professor in Language, Literature, & Communication, Social Psychology, and Information Technology, Rensselaer Polytechnic Institute
May, 2005: Visiting Professor, School of Communication Studies, Kent State University
2002-2006: Associate Professor, Professor in Communication, Information Science, Cornell University
2006- 2013: Professor, Dept. of Communication, Michigan State University & Professor, Dept. of Telecommunication, Information Studies and Media, Michigan State University
2013 - 2017: Wee Kim Wee Professor, Division of Communication Research, Nanyang Technological University
2017- Present: Mark and Susan Bertelsen Presidential Chair in Technology and Society; Director of the Center for Information Technology & Society, UC Santa Barbara
Bibliography
Selected works:
See also
Social information processing (theory)
Hyperpersonal model
Computer-mediated communication
Warranting theory
Relational Maintenance and CMC - Tie Signs
References
External links
https://web.archive.org/web/20111228224314/http://www.tism.msu.edu/users/joseph-walther
https://michiganstate.academia.edu/JosephWalther
Living people
Michigan State University faculty
Cornell University faculty
1958 births |
8556788 | https://en.wikipedia.org/wiki/Parallels%20Desktop%20for%20Mac | Parallels Desktop for Mac | Parallels Desktop for Mac is software providing hardware virtualization for Macintosh computers with Intel processors, and since version 16.5 also for Apple silicon-based Macintosh computers. It is developed by Parallels, since 2018 a subsidiary of Corel.
Overview
Parallels, Inc. is a developer of desktop and server virtualization software.
History
Released on June 15, 2006, it was the first software product to bring mainstream virtualization to Macintosh computers utilizing the Apple–Intel architecture (earlier software products ran PC software in an emulated environment).
Its name initially was 'Parallels Workstation for Mac OS X', which was consistent with the company's corresponding Linux and Windows products. This name was not well received within the Mac community, where some felt that the name, particularly the term “workstation,” evoked the aesthetics of a Windows product. Parallels agreed: “Since we've got a great Mac product, we should make it look and sound like a Mac product...”, it was therefore renamed ‘Parallels Desktop for Mac’.
On January 10, 2007, Parallels Desktop 3.0 for Mac was awarded “Best in Show” at MacWorld 2007.
Technical
Parallels Desktop for Mac is a hardware emulation virtualization software, using hypervisor technology that works by mapping the host computer's hardware resources directly to the virtual machine's resources. Each virtual machine thus operates identically to a standalone computer, with virtually all the resources of a physical computer. Because all guest virtual machines use the same hardware drivers irrespective of the actual hardware on the host computer, virtual machine instances are highly portable between computers. For example, a running virtual machine can be stopped, copied to another physical computer, and restarted.
Parallels Desktop for Mac is able to virtualize a full set of standard PC hardware, including
A virtualized CPU of the same type as the host's physical processor,
ACPI compliance system,
A generic motherboard compatible with the Intel i965 chipset,
Up to 64 GB of RAM for guest virtual machines,
Up to 2 GB of video RAM (VRAM),
VGA and SVGA video adapter with VESA 3.0 support and OpenGL and DirectX 10.1 acceleration,
A 1.44 MB floppy drive, which can be mapped to a physical drive or to an image file,
Up to four IDE devices. This includes virtual hard drives ranging in size from 20 MB to 2 TB each and CD/DVD-ROM drives. Virtual CD/DVD-ROM drives can be mapped to either physical drives or ISO image files.
DVD/CD-ROM “pass-through” access,
Up to four serial ports that can be mapped to a pipe or to an output file,
Up to three bi-directional parallel ports, each of which can be mapped to a real port, to a real printer, or to an output file,
An Ethernet virtual network card compatible with Realtek RTL8029(AS), capable of up to 16 network interface connections,
Up to eight USB 2.0 devices and two USB 1.1 devices,
An AC'97-compatible sound card.
A 104-key Windows enhanced keyboard and a PS/2 wheel mouse.
Version history
Version 2.5
The first official release of version 2.5 was on February 27, 2007, as build 3186.
Version 2.5 brought support for USB 2.0 devices, which expanded the number of USB devices supported at native speed, including support for built-in iSight USB webcams. The amount of video RAM allocated to the guest OS was made adjustable, up to 32MB. Full featured CD/DVD drives arrived in this version, which allowed the user to burn disks directly in the virtual environment, and play any copy-protected CD or DVD as one would in Mac OS X. In addition, a shared clipboard and drag-drop support between Mac OS X and the guest OS was implemented. This version brought the ability for users with a Windows XP installation to upgrade to Windows Vista from within the VM environment. A new feature known as Coherence was added, which removed the Windows chrome, desktop, and the virtualization frames to create a more seamless desktop environment between Windows and Mac OS X applications. This version also allowed users to boot their existing Boot Camp Windows XP partitions, which eliminated the need to have multiple Windows installations on their Mac. A tool called Parallels Transporter was included to allow users to migrate their Windows PC, or existing VMware or Virtual PC VMs to Parallels Desktop for Mac.
Netsys lawsuit
In 2007, the German company Netsys GmbH sued Parallels' German distributor Avanquest for copyright violation, claiming that Parallels Desktop and Parallels Workstation are directly based on a line of products called “twoOStwo” that Parallels developed on paid commission for Netsys, of which it says, Netsys has been assigned all copyrights. Additionally, the lawsuit claimed that Parallels Desktop 2.5's compatibility with “twoOStwo” showed that the two software products are run by essentially the same functional core. When Netsys lost its initial urgency proceeding, it filed a new suit, in which it requested a temporary injunction from the Landgericht district court of Berlin.
Version 3.0
On June 7, 2007 build 4124 was released as the first publicly available version of Desktop 3.0.
Version 3.0 retained all of the functionality from previous versions and added new features and tools. Support for DirectX 8.1 and OpenGL was added, allowing Mac users to play some Windows games without the need to boot into Windows with Boot Camp. A new feature called SmartSelect offers cross OS file and application integration by allowing the user to open Windows files with Mac OS X programs and vice versa. Parallels Explorer was introduced, which allows the user to browse their Windows system files in Mac OS X without actually launching Windows. A new snapshot feature was included, allowing one to restore their virtual machine environment to a previous state in case of issues. Further, Parallels added a security manager to limit the amount of interaction between the Windows and Mac OS X installations. This version included a long-awaited complete “Parallels tools'” driver suite for Linux guest operating systems. Therefore, integration between Mac OS X and Linux guest-OS's was greatly improved.
Despite the addition of numerous new features, tools and added functionality, the first iteration of Parallels Desktop for Mac 3.0 was missing some of the features that Parallels had planned for it. A Parallels, Inc. representative stated at MacWorld in January 2007 that version 3.0 would bring accelerated graphics, “multi-core virtual machines/virtual SMP, some SCSI support, a more Mac-like feel, as well as a more sophisticated coherence mode, dubbed Coherence 2.0”. While accelerated graphics have materialised, Coherence, as well as the overall look and feel of Parallels Desktop for Mac has only changed slightly. Also, SCSI support has not been implemented.
It is currently unknown if these features have been abandoned altogether, or if they will show up in a later build of version 3.0.
Build 4560, released on July 17, 2007, added an imaging tool which allowed users to add capacity to their virtual disks.
Feature update
Build 5160, released on September 11, 2007, added some new features and updated some current features.
The release focused on updates to Coherence, with support for Exposé, window shadows, transparent windows, and the ability to overlap several Windows and Mac windows. Further, Parallels' Image Tool was updated to allow one to change their virtual hard disk format between plain and expanding.
Parallels Explorer was updated to allow for one to automatically mount an offline VM hard drive to the Mac desktop. Some new features added are iPhone support in Windows, allowing iTunes in Windows to sync with it. Users can now mirror desktops or other folders. Further, Mac drives can now be mapped by Windows and sound devices can now be changed ‘on the fly’. Up to 2 GB of RAM can be allocated to a virtual machine, with a total of 4 GB of RAM available.
Parallels Desktop for Mac Build 5608 added support for guest Parallels Tools for Linux in the latest Linux distributions (including Ubuntu 8). It also added support for running 3D graphics in Windows virtual machines on Mac OS X Leopard 10.5.3.
Use of code from the Wine project
According to Parallels' Licensing page, Desktop for Mac version 3.0 contains Direct3D code that was originally developed by the Wine open-source project. Wine software is licensed under the GNU Lesser General Public License, which required Parallels to release the source code. Parallels released the modified source code on July 2, 2007, about 2 weeks after the promised release date. A Parallels spokesman explained the reasons for the delay in a message on the official company blog.
Version 4.0
Version 4.0, released November 11, 2008, updates its GUI, adds some new features, enhances its performance by up to 50% and consumes 15–30% less power than previous versions. Version 4.0 is the first version that supports both 32-bit and 64-bit guest operating systems. Parallels Desktop 4.0 for Mac's 3D support includes DirectX 9.0, DirectX Pixel Shader 2.0 and OpenGL 2.0 as well as 256 MB video memory. It also adds support for 8 GB RAM in a virtual machine and 8-way SMP. Parallels Desktop 4.0 introduces an adaptive hypervisor, which allows users to focus the host computer's resources towards either host or the guest operating system.
Parallels Desktop 4.0 for Mac adds some new features such as:
A fourth viewing mode called Modality, which allows users to scale the size of an active guest operating system on the Mac's desktop
A new screenshot utility called Clips, which lets users take and share screenshots between the host and the guest operating systems.
Start Menu integration and Automatic Windows Notifications on the Apple Menu Bar.
The ability to use select voice commands to remotely control the virtual machine.
The ability to start and stop a virtual machine via the iPhone. (Requires installing an iPhone application from Apple's AppStore.)
Starting with the Version 4.0 release, Parallels Desktop for Mac has a new logo, which resembles an aluminum iMac, with what appears to be Windows XP on the screen and 2 parallel red lines overlaid on the right side.
Feature update
Build 3810, released January 9, 2009, includes performance enhancements and features, such as DirectX 9.0 Shaders Model 2 and Vertex Shader support for additional 3D support Intel Streaming SIMD Extensions (SSE4) for better media applications performance. Build 3810 also adds support for running Windows 7 in a VM and for running Mac OS X Snow Leopard Server as either a host or as a guest OS.
Also included are usability features such as the ability to share Windows files by dragging them directly to a Mac application in the Mac Dock. Windows can now also automatically start in the background when a user opens a Windows application on the Mac desktop. Version 4.0 drew criticism for problems upgrading from Version 3.0 shortly after its initial release. Build 3810 also addresses installation and upgrade issues previously experienced with Version 4.0 and introduces the option to enroll in the company's new Customer Experience Program, which lets customers provide information about their preferences and user priorities.
Version 5
Officially released on November 4, 2009, Parallels Desktop 5 adds several new features, mainly to improve integration with the host OS.
New features include:
3D graphics and speed improvements
Optimized for Mac OS X 10.6 (Snow Leopard)
Support for Windows 7
Theming of Windows applications to make them look like native applications
Support for Multi-Touch gestures (from a trackpad or Magic Mouse) and the Apple Remote
The ability to drag and drop formatted text and images between Windows, Linux, and Mac applications,
The ability for a system administrator to lock down a virtual machine so that users can't change the state of the virtual machine,
Support for OpenGL 2.1 for Linux guest virtual machines.
Support for DirectX 9c with Shader Model 3.
Feature update
Build 9308, released on December 21, 2009, added some new features.
Linux guest operating systems
Parallels Tools support Xorg 1.7 in Fedora 12 virtual machines (experimental)
Parallels Tools support Mandriva 2010 (experimental)
OpenSUSE 11.1 installation media auto detection
Virtualization
Improved performance for USB mass storage.
Windows guest operating systems
Improved resume from suspend in virtual machines with multiple monitors assigned.
Improved performance for file access via Shared Folders.
3D and video
Improved performance for video playback in Windows Vista and Windows 7.
Windows Aero is not available by default for machines with Intel GMA X3100 and GMA 950 graphic adapters (some MacBook and Mac Mini models). It is available on MacBooks with NVIDIA 9400M graphics cards.
Vertical synchronization is now configurable. You can configure these settings using the corresponding option in the virtual machine video configuration page.
Improved 3D performance for the video game Mirror's Edge.
macOS Server guest operating system
The ability to pass kernel options to the macOS Server guest OS has been added. To do so, enable the "Select boot device on startup" option in the virtual machine configuration, which will enable you to specify the necessary kernel options in the 5-second timeout before booting the kernel.
Version 6
Officially announced on September 9, 2010 and launched on September 14, 2010, Parallel 6 has full 64-bit support for the first time. Parallels claims that Parallels Desktop 6 for Mac "[has] over 80 new and improved features, including speed 40% above the previous version."
Specific new features include:
An all-new 64-bit engine
5.1 Surround Sound support
Better import implementation of VMware, Virtual PC virtual machines and Boot Camp partitions
Improved network, hard drive and Transporter performance
Windows program Spotlight integration
Faster Windows launch time
Enhanced 3D graphics that are 40% better than previous versions
Ability to extend Mac OS X Parental Controls to Windows applications
Ability to use Mac OS X keyboard shortcuts in Windows applications
Enhanced Spaces and Exposé support
Version 7
Officially announced on September 1, 2011 and released on September 6, 2011, Parallels Desktop 7 adds many new features. These include:
Integration with OS X 10.7.4 "Lion":
Full-screen support
Use of Launchpad for Windows apps
Mission Control support
Lion as a guest OS
Lion animations support
Improved user interface
New standard help and documentation
Shared devices with Mac OS X
Longer battery life
Mac OS X parental controls support
Support for Intel AES-NI encryption
Enhanced performance and 3D graphics
Support for up to 1GB video memory in virtual machine
Enhanced audio support - up to 192 kHz
Surround sound 7.1
Added support for Windows 7
Version 8
Officially announced August 22, 2012 and released September 4, 2012, Parallels Desktop 8 adds many new features:
OS X 10.8 "Mountain Lion" as a guest OS
Retina resolution can be passed to virtual machines
Windows 7 and Windows 8 automatically optimised for best experience on Retina
Parallels Desktop notifications
Notification Center support for Windows 8 toast notifications
Mountain Lion Dictation in Windows apps
Full screen on demand for Windows applications in Coherence
Presentation Wizard
Open in Internet Explorer button for Safari
Drag & drop file to Outlook in the Dock opens new email with attachment
Multi-language Keyboard Sync in Mac and Windows
Full support for new Modern UI Windows 8 applications (Dock, Mission Control, Launchpad)
Reworked Keyboard shortcuts preferences
Use the standard OS X system preferences to set Parallels Desktop application shortcuts.
Resources (CPU/RAM) monitoring
Indication for VM hard drive space usage
Shared Bluetooth
Improved Virtual Machine boot time/Windows boots time are up to 25% faster than previous version
Pause & resume Windows up to 25% faster than previous version
Input/output (I/O) operations are up to 35% faster than previous version
Games run up to 30% faster than previous version
DirectX 10 support
Full USB 3.0 support for faster connections to peripheral devices for Virtual Machines starting from Parallels Desktop 8.0.18305
Version 9
Officially announced on August 29, 2013 and released on September 5, 2013, Parallels Desktop 9 for Mac includes these new features and enhancements:
Brings back the "real" Start menu for Windows 8 and enables Modern apps in separate windows instead of full screen
Power Nap support, so applications stay up-to-date on Retina Display Mac and MacBook Air computers
Thunderbolt and Firewire storage devices are designated to connect to Windows virtual machine
Sticky Multi-monitor setup remembers settings and puts Windows virtual machines back into Full Screen mode on the remote monitor
Sync iCloud, SkyDrive, Dropbox and more without unnecessary duplication of files
Windows apps can launch the OS X Mountain Lion Dictionary with Dictionary gesture
Enhanced integration with MacOS for Linux users
Enhanced New Virtual Machine Wizard makes it easier to set up a new virtual machine, especially on computers without hard drives
PDF printer for Windows to print from any Windows application to a PDF on the Mac desktop, even if the application doesn't have that functionality
Compatibility with OS X 10.9 "Mavericks"
Easily install and access complimentary security software subscriptions from one location
Up to 40% better disk performance than previous versions
Virtual machines shut down up to 25% faster and suspend up to 20% faster than with Parallels Desktop 8
3D graphics and web browsing are 15% faster than in Parallels Desktop 8
Enterprise version:
Set an expiration date for the virtual machine.
Run virtual machines in headless mode.
Start virtual machines on Mac boot.
Version 9 is the last version to support Snow Leopard.
Version 10
Released August 20, 2014, Parallels Desktop 10 for Mac includes support for OS X 10.10 "Yosemite", and ends support for Snow Leopard.
Less than a year after release of its release, Parallels spokesperson John Uppendahl confirmed version 10 will not be fully compatible with Windows 10. The coherence mode, which integrates the Windows user interface with OS X, will not be updated and users will need to purchase and upgrade to version 11 to continue using this feature.
Version 11
Released August 19, 2015, Parallels Desktop 11 for Mac includes support for Windows 10 and is ready for OS X 10.11 "El Capitan".
Parallels Desktop 11 for Mac is available as a one-time purchase of $79.99 for the Desktop edition, and as an annual subscription of $99.99 for Pro edition. Version 11 has multiple issues with macOS 10.13, High Sierra. The website currently offers a full price upgrade to Version 13 as a correction, effectively making this version obsolete with the macOS upgrades.
Version 12
Released August 18, 2016.
Version 13
Released August 22, 2017, Parallels Desktop 13 for Mac provides macOS High Sierra readiness and support for upcoming Windows 10 features. According to Parallels, the new version makes it simple for MacBook Pro users to add Windows applications to the Touch Bar, and to use the Touch Bar within Windows applications. It is also the first solution to bring the upcoming Windows 10 People Bar feature to the Mac, including integration with the Mac Dock and Spotlight. The new version also features up to 100 percent performance improvements for completing certain tasks. The update also brings in a slightly refreshed UI to better match macOS and visual improvements for Windows users on Retina displays.
Version 14
Released August 21, 2018, Parallels Desktop 14 supports macOS 10.14 "Mojave".
Version 15
Released August 13, 2019.
Version 16
Released August 11, 2020., Parallels Desktop 16 for Mac comes with the following highlights:
Is ready for the new macOS Big Sur architecture
In Windows and Linux VMs, DirectX 11 is 20 percent faster and there are improvements for the OpenGL 3 graphics
The battery life when users activate “Travel Mode” in Windows is up to 10 percent longer
In Windows apps users can now use zoom and rotate with Trackpad in Windows apps
More printing options: Print on both sides and paper sizes from A0 to envelope.
New features are added to Parallels Desktop for Mac Pro Edition:
Easier export a virtual machine in a compressed format and prepare it for transfer to another Mac or an SSD
Give custom networks an individual name
On April 14, 2021, Parallels updated the software to version 16.5, notably adding support for Apple silicon-based Macs. On such Macs, only ARM-compatible OSes can be run in VMs; Parallels does not emulate the x86 architecture. Supported guest OSes include Windows Insider builds of Windows 10 (as no retail ARM versions of Windows 10 nor installation disk images for such versions are publicly available), as well as ARM builds of various Linux distributions.
Version 17
Released August 10, 2021, Parallels Desktop 17 for Mac comes with the following highlights:
Optimized for Apple M1 chip.
Added support for USB 3.1 devices.
Added multi-monitor support for Linux.
Added drag-and-drop support for text or graphics between Mac and Windows applications.
Version 17.1, released October 14, 2021, is fully compatible with macOS Monterey and adds support for Windows 11 as a guest OS.
Supported operating systems
Parallels Desktop for Mac Business, Home and Pro Editions requires these versions of MacOS:
Parallels Desktop 11 and 12 only partially support macOS "High Sierra":
A Coherence Mode windows may appear under MacOS windows, and some graphics artifacts may occur.
B Neither Parallels Desktop 11 nor 12 fully support Apple File System (APFS) disks, including virtual disks and Boot Camp partitions. Therefore, a "High Sierra" guest machine must be installed 'manually' by passing the "--converttoapfs NO" command line switch, and cannot use the automated Parallels virtual machine creation process.
C Versions are partially compatible with the corresponding macOS versions and may not work correctly.
Guest
Parallels Desktop 16 for Mac includes support for a variety of different guest operating systems:
Several versions of Windows: Windows 10, Windows 8.1, Windows 8, Windows Server 2019, Windows Server 2016, Windows Server 2012 R2, Windows 7 (SP0-SP1), Windows Server 2008 R2 (SP0-SP2), Windows Vista Home, Business, Ultimate and Enterprise (SP0-SP2), Windows Server 2003 R2 (SP0-SP2), Windows XP (SP0-SP3), Windows 2000 Professional SP4, Windows 2000 Server SP4
Linux distributions: Red Hat Enterprise Linux (RHEL) 8, 7 and 6, CentOS Linux 8, 7 and 6, Fedora Linux 32, 31, 30 and 29, Ubuntu 20.04, 19.10, 19.04, 18.04 LTS and 16.04 LTS, Debian GNU/Linux 10, 9 and 8, Suse Linux Enterprise 15, OpenSUSE Linux 15.2, 15.1 and 15, Linux Mint 20, 19 and 18, Kali 2020.2, 2019 and 2018, elementary OS 5.0, Manjaro 18, Mageia 7 and 6 and more
Android (only when users download the version with the Installation Assistant with Parallels Desktop)
It is also possible to install macOS versions in a VM: macOS Big Sur 11, macOS Catalina 10.15, macOS Mojave 10.14, macOS High Sierra 10.13, macOS Sierra 10.12, OS X El Capitan 10.11, OS X Yosemite 10.10, OS X Mavericks 10.9, OS X Mountain Lion 10.8, OS X Lion 10.7, OS X Lion Server 10.7, Mac OS X Snow Leopard Server 10.6, Mac OS X Leopard Server 10.5
In Parallels Desktop 10 for Mac, support for guest operating systems includes a variety of 32-bit and 64-bit x86 operating systems, including:
MS-DOS
Multiple versions of Windows, including Windows 8 and Windows 8.1. Windows 8.1 must generally be installed from a DVD, since Microsoft offered only the ".exe" version of Windows 8.1 in downloadable form, and did not offer the ".iso" version as a download. Microsoft has released an ISO version of Windows 8.1 a few months earlier.
Mac OS X Leopard Server, Snow Leopard Server, and Mac OS X Lion (only with Mac OS X Lion as host OS)
Various Linux distributions
FreeBSD
eComStation, OS/2, Solaris
See also
Desktop virtualization
Virtual machine
Platform virtualization
x86 virtualization
Virtual disk image
References
External links
Virtualization software
MacOS-only software
Software derived from or incorporating Wine
Software that uses Qt
Proprietary software that uses Qt |
39434 | https://en.wikipedia.org/wiki/Tony%20Hoare | Tony Hoare | Sir Charles Antony Richard Hoare (Tony Hoare or C. A. R. Hoare) (born 11 January 1934) is a British computer scientist who has made foundational contributions to programming languages, algorithms, operating systems, formal verification, and concurrent computing. His work earned him the Turing Award, usually regarded as the highest distinction in computer science, in 1980.
Hoare developed the sorting algorithm quicksort in 1959–1960. He developed Hoare logic, an axiomatic basis for verifying program correctness. In the semantics of concurrency, he introduced the formal language communicating sequential processes (CSP) to specify the interactions of concurrent processes, and along with Edsger Dijkstra, formulated the dining philosophers problem. He is also credited with development (and later criticism) of the null pointer, having introduced it in the ALGOL family of languages. Since 1977, he has held positions at the University of Oxford and Microsoft Research in Cambridge.
Education and early life
Tony Hoare was born in Colombo, Ceylon (now Sri Lanka) to British parents; his father was a colonial civil servant and his mother was the daughter of a tea planter. Hoare was educated in England at the Dragon School in Oxford and the King's School in Canterbury. He then studied Classics and Philosophy ("Greats") at Merton College, Oxford. On graduating in 1956 he did 18 months National Service in the Royal Navy, where he learned Russian. He returned to the University of Oxford in 1958 to study for a postgraduate certificate in statistics, and it was here that he began computer programming, having been taught Autocode on the Ferranti Mercury by Leslie Fox. He then went to Moscow State University as a British Council exchange student, where he studied machine translation under Andrey Kolmogorov.
Research and career
In 1960, Hoare left the Soviet Union and began working at Elliott Brothers Ltd, a small computer manufacturing firm located in London. There, he implemented the language ALGOL 60 and began developing major algorithms.
He was involved with developing international standards in programming and informatics, as a member of the International Federation for Information Processing (IFIP) Working Group 2.1 on Algorithmic Languages and Calculi, which specified, maintains, and supports the languages ALGOL 60 and ALGOL 68.
He became the Professor of Computing Science at the Queen's University of Belfast in 1968, and in 1977 returned to Oxford as the Professor of Computing to lead the Programming Research Group in the Oxford University Computing Laboratory (now Department of Computer Science, University of Oxford), following the death of Christopher Strachey. He is now an Emeritus Professor there, and is also a principal researcher at Microsoft Research in Cambridge, England.
Hoare's most significant work has been in the following areas: his sorting and selection algorithm (Quicksort and Quickselect), Hoare logic, the formal language communicating sequential processes (CSP) used to specify the interactions between concurrent processes (and implemented in various programming languages such as occam), structuring computer operating systems using the monitor concept, and the axiomatic specification of programming languages.
Apologies and retractions
Speaking at a software conference in 2009, Tony Hoare apologized for inventing the null reference:
For many years under his leadership, Hoare's Oxford department worked on formal specification languages such as CSP and Z. These did not achieve the expected take-up by industry, and in 1995 Hoare was led to reflect upon the original assumptions:
Awards and honours
Distinguished Fellow of the British Computer Society (1978)
Turing Award for "fundamental contributions to the definition and design of programming languages". The award was presented to him at the ACM Annual Conference in Nashville, Tennessee, on 27 October 1980, by Walter Carlson, chairman of the Awards committee. A transcript of Hoare's speech was published in Communications of the ACM.
Harry H. Goode Memorial Award (1981)
Fellow of the Royal Society (1982)
Honorary Doctorate of Science by the Queen's University Belfast (1987)
Honorary Doctorate of Science, from the University of Bath (1993)
Honorary Fellow, Kellogg College, Oxford (1998)
Knighted for services to education and computer science (2000)
Kyoto Prize for Information science (2000)
Fellow of the Royal Academy of Engineering (2005)
Member of the National Academy of Engineering (2006) for fundamental contributions to computer science in the areas of algorithms, operating systems, and programming languages.
Computer History Museum (CHM) in Mountain View, California Fellow of the Museum "for development of the Quicksort algorithm and for lifelong contributions to the theory of programming languages" (2006)
Honorary Doctorate from Heriot-Watt University (2007)
Honorary Doctorate of Science from the Department of Informatics of the Athens University of Economics and Business (AUEB) (2007)
Friedrich L. Bauer-Prize, Technical University of Munich (2007)
SIGPLAN Programming Languages Achievement Award (2011)
IEEE John von Neumann Medal (2011)
Honorary Doctorate, University of Warsaw (2012)
Honorary Doctorate, Complutense University of Madrid (2013)
Personal life
In 1962, Hoare married Jill Pym, a member of his research team.
Books
C. A. R. Hoare (1985). Communicating Sequential Processes. Prentice Hall International Series in Computer Science. (hardback) or (paperback). (Available online at http://www.usingcsp.com/ in PDF format.)
References
External links
1934 births
Living people
People from Colombo
People educated at The Dragon School
People educated at The King's School, Canterbury
Alumni of Merton College, Oxford
Academics of Queen's University Belfast
British computer scientists
Fellows of the British Computer Society
Fellows of the Royal Academy of Engineering
Fellows of the Royal Society
Foreign associates of the National Academy of Sciences
Fellows of Wolfson College, Oxford
Formal methods people
History of computing in the United Kingdom
Knights Bachelor
Kyoto laureates in Advanced Technology
Members of the Department of Computer Science, University of Oxford
Microsoft employees
Moscow State University alumni
Programming language researchers
Turing Award laureates
Computer science writers
British expatriates in Sri Lanka
British expatriates in the Soviet Union
Fellows of Merton College, Oxford |
28509000 | https://en.wikipedia.org/wiki/Renfe%20Class%20319%20%28later%20versions%29 | Renfe Class 319 (later versions) | The Renfe classes 319.2, 319.3 and 319.4 are six axle Co'Co' medium power mainline diesel-electric locomotives manufactured by Macosa using General Motors Electromotive division components under license.
Background and design
The first GM mainline locomotives in Spain were the Renfe Class 1900 locomotives, introduced in the mid 1960s, built in both America, and under license by Macosa; over one hundred were built and these were later given the numbers 319-001 to 319-103. In the 1980s the company started to upgrade its diesel fleet; the original class 319s began to be scrapped and a new version, twenty of which, were constructed, forming the sub-class 319.2 with numbers running from 319.201 upwards.
The new locomotive continued the GM-EMD tradition using the same engine and bogies but with other components completely new such as the generator. Opinions differ as to whether or not the locomotives represented a re-build or conversion of the old class. However some of the old components from the earlier locomotives were reused.
A trial of 20 locomotives was produced, and resulted in eventually 108 locomotives of the being built in total.
The first locomotives were constructed between 1984–5 and were numbered 319.201 to 319.220, these had three windows in the front of the cabin, and as a result got the nickname Retales (meaning 'patchwork' from retal a scrap, piece or oddment). Most were painted in grey/blue/yellow colour scheme, and later received the grey/yellow 'Taxi' colour scheme. Two units also received a brown, yellow and orange livery named Estrella. The machines worked well, and a further 20 locomotives were built in between 1990 and 1991 by Meinfesa, to the same electro-mechanical design but with two pane front windows and different body shape, the locomotives also had an air-conditioning unit fitted as standard into the design. These 58 machines formed the sub-class 319.2. 8 of the units were built for AVE with 1435mm gauge running gear (219.241 to 219.248) and received the white/grey AVE livery, the remainder received the standard freight yellow/grey 'Taxi' livery.
A second series of 40 locomotives, 319.3 were built between 1991–2 with an additional electrical generator for electrical train supply and a higher operating speed for passenger train services. The diesel engines had a higher engine bore helping to provide the additional 300 kW power required by the auxiliary generator, and the additional demands of higher speeds. The locomotives were delivered in standard grey/yellow livery but many subsequently received the blue and white liveries of the Grandes Lineas Altaria and Arco passenger services.
The final sub-class 319.4 were made in 1992; built for freight work with non head end power, the locomotives differed from all previous versions, replacing electromechanical speed control with more advanced microprocessor controlled anti-wheel slip control (described by GM-EMD as super-series control) working in conjunction with speed sensing ground radar. Externally the locomotives are indistinguishable from the earlier version. All were given the standard freight 'Taxi' blue and grey with large numerals livery.
Work history
The sub-classes 319.2 and 319.4 worked primarily on freight trains, in multiple on heavier trains, some of the 319.2 series were also used, fitted with standard gauge wheelsets on the construction of the first AVE line between Madrid and Seville.
The class 319.3 initially worked on passenger services, as designed, but have been displaced due to their relatively low power. By 2004 5 had been assigned to infrastructure trains.
One unit fitted with Siemens equipment was used for testing on the installation of ETCS on the Madrid to Barcelona to France high speed line.
Exports
Argentina
Fifty three of the 319 class along with ~100 diesel multiple units, over one hundred and thirty Talgo carriages, as well as over one hundred carriages and vans were sold to the Ministry of Transport of Argentina in a sale worth €120million. The vehicles were acquired for its Plan Nacional de Recuperación Ferroviaria de Argentina, part of a countrywide development investment plan worth $111billion.
By 2010 over twenty of the class had been shipped, with some still undergoing overhaul in preparation. In Argentina the locomotives have been used for the Línea San Martín (LSM).
Saudi Arabia
In 2014 several class 319 were transferred to the Saudi Railways Organization.
Miniature models
In HO scale Roco has produced the different variants of the subclass 319.2, as well as the 319.3 and 319.4 subclasses. The original single cab "american" version has also been produced in HO scale.
In N scale Startrain has produced subclasses 319.2, 319.3 and 319.4.
References
Images and videos
External links
Serie 319, brief description and images, Jorge Sanz Mongay, www.jorges.arrakis.es
RENFE 319, GM Locomotives in Europe, images, Lolke Bijlsma, www.lolkebijlsma.com
EMD Locomotives in Spain personales.ya.com
Railway locomotives introduced in 1984
Diesel locomotives of Spain
319 later
Macosa/Meinfesa/Vossloh Espana locomotives
Electro-Motive Diesel locomotives
Standard gauge locomotives of Spain
5 ft 6 in gauge locomotives
Diesel-electric locomotives of Spain
Mixed traffic locomotives |
26588943 | https://en.wikipedia.org/wiki/MindMeister | MindMeister | MindMeister is an online mind mapping application that allows its users to visualize, share and present their thoughts via the cloud. MindMeister was launched in 2007 by MeisterLabs GmbH, a software company founded by Michael Hollauf and Till Vollmer. After 10 years in the market, MindMeister has more than 7 million users who created more than a billion ideas to date.
Overview
MindMeister provides a way to visualize information in mind maps utilizing user modeling, while also providing tools to facilitate real-time collaboration, coordinate task management and create presentations. By using cloud storage, MindMeister users can share updates in mind maps in real-time with other users across in-browser and mobile apps. Mind maps can be shared both privately with an unlimited number of users or publicly.
MindMeister is based on a freemium model, with a basic account available free of charge, providing limited functionality. The commercial model is built upon 4 different pricing levels with a choice of monthly or yearly subscription-charges. These pricing plans are titled Basic, Personal, Pro and Business. For use in the education sector, 3 different functional levels are available.
The aim of MindMeister is to enable individuals to collaborate on a mind map, where everyone can share ideas, comments and plans, as well as vote on ideas in real-time. MindMeister allows users to share and edit mind maps, leave comments and feedback, attach files, images, videos, and link to external, as well as internal sources, via embedded URLs. Mind maps can be shared with colleagues internally or externally via an email invitation to collaborate, or via a hyperlink. Mind maps can also be turned into interactive presentations.
Development
The idea behind MindMeister was first devised when the two founders, Michael Hollauf and Till Vollmer, were working together using Writely, which had been recently acquired by Google Docs, and the mind mapping tool MindManager. At the time, MindManager had to be installed locally, which made it hard to share mind maps externally or with anyone who had not installed the software. While using Google Docs and MindManager together, the idea was born to combine the two, forming a collaborative online mind mapping tool which could be easily shared and edited, via the cloud. In 2015, MeisterTask was released. A Kanban board-style collaborative Task management-application for agile teams that tightly integrates into the technology and methodology of MindMeister.
Milestones
In 2006, MindMeister's first prototypes were created, in which mind maps were developed with 1x1 px DIVs
On February 7, 2007, MindMeister was released as a private beta. In the same year, MindMeister was awarded the Red Herring 100 Europe Award.
In 2008, MindMeister 2.0 was released. In this release, the History View was added.
In 2009, 5 additional languages were added and, with the advent of the iPhone, MindMeister for iOS was released.
In 2010, MindMeister was added to the Google Apps Marketplace and the first native version for iPad was released.
In 2011, MindMeister for Android was released and the presentation mode was integrated in the online version.
In 2012, MindMeister integrated with Google Drive.
In 2013, MindMeister Groups were introduced and MindMeister integrated with Google Hangouts.
In 2014, the add-on for Google Docs were released and MindMeister became a Google Cloud Premier Partner. New features released include Comments and Votes, New Map Layouts and Video Support.
In 2015, an integration to BiggerPlate was added as well as Geistesblitz for the AppleWatch.
See also
List of collaborative software
List of concept- and mind-mapping software
Mind mapping
Visual thinking
References
Further reading
"MindMeister". Maximum PC. October 2007.
External links
Knowledge representation software
Mind-mapping software
Collaborative software
Concept mapping software
Argument mapping
Visual thinking |
793319 | https://en.wikipedia.org/wiki/Comparison%20of%20OpenGL%20and%20Direct3D | Comparison of OpenGL and Direct3D | Direct3D and OpenGL are competing application programming interfaces (APIs) which can be used in applications to render 2D and 3D computer graphics. , graphics processing units (GPUs) almost always implement one version of both of these APIs. Examples include: DirectX 9 and OpenGL 2 circa 2004; DirectX 10 and OpenGL 3 circa 2008; and most recently, DirectX 11 and OpenGL 4 circa 2011. GPUs that support more recent versions of the standards are backward compatible with applications that use the older standards; for example, one can run older DirectX 9 games on a more recent DirectX 11-certified GPU.
Availability
Direct3D application development targets the Microsoft Windows platform.
The OpenGL API is an open standard, which means that various hardware makers and operating system developers can freely create an OpenGL implementation as part of their system. OpenGL implementations exist for a wide variety of platforms. Most notably, OpenGL is the dominating graphics API of Unix-like computer systems.
From an application developer's perspective, Direct3D and OpenGL are equally open; full documentation and necessary development tools are available with no restrictions.
In more detail, the two computer graphics APIs are the following:
Direct3D is a proprietary API by Microsoft that provides functions to render two-dimensional (2D) and three-dimensional (3D) graphics, and uses hardware acceleration if available on the graphics card. It was designed by Microsoft Corporation for use on the Windows platform.
OpenGL is an open standard API that provides many functions to render 2D and 3D graphics, and is available on most modern operating systems including but not limited to Windows, macOS, and Linux.
Note that many essential OpenGL extensions and methods, although documented, are also patented, thus imposing serious legal troubles to implement them (see issues with Mesa).
OpenGL and Direct3D are both implemented in the display device driver. A significant difference however is that Direct3D implements the API in a common runtime (supplied by Microsoft), which in turn talks to a low-level device driver interface (DDI). With OpenGL, every vendor implements the full API in the driver. This means that some API functions may have slightly different behavior from one vendor to the next. The GLSL shader compilers of different vendors also show slightly different behavior.
The following compares the two APIs, structured around various considerations mostly relevant to game development.
Portability
The proprietary Direct3D is officially implemented only on Microsoft's Windows family of operating systems, including embedded versions used in the Xbox family of video game consoles and Sega's Dreamcast. Several mostly functional reimplementations of the Direct3D API have been made by third parties such as Wine, a project to port common Windows APIs to Unix-like operating systems, and Cedega, a proprietary fork of Wine. However, this process is progressively impeded due to the interdependence of DirectX upon many other proprietary components of Windows, and because Direct3D's proprietary nature requires the difficult process of reverse engineering.
OpenGL has implementations available across many platforms including Microsoft Windows, Unix-based systems such as Mac OS X, Linux. Nintendo and Sony have developed their own libraries which are similar but not identical to OpenGL. A subset of OpenGL was chosen as the main graphics library for Android, BlackBerry, iOS, and Symbian in the OpenGL ES form.
Microsoft's OpenGL driver provides hardware acceleration in Windows Vista; support was dropped in Windows XP, soon after they failed to deliver Fahrenheit graphics API low level support for an OpenGL-Direct3D merger in the late 1990s. OpenGL hardware acceleration on Windows is achieved by users first installing installable client drivers (ICDs) developed by GPU makers. These ICDs are, in virtually all cases, bundled with the standard driver download package from the hardware vendor (IHV), so installing recent graphics drivers is sufficient to provide hardware OpenGL support.
More recently, Google's Almost Native Graphics Layer Engine (ANGLE) project provides a means to convert OpenGL ES 2.0 application calls to DirectX 9. This is done so that WebGL (a subset variant of OpenGL for the web) can run on the common Direct3D runtime, which means there will be less variation between vendors.
Ease of use
Direct3D
The first version of Direct3D in 1996 elicited broad criticism because even simple operations, such as state changes, required creating and submitting objects called execute buffers. In contrast, in OpenGL most state changes can be performed with one function call. The Direct3D model frustrated many programmers. A very famous complaint was made by high-profile game developer John D. Carmack in his .plan file in which he urged Microsoft to abandon Direct3D in favor of OpenGL. Carmack explained, "OpenGL is easy to use and fun to experiment with. D3D is not. ... Many things that are a single line of GL code require half a page of D3D code - to allocate a structure, set a size, fill something in, call a COM routine, then extract the result." Chris Hecker made a similar request in an "Open Letter to Microsoft" in the April–May 1997 issue of Game Developer Magazine.
Version 5 (the second version, named to reflect its release as part of DirectX 5) replaced execute buffers with the new DrawPrimitive API, but it was still considered cumbersome. Chris Hecker's "Open Letter to Microsoft" referred to DrawPrimitive as "an immature and poorly-designed clone of OpenGL that's missing some of the architectural decisions that make OpenGL fast."
Despite the controversy, Microsoft continued to evolve the API. A detailed history of releases and added features is given on the Microsoft Direct3D web pages.
Some former critics of Direct3D acknowledge that now Direct3D is as good if not better than OpenGL in abilities and ease of use. In January 2007, John Carmack said that "…DX9 is really quite a good API level. Even with the Direct3D side of things, where I know I have a long history of people thinking I'm antagonistic against it. Microsoft has done a very, very good job of sensibly evolving it at each step—they're not worried about breaking backwards compatibility—and it's a pretty clean API. I especially like the work I'm doing on the 360, and it's probably the best graphics API as far as a sensibly designed thing that I've worked with."
Some design features of Direct3D have remained unchanged since version one, most notably its reliance on Microsoft's Component Object Model (COM). One advantage of using COM is that the API can be used in any COM-aware language, notably Object Pascal (Delphi), and Microsoft Visual C++, C#, and Visual Basic .NET.
OpenGL
OpenGL is a specification implemented in the programming language C, though it can be used in other languages. It is built on the concept of a state machine. As an API, OpenGL depends on no one programming language feature, and can be made callable from almost any language with the proper bindings. Such bindings exist for most current programming languages.
Comparison
In general, Direct3D is designed to virtualize 3D hardware interfaces. Direct3D frees the game programmer from accommodating the graphics hardware. OpenGL, on the other hand, is designed to be a 3D hardware-accelerated rendering system that may be emulated in software. These two APIs are fundamentally designed under two separate modes of thought.
As such, there are functional differences in how the two APIs work. One functional difference between the APIs is in how they manage hardware resources. Direct3D expects the application to do it, OpenGL makes the implementation do it. This tradeoff for OpenGL decreases difficulty in developing for the API, while at the same time increasing the complexity of creating an implementation (or driver) that performs well. With Direct3D, the developer must manage hardware resources independently; however, the implementation is simpler, and developers have the flexibility to allocate resources in the most efficient way possible for their application.
Until about 2005, another functional difference between the APIs was the way they handled rendering to textures. The Direct3D method (SetRenderTarget()) is convenient, while prior versions of OpenGL required manipulating pixel buffers (P-buffers). This was cumbersome and risky: if the codepath used in a program was different from that anticipated by a driver maker, the code would fall back to software rendering, causing a substantial performance drop. However, broad support for the frame buffer objects extension, which provided an OpenGL equivalent of the Direct3D method, successfully addressed this shortcoming, and the render target feature of OpenGL brought it up to par with Direct3D in this aspect.
Outside of a few minor functional differences which have mostly been addressed over the years, the two APIs provide nearly the same level of function. Hardware and software makers generally respond rapidly to changes in DirectX, e.g., pixel processor and shader requirements in DirectX 9 to stream processors in DirectX 10, to tessellation in DirectX 11. In contrast, new features in OpenGL are usually implemented first by vendors and then retroactively applied to the standard.
Performance
Shortly after the establishment of both Direct3D and OpenGL as viable graphics libraries (circa 1995), Microsoft and SGI engaged in what has been called the "API Wars". Much of the argument revolved around which API offered superior performance. This question was relevant due to the very high cost of dedicated graphics processors during this time, which meant the consumer market was using software renderers implemented by Microsoft for both Direct3D and OpenGL.
Early debate
DOS business software such as AutoCAD and DOS games such as id Software's Quake originally had to be optimized to run on many different graphics chipsets. When hardware makers such as 3Dlabs (member of the OpenGL Architecture Review Board) made OpenGL compatible graphics accelerators (e.g., GLint chip), developers such as John Carmack of id Software optimized their products for OpenGL. As multitasking user environments such as Windows and the X Window System (X11) on Unix-like systems became prevalent, the relevance of this hardware faded.
Microsoft had marketed Direct3D as faster based on in-house performance comparisons of these two software libraries. The performance deficit was blamed on the rigorous specification and conformance required of OpenGL. This perception was changed at the 1996 Special Interest Group on GRAPHics and Interactive Techniques (SIGGRAPH) conference. At that time, Silicon Graphics (SGI) challenged Microsoft with their own optimized Windows software implementation of OpenGL called CosmoGL which in various demos matched or exceeded the performance of Direct3D. For SGI, this was a critical milestone as it showed that OpenGL's poor software rendering performance was due to Microsoft's reference OpenGL implementation, and not due to alleged design flaws in OpenGL.
In contrast, software rendering by the 3D API was largely irrelevant for both Direct3D and OpenGL applications. Not many DirectX applications used Direct3D's software rendering, preferring to perform their own software rendering using DirectDraw's facilities to access the display hardware. As for OpenGL applications, hardware support was expected, and the hardware was so much faster that software fallback by the OpenGL application constituted a rude surprise to the OpenGL developer.
In any case, by the time SGI had demonstrated that OpenGL software rendering performance could be competitive with that of Direct3D, software rendering was fast becoming irrelevant due to the wide availability of low cost 3D graphics hardware. By 1998, even the S3 ViRGE graphics accelerator was substantially faster than the fastest Pentium II running Direct3D's MMX rasterizer.
Marshalling
A more substantive and modern performance difference arises because of the structure of the hardware drivers provided by hardware developers. Under DirectX, independent hardware vendor (IHV) drivers are kernel-mode drivers installed into the operating system. The user-mode part of the API is handled by the DirectX runtime provided by Microsoft. Under OpenGL however, the IHV driver is divided in two parts: a user-mode part that implements the OpenGL API, and a kernel-mode driver that is called by the user-mode part.
This is an issue because calling kernel-mode operations from user-mode requires performing a system call (i.e., making the CPU switch to kernel mode). This is a slow operation, taking on the order of microseconds to complete. During this time, the CPU can perform no operations. As such, minimizing the number of times this switching operation occurs would improve performance. For example, if the GPU's command buffer is full of rendering data, the API could simply store the requested rendering call in a temporary buffer and, when the command buffer is nearly empty, it can perform a switch to kernel-mode and add a set of stored commands in a batch. This is termed marshalling.
Because Direct3D IHV drivers are kernel-mode, and the user-mode code is out of the IHV's hand, there is no chance for such optimizations to occur. Because the Direct3D runtime, the user-mode part that implements the API, cannot have explicit knowledge of the driver's inner workings, it cannot effectively support marshalling. This means that every Direct3D call that sends commands to the hardware must perform a kernel-mode switch, which again, takes time in the order of microseconds to complete. This has led to several behaviors regarding use of Direct3D, the most important being the need for submitting large batches of triangles in one function call.
Since OpenGL's IHV drivers have a user-mode component to them, IHVs have the ability to implement marshalling, thus improving performance. There is still kernel-mode switching, but the theoretical maximum number of switches under OpenGL implementations is simply equal to the Direct3D standard behavior.
Direct3D 10, the release included with Windows Vista, allows parts of drivers to run in user-mode, making it possible for IHVs to implement marshalling, thus bringing the two back into relative performance parity. Mac OS X's OpenGL system is very similar, where IHVs implement a simpler version of the OpenGL API (with both user and kernel mode components), and Apple's additions to the runtime provide the direct interface to the user code, and some basic work to make IHVs' jobs easier.
Race to zero driver overhead
The introduction of Mantle by AMD led to increased discussion about modernizing APIs, and updating abstraction concepts used by all APIs to reflect graphics processing unit (GPU) operations. Both Microsoft and OpenGL vendors began to showcase their visions for limiting or removing altogether driver overhead (the amount of work the CPU needs to do to prepare GPU commands).
In March 2014, Microsoft presented basic assumptions and goals for the DirectX12 3D component (to be ready for December 2015). OpenGL vendors took a different approach, and during GDC 2014 presented a mix of features mandatory in OpenGL 4.3 & OpenGL 4.4 or already ARB extensions, to show fast paths already present in implementations from Nvidia, AMD, and Intel. Later AMD donated Mantle to Khronos Group, the API was renamed Vulkan and now this is the current cross-platform API dedicated to reduce driver overhead, while better distributing work among multiple CPU and GPU cores, using a unified management of compute kernels and graphical shaders.
During the presentation, apitest was introduced. It is a new tool for microbenchmarking specific solutions for given problems emphasizing exploration of fast paths in current APIs. Both OpenGL 4.x and Direct3D 11 are supported. Gathered results showed that modern OpenGL can be many times faster than Direct3D 11.
Structure
OpenGL, originally designed for then-powerful SGI workstations, includes many features, like stereo rendering and the imaging subset, that were generally considered of limited use for games, although stereoscopic gaming has drawn more interest with the development of consumer-level 3D displays. The API as a whole contains about 250 calls, but only a subset of perhaps 100 are useful for game development. However, no official gaming-specific subset was ever defined. MiniGL, released by 3Dfx as a stopgap measure to support GLQuake, might have served as a starting point, but additional features like stencil were soon adopted by games, and support for the full OpenGL standard continued. Today, workstations and consumer machines use the same architectures and operating systems, and so modern versions of the OpenGL standard still include these features, although only special workstation-class video cards accelerate them.
Extensions
The OpenGL extension mechanism is probably the most heavily disputed difference between the two APIs. OpenGL includes a mechanism where any driver can advertise its own extensions to the API, thus introducing new functions such as blend modes, new ways to transfer data to GPUs, or different texture wrapping parameters. This allows new functions to be exposed quickly, but can lead to confusion if different vendors implement similar extensions with different APIs. Many of these extensions are periodically standardized by the OpenGL Architecture Review Board (ARB), and some are made a core part of future OpenGL revisions.
On the other hand, Direct3D is specified by one vendor only (Microsoft), leading to a more consistent API, but denying access to vendor-specific features. NVIDIA's UltraShadow technology, for instance, is not available in the stock Direct3D APIs at the time of writing. Direct3D does support texture format extensions (via FourCC). These were once little-known and rarely used, but are now used for S3 Texture Compression.
When graphics cards added support for pixel shaders (known on OpenGL as "fragment shaders"), Direct3D provided one "Pixel Shader 1.1" (PS1.1) standard with which the GeForce 3 and up, and Radeon 8500 and up, advertised compatibility. Under OpenGL the same functions were accessed through a variety of custom extensions.
In theory, the Microsoft approach allows one code path to support both brands of card, whereas under OpenGL, programmers must write two separate systems. In reality, though, because of the limits on pixel processing of those early cards, Pixel Shader 1.1 was nothing more than a pseudo-assembly language version of the NVIDIA-specific OpenGL extensions. For the most part, the only cards that claimed PS 1.1 functionality were by NVIDIA, and that is because they were built for it natively. When the Radeon 8500 was released, Microsoft released an update to Direct3D that included Pixel Shader 1.4, which was nothing more than a pseudo-assembly language version of the ATI-specific OpenGL extensions. The only cards that claimed PS 1.4 support were ATI cards because they were designed with the precise hardware needed to make that functionality happen.
This situation existed only for a short time under both APIs. Second-generation pixel shading cards functioned far more similarly, with each architecture evolving toward the same kind of pixel processing conclusion. As such, Pixel Shader 2.0 allowed a unified code path under Direct3D. Around the same time OpenGL introduced its own ARB-approved vertex and pixel shader extensions (GL_ARB_vertex_program and GL_ARB_fragment_program), and both sets of cards supported this standard also.
Users
Professional graphics
OpenGL has always seen more use in the professional graphics market than DirectX, while DirectX is used mostly for computer games. (The term professional is used here to refer to the professional production and display of graphics, such as in computer animated films and scientific visualisation, as opposed to games where the graphics produced are for the end user's personal, rather than professional, use.) Currently both OpenGL and DirectX have a large enough overlap in functionality that either could be used for most common purposes, with the operating system often being the main criterion dictating which is used; DirectX is the common choice for Windows, and OpenGL for nearly everything else. Some esoteric applications still divide the applicability of the two APIs: doing accelerated 3D across a network connection is only directly supported by OpenGL with OpenGL Extension to the X Window System (GLX), for example.
In the past, many professional graphics cards supported only OpenGL. As of 2010, virtually all professional cards which work on the Windows platform will also support Direct3D. Part of this has been a change in the professional graphics market from largely Unix-based hardware like SGIs and Suns to lower cost PC-based systems, leading to the growth of Windows in this market segment, while at the same time providing a new market for OpenGL software in Unix-based consumer systems running Linux or Mac OS X.
The principal historical reason for OpenGL's dominance in the professional market was performance. Many professional graphics applications (for example, Softimage|3D, Alias PowerAnimator) were originally written in IRIS GL for high-end SGI workstations, which were far more capable, both graphically and in raw CPU power, than the PCs of the time. Later, many of these were ported to OpenGL, even as the personal computer was evolving into a system powerful enough to run some professional graphics applications. Users were able to run Maya, for example, the successor to Alias PowerAnimator on either SGIs or Windows-based personal computers (and today on Linux, Mac OS X, and Windows). Price competition eventually broke SGI's dominance in the market, but the established base of OpenGL software engineers and the broadening user base for OpenGL in Apple, Linux, and other operating systems, has resulted in a market where both DirectX and OpenGL are viable, widespread APIs.
The other reason for OpenGL's historic advantage was marketing and design. DirectX is a set of APIs that were not marketed for professional graphics applications. Indeed, they were not even designed for such uses. DirectX was an API designed for low-level, high-performance access to broadly available, lower-performance, consumer-priced graphics hardware for the purpose of game development. OpenGL is a much more general purpose 3D API, targeting a full range of graphics hardware from low-end commodity graphics cards up to professional and scientific graphics visualization well out of the range of the average consumer, and providing features that are not necessarily exclusive for a specific kind of user.
Gaming developers typically haven't demanded as wide an API as professional graphics system developers. Many games don't need overlay planes, stencils, and so on, although this hasn't prevented some game developers from using them when available. Specifically, game designers are rarely interested in the pixel invariance demanded in certain parts of the OpenGL standards, which are conversely highly useful to film and computer-aided modeling.
An attempt was once made to merge OpenGL and DirectX by SGI and Microsoft. The Fahrenheit graphics API was intended to bring together both the high end ability of OpenGL with the broad low-level support of DirectX. Microsoft eventually retreated from the project, having never allocated sufficient resources to produce its part of the rendering engine. The move was widely held to be purposed to ensure lock-in of developers to the Windows-DirectX platform, which would be lost if the Fahrenheit API became the world de facto standard graphics API. However, Fahrenheit led to many improvements in DirectX, and the main architect of Fahrenheit now works at Microsoft on DirectX.
Gaming
In the earliest days of 3D accelerated gaming, performance and reliability were key benchmarks and several 3D accelerator cards competed against each other for dominance. Software was written for a specific brand of graphics card. However, over the years, OpenGL and Direct3D emerged as software layers above the hardware, mainly because of industry support for a cross-hardware graphics library. Competition between the two rose as each game developer would choose either one or the other.
In the early days of 3D accelerated gaming, most vendors did not supply a full OpenGL driver. The reason for this was twofold. Firstly, most of the consumer-oriented accelerators did not implement enough functionality to properly accelerate OpenGL. Secondly, many vendors struggled to implement a full OpenGL driver with good performance and compatibility. Instead, they wrote MiniGL drivers, which only implemented a subset of OpenGL, enough to run GLQuake (and later other OpenGL games, mostly based on the Quake engine). Proper OpenGL drivers became more prevalent as hardware evolved, and consumer-oriented accelerators caught up with the SGI systems for which OpenGL was originally designed. This would be around the time of DirectX 6 or DirectX 7.
In the console world proprietary native APIs are dominant, with some consoles (e.g., the PS3) providing an OpenGL wrapper around its native API. The original Xbox supported Direct3D 8.1 as its native API while the Xbox 360 supports DirectX9 as its native API. Most console developers prefer to use the native APIs for each console to maximize performance, making OpenGL and Direct3D comparisons relevant for mostly PC platforms.
Mobile phones and other embedded devices
OpenGL for Embedded Systems, called OpenGL ES, is a subset of the OpenGL 3D graphics API designed for embedded devices. Various versions of smartphone operating systems support OpenGL ES, such as Android, iOS (iPad, iPhone, iPod Touch), Maemo (Nokia N900), and Symbian.
OpenGL ES is available in 6 variants, OpenGL ES 1.0, 1.1, 2.0, 3.0, 3.1, 3.2. The release of 2.0 removed backward compatibility with older variants, due to the extensive programmable pipeline functions available in GL ES 2.0, over the fixed-pipeline functions of GL ES 1.0 and 1.1. OpenGL ES 3.0 needed new hardware over OpenGL ES 2.0, while OpenGL ES 3.1 is meant as a software update, needing only new drivers.
Direct3D Mobile, a Direct3D derivative, is supported by Windows CE. Currently all Windows Phone 7 devices use a .NET Framework UI accelerated by Direct3D Mobile 9 on Adreno 200/205 integrated GPUs by Qualcomm.
Windows Phone 8 implements Direct3D 11 (limited to feature level 9_3).
References
External links
OpenGL to be fully supported by Vista
MSDN library OpenGL
Eli asked about OpenGL, Direct Draw, and WPF, and how they work with Desktop composition...
OpenGL 3 & DirectX 11: The War Is Over
Valve: OpenGL outpaces DirectX, even under Windows
Application programming interfaces
DirectX
Microsoft application programming interfaces
OpenGL
OpenGL and Direct3D |
4039609 | https://en.wikipedia.org/wiki/Unigine | Unigine | UNIGINE is a proprietary cross-platform game engine developed by UNIGINE Company. Apart from its use as a game engine, it is mainly used in the enterprise area: simulators, virtual reality systems, serious games and visualization. A distinguishing feature of UNIGINE is support for large open worlds, up to the planet scale. It also has an advanced 3D renderer that currently supports OpenGL 4 and DirectX 11. An updated UNIGINE SDK is released every three-four months.
UNIGINE Engine is a core technology for a lineup of benchmarks (CPU, GPU, power supply, cooling system), which are used by overclockers and technical media: Tom's Hardware, Linus Tech Tips, PC Gamer, JayzTwoCents, and others. UNIGINE benchmarks are also included as part of the Phoronix Test Suite for benchmarking purposes on Linux and other systems.
UNIGINE 1
The first public release was the 0.3 version on May 4, 2005. UNIGINE Engine was created from scratch and is not based on any other engine.
Platforms
Initially started with only Microsoft Windows and Linux support, more platforms were added later: OS X, PlayStation 3, Android, iOS. Experimental support for WebGL was not included into the official SDK. UNIGINE 1 had support for several graphical APIs: DirectX 9, DirectX 10, DirectX 11, OpenGL, OpenGL ES, PlayStation 3. Initial versions (v0.3x) had only OpenGL support.
There are 3 APIs for developers: C++, C#, UnigineScript (proprietary scripting language, similar to С++ in syntax). Custom shaders can be written in GLSL and HLSL languages.
Serious game features
UNIGINE 1 has several features required by professional simulators and enterprise VR systems (mostly support for large virtual scenarios and specific hardware), often called serious games.
Support for large virtual worlds was implemented via double precision of coordinates (64-bit per axis), zone-based background data streaming, and optional operations in geographic coordinate system (latitude, longitude, and elevation instead of X, Y, Z).
Video output to sophisticated displays was implemented via so-called multi-channel rendering (network-synchronized image generation of a single large image with several computers), which is a standard approach in professional simulators. The same system enabled support of multiple output devices with asymmetric projections (e.g. CAVE). Curved screens with multiple projectors (requiring image warping and edge blending) were also supported. Also, various types of stereoscopic 3D output were supported: anaglyph, separate images output, Nvidia 3D Vision, as well as VR HMD support (Oculus Rift). UNIGINE 1 also supported multi-monitor output (video-walls).
Other features
UNIGINE renderer supports shader model 5.0 with hardware tessellation and DirectCompute (as well as OpenCL), together with a set of post-processes, including screen space ambient occlusion (SSAO), and real-time global illumination. There is a set of built-in high-level objects like terrain, grass, water, clouds and so on. UNIGINE uses a proprietary physics engine (collision detection, rigid body physics, dynamical destruction of objects, rag doll, cloth, fluid buoyancy, force fields, time reverse). Pathfinding is also implemented with a proprietary engine, together with basic AI components (spatial triggers, callbacks). Other features includes interactive 3D GUI, video playback using Theora codec, 3D audio system based on OpenAL library, WYSIWYG scene editor (UNIGINE Editor).
UNIGINE 2
Originally released on October 10, 2015.
UNIGINE 2 has all features from UNIGINE 1, with further focus on simulators and enterprise use. The main differences are the transition from forward rendering to deferred rendering approach, PBR shading, and introduction of several new graphical technologies like geometry water, multi-layered volumetric clouds, SSRTGI, and voxel-based lighting, and introduction of C# API.
Platforms
Supported platforms: Microsoft Windows, Linux, OS X (support stopped starting from 2.6 version). UNIGINE 2 supports the following graphical APIs: DirectX 11, OpenGL 4.x.
There are 3 APIs for developers: C++, C#, UnigineScript. Supported shader languages: HLSL, GLSL, UUSL (Unified UNIGINE Shader Language).
SSRTGI
Proprietary SSRTGI (Screen Space Ray-Traced Global Illumination) rendering technology was introduced in 2.5 version. It was presented at SIGGRAPH 2017 Real-Time Live! event.
Development
The roots of UNIGINE are in the frustum.org open source project, which was initiated in 2002 by Alexander "Frustum" Zaprjagaev, who is a co-founder (along with Denis Shergin, CEO) and ex-CTO of UNIGINE Company.
Linux game competition
On November 25, 2010, UNIGINE Company announced a competition to support Linux game development. They agreed to give away a free license of the UNIGINE engine to anyone willing to develop and release a game with a Linux native client, and would also grant the team a Windows license. The competition ran until December 10, 2010, with a considerable number of entries being submitted. Due to the unexpected response, UNIGINE decided to extend the offer to the three best applicants, with each getting full UNIGINE licenses. The winners were announced on December 13, 2010, with the developers selected being Kot-in-Action Creative Artel (who previously developed Steel Storm), Gamepulp (who intend to make a puzzle platformer), and MED-ART (who previously worked on Painkiller: Resurrection).
UNIGINE-based projects
As of 2021 company claimed to have more than 250 B2B customers worldwide.
Games
Released
Cradle - released for Windows and Linux in 2015
Oil Rush - released for Windows, Linux and Mac OS X in 2012
Syndicates of Arkon - released for Windows in 2010
Tryst - released for Windows in 2012
Petshop - released for Windows and Mac, featuring web-player in 2011
Sumoman - released for Windows and Linux in 2017 (Steam page)
Demolicious - released for iOS in 2012
Dual Universe - MMO RPG on a planetary scale (currently in Beta, full release planned for 2021)
Upcoming
Dilogus: The Winds of War
Node - VR shooter (Steam page)
Kingdom of Kore - action RPG for PC (in future for PS3) - cancelled by publisher
El Somni Quas - MMORPG (Patreon page)
Acro FS - aerobatic flight simulator (Steam page)
Hydrofoil Generation Sailing - realistic sailing simulator reproducing in-shore regatta of modern foiling sailing vessels as well as traditional sailing boats by Jaxx Vane Studio
Simulation and visualization
CarMaker 10.0 by IPG Automotive
NAUTIS maritime simulators by VSTEP
Train driver simulator by Oktal Sydac
Be-200 flight simulator
Klee 3D (3D visualization solution for digital marketing and research applications)
The visualization component of the analytical software complex developed for JSC "ALMAZ-ANTEY" MSDB", an affiliate of JSC "Concern "Almaz-Antey"
Real-time interactive architectural visualization projects of AI3D
Bell-206 Ranger rescue helicopter simulator
Magus ex Machina (3D animated movie)
SIMREX CDS, SIMREX FDS, SIMREX FTS car driving simulators by INNOSIMULATION
Real-time artworks by John Gerrard (artist): Farm, Solar Reserve, Exercise, Western Flag (Spindletop, Texas), X. laevis (Spacelab)
Train simulators by SPECTR
DVS3D by GDI
RF-X flight simulator
NAVANTIS Ship Simulator
VR simulator for learning of computer vision for autonomous flight control at Daedalean AI
Benchmarks
UNIGINE Engine is used as a platform for a series of benchmarks, which can be used to determine the stability of PC hardware (CPU, GPU, power supply, cooling system) under extremely stressful conditions, as well as for overclocking:
Superposition Benchmark (featuring online leaderboards) - UNIGINE 2 (2017)
Valley Benchmark - UNIGINE 1 (2013)
Heaven Benchmark (the first DirectX 11 benchmark) - UNIGINE 1 (2009)
Tropics Benchmark - UNIGINE 1 (2008)
Sanctuary Benchmark - UNIGINE 1 (2007)
References
Computer physics engines
Game engines for Linux
Middleware
Unigine SDK
Video game development software
Video game engines
Video game IDE
Virtual reality |
22967037 | https://en.wikipedia.org/wiki/Performance%20Co-Pilot | Performance Co-Pilot | Performance Co-Pilot (also known as PCP) is an open source software infrastructure for monitoring, visualizing, recording, responding to, and controlling the status, activity, and performance of networks, computers, applications, and servers.
Features
The following features are provided by the Performance Co-Pilot:
Runs on many Unix/Linux variants, as well as Windows and Mac OS X.
Has a fully distributed architecture; any client may interact with any instrumented server or application.
Has a plug-in architecture for instrumenting any custom application or server.
Can query hundreds of operational measurements from operating systems, Apache, Sendmail, MySQL, the Java VM, VMware, KVM, etc.
Can send operational parameters to remote processes, to change their behavior (cf. computational steering).
Can query or send any type of value, including: integers, strings, floating point numbers, and arbitrary composite data structures.
Has a communication protocol designed to minimize consumption of network bandwidth.
History
Performance Co-Pilot was originally created by SGI as a proprietary software product, exclusively for SGI customers. PCP's initial design was done at SGI in Melbourne, Australia, by Ken McDonell and Mark Goodwin, starting in October 1993. The pair were joined by Seppo Keronen and Jonathan Knispel, early in 1994. These four produced the initial version of Performance Co-Pilot 1.0, which was released in April 1995 as add-on software for SGI's IRIX operating system. Components included in that initial release were: (Ken and Jonathan), (Mark), (Ken), (Seppo), (Jonathan), and a host of other smaller bits and pieces. Other significant early contributors were Ania Bodeit, David Chatterton (), Ivan Rayner, Nathan Scott and Tim Shimmin.
In 2000, the core of PCP was re-released as free software, under the GNU LGPL. Additional proprietary components have been re-released as free software since then.
Currently an active community of contributors is enhancing the open source distribution of PCP and releasing new tools built upon it.
Netflix built Vector.io which used PCP. This has been modified to being a Grafana data source which will be integrated into mainline PCP.
See also
Comparison of network monitoring systems
OpenLMI, which includes PCP monitoring agent
References
External links
Performance Co-Pilot official website
SLAC list of network monitoring tools
Free network-related software |
53805936 | https://en.wikipedia.org/wiki/King%20Kaluha | King Kaluha | Michael Alegado (born August 30, 1958), better known by his ring name King Kaluha, is an American semi-retired professional wrestler and trainer. He is best known for his time in DC Drake's Continental Wrestling Alliance, the International Championship Wrestling and National Wrestling Federation during the 1980s. He also made brief appearances in the American Wrestling Association and Jim Crockett Promotions.
Alegado worked for numerous Mid-Atlantic independent promotions during the 1990s including, most notably, Eastern Championship Wrestling, House of Pain Wrestling Federation / National Wrestling League, Maryland Championship Wrestling, Mid-Eastern Wrestling Federation, National Championship Wrestling, Steel City Wrestling and the Tri-State Wrestling Alliance. He occasionally wrestled as Doink the Clown in the Century Wrestling Alliance and NWA New Jersey from 1995 to 1997.
He is considered one of the most respected wrestlers in the Northeastern United States according to Kenny Casanova and other wrestlers. Alegado mentored or trained a number of East Coast independent stars including, most notably, Tom Brandi and Steve Corino. He has occasionally appeared for Corino's Pro Wrestling WORLD-1 promotion as both a performer and trainer. In 2014, Alegado was inducted into the Maryland Wrestling Federation Hall of Fame.
Early life
Alegado was born and raised in the Fishtown section of Philadelphia, Pennsylvania. He was the son of a Filipino World War II veteran and a local Polish-American woman. Alegado began playing ice hockey as a youth. After graduating from Villanova University with a psychology degree, he began working out in order to try out for a European ice hockey team. He spent many hours at a local gym heavy lifting which saw his size increase from 180 to . Alegado was eventually noticed by a sports agent who suggested that he consider a career in professional wrestling. He trained with a man who had recently opened a wrestling school before making his pro debut in 1983.
Professional wrestling career
Early career (1983–1986)
Alegado got his start with DC Drake's Continental Wrestling Alliance based in Allentown, Pennsylvania. One of his first matches turned into a wild brawl as he and his opponent, Damien Kane, fought outside the ring and used foreign objects such as pliers and a sandal. A teenager reportedly became so excited that after the match that he threatened another fan with a steel chair before being restrained. On August 22, 1984, he and Mickey Gilligan unsuccessfully challenged The Salt and Pepper Riot Squad (Damien Kane and Sweet Daddy White) for the CWA Tag Team Championship at the Carbon County Fair in Lehighton. On September 18, 1984, Kaluha wrestled Concrete Cowboy at the 20th annual Palmerton Hospital Festival, a yearly benefit show for facility. Alegado also wrestled for the Empire Wrestling Federation, a New Jersey-based group promoted by Jack Barnett and Enzo Morabito, where he won the promotion's heavyweight and tag team championships with Mickey Gilligan.
On January 17, 1985, Alegado defeated Diamond Jim at an International Championship Wrestling show in Portland, Maine. Later that night, he participated in a 12-man battle royal also involving Carlos Colón, King Tonga, Diamond Jim, The Prince of Pain, Rudy Diamond, Tony Ulysses, The Invaders (Invader #1 and Invader #2), and The Sheepherders (Butch Miller and Luke Williams). On May 15, 1985, Alegado took part in an American Wrestling Association television taping at the Tropicana Casino & Resort Atlantic City in Atlantic City, New Jersey. His first bout was against Sgt. Slaughter, which he lost, as well as a tag team match with Mark Pole against Steve Olsonoski and Buck Zumhofe, and with Lou Fabiano versus The High Flyers (Greg Gagne and Jim Brunzell). Alegado returned to ICW at the end of the year. On March 31, 1986, Alegado wrestled Tom Brandi for a Jim Crockett Promotions show at the Baltimore Civic Center. He was primarily involved in Brandi's training and served as a mentor throughout his career. While wrestling for promoter Mark Tendler in the mid-1980s, Alegado teamed with Mick Foley as The South Sea Islanders. The team frequently wrestled The Rock and Roll Connection (Tom Brandi and Bill Woods.
National Wrestling Federation (1987)
In 1986, DC Drake sold the Continental Wrestling Alliance to Robert Raskin. Alegado was one of many CWA stars who became part of the "new" National Wrestling Federation when Raskin decided to revive the 1970s-era promotion. With the promotion's weekly show on cable television, Alegado enjoyed the highest exposure of his career. On the April 11, 1987 TV taping of NWF Wrestle Power, Alegado defeated Mark Sampson at the Ocean Ice Palace in Bricktown, New Jersey. He and JD McSlade wrestled The Beach Boys (Eddie Miranda and Larry Winters) in a tag team match that same night. On June 20, 1987, Alegado defeated Tom Brandi and Steve Sampson in separate bouts at NWF Rage In A Cage. Alegado was also accompanied by 12-year-old Darren Wyse as his manager for a match in Reading, Pennsylvania that year.
International Championship Wrestling (1987–1988)
Alegado returned to International Championship Wrestling where he formed a tag team with Tom Brandi. They won the ICW Tag Team Championship together that summer. The two remained champions until losing the belts to The Moondogs (Moondog Spike and Moondog Spot) on December 28, 1987. On July 11, 1988, Alegado wrestled The Dungeon Master in Rockland, Maine. He also feuded with Vic Steamboat. Alegado remained with ICW up to the end of the year.
Tri-State Wrestling Alliance (1990)
In early 1990, Alegado began wrestling for Joel Goodhart's Tri-State Wrestling Alliance promotion in Philadelphia. On June 9, 1990, Alegado and Tom Brandi fought The American Pitbulls (Pitbull Rex and Pitbull Spike) to a double-countout at Summer Sizzler. The two were on opposite sides the following month when Alegado and Ron Shaw lost to Brandi and The Cheetah Kid by disqualification at Madison High School on July 1. He was similarly disqualified in his match against Larry Winters that same night. On August 17, 1990, Alegado defeated The Cheetah Kid for the WWCA Light Heavyweight Championship in Wall, New Jersey.
International World Class Championship Wrestling (1991–1992)
Alegado also returned to work for the Savoldi family when ICW merged with World Class Championship Wrestling to form International World Class Championship Wrestling. He was one of Kevin Von Erich's opponents in 1991. On June 6, 1992, Alegado wrestled Neil Superior to a 10 min. time-limit draw at an IWCCW show in Fleetwood, Pennsylvania.
Eastern Championship Wrestling (1992–1993)
In 1992, Joel Goodhart sold his share of the TWA to his partner Tod Gordon. Alegado stayed in Philadelphia when Gordon formed Eastern Championship Wrestling. He made his debut at ECW's second-ever event on March 24, 1992, held at the Original Sports Bar in Philadelphia, defeating Max Thrasher. Alegado regularly appeared at ECW's original sports bar shows throughout the year. He and C.N. Redd were defeated by The Flames (Mr. Anthony and Mr. Perez) on April 26. On July 14, Alegado and Scott Summers beat J.T. Smith and Hurricane Curry. He lost to Smith at the Chestnut Cabaret the following night. On August 22, Alegado won a battle royal at The Aztec Club in Philadelphia. He also received a title show against then ECW Heavyweight Champion Jimmy Snuka. On September 12, 1992, Alegado and Summers lost to Larry Winters and Jimmy Jannetty at The Aztec Club. In early 1993, Alegado wrestled Tommy Cairo at the Kensington Sports Arena. This was the first television taping for ECW's long-running show on Sports Channel America.
Independent circuit (1993–1999)
Alegado spent much of the early 1990s working for Larry Sharpe's World Wrestling Association. On March 19, 1993, Alegado wrestled The Sandman at a WWA show in Pleasantville, New Jersey. He also wrestled The Big Boss Man for an All States Wrestling Association show at Lincoln High School in Ellwood City, Pennsylvania. That same year, Alegado traveled to Venezuela where he and Bastion Booger took on The Bushwhackers (Bushwhacker Butch and Bushwhacker Luke). They lost the match after Alegado was hit with the "Bushwhacker battering ram" finisher. Other international tours took Alegado to Ecuador, where "the crowd whistled and threw oranges at him" while wrestling in a bullfighting ring, and Rome competing before 14,000–15,000 people.
In 1994, Alegado and Dory Funk Jr. trained Steve Corino. Corino later credited Alegado for mentoring him during his early career. He also feuded with his former student Tom Brandi, now wrestling under the name "Johnny Gunn", on the Mid-Atlantic independent circuit. On April 16, 1994, Alegado lost to Gunn at the World Wide Wrestling Alliance supercard The Brawl At The Taj Mahal in Atlantic City, New Jersey. He was also beaten by Gunn at an NWA New Jersey show in Red Lion, Pennsylvania two weeks later. They also brought their feud to Steel City Wrestling in Pittsburgh. In October, Alegado entered a championship tournament to crown the inaugural SCW Heavyweight Championship. He was eliminated in the semi-finals by Shane Douglas in Connellsville, Pennsylvania on October 10, 1994. In August 1995, Alegado became the first heavyweight champion for the United States Wrestling Federation. While in the USWF, Alegado was one of the first wrestlers to be managed by The Prince of Passion. He also wrestled as Doink the Clown in the Century Wrestling Alliance and NWA New Jersey. On October 21, 1994, Alegado and The Viper wrestled Duane Gill and Wayne Gill at a Maryland Championship Wrestling show in Pasadena, Maryland. On August 13, 1995, he and Jim Powers lost to Johnny Gunn and The Rockin' Rebel at the MEWF Arena in Essex, Maryland.
On June 30, 1996, Alegado faced Jimmy Snuka for a New Jack City Wrestling show in Asbury Park, New Jersey. Alegado wrestled Spellbinder at an Eastern Shores Wrestling event in New York City on November 30, 1996. On March 1, 1997, Alegado (as Doink the Clown) defeated Steve Corino at the Salaam Shrine Temple in Livingston, New Jersey. On November 8, 1997, he also defeated Doink the Clown at a MEWF show in Keyser, West Virginia. On April 19, 1998, Alegado unsuccessfully challenged Tom Brandi for the SCW Heavyweight Championship at St. Vincent College in Latrobe, Pennsylvania. He again wrestled Brandi for New Breed Wrestling in Saratoga Springs, New York two months later. Alegado and Brandi's feud on the independent circuit continued for much of the 1990s.
Independent circuit (2000–2004)
In 2000, Alegado made a brief cameo on the MTV reality TV series True Life episode "I'm A Pro Wrestler". On April 29, 2001, Alegado headlined Deaf Wrestlefest 2001, an annual fundraiser for Western Pennsylvania School for the Deaf, with The Patriot. On August 17, 2002, Alegado once again wrestled The Patriot for an International Wrestling Cartel show in Canonsburg, Pennsylvania. On January 24, 2004, Alegado faced Snatch Haggis at a National Wrestling League for Blackhawk High School in Beaver Falls, Pennsylvania. Alegado was disqualified when his cornerman, Johnny Valiant, entered the ring to attack Haggis while attempted a pinfall.
Pro Wrestling WORLD-1 (2004)
Alegado was recruited by former student Steve Corino as both a performer and trainer for Pro Wrestling WORLD-1, the U.S. affiliate of Pro Wrestling ZERO1, in the mid-2000s. In the spring of 2004, he took part in WORLD-1 "Revisited Tour" defeating W1 Dojo student Alex Law in Pottstown, Pennsylvania on April 8 and Eddie Guapo in Essington, Pennsylvania on April 10. He and Corino operated the Revolution Puroresu Dojo training center inside the Gold's Gym in Limerick, Pennsylvania.
Independent circuit (2005)
Elsewhere on the independent circuit, Alegado teamed with Mana the Polynesian Warrior as The Tsunami Express. He also served as the opponent for "Dr. Death" Steve Williams at WrestleReunion 2. It was Williams' first match after his battle with throat cancer, and according to promoter Sal Corrente, the wrestler was initially hesitant to work with Alegado.
Pro Wrestling WORLD-1 (2005)
He returned to action the following year for WORLD-1's "Return Show". On August 6, 2005, Alegado defeated Ricky Landell in Philadelphia after a 21 min. bout. He continued feuding with Landell on and off for the next few years. That fall, Alegado entered a championship tournament to crown the first-ever World-1 Internet Television Champion. He defeated Larry Sweeney in the opening rounds on September 16, 2005. Joined by manager Marcus "King Kong" Dowling, Alegado and Sweeney eventually began teaming together as The Downtown Playboys.
Independent circuit (2005)
Alegado wrestled against Leatherman at Piledriver Pro's debut show in September 2005. He also appeared for Dan McDevitt's wrestling shows at Fort Meade. On October 1, Alegado and Eagle (accompanied by manager Royce C. Profit) battled Romeo Valentino and The Patriot at the facility. Alegado continued feuding with The Patriot thought the Mid-Atlantic area. On November 12, Alegado lost to The Patriot at CPW's "Cruel Intentions" in Moorefield, West Virginia. The two also clashed in the Pittsburgh-based International Wrestling Cartel. He was defeated by The Patriot at the IWC's "Showdown in Utown 4" in Uniontown, Pennsylvania on November 19. The following night at the "Newville Knockout" show, Alegado and Mark Mest lost to The Patriot and King Kong Bundy at Big Spring High School in Newville, Pennsylvania. On December 3, 2005, at "Winter Bash 2005", Alegado was beaten by The Patriot for a third time at Clarion University of Pennsylvania.
Pro Wrestling WORLD-1 (2006)
On January 29, 2006, Alegado defeated Claudio Castagnoli, Josh Daniels and The Masked Grappler in a four-way elimination match in Boyertown, Pennsylvania to win the World-1 Internet Television Championship. On April 9, Alegado and Larry Sweeney wrestled The Patriot and Josh Daniels to a double-countout. One month later, a Champion vs Champion match against WORLD-1 Heavyweight Champion Ricky Landell in Philadelphia ended in a double-disqualification. The Blue Meanie was the special guest referee. On May 7, 2006, The Downtown Playboys won the WORLD-1 Tag Team Championship in a 3-Way Dance against Greg Spitz & Mark Mest and The Patriot & Josh Daniels in Boyertown, Pennsylvania. He entered a tournament for the WORLD-1 Great Lakes Openweight Championship weeks later. Alegado advanced to the semi-finals in Bay City, Michigan where he and his opponent were eliminated via double-countout. During their time in WORLD-1, The Downtown Playboys were involved in an 18-month storyline which saw The Patriot enlisting various tag team partners in an attempt to unseat the tag team champions. Alegado and Sweeney were eventually beaten for the title by The Patriot and Mike Kehner at WORLD-1's "Rebirth or Destruction" at the end of the year.
Independent circuit (2006–2009)
Alegado spent the fall of 2006 traveling the local independent circuit. These included stops in National Wrestling Superstars where he faced WWE Hall of Famers such as Brutus "The Barber" Beefcake, King Kong Bundy, and Tito Santana. On September 29, Alegado was Salvatore Sincere's cornerman in his NWS bout against Larry Zbyszko. On October 14 in Romney, West Virginia, Alegado lost to The Patriot at CPW's "Wrestle War 2006" in a no-disqualification match. On December 29, 2006, Alegado was accompanied by Salvatore Sincere in his NWS match against Draven in Long Branch, New Jersey.
After a brief stay in the Cincinnati-based Northern Wrestling Federation in early 2007, Alegado headed back to the Mid-Atlantic area. On March 3, 2007, he took on King Kong Bundy and Salvatore Sincere in a three-way match for NWS in Point Pleasant Beach, New Jersey. The following week, Alegado and Corvis Fear lost to Mikey Pacifica and Big Slam Vader in a Kiss-My-Butt (Or Be Fired) match. On March 24, Alegado made a one-time return to the International Wrestling Cartel for "Night of Legends III" where he lost to The Patriot. Alegado was also in the corner of Salvatore Sincere during his match against Jim "The Anvil" Neidhart and was attacked by the wrestlerr after making Sincere submit to the camel clutch. Two days later, Alegado wrestled Test at Maryland Championship Wrestling's "March Madness: When Monsters Collide" in Dundalk, Maryland. On April 24, Alegado (with Foxxy Foxxy) and Johnny Candido wrestled to a no-contest at an NWS show in Carteret, New Jersey. The two wrestlers faced each other in the main event which saw Candido and Monsta Mack defeated Alegado and Greg "The Hammer" Valentine. That same year, Alegado became part of Royce Profit's Creative Control stable with Salvatore Sincere, Danny Doring, Nicky Benz, and Sgt. Jimmy Storm. On May 5, 2007, Creative Control (King Kaluha, Nicky Benz and Sgt. Jimmy Storm) wrestled The Patriot, The Rogue and NWS referee Mike "Quick Count" Dillon at The Elks Lodge in Brick, New Jersey. Tracy Brooks was the special guest referee. On November 3, Kahlua returned to NCW and defeated Adam Flash for the NCW Heavyweight Championship. On December 31, 2007, a match between Alegado and "Mr. USA" Tony Atlas ended in a no-contest. In the main event, Atlas teamed with Jimmy "Superfly" Snuka defeated Alegado and Salvatore Sincere.
On January 25, 2008, Alegado faced Rikishi at an NWS show in South River, New Jersey. He received a post-match stinkface after losing to him. The next night in Brick Township, New Jersey, Alegado and Miss Michelle were in the corner of Julio Dinero in his NWS match against The Patriot. Dinero had initially won the bout using a chain to pin his opponent but the decision was reversed when a second official came out to the ring. Alegado joined Dinero and Danny Doring in a 3-on-1 attack after The Patriot was declared the winner. The trio were run off by Rikishi and The Blue Meanie setting up a 6-man tag team match for the main event. Alegado, Dinero and Doring lost to The Patriot, Rikishi and The Blue Meanie in a Best-Of-Three-Falls Match when Alegado was pinned by Rikishi.
That spring, Alegado retained the NCW Heavyweight Championship against Ricky Morton at NCW's "The Hart of Rock And Roll". On May 9, King Kaluha defeated Sweet Daddy Ebony at a NWA 3000 Revolution event in Limerick Township, Pennsylvania. A month later in Maple Shade, New Jersey, at 3K Wrestling's War Before RAW, Alegado and Ricky Landell lost to Kid America and Steve Corino in a match for the vacant PWF Tag Team Championship. On June 22, in a rematch for the NCW title, Kaluha (with Jonathan Luvstruk) beat Ricky Morton by disqualification at a New Elite Wrestling show. He was among the eight wrestlers to compete in 3K Wrestling's Shinya Hashimoto Memorial Tournament one-night tournament at Gold's Gym in Limerick Township, Pennsylvania. The winner was given a title shot at the NWA World Heavyweight Championship. On July 11, Kaluha and Mr. Wrestling III were eliminated from the tournament after battling to a 15-minute time-limit draw. On July 23, Kaluha wrestled Tito Santana at an ECPW show in Middletown, New Jersey. On August 22, Alegado defeated Ricky Landell for the Zero-One United States Heavyweight Championship in Limerick, Pennsylvania. He also held the 3KW World Heavyweight Championship during this time. Alegado's Zero-One title reign lasted nearly three months until his defeat by Mr. Wrestling III on November 11. One week later, Alegado entered a tournament for the B4W Heavyweight Championship. He was eliminated in the semi-finals via disqualification by Gordon P. Samsonite on November 15, 2008. That same year, Alegado and Tom Brandi were among the "roasters" for the Iron Sheik Roast produced by Kayfabe Commentaries. C. M. Burnham of Oklafan.com praised the performances of Alegado and Brandi in his review of the event noting that the two men had "the best material out of all of the wrestlers".
On January 30, 2009, Alegado lost to Gordon P. Samsonite in a champion vs. champion match at "When Worlds Collide", an interpromotional supercard co-hosted by 3K Wrestling and Brookwood 4 Wrestling in Shark River Hills, New Jersey. As a result, Samsonite won both the B4W and 3KW Heavyweight Championships. On March 27, Alegado lost the B4W North American Championship to Alex Anthony at B4W's "Collision Course". On April 17, Alegado defeated Chris Rockwell and Kid America in a 3-Way Dance at B4W's "Dead End" to qualify for the promotion's upcoming "Triad Tournament". On August 4, Alegado defeated CK Kross at an East Coast Pro Wrestling show in Ridgefield Park, New Jersey. On November 29, Alegado wrestled Greg Matthews in a Lumberjack match for Darren Wyse's National Championship Wrestling promotion. On December 5, 2009, Alegado took part in a Tri-State Wrestling Alliance reunion show at St. Matthews Baptist Church in Williamstown, New Jersey with he and Larry Winters appearing on the undercard match. Alegado lost the bout via countout.
Semi-retirement (2010–)
On March 6, 2010, Alegado defeated Chest Flexor at "IWC Clearfield Cataclysm 2" in Clearfield, Pennsylvania. Three weeks later, he beat Doink the Clown at a show for Slam Championship Wrestling in Gloucester City, New Jersey. In May 2010, Alegado was one of several independent stars to appear for a weekend supercard in King of Prussia to raise awareness about childhood autism. It was promoted by Tom Brandi under the American Wrestling Federation banner. Proceeds from the event went to Autism Speaks. On June 4, 2010, Alegado wrestled Jimmy Snuka for an East Coast Pro Wrestling show in Amsterdam, New York.
On October 15, 2011, Alegado took part in a special benefit show held in East Norriton, Pennsylvania. The event was organized by Tom Brandi to celebrate National Breast Cancer Awareness Month with a portion of the proceeds being donated to the Susan G. Komen for A Cure foundation. On November 5, Alegado wrestled Kid Kattrell at a National Championship Wrestling event in York, Pennsylvania for the promotion's heavyweight championship. Alegado's "management team" under "Playboy" Jonathan Luvstruk were banned from the building during the bout. On November 12, Alegado was inducted into the IPWA Hall of Fame as part of Devastation Wrestling Federation's tribute to Larry Sharpe. He made an appearance for the Worldwide Wrestling Alliance that same month. On December 10, 2011, Alegado defeated Amadeus Thorn at the Silver Wing Arena in Gilbertsville, Pennsylvania. On September 23, 2012, Alegado wrestled Steve Corino in a "student vs. teacher" match in Birdsboro, Pennsylvania for the DREAMS Project.
On March 1, 2014, Alegado was inducted into the Maryland Wrestling Federation Hall of Fame along with Barry Hardy, The Stro, Thunder Morgan, Rockin’ Rebel, Johnny Rambo and Ed DeWitt. Since 2014, Alegado has been wrestling for Right Coast Pro in Newark, Delaware. In the summer of 2015, Alegado entered a championship tournament for the RCP Heavyweight Championship. He made it to the semi-finals at RCP Festivus where he lost to Steeler in a 3-Way Dance also involving Chachi.
Personal life
Alegado and his wife Jackie have a daughter and a son: Victoria "Tori" Alegado (born 1993) and Michael Alegado, Jr. (born 1996). His wife suffered from epileptic seizures after a roller skating accident as a teenager. Fearing that Mike Alegado might suffer from a ring injury, she once threatened to stop dating him if he did not quit pro wrestling. Aside from chronic back pain, and a stress fracture in the back of one of his legs, Alegado escaped serious injury during his long career. He has attributed his longevity to a -hour, 6-day training regimen at Gold's Gym in Royersford. Alegado also avoided using anabolic steroids. Alegado worked as a training consultant and software designer during his wrestling career. As of 2010, he is employed as a Senior Technical Trainer at Qlik, a leading data and analytics technology firm based in King of Prussia, Pennsylvania.
Championships and accomplishments
American Wrestling Federation
AWF North American Championship (1 time)
Brookwood 4 Wrestling
B4W North American Championship (1 time)
Devastation Wrestling Federation
IPWA Hall of Fame (Class of 2011)
Empire Wrestling Federation
EWF Heavyweight Championship (1 time)
EWF Tag Team Championship (1 time) – with Mickey Gilligan
International Championship Wrestling
ICW Tag Team Championship (1 time) – with Tom Brandi
Maryland Wrestling Federation
MWF Hall of Fame (Class of 2014)
National Championship Wrestling
NCW Heavyweight Championship (1 time)
Pro Wrestling Illustrated
PWI ranked King Kaluha # 315 of the 500 best singles wrestlers of the PWI 500 in 1991
PWI ranked King Kaluha # 328 of the 500 best singles wrestlers of the PWI 500 in 1992
PWI ranked King Kaluha # 365 of the 500 best singles wrestlers of the PWI 500 in 1993
PWI ranked King Kaluha # 424 of the 500 best singles wrestlers of the PWI 500 in 1994
PWI ranked King Kaluha # 285 of the 500 best singles wrestlers of the PWI 500 in 1995
PWI ranked King Kaluha # 278 of the 500 best singles wrestlers of the PWI 500 in 1996
PWI ranked King Kaluha # 431 of the 500 best singles wrestlers of the PWI 500 in 2009
Pro-Wrestling WORLD-1
WORLD-1 Internet Television Championship (1 time)
WORLD-1 Tag Team Championship (1 time) – with Larry Sweeney
PWF Universal Tag Team Championship (1 time) – with Larry Sweeney
Pro Wrestling Zero1
Zero-One United States Heavyweight Championship (1 time)
3K Wrestling Fighting Athletes
3KW World Heavyweight Championship (1 time)
United States Wrestling Federation
USWF Heavyweight Championship (1 time)
United States Wrestling League
USWF Tag Team Championship (1 time) – with Siva Afi
Summit Wrestling Association
SWA Heavyweight Championship (1 time, Current)
References
External links
Kings of Independent Wrestling: Michael "King" Kaluha
King Kaluha at Cagematch.net
King Kaluha at Wrestlingdata.com
Professional wrestling record for King Kaluha from The Internet Wrestling Database
Living people
1958 births
Sportspeople from Pittsburgh
American male professional wrestlers
Professional wrestlers from Pennsylvania
Professional wrestling managers and valets
Professional wrestling trainers |
54414471 | https://en.wikipedia.org/wiki/GyazMail | GyazMail | GyazMail is an email client for macOS, developed and maintained by Japanese programmer Goichi Hirakawa. It supports the POP3, IMAP and SMTP protocols. Its handling of multiple email accounts includes local mailboxes.
Gyazmail is based on the native macOS Cocoa and written in Objective-C. For its search function, GyazMail uses the Oniguruma regular expression library, which supports a variety of character encodings, which is especially relevant for Asian languages.
Version 1.0.1 was released in 2003.
Features
Supports IMAP, POP3 and local storage folders
An option exists for setting maximum line width when sending mails. This is helpful where autowrap is not supported, such as for newsgroups, which require a maximum line with of 72 characters, with a hard wrap.
Message threading is supported
Multiple accounts
Sorting rules for incoming and outgoing mail can be defined
References
External links
Official website
MacOS email clients
Unix Internet software |
864031 | https://en.wikipedia.org/wiki/Q-Chem | Q-Chem | Q-Chem is a general-purpose electronic structure package featuring a variety of established and new methods implemented using innovative algorithms that enable fast calculations of large systems on various computer architectures, from laptops and regular lab workstations to midsize clusters and HPCC, using density functional and wave-function based approaches. It offers an integrated graphical interface and input generator; a large selection of functionals and correlation methods, including methods for electronically excited states and open-shell systems; solvation models; and wave-function analysis tools. In addition to serving the computational chemistry community, Q-Chem also provides a versatile code development platform.
History
Q-Chem software is maintained and distributed by Q-Chem, Inc., located in Pleasanton, California, USA. It was founded in 1993 as a result of disagreements within the Gaussian company that led to the departure (and subsequent "banning") of John Pople and a number of his students and postdocs (see Gaussian License Controversy).
The first lines of the Q-Chem code were written by Peter Gill, at that time a postdoc of Pople, during a winter vacation (December 1992) in Australia. Gill was soon joined by Benny Johnson (a Pople graduate student) and Carlos Gonzalez (another Pople postdoc), but the latter left the company shortly thereafter. In mid-1993, Martin Head-Gordon, formerly a Pople student, but at that time on the Berkeley tenure track, joined the growing team of academic developers.
In preparation for the first commercial release, the company hired Eugene Fleischmann as marketing director and acquired its URL www.q-chem.com in January 1997. The first commercial product, Q-Chem 1.0, was released in March 1997. Advertising postcards celebrated the release with the proud headline, "Problems which were once impossible are now routine"; however, version 1.0 had many shortcomings, and a wit once remarked that the words "impossible" and "routine" should probably be interchanged! However, vigorous code development continued, and by the following year Q-Chem 1.1 was able to offer most of the basic quantum chemical functionality as well as a growing list of features (the continuous fast multipole method, J-matrix engine, COLD PRISM for integrals, and G96 density functional, for example) that were not available in any other package.
Following a setback when Johnson left, the company became more decentralized, establishing and cultivating relationships with an ever-increasing circle of research groups in universities around the world. In 1998, Fritz Schaefer accepted an invitation to join the Board of Directors and, early in 1999, as soon as his non-compete agreement with Gaussian had expired, John Pople joined as both a Director and code developer.
In 2000, Q-Chem established a collaboration with Wavefunction Inc., which led to the incorporation of Q-Chem as the ab initio engine in all subsequent versions of the Spartan package. The Q-Chem Board was expanded in March 2003 with the addition of Anna Krylov and Jing Kong. In 2012, John Herbert joined the Board and Fritz Schaefer became a Member Emeritus. In 2018, Evgeny Epifanovsky was named Chief Operations Officer. The following year, Shirin Faraji joined the Board; Peter Gill, who had been President of Q-Chem since 1988, stepped down; and Anna Krylov became the new president. The active Board of Directors currently consists of Faraji, Gill (past-President), Herbert, Krylov (President), and Hilary Pople (John's daughter). Martin Head-Gordon remains a Scientific Advisor to the Board.
Currently, there are thousands of Q-Chem licenses in use, and Q-Chem's user base is expanding, as illustrated by citation records for releases 2.0, 3.0, and 4.0, which reached 400 per year in 2016 (see Figure 2). Q-Chem has been used as an engine in high-throughput studies, such as the Harvard Clean Energy Project, in which about 350,000 calculations were performed daily on the IBM World Community Grid.
Innovative algorithms and new approaches to electronic structure have been enabling cutting-edge scientific discoveries. This transition, from in-house code to major electronic structure engine, has become possible due to contributions from numerous scientific collaborators; the Q-Chem business model encourages broad developer participation. Q-Chem defines its genre as open-teamware: its source code is open to a large group of developers. In addition, some Q-Chem modules are distributed as open source. Since 1992, over 400 man- (and woman-) years have been devoted to code development. Q-Chem 5.2.2, released in December 2019, consists of 7.5 million lines of code, which includes contributions by more than 300 active developers (current estimate is 312). See Figure 3.
Features
Q-Chem can perform a number of general quantum chemistry calculations, such as Hartree–Fock, density functional theory (DFT) including time-dependent DFT (TDDFT), Møller–Plesset perturbation theory (MP2), coupled cluster (CC), equation-of-motion coupled-cluster (EOM-CC), configuration interaction (CI), algebraic diagrammatic construction (ADC), and other advanced electronic structure methods. Q-Chem also includes QM/MM functionality. Q-Chem 4.0 and higher releases come with the graphical user interface, IQMol, which includes a hierarchical input generator, a molecular builder, and general visualization capabilities (MOs, densities, molecular vibrations, reaction pathways, etc.). IQMol is developed by Andrew Gilbert (in coordination with Q-Chem) and is distributed as free open-source software. IQmol is written using the Qt libraries, enabling it to run on a range of platforms, including OS X, Widows, and Linux. It provides an intuitive environment to set up, run, and analyze Q-Chem calculations. It can also read and display a variety of file formats, including the widely available formatted checkpoint format. A complete, up-to-date list of features is published on the Q-Chem website and in the user manual.
In addition, Q-Chem is interfaced with WebMO and is used as the computing engine in Spartan, or as a back-end to CHARMM, GROMACS, NAMD, and ChemShell. Other popular visualization programs such as Jmol and Molden can also be used.
In 2018, Q-Chem established a partnership with BrianQC, produced by StreamNovation, Ltd., a new integral engine exploiting the computational power of GPUs. The BrianQC plug-in speeds up Q-Chem calculations by taking advantage of GPUs on mixed architectures, which is highly efficient for simulating large molecules and extended systems. BrianQC is the first GPU Quantum Chemistry software capable of calculating high angular momentum orbitals.
Ground State Self-Consistent Field Methods
Restricted, unrestricted, and restricted open-shell formulations
Analytical first and second derivatives for geometry optimizations, harmonic frequency analysis, and ab initio molecular dynamics
Efficient algorithms for fast convergence
Variety of guess options (including MOM)
Density functional theory
Variety of local, GGA, mGGA, hybrid, double-hybrid, dispersion-corrected, range separated functionals (energies and analytic first and second derivatives)
TDDFT and spin-flip-TDDFT formulations (energies, gradients, and frequencies)
Constrained DFT
Innovative algorithms for faster performance and reduced scaling of integral calculations, HF/DFT and many-body methods
Dual basis
Resolution of identity
Cholesky decomposition of electron-repulsion integrals
Continuous Fast Multipole Method (CFMM)
Fast numerical integration of exchange-correlation with mrXC (multiresolution exchange-correlation)
Linear-scaling HF-exchange method (LinK)
Fourier transform Coulomb method (FTC)
COLD PRISM and J-matrix engine
Mixed-precision arithmetic for correlated methods
Post Hartree–Fock methods
MP2 (including RI-MP2, energies and analytic gradients)
SCS, SOS-MP2, and OO-MP2
CCD, QCISD, CCSD, OOCCD, VOOCCD
(T), (2), (dT), and (fT) corrections
EOM-XX-CCSD methods for open-shell and electronically excited species (XX=EE, SF, IP, EA, DIP, DEA, 2SF; energies, properties, and gradients for most methods), including complex-valued variants for treating resonances (states metastable with respect to electron detachment)
Extensions of DFT and many-body methods to treat core-level states and related spectroscopies
ADC methods
CIS, TDDFT, CIS(D), and SOS-CIS(D) methods for excited states
Variety of implicit solvent models
Wave-function analysis tools enabled by libwfa developed by Felix Plasser and co-workers
QM/MM and QM/EFP methods for extended systems
Janus QM/MM interface
YinYang Atom model without linked atoms
ONIOM model
EFP method (including library of effective fragments, EFP interface with CC/EOM, DFT/TDDFT, and other methods)
Version history
Beginning with Q-Chem 2.0 only major releases versions are shown.
Q-Chem 1.0: March 1997
Q-Chem 1.1: 1997
Q-Chem 1.2 1998
Q-Chem 2.0: 2000
Q-Chem 3.0: 2006
Q-Chem 4.0: February 2012
Q-Chem 5.0: June 2017
Q-Chem 5.2.2: December 2019
Q-Chem 5.3.2: December 2020
Q-Chem 5.4: June 2021
See also
References
External links
Wavefunction, Inc., Spartan Graphical Interface accesses and includes Q-Chem computational Engines
WebMO
BrianQC
Computational chemistry software
Chemistry software for Linux
Science software that uses Qt
Proprietary commercial software for Linux
Proprietary software that uses Qt |
1492625 | https://en.wikipedia.org/wiki/Macintosh%20clone | Macintosh clone | A Macintosh clone, also known as a Clonintosh (a portmanteau of "Clone" and "Macintosh"), is a computer running the Mac OS operating system that was not produced by Apple Inc. The earliest Mac clones were based on emulators and reverse-engineered Macintosh ROMs. During Apple's short lived Mac OS 7 licensing program authorized Mac clone makers were able to either purchase 100% compatible motherboards or build their own hardware using licensed Mac reference designs.
Since Apple's switch to the Intel platform, many non-Apple Wintel/PC computers are technologically so similar to Mac computers that they are able to boot the Mac operating system using a varying combination of community-developed patches and hacks. Such a Wintel/PC computer running macOS is more commonly referred to as a Hackintosh and the most popular community effort developing and sharing the requisite software patches is known as OSx86.
Background
The Apple II and IBM PC computer lines were "cloned" by other manufacturers who had reverse-engineered the minimal amount of firmware in the computers' ROM chips and subsequently legally produced computers that could run the same software. These clones were seen by Apple as a threat, as Apple II sales had presumably suffered from the competition provided by Franklin Computer Corporation and other clone manufacturers, both legal and illegal. At IBM, the threat proved to be real: most of the market eventually went to clone-makers, including Compaq, Leading Edge, Tandy, Kaypro, Packard Bell, Amstrad in Europe, and dozens of smaller companies, and in short order IBM found it had lost control over its own platform.
Apple eventually licensed the Apple II ROMs to other companies, primarily to educational toy manufacturer Tiger Electronics in order to produce an inexpensive laptop with educational games and the AppleWorks software suite: the Tiger Learning Computer (TLC). The TLC lacked a built-in display. Its lid acted as a holster for the cartridges that stored the bundled software, as it had no floppy drive.
Emulators
Long before true clones were available, the Atari ST could emulate a Mac by adding the third-party Magic Sac emulator, released in 1985, and, later, the Spectre, Spectre GCR, and Aladin emulators. The first three of those emulators required that the user purchase a set of Mac ROMs sold as system upgrades to Macintosh users. Later, multiple emulators were released for the Amiga.
Starting with the sales of PowerPC Macs, a CPU emulator to run 68000 applications was built into the Mac OS. By the time 68060 processors were available, PowerPC Macs became so powerful that they ran 68000 applications faster than any 68000-based computer, including any Amiga, Atari ST or Sharp X68000, making it unnecessary for Apple to release a 68060-equipped Mac. This means even a 68060-upgraded Atari ST clone or Amiga, which avoid CPU emulation, were always slower, on top of causing some programs not to work thanks to imperfect virtualization of the Mac system and remaining machine components.
Connectix also released another 68k emulator for Macs, replacing the original, called Speed Doubler, supposedly reported to be even faster than Apple's. As the years went by, the emulator wasn't updated to work with later versions of the original Mac OS, however, supposedly because Apple's own 68k emulator eventually surpassed it in performance, and the OS itself relied further on native PowerPC code with each new Mac OS update.
There was also a software emulator for x86 platforms running DOS/Windows and Linux called Executor, from ARDI. ARDI reverse-engineered the Mac ROM and built a 68000 CPU emulator, enabling Executor to run most (but not all) Macintosh software, from System 5 to System 7, with good speed. The migration from 68000 to PowerPC, and the added difficulties of emulating a PowerPC on x86 platforms, made targeting the later Mac OS versions impractical.
Unlicensed clones
Wary of repeating history and wanting to retain tight control of its product, Apple's Macintosh strategy included technical and legal measures that rendered production of Mac clones problematic. The original Macintosh system software contained a very large amount of complex code, which embodied the Mac's entire set of APIs, including the use of the GUI and file system. Through the 1980s and into the 1990s, much of the system software was included in the Macintosh's physical ROM chips. Therefore, any competitor attempting to create a Macintosh clone without infringing copyright would have to reverse-engineer the ROMs, which would have been an enormous and costly process without certainty of success. Only one company, Nutek, managed to produce "semi-Mac-compatible" computers in the early 1990s by partially re-implementing System 7 ROMs.
This strategy, making the development of competitive Mac clones prohibitively expensive, successfully shut out manufacturers looking to create computers that would directly compete with Apple's product lines. However, companies like Outbound Systems, Dynamac and Colby Systems, were able to sidestep the Mac cloning process by targeting high-end, high-profit market segments without suitable product offerings from Apple and offering Mac conversions instead.
In the early 1980s, Brazil's military dictatorship instituted trade restrictions that prohibited the importation of computers from overseas manufacturers, and these restrictions were not lifted until 1993. A Brazilian company called Unitron (which had previously produced Apple II clones) developed a Macintosh clone with specifications similar to the Mac 512K, and proposed to put it on sale. Although Unitron claimed to have legitimately reverse-engineered the ROMs and hardware, and Apple did not hold patents covering the computer in Brazil, Apple claimed the ROMs had simply been copied. Ultimately, under pressure from the US government and local manufacturers of PC clones the Brazilian Computer and Automation Council did not allow production to proceed.
Hackintosh
When Apple migrated to the PC-Intel platform in the mid 2000s, Apple hardware was more or less the same as generic PC hardware from a platform perspective. This theoretically allowed for installation of Mac OS X on non-Apple hardware. Hackintosh is the term appropriated by hobbyist programmers, who have collaborated on the Internet to install versions of Mac OS X v10.4 onwards dubbed Mac OSx86 to be used on generic PC hardware rather than on Apple's own hardware. Apple contends this is illegal under the DMCA, so in order to combat illegal usage of their operating system software, they continue to use methods to prevent Mac OS X (now macOS) from being installed on unofficial non-Apple hardware, with mixed success. At present, with proper knowledge and instruction, macOS installation is more or less straightforward. Several online communities have sprung up to support end-users who wish to install macOS on non-Apple hardware. Some representative examples of these are Dortania and InsanelyMac.
Psystar Corporation
In April 2008, Psystar Corporation based in Miami, Florida, announced the first commercially available OSx86, a Wintel/PC computer with Mac OS X Leopard pre-installed partially with software from the OSx86 community project. Apple immediately sued in July 2008 and a protracted legal battle followed, ending in November 2009 with a summary judgement against Psystar. In May 2012, the U.S. Supreme Court denied Psystar's appeal, closing the case for good.
Licensed Macintosh clones
In 1992, Macworld published an editorial stating that Apple clones were coming, and that the company should license its technology to others so it would benefit as the overall Macintosh market grew.
By 1995, Apple Macintosh computers accounted for around 7% of the worldwide desktop computer market. Apple executives decided to launch an official clone program in order to expand Macintosh market penetration. Apple's Mac OS 7 licensing program entailed the licensing of the Macintosh ROMs and system software to other manufacturers, each of which agreed to pay a flat fee for a license, and a royalty (initially ) for each clone computer they sold. This generated quick revenues for Apple during a time of financial crisis.
From early 1995 through mid-1997, it was possible to buy PowerPC-based clone computers running Mac OS, most notably from Power Computing and UMAX. However, by 1996 Apple executives were worried that high-end clones were cannibalizing sales of their own high-end computers, where profit margins were highest.
A total of 75 distinct Macintosh clone models are known to have been introduced during the licensee era.
The following companies produced licensed Mac clones:
Jobs ends the official program
Soon after Steve Jobs returned to Apple in 1997, he personally tried to renegotiate licensing deals more favorable to Apple five times over the course of three weeks and in his words each time was "basically told to pound sand". This response caused him to halt negotiations of upcoming licensing deals with OS licensees that Apple executives complained were still financially unfavorable.
Because the clone makers' licenses were valid only for Apple's System 7 operating system, Apple's release of Mac OS 8 left the clone manufacturers without the ability to ship a current Mac OS version and effectively ended the cloning program. Apple bought Power Computing's Mac clone business for and gave their users free Mac OS 8 upgrade disks, ending the clone era. Only UMAX ever obtained a license to ship Mac OS 8 and get Mac OS 8 upgrade disks, which expired in July 1998 (Power Computing also got Mac OS 8 disks by their acquisition by Apple).
All other manufacturers had their Macintosh clone contract terminated by late 1997 and either continued their brands as PC clones or discontinued them all together. Some of the clone manufacturers even went out of business. Reportedly, a heated telephone conversation between Jobs and Motorola CEO Christopher Galvin resulted in the contentious termination of Motorola's clone contract, and the long-favored Apple being demoted to "just another customer" mainly for PowerPC CPUs.
In 1999, Jobs had discussions with Ben Rosen, Chairman and interim CEO of Compaq at the time, for the world's then-largest Wintel PC manufacturer to license Mac OS, which would have been a coup for Apple. However no agreement was reached, as Apple had second thoughts about licensing its "crown jewel", while Compaq did not want to offend Microsoft, which it had partnered with since its founding in 1982. By 2007, five years after Compaq merged with HP, Rosen told Jobs he had switched to being a Mac user.
In 2001, Jobs reportedly had a meeting with Sony executives, saying he was "willing to make an exception" for Sony VAIO to run Mac OS X, although the negotiations later fell through.
Since Apple transitioned the Macintosh to an Intel platform in 2006, and subsequent to a major increase in visibility and a gain in computer market share for Apple with the success of the iPod, large computer system manufacturers such as Dell have expressed renewed interest in creating Macintosh clones. While various industry executives, notably Michael Dell, have stated publicly that they would like to sell Macintosh-compatible computers, Apple VP Phil Schiller said the company does not plan to let people run Mac OS X (macOS) on other computer makers' hardware. "We will not allow running Mac OS X on anything other than an Apple Mac," he said.
Macintosh conversion
Unlike Mac clones that contain little or no original Apple hardware, a Mac conversion is an aftermarket enclosure kit that requires the core components of a previously purchased, genuine Apple Mac computer, such as the Macintosh ROM or the motherboard, in order to become a functional computer system. This business model is most commonly used in the car industry, with one of the most famous examples being the Shelby Mustang, a high performance variant of the Ford Mustang, and is protected in the U.S. by the First-sale doctrine and similar legal concepts in most other countries.
While Mac clones traditionally aim to compete directly with Apple's solutions through lower prices, Mac conversions target market segments that lack dedicated solutions from Apple, and where the need for a Mac solution is high enough to justify the combined cost of the full price of the Mac donor computer plus the price of the conversion kit & labor.
The following companies produced Mac conversions:
See also
IBM PC clone
References
External links
Mac Clones and New O/S movie from archive.org
Mac Clones by Manufacturer (at EveryMac.com)
Macintosh clones (at LowEndMac.com)
Infos on all macs and clones (incl. details on some mainboard PCBs / at MacInfo.de)
Apple Squeezes Mac Clones Out of the Market (at LowEndMac.com) |
69196954 | https://en.wikipedia.org/wiki/System%2011 | System 11 | System 11 may refer to:
Computing
Namco System 11, the arcade system board
X Window System (or X11), a windowing system
Operating systems
Android 11, the Google operating system
PDP-11 operating systems
RT-11, real-time operating system
RSX-11
DSM-11
BATCH-11/DOS-11
Ultrix-11
Linux operating system distributive versions:
Debian 11, the Debian Project distributive
Fedora 11, the RedHat-based distributive
Mandriva 11, the Mandriva distributive
Mint 11, the Ubuntu-based distributive
openSUSE 11, the openSUSE Project distributive
Ubuntu 11.4 and Ubuntu 11.10, the Canonical distributive (2011)
Windows 11, the Microsoft operating system
Other
STS-11 (Space Transportation System-11), a cancelled Space Shuttle mission
Undecimal numbering system
See also
OS 11 |
508309 | https://en.wikipedia.org/wiki/Iphitos | Iphitos | Iphitos or Īphitus (; Ancient Greek: Ἴφιτος) is the name of six individuals in Greek mythology.
Iphitos, son of Eurytus, king of Oechalia.
Iphitos, son of Naubolus and king of Phocis, others say his father was the son of Hippasus from Peloponessus. He entertained Jason when he consulted the Delphic Oracle and later joined the Argonauts. Iphitus an ally of Thebans in the war of the Seven against Thebes. He was the leader of men from Phocis and the cities of Panope, Daulis, Cyparissos, Lebadia and Hyampolis during the war. By his wife Hippolyte or Thrasybule, Iphitos became the father of Schedius and Epistrophus who led the Phocians in the Trojan war.
Iphitos, an Elean who was killed by Copreus, son of Pelops, who fled from Elis after the murder and later on was purified by King Eurystheus in Mycenae. According to the writer Alcman, Iphitos along with Lycurgus, belonged to the participants in the first Olympic Games.
Iphitus, father of Eurynome, who was the mother of King Adrastus of Argos, one of the Seven against Thebes.
Iphitos, an elderly Trojan during the Trojan War. In Book VIII of the Iliad, his son Archeptolemus suddenly becomes the charioteer of Hector when Eniopeus was killed by Diomedes. However, Teucer killed him in the same battle. In Aeneid Book II, Aeneas named Iphitos among half a dozen Trojan heroes who fight by his side during the fall of Troy. When the battle turned against them, Iphitos was the only one of these who remained standing. He was apparently by Aeneas's side until King Priam was killed. In some accounts, Iphitos was also the father of Coeranus who was killed by Odysseus.
Iphitos, king of Elis, restored the Olympic Games after the Dorian invasion. The restoration came after he asked the Oracle at Delphi about what should be done to save Greece from civil war and the diseases that were killing the population. The Oracle answered: "Iphitos and the people of Elis must declare a sacred truce for the duration of the game and revive the Olympic Games".
Notes
References
Gaius Julius Hyginus, Fabulae from The Myths of Hyginus translated and edited by Mary Grant. University of Kansas Publications in Humanistic Studies. Online version at the Topos Text Project.
Gaius Valerius Flaccus, Argonautica translated by Mozley, J H. Loeb Classical Library Volume 286. Cambridge, MA, Harvard University Press; London, William Heinemann Ltd. 1928. Online version at theio.com.
Gaius Valerius Flaccus, Argonauticon. Otto Kramer. Leipzig. Teubner. 1913. Latin text available at the Perseus Digital Library.
Homer, The Iliad with an English Translation by A.T. Murray, Ph.D. in two volumes. Cambridge, MA., Harvard University Press; London, William Heinemann, Ltd. 1924. Online version at the Perseus Digital Library.
Homer, Homeri Opera in five volumes. Oxford, Oxford University Press. 1920. Greek text available at the Perseus Digital Library.
Pseudo-Apollodorus, The Library with an English Translation by Sir James George Frazer, F.B.A., F.R.S. in 2 Volumes, Cambridge, MA, Harvard University Press; London, William Heinemann Ltd. 1921. Online version at the Perseus Digital Library. Greek text available from the same website.
Publius Ovidius Naso, Metamorphoses translated by Brookes More (1859-1942). Boston, Cornhill Publishing Co. 1922. Online version at the Perseus Digital Library.
Publius Ovidius Naso, Metamorphoses. Hugo Magnus. Gotha (Germany). Friedr. Andr. Perthes. 1892. Latin text available at the Perseus Digital Library.
Publius Papinius Statius, The Thebaid translated by John Henry Mozley. Loeb Classical Library Volumes. Cambridge, MA, Harvard University Press; London, William Heinemann Ltd. 1928. Online version at the Topos Text Project.
Publius Papinius Statius, The Thebaid. Vol I-II. John Henry Mozley. London: William Heinemann; New York: G.P. Putnam's Sons. 1928. Latin text available at the Perseus Digital Library.
Publius Vergilius Maro, Aeneid. Theodore C. Williams. trans. Boston. Houghton Mifflin Co. 1910. Online version at the Perseus Digital Library.
Publius Vergilius Maro, Bucolics, Aeneid, and Georgics. J. B. Greenough. Boston. Ginn & Co. 1900. Latin text available at the Perseus Digital Library.
Tzetzes, John, Allegories of the Iliad translated by Goldwyn, Adam J. and Kokkini, Dimitra. Dumbarton Oaks Medieval Library, Harvard University Press, 2015.
Kings of Phocis
Kings in Greek mythology
Argonauts
Trojans
Characters in the Argonautica
Phocian characters in Greek mythology
Achaean characters in Greek mythology
Characters in Greek mythology |
21718271 | https://en.wikipedia.org/wiki/Novay | Novay | Novay, formally known as the Telematica Instituut was a Dutch research institute in the field of information technology, founded in 1997, known for its development of ArchiMate. In 2009 the Telematica Instituut was reorganized and operated under the new name Novay. It filed for bankruptcy April 3, 2014, and is dissolved.
Overview
Novay was a public-private partnership of knowledge institutes and companies with the objective of increasing the competitive power and innovative capability of the Dutch business community. It is managed and funded by top companies and the government. It aims to translate fundamental knowledge into market-oriented research for the public and private sectors in the field of telematics: multimedia, electronic commerce, mobile communications, CSCW, knowledge management, etc.
The work of Novay focused on total solutions that can be applied directly in businesses and society at large. It gathers insights from a wide range of subject areas, such as information technology, business economics, organisational science, psychology and sociology. These insights are developed further in multidisciplinary teams and forged into new concepts, products and services in the area of information and communication technology.
Research and development
Notable Telematica Instituut/Novay research projects have focussed on:
Testbed, an architecture of business processes: a software tool and supporting resources for analysing, designing, simulating and introducing business processes. The tool provides information on the consequences of changes in business processes in areas such as service levels, production time and costs, workflow and automation, even before the actual processes have been introduced. In addition, the graphical interface makes the processes understandable for non-specialists.
Friends, Customised Internet services: This project delivered a component-based middleware platform for the development, rollout and use of all kinds of Internet services.
ArchiMate : An open and independent modelling language for enterprise architecture, supported by different tool vendors and consulting firms. It provides instruments to support enterprise architects in describing, analyzing and visualizing the relationships among business domains in an unambiguous way. ArchiMate is one of the open standards hosted by the Open Group and based on the IEEE 1471 standard. The ArchiMate framework was developed by the Telematica Instituut to offer companies a simple and intuitive concept that reconciles both IT and business aspects. The standard is constantly being enhanced through the work of an international forum, the ArchiMate forum.
See also
European Research Center for Information Systems
IYOUIT
Maeslantkering
MobileHCI
Personal knowledge management
References
External links
Novay.nl confirming "Novay ends its activities"
Enterprise modelling
Information technology research institutes
Research institutes in the Netherlands |
9815218 | https://en.wikipedia.org/wiki/Imeem | Imeem | The online service imeem was a social media website where users interacted with each other by streaming, uploading and sharing music and music videos. It operated from 2003 until 2009 when it was shut down after being acquired by MySpace.
The company was founded in 2003 by Dalton Caldwell (formerly of VA Linux) and Jan Jannink (formerly of Napster), and many of its core engineering team came from the original Napster file-sharing service. The company takes its name from "meme", a term coined by biologist Richard Dawkins to describe the ideas and cultural phenomena that spread as if they had a life of their own.
Helping to pioneer the free, advertising-supported model for online music, imeem permitted consumers to legally upload, stream and share music and music playlists for free with the costs supported by advertising. In 2007, imeem became the first-ever online music site to secure licenses from all four U.S. major music labels to offer their music catalogs for free streaming and sharing on the web.
The company also created the web's first embeddable music and video playlists. People could use imeem's widgets to embed songs and playlists from imeem virtually anywhere on the web, including on their MySpace and Facebook profiles or on their personal blogs.
Headquartered in San Francisco's South of Market district (SoMa), imeem had additional offices in New York City and Los Angeles. The company's investors included Morgenthaler Ventures and Warner Music Group.
On December 8, 2009, imeem was bought out by MySpace Music in a firesale for an undisclosed amount. However, it is stated to have been less than $1 million.
Business model
Revenue generation at imeem was through a combination of direct and indirect advertising sales, sales of MP3 downloads, ringtones and concert tickets, and subscription revenue from premium services. The bulk of its revenue came from advertising; advertisers who ran direct campaigns with imeem included TheTruth.com/American Legacy Foundation, Kia Motors, and Dr Pepper, among others.
The company was one of the pioneers of the ad-supported streaming music model. In 2007, imeem became the first-ever online music site to secure licenses from all 4 major music labels to offer their music catalogs for free streaming and sharing on the web.
Under this model, artists and labels were paid a share of imeem's ad revenue in proportion to the popularity of their music on imeem, and had the right to register their content and determine how (or whether) that content is available on the site or through its embeddable widgets.
This business model was made possible by imeem's proprietary content fingerprinting and digital registry technology. Initially, imeem licensed this technology from SNOCAP, the digital rights and content management startup founded by Napster creator Shawn Fanning. In 2008, imeem acquired SNOCAP and its technology outright. imeem continued to operate the SNOCAP digital registry, and used the technology acquired from SNOCAP to power its ad-supported streaming music service.
Products
The company provided two main services: imeem.com, where people could discover, stream and share music and music videos for free, and imeem Mobile, an Internet radio service for mobile devices. In addition, the company offered a premium service, imeem VIP, that gave people access to additional features on the imeem site.
imeem.com
Registered users of the site could stream and share millions of songs and tens of thousands of music videos free of charge, with the costs for licensing and streaming supported by advertising on the site and on imeem Mobile.
One of imeem's key features was the playlist. Users could create personal playlists, via a "Create Music Playlist" page, with music they had uploaded themselves or with music and video already available on the site. They could publish and share these playlists on imeem, where they could be played by, shared with, commented on, or tagged by other users.
Visitors could also share music, videos and playlists beyond imeem, either by embedding imeem players into external sites.
imeem Mobile
With the free cell phone application imeem Mobile, users could discover, purchase and enjoy music on their mobile device. It was available as a free download to users on the Android and iPhone/iPod touch platforms.
Users could create custom Internet radio stations based on their favorite artists, discover new music through personalized recommendations and buy DRM-free MP3 downloads directly onto their mobile device (on Android, downloads are from the Amazon MP3 application; on the iPhone and iPod touch, downloads are from the iTunes Store.
The app also enabled people to browse and stream their personal imeem music libraries to their mobile device. People could upload up to 20,000 songs of the music they own directly to imeem.com, and then access that music through their mobile devices. To upload more than 100 songs, users had to subscribe to one of imeem's premium services.
The company introduced imeem Mobile on the Android platform in October 2008, and launched it for iPhone and iPod touch users in May 2009. At launch, it was the only streaming music application on the Android platform, which in turn led to it being one of the most popular applications installed on Android devices. In June 2009, imeem Mobile crossed a milestone of over 1 million installs on the Android and iPhone platforms.
The application received several awards, including a 2009 Crunchie Award for Best Mobile Application, the Editor's Choice award from the blog AndroidGuys, and an award for 'Best Streaming Music App' in the 2009 Android Network Awards.
imeem VIP
In 2008, imeem launched a premium imeem VIP service that gave subscribers access to additional site features, most prominently the ability to upload more music (over 100 songs), and to watch videos up to 1080p in resolution. There were three imeem VIP subscription tiers. * The imeem "VIP" plans started at US$$9.99 per year for the "VIP Lite" plan, which gave subscribers access to streaming songs through the VIP Player, and 480p video (up from 360p for basic users). The "VIP" subscription option allowed uploading of up to 1,000 songs and viewing of 720p video, for $29.99 per year. And the "VIP Plus" subscription allowed uploading of up to 20,000 songs and viewing of 1080p video, for $99.99 per year
History
Early history: 2003-2005
The imeem service has changed drastically since its original inception as a messaging application that let people communicate by online chat, blogging, instant messaging and file sharing. The service was billed as a "distributed, peer-to-peer, social network".
Founder Dalton Caldwell began working on what would become the imeem messaging application during Thanksgiving weekend in 2003. Initially, he worked on the software from his home. In 2004, imeem moved into offices in downtown Palo Alto's 285 Hamilton building, with Caldwell, Jannink and a small team of engineers continuing work on the software.
The company first unveiled its software in February 2005 at the DEMO conference and formally launched it that August.
When imeem first launched, to use the service, users were required to download and install the desktop messaging and file-sharing software; the imeem Web site merely existed as a means for users to register and download the client. Though originally designed for messaging, it was the file-sharing function that took off. The client software supported the service's distributed database model: Every imeem client on the network had a database that would generate and store references to media content shared on the network; this system would accelerate access to content deemed to be close to the user. The service's media-sharing was peer-to-peer – if a user shared photos or a podcast, then the data would only exist on the client database network; users who wanted to view the actual content would access it by peering directly with the publisher.
2006
In March 2006, imeem re-launched at the South by Southwest (SXSW) conference in Austin, Texas, with a new focus on enabling people to interact through the imeem.com website, using media (photos, videos, music) to express their personalities and interests. Timed to coincide with the re-launch, imeem introduced new features enabling users to upload and play music and video on the site.
In September 2006, imeem introduced embeddable Adobe Flash-based playlists that gave people the ability to take music and video playlists they created (or found) on the site and embed them virtually anywhere on the web. The company's players quickly became popular with consumers using MySpace and other social networks, giving them a way to customize & personalize their profiles with music.
As a result, imeem quickly gained traction, with the site's traffic growing to 10 million unique monthly users by the end of 2006. By March 2007, imeem's monthly traffic reached over 16 million unique monthly users.
2007
In February 2007, MySpace took steps to limit the posting of imeem content on its site: any updates or comments with "imeem" even mentioned in them were removed upon posting. However, MySpace stopped blocking imeem in 2008.
In March 2007, imeem announced it was partnering with SNOCAP, the digital rights and content management startup founded by Napster creator Shawn Fanning, to enable legal uploading, streaming and sharing of music on imeem, utilizing SNOCAP's content fingerprinting and digital registry technology. The goal was to provide a way for consumers to upload and share music with their friends, for free, and to do so in a way where label and artists can both make money and have greater control over where and how their music was available.
The partnership marked imeem's move to an ad-supported model, in which consumers could freely stream and share music and video content with the costs supported by advertising. Under this model, artists and labels are paid a share of the site's ad revenue in proportion to the popularity of their music on imeem, and have the right to register their content and determine whether that content is available on the site or through its embeddable widgets.
Ultimately, the imeem messaging and file-sharing application had proven to be something of a resource hog for power-users, since the database could grow to large proportions just by associating with a few individuals who were sharing a lot of content. This messaging product was ultimately phased out; the site became entirely Web-based beginning in June 2007. While this distributed model was interesting and received positive press, it proved to be difficult to attract many users since the only way to participate was to download the imeem client software. Over time, imeem integrated many of the client's features into its website and the innovative distributed database model was centralized.
Throughout the first half of 2007, imeem had negotiated with the major labels to secure licenses for this new model. Warner Music Group and imeem announced a licensing agreement for imeem's new Web-based service in July 2007, followed by Sony Music Entertainment and EMI Music in September. In December 2007, imeem signed a licensing agreement with Universal Music Group, becoming the first online music service to partner with all four major music labels to let people legally stream and share music for free online.
2008
On January 28, 2008, imeem announced that it was acquiring the music locker service Anywhere.FM.
On February 1, 2008, imeem acquired SNOCAP. It had already been making extensive use of SNOCAP's audio fingerprinting technology and music database. As part of the acquisition, SNOCAP's chief operating officer, Ali Aydar (ex-Napster), joined imeem.
On March 24, 2008 imeem announced the launch of a developer's platform that will permit third-party developers to interact with imeem data.
In April 2008, imeem received a new round of funding from Sequoia Capital.
In October 2008, imeem launched imeem Mobile, a free mobile music application. However, on the 22nd of that month, the company laid off 25% of its staff.
2009
The company's troubles continued into 2009, as Warner Music wrote off its entire $15 million investment in imeem in anticipation that no return would come of it and at the same time Warner didn't extend their licensensing with Imeem.
It seemed possible the company could close in April 2009, but it was able to renegotiate deals with its major label partners, and subsequently found enough new investors to continue service. Sources told TechCrunch that imeem raised $6 million in this most recent funding round, with Morgenthaler Ventures and Warner Music Group among those investing.
The company launched imeem Mobile for the iPhone and iPod touch in May 2009. In June 2009, imeem Mobile crossed a milestone of over 1 million installs on the Android and iPhone platforms.
On June 25, 2009, imeem announced that it was removing all user-uploaded video and photos from the site. This move, and the lack of advance notice, was unpopular with many users.
In October 2009, imeem and Google announced the integration of links to music on imeem within Google search results; imeem was one of several online music companies involved in such efforts.
Closure
On December 8, 2009, MySpace (owned by News Corporation) acquired imeem, and angered many imeem users when the new parent company closed down the beleaguered service on the same day, redirecting all imeem traffic to MySpace Music. Furthermore, MySpace social network did not pay artists or labels the money still owed to them by imeem for music streaming. The controversial closure was criticized as a sign that MySpace was out of touch with the times. MySpace, on December 22, 2009, assured imeem.com users that their playlists are safe and that they are currently duplicating every user's playlist and will migrate them on to MySpace Music as soon as possible. MySpace assured that features and functionality that users were used to at imeem would soon find their way onto MySpace, and complement the existing platform alongside free full-song streaming, artist profiles, music videos, and more. MySpace will email imeem users the instructions on how to claim their playlists.
On January 15, 2010, MySpace began allowing users to import imeem playlists. However, songs that are currently not available via MySpace Music were not converted over, and there was no means provided to even recover the names of the missing tracks. Additionally, user "favorites" metadata was not able to be carried over, with the result that users who depended upon their favorites lists instead of normal playlists were unable to retrieve their music. Other complaints include incorrect artist info, garbled tracks, and an increase in between-songs advertising.
Technology
The back-end software for imeem's services was primarily written in C#. While most of the front-end Web servers ran under Microsoft Windows, some used the Linux operating system. The Web site heavily used Ajax programming and Flash animation.
Audio streams were delivered as 128 kbps-quality MPEG-2 Audio Layer III (MP3) format. Video was encoded in the Sorenson Video codec at >700kbit/s, in the Flash Video (FLV) container format, with resolution resized to 400 pixels wide and preserving its aspect ratio, and with embedded mp3 audio at 96 kbit/s. While the video quality and resolution was significantly better than other video sites at that time (YouTube, for example, used 300kbit/s video) the late-arriving video sharing aspect of imeem was largely eclipsed by the original audio sharing component. In 2008 imeem upgraded the video quality further and became one of the first media sharing sites to offer video encoded with the MPEG 4 H.264 codec and at the original source resolution.
The original imeem client software conducted most of its network activity using an encrypted protocol, making it difficult to monitor user activity. Thus, conversations via the client's IM functionality and group chats were encrypted and only visible to participants. On startup, the application validated with a central server. This ensured that unauthorized clients could not connect and run malicious exploits (such as for monitoring network traffic or spoofing identities) against the network. Software updates were also delivered via the client and authenticated before they were installed. The company's move to a Web-based file-sharing business model in 2007 made most of these considerations moot.
Copyright infringement lawsuit
Warner Music Group announced on July 12, 2007 that it had dropped its copyright infringement lawsuit against imeem by agreeing to license its music and video content to the site for a percentage of imeem's advertising revenue. Financial details were not disclosed. Under the agreement, imeem could carry music and videos from all of the record company's artists. Warner also released financial statements indicating that it invested $15 million into imeem.
Meems (groups)
Users of imeem could link to each other through topic groups (which were originally called meems), based on common interests. Some meems were created by imeem itself, while others were user-generated. Media content could be placed in custom profile pages and topic groups, as well as in browseable content channels and charts. Meems could serve as basic online communities for artists, bands, clubs, films, schools, festivals, concert tours, friends, and sports enthusiasts. A late redesign of the site replaced most of the "meem" references with the familiar word "group". Early on, it had been possible for links to be made between groups which had related subject matter, but this feature had only been implemented in the original client software. After the transition to the Web-only service model, it became impossible for users to add (or even remove) such links, although official imeem-created groups sometimes had links added at creation time, by an administrative means not available to subscribers.
References
External links
(defunct)
SNOCAP site
2009 disestablishments in the United States
Android (operating system) software
BlackBerry software
Blog hosting services
Companies based in San Francisco
Defunct digital music services or companies
Defunct social networking services
Defunct websites
Instant messaging clients
Internet properties established in 2004
Internet properties disestablished in 2009
Internet radio in the United States
Online music and lyrics databases
Former video hosting services
Former News Corporation subsidiaries
Defunct video on demand services
Defunct online companies of the United States
American music websites |
1214676 | https://en.wikipedia.org/wiki/Flight%20Unlimited | Flight Unlimited | Flight Unlimited is a 1995 aerobatic flight simulator video game developed and published by LookingGlass Technologies. It allows players to pilot reproductions of real-world aircraft and to perform aerobatic maneuvers. They may fly freely, race through floating rings against a timer or take lessons from a virtual flight instructor. The instructor teaches basic and advanced techniques, ranging from rudder turns to maneuvers such as the tailslide, Lomcovák and Immelmann turn.
Flight Unlimited was the first self-published game released by Looking Glass Technologies. It was intended to establish the company as a video game publisher and to compete with flight simulator franchises such as Microsoft Flight Simulator. Project leader Seamus Blackley, a particle physicist and amateur pilot, conceived the game in 1992. He felt that other flight simulators failed to convey the experience of real flight, and he reacted by coding a simulated atmosphere for Flight Unlimited based on real-time computational fluid dynamics. Aerobatic pilot Michael Goulian endorsed the game and assisted the team in making it more true to life.
Flight Unlimited received positive reviews from critics and was a commercial success; its sales exceeded 780,000 copies by 2002. Reviewers lauded its realism, flight instruction, graphics and sense of flight, but some criticized its high system requirements. The game was followed by two sequels: Flight Unlimited II (1997) and Flight Unlimited III (1999). A combat-oriented successor, Flight Combat, was released in 2002 as Jane's Attack Squadron after a series of setbacks. Soon after Flight Unlimiteds completion, Blackley was fired from Looking Glass. He went on to design Jurassic Park: Trespasser at DreamWorks Interactive and later spearhead the Xbox project at Microsoft.
Gameplay
Flight Unlimited is a three-dimensional (3D) flight simulator video game: its gameplay is a simulation of piloting real-world airplanes. Players may control the Bellanca Decathlon, Extra 300S, Pitts Special S-2B, Sukhoi Su-31 and Grob G103a Twin II sailplane. The game begins at the fixed-base operator (FBO) interface—a traversable 3D room whose contents represent menu options. For example, the player interacts with a row of scale airplane models to select an aircraft, and with a world globe to change airfield locations. Six settings are available, including Sedona, Arizona and Springfield, Vermont.
The player may choose to begin flight on a runway or taxiway, or in the air. Aircraft are controlled via keyboard, joystick, head-mounted display or specialized input devices such as pedals. During flight, several third- and first-person camera angles may be selected. For example, the third-person Flyby View places the camera in front of the plane as it flies past, while the first-person Three-Way View displays more information about the plane's position and speed than other angles. Certain camera angles, including the Three-Way View and 3-D Cockpit view, provide the player with simulated flight instruments such as an altimeter, airspeed indicator, accelerometer, variometer and tachometer. The game is designed to allow players to perform aerobatic maneuvers such as the Immelmann turn, tailslide, Lomcevak and Cuban Eight. Performances may be recorded and played back, with controls that allow the player to pause, rewind and fast forward. At any time, the player may stop a recording and resume flight from that point.
The game contains lessons that cover basic and advanced flight techniques, ranging from rudder turns to challenging aerobatic maneuvers. A simulated flight instructor offers real-time advice based on the player's performance. Certificates are earned by performing well during lessons. In Hoops courses, the player undertakes a time trial through rings that float in the sky, with the option to enable a "ghost plane" of the highest score. Four types of Hoops courses are available: Basic, Challenge, Distance and Trick. The last is intended as a highly demanding test of the player's aerobatic ability. The game's sole non-powered aircraft, the Grob G103a Twin II sailplane, features its own game mode focused on energy management. The player attempts to use the direction of the wind, thermals—which realistically occur above areas that absorb more heat, such as plains and parking lots—and the orographic lift caused by slopes to stay airborne for as long as possible.
Development
Origin
The concept of Flight Unlimited originated from Looking Glass Technologies' discontent with contemporary flight simulators. Company co-founders Paul Neurath and Ned Lerner wanted to develop an exceptional game in the genre, and Neurath considered the idea during the production of Ultima Underworld: The Stygian Abyss and Ultima Underworld II: Labyrinth of Worlds. In 1992, Seamus Blackley, who had been undertaking graduate studies in particle physics at the Fermilab research facility, was hired through a want advertisement that Lerner had placed on a bulletin board. At the company, Blackley programmed the physics modeling system for a racing game and designed a large number of standalone physics demonstrations. He became fascinated by physics programming. An amateur pilot and flight devotee, Blackley asked Lerner extensive questions about his earlier game Chuck Yeager's Advanced Flight Trainer, which Blackley held in high regard. In reaction to Blackley's enthusiasm, Neurath suggested that the company develop a "traditional Cessna sim".
However, Blackley instead proposed an aerobatics training simulation, which he had conceived while reading an aerobatics magazine on a Lexington, Massachusetts bus. Collaborating with Ultima Underworld II programmer Greg Travis, he created a thirty page concept document that outlined the game. His core idea was to recreate the "yummy, visceral, fluid feeling that you get when flying a real airplane". He wanted the project to bear more resemblance to a playground than to a video game, and he sought to give it simple controls and realistic terrain to decrease the learning curve for beginners. Blackley assumed the role of project leader and then engaged the team in "flaming sessions" to generate ideas. According to programmer Doug Church, Blackley's concept of the game was not fully developed, but he clearly expressed his thoughts and motivated the team. The first months of the project produced disparate prototypes that demonstrated prospective features. The company committed to full development of the game in early 1993, and production commenced in March.
Production
Blackley's first objective was to code the game's simulated physics. He began by deciding on a programming method—in particular, he sought one that would allow aircraft to perform the "knife-edge spin" maneuver that he had witnessed at air shows. In 1995, he said that he had never played a flight simulator with an accurate sense of flight. He later described his belief that the genre had stagnated, and that flight games were evaluated "by [their] implementation of the standard feature set", rather than by their enjoyability. Blackley researched physics programming in contemporary flight simulators, and he discovered that many used large databases of wind tunnel and plane sensory equipment information to dictate how aircraft would operate in prerecorded scenarios. Higher-end simulators used a "Newtonian" system, in which algebra-based measurements of force vectors determine a plane's position in real-time. However, Blackley believed that neither system correctly simulated the experience of flight.
In reaction, he used his knowledge of particle physics to create a real-time computational fluid dynamics (CFDs) model for Flight Unlimited. The result is a simulated atmosphere: air acts as a fluid that automatically reacts to the shape of any object placed within it. Blackley gave the example that a lawn chair, if placed within the game's real-time CFDs model, would fall merely because of its shape. The game's planes fly because the interaction of their architecture with the atmosphere creates lift, as with real-world aircraft. Changes in the plane's direction are caused by the interaction of their flight control surfaces (ailerons, elevators and rudders) with the simulated atmosphere. Because it simulates the dynamics of flight in real-time, the system allows for aerobatic maneuvers that were impossible in previous flight simulators. In 1994, Blackley said that it was possibly the first flight code designed for aerobatics. In constructing the CFDs model, Blackley and the team built from the Navier–Stokes equations of fluid motion, which Blackley described as "horrible, complicated partial differential equations". According to Computer Gaming World, Blackley did not seek to represent the equations with perfect accuracy, and he was satisfied when the results were consistent and the sensation that they generated was correct.
After programming a basic version of the CFDs model, Blackley used several programs to examine the simulated currents of air that flowed across a model of a flat plate. He adjusted the code until the plate fell realistically, and then constructed test models for a plane wing and fuselage. He eventually built a complete but dysfunctional plane by using data from "pinhead books". By reading aircraft design manuals, he discovered that the problems were caused by his plane's incorrect tail and center of gravity. Following this, he created an exact three-dimensional model of the Extra 300S over roughly three days. As he had not yet simulated the physical attributes of its propeller, Blackley programmed the plane to be propelled from the rear. However, the accurate model performed properly in the simulated atmosphere.
Artists Mike Marsicano and Kurt Bickenbach played critical roles in the creation of the game's aircraft models, which were built in 3D Studio. As reference material, the team photographed real planes at several airfields, and they received blueprints and datasheets from aircraft manufacturers. The game's Grob G103a Twin II sailplane was based directly on the one that Blackley owned at the time. The sophistication of the real-time CFDs complicated the 3D modeling process, as the planes required accurate geometry to fly properly. While attempting to meet this goal, however, Bickenbach said that the models he created were overly detailed, which caused the team to struggle with performance issues related to the high number of polygons. Reducing the number altered the plane's shape, which in turn reduced its flight realism; this necessitated a balance between performance and accuracy. To obtain audio for the planes, Greg LoPiccolo and Tom Streit—former bassist and road manager, respectively, of the band Tribe—visited a Florida importer of Russian aerobatic aircraft. The two placed microphones inside the cockpits and next to the engines, and they flew each plane at multiple speeds while recording with a digital audio tape machine. Combining this material with digital recordings of wind sounds, the team fashioned a physics-based sound system: sounds of the wind and engine are altered in real-time based on wind speed in the game.
The flight instructor was created by programmer Andrew Grant and voiced by Tom Streit. It monitors the player's controller input during "each frame of animation". If a maneuver is attempted, the instructor "interpolates the initial control movements" and predicts which maneuver is being performed. The instructor then gives advice on how to complete the maneuver and offers guidance if a mistake is made. Grant believed that the code is sometimes "too picky", and he stated that it expects players to perform maneuvers more precisely than is humanly possible. The team initially planned to include an online multiplayer component, which would have allowed 64 planes to fly in the same area—thereby giving players the ability to compete with one another. However, the feature was not implemented into the final game. The staff members also sought to include aerobatic competitions in which the player could participate, but the idea was dropped because of difficulties with realism. Problems with artificially intelligent judges were also a factor in the feature's removal.
Flight Unlimiteds terrain graphics were created with stereophotogrammetry. The team gathered aerial photographs from locations in France and the United States. They combined two to three images of each area to create digital reproductions roughly in size. Each location in the game was based on two stereoscopic sets of photographs, which were processed for more than 72 hours by a "dedicated Pentium tucked away in a dark corner". From the contrasting images, the computer generated a terrain "data blanket" with 3D height variations. While the team had considered using satellite or surveillance aircraft images to create the game's terrain graphics, they found that the resolution was inadequate. Material from geographic information systems was also studied, but associate producer Paul Schaffer said that it would have been "astronomically expensive" to obtain data with the necessary resolution.
After assembling a playable demo of Flight Unlimited, the team requested assistance from then-US Aerobatic Team member Michael Goulian, who worked as a flight instructor at the nearby Hanscom Field. Because of the game's flight code, Goulian was able to execute aerobatic maneuvers within less than three minutes of playing the game; and he later performed his "entire basic aerobatic routine". Blackley told PC Gamer US that, while Goulian disliked flight simulators, "When he flew Flight Unlimited, he just said 'pretty cool.' I was so psyched". Goulian assisted the team during the next year of development: he co-designed the game's flight lessons and advised the team on adjustments to the plane models. Aerobatic pilot Patty Wagstaff was also consulted. At one point, the team encountered problems while testing a maneuver in the game's Sukhoi Su-31, and Blackley was concerned that he would need to rework the game's physics code. However, Goulian phoned a colleague—a Russian pilot—who told them to compensate for the plane's abnormally large ailerons. Using his advice on flying the real-world plane, the team found that the maneuver worked correctly. Goulian endorsed Flight Unlimited and wrote the foreword to its official strategy guide.
The graphics and physics code increased the game's system requirements, and the team worked to optimize performance during development. They struggled to improve the game's memory usage: the process consumed nearly as much time as the creation of the physics model, according to Church. Programmer Eric Twietmeyer ran weekly tests of the game's performance by disabling certain parts of the code—such as the physics calculations—to isolate which parts used the most memory. By 1994, Blackley's physics code took up only 1% of CPU time, with the rest allocated to the terrain renderer. Blackley optimized his code by converting the mathematical calculations of air from the 3D game world into a "math-friendly space", during which time the Navier-Stokes equations are applied. Afterwards, the data is returned to 3D space. According to Computer Gaming World, this method increased speed by "a factor of 100, with almost no loss in precision." The team had trouble with complex memory-related glitches during development. Church called them "crazy", and programmer Greg Travis noted that debugging the terrain cache system was a "nightmare".
While leading the team, Blackley adopted a loose style of supervision. According to Opening the Xbox author Dean Takahashi, "Blackley [was not] ultra-organized. His idea of good management was to invite someone over for a gourmet dinner and have a casual conversation about work". However, Takahashi wrote that "Blackley worked hard to inspire his team", and he described artist James Dollar's belief that, "in contrast to other Looking Glass managers, he didn't take over tasks and make others feel stupid". During the first two years of production, the team was divided into small groups that worked on the game's elements separately. For example, Blackley programmed the game's physics, while Eric Twietmeyer and Tim Day created the terrain renderer. However, Doug Church later said that, while "the team [did] a bunch of very cool stuff, the FBO, the flight model, the instructor, the renderer, so on", the result "was almost like four separate programs, with no connection". Following the completion of the concurrently-developed System Shock, a significant part of that game's team—including Church, Marc LeBlanc and Rob Fermier—moved to Flight Unlimited to add connective material. At the time, Church said that it was difficult to meld the game's elements, but he later stated that they largely coalesced by the end.
Publication
Flight Unlimited was self-published by Looking Glass Technologies. Their previous games had been developed for other video game publishers, and had generated $90 million total earnings for those companies. However, Ronald Rosenberg of The Boston Globe reported that Looking Glass was "no longer satisfied as a backroom player surviving on royalties". Doug Church later explained that the company wanted to self-publish in order to escape the "treadmill of waiting for advances", which would allow them to make long-term plans without needing to satisfy the immediate demands of a publisher. In late 1994, Looking Glass announced that venture capital investors, including Matrix Partners and Institutional Venture Partners, had provided the company with $3.8 million. The sum was intended to fund the development and self-publication of Flight Unlimited. According to Michael Humphreys of Matrix Partners and Ruthann Quindlen of Institutional Venture Partners, the decision was partly influenced by the past success of the company's co-founders, Paul Neurath and Ned Lerner.
Looking Glass intended Flight Unlimited as a gateway into the video game publishing industry. According to Lerner, the idea of self-publishing had been considered when the company was founded. In 1995, Looking Glass projected that sales of Flight Unlimited would increase royalty revenues to $10 million that year, up from $1.5 million in 1994. Jeffrey A. Kalowski, the company's vice president of finance and administration, expected that the game would recoup its development costs and make a return before the end of the year. He predicted that, over the following 12 to 18 months, the company's number of employees would increase from 52 to 82. The company's executive vice president and general manager, Jerry Wolosenko, told The Boston Globe that the company hoped to publish six games each year. According to Doug Church, the pressure for Flight Unlimited to succeed meant that the concurrently-developed System Shock, which was not self-published, received little attention from the company's management.
Flight Unlimited was placed in direct competition with several major flight simulator franchises. Before the game's release, Shelby Bateman of Next Generation Magazine wrote, "1995 is going to be a real dogfight in the flight-sim and aerial-combat categories, and LookingGlass is betting its bankroll ... that it can capture significant market share from the likes of Microsoft Flight Simulator and the debut of Spectrum HoloByte's Falcon 4.0, among others." Describing the situation, Johnny L. Wilson of Computer Gaming World wrote, "The games that sell big are the ones that allow you to blow stuff up, so, if anything, that could be a problem for Flight Unlimited." Doug Church explained that, because the game did not feature combat and bore little resemblance to Microsoft Flight Simulator, the team spent "many late nights" on marketing strategies. However, he noted that the game had a wide appeal among those who tested it during development, which he called "a really good sign". Talking to Bernie Yee of PC Gamer US, Paul Neurath said that he thought the game would sell well. Yee noted that Neurath "fully [expected] it to prove more popular than Microsoft Flight Simulator".
In January 1995, Looking Glass showed Flight Unlimited alongside Terra Nova: Strike Force Centauri at the Winter Consumer Electronics Show, under their "Immersive Reality" marketing label. In March 1995, the Boston Globe reported that the team was performing "11th hour checks" of the game to prepare it for shipment to a Midwestern United States Compact Disc manufacturer. According to the newspaper, Looking Glass planned to begin by shipping 100,000 units to retailers in Canada and the United States. Another 100,000 copies were to be sent to France, Germany and the United Kingdom at a later date. However, upon the game's June 7, 1995 release for MS-DOS, 200,000 units were distributed simultaneously in the United States and Europe. The game's European releases were localized with German, French and English text and voice acting, which was made possible by "close coordination with international partners". Versions for Macintosh and Windows 95 were later released; the former was shown at the Macworld Expo in April 1996.
Reception
Flight Unlimited was a commercial success. It debuted in twelfth place on a June 1995 sales chart compiled by NPD Group, while Microsoft Flight Simulator 5.1 took first place. The game went on to sell more than 300,000 copies by 1997, and more than 780,000 by 2002. According to Constantine von Hoffman of the Boston Herald, Flight Unlimited successfully competed with Microsoft Flight Simulator. PC Gamers Lee Buchanan wrote that it "soars above the pack of flight simulations", and he considered it to be "the most fun [he had] had in a computerized cockpit". Frank Vizard of Popular Mechanics hailed it as "the new top gun of flight simulators", and Doug Bailey of The Boston Globe considered it to be the "first real serious challenge to Microsoft's dominance of the genre". The Records David Noack believed that the game's physics and stereoscopic terrain set "a new standard in flight simulation". Writing for Computer Gaming World, Bob and John Nolan stated, "If anything, you should at least take a look at this product, because you'll be looking at the future of simulations."
The game was a finalist in the 12th Annual Awards for Technical Excellence held by PC Magazine, whose staff called it "the simulator by which all others will be judged." It was named the best simulation of 1995 by Computer Games Strategy Plus (tied with Apache), and the best of 1996 by Macworld, whose editor Steven Levy wrote that it "puts you in touch with what makes flying special." Inside Mac Games and PC Gamer both nominated Flight Unlimited as the simulation of the year, although it lost these awards to A-10 Cuba! and Apache, respectively.
Design
Vizard stated that Flight Unlimiteds "very advanced computational fluid dynamics make [each] plane react according to spec". Buchanan lauded the fluid model for creating a "sensation of actual flight [that] is nothing short of magnificent", while PC Magazines staff commented that it makes "planes behave more like real aircraft than any simulator we have seen". Bob and John Nolan called the game's physics programming "groundbreaking", and Chris Ware of the Lexington Herald-Leader found the game to be the most accurate simulation of flight beyond "those multimillion-dollar flight simulators [used by] fighter pilots and astronauts". Noack agreed: he wrote that the game "is about as close to flying within going to the airport". In 1996, Computer Gaming World presented Flight Unlimited with a Special Artistic Achievement Award For Physical Model. The magazine's staff praised Blackley's programming for pushing the genre "higher into the realm of simulation", and listed the game's sophisticated physics model as #5 on its list of "the 15 best ways to die in computer gaming".
Ware found Flight Unlimited approachable and noted its "simplicity of use and depth of instruction". Buchanan hailed the lesson mode as "a dream come true for any budding pilot". A writer for The Washington Post called the game "[the] world's first truly easy-to-use flight simulator" and "a good entry product", in which "rank amateurs can just launch the program and start cruising immediately". The Washington Posts John Gaudiosi wrote that, while many games in the genre are overly complex, Flight Unlimited lets "those who aren't rocket scientists ... experience the thrills of stunt flying." He found its control scheme simple to understand. By contrast, Bailey found the game difficult and initially "frustrating": he complained that he had to play the lesson mode before even taking off. Denny Atkin of Computer Gaming World characterized the game's learning curve as steep, thanks to the accuracy of the physics programming, but he noted the scalable difficulty options. Bailey later recommended the game in a holiday shopping guide. He wrote that "it can be difficult to master. But once you're up, it's worth the trouble."
A writer for The Washington Post commented that "serious flight freaks will like the racing and advanced maneuvers". According to Gaudiosi, dedicated players will learn "all about aerodynamics and stunt flying"; he considered the latter to be "hard stuff, even with green hoops guiding you". Similarly, Buchanan characterized the Hoops courses as "incredibly demanding", and Atkin cited that mode's Trick difficulty level as "amazingly tough". Bob and John Nolan wrote that people who "love to loop around the skies of Flight Simulator 5 will go bananas for" the aerobatics; but the pair commented that combat flight simulator players "might get a little edgy once the wow-power wears off." However, Atkin believed that only those "never happy without something to shoot at" could be disappointed by the lack of combat: other players will "be too busy choreographing aerial ballets, pulling off death-defying aerobatic stunts, or just enjoying a quiet soar down the ridge line to miss that stuff". Likewise, Ware called the non-violent gameplay "refreshing", and Buchanan wrote, "If [you are] a battle-weary veteran of air combat sims, Flight Unlimited might be just the sort of [rest and relaxation] you need."
Presentation
Atkin found the cockpit and terrain graphics to look "almost real". He commented, "Every few years a sim comes along that lets reviewers use the 'sets new standards for graphics' cliché, and Flight Unlimited is the 1995 entry in this club." Bob and John Nolan called Flight Unlimited "the ultimate show off piece for your new Pentium", thanks to "unbelievable" graphics superior to those of any other computer game. Gaudiosi concurred: he characterized the visuals as "photo-sharp" and "better than any I have seen". PC Magazines staff found the graphics "impressive" and "even more stunning than those in Microsoft Flight Simulator". Ware noted the "stunning 3-D photo-realistic scenery", while Bailey stated that the "graphics are brilliantly rendered and whiz by smoothly". Buchanan called Flight Unlimiteds terrain "just superb" and Vizard described it as "amazingly real". Buchanan believed that "what you hear in Flight Unlimited is every bit as good as what you see", thanks to "utterly convincing" sound effects. Atkin praised the instructor as "one of the best uses of voice ever in a multimedia title".
Bailey wrote that the game needs "a real beefy machine" to run properly; Atkin stated that the "massive horsepower requirement will restrict many gamers to lower resolutions and detail levels". Bob and John Nolan similarly found that the game "hogs computing power". Buchanan wrote that the system requirements listed on the back of the game's box "must be a joke", and that a high-performance computer is necessary to run the game. In 1996, PC Gamer US presented the game with a "Special Achievement in Graphics" award. The editors wrote, "While it requires the most sophisticated computer hardware on the market to be enjoyed, Flight Unlimited rewards gamers with some of the most stunning scenery ever seen in a flight sim."
Aftermath
Flight Unlimited was the first of three self-published titles released by Looking Glass Technologies. However, the next two products, Terra Nova: Strike Force Centauri (1996) and British Open Championship Golf (1997), were commercial failures. As a result, the company ceased self-publishing and was left in dire financial circumstances. Doug Church later explained that Looking Glass' attempt to publish came at a difficult time for the video game industry: "the other mid-sized publishers were mostly going out of business or getting bought". He believed that the company had been "overreaching itself" with the venture, and that it was "being a little overambitious and a little cocky".
Sequels
Flight Unlimited was intended to be followed by a combat-oriented sequel, which was developed under the working title Flight Combat. In 1995, Seamus Blackley told PC Gamer US that he wanted the game to "feel so real that pilots will be afraid. They'll feel the gun hits." Talking to Computer Gaming World, he stated that the game would teach players the "same curriculum [as] the Air Force", and that it would feature competitive online play. However, a company manager, newly instated by venture capital investors who disliked Looking Glass' management style, instead demanded that Blackley create a direct sequel to Flight Unlimited. The two argued regularly, and Blackley later accused the manager of "ripp[ing] the guts out of Looking Glass". In response to Blackley's refusal to create Flight Unlimited II, the manager fired him. Blackley left the company in late 1995 with designer Austin Grossman, and both were hired by DreamWorks Interactive to create Jurassic Park: Trespasser. He later spearheaded development of the Xbox at Microsoft.
Constantine Hantzopoulos directed Flight Unlimited II, which was published by Eidos Interactive in 1997. The team could not continue using the real-time computational fluid dynamics of Flight Unlimited because, according to Hantzopoulos, it was "all black box spaghetti code from Seamus". The aerobatics focus of its predecessor was dropped in favor of general civilian aviation. The development of Flight Combat was hinted at during the production of Flight Unlimited II. A third game, Flight Unlimited III, was published by Electronic Arts in 1999; and it continued the focus on general aviation. That year, Flight Combat was officially announced as the World War II-themed, Electronic Arts-published Flight Combat: Thunder Over Europe, but its name was eventually changed to Jane's Attack Squadron. The game was canceled as a consequence of Looking Glass Studios' closure in 2000. However, it was later finished by developer Mad Doc Software and released in 2002 by publisher Xicat Interactive.
Notes
External links
Flight Unlimited download at the Internet Archive
1995 video games
DOS games
DOS games ported to Windows
Flight simulation video games
Looking Glass Studios games
Classic Mac OS games
Single-player video games
Video games developed in the United States
Windows games |
9845 | https://en.wikipedia.org/wiki/JavaScript | JavaScript | JavaScript (), often abbreviated JS, is a programming language that is one of the core technologies of the World Wide Web, alongside HTML and CSS. Over 97% of websites use JavaScript on the client side for web page behavior, often incorporating third-party libraries. All major web browsers have a dedicated JavaScript engine to execute the code on users' devices.
JavaScript is a high-level, often just-in-time compiled language that conforms to the ECMAScript standard. It has dynamic typing, prototype-based object-orientation, and first-class functions. It is multi-paradigm, supporting event-driven, functional, and imperative programming styles. It has application programming interfaces (APIs) for working with text, dates, regular expressions, standard data structures, and the Document Object Model (DOM).
The ECMAScript standard does not include any input/output (I/O), such as networking, storage, or graphics facilities. In practice, the web browser or other runtime system provides JavaScript APIs for I/O.
JavaScript engines were originally used only in web browsers, but are now core components of some servers and a variety of applications. The most popular runtime system for this usage is Node.js.
Although Java and JavaScript are similar in name, syntax, and respective standard libraries, the two languages are distinct and differ greatly in design.
History
Creation at Netscape
The first web browser with a graphical user interface, Mosaic, was released in 1993. Accessible to non-technical people, it played a prominent role in the rapid growth of the nascent World Wide Web. The lead developers of Mosaic then founded the Netscape corporation, which released a more polished browser, Netscape Navigator, in 1994. This quickly became the most-used.
During these formative years of the Web, web pages could only be static, lacking the capability for dynamic behavior after the page was loaded in the browser. There was a desire in the burgeoning web development scene to remove this limitation, so in 1995, Netscape decided to add a scripting language to Navigator. They pursued two routes to achieve this: collaborating with Sun Microsystems to embed the Java programming language, while also hiring Brendan Eich to embed the Scheme language.
Netscape management soon decided that the best option was for Eich to devise a new language, with syntax similar to Java and less like Scheme or other extant scripting languages. Although the new language and its interpreter implementation were called LiveScript when first shipped as part of a Navigator beta in September 1995, the name was changed to JavaScript for the official release in December.
The choice of the JavaScript name has caused confusion, implying that it is directly related to Java. At the time, the dot-com boom had begun and Java was the hot new language, so Eich considered the JavaScript name a marketing ploy by Netscape.
Adoption by Microsoft
Microsoft debuted Internet Explorer in 1995, leading to a browser war with Netscape. On the JavaScript front, Microsoft reverse-engineered the Navigator interpreter to create its own, called JScript.
JScript was first released in 1996, alongside initial support for CSS and extensions to HTML. Each of these implementations was noticeably different from their counterparts in Navigator. These differences made it difficult for developers to make their websites work well in both browsers, leading to widespread use of "best viewed in Netscape" and "best viewed in Internet Explorer" logos for several years.
The rise of JScript
In November 1996, Netscape submitted JavaScript to Ecma International, as the starting point for a standard specification that all browser vendors could conform to. This led to the official release of the first ECMAScript language specification in June 1997.
The standards process continued for a few years, with the release of ECMAScript 2 in June 1998 and ECMAScript 3 in December 1999. Work on ECMAScript 4 began in 2000.
Meanwhile, Microsoft gained an increasingly dominant position in the browser market. By the early 2000s, Internet Explorer's market share reached 95%. This meant that JScript became the de facto standard for client-side scripting on the Web.
Microsoft initially participated in the standards process and implemented some proposals in its JScript language, but eventually it stopped collaborating on Ecma work. Thus ECMAScript 4 was mothballed.
Growth and standardization
During the period of Internet Explorer dominance in the early 2000s, client-side scripting was stagnant. This started to change in 2004, when the successor of Netscape, Mozilla, released the Firefox browser. Firefox was well received by many, taking significant market share from Internet Explorer.
In 2005, Mozilla joined ECMA International, and work started on the ECMAScript for XML (E4X) standard. This led to Mozilla working jointly with Macromedia (later acquired by Adobe Systems), who were implementing E4X in their ActionScript 3 language, which was based on an ECMAScript 4 draft. The goal became standardizing ActionScript 3 as the new ECMAScript 4. To this end, Adobe Systems released the Tamarin implementation as an open source project. However, Tamarin and ActionScript 3 were too different from established client-side scripting, and without cooperation from Microsoft, ECMAScript 4 never reached fruition.
Meanwhile, very important developments were occurring in open-source communities not affiliated with ECMA work. In 2005, Jesse James Garrett released a white paper in which he coined the term Ajax and described a set of technologies, of which JavaScript was the backbone, to create web applications where data can be loaded in the background, avoiding the need for full page reloads. This sparked a renaissance period of JavaScript, spearheaded by open-source libraries and the communities that formed around them. Many new libraries were created, including jQuery, Prototype, Dojo Toolkit, and MooTools.
Google debuted its Chrome browser in 2008, with the V8 JavaScript engine that was faster than its competition. The key innovation was just-in-time compilation (JIT), so other browser vendors needed to overhaul their engines for JIT.
In July 2008, these disparate parties came together for a conference in Oslo. This led to the eventual agreement in early 2009 to combine all relevant work and drive the language forward. The result was the ECMAScript 5 standard, released in December 2009.
Reaching maturity
Ambitious work on the language continued for several years, culminating in an extensive collection of additions and refinements being formalized with the publication of ECMAScript 6 in 2015.
The creation of Node.js in 2009 by Ryan Dahl sparked a significant increase in the usage of JavaScript outside of web browsers. Node combines the V8 engine, an event loop, and I/O APIs, thereby providing a stand-alone JavaScript runtime system. As of 2018, Node had been used by millions of developers, and npm had the most modules of any package manager in the world.
The ECMAScript draft specification is currently maintained openly on GitHub, and editions are produced via regular annual snapshots. Potential revisions to the language are vetted through a comprehensive proposal process. Now, instead of edition numbers, developers check the status of upcoming features individually.
The current JavaScript ecosystem has many libraries and frameworks, established programming practices, and substantial usage of JavaScript outside of web browsers. Plus, with the rise of single-page applications and other JavaScript-heavy websites, several transpilers have been created to aid the development process.
Trademark
"JavaScript" is a trademark of Oracle Corporation in the United States.
Website client-side usage
JavaScript is the dominant client-side scripting language of the Web, with 97% of websites using it for this purpose. Scripts are embedded in or included from HTML documents and interact with the DOM. All major web browsers have a built-in JavaScript engine that executes the code on the user's device.
Examples of scripted behavior
Loading new web page content without reloading the page, via Ajax or a WebSocket. For example, users of social media can send and receive messages without leaving the current page.
Web page animations, such as fading objects in and out, resizing, and moving them.
Playing browser games.
Controlling the playback of streaming media.
Generating pop-up ads.
Validating input values of a web form before the data is sent to a web server.
Logging data about the user's behavior then sending it to a server. The website owner can use this data for analytics, ad tracking, and personalization.
Redirecting a user to another page.
Libraries and frameworks
Over 80% of websites use a third-party JavaScript library or web framework for their client-side scripting.
jQuery is by far the most popular library, used by over 75% of websites. Facebook created the React library for its website and later released it as open source; other sites, including Twitter, now use it. Likewise, the Angular framework created by Google for its websites, including YouTube and Gmail, is now an open source project used by others.
In contrast, the term "Vanilla JS" has been coined for websites not using any libraries or frameworks, instead relying entirely on standard JavaScript functionality.
Other usage
The use of JavaScript has expanded beyond its web browser roots. JavaScript engines are now embedded in a variety of other software systems, both for server-side website deployments and non-browser applications.
Initial attempts at promoting server-side JavaScript usage were Netscape Enterprise Server and Microsoft's Internet Information Services, but they were small niches. Server-side usage eventually started to grow in the late 2000s, with the creation of Node.js and other approaches.
Electron, Cordova, React Native, and other application frameworks have been used to create many applications with behavior implemented in JavaScript. Other non-browser applications include Adobe Acrobat support for scripting PDF documents and GNOME Shell extensions written in JavaScript.
JavaScript has recently begun to appear in some embedded systems, usually by leveraging Node.js.
Features
The following features are common to all conforming ECMAScript implementations unless explicitly specified otherwise.
Imperative and structured
JavaScript supports much of the structured programming syntax from C (e.g., if statements, while loops, switch statements, do while loops, etc.). One partial exception is scoping: originally JavaScript only had function scoping with var; then block scoping was added in ECMAScript 2015 with the keywords let and const. Like C, JavaScript makes a distinction between expressions and statements. One syntactic difference from C is automatic semicolon insertion, which allow semicolons (which terminate statements) to be omitted.
Weakly typed
JavaScript is weakly typed, which means certain types are implicitly cast depending on the operation used.
The binary + operator casts both operands to a string unless both operands are numbers. This is because the addition operator doubles as a concatenation operator
The binary - operator always casts both operands to a number
Both unary operators (+, -) always cast the operand to a number
Values are cast to strings like the following:
Strings are left as-is
Numbers are converted to their string representation
Arrays have their elements cast to strings after which they are joined by commas (,)
Other objects are converted to the string [object Object] where Object is the name of the constructor of the object
Values are cast to numbers by casting to strings and then casting the strings to numbers. These processes can be modified by defining toString and valueOf functions on the prototype for string and number casting respectively.
JavaScript has received criticism for the way it implements these conversions as the complexity of the rules can be mistaken for inconsistency. For example, when adding a number to a string, the number will be cast to a string before performing concatenation, but when subtracting a number from a string, the string is cast to a number before performing subtraction.
Often also mentioned is {} + [] resulting in 0 (number). This is misleading: the {} is interpreted as an empty code block instead of an empty object, and the empty array is cast to a number by the remaining unary + operator. If you wrap the expression in parentheses ({} + []) the curly brackets are interpreted as an empty object and the result of the expression is "[object Object]" as expected.
Dynamic
TypingJavaScript is dynamically typed like most other scripting languages. A type is associated with a value rather than an expression. For example, a variable initially bound to a number may be reassigned to a string. JavaScript supports various ways to test the type of objects, including duck typing.
Run-time evaluation JavaScript includes an eval function that can execute statements provided as strings at run-time.
Object-orientation (prototype-based)
Prototypal inheritance in JavaScript is described by Douglas Crockford as:
In JavaScript, an object is an associative array, augmented with a prototype (see below); each key provides the name for an object property, and there are two syntactical ways to specify such a name: dot notation (obj.x = 10) and bracket notation (obj['x'] = 10). A property may be added, rebound, or deleted at run-time. Most properties of an object (and any property that belongs to an object's prototype inheritance chain) can be enumerated using a for...in loop.
Prototypes JavaScript uses prototypes where many other object-oriented languages use classes for inheritance. It is possible to simulate many class-based features with prototypes in JavaScript.
Functions as object constructors Functions double as object constructors, along with their typical role. Prefixing a function call with new will create an instance of a prototype, inheriting properties and methods from the constructor (including properties from the Object prototype). ECMAScript 5 offers the Object.create method, allowing explicit creation of an instance without automatically inheriting from the Object prototype (older environments can assign the prototype to null). The constructor's prototype property determines the object used for the new object's internal prototype. New methods can be added by modifying the prototype of the function used as a constructor. JavaScript's built-in constructors, such as Array or Object, also have prototypes that can be modified. While it is possible to modify the Object prototype, it is generally considered bad practice because most objects in JavaScript will inherit methods and properties from the Object prototype, and they may not expect the prototype to be modified.
Functions as methods Unlike many object-oriented languages, there is no distinction between a function definition and a method definition. Rather, the distinction occurs during function calling; when a function is called as a method of an object, the function's local this keyword is bound to that object for that invocation.
Functional
A function is first-class; a function is considered to be an object. As such, a function may have properties and methods, such as .call() and .bind(). A nested function is a function defined within another function. It is created each time the outer function is invoked. In addition, each nested function forms a lexical closure: the lexical scope of the outer function (including any constant, local variable, or argument value) becomes part of the internal state of each inner function object, even after execution of the outer function concludes. JavaScript also supports anonymous functions.
Delegative
JavaScript supports implicit and explicit delegation.
Functions as roles (Traits and Mixins) JavaScript natively supports various function-based implementations of Role patterns like Traits and Mixins. Such a function defines additional behavior by at least one method bound to the this keyword within its function body. A Role then has to be delegated explicitly via call or apply to objects that need to feature additional behavior that is not shared via the prototype chain.
Object composition and inheritance Whereas explicit function-based delegation does cover composition in JavaScript, implicit delegation already happens every time the prototype chain is walked in order to, e.g., find a method that might be related to but is not directly owned by an object. Once the method is found it gets called within this object's context. Thus inheritance in JavaScript is covered by a delegation automatism that is bound to the prototype property of constructor functions.
Miscellaneous
JS is a zero-index language.
Run-time environmentJavaScript typically relies on a run-time environment (e.g., a web browser) to provide objects and methods by which scripts can interact with the environment (e.g., a web page DOM). These environments are single-threaded. JavaScript also relies on the run-time environment to provide the ability to include/import scripts (e.g., HTML <script> elements). This is not a language feature per se, but it is common in most JavaScript implementations. JavaScript processes messages from a queue one at a time. JavaScript calls a function associated with each new message, creating a call stack frame with the function's arguments and local variables. The call stack shrinks and grows based on the function's needs. When the call stack is empty upon function completion, JavaScript proceeds to the next message in the queue. This is called the event loop, described as "run to completion" because each message is fully processed before the next message is considered. However, the language's concurrency model describes the event loop as non-blocking: program input/output is performed using events and callback functions. This means, for instance, that JavaScript can process a mouse click while waiting for a database query to return information.
Variadic functions An indefinite number of parameters can be passed to a function. The function can access them through formal parameters and also through the local arguments object. Variadic functions can also be created by using the bind method.
Array and object literals Like many scripting languages, arrays and objects (associative arrays in other languages) can each be created with a succinct shortcut syntax. In fact, these literals form the basis of the JSON data format.
Regular expressions JavaScript also supports regular expressions in a manner similar to Perl, which provide a concise and powerful syntax for text manipulation that is more sophisticated than the built-in string functions.
Promises and Async/await JavaScript supports promises and Async/await for handling asynchronous operations. A built-in Promise object provides functionality for handling promises and associating handlers with an asynchronous action's eventual result. Recently, combinator methods were introduced in the JavaScript specification, which allows developers to combine multiple JavaScript promises and do operations based on different scenarios. The methods introduced are: Promise.race, Promise.all, Promise.allSettled and Promise.any. Async/await allows an asynchronous, non-blocking function to be structured in a way similar to an ordinary synchronous function. Asynchronous, non-blocking code can be written, with minimal overhead, structured similar to traditional synchronous, blocking code.
Vendor-specific extensions
Historically, some JavaScript engines supported these non-standard features:
conditional catch clauses (like Java)
array comprehensions and generator expressions (like Python)
concise function expressions (function(args) expr; this experimental syntax predated arrow functions)
ECMAScript for XML (E4X), an extension that adds native XML support to ECMAScript (unsupported in Firefox since version 21)
Syntax
Simple examples
Variables in JavaScript can be defined using either the var, let or const keywords.
// Declares a function-scoped variable named `x`, and implicitly assigns the
// special value `undefined` to it. Variables without value are automatically
// set to undefined.
var x;
// Variables can be manually set to `undefined` like so
var x2 = undefined;
// Declares a block-scoped variable named `y`, and implicitly sets it to
// `undefined`. The `let` keyword was introduced in ECMAScript 2015.
let y;
// Declares a block-scoped, un-reassignable variable named `z`, and sets it to
// a string literal. The `const` keyword was also introduced in ECMAScript 2015,
// and must be explicitly assigned to.
// The keyword `const` means constant, hence the variable cannot be reassigned
// as the value is `constant`.
const z = "this value cannot be reassigned!";
// Declares a variable named `myNumber`, and assigns a number literal (the value
// `2`) to it.
let myNumber = 2;
// Reassigns `myNumber`, setting it to a string literal (the value `"foo"`).
// JavaScript is a dynamically-typed language, so this is legal.
myNumber = "foo";
Note the comments in the example above, all of which were preceded with two forward slashes.
There is no built-in Input/output functionality in JavaScript; the run-time environment provides that. The ECMAScript specification in edition 5.1 mentions:
indeed, there are no provisions in this specification for input of external data or output of computed results.
However, most runtime environments have a console object that can be used to print output. Here is a minimalist Hello World program in JavaScript:
console.log("Hello World!");
A simple recursive function:
function factorial(n) {
if (n === 0)
return 1; // 0! = 1
return n * factorial(n - 1);
}
factorial(3); // returns 6
An anonymous function (or lambda):
function counter() {
let count = 0;
return function() {
return ++count;
};
}
let x = counter();
x(); // returns 1
x(); // returns 2
x(); // returns 3
This example shows that, in JavaScript, function closures capture their non-local variables by reference.
Arrow functions were first introduced in 6th Edition - ECMAScript 2015. They shorten the syntax for writing functions in JavaScript. Arrow functions are anonymous, so a variable is needed to refer to them in order to invoke them after their creation.
Example of arrow function:
// Arrow functions let us omit the `function` keyword.
// Here `long_example` points to an anonymous function value.
const long_example = (input1, input2) => {
console.log("Hello, World!");
const output = input1 + input2;
return output;
};
// If there are no braces, the arrow function simply returns the expression
// So here it's (input1 + input2)
const short_example = (input1, input2) => input1 + input2;
long_example(2, 3); // Prints "Hello, World!" and returns 5
short_example(2, 5); // Returns 7
// If an arrow function only has one parameter, the parentheses can be removed.
const no_parentheses = input => input + 2;
no_parentheses(3); // Returns 5
In JavaScript, objects are created in the same way as functions; this is known as a function object.
Object example:
function Ball(r) {
this.radius = r; // the "r" argument is local to the ball object
this.area = Math.PI * (r ** 2); // parentheses don't do anything but clarify
// objects can contain functions ("method")
this.show = function() {
drawCircle(this.radius); // references another function (that draws a circle)
};
}
let myBall = new Ball(5); // creates a new instance of the ball object with radius 5
myBall.radius++; // object properties can usually be modified from the outside
myBall.show(); // using the inherited "show" function
Variadic function demonstration (arguments is a special variable):
function sum() {
let x = 0;
for (let i = 0; i < arguments.length; ++i)
x += arguments[i];
return x;
}
sum(1, 2); // returns 3
sum(1, 2, 3); // returns 6
Immediately-invoked function expressions are often used to create closures. Closures allow gathering properties and methods in a namespace and making some of them private:
let counter = (function() {
let i = 0; // private property
return { // public methods
get: function() {
alert(i);
},
set: function(value) {
i = value;
},
increment: function() {
alert(++i);
}
};
})(); // module
counter.get(); // shows 0
counter.set(6);
counter.increment(); // shows 7
counter.increment(); // shows 8
Exporting and Importing modules in JavaScript
Export example:
/* mymodule.js */
// This function remains private, as it is not exported
let sum = (a, b) => {
return a + b;
}
// Export variables
export let name = 'Alice';
export let age = 23;
// Export named functions
export function add(num1, num2) {
return num1 + num2;
}
// Export class
export class Multiplication {
constructor(num1, num2) {
this.num1 = num1;
this.num2 = num2;
}
add() {
return sum(this.num1, this.num2);
}
}
Import example:
// Import one property
import { add } from './mymodule.js';
console.log(add(1, 2)); // 3
// Import multiple properties
import { name, age } from './mymodule.js';
console.log(name, age);
//> "Alice", 23
// Import all properties from a module
import * from './module.js'
console.log(name, age);
//> "Alice", 23
console.log(add(1,2));
//> 3
More advanced example
This sample code displays various JavaScript features.
/* Finds the lowest common multiple (LCM) of two numbers */
function LCMCalculator(x, y) { // constructor function
const checkInt = function(x) { // inner function
if (x % 1 !== 0)
throw new TypeError(x + "is not an integer"); // var a = mouseX
return x;
};
this.a = checkInt(x)
// semicolons ^^^^ are optional, a newline is enough
this.b = checkInt(y);
}
// The prototype of object instances created by a constructor is
// that constructor's "prototype" property.
LCMCalculator.prototype = { // object literal
constructor: LCMCalculator, // when reassigning a prototype, set the constructor property appropriately
gcd: function() { // method that calculates the greatest common divisor
// Euclidean algorithm:
let a = Math.abs(this.a), b = Math.abs(this.b), t;
if (a < b) {
// swap variables
// t = b; b = a; a = t;
[a, b] = [b, a]; // swap using destructuring assignment (ES6)
}
while (b !== 0) {
t = b;
b = a % b;
a = t;
}
// Only need to calculate GCD once, so "redefine" this method.
// (Actually not redefinition—it's defined on the instance itself,
// so that this.gcd refers to this "redefinition" instead of LCMCalculator.prototype.gcd.
// Note that this leads to a wrong result if the LCMCalculator object members "a" and/or "b" are altered afterwards.)
// Also, 'gcd' === "gcd", this['gcd'] === this.gcd
this['gcd'] = function() {
return a;
};
return a;
},
// Object property names can be specified by strings delimited by double (") or single (') quotes.
"lcm": function() {
// Variable names do not collide with object properties, e.g., |lcm| is not |this.lcm|.
// not using |this.a*this.b| to avoid FP precision issues
let lcm = this.a / this.gcd() * this.b;
// Only need to calculate lcm once, so "redefine" this method.
this.lcm = function() {
return lcm;
};
return lcm;
},
// Methods can also be declared using es6 syntax
toString() {
// Using both es6 template literals and the (+) operator to concatenate values
return `LCMCalculator: a = ${this.a}, b = ` + this.b;
}
};
// Define generic output function; this implementation only works for Web browsers
function output(x) {
document.body.appendChild(document.createTextNode(x));
document.body.appendChild(document.createElement('br'));
}
// Note: Array's map() and forEach() are defined in JavaScript 1.6.
// They are used here to demonstrate JavaScript's inherent functional nature.
[
[25, 55],
[21, 56],
[22, 58],
[28, 56]
].map(function(pair) { // array literal + mapping function
return new LCMCalculator(pair[0], pair[1]);
}).sort((a, b) => a.lcm() - b.lcm()) // sort with this comparative function; => is a shorthand form of a function, called "arrow function"
.forEach(printResult);
function printResult(obj) {
output(obj + ", gcd = " + obj.gcd() + ", lcm = " + obj.lcm());
}
The following output should be displayed in the browser window.
LCMCalculator: a = 28, b = 56, gcd = 28, lcm = 56
LCMCalculator: a = 21, b = 56, gcd = 7, lcm = 168
LCMCalculator: a = 25, b = 55, gcd = 5, lcm = 275
LCMCalculator: a = 22, b = 58, gcd = 2, lcm = 638
Security
JavaScript and the DOM provide the potential for malicious authors to deliver scripts to run on a client computer via the Web. Browser authors minimize this risk using two restrictions. First, scripts run in a sandbox in which they can only perform Web-related actions, not general-purpose programming tasks like creating files. Second, scripts are constrained by the same-origin policy: scripts from one Web site do not have access to information such as usernames, passwords, or cookies sent to another site. Most JavaScript-related security bugs are breaches of either the same origin policy or the sandbox.
There are subsets of general JavaScript—ADsafe, Secure ECMAScript (SES)—that provide greater levels of security, especially on code created by third parties (such as advertisements). Closure Toolkit is another project for safe embedding and isolation of third-party JavaScript and HTML.
Content Security Policy is the main intended method of ensuring that only trusted code is executed on a Web page.
Cross-site vulnerabilities
A common JavaScript-related security problem is cross-site scripting (XSS), a violation of the same-origin policy. XSS vulnerabilities occur when an attacker can cause a target Web site, such as an online banking website, to include a malicious script in the webpage presented to a victim. The script in this example can then access the banking application with the privileges of the victim, potentially disclosing secret information or transferring money without the victim's authorization. A solution to XSS vulnerabilities is to use HTML escaping whenever displaying untrusted data.
Some browsers include partial protection against reflected XSS attacks, in which the attacker provides a URL including malicious script. However, even users of those browsers are vulnerable to other XSS attacks, such as those where the malicious code is stored in a database. Only correct design of Web applications on the server-side can fully prevent XSS.
XSS vulnerabilities can also occur because of implementation mistakes by browser authors.
Another cross-site vulnerability is cross-site request forgery (CSRF). In CSRF, code on an attacker's site tricks the victim's browser into taking actions the user did not intend at a target site (like transferring money at a bank). When target sites rely solely on cookies for request authentication, requests originating from code on the attacker's site can carry the same valid login credentials of the initiating user. In general, the solution to CSRF is to require an authentication value in a hidden form field, and not only in the cookies, to authenticate any request that might have lasting effects. Checking the HTTP Referrer header can also help.
"JavaScript hijacking" is a type of CSRF attack in which a <script> tag on an attacker's site exploits a page on the victim's site that returns private information such as JSON or JavaScript. Possible solutions include:
requiring an authentication token in the POST and GET parameters for any response that returns private information.
Misplaced trust in the client
Developers of client-server applications must recognize that untrusted clients may be under the control of attackers. The application author cannot assume that their JavaScript code will run as intended (or at all) because any secret embedded in the code could be extracted by a determined adversary. Some implications are:
Web site authors cannot perfectly conceal how their JavaScript operates because the raw source code must be sent to the client. The code can be obfuscated, but obfuscation can be reverse-engineered.
JavaScript form validation only provides convenience for users, not security. If a site verifies that the user agreed to its terms of service, or filters invalid characters out of fields that should only contain numbers, it must do so on the server, not only the client.
Scripts can be selectively disabled, so JavaScript cannot be relied on to prevent operations such as right-clicking on an image to save it.
It is considered very bad practice to embed sensitive information such as passwords in JavaScript because it can be extracted by an attacker.
Misplaced trust in developers
Package management systems such as npm and Bower are popular with JavaScript developers. Such systems allow a developer to easily manage their program's dependencies upon other developers' program libraries. Developers trust that the maintainers of the libraries will keep them secure and up to date, but that is not always the case. A vulnerability has emerged because of this blind trust. Relied-upon libraries can have new releases that cause bugs or vulnerabilities to appear in all programs that rely upon the libraries. Inversely, a library can go unpatched with known vulnerabilities out in the wild. In a study done looking over a sample of 133k websites, researchers found 37% of the websites included a library with at least one known vulnerability. "The median lag between the oldest library version used on each website and the newest available version of that library is 1,177 days in ALEXA, and development of some libraries still in active use ceased years ago." Another possibility is that the maintainer of a library may remove the library entirely. This occurred in March 2016 when Azer Koçulu removed his repository from npm. This caused tens of thousands of programs and websites depending upon his libraries to break.
Browser and plugin coding errors
JavaScript provides an interface to a wide range of browser capabilities, some of which may have flaws such as buffer overflows. These flaws can allow attackers to write scripts that would run any code they wish on the user's system. This code is not by any means limited to another JavaScript application. For example, a buffer overrun exploit can allow an attacker to gain access to the operating system's API with superuser privileges.
These flaws have affected major browsers including Firefox, Internet Explorer, and Safari.
Plugins, such as video players, Adobe Flash, and the wide range of ActiveX controls enabled by default in Microsoft Internet Explorer, may also have flaws exploitable via JavaScript (such flaws have been exploited in the past).
In Windows Vista, Microsoft has attempted to contain the risks of bugs such as buffer overflows by running the Internet Explorer process with limited privileges. Google Chrome similarly confines its page renderers to their own "sandbox".
Sandbox implementation errors
Web browsers are capable of running JavaScript outside the sandbox, with the privileges necessary to, for example, create or delete files. Such privileges are not intended to be granted to code from the Web.
Incorrectly granting privileges to JavaScript from the Web has played a role in vulnerabilities in both Internet Explorer and Firefox. In Windows XP Service Pack 2, Microsoft demoted JScript's privileges in Internet Explorer.
Microsoft Windows allows JavaScript source files on a computer's hard drive to be launched as general-purpose, non-sandboxed programs (see: Windows Script Host). This makes JavaScript (like VBScript) a theoretically viable vector for a Trojan horse, although JavaScript Trojan horses are uncommon in practice.
Hardware vulnerabilities
In 2015, a JavaScript-based proof-of-concept implementation of a rowhammer attack was described in a paper by security researchers.
In 2017, a JavaScript-based attack via browser was demonstrated that could bypass ASLR. It's called "ASLR⊕Cache" or AnC.
In 2018, the paper that announced the Spectre attacks against Speculative Execution in Intel and other processors included a JavaScript implementation.
Development tools
Important tools have evolved with the language.
Every major web browser has built-in web development tools, including a JavaScript debugger.
Static program analysis tools, such as ESLint and JSLint, scan JavaScript code for conformance to a set of standards and guidelines.
Some browsers have built-in profilers. Stand-alone profiling libraries have also been created, such as benchmark.js and jsbench.
Many text editors have syntax highlighting support for JavaScript code.
Related technologies
Java
A common misconception is that JavaScript is the same as Java. Both indeed have a C-like syntax (the C language being their most immediate common ancestor language). They are also typically sandboxed (when used inside a browser), and JavaScript was designed with Java's syntax and standard library in mind. In particular, all Java keywords were reserved in original JavaScript, JavaScript's standard library follows Java's naming conventions, and JavaScript's and objects are based on classes from Java 1.0.
Java and JavaScript both first appeared in 1995, but Java was developed by James Gosling of Sun Microsystems and JavaScript by Brendan Eich of Netscape Communications.
The differences between the two languages are more prominent than their similarities. Java has static typing, while JavaScript's typing is dynamic. Java is loaded from compiled bytecode, while JavaScript is loaded as human-readable source code. Java's objects are class-based, while JavaScript's are prototype-based. Finally, Java did not support functional programming until Java 8, while JavaScript has done so from the beginning, being influenced by Scheme.
JSON
JSON, or JavaScript Object Notation, is a general-purpose data interchange format that is defined as a subset of JavaScript's object literal syntax.
WebAssembly
Since 2017, web browsers have supported WebAssembly, a binary format that enables a JavaScript engine to execute performance-critical portions of web page scripts close to native speed. WebAssembly code runs in the same sandbox as regular JavaScript code.
asm.js is a subset of JavaScript that served as the forerunner of WebAssembly.
Transpilers
JavaScript is the dominant client-side language of the Web, and many websites are script-heavy. Thus transpilers have been created to convert code written in other languages, which can aid the development process.
References
Further reading
Flanagan, David. JavaScript: The Definitive Guide. 7th edition. Sebastopol, California: O'Reilly, 2020.
Haverbeke, Marijn. Eloquent JavaScript. 3rd edition. No Starch Press, 2018. 472 pages. .(download)
Zakas, Nicholas. Principles of Object-Oriented JavaScript, 1st edition. No Starch Press, 2014. 120 pages. .
External links
American inventions
Articles with example JavaScript code
Cross-platform software
Dynamically typed programming languages
Functional languages
Object-based programming languages
High-level programming languages
Programming languages created in 1995
Programming languages with an ISO standard
Prototype-based programming languages
Scripting languages
Web programming |
323858 | https://en.wikipedia.org/wiki/History%20of%20animation | History of animation | While the history of animation began much earlier, this article is concerned with the development of the medium after the emergence of celluloid film in 1888, as produced for theatrical screenings, television and (non-interactive) home entertainment.
Between 1895 and 1920, during the rise of the cinematic industry, several different animation techniques were re-invented or newly developed, including stop-motion with objects, puppets, clay or cutouts, and drawn or painted animation. Hand-drawn animation, mostly animation painted on cels, was the dominant technique throughout most of the 20th century and became known as traditional animation.
Around the turn of the millennium, computer animation became the dominant animation technique in most regions (while Japanese anime and European hand-drawn productions continue to be very popular). Computer animation is mostly associated with a three-dimensional appearance with detailed shading, although many different animation styles have been generated or simulated with computers. Some productions may be recognized as Flash animation, but in practice, computer animation with a relatively two-dimensional appearance, stark outlines and little shading, will generally be considered "traditional animation". For instance, the first feature movie made on computers, without a camera, is The Rescuers Down Under (1990), but its style can hardly be distinguished from cel animation.
This article details the history of animation which looks like drawn or painted animation, regardless of the underlying technique.
Influence of predecessors
Animated movies are part of ancient traditions in storytelling, visual arts and theatre. Popular techniques with moving images before film include shadow play, mechanical slides and mobile projectors in magic lantern shows (especially phantasmagoria). Techniques with similarly fanciful three-dimensional moving figures include masks and costumes, puppetry and automata. Illustrated children's books, caricature, political cartoons and especially comic strips are closely related to animation, with much influence on visual styles and types of humour.
The technical principles of modern animation are based on the stroboscopic illusion of motion that was introduced in 1833 with stroboscopic discs (better known as the phenakistiscope). These animated discs with an average of about 8 to 16 images were usually designed as endless loops (like many GIF animations), for home use as a hand-operated "philosophical toy". Although some pioneers hoped it could be applied to longer scenes for theatrical use, throughout the 19th century further development of the technique mostly concentrated on combinations with the stereoscope (introduced in 1838) and photography (introduced in 1839). The breakthrough of cinematography partly depended on the novelty of a technique that was able to record and reproduce reality in life-like motion pictures. During the first years, drawing animated pictures seemed an archaic technique, until some artists produced popular and influential animated shorts and producers embraced cheap techniques to turn popular comic strips into animated cartoons.
1888–1909: Earliest animations on film
Théâtre Optique
Charles-Émile Reynaud developed his projection praxinoscope into the Théâtre Optique with transparent hand-painted colorful pictures in a long perforated strip wound between two spools, patented in December 1888. From 28 October 1892 to March 1900 Reynaud gave over 12,800 shows to a total of over 500,000 visitors at the Musée Grévin in Paris. His Pantomimes Lumineuses series of animated films each contained 300 to 700 frames manipulated back and forth to last 10 to 15 minutes per film. A background scene was projected separately. Piano music, song and some dialogue were performed live, while some sound effects were synchronized with an electromagnet. The first program included three cartoons: Pauvre Pierrot (created in 1892), Un bon bock (created in 1892, now lost), and Le Clown et ses chiens (created in 1892, now lost). Later on the titles Autour d'une cabine (created in 1894) and A rêve au coin du feu would be part of the performances.
Standard picture film
Despite the success of Reynaud's films, it took some time before animation was adapted in the film industry that came about after the introduction of Lumiere's Cinematograph in 1895. Georges Méliès' early fantasy and trick films (released between 1896 and 1913) occasionally contain elements that somewhat resemble animation, including painted props or painted creatures that were moved in front of painted backgrounds (mostly using wires), and film colorization by hand. Méliès also popularized the stop trick, with a single change made to the scene in between shots, that had already been used in Edison's The Execution of Mary Stuart in 1895 and probably led to the development of stop-motion animation some years later. It seems to have lasted until 1906 before proper animated films appeared in cinemas. The dating of some presumed earlier films with animation is contested, while other early films that may have used stop motion or other animation techniques are lost or unidentified, and thus can't be checked.
Printed animation film
In 1897 German toy manufacturer Gebrüder Bing had a first prototype of their kinematograph. In November 1898 they presented this toy film projector, possibly the first of its kind, at a toy festival in Leipzig. Soon other toy manufacturers, including Ernst Plank and Georges Carette, sold similar devices. Around the same time the French company Lapierre marketed a similar projector. The toy cinematographs were adapted toy magic lanterns with one or two small spools that used standard "Edison perforation" 35mm film. These projectors were intended for the same type of "home entertainment" toy market that most of these manufacturers already provided with praxinoscopes and toy magic lanterns. Apart from relatively expensive live-action films, the manufacturers produced many cheaper films by printing lithographed drawings. These animations were probably made in black-and-white from around 1898 or 1899, but at the latest by 1902 they were made in color. The pictures were often traced from live-action films (much like the later rotoscoping technique). These very short films typically depicted a simple repetitive action and were created to be projected as a loop - playing endlessly with the film ends put together. The lithograph process and the loop format follow the tradition that was set by the stroboscopic disc, zoetrope and praxinoscope.
Katsudō Shashin, from an unknown creator, was discovered in 2005 and is speculated to be the oldest work of animation in Japan, with Natsuki Matsumoto, an expert in iconography at the Osaka University of Arts and animation historian Nobuyuki Tsugata determining the film was most likely made between 1907 and 1911. The film consists of a series of cartoon images printed in 50 frames on a celluloid strip and lasts three seconds at 16 frames per second. It depicts a young boy in a sailor suit who writes the kanji characters "" (katsudō shashin, or "moving picture"), then turns towards the viewer, removes his hat, and offers a salute. Evidence suggests it was mass-produced to be sold to wealthy owners of home projectors. To Matsumoto, the relatively poor quality and low-tech printing technique indicate it was likely from a smaller film company.
J. Stuart Blackton
J. Stuart Blackton was a British-American filmmaker, co-founder of the Vitagraph Studios and one of the first to use animation in his films. His The Enchanted Drawing (1900) can be regarded as the first theatrical film recorded on standard picture film that included animated elements, although this concerns just a few frames of changes in drawings. It shows Blackton doing "lightning sketches" of a face, cigars, a bottle of wine and a glass. The face changes expression when Blackton pours wine into the face's mouth and when Blackton takes his cigar. The technique used in this film was basically the stop trick: the single change to the scenes was the replacement of a drawing by a similar drawing with a different facial expression. In some scenes, a drawn bottle and glass were replaced by real objects. Blackton had possibly used the same technique in a lost 1896 lightning sketch film.
Blackton's 1906 film Humorous Phases of Funny Faces is often regarded as the oldest known drawn animation on standard film. It features a sequence made with blackboard drawings that are changed between frames to show two faces changing expressions and some billowing cigar smoke, as well as two sequences that feature cutout animation with a similar look for more fluid motion.
Blackton's use of stop motion in The Haunted Hotel (1907) was very influential.
Émile Cohl
The French artist Émile Cohl created the first animated film using what came to be known as traditional animation methods: the 1908 Fantasmagorie. The film largely consisted of a stick figure moving about and encountering all manner of morphing objects, such as a wine bottle that transforms into a flower. There were also sections of live action where the animator's hands would enter the scene. The film was created by drawing each frame on paper and then shooting each frame onto negative film, which gave the picture a blackboard look. Cohl later went to Fort Lee, New Jersey near New York City in 1912, where he worked for French studio Éclair and spread its animation technique to the US.
1910s: From original artists to "assembly-line" production studios
During the 1910s larger-scale animation studios began to come into being. From then onwards, solo artists faded from the public eye.
Winsor McCay
Starting with a short 1911 film of his most popular character Little Nemo, successful newspaper cartoonist Winsor McCay gave much more detail to his hand-drawn animations than any animation previously seen in cinemas. His 1914 film Gertie the Dinosaur featured an early example of character development in drawn animation. It was also the first film to combine live-action footage with animation. Originally, McCay used the film in his vaudeville act: he would stand next to the screen and speak to Gertie who would respond with a series of gestures. At the end of the film McCay would walk behind the projection screen, seamlessly being replaced with a prerecorded image of himself entering the screen, getting on the cartoon dinosaur's back and riding out of frame. McCay personally hand-drew almost every one of the thousands of drawings for his films. Other noteworthy titles by McCay are How a Mosquito Operates (1912) and The Sinking of the Lusitania (1918).
Cartoon Film Company – Buxton and Dyer
Between 1915 and 1916 Dudley Buxton, and Anson Dyer produced a series of 26 topical cartoons, during WWI, utilising mainly cutout animation, released as the John Brown's animated sketchbook, The episodes included the shelling of Scarborough by German battleships, and The Sinking of the Lusitania, No.4 (June 1915)
Barré Studio
Around 1913 Raoul Barré developed the peg system that made it easier to align drawings by perforating two holes below each drawing and placing them on two fixed pins. He also used a "slash and tear" technique to not have to draw the complete background or other motionless parts for every frame. The parts where something needed to be changed for the next frame were carefully cut away from the drawing and filled in with the required change on the sheet below. After Barré had started his career in animation at Edison Studios, he founded one of the first film studios dedicated to animation in 1914 (initially together with Bill Nolan). Barré Studio had success with the production of the adaptation of the popular comic strip Mutt and Jeff (1916–1926). The studio employed several animators who would have notable careers in animation, including Frank Moser, Gregory La Cava, Vernon Stallings, Tom Norton and Pat Sullivan.
Bray Productions
In 1914, John Bray opened John Bray Studios, which revolutionized the way animation was created. Earl Hurd, one of Bray's employees, patented the cel technique. This involved animating moving objects on transparent celluloid sheets. Animators photographed the sheets over a stationary background image to generate the sequence of images. This, as well as Bray's innovative use of the assembly-line method, allowed John Bray Studios to create Colonel Heeza Liar, the first animated series. Many aspiring cartoonists started their careers at Bray, including Paul Terry (later of Heckle and Jeckle fame), Max Fleischer (later of Betty Boop and Popeye fame), and Walter Lantz (later of Woody Woodpecker fame). The cartoon studio operated from circa 1914 until 1928. Some of the first cartoon stars from the Bray studios were Farmer Alfalfa (by Paul Terry) and Bobby Bumps (by Earl Hurd).
Hearst's International Film Service
Newspaper tycoon William Randolph Hearst founded International Film Service in 1916. Hearst lured away most of Barré Studio's animators, with Gregory La Cava becoming the head of the studio. They produced adaptations of many comic strips from Heart's newspapers in a rather limited fashion, giving just a little motion to the characters while mainly using the dialog balloons to deliver the story. The most notable series is Krazy Kat, probably the first of many anthropomorphic cartoon cat characters and other talking animals. Before the studio stopped in 1918, it had employed some new talents, including Vernon Stallings, Ben Sharpsteen, Jack King, John Foster, Grim Natwick, Burt Gillett and Isadore Klein.
Rotoscoping
In 1915, Max Fleischer applied for a patent (granted in 1917) for a technique which became known as rotoscoping: the process of using live-action film recordings as a reference point to more easily create realistic animated movements. The technique was often used in the Out of the Inkwell series (1918–1929) for John Bray Productions (and others). The series resulted from experimental rotoscoped images of Dave Fleischer performing as a clown, evolving into a character who became known as Koko the Clown.
Felix the cat
In 1919, Otto Messmer of Pat Sullivan Studios created Felix the Cat. Pat Sullivan, the studio head took all of the credit for Felix, a common practice in the early days of studio animation. Felix the Cat was distributed by Paramount Studios and attracted a large audience, eventually becoming one of the most recognized cartoon characters in film history. Felix was the first cartoon to be merchandised.
Quirino Cristiani: the first animated features
The first known animated feature film was El Apóstol by Quirino Cristiani, released on 9 November 1917 in Argentina. This successful 70-minute satire utilized a cardboard cutout technique, reportedly with 58,000 frames at 14 frames per second. Cristiani's next feature Sin dejar rastros was released in 1918, but it received no press coverage and poor public attendance before it was confiscated by the police for diplomatic reasons. None of Cristiani's feature films survived.
1920s: Absolute film, synchronized sound and the rise of Disney
A number of key events occurred in the 1920s, including the development of the first animations with synchronized sound, and the founding of the Walt Disney Studio. The decade also saw the first appearance of Mickey Mouse in Steamboat Willie (1928).
Absolute film
In the early 1920s, the absolute film movement with artists such as Walter Ruttmann, Hans Richter, Viking Eggeling and Oskar Fischinger made short abstract animations which proved influential. Although some later abstract animation works by, for instance, Len Lye and Norman McLaren would be widely appreciated, the genre largely remained a relatively obscure avant-garde art form, while direct influences or similar ideas would occasionally pop up in mainstream animation (for instance in Disney's Toccata and Fugue in D Minor in Fantasia (1940) – on which Fischinger originally collaborated until his work was scrapped, and partly inspired by the works of Lye – and in The Dot and the Line (1965) by Chuck Jones).
Early synchronized sound: Song Car-Tunes and Aesop's Sound Fables
From May 1924 to September 1926, Dave and Max Fleischer's Inkwell Studios produced 19 sound cartoons, part of the Song Car-Tunes series, using the Phonofilm "sound-on-film" process. The series also introduced the "bouncing ball" above lyrics to guide audiences to sing along to the music. My Old Kentucky Home from June 1926 was probably the first film to feature a bit of synchronized animated dialogue, with an early version of Bimbo mouthing the words "Follow the ball, and join in, everybody". The Bimbo character was further developed in Fleischer's Talkartoons (1929–1932).
Paul Terry's Dinner Time, from his Aesop's Fables (1921–1936) series, premiered on 1 September 1928 with a synchronized soundtrack with dialogue. Terry was urged to add the novelty against his wishes by the new studio owner Van Beuren. Although the series and its main character Farmer Al Falfa had been popular, audiences were not impressed by this first episode with sound.
Lotte Reiniger
The earliest surviving animated feature film is the 1926 silhouette-animated Die Abenteuer des Prinzen Achmed (Adventures of Prince Achmed), which used colour-tinted film. It was directed by German Lotte Reiniger and her husband Carl Koch. Walter Ruttmann created visual background effects. French/Hungarian collaborator Berthold Bartosch and/or Reiniger created depth of field by putting scenographic elements and figures on several levels of glass plates with illumination from below and the camera vertically above. Later on a similar technique became the basis of the multiplane camera.
Early Disney: Laugh-O-Grams, Alice, Oswald and Mickey
Between 1920 and 1922, cartoonists Walt Disney, Ub Iwerks and Fred Harman worked at the Slide Company (soon renamed as Kansas City Film Ad Company), which produced cutout animation commercials. Disney and Ub studied Muybridge's chronophotography and the one book on animation in the local library, and Disney experimented with drawn animation techniques in his parents' garage. They were able to bring some innovations to the company, but their employer did not want to forsake the trusted cutout technique. Disney's home experiments led to a series that satirized current local topics, which he managed to sell to the owner of the three local Newman Theatres as weekly Newman Laugh-O-Grams in 1921. While striking the deal, the 19-year old Disney forgot to include a profit margin, but he was happy that someone paid for his "experiment" and gained local renown from the screenings. Disney also created his first recurring character, Professor Whosis, appearing in humorous public announcements for Newman.
Disney and Harman started their own Kaycee Studio on the side, experimenting with films played backwards, but their efforts to make money with commercials and newsreel footage were not very fruitful and Harman left in 1922. Through a newspaper ad, Disney "hired" Rudolph Ising in exchange for teaching him the ins and outs of animation. Inspired by Terry's Aesop's Fables, Disney started a series of circa seven-minute modernized fairy tale cartoons, and a new series of satirical actualities called Lafflets, with Ising's help. After two fairy-tale cartoons, Disney quit his job at Film Ad and started Laugh-O-Gram Films, Inc. with the help of investors. Iwerks, Fred's brother Hugh Harman and Carman Maxwell were among the animators who would produce five more Laugh-O-Gram fairy tale cartoons and the sponsored Tommy Tucker's Tooth in 1922. The series failed to make money and in 1923 the studio tried something else with the live-action "Song-O-Reel" Martha and Alice's Wonderland. The 12-minute film featured a live-action girl (Virginia Davis) interacting with numerous cartoon characters, including the Felix-inspired Julius the Cat (who had already appeared in the Laugh-O-Gram fairy tales, without a name). Before Disney was able to sell the picture, his studio went bankrupt.
Disney moved to Hollywood and managed to close a deal with New York film distributor Margaret J. Winkler, who had just lost the rights to Felix the Cat and Out of the Inkwell. To make the Alice Comedies series (1923–1927), Iwwerks also moved to Hollywood, later followed by Ising, Harman, Maxwell and Film Ad colleague Friz Freleng. The series was successful enough to last 57 episodes, but Disney eventually preferred to create a new fully animated series.
Oswald the Lucky Rabbit followed in 1927 and became a hit, but after failed negotiations for continuation in 1928, Charles Mintz took direct control of production and Disney lost his character and most of his staff to Mintz.
Disney and Iwerks developed Mickey Mouse in 1928 to replace Oswald. A first film entitled Plane Crazy failed to impress a test audience and did not raise sufficient interest of potential distributors. After some live-action movies with synchronized sound had become successful, Disney put the new Mickey Mouse cartoon The Gallopin' Gaucho on hold to start work on a special sound production which would launch the series more convincingly. Much of the action in the resulting Steamboat Willie (November 1928) involves the making of sounds, for instance with Mickey making music using livestock aboard the boat. The film became a huge success and Mickey Mouse would soon become the most popular cartoon character in history.
Bosko
Bosko was created in 1927 by Hugh Harman and Rudolf Ising, specifically with talkies in mind. They were still working for Disney at the time, but they left in 1928 to work on the Oswald the Lucky Rabbit cartoons at Universal for about a year, and then produced Bosko, the Talk-Ink Kid pilot in May 1929 to shop for a distributor. They signed with Leon Schlesinger productions and started the Looney Tunes series for Warner Bros. in 1930. Bosko was the star of 39 Warner Bros. cartoons before Harman and Ising took Bosko to MGM after leaving Warner Bros.. After two MGM cartoons, the character received a dramatic make-over that was much less appreciated by audiences. Bosko's career ended in 1938.
1930s: Color, depth, cartoon superstars and Snow White
While the global economy suffered under the Great Depression through the 1930s, animation continued to flourish. Early colour processes came into use, along with use of the Multiplane camera. In 1937, Snow White debuted in theatres- the first full-length traditionally animated feature film.
Lithographed colour
The lithographed films for home use that were available in Europe in the first decades of the twentieth century were multi-coloured, but the technique does not seem to have been applied for theatrically released animated films. While the original prints of The Adventures of Prince Achmed featured film tinting, most theatrically released animated films before 1930 were plain black and white. Effective color processes were a welcome innovation in Hollywood and seemed especially suitable for cartoons.
Two-strip color
A cartoon segment in the feature film King of Jazz (April 1930), made by Walter Lantz and Bill Nolan, was the first animation presented in two-strip Technicolor.
Fiddlesticks, released together with King of Jazz, was the first Flip the Frog film and the first project Ub Iwerks worked on after he had left Disney to set up his own studio. In England, the cartoon was released in Harris Color, a two-color process, probably as the first theatrically released standalone animated cartoon to boast both sound and color.
Disney's Silly Symphonies in Technicolor
When the Silly Symphonies series, started in 1929, was less popular than Disney had hoped, he turned to a new technical innovation to improve the impact of the series. In 1932 he worked with the Technicolor company to create the first full-colour animation Flowers and Trees, debuting the three-strip technique (the first use in live-action movies came circa two years later). The cartoon was successful and won an Academy Award for Short Subjects, Cartoons. Disney temporarily had an exclusive deal for the use of Technicolor's full color technique in animated films. He even waited a while before he produced the ongoing Mickey Mouse series in color, so the Silly Symphonies would have their special appeal for audiences. After the exclusive deal lapsed in September 1935, full color animation soon became the industry standard.
Silly Symphonies inspired many cartoon series that boasted various other color systems until Technicolor wasn't exclusive to Disney anymore, including Ub Iwerks' ComiColor Cartoons (1933–1936), Van Beuren Studios' Rainbow Parade (1934–1936), Fleischer's Color Classics (1934–1941), Charles Mintz's Color Rhapsody (1936–1949), MGM's Happy Harmonies (1934–1938) George Pal's Puppetoons (1932–1948), and Walter Lantz's Swing Symphony (1941–1945).
Multiplane cameras and the stereoptical process
To create an impression of depth, several techniques were developed. The most common technique was to have characters move between several background and/or foreground layers that could be moved independently, corresponding to the laws of perspective (e.g. the further away from the camera, the slower the speed).
Lotte Reiniger had already designed a type of multiplane camera for Die Abenteuer des Prinzen Achmed and her collaborator Berthold Bartosch used a similar setup for his intricately detailed 25-minute film L'Idée (1932).
In 1933, Ub Iwerks developed a multiplane camera and used it for a number of Willie Whopper (1933–1934) and ComiColor Cartoons episodes.
The Fleischers developed the very different stereoptical process in 1933 for their Color Classics. It was used in the first episode Betty Boop in Poor Cinderella (1934) and most of the following episodes. The process involved three-dimensional sets built and sculpted on a large turntable. The cels were placed within the movable set, so that the animated characters would appear to move in front and behind of the 3D elements within the scene when the turntable was made to rotate.
Disney-employee William Garity developed a multiplane camera that could have up to seven layers of artwork. It was tested in the Academy Award-winning Silly Symphony The Old Mill (1937) and used prominently in Snow White and later features.
New colourful cartoon superstars
After the additions of sound and colour were a huge success for Disney, other studios followed. By the end of the decade, almost all the theatrical cartoons were produced in full colour.
Initially, music and songs were the focus of many series, as indicated by series titles as Song Car-Tunes, Silly Symphonies, Merrie Melodies and Looney Tunes, but it was the recognizable characters that really stuck with audiences. Mickey Mouse had been the first cartoon superstar who surpassed Felix the Cat's popularity, but soon dozens more cartoon superstars followed, many remaining popular for decades.
Warner Bros. had a vast music library that could be popularized through cartoons based on the available tunes. While Disney needed to create the music for every cartoon, the readily available sheet music and songs at Warner Bros. provided inspiration for many cartoons. Leon Schlesinger sold Warner Bros. a second series called Merrie Melodies, which until 1939 contractually needed to contain at least one refrain from the music catalog. Unlike Looney Tunes with Bosko, Merrie Melodies featured only a few recurring characters like Foxy, Piggy and Goopy Geer before Harman and Ising left in 1933. Bosko was replaced with Buddy for the Looney Tunes series, but lasted only two years, while Merrie Melodies initially continued without recurring characters. Eventually, the two series became indistinguishable and produced many new characters that became popular. Animator/director Bob Clampett designed Porky Pig (1935) and Daffy Duck (1937) and was responsible for much of the energetic animation and irreverent humour associated with the series. The 1930s also saw early anonymous incarnations of characters who would later become the superstars Elmer Fudd (1937/1940), Bugs Bunny (1938/1940) and Sylvester the Cat (1939/1945). Since 1937, Mel Blanc would perform most of the characters' voices.
Disney introduced new characters to the Mickey Mouse universe who would become very popular, to star together with Mickey and Minnie Mouse (1928): Pluto (1930), Goofy (1932), and a character who would soon become the new favourite: Donald Duck (1934). Disney had realized that the success of animated films depended upon telling emotionally gripping stories; he developed a "story department" where storyboard artists separate from the animators would focus on story development alone, which proved its worth when the Disney studio released in 1933 the first animated short to feature well-developed characters, Three Little Pigs. Disney would expand his studio and start more and more production activities, including comics, merchandise and theme parks. Most projects were based on the characters developed for theatrical short films.
Fleischer Studios introduced an unnamed dog character as Bimbo's girlfriend in Dizzy Dishes (1930), who evolved into the human female Betty Boop (1930–1939) and became Fleischer's best-known creation. In the 1930s they also added Hunky and Spunky (1938) and the popular animated adaptation of Popeye (1933) to their repertoire.
Hays code and Betty Boop
Hays' Motion Picture Production Code for moral guidelines was applied in 1930 and rigidly enforced between 1934 and 1968. It had a big impact on filmmakers who liked to create relatively saucy material. As an infamous example, Betty Boop suffered greatly when she had to be changed from a carefree flapper with an innocent sex appeal into a more wholesome and much tamer character in fuller dress. Her boyfriend Bimbo's disappearance was probably also the result of the codes disapproval of mixed-species relationships.
Snow White and the breakthrough of the animated feature
At least eight animated feature films were released before Disney's Snow White and the Seven Dwarfs, while at least another two earlier animated feature projects remained unfinished. Most of these films (of which only four survive) were made using cutout, silhouette or stop-motion techniques. Among the lost animated features were three features by Quirino Cristiani, who had premiered his third feature Peludópolis on 18 September 1931 in Buenos Aires with a Vitaphone sound-on-disc synchronized soundtrack. It was received quite positively by critics, but did not become a hit and was an economic fiasco for the filmmaker. Cristiani soon realized that he could no longer make a career with animation in Argentina. Only Academy Award Review of Walt Disney Cartoons—also by Disney—was totally hand-drawn. It was released seven months prior to Snow White to promote the upcoming release of Snow White.. Many do not consider this a genuine feature film, because it is a package film and lasts only 41 minutes. It does meet the official definitions of a feature film by the British Film Institute, the Academy of Motion Picture Arts and Sciences and the American Film Institute, which require that the film has to be over 40 minutes long.
When it became known that Disney was working on a feature-length animation, critics regularly referred to the project as "Disney's folly", not believing that audiences could stand the expected bright colors and jokes for such a long time. Snow White and the Seven Dwarfs premiered on 21 December 1937 and became a worldwide success. The film continued Disney's tradition to appropriate old fairy tales and other stories (started with the Laugh-O-Grams in 1921), as would most of the Disney features that followed.
The Fleischer studios followed Disney's example with Gulliver's Travels in 1939, which was a minor success at the box office.
Early TV animation
In April 1938, when about 50 television sets were connected, NBC aired the eight-minute low-budget cartoon Willie the Worm. It was especially made for this broadcast by former Disney employee Chad Grothkopf, mainly with cutouts and a bit of cel animation. About a year later, on 3 May 1939, Disney's Donald's Cousin Gus was premiered on NBC's experimental W2XBS channel a few weeks before it was released in movie theatres. The cartoon was part of the first full-evening programme.
1940s
Wartime propaganda
Several governments had already used animation in public information films, like those by the GPO Film Unit in the U.K. and Japanese educational films. During World War II, animation became a common medium for propaganda. The US had their best studios working for the war effort.
To instruct service personnel about all kinds of military subjects and to boost morale, Warner Bros. was contracted for several shorts and the special animated series Private Snafu. The character was created by the famous movie director Frank Capra, Dr. Seuss was involved in screenwriting and the series was directed by Chuck Jones. Disney also produced several instructive shorts and even personally financed the feature-length Victory Through Air Power (1943) that promoted the idea of long-range bombing.
Many popular characters promoted war bonds, like Bugs Bunny in Any Bonds Today?, Disney's little pigs in The Thrifty Pig and a whole bunch of Disney characters in All Together. Daffy Duck asked for scrap metal for the war effort in Scrap Happy Daffy. Minnie Mouse and Pluto invited civilians to collect their cooking grease so it could be used for making explosives in Out of the Frying Pan Into the Firing Line. There were several more political propaganda short films, like Warner Bros.' Fifth Column Mouse, Disney's Chicken Little and the more serious Education for Death and Reason and Emotion (nominated for an Academy Award).
Such wartime films were much appreciated. Bugs Bunny became something of a national icon and Disney's propaganda short Der Fuehrer's Face (starring Donald Duck) won the company its tenth Academy Award for cartoon short subjects.
Japan's first feature anime 桃太郎 海の神兵 (Momotaro: Sacred Sailors) was made in 1944, ordered by the Ministry of the Navy of Japan. It was designed for children and, partly inspired by Fantasia, was meant to inspire dreams and hope for peace. The main characters are an anthropomorphic monkey, dog, bear and pheasant who become parachute troopers (except the pheasant who becomes a pilot) tasked with invading Celebes. An epilogue hints at America being the target for the next generation.
Feature animation in the 1940s
High ambitions, setbacks and cutbacks in US feature animation
Disney's next features (Pinocchio and the very ambitious concert-film Fantasia, both released in 1940) and Fleischer Studios' second animated feature Mr. Bug Goes to Town (1941/1942) were all received favorably by critics but failed at the box office during their initial theatrical runs. The primary cause was that World War II had cut off most foreign markets. These setbacks discouraged most companies who had plans for animated features.
Disney cut back on the costs for the next features and first released The Reluctant Dragon, mostly consisting of a live-action tour of the new studio in Burbank, partly in black and white, with four short cartoons. It was a mild success at the worldwide box office. It was followed a few months later by Dumbo (1941), only 64 minutes long and animated in a simpler economic style. This helped securing a profit at the box office, and critics and audiences reacted positively. Disney's next feature Bambi (1942) returned to a larger budget and more lavish style, but the more dramatic story, darker mood and lack of fantasy elements was not well-received during its initial run and lost money at the box office.
Although all the other eight Disney features of the 1940s were package films, and/or combinations with live-action (for instance Saludos Amigos (1943) and The Three Caballeros (1944)), Disney kept faith in animated feature animation. For decades, Disney was the only American studio to release animated theatrical feature films regularly, though some other American studios did manage to release more than a handful before the beginning of the 1990s.
Non-US animation forces
American cel-animated films dominated the worldwide production and consumption of theatrical animated releases since the 1920s. Especially Disney's work proved to be very popular and most influential around the world. Studios from other countries could hardly compete with the American productions. Relatively many animation producers outside the US chose to work with other techniques than "traditional" or cel animation, such as puppet animation or cut-out animation. However, several countries (most notably Russia, China and Japan) developed their own relatively large "traditional" animation industries. Russia's Soyuzmultfilm animation studio, founded in 1936, employed up to 700 skilled workers and, during the Soviet period, produced 20 films per year on average. Some titles noticed outside their respective domestic markets include 铁扇公主 (Princess Iron Fan) (China 1941, influential in Japan), Конёк-Горбуно́к (The Humpbacked Horse) (Russia 1947, winner special jury award in Cannes in 1950), I Fratelli Dinamite (The Dynamite Brothers) (Italy 1949) and La Rosa di Bagdad (The Rose of Baghdad) (Italy 1949, the 1952 English dub starred Julie Andrews).
Successful theatrical short cartoons of the 1940s
During the "Golden Age of American animation", new studios competed with the studios that survived the sound and colour innovation battles of the previous decades. Cartoon animals were still the norm and music was still a relevant element, but often lost its main stage appeal to Disney's melodramatic storytelling or the wild humour in Looney Tunes and other cartoons.
Disney continued their cartoon successes, adding Daisy Duck (1940) and Chip 'n' Dale (1943/1947) to the Mickey Mouse universe, while Warner Bros. developed new characters to join their popular Merrie Melodies/Looney Tunes cast, including Tweety (1941/1942), Henery Hawk (1942), Yosemite Sam (1944/1945), Foghorn Leghorn (1946), Barnyard Dawg (1946), Marvin the Martian (1948), and Wile E. Coyote and Road Runner (1949).
Other new popular characters and series were Terrytoons' Mighty Mouse (1942–1961) and Heckle and Jeckle (1946), and Screen Gems' The Fox and the Crow (1941–1948).
Fleischer/Famous Studios
Fleischer launched its spectacular Superman adaptation in 1941. The success came too late to save the studio from its financial problems and in 1942 Paramount Pictures took over the studio from the resigning Fleischer Brothers. The renamed Famous Studios continued the Popeye and Superman series, developed popular adaptations of Little Lulu (1943–1948, licensed by Gold Key Comics), Casper the friendly ghost (1945) and created new series, such as Little Audrey (1947) and Baby Huey (1950).
Walter Lantz Productions
Walter Lantz had started his animation career at Hearst's studio at the age of 16. He had also worked for the Bray Studios and Universal Pictures, where he had gained control over the Oswald the Lucky Rabbit cartoons in 1929 (reportedly by winning the character and control of the studio in a poker bet with Universal president Carl Laemmle). In 1935, the Universal studio was turned into the independent Walter Lantz Productions, but remained on the Universal lot and continued to produce cartoons for Universal to distribute. When Oswald's popularity dwindled and the character was eventually retired in 1938, Lantz's productions went without successful characters until he developed Andy Panda in 1939. The anthropomorphic panda starred in over two-dozen cartoons until 1949, but he was soon overshadowed by the iconic Woody Woodpecker, who debuted in the Andy Panda cartoon Knock Knock in 1940. Other popular Lantz characters include Wally Walrus (1944), Buzz Buzzard (1948), Chilly Willy (1953), Hickory, Dickory, and Doc (1959).
MGM
After distributing Ub Iwerks' Flip the Frog and Willie Whopper cartoons and Happy Harmonies by Harman and Ising, Metro-Goldwyn-Mayer founded its own cartoon studio in 1937. The studio had much success with Barney Bear (1939–1954), Hanna and Joseph Barbera's Tom and Jerry (1940) and Spike and Tyke (1942).
In 1941, Tex Avery left Warner Bros. for MGM and would there create Droopy (1943), Screwy Squirrel (1944) and George and Junior (1944).
UPA
While Disney and most of the other studios sought a sense of depth and realism in animation, UPA animators (including former Disney employee John Hubley) had a different artistic vision. They developed a much sparser and more stylized type of animation, inspired by Russian examples. The studio was formed in 1943 and initially worked on government contracts. A few years later they signed a contract with Columbia Pictures, took over The Fox and the Crow from Screen Gems and earned Oscar nominations for their first two theatrical shorts in 1948 and 1949. While the field of animation was dominated by anthropomorphic animals when the studio was allowed to create a new character, they came up with the near-sighted old man. Mr. Magoo (1949) became a hit and would be featured in many short films. Between 1949 and 1959 UPA received 15 Oscar nominations, winning their first Academy Award with the Dr. Seuss adaptation Gerald McBoing-Boing (1950), followed by two more for When Magoo Flew (1954) and Magoo's Puddle Jumper (1956). The distinctive style was influential and even affected the big studios, including Warner Bros. and Disney. Apart from an effective freedom in artistic expression, UPA had proved that sparser animation could be appreciated as much as (or even more than) the expensive lavish styles.
TV animation in the 1940s
The back catalog of animated cartoons of many studios, originally produced for a short theatrical run, proved very valuable for television broadcasting. Movies for Small Fry (1947), presented by "big brother" Bob Emery on Tuesday evenings on the New York WABD-TV channel, was one of the first TV series for children and featured many classic Van Beuren Studios cartoons. It was continued on the DuMont Television Network as the daily show Small Fry Club (1948–1951) with a live audience in a studio setting.
Many classical series from Walter Lantz, Warner Bros., Terrytoons, MGM and Disney similarly found a new life in TV shows for children, with many reruns, for decades. Instead of studio settings and live-action presentation, some shows would feature new animation to present or string together the older cartoons.
The earliest American animated series specifically produced for TV came about in 1949, with Adventures of Pow Wow (43 five-minute episodes broadcast on Sunday mornings from January to November) and Jim and Judy in Teleland (52 episodes, later also sold to Venuzuela and Japan).
1950s: Shift from classic theatrical cartoons to limited animation in TV series for children
Most theatrical cartoons had been produced for non-specific audiences. Dynamic action and gags with talking animals in clear drawing styles and bright colours were naturally appealing to young children, but the cartoons regularly contained violence and sexual innuendo and were often screened together with news reels and feature films that were not for children. On US television, cartoons were mainly programmed for children in convenient time slots on weekend mornings, weekday afternoons or early evenings.
The scheduling constraints of the 1950s American TV animation process, notably issues of resource management (higher quantity needed to be made in less time for a lower budget compared to theatrical animation), led to the development of various techniques known now as limited animation. The sparser type of animation that originally had been an artistic choice of style for UPA, was embraced as a means to cut back production time and costs. Full-frame animation ("on ones") became rare in the United States outside its use for increasingly less theatrical productions. Chuck Jones coined the term "illustrated radio" to refer to the shoddy style of most television cartoons that depended more on their soundtracks than visuals. Some producers also found that limited animation looked better on the small (black-and-white) TV screens of the time.
Animated TV series of the 1950s
Jay Ward produced the popular Crusader Rabbit (tested in 1948, original broadcasts in 1949–1952 and 1957–1959), with successful use of a limited-animation style.
At the end of the 1950s, several studios dedicated to TV animation production started competing. While the focus for competition in theatrical animation had been on quality and innovation, it now shifted to delivering animation fast and cheap. Critics noted how the quality of many shows was often poor in comparison to the classic cartoons, with rushed animation and run-of-the-mill stories. Network executives were satisfied as long as there were enough viewers, and the huge amounts of young viewers were not bothered with the lack of quality that the critics perceived. Watching Saturday-morning cartoon programming, up to four hours long, became a favorite pastime of most American children since the mid-1960s and was a mainstay for decades.
Disney had entered into TV production relatively early, but refrained from creating newly animated series for decades. Instead, Disney had their own anthology series on the air since 1954 in primetime three-hour slots, starting with the Walt Disney's Disneyland series (1954–1958), clearly promoting the Disneyland theme park that opened in 1955. Walt Disney personally hosted the series that apart from older cartoons featured segments with, for instance, looks behind the scenes on film-making processes or new live-action adventures.
William Hanna and Joseph Barbera (the creators of Tom and Jerry) continued as Hanna-Barbera after Metro-Goldwyn-Mayer closed their animation studio in 1957, when MGM considered their back catalog sufficient for further sales. While Hanna-Barbera only made one theatrically released series with Loopy de Loop (1959–1965), they proved to be the most prolific and successful producers of animated television series for several decades. Starting with The Ruff and Reddy Show (1957–1960), they continued with successful series like The Huckleberry Hound Show (1958, the first half-hour television program to feature only animation) and The Quick Draw McGraw Show (1959–1961).
Other notable programs include UPA's Gerald McBoing Boing (1956–1957), Soundac's Colonel Bleep (1957–1960, the first animated TV series in color), Terrytoons's Tom Terrific (1958), and Jay Ward's The Adventures of Rocky and Bullwinkle and Friends (1959–1964).
In contrast to the international film market (developed during the silent era when language problems were limited to title cards), TV-pioneering in most countries (often connected to radio broadcasting) focused on domestic production of live programs. Rather than importing animated series that usually would have to be dubbed, children's programming could more easily and more cheaply be produced in other ways (for instance, featuring puppetry). One notable method was the real-time "animation" of cutout figures in Captain Pugwash (1957) on the BBC. One of the few early animated series for TV that was seen abroad was Belvision Studios' Les Aventures de Tintin, d'après Hergé (Hergé's Adventures of Tintin) (Belgium 1957–1964, directed by Ray Goossens), broadcast by the BBC in 1962 and syndicated in the United States from 1963 to 1971.
Theatrical short cartoons in the 1950s
Warner Bros. introduced new characters Sylvester Jr. (1950), Speedy Gonzales (1953), Ralph Wolf and Sam Sheepdog (1953), and Tasmanian Devil (1954).
Theatrical feature animation in the 1950s
Disney
After a string of package features and live-action/animation combos, Disney returned to fully animated feature films with Cinderella in 1950 (the first since Bambi). Its success practically saved the company from bankruptcy. It was followed by Alice in Wonderland (1951), which flopped at the box office and was critically panned. Peter Pan (1953) and Lady and the Tramp (1955) were hits. The ambitious, much delayed and more expensive Sleeping Beauty (1959) lost money at the box office and caused doubts about the future of Walt Disney's animation department. Like "Alice in Wonderland" and most of Disney's flops, it would later be commercially successful through re-releases and would eventually be regarded as a true classic.
Non-US
Jeannot l'intrépide (Johnny the Giant Killer) (France, 1950 feature)
Le Roi et l'Oiseau (The King and the Mockingbird) (France, 1952 unfinished feature release, 1980 finished release, influential for Hayao Miyazaki and Isao Takahata)
Animal Farm (U.K./U.S.A., 1954 feature)
乌鸦为什么是黑的 (Why Is the Crow Black-Coated) (China, 1956 short film, Venice Film Festival)
Снежная королева (The Snow Queen) (Soviet Union, 1957 feature)
Krtek (Mole) (Czechoslovakia, 1956 short film series)
白蛇伝 (Panda and the Magic Serpent) (Japan, 1958 feature)
少年猿飛佐助 (Magic Boy) (Japan, 1959 feature, first anime released in U.S. in 1961)
1960s
US animated TV series and specials in the 1960s
Total Television was founded in 1959 to promote General Mills products with original cartoon characters in Cocoa Puffs commercials (1960–1969) and the General Mills-sponsored TV series King Leonardo and His Short Subjects (1960–1963, repackaged shows until 1969), Tennessee Tuxedo and His Tales (1963–1966, repackaged shows until 1972), The Underdog Show (1964–1967, repackaged shows until 1973) and The Beagles (1966–1967). Animation for all series was produced at Gamma Studios in Mexico. Total Television stopped producing after 1969, when General Mills no longer wanted to sponsor them.
Many of the American animated TV series from the 1960s to 1980s were based on characters and formats that had already proved popular in other media. UPA produced The Dick Tracy Show (1961–1962), based on the comic books. Filmation, active from 1962 to 1989, created few original characters, but many adaptations of DC Comics, live-action TV series (including Lassie's Rescue Rangers (1973–1975) and Star Trek: The Animated Series), some live-action features (including Journey to the Center of the Earth (1967–1969), and much more). Grantray-Lawrence Animation was the first studio to adapt Marvel Comics superheroes in 1966. Pop groups got animated versions in The Beatles (1965–1966) and Rankin/Bass's The Jackson 5ive (1971–1972) and The Osmonds (1972). Hanna-Barbera turned comedians into cartoon characters with Laurel and Hardy (1966–1967) and The Abbott and Costello Cartoon Show (1967–1968). Format Films' The Alvin Show (1961–1962) was a spin-off of a 1958 novelty song and the subsequent comic books with redesigned versions of Alvin and the Chipmunks. Other series contained unlicensed appropriations. For instance, Hanna-Barbera's The Flintstones (1960–1966) was clearly inspired by sitcom The Honeymooners and creator Jackie Gleason considered suing Hanna-Barbera, but he did not want to be known as "the guy who yanked Fred Flintstone off the air".
The Flintstones was the first prime-time animated series and became immensely popular, it remained the longest-running network animated television series until that record was broken three decades later. Hanna-Barbera scored more hits with The Yogi Bear Show (1960–1962), The Jetsons (1962–1963, 1985, 1987) and Scooby-Doo, Where Are You! (1969–1970, later followed by other Scooby-Doo series).
From around 1968, after the assassination of Martin Luther King Jr., then Robert F. Kennedy's and other violent acts made the public less at ease with violence in entertainment, networks hired censors in order to ban anything deemed too violent or suggestive from children's programming.
Apart from regular TV series, there were several noteworthy animated television (holiday) specials, starting with UPA's Mister Magoo's Christmas Carol (1962), followed a few years later by other classic examples such as the string of Bill Melendez' Peanuts specials (1965-2011, based on Charles M. Schulz's comic strip), and Chuck Jones's How the Grinch Stole Christmas! (1966, based on the story by Dr. Seuss).
Cambria Productions
Cambria Productions only occasionally used traditional animation and would often resort to camera movements, real-time movements between foreground and background cels, and integration of live-action footage. Creator Clark Haas explained: "We are not making animated cartoons. We are photographing 'motorized movement' and—the biggest trick of all—combining it with live action.... Footage that Disney does for $250,000 we do for $18,000." Their most famous trick was the Syncro-Vox technique of superimposing talking lips on the faces of cartoon characters in lieu of animating mouths synchronized to dialogue. This optical printing system had been patented in 1952 by Cambria partner and cameraman Edwin Gillette and was first used for popular "talking animal" commercials. The method would later be widely used for comedic effect, but Cambria used it straight in their series Clutch Cargo (1959–1960), Space Angel (1962) and Captain Fathom (1965). Thanks to imaginative stories, Clutch Cargo was a surprise hit. Their last series The New 3 Stooges (1965–1966) no longer used Syncro-Vox. It contained 40 new live-action segments with the original Three Stooges that was spread and repeated throughout 156 episodes together with new animation (occasionally causing people to turn off their TV when live-action footage was repeated, convinced that they had already seen the episode).
US theatrical animation in the 1960s
For One Hundred and One Dalmatians (1961) production costs were restrained, helped by the xerography process that eliminated the inking process. Although the relatively sketchy look was not appreciated by Walt Disney personally, it did not bother critics or audiences and the film was another hit for the studio. The Sword in the Stone (1963) was another financial success, but over the years it has become one of the least known Disney features. It was followed by the live-action/animation hit Mary Poppins (1964) which received 13 Academy Awards nominations, including Best Picture. Disney's biggest animated feature of the 1960s was The Jungle Book (1967) which was both a critical and commercial success. This was also the final film that was overseen by Walt Disney before his death in 1966. Without Walt's imagination and creative endeavour, the animation teams were unable to produce many successful films during the 1970s and 1980s. This was until the release of The Little Mermaid (1989), 22 years later.
UPA produced their first feature 1001 Arabian Nights (1959) (starring Mr. Magoo as Alladin's uncle) for Columbia Pictures, with little success. They tried again with Gay Purr-ee in 1962, released by Warner Bros.. It was well received by critics, but failed at the box office and would be the last feature the studio ever made.
Decline of the theatrical short cartoon
The Supreme Court ruling of the Hollywood Anti-trust Case of 1948 prohibited "block bookings" in which hit feature films were exclusively offered to theatre owners in packages together with newsreels and cartoons or live-action short films. Instead of receiving a reasonable percentage of a package deal, short cartoons had to be sold separately for the prices that theatre owners were willing to pay for them. Short cartoons were relatively expensive and could now be dropped from the program without people losing interest in the main feature, which became a sensible way to reduce costs when more and more potential movie-goers seemed to stay at home to watch movies on their television sets. Most cartoons had to be re-released several times to recoup the invested budget. By the end of the 1960s most studios had ceased producing theatrical cartoons. Even Warner Bros. and Disney, with occasional exceptions, stopped making short theatrical cartoons after 1969. Walter Lantz was the last of the classic cartoon producers to give up, when he closed his studio in 1973.
DePatie–Freleng
DePatie–Freleng Enterprises, founded by Friz Freleng and David H. DePatie in 1963 after Warner Bros. closed their animation department, was the only studio who found new success with short theatrical cartoon series after the 1950s. They created Pink Panther in 1963 for the opening and closing credits of the live-action The Pink Panther film series featuring Peter Sellers. Its success led to a series of short films (1964–1980) and TV series (1969–1980). Pink Panther was followed by the spin-off The Inspector (1965–1969), The Ant and the Aardvark (1969–1971) and a handful of other theatrical series. The Dogfather (1974–1976) was the last new series, but Pink Panther cartoons appeared in theaters until 1980, shortly before the demise of the studio in 1981. From 1966 to 1981 DePatie–Freleng also produced many TV series and specials.
Rise of anime
Japan was notably prolific and successful with their own style of animation, which became known in the English language initially as Japanimation and eventually as anime. In general, anime was developed with limited-animation techniques that put more emphasis on the aesthetic quality than on movement, in comparison to US animation. It also applies a relatively "cinematic" approach with zooming, panning, complex dynamic shots and much attention for backgrounds that are instrumental to creating atmosphere.
Anime was first domestically broadcast on TV in 1960. Export of theatrical anime features started around the same time. Within a few years, several anime TV series were made that would also receive varying levels of airplay in the United States and other countries, starting with the highly influential 鉄腕アトム (Astro Boy) (1963), followed by ジャングル大帝 (Kimba the White Lion) (1965–1966), エイトマン (8th Man) (1965), 魔法使いサリー (Sally the Witch) (1966–1967) and マッハGoGoGo (Mach GoGoGo a.k.a. Speed Racer) (1967).
The domestically popular サザエさん / Sazae-san started in 1969 and is probably the longest-running animated TV show in the world, with more than 7,700 episodes.
Early adult-oriented and counterculture animation
Before the end of the 1960s, hardly any adult-oriented animation had been produced. A notable exception was the pornographic short Eveready Harton in Buried Treasure (1928), presumably made by famous animators for a private party in honour of Winsor McCay, and not publicly screened until the late 1970s. After 1934, the Hays code gave filmmakers in the United States little leeway to release risky material, until the code was replaced by the Motion Picture Association of America film rating system in 1968. While television programming of animation had made most people think of it as a medium for children or for family entertainment, new theatrical animations proved otherwise.
Arguably, the philosophical, psychological, and sociological overtones of the Peanuts TV specials were relatively adult-oriented, while the specials were also enjoyable for children. In 1969 director Bill Mendelez expanded the success of the series to cinemas with A Boy Named Charlie Brown. The theatrical follow-up Snoopy Come Home (1972) was a box-office flop, despite positive reviews. Race for Your Life, Charlie Brown (1977) and Bon Voyage, Charlie Brown (and Don't Come Back!!) (1980) were the only other theatrical traditionally animated feature films for Peanuts, while the TV specials continued until 2011.
The anti-establishment counterculture boom at the end of the 1960s impacted Hollywood early on. In animation, anti-war sentiments were clearly present in several short underground films like Ward Kimball's Escalation (1968) (made independently from his employment at Disney) and the parody Mickey Mouse in Vietnam (1969). The less political parody Bambi meets Godzilla (1969) by Marv Newland, another underground short film for adults, is considered a great classic and was included in The 50 Greatest Cartoons (1994) (based on a poll of 1,000 people working in the animation industry).
The popularity of psychedelia reportedly made the 1969 re-release of Disney's Fantasia popular among teenagers and college students, and the film started to make a profit. Similarly, Disney's Alice in Wonderland became popular with TV screenings in this period and with its 1974 theatrical re-release.
Also influenced by the psychedelic revolution, The Beatles' animated musical feature Yellow Submarine (1968) showed a broad audience how animation could be quite different from the well known television cartoons and Disney features. Its distinctive design came from art director Heinz Edelman. The film received widespread acclaim and would prove to be influential. Peter Max further popularized a similar visual style in his artworks.
Non-US animation in the 1960s
わんぱく王子の大蛇退治 (The Little Prince and the Eight-Headed Dragon) (Japan, 1963 feature)
大鬧天宮 (Havoc in Heaven) (China, 1963 feature)
ガリバーの宇宙旅行 (Gulliver's Travels Beyond the Moon) (Japan, 1965 feature)
Calimero (Italy/Japan 1963–1972, TV series)
Belvision's Pinocchio in Outer Space (Belgium/USA 1965, feature directed by Ray Goossens)
West and Soda (Italy 1965, first feature by Bruno Bozzetto)
1970s
Breakthrough of adult-oriented and counterculture feature animation
Ralph Bakshi thought that the idea of "grown men sitting in cubicles drawing butterflies floating over a field of flowers, while American planes are dropping bombs in Vietnam and kids are marching in the streets, is ludicrous." He therefore created a more sociopolitical type of animation, starting with Fritz the Cat (1972), based on Robert Crumb's comic books and the first animated feature to receive an X-rating. The X-rating was used to promote the film and it became the highest-grossing independent animated film of all time. The success of Heavy Traffic (1973) made Bakshi the first since Disney to have two financially successful animated feature films in a row. The film utilized an artistic blend of techniques with still photography as the background in parts, a live-action scene of models with painted faces rendered in negative cinematography, a scene rendered in very limited sketchy animation that was only partly colored, detailed drawing, archival footage, and most characters animated in a consistent cartoon style throughout it all, except the last ten minutes which were fully filmed as a standard live-action film. He continued to experiment with different techniques in most of his next projects. His next projects Hey Good Lookin' (finished in 1975, but shelved by Warner Bros. until release in an adjusted version in 1982) and Coonskin (1975, suffered from protests against its perceived racism while actually satirizing it) were far less successful, but received more appreciation later on and became cult films.
Bakshi found new success with the fantasy films Wizards (1977) and The Lord of the Rings (1978). Both used rotoscoping for massive battle scenes. For Wizards the technique was used on archival footage as a solution to budgetary problems and rendered in a psychedelic and artistic style. For The Lord of the Rings it became a means to create a look that Bakshi described as "real illustration as opposed to cartoons" for a film that he wanted be true to Tolkien's work, with reference material shot with costumed actors in Spain. The more family-oriented television film The Return of the King (1979) by Rankin/Bass and Topcraft is sometimes regarded as an unofficial sequel after Bakshi's intended second part was not made, but they had already independently started their adaptation of the story on television with The Hobbit in 1977.
The imaginative French/Czech science fiction production La Planète sauvage (1973) was awarded the Grand Prix special jury prize at the 1973 Cannes Film Festival, and in 2016, it was ranked the 36th-greatest animated movie ever by Rolling Stone.
The British production Watership Down (film) (1978) was a huge international success. It featured animal characters that looked more realistic than anthropomorphic, against watercolor backgrounds. Despite its dark and violent aspects, it was deemed suitable for all ages in the UK and rated PG in the United States.
Anime in Europe
Anime import offered relatively cheap animated series, but some European broadcasters thought of animation as something for young children and all too easily programmed anime series accordingly. This led to much criticism when some programs were deemed too violent for children. Child-friendly adaptions of European stories ensured much more success in Europe, with popular titles such as アルプスの少女ハイジ (Heidi, Girl of the Alps) (1974) and みつばちマーヤの冒険 (Maya the Honey Bee) (1975).
Only a few animation studios were active in Europe and starting a new studio required much time, effort and money. For European producers interested in animated series, it made sense to collaborate with Japanese studios who could provide affordable animation in relatively high quality. Resulting productions include Barbapapa (The Netherlands/Japan/France 1973–1977), Wickie und die starken Männer/小さなバイキング ビッケ (Vicky the Viking) (Austria/Germany/Japan 1974), Il était une fois... (Once Upon a Time...) (France/Japan 1978) and Doctor Snuggles (The Netherlands/West Germany/Japan/US 1979).
Artistic short-animation highlights
Short animated films mostly became a medium for film festivals in which independent animators showcased their talents. With the big studios away from the field, the Academy Award for Best Animated Short Film and nominations of the 1970s and 1980s were usually for relatively unknown artists.
La Linea (Italy 1971, 1978, 1986) is a popular animation series with a main character that consists of a part of an otherwise straight white line that runs horizontally across the screen.
Soviet/Russian animator Yuri Norstein "is considered by many to be not just the best animator of his era, but the best of all time". He released a handful of award-winning short films in the 1970s:
The Battle of Kerzhenets (, 1971), in collaboration with Ivan Ivanov-Vano
The Fox and the Hare (, 1973).
The Heron and the Crane (, 1974).
Hedgehog in the Fog (, 1975).
Tale of Tales (, 1979).
He has since 1981 been working on The Overcoat () and participated in .
Early animated music videos
Although the combination of music and animation had a long tradition, it took some time before animation became part of music videos after the medium became a proper genre in the mid-1970s.
Halas and Batchelor produced an animated video for Roger Glover's Love Is All (1974) that was broadcast internationally over decades, often as an interstitial program.
Pink Floyd's 1977 Welcome to the Machine music video, animated by Gerald Scarfe, was initially only used as backdrop for concert performances.
Elvis Costello's Accidents Will Happen (1979) was made by Annabelle Janckel and Rocky Morton, known for their animated commercials. Despite an initially lukewarm reception, the video has since received acclaim.
Roger Mainwood and John Halas created an animated music video for Kraftwerk's Autobahn in 1979. The short wordless documentary Making it move... showed the production process.
A cartoon for Linda McCartney's Seaside Woman was made by Oscar Grillo and won a Palme d'Or for Best Short Film at the Cannes festival in 1980.
1980s
US animation's low point (early 1980s)
Animation for US TV programming had grown formulaic, often based on characters known from other media, and with much of the actual (limited) animation work outsourced to cheap Asian laborers.
Several popular animated TV series for children could be perceived as little more than commercials, since they were based on toy lines, including Mattell's He-Man and the Masters of the Universe (1983–1985) and Hasbro's G.I. Joe (1983–1986), The Transformers (1984–1987) and My Little Pony (1986–1987). Don Bluth, who had left Disney in 1979 together with nine other animators, started to compete with his former employer in cinemas in 1982 with The Secret of NIMH. The film garnered critical acclaim but was only a modest success at the box office.
Mostly in retrospect, Disney feature films have been perceived as going through a dark age in the first decades after Walt Disney's death in 1966 (despite a more steady string of box office successes than during the decennia in which Walt was alive). The failure of The Black Cauldron (1985), made on an ambitious budget, was clearly a new low. Tim Burton cited Disney's failure to train new animators during the 1960s and early 1970s as a reason for the decline, with Disney relying instead on an aging group of veterans.
Europe
In comparison to the US animation output around the start of the 1980s, international co-productions seemed more imaginative and more promising. The Smurfs (1981–1989), produced by Belgian Freddy Monnickendam's SEPP International in collaboration with Hanna-Barbera, was highly successful, and followed by Snorks (1984–1989) and Foofur (1986–1988). Production for Bzz Films' Bibifoc (Seabert) (1984–1988) was also handled by SEPP. Other notable international co-productions include Inspector Gadget (France/U.S.A. 1983) and The Wonderful Wizard of Oz (Canada/Japan 1986–1987).
Studio Ghibli and TV anime
Anime, together with printed manga, had built enormous fandom in Japan and became a big part of the country's mainstream culture. Among anime's many genres, mecha (giant-robot science fiction) became particularly iconic. Print manga in particular entered into a golden age during the 1980s, buoyed by series such as Dragon Ball (1984–1995), and these series received long running successful anime adaptations. The relatively new home video market grew very large and original video animation (OVA) became a much-appreciated medium, often with higher-quality productions than those made for TV (in contrast to the US, where direct-to-video was mainly a medium for releases that were not expected to be popular enough to warrant a theatrical release or TV broadcast and therefore often produced on a much lower budget). Naturally, the OVA medium suited the consumption of erotic and pornographic animation. The first erotic ova release was the ロリータアニメ (Lolita Anime) series from February 1984 to May 1985, soon followed by the Cream Lemon series (August 1984 – 2005). The genre became internationally known as hentai and is infamous for often containing perverse subject matter, including underage sex, monster sex and tentacle sex (originally devised as a means to bypass Japanese censorship regulations). New anime series based on European material included ニルスのふしぎな旅 (The Wonderful Adventures of Nils) (1980-1981) and スプーンおばさん (Mrs. Pepper Pot) (1983–1984).
Hayao Miyazaki's epic theatrical features Nausicaä of the Valley of the Wind (1984), based on his own manga, and 天空の城ラピュタ (Castle in the Sky) (1986) are regularly praised as some of the greatest animated films of all time. Castle in the Sky was the first feature for Studio Ghibli, founded in 1985 by Miyazaki with Isao Takahata and others. Studio Ghibli continued its success with Takahata's WWII film 火垂るの墓 (Grave of the Fireflies) (1988) and Miyazaki's iconic となりのトトロ (My Neighbor Totoro) (1988) and 魔女の宅急便 (Kiki's Delivery Service) (1989).
Renaissance of US animation
Beginning in the mid-1980s, US animation would see a renaissance. This has been credited to a wave of talent that emerged from the California Institute of the Arts- primarily among the cohort that had studied there in the 1970s under Marc Davis, a member of the nine old men that had influenced the language of animation in the 1920s and 1930s. At this time, many of the nine were in the process of retiring. As an inside joke, many students of classroom A113 have inserted the room code into their films, television series and so forth in the years since. The students of A113 include Jerry Rees, John Lasseter, Tim Burton, Michael Peraza, and Brad Bird. Two other members of the nine old men, Ollie Johnston and Frank Thomas, published The Illusion of Life in 1981, an instructional book that has been voted the best animation book of all time and influenced James Baxter among other modern animators.
In cinemas, Robert Zemeckis' live-action/animation hit Who Framed Roger Rabbit (1988) also harked back to the quality and zany comedy of the golden age of cartoons, with cameos of many of the superstars of that era, including Mickey, Minnie, Donald, Goofy, Betty Boop, Droopy, Woody Woodpecker and the Mel Blanc-voiced Bugs Bunny, Daffy Duck, Porky Pig, Tweety and Sylvester. The film won several Oscars and helped revive interest in theatrical feature animation and the classic cartoons. The fully animated Roger Rabbit short film Tummy Trouble (1989) was then packaged with the live-action family comedy Honey, I Shrunk the Kids and believed to have helped that movie's quick start at the box-office. In collaboration with Steven Spielberg's Amblin Entertainment, Bluth's An American Tail (1986) became the highest-grossing non-Disney animated film at the time. The Land Before Time (1988) was equally successful, but Bluth's next five feature films flopped.
Mighty Mouse: The New Adventures (1987–1989) was one of the first animated TV shows to recapture the earlier quality and originality of American cartoons. It was produced by Ralph Bakshi and the first season was supervised by John Kricfalusi, with much freedom for artists to work in their own style. Rather than making a nostalgic rehash of the original Terrytoons series, it tried to recreate the quality and the zany humour of the Looney Tunes classics. Matt Groening's The Simpsons started in April 1987 as a short segment in-sketch comedy show The Tracey Ullman Show, and then launched as a separate prime-time half-hour sitcom in December 1989. It became one of the biggest cartoon hits in history and is the longest-running scripted US primetime television series.
While the successes of The Great Mouse Detective (1986) and Oliver and Company (1988) had already helped to get the Disney studio back on track, they struck gold with the box office record-breaking hit The Little Mermaid (1989). A shot for the rainbow sequence at the end of The Little Mermaid was the first piece of feature animation to be created with the Computer Animation Production System (CAPS) system that Disney and Pixar had collaboratively assembled. This digital ink and paint system replaced the expensive method of inking and colouring cels by hand, and provided filmmakers with new creative tools. By 1990, the boom of animated hits was heralded as a comeback that might rival the golden age of cartoons.
Adult-oriented theatrical animation in the 1980s
Bakshi's rock musical American Pop (1981) was another success, mostly made with the rotoscope technique in combination with some water colors, computer graphics, live-action shots, and archival footage. His next film Fire and Ice (1983) was a collaboration with artist Frank Frazetta. It was one of many films in the sword and sorcery genre released after the success of Conan the Barbarian (1982) and The Beastmaster (1982). Critics appreciated the visuals and action sequences, but not its script, and the film flopped at the box office. After failing to get several projects off the ground, Bakshi retired for a few years.
The Canadian anthology hit film Heavy Metal (1981) was based on comics published in the popular Heavy Metal magazine and co-produced by its founder. Mixed reviews thought the film was uneven, juvenile and sexist. It was eventually followed in 2000 by the poorly received Heavy Metal 2000 and re-imagined as the Netflix series Love, Death & Robots in 2019.
The dark rock opera film Pink Floyd – The Wall (1982) contained 15 minutes worth of animated segments by British cartoonist Gerald Scarfe, who had already designed related artwork for the 1979 album and 1980-81 concert tour. Some of the film's animated material was previously used for the 1979 music video for "Another Brick in the Wall: Part 2" and for the tour. Scarfe had also made animations for Pink Floyd's 1977 In the Flesh tour.
The successful British nuclear disaster film When the Wind Blows (1986) showed hand-drawn characters against real backgrounds, with stop-motion for objects that moved.
The violent post-apocalyptic cyberpunk anime Akira (1988) garnered increased popularity of anime outside Japan and is now widely regarded as a classic.
MTV and animated videos
MTV launched in 1981 and further popularized the music-video medium, which allowed relatively much artistic expression and creative techniques, since all involved wanted their video to stand out. Many of the most celebrated music videos of the 1980s featured animation, often created with techniques that differed from standard cel animation. For instance, the iconic video for Peter Gabriel's Sledgehammer (1986) featured claymation, pixilation, and stop motion by Aardman Animations and the Brothers Quay.
A-ha's "Take On Me" (1985) famously combined live-action with realistic pencil-drawing animation by Michael Patterson. The video was directed by Steve Barron, who would also direct the groundbreaking computer-animated Dire Straits "Money for Nothing" in the same year. The a-ha video was inspired by Alex Patterson's CalArts graduation film Commuter (1984), which had attracted the attention of Warner Bros. records executives and would be partly used again for A-ha's Train of Thought video.
Patterson also directed Paula Abdul's Opposites Attract (1989), featuring his animated creation MC Skat Kat.
The Rolling Stones' "The Harlem Shuffle" (1986) featured animated elements directed by Ralph Bakshi and John Kricfalusi, created in a few weeks.
The original moon-landing bumpers on MTV were pulled in early 1986 in the wake of the Challenger disaster MTV then furthered its wild artistic postmodern image through a plethora of experimental ident bumpers, most of them animated. Animators usually went uncredited, but were free to work in their own identifiable styles. For instance, Canadian animator Danny Antonucci's contribution anonymously featured his Lupo the Butcher character that was allowed to utter his psychotic ramblings.
From around 1987 MTV had a dedicated Animation department and slowly started introducing more animation in between its music-related programming. Bill Plympton's Microtoons is an early example.
1990s
Disney Renaissance
The 1990s saw Disney release numerous films that were both critically and commercially successful, returning to heights not seen since their heyday from the 1930s to 1960s. The period from 1989 to 1999 is now referred to as the Disney Renaissance or the Second Golden Age. Their success led other major film studios to establish new animation divisions such as Amblimation, Fox Animation Studios or Warner Bros. Feature Animation to replicate Disney's success by turning their animated films into Disney-styled musicals.
Disney's Beauty and the Beast (1991) (the first animated film in history to be nominated for the Academy Award for Best Picture), Aladdin (1992) and The Lion King (1994) successively broke box-office records. Pocahontas (1995) opened to mixed reviews but was a financial success and received two Academy Awards. Mulan (1998) and Tarzan (1999) didn't surpass The Lion King as the highest-grossing (traditionally) animated film of all time, but were both successful grossing over $300 million worldwide. The Hunchback of Notre Dame (1996) was a financial success at the time but contained very dark and adult themes and has since become one of Disney's lesser known films. Only the sequel The Rescuers Down Under (1990) and Hercules (1997) underperformed box-office expectations. From 1994 onward, Disney continued to produce feature-length sequels to successful titles, but only as direct-to-video releases.
Television
John Kricfalusi's influential The Ren & Stimpy Show (1991–1995) garnered widespread acclaim. For a while it was the most popular cable TV show in the United States. Programmed as a children's cartoon, it was notoriously controversial for its dark humor, sexual innuendos, adult jokes, and shock value. The enormous success of The Simpsons and The Ren & Stimpy Show prompted more original and relatively daring series, including South Park (since 1997), King of the Hill (1997–2010), Family Guy (since 1999), and Futurama (1999–2003). The use of animation on MTV increased when the channel started to make more and more shows that did not really fit their "music television" moniker. Liquid Television (1991 to 1995) showcased contributions that were mostly created by independent animators specifically for the show and spawned separate Æon Flux and Beavis and Butt-Head (1993–1997) series. Other 1990s cartoon series on MTV included The Head (1994-1996) and The Maxx (1995), both under the MTV's Oddities banner. By 2001, MTV closed its animation department, began to outsource their animated series and eventually imported shows from associated networks. The 24-hour cable channel Cartoon Network was launched in the United States on 1 October 1992 and was soon followed by its first international versions. Originally the programming consisted of classic cartoons from the back catalogues of Warner Bros, MGM, Fleischer/Famous and Hanna-Barbera. From 1996 to 2003, new original series ran as Cartoon Cartoons and introduced the popular titles Dexter's Laboratory (1996–2003), Johnny Bravo (1997–2004), Cow and Chicken (1997–1999), I Am Weasel (1997–2000), The Powerpuff Girls (1998–2005) and Ed, Edd n Eddy (1999–2009). Television animation for children also continued to flourish in the United States on specialized cable channels like Nickelodeon, Disney Channel/Disney XD, PBS Kids, and in syndicated afternoon time slots.
2000s–2010s: In the shadow of computer animation
After the success of Pixar's Toy Story (1995) and DreamWorks Animation's Shrek (2001), computer animation grew into the dominant animation technique in the US and many other countries. Even animation that looked traditional was more and more often created fully with computers. By 2004, only small productions were still created with traditional techniques.
The first decades of the 21st century also saw 3D film turn mainstream in theatres. The production process and visual style of CGI lends itself perfectly for 3D-viewing, much more than traditional animation styles and methods. However, many traditionally animated films can be very effective in 3D. Disney successfully released a 3D version of The Lion King in 2011, followed by Beauty and the Beast in 2012. A planned 3D version of The Little Mermaid was cancelled when Beauty and the Beast and two 3D-converted Pixar titles were not successful enough at the box office.
Disney-Pixar
Disney started producing their own 3D-style computer animated features with Dinosaur and Chicken Little, but continued to make animated features with a traditional look: The Emperor's New Groove (2000), Atlantis: The Lost Empire (2001), Lilo & Stitch (2002), Treasure Planet (2002), Brother Bear (2003) and Home on the Range (2004).
Treasure Planet and Home on the Range were big flops on big budgets and it looked like Disney would only continue with 3D computer animation. Financial analysis in 2006 proved that Disney had actually lost money on their animation productions of the previous ten years. In the meantime, Pixar's CGI features did extremely well. To turn things around Disney acquired Pixar in 2006, and put creative control over both Pixar and Walt Disney Animation Studios in the hands of Pixar's John Lasseter as part of the deal. The studios would remain separate legal entities. Under Lasseter, the Disney studio developed both traditionally styled and 3D-styled animation projects.
The theatrical short How to Hook Up Your Home Theater (2007) tested whether new paperless animation processes could be used for a look similar to cartoons of the 1940s and 1950s, with Goofy returning to his "Everyman" role in his first solo appearance in 42 years.
Ron Clements and John Musker's feature The Princess and the Frog (2009) was a moderate commercial and critical success, but not the comeback hit for traditional features that the studio had hoped it would be. Its perceived failure was mostly blamed on the use of "princess" in the title causing potential movie-goers to think it was only for little girls, and old-fashioned.
Winnie the Pooh (2011) received favourable reviews, but failed at the box office and became Disney's last traditional feature to date. Frozen (2013) was originally conceived in the traditional style, but switched to 3D CGI to enable the creation of certain required visual elements and became Disney's biggest hit at the time and surpassing both The Lion King and Toy Story 3 as the highest-grossing animated film of all time. It won the studio's first Academy Award for best animated feature.
Anime
Outside North America, hand-drawn animation continued to be more popular. Most notably in Japan, where traditionally styled anime was still the dominant technique. Popularity of anime continued to rise domestically, with a record-high 340 anime series airing on television in 2015, as well as internationally, with a dedicated Toonami block on Cartoon Network (1997–2008) and Adult Swim (since 2012) and with streaming services like Netflix and Amazon Prime licensing and producing an increasing amount of anime.
Ghibli continued its enormous success with Miyazaki's Spirited Away (2001), ハウルの動く城 (Howl's Moving Castle) (2004), 崖の上のポニョ (Ponyo) (2008) and 風立ちぬ (The Wind Rises) (2013) and Hiromasa Yonebayashi借りぐらしのアリエッティ(The Secret World of Arrietty) (2010), all grossing more than $100 million worldwide and appearing in the top 10 of the highest-grossing anime films of all time (as of 2020). Takahata's かぐや姫の物語 (The Tale of the Princess Kaguya) (2013) was nominated for an Oscar for Best Animated Feature Film Academy Award and many other awards.
Makoto Shinkai directed 君の名は。(Your Name) (2016, highest-grossing anime film of all time internationally) and 天気の子 (Weathering with You) (2019).
Comparison to stop motion
After the pioneering work by the likes of J. Stuart Blackton, Segundo de Chomón and Arthur Melbourne-Cooper, stop motion became a branch of animation that has been much less dominant than hand-drawn animation and computer animation. Nonetheless, there have been many successful stop motion films and television series. Among the animators whose work with animated puppets have received the highest acclaim are Wladyslaw Starewicz, George Pal and Henry Selick. Popular titles using animated clay include Gumby (1953), Mio Mao (1970), The Red and the Blue (1976), Pingu (1990-2000) and many Aardman Animations productions (Morph (1977) and Wallace and Gromit (1989)).
In the hands of influential filmmakers such as Jan Svankmajer and Brothers Quay, stop motion has been regarded as a highly artistic medium.
Until largely replaced by computer-animated effects, stop motion was also a popular technique for special effects in live-action films. Pioneer Willis O'Brien and his protégé Ray Harryhausen animated many monsters and creatures for live-action Hollywood films, using models or puppets with armatures. In comparison, hand-drawn animation has relatively often been combined with live-action, but usually in an obvious fashion and often used as a surprising gimmick that combines a "real" world and a fantasy or dream world. Only very occasionally has hand-drawn animation been used as convincing special effects (for instance in the climax of Highlander (1986)).
Comparison to cutout animation
Cutout techniques were relatively often used in animated films until cel animation became the standard method (at least in the United States). The earliest animated feature films, by Quirino Cristiani and Lotte Reiniger, were cutout animations.
Before 1934, Japanese animation mostly used cutout techniques rather than cel animation, because celluloid was too expensive.
As cutouts often have been hand-drawn and some productions combine several animation techniques, cutout animation can sometimes look very similar to hand-drawn traditional animation.
While sometimes used as a simple and cheap animation method in children's programs (for instance in Ivor the Engine), cutout animation has remained a relatively artistic and experimental medium in the hands of for instance Harry Everett Smith, Terry Gilliam and Jim Blashfield.
Today, cutout-style animation is frequently produced using computers, with scanned images or vector graphics taking the place of physically cut materials. South Park is a notable example of the transition since its pilot episode was made with paper cutouts before switching to computer software. Similar stylistic choices and blends with different techniques in computer animation have made it harder to differentiate between "traditional", cutout and Flash animation styles.
Computer animation
Early experiments with computers to generate (abstract) moving images have been conducted since the 1940s.
The earliest known interactive electronic game was developed in 1947, paving the way for a medium that can be regarded as an interactive branch of computer animation (which is quite different from animated movies).
A short vector animation of a car traveling down a planned highway was broadcast on Swedish national television on 9 November 1961.
In 1968 Soviet physicists and mathematicians created a mathematical model for the motion of a cat, with which they produced a short animated film.
In 1971 the first commercial (coin-operated) video game was marketed. Next year's Pong by Atari, Inc., with very simple two-dimensional graphics, was a huge success.
Since the 1970s digital image processing and computer-generated imagery, including early 3D wire-frame model animations, were occasionally used in commercials as well as for the representation of futuristic computer technology in big Hollywood productions (including Star Wars).
Since 1974 the annual SIGGRAPH conventions have been organised to demonstrate current developments and new research in the field of computer graphics.
3D animation started to have more cultural impact during the 1980s, demonstrated for instance in the 1982 movie Tron and the music video for Money for Nothing (1985) by the Dire Straits. The concept even spawned a popular faux 3D-animated AI character: Max Headroom.
During the 1990s, 3D animation became more and more mainstream, especially in video games, and eventually had a big breakthrough in 1995 with Pixar's feature film hit Toy Story.
More or less photo-realistic 3D animation has been used for special effects in commercials and films since the 1980s. Breakthrough effects were seen in Terminator 2: Judgment Day (1991) and Jurassic Park (1993). Since then techniques have developed to the stage that the difference between CGI and real life cinematography is seldom obvious. Filmmakers can blend both types of images seamlessly with virtual cinematography. The Matrix (1999) and its two sequels are usually regarded as breakthrough films in this field.
Due to the complexity of human body functions, emotions and interactions, movies with important roles for fully 3D-animated realistic-looking human characters have been rare. The more realistic a CG character becomes, the more difficult it is to create the nuances and details of a living person, and the greater the likelihood of the character falling into the uncanny valley. Films that have attempted to create realistic-looking humans include Final Fantasy: The Spirits Within in 2001, Final Fantasy: Advent Children in 2005, The Polar Express in 2004, Beowulf in 2007 and Resident Evil: Degeneration in 2009.
The creation of virtual worlds allows real-time animation in virtual reality, a medium that has been experimented with since 1962 and started to see commercial entertainment applications in the 1990s.
In the first decades of the 21st century, computer animation techniques slowly became much more common than traditional cel animation. To recreate the much-appreciated look of traditional animation for 3D animated techniques, cel-shading techniques were developed. True real-time cel-shading was first introduced in 2000 by Sega's Jet Set Radio for their Dreamcast console.
Other developments per region
Americas
History of Cuban animation
1970: Juan Padrón creates the character of Elpidio Valdés, star of a long-running series of shorts and two motion pictures.
1985: Juan Padrón's ¡Vampiros en la Habana!
1992: An animation category is added to the Festival Internacional del Nuevo Cine Latinoamericano.
History of Mexican animation
1935: Alfonso Vergara produces Paco Perico en premier, an animated short film.
1974: Fernando Ruiz produces Los tres reyes magos, Mexico's first animated feature-length film.
1977: Anuar Badin creates the film Los supersabios, based on the comic.
1983: Roy del espacio
2003: Ánima Estudios releases Magos y gigantes a full-length Mexican-animated feature after many years of hiatus in the country's industry.
Modern animation in the United States (1986 through present)
Success of Disney animated series: The Disney Afternoon (1985–1997).
Steven Spielberg's collaborations with Warner Bros. Animation (1990–1999).
The decline of Saturday-morning cartoons in the 1990s.
Cartoon Network's late-night animation block Adult Swim becomes immensely popular and leads to a resurgence in short, adult animation.
"Disney Revival" films (2009-2018).
Europe
History of Estonian animation
Estonian animation began in the 1930s and has carried on into the modern day.
1931 – The Adventures of Juku the Dog, first Estonian animated short film
1950s – founding of puppet animation division of Tallinnfilm by Elbert Tuganov
1970s – founding of drawn animation division, Joonisfilm, by Rein Raamat
History of Italian animation
1914: First use of stop-motion animation as special effects in Cabiria
1936: The Adventures of Pinocchio, unfinished, considered lost
1949: The first two Italian animated movies are released: La Rosa di Bagdad directed by Anton Gino Domeneghini and The Dynamite Brothers directed by Nino Pagot
1962: The Italian animated cartoon art and industry (La Linea, Mio Mao, Calimero...) is born.
1977: The animated Italian classic, Allegro non troppo, is both a parody of and homage to Disney's Fantasia; this is director Bruno Bozzetto's most ambitious work and his third feature-length animation, after West and Soda, an animated Spaghetti Western, and VIP my Brother Superman, a parody of superheroes, although he also directed several notable shorter works including Mr. Rossi and the Oscar-nominated Grasshoppers (Cavallette).
2001: Magic Bloom, the pilot for Winx Club
2004: Winx Club, produced by Rainbow S.p.A.
2009: Huntik: Secrets & Seekers
2016: Regal Academy
History of animation in Croatia (in former Yugoslavia)
1953: Zagreb Film inaugurates the Zagreb school of animation.
1975: Škola Animiranog Filma Čakovec (ŠAF) inaugurates the Čakovec school of animation.
Asia
Animation was part of Chinese cinema as early as the 1920s, as seen in extant films. Princess Iron Fan (1941), by the Wan brothers, is said to be China's first full-length animated film.
Oceania
History of Australian animation
See: Animal Logic, Yoram Gross, Flying Bark Productions
1977: Dot and the Kangaroo
1979: The Little Convict
1982: The Seventh Match (also known as Sarah)
1984: The Camel Boy
1984: Epic: Days of the Dinosaurs (also known as EPIC)
1991: The Magic Riddle
1992: Blinky Bill: The Mischievous Koala
1992: FernGully: The Last Rainforest
2000: The Magic Pudding (2000)
2006: Happy Feet (co-production with America)
History of New Zealand animation
See: Weta Digital
1986: Footrot Flats: The Dog's Tale
2015: 25 April
2019: Mosley
Media
Notes
References
Citations
Works cited
Bibliography
</ref>
Online sources
External links
Chinese Film Classics: Animation and cartoons (manhua): Examples and discussion of animation in the early Chinese film industry, from the scholarly website chinesefilmclassics.org
History of film
Articles containing video clips
Animation |
43154124 | https://en.wikipedia.org/wiki/CodeHS | CodeHS | CodeHS is an interactive online learning platform offering computer science and programming instruction for schools and individual learners. CodeHS is focused on spreading access to and knowledge of computer science by offering online instructional materials supported by remote tutors. In the introductory learning module, students on the site practice computer science concepts and programming skills by giving commands to a dog named Karel. Similar to the original Karel programming language developed by Richard E. Pattis, Karel the dog must complete various tasks by moving around a grid world, and putting down and picking up tennis balls using only simple commands. Later learning modules teach more advanced concepts using languages like JavaScript, Java, and HTML.
History
CodeHS was founded in 2012 by Jeremy Keeshin and Zach Galant, both Stanford University Computer Science graduates. Keeshin and Galant based CodeHS on their experience as section leaders and teaching assistants for several of Stanford's introductory computer science courses. The company joined the Imagine K12 incubator's third class, launching in October 2012, and its investors include NewSchools Venture Fund, Seven Peaks Ventures, Kapor Capital, Learn Capital, Imagine K12, Marc Bell Ventures, and Lighter Capital. In total, CodeHS has raised $2.9 million as of December 2016.
NBC Education Nation
CodeHS was selected as one of three education technology companies to take part in the 2013 Innovation Challenge, part of the NBC Education Nation initiative. Innovation Nation challenge participants CodeHS, Teachley, and GigaBryte participated in a series of challenges in October 2013, culminating in a live pitch contest broadcast live on NBC during the Education Nation Summit. CodeHS won the Innovation Challenge, earning a $75,000 prize awarded by the Robin Hood Foundation.
Hour of Code
During the week of December 9, 2013, CodeHS participated in the nationwide Hour of Code challenge promoted by Code.org. CodeHS was featured as a tutorial for learning JavaScript on the Computer Science Education Week website. Over the course of the week, an estimated 116,648 participants started learning to code for an hour on CodeHS.
Karel the Dog
The first learning module on CodeHS teaches introductory programming concepts by having students give basic commands to Karel the Dog using Karel-specific JavaScript commands. This approach is based on the original Karel programming language developed by Richard E. Pattis and is used in Stanford University's introductory computer science classes. Karel initially knows only a few basic commands: move(); to have Karel move one spot forward, turnLeft(); to have Karel turn left, putBall(); to have Karel put down one tennis ball in the current spot, and takeBall(); to have Karel pick up one tennis ball from the current spot. Karel can be "taught" additional commands by defining new functions composed of these basic commands. The programmer can also use elements like loops and conditionals to control the flow of the program.
After Karel the dog, Tracy the turtle was introduced.
Superkarel
Karel evolves into Superkarel and gains the ability to turnRight() and turnAround().
Example
The following is an example of a simple program to have Karel repeat a series of commands (put down a tennis ball, move, turn left, move, then turn right) three times:
function start() {
for (var i = 0; i < 3; i++) {
putBall();
move();
turnLeft();
move();
turnRight();
}
}
function turnRight() {
turnLeft();
turnLeft();
turnLeft();
}
Reception
CodeHS received significant media coverage upon its launch, including articles in the Forbes, TechCrunch, and Education Week. The site has also been featured on various blogs for its interactive and beginner-focused approach to teaching programming.
See also
Blended learning
Karel (programming language)
Code.org
Codecademy
CodeCombat
Code Avengers
Khan Academy
Team Treehouse
Udacity
References
External links
CodeHS
CodeHS Blog
American educational websites
Computer programming
Privately held companies of the United States |
47792762 | https://en.wikipedia.org/wiki/Stina%20Ehrensv%C3%A4rd | Stina Ehrensvärd | Stina Ehrensvärd is a Swedish-American entrepreneur, innovator and industrial designer. She is the founder and CEO of Yubico and co-inventor of the YubiKey authentication device.
Biography
Ehrensvärd was born in the United States. Her father, who was an architect like her mother, spent a year at the University of Washington in Seattle, undertaking research on urban planning and computer graphics. The following year, the family moved back to Lund, Sweden, where she grew up with three siblings. She went on to study industrial design at the Konstfack University College of Arts, Crafts and Design in Stockholm. It was around this time that she met her husband-to-be Jakob Ehrensvärd, an electronics enthusiast. They now have three children.
The two began collaborating on a series of innovations combining their design and computing talents. Their first significant joint development was Cypak, an intelligent pharmaceutical packaging system that did not take off. In 2007, the couple founded Yubico, and began manufacturing the YubiKey authentication device for account logins. The YubiKey quickly gained worldwide popularity and attracted millions of users, including nine of the top ten internet companies. In 2011, the couple moved to Palo Alto to become part of the Silicon Valley IT scene.
Yubico is a leading contributor to the FIDO Universal 2nd Factor (U2F) open authentication standard (co-authored with Google), and invented the concept of having one authentication device access any number of online services with no shared secrets. Under Ehrensvärd’s guidance, Yubico is the innovator behind driverless one-time password (OTP) authentication, PIV smart cards with touch-to-sign, and Hardware Security Modules that sit inside standard USB-ports.
Ehrensvärd also frequently speaks on internet identity and entrepreneurship. In 2013, she was listed in the monthly magazine Inc. as one of the "10 Women to Watch in Tech in 2013". The following year, Yubico was awarded the Swedish Innovation Award, and in 2016, she was awarded the KTH Great Prize, one of the most prestigious innovation and entrepreneur awards in Sweden.
In 2013, Ehrensvärd was interviewed by the business magazine The Next Woman about using the YubiKey for the first FIDO U2F pilot with Google. When asked if there was anything she wished to share with the community, she revealed: "We women are trained in our DNA to please. I have stopped trying to please everyone, but to instead follow my dreams."
Ehrensvärd continues to work towards her vision of bringing FIDO U2F to the masses, having one single YubiKey to an unlimited number of services, and making secure login easy and available for everyone.
Ehrensvärd's company, Yubico, is also a supporter of the Hong Kong protests, having donated 500 encryption keys to the activists.
References
1967 births
Living people
Businesspeople from Seattle
American people of Swedish descent
Swedish business executives
Swedish women business executives
Swedish designers
Konstfack alumni |
39450160 | https://en.wikipedia.org/wiki/Dell%20Networking%20Operating%20System | Dell Networking Operating System | DNOS or Dell Networking Operating System is a network operating system running on switches from Dell Networking. It is derived from either the PowerConnect OS (DNOS 6.x) or Force10 OS/FTOS (DNOS 9.x) and will be made available for the 10G and faster Dell Networking S-series switches, the Z-series 40G core switches and DNOS6 is available for the N-series switches.
Two version families
The DNOS network operating system family comes in a few main versions:
DNOS3
DNOS 3.x: This is a family of firmware for the campus access switches that can only be managed using a web based GUI or run as unmanaged device.
DNOS6
DNOS 6.x: This is the operating system running on the Dell Networking N-series (campus) networking switches. It is the latest version of the 'PowerConnect' operating system, running on a Linux Kernel. It is available as upgrade for the PowerConnect 8100 series switches (which then become a Dell Networking N40xx switch) and it also is installed on all DN N1000, N2000 and N3000 series switches. It has a full web-based GUI together with a full CLI (command line interface) and the CLI will be very similar to the original PowerConnect CLI, though with a range of new features like PVSTP (per VLAN spanning tree), Policy Based Routing and MLAG.
DNOS9
DNOS 9.x: TeUTg on NetBSD.
Only the PowerConnect 8100 will be able to run on DNOS 6.x: all other PowerConnect ethernet switches will continue to run its own PowerConnect OS (on top of VxWorks) while the PowerConnect W-series run on a Dell specific version of ArubaOS.
The Dell Networking S- xxxx and Z9x00 series will run on DNOS where the other Dell Networking switches will continue to run FTOS 8.x firmware.
OS10
OS10 is a Linux-based open networking OS that can run on all Open Network Install Environment (ONIE) switches. As it runs directly in a Linux environment network admins can highly automate the network platform and manage the switches in a similar way as the (Linux) servers.
Hardware Abstraction Layer
Three of the four product families from Dell Networking are using the Broadcom Trident+ ASICs, but the company doesn't use the APIs from Broadcom: the developers at Dell Networking have written their own Hardware Abstraction Layer so that DNOS 9.x can run on different hardware platforms with minimal impact for the firmware. Currently three of the four DN switch families are based on the Broadcom Trident family (while the 4th - the E-series- run on self-developed ASICs); and two of them are running DNOS 9.x (S- and Z- series) and if the product developers want or need to use different hardware for new products they only need to develop a HAL for that new hardware and the same firmware can run on it. This keeps the company flexible and not dependent on a specific hardware-vendor and can use both 3rd party or self designed ASICs and chipsets.
The underlying OS on which DNOS 9.x, runs, is based on NetBSD (while the DNOS 6.x runs on a Linux kernel), an implementation which is often used in embedded networking-systems. NetBSD is a very stable, open source, OS running on many different hardware platforms. By choosing for a proven technology with extended TCP functionality built into the core of the OS it reduces time during development of new products or extending the DNOS with new features.
Modular setup
DNOS 9.x is also modular where different parts of the OS run independently from each other within one switch: if one process would fail the impact on other processes on the switch are limited. This modular setup is also taken to the hardware level in some product-lines where a routing-module has three separate CPUs: one for management, one for L2 and one for L3 processing. This same approach is also used in the newer firmware-families from Cisco like the NX-OS for the Nexus product-line or the IOS XR for the high-end routers (the Carrier Routing Systems) from Cisco. (and unlike the original IOS: processes under IOS aren't isolated from each other). This approach is regarded not only a way to make the firmware more resilient but also increases the security of the switches
Capabilities
All DNOS 9.x based switches offer a wide range of layer2 and layer3 protocols. All features are available on all switches: some switch models (in the S-series) offer an additional license for layer3 or routing: this additional license is NOT required to use that protocol, but only required to get support from the Dell Networking support department on using these features. All interfaces on DNOS 9.x running switches are configured as a layer3 interface and by default shutdown. To use such an interface as an ethernet switchport you need to configure it as such (with the command "switchport") and then enable that port using "no shutdown".
Unlike DNOS 6.x (which provide web and CLI - with extensive API control via undocumented "dedug console" and "dev help" commands), DNOS 9.x only offers a documented command line interface (CLI) to configure and monitor the switch directly, though it is possible with the "Automation Tools" to create your own webGui on DNOS 9.x switches.
Layer2 capabilities
All standardized ethernet standards are supported by switches running FTOS including: Spanning Tree Protocol and RSTP, VLAN and the IEEE 802.1Q standards, QinQ or IEEE 802.1ad, Link Layer Discovery Protocol and LLDP MED.
The S-series switches ending with a V and some of the E-series line-cards support Power over Ethernet or PoE with the standards for this protocol.
Layer3 capabilities
As mentioned above, by default an interface on a switch running DNOS 9.x are configured as a layer3 port. All these switches are thus routers with many interfaces that can (and most often are) reconfigured into a layer2 ethernet switch.
All DNOS 9 switches run at least the following routing protocols: Routing Information Protocol and RIP version 2, OSPF, IS-IS and Border Gateway Protocol version 4.
Open Automation
Under the name OpenAutomation 2.0 Dell Networking switches running DNOS 9.x offers a number of features under this name. These features include:
Smart Scripting
Dell Networking switches support so called smart scripting. It is possible to develop scripts that run on the switches running DNOS 9. Both Perl and Python are supported as scripting languages to automate environment specific repetitive tasks or to build in custom behavior. Users who write such scripts are promoted to share these scripts with the user-community and make them available to other Force10/DNOS users. Force10 introduced the smart scripting in FTOS in 2010, following other vendors like Cisco for their Nexus product range
Bare metal provisioning
Dell Networking switches support a bare metal provisioning option: if you need to deploy a number of similar switches you can put both (desired/latest) firmware release and standard user-specific configuration on a USB key: when deploying the switches you can insert the USB key, power-up the switch and it will automatically load the correct firmware and configuration. In combination with smart scripting someone can combine these features for a fully automated installation and configuration of new switches. It is also possible to run BMP via the network: unless re-configured to start in 'normal' mode all DNOS 9.x switches (and the earlier FTOS switches) will check if there is a BMP server on the network by sending out a DHCP/BOOTP request at boot: if it gets the correct response from the DHCP server (IP address, address of TFTP server and a script/config file name) it will contact a TFTP server to download correct firmware and configuration files and run that. You can disable this feature during initial configuration so that the switch will boot from the firmware and configuration saved on the switch NVRAM memory.
Virtual server networking
Part of the Open Automation platform are special features for the use of virtualisation in your datacenter. Virtualisation allows you to create complete (virtual) server-systems running on a standard hypervisor farm. This will create new challenges for networking in such a datacenter, support automated configuration of datacenter switches to connect newly created virtual servers. The open automation platform has several features to support this.
Network Automation
According to Dell the move to (server and datacenter) virtualisation is one of the most important developments in the IT industry. According to this vendor the industry must prevent that this path leads to getting locked-in into specific vendors due to the use of proprietary technologies. The open automation framework is an open framework that doesn't rely on proprietary solutions
Alternative OS
On some models Dell Networking switches (currently the S3048-ON, S4048-ON, S4810-ON, S6000-ON and Z9100) it is possible to run an alternative network OS: Cumulus Linux. This will run instead of DNOS on top of NetBSD. Cumulus Linux is a complete Linux distribution which uses the full TCP/IP stack of Linux.
References
Computer networking
Dell
Embedded operating systems
Internet Protocol based network software
Network operating systems |
38136532 | https://en.wikipedia.org/wiki/Pakistan%20Library%20Automation%20Group | Pakistan Library Automation Group | Pakistan Library Automation Group (PakLAG), a not-for-profit trust, came into existence in year 2000, when some young professionals from the field of Library & Information Science in Pakistan wanted to institutionalize their volunteer work. Lahore based activity soon spread all over the country and volunteers from other provinces and cities joined the efforts to promote the use of ICTs in libraries. PakLAG has its chapters in all four provinces and federal capital. There is no membership fee and no official sponsorship. The idea was to achieve the objectives by promoting the self-reliance and economical solutions.
Objectives
Pakistan Library Automation Group (PakLAG) is committed to empower libraries and librarians of Pakistan to create true learning and research environment through learning the use of latest technologies, software and techniques. The objectives of PakLAG are:
To provide professional and technical advice to libraries, information centers and documentation centers in their development programs.
To recommend training programs for librarians so as to help them to develop, update and automate their libraries and documentation centers.
To develop library automation and capacity building programs.
To coordinate library development activities in the country with national as well as international development agencies and institutions.
To provide information and conduct research studies on library development.
To provide platform to the information professionals for the exchange of views, sharing of experiences, networking among libraries as well as to develop consensus upon the common issues faced by the profession.
To provide research support and policy recommendations to government at all levels and to legislative bodies in the formation of policies regarding the libraries and information services.
Activities
Library Information Management System (LIMS): Free software for library housekeeping routines. Used in more than 100 libraries.
Multilingual Web OPAC: First multilingual web OPAC solution is distributed free of cost.
LOC Gateway and Zebra Server: PakLAG promotes the use of Library of Congress Gateway for Web OPACs in Pakistani libraries. Zebra Server (free software under GPL) is used for this purpose. PakLAG provides free help and training to librarians.
Listserv for LIS professionals: First mailing list for librarians in Pakistan. Current members: 4000
Online Directory of LIS Professionals: Contact information of Pakistani librarians.
Training of ICTs and Indexing & Retrieval Tools: Conducted eight workshops for librarians. Introduced many new ICT products for libraries.
PakLAG Koha: PakLAG has localized this open source library software. Local languages are incorporated. Some new features have also been added.
Searchable Database of Journals in National Digital Library: Available at PakLAG website. Journal title, subject, database and publisher searching can be done.
Publication Program: A PhD dissertation has been published. Some software user manuals have also been electronically published on CD-ROM.
Virtual Library: National and international links include web OPACs of libraries, online bookstores, online newspapers, online databases and journals, online directories and other reference sources, LIS resources, and computer science resources.
Free Consultancy in Library Automation: Helps in selection and purchase of hardware and software and advice on retrospective conversion. More than 200 libraries have enjoyed the benefits of this service.
Survey of ICT Training Needs: Conducted a survey of Pakistani librarians to design future training program.
References
Dmoz Pakistan
External links
Official website
Library associations
Library automation |
38563478 | https://en.wikipedia.org/wiki/Infocommunications | Infocommunications | Infocommunications is the natural expansion of telecommunications with information processing and content handling functions including all types of electronic communications (fixed and mobile telephony, data communications, media communications, broadcasting, etc.) on a common digital technology base, mainly through Internet technology.
History
The term Infocommunications, or in short form, Infocom(s) or Infocomm(s) first emerged in the beginning of eighties at scientific conferences and then was gradually adopted in the 1990s by the players of telecommunications sector, including manufacturers, service providers, regulatory authorities and international organizations to clearly express their participation in the convergence process of the telecommunications and information technology sectors. The convergence process is triggered by the huge scale development of digital technology. Digital technology has unified, Internet technology radically reshaped telecommunications, integrated information processing and content management functions.
Related terms
The term "infocommunications" is also used in politics in a wider sense as a shorter form of information and communication(s) technology (ICT).
The terms info-com(s) and info-communications (with a hyphen) are also used to express the integration of the information technology (IT) and (tele)communication sectors, or simply to interpret the abbreviation ICT.
The term Information and Communication(s) Technology (ICT) has been defined as an extended synonym for information technology (IT) to emphasize the integration with (tele)communications. Content published in mass communication media such as printed, audio-visual and online contents and related services are not considered as ICT products, but are referred to as the Media & Content products.
The abbreviation TIM, as the Telecom IT Media sector is used to express the full integration of the Telecommunications, IT and Media & Content sectors. As well, the abbreviation TIME, as the Telecom IT/Internet Media & Entertainment/Edutainment sector is used to express the integration of these sectors.
The relationship and position of the terms can be demonstrated by a digital convergence prism (Figure 1), which shows the three components (T, I, M) and their pairs and the triple combination (convergent TIM triplet) according to the rule of additive colour mixing. Assuming that telecommunications (Telecom) is blue, informatics (IT) is green and Media & Content is red, then teleinformatics/telematics is cyan, telemedia/networked media is magenta, media informatics is yellow, and the convergent TIM is white. In such a way, the integrated TIM sector corresponds to the prism as a whole, the ICT sector to the whole minus the red area (Media & Content), and the Infocommunications relates to Telecom and neighbouring three areas (blue, cyan, magenta and white). That means that, for example, media informatics is a part of ICT but not part of Infocommunications.
See also
Information Age
Cognitive infocommunications
Information and communication technologies for environmental sustainability
References
External links
Infocommunications Journal
Detailed information on infocommunications and the related terms
Information technology |
40067793 | https://en.wikipedia.org/wiki/ConQAT | ConQAT | The Continuous Quality Assessment Toolkit (ConQAT) is a configurable software quality analysis engine. ConQAT is based on a pipes and filters architecture that enables flexible complex analysis configurations using a graphical configuration language. This architecture differs from other analysis tools that usually have a fixed data model and hard-wired analysis logics.
Architecture
ConQAT's underlying pipes and filters architecture manifests in its analysis configuration, so called ConQAT-blocks. These blocks contain a network of ConQAT processors or additional blocks. This allows configuring analyses that can be adapted to the context of the system to be analyzed with a high degree of flexibility. For example, different kinds of source code (manual written code, generated code, test code) could be treated in different ways. Furthermore, this architecture enables the reuse of blocks and processors in different contexts. For example, graph metrics can be calculated using the same blocks for dependency or control-flow graph of a program or a revision graph from a version management system.
Functionality
ConQAT analyses are usually executed on a command line in batch mode. Beside the application in software quality audits it is also often used integrated into a nightly build of a system. ConQAT implements processors (so called Scopes) to read data from different sources, such as source code or binary code files as well as from issue trackers or version management systems. For languages such as Java, C#, C/C++, and ABAP, Lexer processors and other preprocessing operations are available. ConQAT implements algorithms for detecting redundancy and architecture analysis in processors/blocks. Furthermore, it integrates established tools, like FindBugs, FxCop etc. using processors that read their output formats. Although ConQAT supports different output formats (e.g. XML), usually generated HTML files are used to present the analysis results. Visualizations include various diagrams and treemaps.
Background
ConQAT was developed in 2007 at the Technische Universität München and has received acclaim due to several scientific publications on its architecture as well as analysis techniques for detecting redundancy (clone detection) or architecture conformance analyses. Since 2009, ConQAT has been maintained and developed in a partnership between TU Munich and CQSE GmbH as an open-source project.
End-of-life
ConQAT is now a dead product. Its end-of-life has been announced in 2018.
References
External links
Tool Support for Continuous Quality Control by F. Deissenboeck, E. Juergens, B. Hummel, S. Wagner, B. Mas y Parareda, M. Pizka, IEEE Computer Society, IEEE Software, Vol. 25, num. 5, 2008, Sept., pages 60 – 67, ISSN 0740-7459, IEEE Xplore Digital Library, DOI 10.1109/MS.2008.129
Comparison of Clone Detection Tools: CONQAT and SolidSDD by Prabhjot Kaur, Harpreet Kaur, Rupinder Kaur, International Journal of Advanced Research in Computer Science and Software Engineering, pdf, Volume 2, Issue 5, May 2012
Using clone detection to identify bugs in concurrent software by Jabier Martinez, Anil Kumar Thurimella, IEEE Explore, IEEE International Conference on Software Maintenance (ICSM), 2010, ISSN 1063-6773
Type 2 Clone Detection On ASCET Models by Francesco Gerardi, Jochen Quante, University Siegen Softwaretechnik-Trends, 2012, Springer
Using mutation analysis for a model-clone detector comparison framework by Matthew Stephan, Manar H. Alalfi, Andrew Stevenson, James R. Cordy, ACM Digital Library, Proceedings of the 2013 International Conference on Software Engineering, Pages 1261-1264, IEEE Press
Static program analysis tools
Free software testing tools |
18540632 | https://en.wikipedia.org/wiki/ShiVa | ShiVa | ShiVa3D is a 3D game engine with a graphical editor designed to create applications and video games for desktop PCs, the web, game consoles and mobile devices. Games made with ShiVa can be exported to over 20 target platforms, with new export targets being added regularly.
Numerous applications have been created using ShiVa, including the Prince of Persia 2 remake for Mobiles and Babel Rising published by Ubisoft.
With ShiVa 2.0, the next version of the Editor is currently under heavy development. ShiVa users with licenses newer than January 1, 2012 are invited to download the beta builds, test thoroughly and provide feedback.
Engine
Current core engine features include:
engine runs natively on a wide variety of target platforms, including mobile devices, desktop PCs, web browsers and game consoles
support for both 32-bit and 64-bit operating systems
Lua API with native C++ compilation support
industry standard plugin architecture that can be used to extend the engine with libraries like NVIDIA PhysX, F-Mod, and ARToolKit
rendering in OpenGL, DirectX 9 or 11, and OpenGL ES modes
realtime point and directional lights with screen space blurred cascaded shadow maps
full lightmap control (UV2, import, export, built-in shadow mapper with Ambient occlusion support)
Material overrides, particle systems and Polygon Trails
post processing effects like Bloom, Depth of Field, Motion Blur and Camera Distortion
realtime mesh modification API with morphing support
chunked terrain and ocean rendering
ODE physics with compound bodies
local and remote video, texture and content streaming
2D HUD system for on-screen information displays
multiple viewports and scene cameras
several stereoscopic 3D rendering modes with Oculus Rift support
Built-in XML API and file exchange
network API for Multiplayer games, best used in connection with ShiVa Server
Stereo audio and 5.1 surround sound for 3D sound effects
Platforms
The ShiVa engine currently runs on the following platforms:
Desktop
Microsoft Windows XP/Vista/7/8/8.1 in legacy mode with DirectX 9 or OpenGL 2.0 (x86)
Microsoft Windows 8/8.1 in DirectX 11 mode with Windows Store support as WinRT application (x86, x86-64, ARM)
Mac OS X 10.6+ in x86 and x86-64 or as Universal binary
Linux as separate x86 and x86-64 binaries
Mobile
iOS 6+ devices including iPad, iPhone and iPod Touch
Android 2.3+ including microconsoles like Ouya
Windows Phone 7.5, 8 and 8.1
BlackBerry Tablet OS and BlackBerry 10
Marmalade (SDK) for targets like Bada and Symbian
HP/Palm WebOS (legacy)
Game consoles
Sony PlayStation 3 (requires Sony developer certificate)
Microsoft Xbox 360 (requires Microsoft developer certificate)
Nintendo Wii (requires Nintendo developer certificate)
Sony PlayStation 4 (in testing)
Microsoft Xbox One (in testing)
Sony PlayStation Vita (in closed beta)
Apple TV
Web
ShiVa Web Player (browser plugin) available for x86 and x86-64 on Windows, Mac OS X and Linux for Firefox, Safari (web browser), Opera (web browser), Internet explorer < v10 and Chrome (web browser)
Adobe Flash 11.2+
Google Native Client
HTML5/WebGL in beta testing
ShiVa Editor
Games for the ShiVa Engine are made with the ShiVa Editor, a WYSIWYG RAD tool designed to let developers create 3D games and application in a fraction of the usual time.
ShiVa Editor 1.x runs exclusively on Windows XP/Vista/7/8/8.1. ShiVa 1.x games are built by the ShiVa Authoring Tool, a free companion product to the Editor, which transforms game packages into native applications. Since not all platforms have SDKs for Windows, the Authoring Tool is available for Mac OS X as well as Windows.
ShiVa Editor 2.x runs natively on Windows XP/Vista/7/8/8.1 x86/x86-64, Mac OS X x86-64 and Linux x86-64. The once-separate Authoring Tool is now built into the Editor itself.
While the ShiVa Editor is capable of exporting ready to run games, advanced users may also use it to export Xcode, Eclipse or Visual Studio projects in order to modify their games further in their preferred IDE.
Editor key features
WYSIWYG live preview of virtually any component of game development
editor modules for creating materials, particles, polygon trails, HUDs, animations, terrains, and many more
fully scriptable interface with custom modules (ShiVa 2.0 only)
Lua code editor with auto suggestion and auto-completion, debugging, syntax highlighting, code folding and integrated help
DAE, DWF and STE 3D asset import
auto-conversion of imported sound, video, texture and model files
fine-grained compression control for textures, sounds and videos
export profiles (affects texture, sound and video formats)
binary asset merging, SVN, performance analysis (Advanced license only)
embedded server for Multiplayer testing
Command-line interface (2.0 only)
Release timeline
ShiVa 1.5
July 4, 2007: initial public release
ShiVa 1.6
July 1, 2008: New Modules
TerrainEditor and AnimClipEditor
HLD script library
ShiVa 1.7
March 10, 2009: XML and previews
live preview for Particles, Trails, Materials and HUDs
XML API
full light-mapping control
DDZ texture streaming
ShiVa 1.8
November 16, 2009: Terrain and Ocean
more than 150 new features, improvements and bugfixes
more than 300 new api functions and constants
projectors, ocean and post processing
ShiVa 1.9
November 2, 2010: Plugins
over 300 new api functions and constants
c++/C# plugin support
dynamic point light shadows
ShiVa Device Development Tools (local device testing without publishing)
ShiVa 1.9.1
January 10, 2012: Debugging
more than 250 new api functions
AI Debugger, unicode text support
multiple viewports and soft particles
SVN support, Asset Merger Tool
ShiVa 1.9.2
December 21, 2013: final 1.x release
all engines updated to the current SDKs
Flash, WindowsRT and NaCl exporters
ShiVa 2.0
On July 4, 2014, the first public beta version of ShiVa Editor 2.0 was released to licensees who purchased a ShiVa license after Jan 1st 2012.
Licensing
ShiVa is proprietary software and licensed under the ShiVa EULA. Several license packages are available:
ShiVa Web Edition
ShiVa Web Edition is free to download and use. Exports are watermarked except for the ShiVa Web Player browser plugin, Adobe Flash and HTML5/WebGL.
ShiVa Basic Edition
ShiVa Basic Edition was built for indie developers and small development studios. All standard exporters are unlocked. Beta versions may be downloaded and tested. C++ Plugins can be designed and tested, but not exported.
ShiVa Advanced Edition
ShiVa Advanced Edition comes with additional tools typically needed by big development teams, like asset merging, SVN, performance analysis modules, Game Console export and full C++ Plugin development/signing/export.
References
External links
Official site of ShiVa engine
Game engines for Linux
IPhone video game engines
Lua (programming language)-scriptable game engines
Video game development software
Video game engines
2007 software |
56704 | https://en.wikipedia.org/wiki/California%20State%20University | California State University | The California State University (Cal State or CSU) is a public university system in California. With 23 campuses and eight off-campus centers enrolling 485,550 students with 55,909 faculty and staff, CSU is the largest four-year public university system in the United States. It is one of three public higher education systems in the state, with the other two being the University of California system and the California Community Colleges. The CSU System is incorporated as The Trustees of the California State University. The California State University system headquarters are in Long Beach, California.
The California State University system was created in 1960 under the California Master Plan for Higher Education, and it is a direct descendant of the California State Normal Schools chartered in 1857. With nearly 100,000 graduates annually, the CSU is the country's greatest producer of bachelor's degrees. The university system collectively sustains more than 150,000 jobs within the state, and its related expenditures reach more than $17 billion annually.
In the 2011–12 academic year, CSU awarded 52 percent of newly issued California teaching credentials, 47 percent of the state's engineering degrees, 28 percent of the state's information technology bachelor's degrees, and it had more graduates in business (50 percent), agriculture (72 percent), communication studies, health (53 percent), education, and public administration (52 percent) than all other universities and colleges in California combined. Altogether, about half of the bachelor's degrees, one-third of the master's degrees, and nearly two percent of the doctoral degrees awarded annually in California are from the CSU.
Furthermore, the CSU system is one of the top U.S. producers of graduates who move on to earn their Ph.D. degrees in a related field. The CSU has a total of 17 AACSB accredited graduate business schools which is over twice as many as any other collegiate system. Since 1961, nearly three million alumni have received their bachelor's, master's, or doctoral degrees from the CSU system. CSU offers more than 1,800 degree programs in some 240 subject areas. In fall of 2015, 9,282 (or 39 percent) of CSU's 24,405 faculty were tenured or on the tenure track.
History
The State Normal Schools
Today's California State University system is the direct descendant of the Minns Evening Normal School, a normal school in San Francisco that educated the city's future teachers in association with the high school system. The school was taken over by the state in 1862 and moved to San Jose and renamed the California State Normal School; it eventually evolved into San Jose State University. A southern branch of the California State Normal School was created in Los Angeles in 1882.
In 1887, the California State Legislature dropped the word "California" from the name of the San Jose and Los Angeles schools, renaming them "State Normal Schools." Later Chico (1887), San Diego (1897), and other schools became part of the State Normal School system. However, these did not form a system in the modern sense, in that each normal school had its own board of trustees and all were governed independently from one another. In 1919, the State Normal School at Los Angeles became the Southern Branch of the University of California; in 1927, it became the University of California at Los Angeles (the "at" was later replaced with a comma in 1958).
The State Teachers Colleges
In May 1921, the legislature enacted a comprehensive reform package for the state's educational system, which went into effect that July. The State Normal Schools were renamed State Teachers Colleges, their boards of trustees were dissolved, and they were brought under the supervision of the Division of Normal and Special Schools of the new California Department of Education located at the state capital in Sacramento. This meant that they were to be managed from Sacramento by the deputy director of the division, who in turn was subordinate to the State Superintendent of Public Instruction (the ex officio director of the Department of Education) and the State Board of Education. By this time it was already commonplace to refer to most of the campuses with their city names plus the word "state" (e.g., "San Jose State," "San Diego State," "San Francisco State").
The resulting administrative situation from 1921 to 1960 was quite complicated. On the one hand, the Department of Education's actual supervision of the presidents of the State Teachers Colleges was minimal, which translated into substantial autonomy when it came to day-to-day operations. According to Clark Kerr, J. Paul Leonard, the president of San Francisco State from 1945 to 1957, once boasted that "he had the best college presidency in the United States—no organized faculty, no organized student body, no organized alumni association, and...no board of trustees." On the other hand, the State Teachers Colleges were treated under state law as ordinary state agencies, which meant their budgets were subject to the same stifling bureaucratic financial controls as all other state agencies (except the University of California). At least one president would depart his state college because of his express frustration over that issue: Leonard himself.
During the 1920s and 1930s, the State Teachers Colleges started to transition from normal schools (that is, vocational schools narrowly focused on training elementary school teachers in how to impart basic literacy to young children) into teachers colleges (that is, providing a full liberal arts education) whose graduates would be fully qualified to teach all K–12 grades. A leading proponent of this idea was Charles McLane, the first president of Fresno State, who was one of the earliest persons to argue that K–12 teachers must have a broad liberal arts education. Having already founded Fresno Junior College in 1907 (now Fresno City College), McLane arranged for Fresno State to co-locate with the junior college and to synchronize schedules so teachers-in-training could take liberal arts courses at the junior college.
The State Colleges
In 1932, the Carnegie Foundation for the Advancement of Teaching was asked by the state legislature and governor to perform a study of California higher education. The Foundation's 1933 report sharply criticized the State Teachers Colleges for their intrusion upon UC's liberal arts prerogative and recommended their transfer to the Regents of the University of California (who would be expected to put them back in their proper place). This recommendation spectacularly backfired when the faculties and administrations of the State Teachers Colleges rallied to protect their independence from the Regents. In 1935, the State Teachers Colleges were formally upgraded by the state legislature to State Colleges and were expressly authorized to offer a full four-year liberal arts curriculum, culminating in bachelor's degrees, but they remained under the Department of Education.
During World War II, a group of local Santa Barbara leaders and business promoters (with the acquiescence of college administrators) were able to convince the state legislature and governor to transfer Santa Barbara State College to the University of California in 1944. After losing a second campus to UC, the state colleges' supporters arranged for the California state constitution to be amended in 1946 to prevent it from happening again.
The period after World War II brought a great expansion in the number of state colleges. Additional state colleges were established in Los Angeles, Sacramento, and Long Beach from 1947 to 1949, and then seven more state colleges were authorized to be established between 1957 and 1960. Six more state colleges were founded after the enactment of the Donahoe Higher Education Act of 1960, bringing the total number to 23.
The California State Colleges
During the 1950s, the state colleges' peculiar mix of fiscal centralization and operational decentralization began to look rather incongruous in comparison to the highly centralized University of California (then on the brink of its own decentralization project) and the highly decentralized local school districts around the state which operated K–12 schools and junior colleges—all of which enjoyed much more autonomy from the rest of the state government than the state colleges.
The state legislature was limited to merely suggesting locations to the UC Board of Regents for the planned UC campus on the Central Coast. In contrast, because the state colleges lacked autonomy, they became vulnerable to pork barrel politics in the state legislature. In 1959 alone, state legislators introduced separate bills to individually create nineteen state colleges. Two years earlier, one bill that had actually passed had resulted in the creation of a new state college in Turlock, a town better known for its turkeys than its aspirations towards higher education, and which made no sense except that the chair of the Senate Committee on Education happened to be from Turlock.
In April 1960, the California Master Plan for Higher Education and the resulting Donahoe Higher Education Act granted similar autonomy to the state colleges. The Donahoe Act merged all the state colleges into the State College System of California, severed them from the Department of Education (and also the State Board of Education and the State Superintendent of Public Instruction), and authorized the appointment of a systemwide board of trustees and a systemwide chancellor. The board was initially known as the "Trustees of the State College System of California"; the word "board" was not part of the official name. In March 1961, the state legislature renamed the system to the California State Colleges (CSC) and the board became the "Trustees of the California State Colleges."
As enacted, the Donahoe Act provides that UC "shall be the primary state-supported academic agency for research" and "has the sole authority in public higher education to award the doctoral degree in all fields of learning". In contrast, CSU may only award the doctoral degree as part of a joint program with UC or "independent institutions of higher education" and is authorized to conduct research "in support of" its mission, which is to provide "undergraduate and graduate instruction through the master’s degree." This language reflects the intent of UC President Kerr and his allies to bring order to "a state of anarchy"—in particular, the state colleges' repeated attempts (whenever they thought UC was not looking) to quietly blossom into full-fledged research universities, as was occurring elsewhere with other state colleges like Michigan State.
Kerr explained in his memoirs: "The state did not need a higher education system where every component was intent on being another Harvard or Berkeley or Stanford." As he saw it, the problem with such "academic drift" was that state resources would be spread too thin across too many universities, all would be too busy chasing the "holy grail of elite research status" (in that state college faculty members would inevitably demand reduced teaching loads to make time for research) for any of them to fulfill the state colleges' traditional role of training teachers, and then "some new colleges would have to be founded" to take up that role. At the time, California already had too many research universities; it had only 9 percent of the American population but 15 percent of the research universities (12 out of 80). The language about joint programs and authorizing the state colleges to conduct some research was offered by Kerr at the last minute on December 18, 1959, as a "sweetener" to secure the consent of a then-wavering Glenn Dumke, the state colleges' representative on the Master Plan survey team. (Dumke had succeeded Leonard in 1957 as president of San Francisco State College.)
Most state college presidents and approximately 95 percent of state college faculty members (at the nine campuses where polls were held) strongly disagreed with the Master Plan's express endorsement of UC's primary role with respect to research and the doctorate, but they were still subordinate to the State Board of Education. In January 1960, Louis Heilbron was elected as the new chair of the State Board of Education. A Berkeley-trained attorney, Heilbron had already revealed his loyalty to his alma mater by joking that UC's ownership of the doctorate ought to be protected from "unreasonable search and seizure." He worked with Kerr to get the Master Plan's recommendations enacted in the form of the Donahoe Act, which was signed into state law on April 27, 1960.
Heilbron went on to serve as the first chairman of the Trustees of the California State Colleges (1960-1963), where he had to "rein in some of the more powerful campus presidents," improve the smaller and weaker campuses, and get all campuses accustomed to being managed for the first time as a system. Heilbron set the "central theme" of his chairmanship by saying that "we must cultivate our own garden" (an allusion to Candide) and stop trying to covet someone else's. Under Heilbron, the board also attempted to improve the quality of state college campus architecture, "in the hope that campuses no longer would resemble state prisons."
Buell G. Gallagher was selected by the board as the first chancellor of the California State Colleges (1961-1962), but resigned after only nine unhappy months to return to his previous job as president of the City College of New York. Dumke succeeded him as the second chancellor of the California State Colleges (1962-1982). As chancellor, Dumke faithfully adhered to the system's role as prescribed by the Master Plan, despite continuing resistance and resentment from state college dissidents who thought he had been "out-negotiated" and bitterly criticized the Master Plan as a "thieves' bargain". Looking back, Kerr thought the state colleges had failed to appreciate the vast breadth of opportunities reserved to them by the Master Plan, as distinguished from UC's relatively narrow focus on basic research and the doctorate. In any event, "Heilbron and Dumke got the new state college system off to an excellent start."
The California State University and Colleges
In 1972, the system became The California State University and Colleges, and the board was renamed the "Trustees of the California State University and Colleges". The board also became known in the alternative as the "Board of Trustees," similar to how the Regents of the University of California are also known in the alternative as the Board of Regents.
On May 23, 1972, fourteen of the nineteen CSU campuses were renamed to "California State University," followed by a comma and then their geographic designation. The five campuses exempted from renaming were the five newest state colleges created during the 1960s. The new names were very unpopular at certain campuses, and in 1974, all CSU campuses were given the option to revert to an older name: e.g., San Jose State, San Diego State, San Francisco State, etc.
The California State University
In 1982, the CSU system dropped the word "colleges" from its name.
Today the campuses of the CSU system include comprehensive universities and polytechnic universities along with the only maritime academy in the western United States—one that receives aid from the U.S. Maritime Administration.
In May 2020, it was announced that all 23 institutions within the CSU system would host majority-online courses in the Fall 2020 semester as a result of the COVID-19 pandemic and the impact of the pandemic on education.
Governance
The governance structure of the California State University is largely determined by state law. The California State University is ultimately administered by the 25 member (24 voting, one non-voting) Board of Trustees of the California State University. The Trustees appoint the Chancellor of the California State University, who is the chief executive officer of the system, and the Presidents of each campus, who are the chief executive officers of their respective campuses.
The Academic Senate of the California State University, made up of elected representatives of the faculty from each campus, recommends academic policy to the Board of Trustees through the Chancellor.
Board of Trustees
The California State University is administered by the 25 member Board of Trustees (BOT). Regulations of the BOT are codified in Title 5 of the California Code of Regulations (CCR). The BOT is composed of:
The Governor of California (president ex officio)
Sixteen members who are appointed by the Governor of California with the consent of the Senate
Two students from the California State University appointed by the Governor
One tenured faculty member appointed by the Governor selected from a list of names submitted by the Academic Senate
One representative of the alumni associations of the state university selected for a two-year term by the alumni council of the California State University
Four ex officio members aside from the Governor:
Lieutenant Governor
Speaker of the Assembly
State Superintendent of Public Instruction
The CSU Chancellor
Current members
Ex officio trustees:
Gavin Newsom, Governor of California
Eleni Kounalakis, Lieutenant Governor of California
Anthony Rendon, Speaker of the Assembly
Tony Thurmond, State Superintendent of Public Instruction
Joseph I. Castro, CSU Chancellor
Appointed trustees: Silas Abrego, Jane W. Carney, Adam Day (Chair), Rebecca D. Eisen, Douglas Faigin, Debra S. Farar, Jean P. Firstenberg, Wenda Fong, Lillian Kimbell (Vice-chair), Jack McGrory, Thelma Melendez de Santa Ana, Hugo M. Moralas, John Nilon, J. Lawrence Norton, Romey Sabalius, Lateefah Simon, Christopher Steinhauser, Peter J. Taylor.
Student Trustees (also appointed): Emily F. Hinton (voting) and Juan Garcia (non-voting).
Chancellor
The position of the Chancellor is declared by statute and is defined by resolutions of the BOT. The delegation of authority from the BOT to the Chancellor was historically governed by a BOT resolution titled "Statement of General Principles in the Delegation of Authority and Responsibility" of August 4, 1961. It is now controlled by the Standing Orders of the Board of Trustees of the California State University. Under the Standing Orders, the Chancellor is the chief executive officer of the CSU, and all Presidents report directly to the Chancellor.
Chancellors
Buell Gallagher (1961–1962)
Glenn S. Dumke (1962–1982)
W. Ann Reynolds (1982–1990)
Ellis E. McCune [Acting] (1990–1991)
Barry Munitz (1991–1998)
Charles B. Reed (1998–2012)
Timothy P. White (2012–2020)
Joseph I. Castro (2021–2022)
Student government
All 23 campuses have mandatory student body organizations with mandatory fees, all with the "Associated Students" moniker, and are all members of the California State Student Association (CSSA). California Education Code § 89300 allows for the creation of student body organizations at any state university for the purpose of providing essential activities closely related to, but not normally included as a part of, the regular instructional program. A vote approved by two-thirds of all students causes the Trustees to fix a membership fee required of all regular, limited, and special session students attending the university such that all fee increases must be approved by the Trustees and a referendum approved by a majority of voters. Mandatory fee elections are called by the president of the university, and the membership fees are fixed by the Chancellor. All fees are collected by the university at the time of registration except where a student loan or grant from a recognized training program or student aid program has been delayed and there is reasonable proof that the funds will be forthcoming. The Gloria Romero Open Meetings Act of 2000 mandates that the legislative body of a student body organization conduct its business in public meetings.
Student body organization funds obtained from mandatory fees may be expended for:
Programs of cultural and educational enrichment and community service.
Recreational and social activities.
Support of student unions.
Scholarships, stipends, and grants-in-aid for only currently admitted students.
Tutorial programs.
Athletic programs, both intramural and intercollegiate.
Student publications.
Assistance to recognized student organizations.
Student travel insurance.
Administration of student fee program.
Student government-scholarship stipends, grants-in-aid, and reimbursements to student officers for service to student government. Before such scholarship stipends, grants-in-aid, and reimbursements are established by a student body association, the principle of establishing such payments shall be approved by a student referendum.
Student employment to provide payment for services in connection with the general administration of student fee.
Augmentation of counseling services, including draft information, to be performed by the campus. Such counseling may also include counseling on legal matters to the extent of helping the student to determine whether he should retain legal counsel, and of referring him to legal counsel through a bar association, legal aid foundation or similar body.
Transportation services.
Child day care centers for children of students and employees of the campus.
Augmentation of campus health services. Additional programs may be added by appropriate amendment to this section by the Board.
Impact
The CSU confers over 70,000 degrees each year, awarding 46% of the state's bachelor's degrees and 32% of the state's master's degrees. The entire 23 campus system sustains nearly 150,000 jobs statewide, generating nearly $1 billion in tax revenue. Total CSU related-expenditures equate to nearly $17 billion,
The CSU produces 62% of the bachelor's degrees awarded in agriculture, 54% in business, 44% in health and medicine, 64% in hospitality and tourism, 45% in engineering, and 44% of those in media, culture and design. The CSU is the state's largest source of educators, with more than half of the state's newly credentialed teachers coming from the CSU, expanding the state's rank of teachers by nearly 12,500 per year.
Over the last 10 years, the CSU has significantly enhanced programs towards the underserved. 56% of bachelor's degrees granted to Latinos in the state are from the CSU, while 60% of bachelor's awarded to Filipinos were from the CSU. In the Fall of 2008, 42% of incoming students were from California Community Colleges.
Enrollment
Compensation and hiring
During the recession years (December 2007 – June 2009), the CSU lost 1/3 of its revenue – roughly $1 billion – and 4,000 employees. With the state's reinvestment in higher education, the CSU is restoring its employee ranks and currently employs a record number of instructional faculty. Between 2010 and 2015, the number of CSU faculty increased by 3,500, but the number of tenure track faculty declined by 150, leaving the CSU system with its lowest percentage of tenure track faculty (39%) in the schools' history. In the two years (2013–14, 2014–15) through the state's reinvestment, the CSU has directed $129.6 million to enhance employee compensation. Another $65.5 million in slated in the 2015–16 operating budget for employee compensation. However, according to the California Faculty Association (CFA) report, "Race to the Bottom: CSU's 10-Year Failure to Fund Its Core Mission", written in 2015, "Over the past decade— in good times and bad, whether state funding was up or down, when tuition was raised and when it wasn’t— CSU expenditures on faculty salaries have remained essentially flat... When compared to other university systems around the country, and to every education segment in California, the CSU stands out for its unparalleled failure to improve faculty salaries or even to protect them from the ravages of inflation."
(For data definitions and additional statistics, please see the CSU Employee Profile at www.calstate.edu/hr/employee-profile/ .)
Campuses
The CSU is composed of the following 23 campuses listed here by order of the year founded:
* U.S. News & World Report ranks San Diego State and Fresno State in the National Universities category as they offer several Ph.D programs. The other universities in the California State University system are ranked in the Regional Universities (West) category as they offer few or no Ph.D programs.
^ Cal Maritime only awards undergraduate degrees and therefore is ranked separately from the other campuses of the California State University. It is ranked in the "Regional Colleges" category.
Gallery
Off campus branches
A handful of universities have off campus branches that make education accessible in a large state. Unlike the typical university extension courses, they are degree-granting and students have the same status as other California State University students. The newest campus, the California State University, Channel Islands, was formerly an off campus branch of CSU Northridge. Riverside County and Contra Costa County, which have three million residents between them, have lobbied for their off campus branches to be free-standing California State University campuses. The total enrollment for all off campus branches of the CSU system in Fall 2005 was 9,163 students, the equivalent of 2.2 percent of the systemwide enrollment. The following are schools and their respective off campus branches:
California State University, Bakersfield
Antelope Valley (in Lancaster, California)
California State University, Chico
Redding (affiliated with Shasta College)
California State University, Fullerton
Garden Grove
California State University, East Bay
Concord
Oakland (Professional & Conference Center)
California State University, Fresno
Visalia
California State University, Los Angeles
Downtown Los Angeles
California State University, Monterey Bay
Salinas (Professional & Conference Center)
California State University, San Bernardino
Palm Desert
California State University, San Marcos
Temecula/Murrieta
San Diego State University
Imperial Valley (in Brawley, California and Calexico, California)
SDSU-Georgia (in Tbilisi in the former Soviet Republic of Georgia)
San Francisco State University
Cañada College (in Redwood City, California)
Downtown Campus (in San Francisco, California)
California State University, Stanislaus
Stockton, California
Sonoma State University
Rohnert Park, California
Laboratories and observatories
Research facilities owned and operated by units of the CSU:
Desert Studies Center
Research consortium and field site managed by California State University, Fullerton
Moss Landing Marine Laboratories
Independent degree-granting campus managed by San Jose State University
Oceanographic laboratory located in the Monterey Bay area
Murillo Family Observatory
Newest research observatory in the San Bernardino Metropolitan Area and the CSU system. It is located in and managed by California State University, San Bernardino.
Southern California Marine Institute
Oceanographic laboratory in the Los Angeles Basin
Mount Laguna Observatory
Astronomical observatory part of the Astronomy Department of San Diego State University
T.S. Golden Bear
The training ship of the California Maritime Academy
Telonicher Marine Laboratory at Humboldt State University in Trinidad, CA
Marine research laboratory on the North California coast
Home of the research vessel RV Coral Sea
Former campuses
Former campuses of the CSU system:
Los Angeles State Normal School (State Normal School at Los Angeles), founded 1882, became the University of California at Los Angeles in 1919.
Santa Barbara State College, founded 1909, became the University of California at Santa Barbara in 1944.
Differences between the CSU and UC systems
Both California public university systems are publicly funded higher education institutions. Despite having far fewer students, the largest UC campus, UCLA, as a result of its research emphasis and medical center, has a budget ($7.5 billion as of 2019) roughly equal to that of the entire CSU system ($7.2 billion as of 2019). According to a 2002 study, faculty at the CSU spend about 30 hours a week teaching and advising students and about 10 hours a week on research/creative activities, while a 1984 study reports faculty at the UC spend about 26 hours a week teaching and advising students and about 23 hours a week on research/creative activities. CSU's Chancellor, Dr. Charles B. Reed, pointed out in his Pullias Lecture at the University of Southern California that California was big enough to afford two world-class systems of public higher education, one that supports research (UC) and one that supports teaching (CSU). However, student per capita spending is lower at CSU, and that, together with the lack of a research mission or independent doctoral programs under the California Master Plan, has led some in American higher education to develop the perception that the CSU system is less prestigious than the UC system. Kevin Starr, the seventh State Librarian of California, stated that the "University of California drew from the top ten percent of the state's high school graduates" while "the CSU system was expected to draw its students from the top 33 percent of each graduating high school class." However, per the California Master Plan, the UC draws from the top 12.5 percent of California's public high school graduates.
According to the California Master Plan for Higher Education (1960), both university systems may confer bachelors or master's degrees as well as professional certifications, however only the University of California has the authority to issue Ph.D degrees (Doctor of Philosophy) and professional degrees in the fields of law, medicine, veterinary, and dentistry. As a result of recent legislation (SB 724 and AB 2382), the California State University may now offer the Ed.D (also known as the Doctor of Education or "education doctorate degree") and DPT (Doctor of Physical Therapy) degrees to its graduate students. Additionally, the California State University (CSU) offers Ph.D degrees and some professional doctorates (for instance, audiology, Au.D) as a "joint degree" in combination with other institutions of higher education, including "joint degrees" with the University of California (UC) and accredited private universities. This is why, for instance, San Diego State can qualify as a "Doctoral University: High Research Activity" by offering some 22 doctoral degrees.
There are 23 CSU campuses and 10 UC campuses representing approximately 437,000 and 237,000 students respectively. The cost of CSU tuition is approximately half that of UC. Thus, the CSU system has been referred to by former California State University authorities as "The People's University."
CSU and UC use the terms "president" and "chancellor" internally in opposite ways: At CSU, the campuses are headed by presidents who report to a systemwide chancellor; but at UC, they are headed by chancellors who report to a systemwide president.
CSU has traditionally been more accommodating to older students than UC, by offering more degree programs in the evenings and, more recently, online. In addition, CSU schools, especially in more urban areas, have traditionally catered to commuters, enrolling most of their students from the surrounding area. This has changed as CSU schools increase enrollment and some of the more prestigious urban campuses attract a wider demographic.
The majority of CSU campuses operate on the semester system while UC campuses operate on the quarter system (with the exception of UC Berkeley, UC Merced, the UCLA medical school, and all UC law schools). As of Fall 2014, the CSU began converting its six remaining quarter campuses to the semester calendar. Cal State LA and Cal State Bakersfield converted in Fall 2016, while Cal State East Bay and Cal Poly Pomona transitioned to semesters in Fall 2018. Cal State San Bernardino is planning to make the conversion in Fall 2020, while Cal Poly San Luis Obispo has not announced a date for conversion to semesters.
Admission standards
Historically the requirements for admission to the CSU have been less stringent than the UC system. However, both systems require completion of the A-G requirements in high school as part of admission. The CSU attempts to accept applicants from the top one-third of California high school graduates. In contrast, the UC attempts to accept the top one-eighth. In an effort to maintain a 60/40 ratio of upper division students to lower division students and to encourage students to attend a California community college first, both university systems give priority to California community college transfer students.
However, the following 17 CSU campuses use higher standards than the basic admission standards due to the number of qualified students who apply which makes admissions at these schools more competitive:
Chico
Fresno
Fullerton
Humboldt (freshmen)
Long Beach
Los Angeles
Monterey Bay (freshmen)
Northridge
Pomona
Sacramento
San Bernardino
San Diego
San Francisco
San Jose
San Luis Obispo
San Marcos
Sonoma
Furthermore, seven California State University campuses are fully impacted for both freshmen and transfers, meaning in addition to admission into the school, admission into all majors is also impacted for the academic 2020-2021 program. The seven campuses that are fully impacted are Los Angeles, Fresno, Fullerton, Long Beach, San Diego, San Jose, and San Luis Obispo.
Campus naming conventions
The UC system follows a consistent style in the naming of campuses, using the words "University of California" followed by the name of its declared home city, with a comma as the separator. Most CSU campuses follow a similar pattern, though several are named only for their home city or county, such as San Francisco State University, San Jose State University, San Diego State University, or Sonoma State University.
Some of the colleges follow neither pattern. California Polytechnic State University, San Luis Obispo, California State Polytechnic University, Humboldt, and California State Polytechnic University, Pomona use the word "polytechnic" in both their full names (but in different word orders) per California Education Code section 89000. and section 89005.5 CSU's editorial style guide refers to the same formal names while they also refer to the abbreviated forms "Cal Poly San Luis Obispo" and "Cal Poly Pomona" respectively, but not the name "Cal Poly" by itself. Cal Poly San Luis Obispo unilaterally claims the "Cal Poly" name per its own marketing brand guides and, since the 1980s, the CSU Chancellor's Office has taken numerous small and medium-sized businesses to court on Cal Poly San Luis Obispo's behalf for not having a licensing agreement to sell merchandise with the words "Cal Poly".
In addition, the California Maritime Academy (Cal Maritime) is the only campus whose official name does not refer to its location in California. Both Channel Islands and San Marcos campuses' official names do not include a comma, unlike the typical style of the CSU naming convention, and instead follow California State University San Marcos, or Channel Islands. Some critics, including Donald Gerth (former President of Sacramento State), have claimed that the weak California State University identity has contributed to the CSU's perceived lack of prestige when compared to the University of California.
Fall 2018 enrolled freshmen profile
Impacted campuses
An impacted campus or major is one which has more CSU-qualified students than capacity permits. , 16 out of the 23 campuses are impacted including Chico, Fresno, Fullerton, Humboldt, Long Beach, Los Angeles, Northridge, Pomona, San Bernardino, Sacramento, San Diego, San Francisco, San Jose, Sonoma, San Marcos, and San Luis Obispo. Some programs at other campuses are similarly impacted. All undergraduate programs, pre-programs, and undeclared/undecided programs are impacted for the following campuses: as the academia year 2021-22 for Cal Poly San Luis Obispo, Fresno State, CSU Fullerton, Cal State LA, CSU Long Beach, San Diego State University, and San José State. Despite this, CSU undergraduate admissions are quantitatively based and generally do not include items such as personal statements, SAT Subject Test scores, letters of recommendation, or portfolios. In addition, there is geographic preference given to those residing within the commuting areas of the colleges.
Special admissions process for the California Maritime Academy
The Maritime Academy uses a different admissions process from other CSU schools. Because of the nature of its programs, the Maritime Academy requires all applicants to pass a standard physical examination prior to enrollment.
Research and academics
AAU, AASCU and APLU
The University of California and most of its campuses are members of the Association of American Universities (AAU) and the Association of Public and Land-grant Universities (APLU).
The California State University (CSU) and most of its campuses are members of APLU and the American Association of State Colleges and Universities (AASCU).
ABET
ABET (Accreditation Board for Engineering and Technology, Inc.) is the recognized U.S. accreditor of college and university programs in applied and natural science, computing, engineering, and engineering technology. The California State University has 18 colleges with ABET-accredited programs.
Cal Poly Pomona, Cal Poly Pomona College of Engineering
Cal Poly San Luis Obispo, College of Engineering
CSU Maritime Academy, School of Engineering
CSU Bakersfield, School of Natural Sciences, Mathematics, and Engineering
CSU Chico, College of Engineering, Computer Science, & Construction Management
CSU Dominguez Hills, College of Natural & Behavioral Sciences
CSU East Bay, College of Science
CSU Fresno, Lyles College of Engineering
CSU Fullerton, College of Engineering & Computer Science
CSU Long Beach, College of Engineering
CSU Los Angeles, College of Engineering, Computer Science, and Technology
CSU Northridge, College of Engineering & Computer Science
CSU Sacramento, College of Engineering & Computer Science
CSU San Bernardino, College of Natural Sciences
Cal Poly Humboldt, College of Natural Resources & Sciences
San Diego State University, College of Engineering
San Francisco State University, College Science & Engineering
CSU San José State University, Charles W. Davidson College of Engineering
CENIC
The CSU is a founding and charter member of CENIC, the Corporation for Education Network Initiatives in California, the nonprofit organization which provides extremely high-performance Internet-based networking to California's K–20 research and education community.
Statewide university programs
Agricultural Research Initiative
A comprehensive applied agricultural and environmental research program joining the CSU's four colleges of agriculture (at San Luis Obispo, Pomona, Chico and Fresno) and the state's agriculture and natural resources industries and allied business communities.
Cal Poly Pomona
Cal Poly San Luis Obispo
Chico State
Fresno State
Biotechnology
The California State University Program for Education and Research in Biotechnology (CSUPERB) mission is to develop a professional biotechnology workforce. CSUPERB provides grant funding, organizes an annual symposium, sponsors industry-responsive curriculum, and serves as a liaison for the CSU with government, philanthropic, educational, and biotechnology industry partners. The program involves students and faculty from Life, Physical, Computer and Clinical Science, Engineering, Agriculture, Math and Business departments at all 23 CSU campuses.
Coastal Research and Management
The CSU Council on Ocean Affairs, Science & Technology (CSU COAST) affinity group is the umbrella organization for marine, coastal, and coastal watershed-related activities. A highly effective CSU affinity group with active faculty and administration members across each of the system's 23 campuses, CSU COAST functions primarily as a coordinating force to stimulate new research, teaching, and policy tools via administrative support, robust networking opportunities, and by providing small incubator/accelerator funding to students and faculty.
Graduation Initiative 2025
The Graduation Initiative 2025 is a plan to increase graduation rates, eliminate equity gaps in degree completion and meet California's workforce needs. The initiative organizes an annual symposium with keynote speakers such as, California Governor Gavin Newsom. The initiative focuses mainly on enhancing guidance and academic planning for first generation and transfer students. The initiative has resulted in a six percentage point increase in the four-year graduation rate of first-time freshman over three years, from 19.2 percent in 2015 to 25.4 percent in 2018 and an increase in the six-year graduation by four percentage points, from 57 percent in 2015 to 61.1 percent in 2018.
Hospitality Management
The Hospitality Management Education Initiative (HMEI) was formed in 2008 to address the shortage of hospitality leaders in California. HMEI is a collaboration between the 14 CSU campuses that have hospitality-related degrees and industry executives. CSU awarded 95% of hospitality bachelor's degrees in the state in 2011.
Nursing
Headquartered and administered at the Dominguez Hills campus, the CSU Statewide Nursing Program offers registered nurses courses available throughout California that lead to Bachelors, Masters of Science, and a Doctoral degree in Nursing (awarded by the closest participating CSU campus). The campuses that award a Doctorate in Nursing Practice (DNP) are:
Fullerton
Los Angeles
San Jose
Long Beach
Fresno
Online Education and Concurrent Enrollment
Beginning in 2013, the CSU made a radical change in the way it delivered online education. The university approved more than 30 courses for system-wide consumption, meaning any student attending one of the 23 campuses will be able to enroll in an online course offered at another campus, concurrently. The new online education delivery method is part of $17 million additional funding from the state to improve online education, and ultimately improve graduation rates and access to "bottleneck courses" across the 23 campuses. Courses offered include biology, business finance, chemistry, and microeconomics.
Pre-doctoral Program
California Pre-Doctoral Program is designed to increase the pool of potential faculty by supporting the doctoral aspirations of California State University students who have experienced economic and educational disadvantages.
The Chancellor's Doctoral Incentive Program provides financial and other assistance to individuals pursuing doctoral degrees. The program seeks to provide loans to doctoral students who are interested in applying and competing for California State University instructional faculty positions after completion of the doctoral degree.
Professional Science Master's Degrees
The CSU intends to expand its post-graduate education focus to establish and encourage Professional Science master's degree (PSM) programs using the Sloan model.
See also
California State Employees Association
California State University Emeritus and Retired Faculty Association
California State University Police Department
Colleges and universities
List of colleges and universities in California
References
Further reading
Donald R. Gerth. The People's University: A History of the California State University. Berkeley: Institute of Governmental Studies, University of California, 2010. .
External links
C
Schools accredited by the Western Association of Schools and Colleges
California State University system
Educational institutions established in 1857
1857 establishments in California |
20722416 | https://en.wikipedia.org/wiki/Science%20and%20technology%20in%20Pakistan | Science and technology in Pakistan | Science and technology is a growing field in Pakistan and has played an important role in the country's development since its founding. Pakistan has a large pool of scientists, engineers, doctors, and technicians assuming an active role in science and technology. The real growth in science in Pakistan occurred after the establishment of the Higher education Commission in 2002 which supported science in a big way and also became the major sponsor of the Pakistan Academy of Sciences under the leadership of Prof. Atta-ur-Rahman. The emphasis was placed on quality rather than numbers during this period. The quality measures introduced by Prof. Atta-ur-Rahman as Founding Chairman HEC included:1) All Ph.D. thesis were evaluated by eminent foreign scientists,2) All PhD theses and research papers were checked for plagiarism 3) Some 11,000 students were sent abroad to leading universities for PhD level training and absorbed on their return, 4) Appointments at faculty positions were linked to international stature of the applicants as judged from their international publications, patents and citations, and (5) Quality Enhancement Cells were established in all universities for the first time in the history of the country. (6) The minimum criteria for establishment of a new university were approved by the Cabinet and universities that did not meet this criteria were closed down. (7) The Model University Ordinance was approved (Appendix 3 in the reference) setting the governance parameters for new universities. (8) A list of fake higher education institutions was prepared and made public. (9) Quality Assurance Agency (QAA) was set up within the Higher Education Commission that established Quality Enhancement Cells (QECs) as its operational units in public and private-sector universities across the country. (10) The funding of universities was linked to excellence in teaching and research under a formula based funding mechanism that considered enrolment, subjects and quality of teaching and research. The first IT policy and implementation strategy was approved under the leadership of Prof. Atta-ur-Rahman, then Federal Minister of Science & technology, in August 2000 which laid the foundations of the development of this sector On the request of Prof. Atta-ur-Rahman, Intel initiated a nationwide programme to train school teachers in Information and Communication technologies in March 2002 which has led to the training of 220,000 school teachers in 70 districts and cities across Pakistan. A 15-year tax holiday was approved on the recommendation of Prof. Atta-ur-Rahman which has resulted in growth of IT business from $30 million in 2001 to over $3 billion. The Pakistan Austria University of Applied Engineering (Fachhochschule) has been established in Haripur Hazara under a Steering Committee Chaired by Prof. Atta-ur-Rahman in which students will get degrees from several Austrian universities. Pakistan's growth in scientific output can be seen from the fact that in 1990 Pakistan published 926 scholarly documents while in 2018 the number rose to 20548, a twenty times increase.In contrast India published 21443 scholarly documents in 1990 and the number rose to 171356 in 2018, an eight times increase. In 2018, 336 people per million were researchers in the R&D (Research and Development sector) in Pakistan compared to 256 people per million being researchers in India. The reforms begun by Prof. Atta-ur-Rahman FRS in 2003-2008 have continued over the subsequent decade and according to the Web of Science report, there was a 300% growth in research publications in 2019 over the decade, with 2019 marking the first year in which Pakistan was ranked above the world average in research. In 2019, Pakistan produced 300% more publications indexed in the Web of Science Core Collection than in 2010. In the decade of 2010-2019, more than half of Pakistan’s research was published in journals with Impact Factor. The global influence of Pakistan’s research is increasing as scientists in the country are publishing more in top quartile journals. The Category Normalized Citation Impact of Pakistan’s publications (which measures publications’ impact against their peers worldwide) has risen from 0.67 to 1.03. output. As of 2020, Pakistan has 85% teledensity with 183 million celllular, 98 million 3G/4G and 101 million broadband subscribers, due to the foundations laid by Prof. Atta-ur-Rahman of the IT and telecom industry during 2000-2008. In an analysis of scientific research productivity of Pakistan, in comparison to Brazil, Russia, India and China, Thomson Reuters has applauded the developments that have taken place as a result of the reforms introduced by Prof. Atta-ur-Rahman FRS, since Pakistan has emerged as the country with the highest increase in the percentage of highly cited papers in comparison to the "BRIC" countries
Chemistry remains the strongest subject in the country with the International Center for Chemical and Biological Sciences playing the lead role with the largest postgraduate research program in the country having about 600 students enrolled for PhD.Physics (theoretical, nuclear, particle, laser, and quantum physics), material science, metallurgy (engineering), biology, and mathematics, are some of the other fields in which Pakistani scientists have contributed. From the 1960s and onwards, the Pakistani government made the development and advancement of science a national priority and showered top scientists with honours. While the government has made efforts to make science a part of national development, there have been criticisms of federal policies, such as the government's dissolution of the Higher Education Commission of Pakistan (HEC)— an administrative body that supervised research in science – in 2011. This attempted dissolution failed to materialise because of a Supreme Court of Pakistan decision on a petition filed by Prof. Atta-ur-Rahman, former Federal Minister of Science & Technology and former founding Chairman of the Higher Education Commission. Pakistani scientists have also won acclaim in mathematics and in several branches of physical science, notably theoretical and nuclear physics, chemistry, and astronomy. Professor Abdus Salam, a theoretical physicist won the Nobel Prize in Physics in 1979, being the first and only Pakistani to date to have received the honor. Prof. Atta-ur-Rahman an organic chemist was elected as Fellow of Royal Society (London) in 2006 in recognition of his contributions in the field of natural products thereby becoming the first scientist from the Islamic world to receive this honour for work carried out within an Islamic country. The contributions of Prof. Atta-ur-Rahman to uplift science and higher education in Pakistan were internationally acknowledged and a tribute paid to him in the world's leading science journal Nature that termed him as "a force of nature". In an analysis of scientific research productivity of Pakistan, in comparison to Brazil, Russia, India, and China, Thomson Reuters has applauded the developments that have taken place as a result of the reforms introduced by Prof. Atta-ur-Rahman FRS, since Pakistan has emerged as the country with the highest increase in the percentage of highly cited papers in comparison to the "BRIC" countries. In recognition of building strong bridges between science in Pakistan and China, Prof. Atta-ur-Rahman FRS received the highest national award of China, the "International Science and Technology Cooperation Award". His book on NMR spectroscopy published by Springer Verlag was translated into Japanese language and used for teaching courses on NMR spectroscopy in Japan. His book entitled "Stereoselective Synthesis in Organic Chemistry" published by Springer Verlag was described as a "monumental tome" by the Nobel Laureate Sir Derek Barton who wrote the Foreword to this book.
Technology is highly developed in nuclear physics and explosives engineering, where the arms race with India convinced policymakers to set aside sufficient resources for research. Due to a programme directed by Munir Ahmad Khan and the Pakistan Atomic Energy Commission (PAEC), Pakistan is the seventh nation to have developed an atomic bomb, which the global intelligence community believes it had done by 1983 (see Kirana-I), nine years after India (see Pokhran-I). Pakistan first publicly tested its devices (see Chagai-I and Chagai-II) on 28 and 30 May 1998, two weeks after India carried out its own tests (See Pokhran-II).
Space exploration was hastily developed, in 1990 Pakistan launched Badr-1 followed by Badr-II in 2001. Since the 1980s, the space programme dedicated itself to military technologies (Space weapons programme and Integrated missile systems), and maintains a strong programme developed for military applications.
Pakistan is an associate member of CERN, one of the few countries to obtain that status. Pakistan was ranked 107th in the Global Innovation Index in 2020, down from 105th in 2019.
During 2018-2019, the Government of Pakistan has formed a number of Task Forces to strengthen science and technology, information technology and knowledge economy. The task force formed in 2018 on "Technology Driven Knowledge Economy" is chaired by the Prime Minister Mr. Imran Khan and has Atta-ur-Rahman as its Vice Chairman The group has several important Federal Ministers as members including Ministers of Finance, Planning, Education, IT/Telecom, Science & Technology and chairman Higher Education Commission. The task force aims to promote research in important and emerging technology fields. Another important task force of the Prime Minister is that on science & technology with Atta-ur-Rahman as its chairman. As a result of the efforts of these Task Forces under the leadership of Prof. Atta-ur-Rahman FRS, a huge change has occurred in the Ministry of Science and Technology and the development budget of the Federal Ministry of Science and technology has been enhanced by over 600% due to the projects initiated by these Task Forces, allowing a large number of new important initiatives in the fields of materials engineering, genomics, industrial biotechnology, alternative energy, minerals, regenerative medicine, neuroscience, and artificial intelligence to be undertaken. Pakistan's first foreign engineering university (Pak Austria Fachhochschule) is a unique hybrid model involving a Fachhochschule half and a postgraduate research half, with a central technology park. With 8 foreign universities collaborating (3 Austrian and 5 Chinese), it has also started functioning under the supervision of a steering committee headed by Atta-ur-Rahman in Haripur, Hazara. A number of such foreign engineering universities are in the process of being established under the supervision of Prof. Atta-ur-Rahman FRS. These include one in Sialkot the foundation stone of which has already been laid by the Prime Minister of Pakistan, and another in the lands behind Prime Minister House, Islamabad
History
The Scientific and Technological Research Division (S&TR) was established in 1964 for (i) coordination and implementation of national science and technology policy; (ii) promotion and coordination of research and utilization of the results of research; (iii) development, production and utilization of nuclear energy; and (iv) coordination of utilization of scientific and technological manpower. The Division was administratively responsible for the National Science Council, the Council of Scientific and Industrial Research, the Atomic Energy Commission and Space and Upper Atmospheric Research Committee.
The Ministry of Science and Technology (MoS&T) has been functioning since 1972. It is the national focal point and enabling arm of Government of Pakistan for planning, coordinating and directing efforts; to initiate and launch scientific and technological programs and projects as per national agenda for sound and sustainable Science & Technology Research base for the socio-economic development.
From the areas of industrial development to renewable energy and rural development, the Ministry suggests technological development for higher growth-rates and to improve standards of living. Its principal focus is on building Pakistan's technological competence and developing a larger pool of human resources to reverse brain drain, and for integrating the existing technological infrastructure for the strengthening of technology institutions, effective governance of S&TR and enhancing the capacity of indigenous innovation systems.
Golden age of science
The 1960s and the 1970s period is regarded as the initial rise of Pakistan's science, which gained an international reputation in the different science communities of the world. During this period, scientists contributed to the fields of, particularly, Natural Product Chemistry, theoretical, particle, mathematical, and nuclear physics, and other major and subfields of Chemistry and Physics. The research was preceded by such scientists as Riazuddin, Ishfaq Ahmad, Salimuzzaman Siddiqui, Atta-ur-Rahman and Samar Mubarakmand. However, the major growth in scientific output occurred after the establishment of the Higher Education Commission which was accompanied by a 60-fold increase in funding for science
The real growth of science in Pakistan occurred under the leadership of Prof. Atta-ur-Rahman during 2000–2008 when he was the Federal Minister of Science & Technology and later Chairman of the Higher Education Commission (HEC) with the status of Federal Minister. The chairperson of the Senate Standing Committee on Education announced the first 6 years of HEC under Prof. Atta-ur-Rahman as "Pakistan's golden period". Thomson Reuters, in an independent assessment of Pakistan's progress in international publications, has acknowledged that in the last decade there has been a fourfold increase in international publications and a tenfold growth in highly cited papers, statistics that were better than the BRIC countries.
The remarkable transformation of science and higher education under the leadership of Prof. Atta-ur-Rahman as Federal Minister of Science & Technology and later as Chairman of Higher Education Commission with status of a Federal Minister during the period 2000–2008 was applauded by many independent experts and he was called a "force of nature" in a review published in Nature
Dr. Abdus Salam, the first Pakistani winner of the Nobel Prize in Physics, was the father of physics research in Pakistan. Under the watchful direction of Salam, mathematicians and physicists tackled the greatest and outstanding problems in physics and mathematics. From 1960 to 1974, Salam was responsible for leading the research at its maximum point. This prompted the international recognition of Pakistani mathematicians and physicists, allowing them to conduct their research at CERN. Salam and his students (Riazuddin, Fayyazuddin, and others) revolutionized particle and theoretical physics, are thought to be modern pioneers of particle physics at all aspect of it. Pure research was undertaken in Quantum electrodynamics, Quantum field theory, protonic decay and major fields in physics, were pioneered by Pakistan's scientists. With the establishment of nuclear and neutron institutes in the country, Pakistan's mathematicians introduced complex mathematical applications to study and examine the behaviours of elements during the fission process. Salimuzzaman Siddiqui, Atta-ur-Rahman and Iqbal Choudhary are the pioneering personalities for studying the isolation of unique chemical compounds from the Neem (Azadirachta indica), Rauvolfia, periwinkle (Catharanthus roseus), (Buxus papillosa) and various other plants.
State controlled science
Unlike some Western countries, the majority of the research programmes are conducted not at the institutions (such as universities) but at specially set up research facilities and institutes. These institutes are performed under the government's Ministry of Science that overlooks the development and promotion of science in the country, while others are performed under the Pakistan Academy of Sciences, other specialized academies and even the research arms of various government ministries. At first, the core of fundamental science was the Pakistan Academy of Sciences, originally set up in 1953 and moved from Karachi to Islamabad in 1964. The Pakistan Academy of Sciences has a large percentage of researchers in the natural sciences, particularly physics. From 1947 to 1971, the research was being conducted independently with no government influence. The High Tension Laboratories (HTL) at the Government College University, Lahore (GCU) was established by R. M. Chaudhry with funds given by the British government in the 1950s. In 1967, Professor Abdus Salam led the foundation of the Institute of Theoretical Physics (ITP) at the Quaid-e-Azam University, and the establishment of the Pakistan Institute of Nuclear Science and Technology (PINSTECH) and the Centre for Nuclear Studies; all were independently established by Pakistan's academic scientists with financial assistance provided by European countries. However, after Zulfikar Ali Bhutto became president, he took control of scientific research in 1972 as part of his intensified socialist reforms and policies. With advice taken from Dr. Mubashir Hassan, Bhutto established the Ministry of Science with Ishrat Hussain Usmani, a bureaucrat with a doctorate in atomic physics.
During the 1950s and 1960s, both West Pakistan and East Pakistan had their own academies of science, with East Pakistan relying on West Pakistan to allot the funds. Medical research is coordinated and funded by the Health Ministry and agricultural research is led by Agriculture Ministry and likewise, the research on environmental sciences is headed by the Environment Ministry.
The aftermath of the 1971 Indo-Pakistan Winter War was that President Bhutto increased scientific funding by the Government by more than 200%, mostly dedicated to military research and development. Bhutto, with the help of his Science Adviser Dr. Salam, gathered hundreds of Pakistani scientists working abroad to develop what became Pakistan's atom bomb. This crash programme was directed at first by Dr. Abdus Salam until 1974, and then directed and led by Dr. Munir Ahmad Khan from 1974 until 1991. For the first time, an effort was made by the government when Pakistan's citizens made advancements in nuclear physics, theoretical physics, and mathematics. In the 1980s, General Muhammad Zia-ul-Haq radicalized science by enforcing pseudoscience – by his Muslim fundamentalists as administrators – in Pakistan's schools and universities. Zia-ul-Haq later promoted Dr. Abdul Qadeer Khan to export the sensitive industrial (military) technologies to Libya, Iran, and North Korea. Because of government control, academic research in Pakistan remains highly classified and unknown to the international scientific community. There have been several failed attempts made by foreign powers to infiltrate the country's research facilities to learn how much research has progressed and how much clandestine knowledge has been gained by Pakistan's scientists. One of the notable cases was in the 1970s when the Libyan intelligence made an unsuccessful attempt to gain knowledge on critical aspects of nuclear technology, and crucial mathematical fast neutron calculations in theoretical physics. It was thwarted by the ISI Directorate for Joint Intelligence Technical (JIT). From the 1980s and onward, both Russian intelligence and the Central Intelligence Agency made several attempts to access Pakistan's research but because of the ISI, they were unable to gain any information. From the period 1980 to 2004, research in science fell short until General Pervez Mushrraf established the Higher Education Commission (HEC) which heightened the contribution of science and technology in Pakistan. The major boost to science in Pakistan occurred under the leadership of Prof. Atta-ur-Rahman as the founding Chairman of the Higher Education Commission when about 11,000 students were sent to top universities abroad for Ph.D. and postdoctoral training. This has resulted in the enormous increase in the research output of Pakistan in Impact factor journals from about 800 per year in the year 2000 to over 12,000 publications per year. This drew positive comments from Thomson Reuters about the sharp increase in highly cited papers in comparison to Brazil, Russia, India and China Major research was undertaken by Pakistan's institutes in the field of natural sciences. In 2003, the Ministry of Science and Technology of the Government of Pakistan and the United States Department of State signed a comprehensive Science and Technology Cooperation Agreement that established a framework to increase cooperation in science, technology, engineering and education for mutual benefit and peaceful purposes between the science and education communities in both countries. In 2005, the United States Agency for International Development (USAID) joined with the Ministry of Science and Technology (MOST) and the Higher Education Commission of Pakistan to support the joint Pakistan-U.S. Science and Technology Cooperation Program. Beginning in 2008, the U.S. Department of State joined USAID as U.S. co-sponsor of the program. This program, which is being implemented by the National Academy of Sciences on the U.S. side, is intended to increase the strength and breadth of cooperation and linkages between Pakistan scientists and institutions with counterparts in the United States. However, with unfavourable situations, research declined. In 2011, the government dissolved the HEC and the control of education was taken over by governmental ministries. Prof. Atta-ur-Rahman filed a petition in the Supreme Court of Pakistan against the government action. The Supreme Court decided in favour of the stand taken by Prof. Atta-ur-Rahman, and the federal nature of the Higher Education Commission was preserved.
Science policy
National Science, Technology and Innovation Policy
The Federal Ministry of Science and Technology has overseen the S&T sector since 1972. However, it was not until 2012 that Pakistan's first National Science, Technology and Innovation Policy was formulated: this was also the first time that the government had formally recognized innovation as being a long-term strategy for driving economic growth. The policy principally emphasizes the need for human resource development, endogenous technology development, technology transfer and greater international co-operation in research and development (R&D).
The policy was informed by the technology foresight exercise undertaken by the Pakistan Council for Science and Technology from 2009 onwards. By 2014, studies had been completed in 11 areas: agriculture, energy, ICTs, education, industry, environment, health, biotechnology, water, nanotechnology, and electronics. Further foresight studies were planned on pharmaceuticals, microbiology, space technology, public health, sewage, and sanitation, as well as higher education.
National Science, Technology and Innovation Strategy
Following the change of government in Islamabad after the May 2013 general election, the new Ministry of Science and Technology issued the draft National Science, Technology and Innovation Strategy 2014–2018, along with a request for comments from the public. This strategy has been mainstreamed into the government's long-term development plan, Vision 2025, a first for Pakistan.
The central pillar of the draft National Science, Technology and Innovation Strategy is human development. Although the pathway to implementation is not detailed, the new strategy fixes a target of raising Pakistan's gross domestic expenditure on R&D (GERD) from 0.29% (2013) to 0.5% of GDP by 2015 then to 1% of GDP by the end of the current government's five-year term in 2018. The ambitious target of tripling the GERD/GDP ratio in just seven years is a commendable expression of the government's resolve but ambitious reforms will need to be implemented concurrently to achieve the desired outcome.
National prizes
The most prestigious government prize awarded for achievements in science and technology is Nishan-e-Imtiaz (or in English Order of Excellence). While Hilal-i-Imtiaz, Pride of Performance, Sitara-i-Imtiaz, and Tamgha-e-Imtiaz occupies a unique role and importance in Pakistan's civil society. Atta-ur-Rahman is the only scientist of Pakistan to have won all these 4 Civil Awards.
Achievements
In 1961, international achievements first recorded in 1961 when Pakistan became the third Asian country and tenth in the world when the Rehbar-I – a solid fuel expendable rocket— was launched from Sonmani Spaceport. The Rehbar-I was developed and launched under the leadership of Dr. W. J. M. Turowicz, a Polish-Pakistani scientist and then project director of this program. Since then, the program began taking flights which continued until the 1970s.
A major breakthrough occurred in 1979, when the Nobel Prize Committee awarded the Nobel Prize in Physics to Abdus Salam, for formulating the electroweak theory – a theory that provides the basis of unification of weak nuclear force and electromagnetic force. In 1990, the Space and Upper Atmosphere Research Commission (SUPARCO) launched the first, and locally designed, a communication satellite, Badr-1, from Xichang Satellite Launch Center (XLSC) of the People's Republic of China. With the launch, Pakistan became the first Muslim majority country to have developed an artificial robotic satellite, and was the second South Asian state to have launched its satellite, second to India.
One of the widely reported achievements was in 1998 when the country joined the nuclear club. In response to India's nuclear tests on 11 May and 13 May 1998, under codename Operation Shakti, in the long-constructed Pokhran Test Range (PTR). Under Prime Minister Nawaz Sharif, the Pakistan Atomic Energy Commission (PAEC) conducted five simultaneous tests at the Chagai Hills under codename Chagai-I on 28 May 1998. PAEC carried out another test in the Kharan Desert, under Chagai-II, meaning it had tested six devices in under one week. With the testing of these atomic devices, Pakistan became the seventh nuclear power in the world, and the only Muslim-majority country to have mastered the technology. On 13 August 2011, SUPARCO launched its first indigenously developed geosynchronous satellite, Paksat-1R also from XLSC in China.
In 2006 Prof. Atta-ur-Rahman was elected as Fellow of Royal Society (London), thereby becoming the first scientist from the Muslim world to be so honoured in recognition of researches and contributions carried out within an Islamic country. He has major contributions in the development of natural product chemistry and several international journals have published special issue in recognition of these contributions in his honour, He contributed to the major development of science and technology as Chairman Higher Education Commission during 2002–2008 which have resulted in a significant increase in research publications in Pakistan from only about 800 research papers in Impact Factor journals in 2002 to over 11,000 publications in 2016 the quality of which has been recognised by ThomsonReuters. The International Centre for Chemical and Biological Sciences at the University of Karachi which has developed as a leading research centre in the region under the leadership of Prof. Atta-ur-Rahman was designated as a UNESCO Centre of Excellence in 2016. Prof. Atta-ur-Rahman was awarded the high Civil Award of the Government of Austria (the 'Grosses goldenes Ehrenzeichen am Bande') in 2007 in recognition for his contributions for uplifting science in Pakistan, and the Government of China also honoured him with the highest Award for Foreigners (Friendship Award) in recognition of his eminent contributions. The largest university of Malaysia, Universiti Teknologi Mara, established a Research Centre entitled " Dr. Atta-ur-Rahman Research Institute of natural Product Discovery" to honour this great Muslim scientist for uplifting science in Pakistan and in the Muslim world in his capacity as Coordinator General COMSTECH, a Ministerial Committee comprising 57 Ministers of Science and Technology of the 57 OIC member countries. More recently, the leading Chinese University on Traditional Medicine in Changsha, Hunan has also decided to neame a research institute in honour of Prof. Atta-ur-Rahman FRS, in recognition of his eminent contributions to uplift science in Pakistan and to establish strong linkages with China.
In another landmark study undertaken by Thomson Reuters, highlighting the impact of the reforms introduced by Atta-ur-Rahman, it was revealed that the rate of growth of highly cited papers from Pakistan in a decade was even greater than that in Brazil, Russia, India or China
Information technology
The rapid progress made by Pakistan in the IT and telecom sector during 2000–2002, under Professor Atta-ur-Rahman as Federal Minister, led to the spread of internet from 29 cities in the year 2000 to 1,000 cities, towns and villages by 2002, and the spread of fiber from 40 cities to 400 cities in this period. The first IT policy and implementation strategy was approved under the leadership of Prof. Atta-ur-Rahman, then Federal Minister of Science & technology, in August 2000 which laid the foundations of the development of this sector The internet prices were reduced sharply from $87,000 per month for a 2 MB line to only $3000 per month and later to $90 per month. The mobile telephony boom also occurred under the leadership of Atta-ur-Rahman, and it began by the drastic lowering of prices, bringing in of competition (Ufone) and changing the system so that the person receiving a call was no longer required to pay any charges. A satellite was placed in space (Paksat 1) at a cost of only $4 million. These changes in the IT infra-structure proved invaluable for the Higher education sector. Pakistan Educational Research Network was set up in 2004 through which one of the finest digital libraries was established in universities. In 2002, few university libraries could subscribe to a handful of journals. Today every student in every public sector university has free access to over 20,000 international journals with back volumes and over 60,000 books from 250 international publishers.
As of 2011, Pakistan has over 20 million internet users and is ranked as one of the top countries that have registered a high growth rate in internet penetration. Overall, it has the 15th largest population of internet users in the world. In the fiscal year 2012–2013, the Government of Pakistan aims to spend 4.6 billion rupees (Rs.) on information technology projects, with emphasis on e-government, human resource and infrastructure development.
Pakistan's information technology industry has gone through a dramatic change, and the country has taken the lead in adopting some technologies while also setting an example for others in global best practices. Matters relating to the IT industry are overseen and regulated by the Ministry of Information Technology of the Government of Pakistan. The IT industry is regarded as a successful sector of Pakistan economically, even during the financial crisis. The Government of Pakistan has given numerous favors to IT investors in the country since the last decade, that resulted in the development of the IT sector. In the years 2003–2005 the country's IT exports saw a rise of about fifty percent and amounted a total of about US$48.5 million. The World Economic Forum, assessing the development of Information and Communication Technology in the country ranked Pakistan 102nd among 144 countries in the Global Information Technology report of 2012.
Higher education reforms
Reform 2002–2009
In 2002, the University Grants Commission was replaced by the Higher Education Commission (HEC), which has an independent chairperson. The HEC was charged with reforming Pakistan's higher education system by introducing better financial incentives, increasing university enrolment and the number of PhD graduates, boosting foreign scholarships and research collaboration and providing all the major universities with state-of-the-art ICT facilities.
In a series of reforms in 2002, the HEC instituted major upgrades for scientific laboratories, rehabilitating existing educational facilities, expanding research support and overseeing the development of one of the best digital libraries in the region. Seeking to meet international standards, quality assurance and accreditation process was also established. Some ~95% of students sent abroad for training returned, an unusually high result for a developing country, in response to improved salaries and working conditions at universities as well as bonding and strict follow-up by the commission, Fulbright and others. Within a limited timespan, the HEC provided all universities with free, high-speed Internet access to scientific literature, an upgrade of research equipment accessible across the country and a programme for the creation of new universities of science and technology, including science parks which attracted foreign investors.
International praise : Pakistan's Golden Period for Higher Education
Since the Higher Education Commission (HEC) reforms have been carried out in 2002, HEC has received praise from international higher education observers. Rahman, founding Chairman of HEC, has received a number of international awards for the transformation of the higher education sector under his leadership.[25] German academic, Dr. Wolfgang Voelter of Tübingen University in Germany over viewed the performance of HEC under the leadership of Rahman and described the reforms in HEC as "A miracle happened." After teaching and visiting in 15 universities of Pakistan, Voelter wrote that the "scenario of education, science and technology in Pakistan has changed dramatically, as never before in the history of the country.[25] The chairperson of the Senate Standing Committee on Education recently announced the first 6 years of HEC under Rahman as "Pakistan's golden period in higher education".
American academic Prof. Fred M. Hayward has also praised the reform process undertaken by Pakistan, admitting that "since 2002, a number of extraordinary changes have taken place." Hayward pointed out that "over the last six years almost 4,000 scholars have participated in Ph.D. programs in Pakistan in which more than 600 students have studied in foreign PhD programs'.
The HEC's reforms were also applauded by the United Nations Commission on Science and Technology for Development (UNCSTD) which reported that the "progress made was breath-taking and has put Pakistan ahead of comparable countries in numerous aspects." The UNCSTD has closely monitored the development in Pakistan in the past years, coming to the conclusion that HEC's program initiated under the leadership of Rahman is a "best-practice" example for developing countries aiming at building their human resources and establishing an innovative, technology-based economy.". According to an article published in the leading science journal Nature "Rahman's strong scientific background, enthusiasm for reform and impressive ability to secure cash made him a hit at home and abroad. "It really was an anomaly that we had a person of that stature with that kind of backing,----Atta-ur-Rahman was a force of nature
Rahman has won four international awards for the revolutionary changes in the higher education sector brought in the HEC. Nature, a leading science journal, has also written a number of editorials and articles about the transformation brought about in Pakistan in the higher education sector under the HEC. In an article entitled "Pakistan Threat to Indian Science" published in the leading daily newspaper Hindustan Times, India, it has been reported that Professor C. N. R. Rao, Chairman of the Indian Prime Minister's Scientific Advisory Council made a presentation to the Indian Prime Minister at the rapid progress made by Pakistan in the higher education sector under the leadership of Rahman, Chairman, Higher Education Commission. It was reported that as a result of the reforms, "Pakistan may soon join China in giving India serious competition in science". "Science is a lucrative profession in Pakistan. It has tripled the salaries of its scientists in the last few years."
Decentralizing the governance of higher education
In 2011–2012, the HEC found itself on the brink of dissolution in the face of the 18th amendment to the Constitution, which devolved several governance functions to provincial governments, including that of higher education. It was only after Prof. Atta-ur-Rahman FRS former Chairman HEC filed a petition before the Supreme Court of Pakistan and the Supreme Court intervened in April 2011, that the commission was spared from being divided up among the four Provinces of Baluchistan, Khyber–Pakhtunkhwa, Punjab and Sindh.
Notwithstanding this, the HEC's developmental budget – that spent on scholarships and faculty training, etc. – was slashed by 37.8% in 2011–2012, from a peak of R. 22.5 billion (circa US$0.22 billion) in 2009–2010 to Rs 14 billion (circa US$0.14 billion). The higher education sector continues to face an uncertain future, despite the marginal increase in developmental spending wrought by the new administration in Islamabad: Rs. 18.5 billion (circa US$0.18 billion) in the 2013–2014 budget. According to HEC statistics, the organization's budget as a percentage of national GDP has consistently fallen from the 2006–2007 peak of 0.33% to 0.19% in 2011–2012.
In defiance of the Supreme Court ruling of April 2011, the provincial assembly of Sindh Province passed the unprecedented Sindh Higher Commission Act in 2013 creating Pakistan's first provincial higher education commission. In October 2014, Punjab Province followed suit as part of a massive restructuring of its own higher education system.
Effect of reforms on student numbers and academic output
Despite the turbulence caused by the legal battle being waged since the 2011 constitutional amendment discussed above, the number of degree-awarding institutions continues to grow throughout the country, both in the private and public sectors. University student rolls have continued to rise, from 0.28 million in 2001 to 0.47 million in 2005 and more than 1.2 million in 2014. Just under half of universities are privately owned.
Between 2002 and 2009, the HEC increased the number of PhD graduates to 6 000 per year and in provided up to 11 000 scholarships for study abroad. The number of Pakistani publications recorded in Thomson Reuter's Web of Science (Science Citation Index Expanded) leapt from 714 to 3 614 over the same period then to 6778 by 2014, and to over 20,000 by 2020. This progress in scientific productivity appears to be due to the momentum generated by the larger numbers of faculty and student scholarships for study abroad, as well as the swelling ranks of PhD graduates. Critics argue that the rapid, massive increase in numbers has compromised quality. However this claim has been refuted by neutral international experts.
Challenges
Pakistan has been known for some of its achievements in science and technology such as successful development of media and military technologies and a growing base of doctors and engineers, as well as its new influx of software engineers who have been contributing to Pakistan's Information Technology industry. Due to present situation in Pakistan, around 3,000 Pakistani doctors emigrate to Western economies in search of suitable employment opportunities and hence contribute intellectually to the health sector of developed countries and at the same time leaving the effects of a brain drain in Pakistan.
Pervez Hoodbhoy published a report on scientific output in Pakistan in which he claimed that research and scientific activities are lower than many other developing countries Hoodbhoy asserted that Pakistan has produced fewer papers than neighboring India. The contentions of Hoodbhoy have been questioned for using outdated data. The increase in research output from Pakistan has been praised after the establishment of the Higher Education Commission in 2002. This is borne out by the graphical comparison between Pakistan and India shown on the right which shows that Pakistan (green) was 400% behind India (blue) in research publications per 10 million population in year 2000 but overtook India in 2017 and by 2018, it was about 20% ahead of India according to Web of Science data.
In a report published by Thomson Reuters in 2016, it has been concluded that the rate of increase of highly cited papers in international journals from Pakistan is higher than that from Brazil, Russia, India or China.
Pakistan’s public-sector infrastructure for science and technology is complemented by academic institutions and the strategic and defence sectors. Over the years, these three components have vied for political patronage and societal recognition, leading to duplication and competition between the different bodies.
Scientific research institutions (SRI)
A large part of research is conducted by science research institutes with semi-controlled by the Government.
International Center for Chemical and Biological Sciences
H.E.J. Research Institute of Chemistry
Dr. Panjwani Centre for Molecular Medicine and Drug Research
School of Biological Sciences, Punjab University
National Center for Physics
National Institute for Biotechnology and Genetic Engineering
Abdus Salam School of Mathematical Sciences
PU Centre for High Energy Physics
Atta-ur-Rahman School of Applied Biosciences, NUST
Institute of Space and Planetary Astrophysics
National Engineering and Scientific Commission
Pakistan Institute of Nuclear Science and Technology
Institute of Space Technology
Council of Scientific and Industrial Research
Nuclear Institute for Agriculture and Biology
Nuclear Institute for Food and Agriculture
Technology Resource Mobilization Unit
Federal Bureau of Statistics
Mathematics Statistical Division
Science community of Pakistan
NUST Science Society
Pakistan Mathematical Society
Pakistan Agricultural Research Council
Pakistan Academy of Sciences
Pakistan Institute of Physics
Pakistan Astrophysicist Society
Pakistan Atomic Energy Commission
Pakistan Atomic Scientists Society
Pakistan Nuclear Society
National Information and Communication Technologies Research and Development Funds
Pakistan Science Foundation
Department of Pakistan Survey
Pakistan Geo-engineering and Geological Survey
Pakistan Cave Research & Caving Federation
Pakistan Physical Society
Pakistan Optical Society
Khwarizmi Science Society
Pakistan science club
Ghulam Ishaq Khan Institute of Engineering Sciences and Technology
Shaheed Zulfiqar Ali Bhutto Institute of Science and Technology
Pakistan Institute of Nuclear Science and Technology
National Institute of Food Science and Technology
USTAD Institute of Science & Technology Abbottabad
Royal Institute of Science & Technology Karachi
Gandhara Institute of Science & Technology
Sukkur Institute of Science & Technology
Bright Institute of Science and technology - Peshawar
Pakistan Advanced Institute of Science and Technology
See also
List of Pakistani inventions and discoveries
List of Pakistani scientists
Economy of Pakistan
Nergis Mavalvala
Sources
Further reading
References
External links
Pakistan to introduce technology in four Muslim countries
Science, Economy and Peace: A study focusing Pakistan |
3077577 | https://en.wikipedia.org/wiki/Libwww | Libwww | libwww (Library World Wide Web) is a modular client-side web API for Unix and Windows. It is also the name of the reference implementation of the libwww API.
It has been used for applications of varying sizes, including web browsers, editors, Internet bots, and batch tools. Pluggable modules provided with libwww add support for HTTP/1.1 with caching, pipelining, POST, Digest Authentication, and deflate.
The purpose of libwww is to serve as a testbed for protocol experiments so that software developers do not have to "reinvent the wheel."
libcurl is considered to be a modern replacement for libwww.
History
In 1991 and 1992, Tim Berners-Lee and a student at CERN named Jean-François Groff rewrote various components of the original WorldWideWeb browser for the NeXTstep operating system in portable C code, in order to demonstrate the potential of the World Wide Web. In the beginning libwww was referred to as the Common Library and was not available as a separate product. Before becoming generally available, libwww was integrated in the CERN program library (CERNLIB). In July 1992 the library was ported to DECnet. In the May 1993 World Wide Web Newsletter Berners-Lee announced that the Common Library was now called libwww and was licensed as public domain to encourage the development of web browsers. He initially considered releasing the software under the GNU General Public License, rather than into the public domain, but decided against it due to concerns that large corporations such as IBM would be deterred from using it by the restrictions of the GPL. The rapid early development of the library caused Robert Cailliau problems when integrating it into his MacWWW browser.
From 25 November 1994 (version 2.17) Henrik Frystyk Nielsen was responsible for libwww.
On 21 March 1995, with the release of version 3.0, CERN put the full responsibility for libwww on the World Wide Web Consortium (W3C). From 1995 onwards, the Line Mode Browser was no longer released separately, but part of the libwww package.
The W3C created the Arena web browser as a testbed and testing tool for HTML3, CSS, PNG and other features like the libwww, but after beta 3, Arena was replaced by Amaya. On 2 September 2003 the W3C stopped development of the library due a lack of resources, with the expectation that any further development would come from the open source community.
Features
Libwww supports following protocols:
file
FTP
Gopher
HTTP 1.1 with a Persistent Cache Manager, pipelining
NNTP
Telnet
WAIS
Other features include:
TLS and SSL can be used through OpenSSL.
gzip compression and decompression through zlib
a HTML, RDF, SGML and XML parser and a style sheet manager
an integration of a SQL database (using the MySQL) for i.e. web crawlers
Libwww supports plug-ins.
Applications using libwww
Over 19 applications have used libwww.
Agora
Arena
Amaya
Cello
CERN httpd server
Cygwin
Distributed Oceanographic Data Systems with the OPeNDAP
GRIF Symposia, a HTML editor
Lynx
MacWWW
Mosaic
ROS (Robot Operating System)
TkWeb
tkWWW
WorldWideWeb (later Nexus)
Integrated applications in libwww are:
Command Line Tool, an application which shows how to use libwww for building simple batch mode tools for accessing the Web.
Line Mode Browser, a Spartan web browser.
Webbot, a simple application showing how to use libwww for building robots.
Mini Server, a small application showing how to implement a server or a proxy using libwww.
Criticism
The developers of libcurl have criticised libwww as being not as portable, not thread-safe and lacking several HTTP authentication types.
Neither libcurl nor libwww are lightweight enough for some projects.
See also
Library for WWW in Perl
cURL
References
External links
libwww hackers
The architecture of libwww
The historic architecture of libwww presented on the Mosaic and the Web conference in Chicago
Application programming interfaces
C (programming language) libraries
Cross-platform free software
Free FTP clients
Free software programmed in C
Free web crawlers
Hypertext Transfer Protocol clients
Software using the W3C license
CERN software |
67440345 | https://en.wikipedia.org/wiki/Vincit | Vincit | Vincit Oyj (Vincit Plc) is a Finnish technology company concentrating on software development, service design, consulting, and products for data-driven service management and employee-oriented leadership. Vincit's transparent company culture received awards both in Finland and in Europe.
Vincit Oyj is listed on Nasdaq First North Growth Market Finland.
History
Vincit Oy 2007–2016
Mikko Kuitunen and Olli-Pekka Virtanen founded Vincit in 2007, with Kuitunen as CEO.
In autumn 2014 Vincit, then employing 100 people, opened two new offices in Helsinki. One was focusing on projects in the public sector, finance, commerce, and media, while the other developed medical systems in cooperation with GE Healthcare.
In July 2015, former Jolla CEO Tomi Pienimäki was appointed as Vincit's CEO. In August, Vincit merged with ICT service provider Javerdel Oy. The new company offered software development, devices, continuous services, server capacity, and financial services for companies and public administration. The merger raised Vincit's number of staff above 200. Vincit offered startups services that they couldn't otherwise afford in exchange for shares or after investing their own money into the start-up. After investing in 12 companies, Vincit decided to found a separate private equity company called Amor & Labor to manage its investments.
Vincit Group Oyj 2016–2018
In 2016, Vincit operated in Tampere and Helsinki, had a service center in Savonlinna, Finland, and an office in Palo Alto, California. In September, Vincit announced it was listing on First North Growth Market operated by Nasdaq Helsinki. The initial public offering began on September. Two-thirds of Vincit staff took part in the private offering to the employees.
In February 2017, Vincit announced the opening of a new office in Turku. Vincit acquired XTOPLY, a design company based in California specializing in digital design and service design, in June, and Avoltus, a software company based in Turku, Finland, in November. In October, Vincit acquired Linja Design, an IoT service design company located in Helsinki and Tampere. Vincit also acquired Solid Angle Oy, a Finnish WordPress company.
In January 2018, Vincit sold its service business Vincit Services Oy to DataCenter Finland Oy. Most of the services had become a part of Vincit during the Javerdel acquisition.
Vincit Oyj 2018–recent
Mikko Kuitunen made a comeback as Vincit's CEO in June 2018. At the same time, the word ‘Group’ was dropped from the company's name. Kuitunen wanted his employment contract to state that both the Vincit board and staff had the power to fire the CEO.
In January 2019, Vincit acquired LeanCraft Innovations, a software company based in Oulu, Finland. In June, one of the hundreds of Slack channels used by Vincit employees caused a stir due to the disturbing material shared on the channel. The channel was closed, and the company created guidelines for internal communication. At the time, Vincit employed 450 people. In August, Vincit announced news of an organizational renewal. Vincit introduced a cell structure to regroup the staff that had grown to 500 people. As the people no longer knew each other, making shared decision-making had become difficult. In October, Vincit sold its Vincit SIGN business to Telia Finland.
In early 2020, VincitEAM and Vincit LaaS began to operate as incorporated companies. During the spring, the outbreak of the COVID-19 pandemic hindered Vincit's business activities as the company was operating on a relatively short order backlog. In the “humane cooperation negotiations” some of Vincit's employees decided to take unpaid leave voluntarily. In May, Vincit announced it would renew a web service of Association for Finnish Work pro bono to support Finnish economic life during the exceptional circumstances.
Organization
Mikko Kuitunen is Vincit's CEO.
Since 2019, Vincit employees have been grouped into cells that operate like companies within a company. Cells have their own clients, own directors, reward systems, study groups and people and culture officers. Cells are responsible for employee and customer experience, utilization rate, and allocation. There can be up to a hundred people in one cell. The smallest cell with 30 employees is located in Oulu. In 2019, Tampere had three cells. Employees also have different informal cross-cell interest groups.
Vincit has offices at Tampere, Turku, Oulu, Helsinki, and Irvine, California. In addition to the parent company, the group comprises Vincit Coop Oy, VincitEAM Oy, Vincit LaaS Oy and Vincit's subsidiary in the USA.
Markets
Vincit offers software development and ICT services for corporate customers and the public sector. The majority of Vincit's clients are large or medium-sized enterprises and public sector actors. In 2016, Vincit put efforts in the fastest growing software development areas, such as the Industrial Internet. In 2019, Vincit's key product-based businesses were VincitEAM and LaaS. In 2017, Vincit had over 200 clients from several different fieldsincluding for example General Electric, Taksi Helsinki, Yleinen työttömyyskassa, Kalmar, Logitech and Tommy Car Wash.
Vincit's core business area is project business.
VincitEAM (Vincit Enterprise Asset Management) provides ERP services for example for power plants’ maintenance and asset management.
Vincit LaaS was a web service for employee-driven leadership. In 2017, Vincit turned its internal LaaS platform (Leadership as a Service) tool into a product. The tool has been compared to a web store where employees can order leadership services like performance appraisals, feedback, massages, LinkedIn profile coaching, or sleep trackers. The employees could also review the services, making the further development or cutting of services easier. The LaaS tool also offers real-time analytics and supports low organizational hierarchy. In November 2019, close to 50 organizations used LaaS.
Company culture
Vincit's company culture's distinctive features are the company's distinguishable communication style, low hierarchy, and transparency: salary data and operating figures are open to everyone. Vincit employees are free to make decisions related to their workplace well-being like purchasing their work devices with the credit cards they have access to. Vincit's employee benefits also include childcare services for sick children, family coaching, and sleep school for families with small children. Employees can borrow childcare products from the office and use the office spaces for children's’ birthday parties during the weekends. Vincit's company culture has won competitions, such as Työhyvinvointiteko 2012 (Workplace Well-being Accomplishment) and the Finnish Employer of the Year 2014 awards.
Vincit has also performed well in the Great Place to Work contest in Finland and Europe:
The Great Place to Work Institute Finland awarded Vincit as the Best Workplace in Finland in 2014, 2015, and 2016.
In June 2014, Vincit was in 8th place and a year later, in 3rd place as the Best Workplace in Europe. In June 2016, Vincit was awarded the Best Workplace in Europe.
In 2015, Vincit's Utopia project was awarded the Employee Wellbeing Accomplishment of the Year (Vuoden Henkilöstöteko). The Utopia project helped Vincit employees to turn their work-related dreams into reality. Their dreams were focusing for example on daily workflow, competence development, or work-life balance. The project led to the introduction of a new model where the role of administrative superiors was dropped, and the employees were given more autonomy over their personal development.
In 2020, Vincit was on Fast Company’s global list of Best Workplaces for Innovators.
Recognitions
Vincit and its management have received various entrepreneurship awards, including:
Vincit was among the top 10 finalists in the European Business Awards in 2011. Vincit also won the Ruban d’Honneur honorable mention and represented Finland in the 2013/2014 competition.
In December 2011, The Finnish Enterprise Agencies awarded Vincit the New Entrepreneur of the year.
In June 2012, Tietoviikko (Tivi) magazine awarded Vincit as the Tivi company of the year. CEO Kuitunen won the special price of Young Entrepreneur of the Year in the Ernst & Young Entrepreneur of the Year Awards and the Best Young Entrepreneur in Finland award granted by the Finnish Ministry of Economic Affairs and Employment.
Kauppalehti magazine and OP Financial Group rewarded Vincit as the Company of the Year in November 2013.
In November 2014, Junior Chamber International Finland awarded Mikko Kuitunen as the Outstanding Young Professional of the Year and in January 2015, he was one of the winners of the Most Marketing-minded Engineer 2014 competition.
In 2016, the City of Tampere presented Kuitunen with the Tampere prize.
In 2018, Ville Houttu, CEO of Vincit California, was named the Entrepreneur of the Year by the Greater Irvine Chamber of Commerce.
References
External links
Official website
Software companies of Finland
Companies listed on Nasdaq Helsinki
2007 establishments in Finland |
29726536 | https://en.wikipedia.org/wiki/Computer%20processing%20of%20body%20language | Computer processing of body language | The normal way that a computer functions manually is through a person that controls the computer. An individual generates computer actions with the use of either a computer mouse or keyboard. However the latest technology and computer innovation might allow a computer to not only detect body language but also respond to it. Modern devices are being experimented with, that may potentially allow that computer related device to respond to and understand an individual's hand gesture, specific movement or facial expression.
Relating to computer science
Being able to read body language is an example of artificial intelligence. As stated by the man who came up with the term, John McCarthy, artificial intelligence is "the science and engineering of making intelligent machines". In relation to computers and body language, research is being done with the use of mathematics in order to teach computers to interpret human movements, hand gestures and even facial expressions. This is different from the normal way people generally communicate with computers for example with the click of the mouse, keyboard, or any physical contact in general between the user and the computer.
Background information on research being done
MIAUCE and Chaabane Djeraba
This type of research is being done by a group of European researchers and other scientists as well. There is also a project called MIAUCE (Multimodal interactions analysis and exploration of users within a Controlled Environment). This project has scientists working on making this sort of new advance in computer technology a reality. Chaabane Djeraba, the project coordinator stated "The motivation of the project is to put humans in the loop of interaction between the computer and their environment."
Aims and motives of this project
Researchers and scientists are trying to use their innovation and ideas in a way that can help them apply these modern technological devices to the daily needs of businesses and places people visit such as the mall or an airport. The project coordinator of MIAUCE stated "We would like to have a form of ambient intelligence where computers are completely hidden…this means a multimodal interface so people can interact with their environment. The computer sees their behavior and then extracts information useful for the user." This specific research group has developed a couple of different real life models of computer technology that will use body language as a means of communication and way to function.
Prototypes created by scientists and researchers
General personal use
Scientists and researchers experimented with people and computers. A certain number of volunteers were gathered to test out if computers could function by interpreting what the individual was physically doing without actually having to touch the computer itself. In this specific experiment researchers asked the volunteers to try to control the computer using nothing but their eye movement. The science behind this involves the use of electrooculography, also referred to as EOG. This is basically a technique used to examine someones eye movements. This is then made into a cursor on the computer screen for the individual, in this case the volunteers. The volunteers were then shown a display on the computer screen of the letters in the alphabet. With their eye movements, the volunteers were now able to use this cursor on the computer screen and perform tasks such as typing words.
Security purposes in busy and overcrowded areas
An example of the applied version of this computer technology is controlling the safety of people in heavily populated areas such as bus terminals, airports or even shopping complexes. To monitor such places you definitely need a security camera at the site itself. However, researchers are trying to make use of computers to manage such areas aside from just video cameras alone. Basically a video stream is evaluated and examined which involves a bit of math, for example a description of shapes, the flow of people and their movements. Then that data is analyzed in terms of crowd density, their pace and the direction in which they are moving. At last is when the use of the computer comes in. The computer aids in evaluating any type of activity in these types of crowded areas that seems fairly irregular. The information that the computer gives will help provide an alert or warning that something is not going the way it should. In the example of a mall, this sort of computer will provide an alert to something like an individual that has fallen from an escalator.
Marketing purposes
Helping businesses and local stores analyze how their customers behave while shopping is another example of how this type of computer technology will be implemented. The scientists and researchers that are creating a computer that runs on this type of technology are making use of the idea of body language being used to work with computers for reasons relating to marketing purposes. This involves monitoring how many people there are in the street next to that certain shop as well as the use of a heat map generator. The heat map generator will basically allow the manager or staff at a store to observe how exactly people are moving. It will also allow staff workers to see what things in the store are attracting the attention of the customers the most.
Controversy and privacy concerns
Although this sort of technology does seem helpful and practical, certain issues and concerns arise among everyday people who are doubtful about how such computers are going to be used. Some people debate on the issue of security and privacy with such computer technologies. With this new technology comes a bit of concern with the public. Therefore, researchers and scientists have to take into perspective that such computer devices that can read body language have to give people a warning of that kind of a capability. So when building and putting into effect this type of computer, so to say as a means of security, there has to be concern over whether or not to take into consideration if this kind of computer technology would be accepted by the general public. That includes people who are possibly going to be observed and monitored by a kind of computer that analyzes their body movements and gestures when they are doing something as simple as shopping at a mall or traveling through airports.
See also
Emotion recognition
Facial recognition system
Facial Action Coding System
Machine translation of sign languages
3D pose estimation
References
Moursund, David. Brief Introduction to Educational Implications of Artificial Intelligence. Oregon: Dave Moursund, 2006. Print.
Braffort, Annelies. Gesture-based Communication in Human-computer Interaction: International Gesture Workshop, GW '99, Gif-sur-Yvette, France, March 17–19, 1999 : Proceedings. Berlin: Springer, 1999. Print.
Fred, Ina. "Gates: Natal to Bring Gesture Recognition to Windows Too." Cnetnews 14 July 2009: 1. http://news.cnet.com. Ina Fred, 14 July 2009. Web. 18 Nov. 2010. <http://news.cnet.com/8301-13860_3-10286309-56.html>.
Hansen, Evan. "Building a Better Computer Mouse." CNET News. CNET, 2 Oct. 2002. Web. 20 Nov. 2010. <http://news.cnet.com/2100-1023-960408.html>.
Unknown. "How Computers Can Read Body Language." EUROPA - European Commission - Homepage. 22 Oct. 2010. Web. 22 Nov. 2010. <http://ec.europa.eu/research/headlines/news/article_10_10_22_en.html>.
Braffort, Annelies. Gesture-based Communication in Human-computer Interaction: Proceedings. Berlin [etc.: Springer, 1999. Print.
Yang, Ming-Hsuan, and Narendra Ahuja. Face Detection and Gesture Recognition for Human-computer Interaction. Boston: Kluwer Academic, 2001. Print.
External links
Computers Detecting Body Language
Artificial Intelligence
John McCarthy
Subfields of Computer Science
Online Artificial Intelligence Resource
Computers and Gestures
Mathematics and Computer Science
https://web.archive.org/web/20110717201127/http://www.faculty.iu-bremen.de/llinsen/publications/theses/Alen_Stojanov_Guided_Research_Report.pdf
http://www.physorg.com/news/2010-11-human-computer-music-links-musical-gestures.html
Human–computer interaction |
27565075 | https://en.wikipedia.org/wiki/Asana%20%28software%29 | Asana (software) | Asana ( or ) is a web and mobile work management platform designed to help teams organize, track, and manage their work. It is produced by the San Francisco based company of the same name (Asana, Inc.).
The company was founded in 2008 by Dustin Moskovitz and Justin Rosenstein. The product launched commercially in April 2012. In September 2020, the company was valued at $5.5 billion following its direct listing.
History
The co-founders met at Facebook, where Moskovitz, Facebook's co-founder and vice president of engineering, and his colleague Rosenstein created a productivity tool called Tasks. In 2008, Moskovitz and Rosenstein left Facebook to start Asana. Asana officially launched for free out of beta in November 2011 and commercially in April 2012.
In 2016, Asana raised $50 million in Series C financing led by Sam Altman, President of Y Combinator.
By January 2018, more than 35,000 paying customers were using Asana, including CityFibre, AB-InBev, Viessmann, eBay, Uber, Overstock, Navy Federal Credit Union, Icelandair, and IBM. That same year, the company raised $75 million in Series D funding led by Generation Investment Management, a firm backed by Al Gore. In November 2018, Asana raised another $50 million in funding in Series E to invest in international and product expansion.
In September 2020, Asana went public on the New York Stock Exchange via a direct public offering. In August 2021, Asana dual listed on the Long-Term Stock Exchange.
By December 2021, Asana’s customer count increased to 114,000 with two million paid seats globally and 739 of these were spending $50,000 or more on an annualized basis.
Product
Asana is a software-as-a-service platform designed to improve team collaboration and work management. It helps teams manage projects and tasks in one tool. Teams can create projects, assign work to teammates, specify deadlines, and communicate about tasks directly in Asana. It also includes reporting tools, file attachments, calendars, and more.
In May 2013, Asana launched Organizations, a way for companies of all sizes to use Asana, reporting tools to help teams monitor project progress, and IT admin tools.
In 2014, Asana launched its native iOS app and in January 2015, Asana released its native Android app. In September 2015, the company redesigned its application and brand adding a conversations feature.
In September 2016, the company launched custom fields, “an interface and architecture that will let you tailor Asana’s information management to cover a variety of structured data points”. A few months later, Asana launched Boards so teams could organize and visualize their projects in columns. The Verge reported that, “By integrating lists and boards into a single product, Asana may have just vaulted ahead of its rivals.” The company also released pre-made project templates.
In March 2017, Asana announced its integration with Microsoft Teams. In fall 2017, start dates, a new integration with Gmail, and comment-only projects were released. Also in November, Asana launched its app in French and German.
In March 2018, Asana announced a new interactive feature called Timeline, which businesses can use to visualize and map out their projects. Later that year, Asana launched its Business tier for enterprises using Asana for multiple projects.
In response to the start of the COVID-19 pandemic in 2020, Asana released its Do Not Disturb feature, targeted at workers who were more likely to work from home as a result of the sudden shift to remote work.
In July 2021, Asana launched an app for Zoom. The Asana app can be opened within the videoconferencing software Zoom.
In October 2021, Asana announced its new Enterprise Work Graph suite.
API and integrations
In April 2012, Asana released its API to third-party developers. Asana's open API provides a means to programmatically read and input information, and create automations within Asana. Common use cases include automating repetitive tasks, chaining processes, automating reporting on tasks and projects, and syncing with databases or other tools.
The Asana API is a RESTful interface, allowing users to update and access much of their data on the platform.
In April 2021, Asana launched Asana Partners, which allows for cross platform integration with its project management software. Asana has integrations with more than 200 SaaS tools, including Gmail, Slack, Microsoft Outlook, Dropbox, Box, Google Drive, Zapier, IFTTT, Wufoo, JotForm, Okta, OneLogin, Harvest, Instagantt, Zendesk, Zoom and Qatalog.
Reception
Asana twice received a 4.5 out of 5 from PC Magazine. In 2017 it was an Editors' Choice and was called "one of the best collaboration and productivity apps for teams" and went on to note its "thoughtful design, fluid interactive elements, and generous member allotment." In 2020, it received a Best of the Year award, remarking that it "is one of the best apps for managing tasks, workflows, and—yes—certain kinds of projects" despite not being a full-scale project management platform. Asana was also named as one of the Best Workplaces for Parents in the US by Great Place to Work in 2020 and 2021.
See also
Collaboration software
Comparison of project management software
List of collaborative software
List of project management software
Project management software
References
External links
Task management software
Mission District, San Francisco
Web applications
Software companies based in the San Francisco Bay Area
2008 establishments in California
Companies based in San Francisco
American companies established in 2008
Software companies established in 2008
Software companies of the United States
Companies listed on the New York Stock Exchange
Direct stock offerings |
3118940 | https://en.wikipedia.org/wiki/Semantic%20wiki | Semantic wiki | A semantic wiki is a wiki that has an underlying model of the knowledge described in its pages. Regular, or syntactic, wikis have structured text and untyped hyperlinks. Semantic wikis, on the other hand, provide the ability to capture or identify information about the data within pages, and the relationships between pages, in ways that can be queried or exported like a database through semantic queries.
Semantic wikis were first proposed in the early 2000s, and began to be implemented seriously around 2005. As of 2021, well-known semantic wiki engines are Semantic MediaWiki and Wikibase.
Key characteristics
Formal notation
The knowledge model found in a semantic wiki is typically available in a formal language, so that machines can process it into an entity-relationship model or relational database.
The formal notation may be included in the pages themselves by users, as in Semantic MediaWiki, or it may be derived from the pages or the page names or the means of linking. For example, using a specific alternative page name might indicate that a specific type of link was intended.
Providing information through a formal notation allows machines to calculate new facts (e.g. relations between pages) from the facts represented in the knowledge model.
Semantic Web compatibility
The technologies developed by the Semantic Web community provide one basis for formal reasoning about the knowledge model that is developed by importing this data. However, there is also a wide array of technologies that work on relational data.
Example
Imagine a semantic wiki devoted to food. The page for an apple would contain, in addition to standard text information, some machine-readable or at least machine-intuitable semantic data. The most basic kind of data would be that an apple is a kind of fruit—what's known as an inheritance relationship. The wiki would thus be able to automatically generate a list of fruits, simply by listing all pages that are tagged as being of type "fruit." Further semantic tags in the "apple" page could indicate other data about apples, including their possible colors and sizes, nutritional information and serving suggestions, and so on.
If the wiki exports all this data in RDF or a similar format, it can then be queried in a similar way to a database—so that an external user or site could, for instance, request a list of all fruits that are red and can also be baked in a pie.
History
In the 1980s, before the Web began, there were several technologies to process typed links between collectively maintained hypertext pages, such as NoteCards, KMS, and gIBIS. Extensive research was published on these tools by the collaboration software, computer-mediated communication, hypertext, and computer supported cooperative work communities.
The first known usage of the term "Semantic Wiki" was a Usenet posting by Andy Dingley in January 2001. Its first known appearance in a technical paper was in a 2003 paper by Austrian researcher Leo Sauermann.
Many of the existing semantic wiki applications were started in the mid-2000s, including ArtificialMemory (2004), Semantic MediaWiki (2005), Freebase (2005), and OntoWiki (2006).
June 2006 saw the first meeting dedicated to semantic wikis, the "SemWiki" workshop, co-located with the European Semantic Web Conference in Montenegro. This workshop ran annually until 2010.
The site DBpedia, launched in 2007, though not a semantic wiki, publishes structured data from Wikipedia in RDF form, which enables semantic querying of Wikipedia's data.
In March 2008, Wikia, the world's largest wiki farm, made the use of Semantic MediaWiki available for all their wikis on request, thus allowing all the wikis they hosted to function as semantic wikis. However, since upgrading to version 1.19 of MediaWiki in 2013, they have stopped supporting Semantic MediaWiki for new requests on the basis of performance problem.
In July 2010, Google purchased Metaweb, the company behind Freebase.
In April 2012, work began on Wikidata, a collaborative, multi-language store of data, whose data could then be used within Wikipedia articles, as well as by the outside world.
Semantic wiki software
There are a number of wiki applications that provide semantic functionality. Some standalone semantic wiki applications exist, including OntoWiki. Other semantic wiki software is structured as extensions or plugins to standard wiki software. The best-known of these is Semantic MediaWiki, an extension to MediaWiki. Another example is the SemanticXWiki extension for XWiki.
Some standard wiki engines also include the ability to add typed, semantic links to pages, including PhpWiki and Tiki Wiki CMS Groupware.
Freebase, though not billed as a wiki engine, is a web database with semantic-wiki-like properties.
Common features
Semantic wikis vary in their degree of formalization. Semantics may be either included in, or placed separately from, the wiki markup. Users may be supported when adding this content, using forms or autocompletion, or more complex proposal generation or consistency checks. The representation language may be wiki syntax, a standard language like RDF or OWL, or some database directly populated by the tool that withdraws the semantics from the raw data. Separate versioning support or correction editing for the formalized content may also be provided. Provenance support for the formalized content, that is, tagging the author of the data separately from the data itself, varies.
What data can get formalized also varies. One may be able to specify types for pages, categories, or paragraphs or sentences (the latter features were more common in pre-web systems). Links are usually also typed. The source, property, and target may be determined by some defaults, e.g. in Semantic MediaWiki the source is always the current page.
Reflexivity also varies. More reflexive user interfaces provide strong ontology support from within the wiki, and allow it to be loaded, saved, created, and changed.
Some wikis inherit their ontology entirely from a pre-existing strong ontology like Cyc or SKOS, while, on the other extreme, in other semantic wikis the entire ontology is generated by users.
Conventional, non-semantic wikis typically still have ways for users to express data and metadata, typically by tagging, categorizing, and using namespaces. In semantic wikis, these features still typically exist, but integrated these with other semantic declarations, and sometimes with their use restricted.
Some semantic wikis provide reasoning support, using a variety of engines. Such reasoning may require that all instance data comply with the ontologies.
Most semantic wikis have simple querying support (such as searching for all triples with a certain subject, predicate, object), but the degree of advanced query support varies; some semantic wikis provide querying in standard languages like SPARQL, while others instead provide a custom language. User interface support to construct these also varies. Visualization of the links especially may be supported.
Many semantic wikis can display the relationships between pages, or other data such as dates, geographical coordinates, and number values, in various formats, such as graphs, tables, charts, calendars, and maps.
See also
Microformats
Ontology
RDF, RDFS, OWL, SPARQL
Business Intelligence 2.0 (BI 2.0)
Software and websites:
Familypedia
Freebase
Gardenology.org
Math Images Project
Metavid
NeuroLex
OpenEI
SKYbrary
SNPedia
Wikidata
References
External links
Semantic wiki article at SemanticWeb.org
Semantic wiki projects - contains a list of active, defunct and proposed semantic wiki applications
SemanticWiki mini-series - a mini-series of online conferences about semantic wikis that ran in 2008 and 2009.
Semantic wiki software
Wikis by genre |
10354714 | https://en.wikipedia.org/wiki/LinuxForums.org | LinuxForums.org | LinuxForums.org was an Internet forum for Linux users needing free help and support with their Linux distributions and software, and computer hardware. It was owned by MAS Media Inc. With more than 200,000 registered members, it was one of the most active Linux forums and free software community sites on the Internet.
Support was given in different ways in specific forums, such as on a distribution level (for major distribution such as Red Hat/Fedora Core, Ubuntu, Suse, Slackware, Debian, Gentoo, Arch), but also on an operational level (for areas such as Wireless, Applications, Servers, Networking, Desktop / X Window, Programming & Scripting).
In November 2008 the forum changed ownership and did a complete overhaul of the site.
Breach and shutdown
In May 2018, the Linux Forums website suffered a data breach which resulted in the disclosure of 276k unique email addresses.
Since May 2020, the website has been offline. No explanation has been provided.
References
Internet properties established in 2009
Internet properties disestablished in 2020
Defunct websites |
34516594 | https://en.wikipedia.org/wiki/Hacker%20T.%20Dog | Hacker T. Dog | Hacker T. Dog (born 27 October) is a dog puppet who appears on the CBBC television channel in the United Kingdom. He is described as being a Border Terrier who was born and lives in Wigan. He is the half-brother of Dodge T. Dog and is the son of Mrs. T. Dog (his father is never mentioned or seen).
Development
Hacker was introduced as a character in the CBBC television programme Scoop, performed by Andy Heath. The character became popular, and began to appear as a weekday presenter at the CBBC office in May 2009, with the puppetry and voice now being performed by Phil Fletcher. His "half-brother" Dodge T. Dog joined the following year. In July, he began presenting on weekdays with Scottish comedian Iain Stirling. He presented with Stirling until his departure in 2013, when they began presenting with Chris Johnson.
In 2011, Hacker was given a solo presenting role on a separate chat show titled Hacker Time.
The character took a hiatus from CBBC in April 2014. During his absence, numerous guest presenters filled in for him during his usual weekday afternoon slot. Hacker returned temporarily on 24 May 2014, before making a permanent return on 18 June 2014.
Hacker is very fond of TV presenter Sue Barker and mentions her often. In 2009 Hacker was the mascot for the Wimbledon Lawn Tennis Championship.
Hacker is known for saying ‘cockers’ - a Northern term for ‘mate’.
In other media
Hacker, Barry Davies and Amberley Lobo provided CBBC commentary on the Russia v Belgium match at the 2014 FIFA World Cup.
Hacker appeared on an episode of Celebrity Mastermind in January 2018, his specialist subject being the Pet Shop Boys. He came in second place, losing to TV chef Paul Rankin.
A Hacker T. Dog plush puppet was released in 2015 and made by Kidz Kreations, and sold in Argos and The Entertainer.
Social media
Hacker gained an official Twitter account in 2014, which as of March 2021 has over 42 thousand followers. An Instagram account was eventually created in April 2019, followed by an official YouTube channel. These social media channels are run by both Phil Fletcher and the BBC social media team.
Television
The character has appeared in the following roles:
See also
Hacker Time
Andy Heath (puppeteer)
Warrick Brownlow-Pike
Sue Barker
List of CBBC presenters
References
Fictional dogs
British comedy puppets |
927470 | https://en.wikipedia.org/wiki/Single%20sign-on | Single sign-on | Single sign-on (SSO) is an authentication scheme that allows a user to log in with a single ID to any of several related, yet independent, software systems.
True single sign-on allows the user to log in once and access services without re-entering authentication factors.
It should not be confused with same-sign on (Directory Server Authentication), often accomplished by using the Lightweight Directory Access Protocol (LDAP) and stored LDAP databases on (directory) servers.
A simple version of single sign-on can be achieved over IP networks using cookies but only if the sites share a common DNS parent domain.
For clarity, a distinction is made between Directory Server Authentication (same-sign on) and single sign-on: Directory Server Authentication refers to systems requiring authentication for each application but using the same credentials from a directory server, whereas single sign-on refers to systems where a single authentication provides access to multiple applications by passing the authentication token seamlessly to configured applications.
Conversely, single sign-off or single log-out (SLO) is the property whereby a single action of signing out terminates access to multiple software systems.
As different applications and resources support different authentication mechanisms, single sign-on must internally store the credentials used for initial authentication and translate them to the credentials required for the different mechanisms.
Other shared authentication schemes, such as OpenID and OpenID Connect, offer other services that may require users to make choices during a sign-on to a resource, but can be configured for single sign-on if those other services (such as user consent) are disabled. An increasing number of federated social logons, like Facebook Connect, do require the user to enter consent choices upon first registration with a new resource, and so are not always single sign-on in the strictest sense.
Benefits
Benefits of using single sign-on include:
Mitigate risk for access to 3rd-party sites ("federated authentication") because user passwords not stored or managed externally
Reduce password fatigue from different username and password combinations
Reduce time spent re-entering passwords for the same identity
Reduce IT costs due to lower number of IT help desk calls about passwords
Simpler administration. SSO-related tasks are performed transparently as part of normal maintenance, using the same tools that are used for other administrative tasks.
Better administrative control. All network management information is stored in a single repository. This means that there is a single, authoritative listing of each user’s rights and privileges. This allows the administrator to change a user’s privileges and know that the results will propagate network wide.
Improved user productivity. Users are no longer bogged down by multiple logons, nor are they required to remember multiple passwords in order to access network resources. This is also a benefit to Help desk personnel, who need to field fewer requests for forgotten passwords.
Better network security. Eliminating multiple passwords also reduces a common source of security breaches—users writing down their passwords. Finally, because of the consolidation of network management information, the administrator can know with certainty that when he disables a user’s account, the account is fully disabled.
Consolidation of heterogeneous networks. By joining disparate networks, administrative efforts can be consolidated, ensuring that administrative best practices and corporate security policies are being consistently enforced.
SSO shares centralized authentication servers that all other applications and systems use for authentication purposes and combines this with techniques to ensure that users do not have to actively enter their credentials more than once.
Criticism
The term reduced sign-on (RSO) has been used by some to reflect the fact that single sign-on is impractical in addressing the need for different levels of secure access in the enterprise, and as such more than one authentication server may be necessary.
As single sign-on provides access to many resources once the user is initially authenticated ("keys to the castle"), it increases the negative impact in case the credentials are available to other people and misused. Therefore, single sign-on requires an increased focus on the protection of the user credentials, and should ideally be combined with strong authentication methods like smart cards and one-time password tokens.
Single sign-on also makes the authentication systems highly critical; a loss of their availability can result in denial of access to all systems unified under the SSO. SSO can be configured with session failover capabilities in order to maintain the system operation. Nonetheless, the risk of system failure may make single sign-on undesirable for systems to which access must be guaranteed at all times, such as security or plant-floor systems.
Furthermore, the use of single-sign-on techniques utilizing social networking services such as Facebook may render third party websites unusable within libraries, schools, or workplaces that block social media sites for productivity reasons. It can also cause difficulties in countries with active censorship regimes, such as China and its "Golden Shield Project," where the third party website may not be actively censored, but is effectively blocked if a user's social login is blocked.
Security
In March, 2012, a research paper reported an extensive study on the security of social login mechanisms. The authors found 8 serious logic flaws in high-profile ID providers and relying party websites, such as OpenID (including Google ID and PayPal Access), Facebook, Janrain, Freelancer, FarmVille, and Sears.com. Because the researchers informed ID providers and relying party websites prior to public announcement of the discovery of the flaws, the vulnerabilities were corrected, and there have been no security breaches reported.
In May 2014, a vulnerability named Covert Redirect was disclosed. It was first reported "Covert Redirect Vulnerability Related to OAuth 2.0 and OpenID" by its discoverer Wang Jing, a Mathematical PhD student from Nanyang Technological University, Singapore. In fact, almost all Single sign-on protocols are affected. Covert Redirect takes advantage of third-party clients susceptible to an XSS or Open Redirect.
In December 2020, flaws in federated authentication systems were discovered to have been utilized by attackers during the 2020 United States federal government data breach.
Due to how single sign-on works, by sending a request to the logged-in website to get a SSO token and sending a request with the token to the logged-out website, the token cannot be protected with the HttpOnly cookie flag and thus can be stolen by an attacker if there is an XSS vulnerability on the logged-out website, in order to do session hijacking. Another security issue is that if the session used for SSO is stolen (which can be protected with the HttpOnly cookie flag unlike the SSO token), the attacker can access all the websites that are using the SSO system.
Privacy
As originally implemented in Kerberos and SAML, single sign-on did not give users any choices about releasing their personal information to each new resource that the user visited. This worked well enough within a single enterprise, like MIT where Kerberos was invented, or major corporations where all of the resources were internal sites. However, as federated services like Active Directory Federation Services proliferated, the user's private information was sent out to affiliated sites not under control of the enterprise that collected the data from the user. Since privacy regulations are now tightening with legislation like the GDPR, the newer methods like OpenID Connect have started to become more attractive; for example MIT, the originator of Kerberos, now supports OpenID Connect.
Email address
Single sign-on in theory can work without revealing identifying information such as email addresses to the relying party (credential consumer), but many credential providers do not allow users to configure what information is passed on to the credential consumer. As of 2019, Google and Facebook sign-in do not require users to share email addresses with the credential consumer. 'Sign in with Apple' introduced in iOS 13 allows a user to request a unique relay email address each time the user signs up for a new service, thus reducing the likelihood of account linking by the credential consumer.
Common configurations
Kerberos-based
Initial sign-on prompts the user for credentials, and gets a Kerberos ticket-granting ticket (TGT).
Additional software applications requiring authentication, such as email clients, wikis, and revision-control systems, use the ticket-granting ticket to acquire service tickets, proving the user's identity to the mail-server / wiki server / etc. without prompting the user to re-enter credentials.
Windows environment - Windows login fetches TGT. Active Directory-aware applications fetch service tickets, so the user is not prompted to re-authenticate.
Unix/Linux environment - Login via Kerberos PAM modules fetches TGT. Kerberized client applications such as Evolution, Firefox, and SVN use service tickets, so the user is not prompted to re-authenticate.
Smart-card-based
Initial sign-on prompts the user for the smart card. Additional software applications also use the smart card, without prompting the user to re-enter credentials. Smart-card-based single sign-on can either use certificates or passwords stored on the smart card.
Integrated Windows Authentication
Integrated Windows Authentication is a term associated with Microsoft products and refers to the SPNEGO, Kerberos, and NTLMSSP authentication protocols with respect to SSPI functionality introduced with Microsoft Windows 2000 and included with later Windows NT-based operating systems. The term is most commonly used to refer to the automatically authenticated connections between Microsoft Internet Information Services and Internet Explorer. Cross-platform Active Directory integration vendors have extended the Integrated Windows Authentication paradigm to Unix (including Mac) and Linux systems.
Security Assertion Markup Language
Security Assertion Markup Language (SAML) is an XML-based method for exchanging user security information between an SAML identity provider and a SAML service provider. SAML 2.0 supports W3C XML encryption and service-provider–initiated web browser single sign-on exchanges. A user wielding a user agent (usually a web browser) is called the subject in SAML-based single sign-on. The user requests a web resource protected by a SAML service provider. The service provider, wishing to know the identity of the user, issues an authentication request to a SAML identity provider through the user agent. The identity provider is the one that provides the user credentials. The service provider trusts the user information from the identity provider to provide access to its services or resources.
Emerging configurations
Mobile devices as access credentials
A newer variation of single-sign-on authentication has been developed using mobile devices as access credentials. Users' mobile devices can be used to automatically log them onto multiple systems, such as building-access-control systems and computer systems, through the use of authentication methods which include OpenID Connect and SAML, in conjunction with an X.509 ITU-T cryptography certificate used to identify the mobile device to an access server.
A mobile device is "something you have," as opposed to a password which is "something you know," or biometrics (fingerprint, retinal scan, facial recognition, etc.) which is "something you are." Security experts recommend using at least two out of these three factors (multi-factor authentication) for best protection.
See also
Central Authentication Service
Identity management
Identity management systems
List of single sign-on implementations
Password manager
Security Assertion Markup Language
Usability of web authentication systems
References
External links
Single sign-on intro with diagrams
(Note: the above linked article does not exist, not even in web archive)
Password authentication
Federated identity
Computer access control |
49724389 | https://en.wikipedia.org/wiki/Octaware%20Technologies | Octaware Technologies | Octaware Technologies is a software development company based in Mumbai, India. It was incorporated in 2005. It is India's first company claiming sharia compliance that has been listed on the Bombay Stock Exchange, the listing happened on 3 April, 2017. It received approval for listing in early 2016. The company operates domestically and internationally with subsidiaries registered in United Arab Emirates, United States, and India. It has marketing offices in Zimbabwe, Nigeria, Saudi Arabia and Qatar.
Profile and products
The company is SEI-CMMI Level 3, ISO 9001:2008 and rated “SE-2A” High Performance Capability and High Financial Strength by NSIC-CRISIL. Octaware is a member of the Electronics and Software Promotion Council.
Octaware provides specialized software application and product development services and solutions in the areas of healthcare, finance, and e-government industry. The company has proprietary products for domestic as well as international markets, such as PowerERM – Human Capital Relationship Management, Hospice – Healthcare and Citizen services solution, and – Inventory management and tracking System etc. These products are available as packaged products as well as software-as-a-service model integrated with legacy system.
People
Aslam Khan is Chairman and CEO of the company. Aslam Khan is a member of NASSCOM Foundation's Business Responsibility Forum. He is an alumnus of M. H. Saboo Siddik College of Engineering, and worked in Japan and the US before his return to India in 2005. Shariq Nisar, academic, activist and finance professional is a non-executive independent director.
CSR initiatives
Octaware Technologies has designed material for a training programme at the National Association for the Blind's Employment and Training (NABET) centre, which enables NABET personnel to train those with sight impairment in software testing; it has also provided trainee opportunities to work on professional assignments. CEO Khan is the founder of a school for children with special needs and a multi-speciality hospital in Mumbai. These are based on the social entrepreneurship model. Khan has been invited to speak with students so as to acquaint them with the opportunities available in the software industry in terms of employment and entrepreneurship.
References
Website Link
http://www.octaware.com/
Business software companies
Islamic legal occupations |
30053274 | https://en.wikipedia.org/wiki/Bonnie%20E.%20John | Bonnie E. John | Bonnie E. John (born September 10, 1955) is an American cognitive psychologist who studies human–computer interaction, predictive human performance modeling, and the relationship between usability and software architecture. She was a founding member of the Human-Computer Interaction Institute at Carnegie Mellon University, a research staff member at IBM's Thomas J. Watson Research Center, and the director of computation and innovation at The Cooper Union. She is currently a UX designer at Bloomberg L.P.
Background
A founding member of the Human-Computer Interaction Institute, established in 1993 at Carnegie Mellon University, she was previously an assistant professor in the Computer Science Department at Carnegie Mellon. She earned her Ph.D. in cognitive psychology at Carnegie Mellon University in 1988.
John has published over 100 technical papers in the area of human–computer interaction. She was elected to the CHI Academy in 2005. She was also a founding associate editor for ACM Transactions on Computer Human Interaction (TOCHI) and regularly serves on the ACM SIGCHI conference program committee. John was the director of the Masters in HCI Program in Human–Computer Interaction at Carnegie Mellon University from 1997 to 2009. John was a research staff member at IBM's Thomas J. Watson Research Center from December 2010 through December 2014. She returned to her alma mater, The Cooper Union, as the director of computation and innovation in December 2014. In July 2015 Bonnie joined Bloomberg's UX design team, to focus primarily on discoverability of new functionality on the Bloomberg Terminal.
Research
John researches techniques to improve the design of computer systems with respect to their usefulness and usability. She has investigated the effectiveness and usability of several HCI techniques (e.g., think-aloud usability studies, Cognitive Walkthrough, GOMS) and produced new techniques for bringing usability concerns to the design process (e.g., CPM-GOMS and Usability-Supporting Architectural patterns). Her team at Carnegie Mellon University has developed CogTool, an open-source tool to support Keystroke-Level Model analysis.
Honors
2005—Elected to CHI Academy
2007—Thomas A. Wasow Visiting Scholar in Symbolic Systems, Stanford University
References
External links
Bonnie E. John's Cooper Union projects, courses, publications, students, etc.
Bonnie E. John's IBM website
Bonnie E. John's Carnegie Mellon website
Human-Computer Interaction Institute
Human–computer interaction researchers
Carnegie Mellon University faculty
Cognitive scientists
Carnegie Mellon University alumni
Human-Computer Interaction Institute faculty
Cooper Union alumni
1955 births
Living people |
42440937 | https://en.wikipedia.org/wiki/Piranha%20%28software%29 | Piranha (software) | Piranha is a text mining system developed for the United States Department of Energy (DOE) by Oak Ridge National Laboratory (ORNL). The software processes large volumes of unrelated free-text documents and shows relationships amongst them, a technique valuable across numerous scientific and data domains, from health care fraud to national security. The results are presented in clusters of prioritized relevance to business and government analysts. Piranha uses the term frequency/inverse corpus frequency term weighting method which provides strong parallel processing of textual information, thus the ability to analyze very large document sets.
Piranha has six main strengths:
Collecting and Extracting: Millions of documents from numerous sources such as databases and social media can be collected and text extracted from hundreds of file formats; This info. can then be translated to any number of languages.
Storing and indexing: Documents in search servers, relational databases, etc. can be stored and indexed at will.
Recommending: Recommending the most valuable information for particular users.
Categorizing: Grouping items via supervised and semi-supervised machine learning methods and targeted search lists.
Clustering: Similarity is used to create a hierarchical group of documents.
Visualizing: Showing relationships among documents so that users can quickly recognize connections.
This work has resulted in eight issued ( 9,256,649, 8,825,710, 8,473,314, 7,937,389, 7,805,446, 7,693,9037, 7,315,858, 7,072,883), and several commercial licenses (including TextOre and Pro2Serve), a spin-off company with the inventors, Covenant Health, and Pro2Serve called VortexT Analytics, two R&D 100 Awards, and scores of peer reviewed research publications.
References
Cui, X., Beaver, J., St. Charles, J., Potok, T. (September 2008). Proceedings of the IEEE Swarm Intelligence Symposium, St. Louis, Mo. Dimensionality Reduction for High Dimensional Particle Swarm Clustering.
Yasin, Rutrell (Nov 29, 2012) GCN. Energy lab's Piranha puts teeth into text analysis
Franklin Jr., Curtis (Nov 30, 2012) Enterprise Efficiency. Piranha Brings Affordable Big-Data to Government
Breeden II, John (Dec 7, 2012) GCN. Swimming with Piranha: Testing Oak Ridge's text analysis tool
Kirby, Bob (Summer 2013) FedTech. Big Data Can Help the Federal Government Move Mountains. Here's How.
R. M. Patton, B. G. Beckerman, T. E. Potok, G. Tourassi, "A Recommender System for Web-Based Discovery and Refinement of Information Radiologists Seek", Radiological Society of North America (RSNA), 2012 Annual Meeting, Nov. 2012, Chicago, IL, USA.
R. M. Patton, T. E. Potok, B. A. Worley, "Discovery & Refinement of Scientific Information via a Recommender System", The Second International Conference on Advanced Communications and Computation, Oct. 2012, Venice, Italy.
J. W. Reed, T. E. Potok, and R. M. Patton, "A multi-agent system for distributed cluster analysis," in Proceedings of Third International Workshop on Software Engineering for Large-Scale Multi- Agent Systems (SELMAS'04)" W16L Workshop - 26th International Conference on Software Engineering Edinburgh, Scotland, UK: IEE, 2004, pp. 152-5.
J. Reed, Y. Jiao, T. E. Potok, B. Klump, M. Elmore, and A. R. Hurson, "TF-ICF: A New Term Weighting Scheme for Clustering Dynamic Data Streams," in Proceedings of 5th International Conference on Machine Learning and Applications (ICMLA'06). vol. 0 ORLANDO, FL, 2006, pp. 258–263.
Awards
2007 R&D 100 Magazine's Award Piranha (software)
Patents
– System for gathering and summarizing internet information
– Method for gathering and summarizing internet information
– Agent-based method for distributed clustering of textual information
– Dynamic reduction of dimensions of a document vector in a document search and retrieval system
– Method and system for determining precursors of health abnormalities from processing medical records
External links
DOE Energy Innovlation Portal (2014) Agent-Based Software for Gathering and Summarizing Textual and Internet Information.
ORNL Piranha website
Cluster computing
Data mining and machine learning software
Agent-based software |
36694312 | https://en.wikipedia.org/wiki/Perf%20%28Linux%29 | Perf (Linux) | perf (sometimes called perf_events or perf tools, originally Performance Counters for Linux, PCL) is a performance analyzing tool in Linux, available from Linux kernel version 2.6.31 in 2009. Userspace controlling utility, named perf, is accessed from the command line and provides a number of subcommands; it is capable of statistical profiling of the entire system (both kernel and userland code).
It supports hardware performance counters, tracepoints, software performance counters (e.g. hrtimer), and dynamic probes (for example, kprobes or uprobes). In 2012, two IBM engineers recognized perf (along with OProfile) as one of the two most commonly used performance counter profiling tools on Linux.
Implementation
The interface between the perf utility and the kernel consists of only one syscall and is done via a file descriptor and a mapped memory region. Unlike LTTng or older versions of oprofile, no service daemons are needed, as most functionality is integrated into the kernel. The perf utility dumps raw data from the mapped buffer to disk when the buffer becomes filled up. According to R. Vitillo (LBNL), profiling performed by perf involves a very low overhead.
, architectures that provide support for hardware counters include x86, PowerPC64, UltraSPARC (III and IV), ARM (v5, v6, v7, Cortex-A8 and -A9), Alpha EV56 and SuperH. Usage of Last Branch Records, a branch tracing implementation available in Intel CPUs since Pentium 4, is available as a patch. Since version 3.14 of the Linux kernel mainline, released on March 31, 2014, perf also supports running average power limit (RAPL) for power consumption measurements, which is available as a feature of certain Intel CPUs.
Perf is natively supported in many popular Linux distributions, including Red Hat Enterprise Linux (since its version 6 released in 2010) and Debian in the linux-tools-common package (since Debian 6.0 (Squeeze) released in 2011).
Subcommands
perf is used with several subcommands:
stat: measure total event count for single program or for system for some time
top: top-like dynamic view of hottest functions
record: measure and save sampling data for single program
report: analyze file generated by perf record; can generate flat, or graph profile.
annotate: annotate sources or assembly
sched: tracing/measuring of scheduler actions and latencies
list: list available events
Criticism
The documentation of perf is not very detailed (as of 2014); for example, it does not document most events or explain their aliases (often external tools are used to get names and codes of events). Perf tools also cannot profile based on true wall-clock time.
Security
The perf subsystem of Linux kernels from 2.6.37 up to 3.8.8 and RHEL6 kernel 2.6.32 contained a security vulnerability (), which was exploited to gain root privileges by a local user. The problem was due to an incorrect type being used (32-bit int instead of 64-bit) in the event_id verification code path.
See also
List of performance analysis tools
OProfile
Performance Application Programming Interface
Profiling (computer programming)
References
External links
perf's wiki on kernel.org
Arnaldo Carvalho de Melo, The New Linux ’perf’ tools, presentation from Linux Kongress, September, 2010
Linux kernel profiling with perf tutorial
Hardware PMU support charts - check perf_event column
perf Examples by Brendan Gregg
Linux kernel features
Linux programming tools
Profilers |
16977588 | https://en.wikipedia.org/wiki/Ubuntu-restricted-extras | Ubuntu-restricted-extras | Ubuntu Restricted Extras is a software package for the computer operating system Ubuntu that allows the user to install essential software which is not already included due to legal or copyright reasons.
It is a meta-package that installs:
Support for MP3 and unencrypted DVD playback
Microsoft TrueType core fonts
Adobe Flash plugin
codecs for common audio and video files
Background
The software in this package is not included in Ubuntu by default because Ubuntu maintainers want to include only completely free software in out-of-the-box installations. The software in this package may be closed-source, encumbered by software patents, or otherwise restricted. For example, the Adobe Flash plugin is a closed-source piece of software. Additionally, many multimedia formats such as MP3 and H.264 are patented. In countries where these patents apply, legally distributing software that use these formats may require paying licensing fees to the patent owners.
Contents
The Ubuntu Restricted Extras is a metapackage and has the following dependencies:
flashplugin-installer
gstreamer0.10-ffmpeg
gstreamer0.10-fluendo-mp3
gstreamer0.10-pitfdll
gstreamer0.10-plugins-bad
gstreamer0.10-plugins-ugly
gstreamer0.10-plugins-bad-multiverse
gstreamer0.10-plugins-ugly-multiverse
icedtea6-plugin
libavcodec-extra-52
libmp4v2-0
ttf-mscorefonts-installer
unrar
Starting with Ubuntu 10.10, several of these dependencies are included indirectly via another meta-package ubuntu-restricted-addons which is included by default.
Inclusion
Due to the legal status of the software included in Ubuntu Restricted Extras, the package is not included by default on any Ubuntu CDs. However, some distributions, such as Super OS, Pinguy OS and Revamplinux do bundle the package on their installation CDs or DVDs.For a listing of the formats offered via this repository, follow the link at the end of this paragraph to help/ubuntu.com, Ubuntu's official online wiki.
Note that the above statement DOES NOT imply that the software is pirated or in any way illegal to download. Furthermore, it doesn't imply that Canonical is somehow breaking the law by offering the software. In fact, the _restricted_ repository exists for the sole purpose of making it easier to follow the law in the event that the software becomes unusable in its current form. That is what is meant by "Due to the legal status...". for an explanation of this concept, see this source.
See also
deb format
References
Linux package management-related software
Ubuntu |
3260155 | https://en.wikipedia.org/wiki/John%20Quarterman | John Quarterman | John S. Quarterman (born April 27, 1954) is an American author and long time Internet participant. He wrote one of the classic books about networking prior to the commercialization of the Internet. He has also written about risk management.
Biography
Quarterman grew up in the Bemiss community, near Valdosta, Georgia, USA.
He first used the ARPANET in 1974 while attending Harvard, and worked on UNIX ARPANET software at BBN (the original prime contractor on the ARPANET) from 1977 to 1981. He was twice elected to the Board of Directors of the USENIX Association, a professional association related to the UNIX operating system. While on that board, he was instrumental in its vote in 1987 to approve the first funding received by UUNET, which, along with PSINet, became one of the first two commercial Internet Service Providers (ISPs).
He co-founded the first Internet consulting firm in Texas (TIC) in 1986, and co-founded one of the first ISPs in Austin (Zilker Internet Park, since sold to Jump Point). He was a founder of TISPA, the Texas ISP Association.
He was a founder and Chief Technology Officer of Matrix NetSystems Inc., established as Matrix Information and Directory Services (MIDS) in 1990. At Matrix, Quarterman published the first maps of the whole Internet; conducted the first Internet Demographic Survey and started the first continuing series of performance data about the entire Internet in 1993, on the web since 1995 in the Internet Weather Report, and also visible as Internet Average, plus comparisons of ISPs visible as ISP Ratings. Matrix NetSystems, which had also been known as Matrix.Net, Inc., merged with Alignment Software, Inc. in April, 2003, briefly becoming Xaffire Inc., before Keynote Systems, Inc. acquired Xaffire's Austin operations in December 2003, and private equity firm Thoma Bravo merged Keynote, which it had acquired in 2013, into Dynatrace.
Inter@ctive Week listed John Quarterman as one of the 25 Unsung Heroes of the Internet in 1998, saying ..."As president of [MIDS], Quarterman, 43, is to Net demographics what The Gallup Organization is to opinion polls." Internet World interviewed Quarterman at length with a full page picture in its June 1996 issue, as Surveyors of Cyberspace.
On September 21, 2006, Quarterman served as a panelist with Hank Hultquist and Michele Chaboudy at a joint meeting of the IEEE Central Texas Section and Communications and Signal Processing Chapters titled "Network Neutrality: Altruism or Capitalism" at St. Edward's University in Austin, Texas. He also organized a November 2, 2006 panel on Net Neutrality for EFF-Austin, featuring Quarterman and Hank Hultquist, Michael Hathaway, and Austin Bay.
Major works
The Design and Implementation of the 4.3BSD UNIX Operating System, Addison-Wesley, January 1989, , Co-author with Samuel J. Leffler, Marshall K. McKusick, Michael J. Karels (describing a system which has been very influential on the TCP/IP protocols).
The Matrix: Computer Networks and Conferencing Systems Worldwide, Digital Press, 1990, (a comprehensive book on the history, technology, and people of computer networks worldwide).
Practical internetworking with TCP/IP and UNIX, Addison-Wesley, September 1993, , Co-author with Smoot Carl-Mitchell.
The Design and Implementation of the 4.4BSD Operating System, Addison-Wesley, April 1996, , Co-author with Marshall K. McKusick, Keith Bostic, Michael J. Karels.
Risk Management Solutions for Sarbanes-Oxley Section 404 IT Compliance'', Wiley, 2006, .
Trivia
At RIPE-58 it was revealed by Daniel Karrenberg that John Quarterman originally came up with the acronym RIPE after seeing a slide made by Karrenberg that said: Réseaux IP Européens at a meeting in Brussels, 1989.
References
External links
John Quarterman's website
Matrix (entry in the Jargon File)
The Design and Implementation of the 4.4BSD Operating System
John Quarterman's Clan Sinclair website
Harvard University alumni
BSD people
1954 births
Living people
People from Lowndes County, Georgia
American male writers |
5999928 | https://en.wikipedia.org/wiki/D.%20C.%20Heath%20and%20Company | D. C. Heath and Company | D.C. Heath and Company was an American publishing company located at 125 Spring Street in Lexington, Massachusetts, specializing in textbooks.
History
The company was founded in Boston by Edwin Ginn and Daniel Collamore Heath in 1885. D.C. Heath and Company was owned by Raytheon from 1966 to 1995. When Raytheon exited the textbook market, it sold the company to Houghton Mifflin.
D.C. Heath started a small division of software editors to supplement the textbooks in the early 80's. The editors strove to make the software packages independent of the books. There were test banks that allowed teachers to pick and choose questions for their quizzes and tests. Development was further supported to enable teachers to create their own questions including a formula editor, tagging items by objectives, and including custom graphics in the question as well as in the answer key. This was for the Apple 2 then later Windows and Macintoshes. Many titles were commissioned for the areas of science, math, reading, social studies, and modern languages. These were interactive original programs. D.C. Heath gave this group their own identity, Collamore Educational Publishing. The editors were involved in all facets of the publishing process including contracts, development, design, publishing, marketing, and sales. Schools were just transitioning from the one computer classroom to the computer lab. In 1988 most of the software was being supported by William K. Bradford Publishing Company composed initially by D. C. Heath / Collamore personnel.
Publications-(note: There are far more titles than are listed here)
Heath Elementary Science, by Herman and Nina Schneider, 6 volumes (1955)
Heath middle level literature (1996?)
Heath Physics (1992)
Fundamentals of Personal Rapid Transit (1978)
Discovering French Bleu: Complete Lesson Plans
The Enduring Vision: A History of the American People Third Edition (1996)
Ruy Blas by Victor Hugo (1933)
The Renaissance Medieval or Modern?
The Enduring South: Subcultural Persistence in Mass Society (1972), by John Shelton Reed.
The Story of Georgia, Massey and Wood, 1904
MC68000: Assembly Language and Systems Programming (1988)
Victor Hugo's Les Misérables: French Edition
Elizabeth Rice Allgeier, Albert Richard Allgeier, Sexual Interactions, 1991
A Short German Grammar for High Schools and Colleges.(The book cover just says "German Grammar" for the title) by E.S.Sheldon, tutor in German in Harvard University (1903), copyright 1879
The Causes of the American Revolution (1950, 1962, 1973)
Builders of the Old World, Written by Gertrude Hartman and Illustrated by Marjorie Quennell (1951)
Composition and Rhetoric by William Williams copyright 1890 published 1893
The Nazi Revolution: Germany's Guilt or Germany's Fate?
Children and Their Helpers New American Readers For Catholic Schools by School Sisters of Notre Dame. (1938)
Donald Duck Sees South America (1945) H. Marion Palmer, Walt Disney
Old Time Stories of the Old North State by L.A.McCorkle (1903)
Old Testament Narratives selected and edited by Roy L. French and Mary Dawson (1931)
Hamlet The Arden Shakespeare, edited by E. K. Chambers, B.A (1908)
Discussions of Literature series, general editor Joseph H. Summers
Eugenie Grandet: French Edition, by Honore de Balzac, Abridged and Edited with Introduction Notes and Vocabulary by A.G.H. Spiers, PH.D., (copyright 1914)
"The Bug In The Hut" and "Nat The Rat", unknown authors, (1968 - 1970)
Modern European History - Revised Edition - by Hutton Webster Ph.D. (Copyright 1920 and 1925)
World Civilization (1940, 1944, 1949) by Hutton Webster and Edgar Bruce Wesley
Hints Toward A Select and Descriptive Bibliography of Education - by G Stanley Hall and John M Mansfield (Copyright 13 August 1886)
Heath's Modern Language Series. Gerstacker's Germelshausen. 1894.
Elementary Linear Algebra: Second Edition by Roland E. Larson and Bruce H. Edwards (1991)
Software initial addition by Hal Wexler, software editor, 1984-1988 then transitioned to William K. Bradford Publishing Company.
Campaign Organization by Xandra Kayden (1978)
References
Defunct book publishing companies of the United States
Companies based in Lexington, Massachusetts
Publishing companies established in 1885
Economic history of Boston
Educational software |
2986645 | https://en.wikipedia.org/wiki/Non-functional%20requirements%20framework | Non-functional requirements framework | NFR (Non-Functional Requirements) need a framework for compaction. The analysis begins with softgoals that represent NFR which stakeholders agree upon. Softgoals are goals that are hard to express, but tend to be global qualities of a software system. These could be usability, performance, security and flexibility in a given system. If the team starts collecting them it often finds a great many of them. In order to reduce the number to a manageable quantity, structuring is a valuable approach. There are several frameworks available that are useful as structure.
Structuring Non-functional requirements
The following frameworks are useful to serve as structure for NFRs:
1. Goal Modelling
The finalised softgoals are then usually decomposed and refined to uncover a tree structure of goals and subgoals for e.g. the flexibility softgoal. Once uncovering tree structures, one is bound to find interfering softgoals in different trees, e.g. security goals generally interferes with usability. These softgoal trees now form a softgoal graph structure. The final step in this analysis is to pick some particular leaf softgoals, so that all the root softgoals are satisfied.[1]
2. IVENA - Integrated Approach to Acquisition of NFR
The method has integrated a requirement tree. [2]
3. Context of an Organization
There are several models to describe the context of an organization such as Business Model Canvas, OrgManle [3], or others [4]. Those models are also a good framework to assign NFRs.
Measuring the Non-functional requirements
SNAP is the Software Non-functional Assessment Process. While Function Points measure the functional requirements by sizing the data flow through a software application, IFPUG's SNAP measures the non-functional requirements.
The SNAP model consists of four categories and fourteen sub-categories to measure the non-functional requirements. Non-functional requirement are mapped to the relevant sub-categories. Each sub-category is sized, and the size of a requirement is the sum of the sizes of its sub-categories.
The SNAP sizing process is very similar to the Function Point sizing process. Within the application boundary, non-functional requirements are associated with relevant categories and their sub-categories. Using a standardized set of basic criteria, each of the sub-categories is then sized according to its type and complexity; the size of such a requirement is the sum of the sizes of its sub-categories. These sizes are then totaled to give the measure of non-functional size of the software application.
Beta testing of the model shows that SNAP size has a strong correlation with the work effort required to develop the non-functional portion of the software application.
See also
SNAP Points
References
[1] Mylopoulos, Chung, and Yu: “From Object-oriented to Goal-oriented Requirements Analysis" Communications of the ACM, January 1999
[CACM.f.doc
[2] Götz, Rolf; Scharnweber, Heiko: "IVENA: Integriertes Vorgehen zur Erhebung nichtfunktionaler Anforderungen". https://www.pst.ifi.lmu.de/Lehre/WS0102/architektur/VL1/Ivena.pdf
[3] Teich, Irene: Tutorial PlanMan. Working paper Postbauer-Heng, Germany 2005. Available on Demand.
[4] Teich, Irene: Context of the organization-Models. Working paper Meschede, Germany 2020. Available on Demand.
Systems engineering
Software requirements |
31187124 | https://en.wikipedia.org/wiki/Y.1564 | Y.1564 | ITU-T Y.1564 is an Ethernet service activation test methodology, which is the new ITU-T standard for turning up, installing and troubleshooting Ethernet-based services. It is the only standard test methodology that allows for complete validation of Ethernet service-level agreements (SLAs) in a single test.
Purposes
ITU-T Y.1564 is designed to serve as a network service level agreement (SLA) validation tool, ensuring that a service meets its guaranteed performance settings in a controlled test time, to ensure that all services carried by the network meet their SLA objectives at their maximum committed rate, and to perform medium- and long-term service testing, confirming that network elements can properly carry all services while under stress during a soaking period.
ITU-T Y.1564 defines an out-of-service test methodology to assess the proper configuration and performance of an Ethernet service prior to customer notification and delivery. The test methodology applies to point-to-point and point-to-multipoint connectivity in the Ethernet layer and to the network portions that provide, or contribute to, the provisioning of such services. This recommendation does not define Ethernet network architectures or services, but rather defines a methodology to test Ethernet-based services at the service activation stage.
Existing test methodologies: RFC 2544
The Internet Engineering Task Force RFC 2544 is a benchmarking methodology for network interconnect devices. This request for Comments (RFC) was created in 1999 as a methodology to benchmark network devices such as hubs, switches and routers as well as to provide accurate and comparable values for comparison and benchmarking.
RFC 2544 provides engineers and network technicians with a common language and results format. The RFC 2544 describes six subtests:
Throughput: Measures the maximum rate at which none of the offered frames are dropped by the device/system under test (DUT/SUT). This measurement translates into the available bandwidth of the Ethernet virtual connection.
Back-to-back or burstability: Measures the longest burst of frames at maximum throughput or minimum legal separation between frames that the device or network under test will handle without any loss of frames. This measurement is a good indication of the buffering capacity of a DUT.
Frame loss: Defines the percentage of frames that should have been forwarded by a network device under steady state (constant) loads that were not forwarded due to lack of resources. This measurement can be used for reporting the performance of a network device in an overloaded state, as it can be a useful indication of how a device would perform under pathological network conditions such as broadcast storms.
Latency: Measures the round-trip time taken by a test frame to travel through a network device or across the network and back to the test port. Latency is the time interval that begins when the last bit of the input frame reaches the input port and ends when the first bit of the output frame is seen on the output port. It is the time taken by a bit to go through the network and back. Latency variability can be a problem. With protocols like voice over Internet protocol (VoIP), a variable or long latency can cause degradation in voice quality.
System reset: Measures the speed at which a DUT recovers from a hardware or software reset. This subtest is performed by measuring the interruption of a continuous stream of frames during the reset process.
System recovery: Measures the speed at which a DUT recovers from an overload or oversubscription condition. This subtest is performed by temporarily oversubscribing the device under test and then reducing the throughput at normal or low load while measuring frame delay in these two conditions. The difference between delay at overloaded condition and the delay and low load conditions represent the recovery time.
From a laboratory and benchmarking perspective, the RFC 2544 methodology is an ideal tool for automated measurement and reporting. From a service turn-up and troubleshooting perspective, RFC 2544, although acceptable and valid, does have some drawbacks:
Service providers are shifting from only providing Ethernet pipes to enabling services. Networks must support multiple services from multiple customers, and each service has its own performance requirements that must be met even under full load conditions and with all services being processed simultaneously. RFC 2544 was designed as a performance tool with a focus on a single stream to measure maximum performance of a DUT or network under test and was never intended for multiservice testing.
With RFC 2544's focus on identifying the maximum performance of a device or network under test, the overall test time is variable and heavily depends on the quality of the link and subtest settings. RFC 2544 test cycles can easily require a few hours of testing. This is not an issue for lab testing or benchmarking, but becomes a serious issue for network operators with short service maintenance windows.
Packet delay variation is a key performance indicator (KPI) for real time services such as VoIP and Internet protocol television (IPTV) and is not measured by the RFC 2544 methodology. Network operators that performed service testing with RFC 2544 typically must execute external packet jitter testing outside of RFC 2544 as this KPI was not defined or measured by the RFC.
Testing is performed sequentially on one KPI after another. In today's multiservice environments, traffic is going to experience all KPIs at the same time, although throughput might be good, it can also be accompanied by very high latency due to buffering. Designed as a performance assessment tool, RFC 2544 measures each KPI individually through its subtest and therefore cannot immediately associate a very high latency with a good throughput, which should be cause for concern.
Service definitions
The ITU-T Y.1564 defines test streams (individually called a "Test Flow") with service attributes linked to the Metro Ethernet Forum (MEF) 10.2 definitions. Test Flows are traffic streams with specific attributes identified by different classifiers such as 802.1q VLAN, 802.1ad, DSCP and class of service (CoS) profiles. These services are defined at the user–network interface (UNI) level with different frame and bandwidth profile such as the service's maximum transmission unit (MTU) or frame size, committed information rate (CIR), and excess information rate (EIR). A single Test Flow is also able to consist of up to 5 different frame sizes called an EMIX (Ethernet Mix). This flexibility allows the engineer to configure a Test Flow very close to real world traffic.
Test rates
The ITU Y.1564 defines three key test rates based on the MEF service attributes for Ethernet virtual connection (EVC) and user-to-network interface (UNI) bandwidth profiles.
CIR defines the maximum transmission rate for a service where the service is guaranteed certain performance objectives. These objectives are typically defined and enforced via SLAs.
EIR defines the maximum transmission rate above the committed information rate considered as excess traffic. This excess traffic is forwarded as capacity allows and is not subject to meeting any guaranteed performance objectives (best effort forwarding).
Overshoot rate defines a testing transmission rate above CIR or EIR and is used to ensure that the DUT or network under test does not forward more traffic than specified by the CIR or EIR of the service.
Service configuration test
Forwarding devices such as switches, routers, bridges and network interface units are the basis of any network as they interconnect segments. If a service is not correctly configured on any one of these devices within the end-to-end path, network performance can be greatly affected, leading to potential service outages and network-wide issues such as congestion and link failures.
The Service configuration test is designed to measure the ability of DUT or network under test to properly forward in different states:
In the CIR phase, where performance metrics for the service are measured and compared to the SLA performance objectives.
In the EIR phase, where performance is not guaranteed and the services transfer rate is measured to ensure that CIR is the minimum bandwidth.
In the discard phase, where the service is generated at the overshoot rate and the expected forwarded rate is not greater than the committed information rate or excess rate (when configured).
In the CBS (Committed Burst Size) phase, performance metrics are measured while changing traffic from the CIR to the line rate.
In the EBS (Excess Burst Size) phase, performance metrics are measured while changing traffic from the EIR to the line rate.
Service performance test
As network devices come under load, they must prioritize one traffic flow over another to meet the KPIs set for each traffic class. With only one traffic class, there is no prioritization performed by the network devices since there is only one set of KPIs. As the number of traffic flows increase, prioritization is necessary and performance failures may occur. The service performance test measures the ability of the DUT or network under test to forward multiple services while maintaining SLA conformance for each service. Services are generated at the CIR, where performance is guaranteed, and pass/fail assessment is performed on the KPI values for each service according to its SLA.
Service performance assessment must also be maintained for a medium- to long-term period as performance degradation will likely occur as the network is under stress for longer period of times. The service performance test is designed to soak the network under full committed load for all services and measure performance over medium and long test time.
The time frame to complete this section of the test is recommended to follow ITU-T M.2110 which mentions intervals of 15min, 2hour or 24hour allowing network availability to be determined.
Metrics
The Y.1564 focuses on the following KPIs for service quality:
Bandwidth or Information rate (IR): This is a bit rate measure of available or consumed data communication resources expressed in bits/second or multiples of it (kilobits/s, megabits/s, etc.)
Frame transfer delay (FTD): Also known as latency, this is a measurement of the time delay between the transmission and the reception of a frame. Typically this is a round-trip measurement, meaning that the calculation measures both the near-end to far-end and far-end to near-end direction simultaneously.
Frame delay variations (FDV): Also known as packet jitter, this is a measurement of the variations in the time delay between packet deliveries. As packets travel through a network to their destination, they are often queued and sent in bursts to the next hop. There may be prioritization at random moments also resulting in packets being sent at random rates. Packets are therefore received at irregular intervals. The direct consequence of this jitter is stress on the receiving buffers of the end nodes where buffers can be overused or underused when there are large swings of jitter.
Frame loss ratio (FLR): Typically expressed as a ratio, this is a measurement of the number of packets lost over the total number of packets sent. Frame loss can be due to a number of issues such as network congestion or errors during transmissions.
Frame loss ratio with reference to the SAC: Typically expressed as a Pass / Fail indication. SAC (Service Acceptance Criteria) is the part of the network operators SLA which references the FLR requirement for the network path under test.
Availability (AVAIL): Typically expressed as a % of up time for link under test for example does the network pass the 5 "9's" 99.999% up time.
References
The Essentials of Ethernet Service Activation (Multi-Service Y.1564)
ITU Y.1564 Ethernet Testing
ITU y.1564 test methodology
New Ethernet test standards and procedures address changing network traffic
Benchmarking Terminology for Network Interconnection Devices
Benchmarking Methodology for Network Interconnect Devices
ITU specifications
External links
eSAM - Performance Assessment
IxNetwork Y.1564 QuickTest
SAMComplete
MEF 10.1
MEF 10.2
See also
Ethernet Private Line
Ethernet standards
ITU-T Y Series Recommendations |
30319 | https://en.wikipedia.org/wiki/Talk%20%28software%29 | Talk (software) | talk is a Unix text chat program, originally allowing messaging only between the users logged on to one multi-user computer—but later extended to allow chat to users on other systems.
Although largely superseded by IRC and other modern systems, it is still included with most Unix-like systems today, including Linux, BSD systems and macOS.
History
Similar facilities existed on earlier system such as Multics, CTSS, PLATO, and NLS. Early versions of talk did not separate text from each user. Thus, if each user were to type simultaneously, characters from each user were intermingled. Since slow teleprinter keyboards were used at the time (11 characters per second maximum), users often could not wait for each other to finish. It was common etiquette for a long typing user to stop when intermingling occurred to see the listener's interrupting response. This is much the same as interrupting a long monologue when speaking in person. More modern versions use curses to break the terminal into multiple zones for each user, thus avoiding intermingling text.
In 1983, a new version of talk was introduced as a Unix command with 4.2BSD, and would also accommodate electronic conversations between users on different machines. Follow-ons to talk included ntalk, Britt Yenne's ytalk and Roger Espel Llima's utalk. ytalk was the first of these to allow conversations between more than two users, and was written in part to allow communication between users on computers with different endianness. utalk uses a special protocol over UDP (instead of TCP used by the rest) that is more efficient and allows edition of the entire screen. All of these programs split the interface into different sections for each participant. The interfaces did not convey the order in which statements typed by different participants would be reassembled into a log of the conversation. Also, all three programs are real-time text, where they transmit each character as it was typed. This leads to a more immediate feel to the discussion than recent instant messaging clients or IRC. Users more familiar with other forms of instant text communication would sometimes find themselves in embarrassing situations by typing something and deciding to withdraw the statement, unaware that other participants of the conversation had seen every keystroke happen in real time.
A similar program exists on VMS systems called phone.
Security
A popular program called "flash", which sent malformed information via the talk protocol, was frequently used by pranksters to corrupt the terminal output of the unlucky target in the early 1990s. It did this by including terminal commands in the field normally designated for providing the name of the person making the request. When the victim would receive the talk request, the name of the person sending the request would be displayed on their screen. This would cause the terminal commands to execute, rendering the person's display unreadable until they reset it. Later versions of talk blocked flash attempts and alerted the user that one had taken place. Later it became clear that, by sending different terminal commands, it is even possible to have the user execute commands. As it has proven impossible to fix all programs that output untrusted data to the terminal, modern terminal emulators have been rewritten to block this attack, though some may still be vulnerable.
See also
List of Unix commands
Talker, a chat system
write (Unix)
wall (Unix)
References
External links
Unix network-related software
Unix SUS2008 utilities
Online chat |
67697286 | https://en.wikipedia.org/wiki/Mobilinux | Mobilinux | Mobilinux is a discontinued Linux distribution by MontaVista. It was announced on April 25, 2005.
History
In 2005, PalmSource joined MontaVista to collaborate on Mobilinux.
In April 2005, version 4.0 was released. In 2007, version 5.0 was released.
Usage
Around 35 million devices have run on Mobilinux, mainly in Asian markets. LWN.net argued that because it was controlled by a single company and targeted mobile operators, it did not generated a large developer community. It has been used on smartphones and NAS devices. The Motorola ROKR ran Mobilinux.
Hardware support
It had support for the Freescale's i.MX31 chipset.
See also
OpenEZX
References
Embedded Linux distributions
Mobile Linux
Linux distributions |
845521 | https://en.wikipedia.org/wiki/GNU%20arch | GNU arch | GNU arch software is a distributed revision control system that is part of the GNU Project and licensed under the GNU General Public License. It is used to keep track of the changes made to a source tree and to help programmers combine and otherwise manipulate changes made by multiple people or at different times.
As of 2009, GNU arch's official status is deprecation, and only security fixes are applied. Bazaar (or 'bzr') has since also been made an official GNU project and can thus be considered the replacement for GNU arch. It is not a fork of arch.
Features
Being a distributed, decentralized versioning system, each revision stored using arch is uniquely globally identifiable; such identifier can be used in a distributed setting to easily merge or "cherry-pick" changes from completely disparate sources.
Being decentralized means that there is no need for a central server for which developers have to be authorized in order to contribute. As with other systems, a full read-only copy of a project is made accessible in an "official" repository via HTTP, FTP, or SFTP; but then, contributors are encouraged to make modifications and publish them in a public archive (repository) of their own, so that the head developer may manually merge changesets into the official repository.
To simulate the behavior of centralized revision control systems, the head developer could allow shell access (SSH) or write access (FTP, SFTP, WebDAV) to a server, allowing authorized users to commit to a central server. More often, GNU arch-managed projects have a lead benevolent dictator that merges changes from contributors.
GNU arch has several other features:
Atomic commits Commits are all-or-nothing. The tree must be in proper condition before the commit begins, and commits are not visible to the world until complete. If the commit is interrupted before this, it remains invisible and must be rolled back before the next commit. This avoids corruption of the archive and other users' checked-out copies.
Changeset oriented Instead of tracking individual files (as in CVS), GNU arch tracks changesets, which are akin to patches. Each changeset is a description of the difference between one source tree and another, and so a changeset can be used to produce one revision from another revision. Authors are encouraged to use one commit per feature or bugfix.
Easy branching Branching is efficient and can span archives. A branch (or 'tag') simply declares the ancestor revision, and development continues from there.
Advanced merging Due to the permanent record of all ancestors and merged revisions, merging can take into account which branch contains which patch, and can do three-way merging based on a shared ancestor revision.
Cryptographic signatures Every changeset is stored with a hash to prevent accidental corruption. Using an external file signing program (such as GnuPG or another PGP client), these hashes can also optionally be signed, preventing unauthorized modification if the archive is compromised.
Renaming All files and directories can be easily renamed. These are tracked by a unique ID rather than by name, so history is preserved, and patches to files are properly merged even if filenames differ across branches.
Metadata tracking The permissions of all files are tracked. Symbolic links are supported and are tracked the same way as files and directories.
History and maintainership
GNU arch version 1 and tla
The original author and maintainer of GNU arch was Thomas Lord who started the project in 2001. The command used to manipulate GNU arch repositories is tla, an initialism for Tom Lord's Arch. Lord started GNU arch as a collection of shell scripts to provide an alternative to CVS. In 2003, arch became part of the GNU project.
The GNU arch project forked several times, resulting in both Canonical Ltd.'s now abandoned Baz fork and Walter Landry's ArX project. Both forks provoked a hostile reaction: the ArX fork was due to a serious dispute in direction and Lord was strongly critical of Canonical's approach to announcing the Baz project.
In August 2005 Lord announced that he was resigning as the maintainer of GNU arch and recommended that Baz become the main GNU arch project. However, this did not happen: the Baz fork was abandoned by Canonical in favour of the separate Bazaar project, with the 1.5 release of Baz being scrapped in 2006. In October, 2005, Andy Tai announced that Lord and the Free Software Foundation had accepted his offer to be the maintainer of GNU arch. Tai subsequently merged many features from Baz back into tla, but in March 2008 indicated that tla was no longer under active development and was no longer competitive with other version control systems.
revc
revc was a prototype revision control project by Thomas Lord that he intended to become GNU arch 2.0, designed to be a radical departure from tla and to draw many ideas from the Git revision control system. It was announced in June 2005, the first pre-release was in July and the last in August, just prior to Lord's resignation as maintainer. revc only had 10 core commands and Lord intended to eliminate restrictive namespaces, complicated file naming conventions and increase the speed.
As of 2008 the last pre-release, 0.0x2, of revc is still available and Lord is still interested in some of the ideas in GNU arch but does not have the resources to resume development of revc.
Criticism
Perhaps the most common criticism of GNU arch is that it is difficult to learn, even for users who have experience with other SCM systems. In particular, GNU arch has a large number of commands, which can be intimidating for new users and some design elements arguably too strongly enforce Lord's taste in version control practices.
Some also criticize GNU arch for using very unusual file naming conventions (), which can create difficulties for using it in scripts, some shells, and in porting it to non-Unix operating systems. GNU arch has been criticised for having a slow running time as part of a design decision to lessen internal code complexity.
See also
Revision control
List of revision control software
Comparison of revision control software
References
External links
LWN.net article on arch
Free version control software
Free software programmed in C
GNU Project software
Distributed version control systems
Discontinued version control systems
2001 software |
473111 | https://en.wikipedia.org/wiki/Adventure%20Construction%20Set | Adventure Construction Set | Adventure Construction Set (ACS) is a computer game creation system written by Stuart Smith that is used to construct tile-based graphical adventure games. ACS was originally published by Electronic Arts (EA) in 1984 on the Commodore 64, and was later ported to the Apple II, Amiga, and MS-DOS. It was one of EA's biggest hits of 1985, earning a Software Publishers Association "Gold Disk" award.
ACS provides a graphical editor for the construction of maps, placement of creatures and items, and a simple menu-based scripting to control game logic. A constructed game is stored on its own disk which can be copied and shared with friends. For some ports (such as Amiga) the ACS software is still needed to play user-constructed games.
Included with the system is a complete game, Rivers of Light, based on the Epic of Gilgamesh. It features art by Smith and Connie Goldman and music by Dave Warhol. The Amiga version of ACS has art by Greg Johnson and Avril Harrison and an additional pre-made adventure called "Galactic Agent" by Ken St Andre.
Titles influenced by ACS include The Elder Scrolls Construction Set. Project lead Todd Howard had stated, "When we started Morrowind, I was really excited about making a tool like 'Stuart Smith's Adventure Construction Set for the Apple 2'. I even used part of the name."
Gameplay
Gameplay features of Adventure Construction Set include:
Turn-based system.
Up to four players may play.
A player character can be imported from another adventure. However the character might not retain the same graphic tile if the new adventure uses a different tile set.
Music and sound.
Random encounters.
Spells.
Range and melee combat.
Along with graphic tiles, text screens are also available for conveying information.
Creatures which behave as player-mimics, copying various traits and equipment of the player.
Shops.
Construction system
Adventure Construction Set was designed to make tile-based graphical adventure games similar to author Stuart Smith's earlier games Return of Heracles and Ali Baba and the Forty Thieves.
The framework of an adventure built within ACS is organized into the following main categories:
"World map": This is the top-level map from which characters begin their adventure. The world map differs from other playable areas of the game in that it has no fixed creature encounters, no stacked tiles, quicker movement, it is scrollable, and it optionally may wrap around (have no borders.) Random encounters may occur on the world map, during which the game switches to a special view similar to a "room" to handle the encounter.
"Regions": A region is a collection of rooms. A region is a construction concept and does not present itself to the player, except by indirect means such as disk access when traveling between regions.
"Rooms": A room is a rectangular, tiled area of a size which must fit within the game's viewport. Tiles may be used to make a room look like shapes other than rectangular.
"Things": A thing is a background tile, obstacle, or collectible item.
"Creatures"
"Pictures": These are art assets used by the tiles. For some platforms, four colors are available for images. For the Amiga platform, 32 colors are available, each of which can be assigned to be any of 4096 available colors.
Tiles may be stacked. Only the top tile of a stack may be directly interacted with by the player, however special tiles allow for game-logic to be implemented via the stack. For example, a tile may be set to "Activate All Things at This Place". Tiles may also allow or disallow interaction based on the contents of the player's inventory, or activate if a specific object is dropped on top of the stack.
Spell-effects may be attached to Things.
The game allows for somewhat varied monster AI behavior. A creature may be specified to behave solely as a "fighter" or "slinker", or adjust its temperament based on its condition. In addition, it may be specified as either an "enemy", "friend", "neutral", or "thief", with a total of 8 possible behavioral patterns expressed.
There are maximum quotas applied to most categories in the game (including the total number of unique things, text messages, pictures, regions, creatures per region, things per region, and rooms per region.) These limits restrict the size of adventures. For example, "Each adventure can contain up to 15 regions and each region can contain up to 16 rooms."
ACS included a framework for fantasy adventures, as well as starter toolkits for fantasy, futurist, and "spy" game genres.
Auto-Construct Feature
Along with user-constructed adventures, the software can also auto-construct a random adventure. This feature can optionally be used to auto-complete a partially built adventure. The user may specify numerous parameters for auto-generation, including difficulty level.
Development
Smith denied that his software was inspired by Pinball Construction Set. Stuart stated that the concept was based on his experience writing accounting software, during which he developed a report generator that would create a standalone COBOL program, and that Electronic Arts suggested the name Adventure Construction Set. ACS was produced by Don Daglow in parallel with the development of Racing Destruction Set.
Reception
Orson Scott Card criticized Adventure Construction Sets user interface, stating that it "was designed by the Kludge Monster from the Nethermost Hell". He praised the game's flexibility, however, reporting that his son was able to create a spell called "Summon Duck". Computer Gaming Worlds Scorpia described ACS as an "easy-to-use, albeit time-consuming, means of creating a graphic adventure."
Reviews
Casus Belli #35 (Dec 1986)
Community
Electronic Arts contest
Shortly after Adventure Construction Set's release, announcements were included in the packaging for players to submit their adventures for a contest to be judged by Electronic Arts and their playtesters. Approximately 50 games were submitted and winners chosen for three categories:
Fantasy - Festival by R.C. Purrenhage written for the Commodore 64
Science Fiction - Cosmos by Albert Jerng written for the C-64
Contemporary - Panama by Will Bryant for the C-64 and Codename:Viper by Peter Schroeder for the Apple II
Adventure Construction Set Club
The supplementary manual included with the Amiga port mentions, "If you're an ACS fanatic you can join the Adventure Construction Set Club. Club members receive access to a library of adventures created with ACS" The supplementary manual also mentions that the club is not affiliated with Electronic Arts.
See also
Music Construction Set
Pinball Construction Set
Racing Destruction Set
References
External links
Adventure Construction Set at AmigaMemo.com - Amiga.Game.Museum
Review of C64 version from 1985
Adventure Creation Kit - remake of Adventure Construction Set
1984 video games
Adventure games
Amiga games
Apple II games
Ariolasoft games
Commodore 64 games
DOS games
Electronic Arts games
Multiplayer and single-player video games
Video games developed in the United States
Video games with tile-based graphics
Video game development software |
60726 | https://en.wikipedia.org/wiki/Simputer | Simputer | The Simputer was a self-contained, open hardware Linux-based handheld computer, first released in 2002. Developed in, and primarily distributed within India, the product was envisioned as a low-cost alternative to personal computers. With initial goals of selling 50,000 simputers, the project had sold only about 4,000 units by 2005, and has been called a failure by news sources.
Design and Hardware
The device was designed by the Simputer Trust, a non-profit organization formed in November 1999 by seven Indian scientists and engineers led by Dr. Swami Manohar. The word "Simputer" is an acronym for "simple, inexpensive and multilingual people's computer", and is a trademark of the Simputer Trust. The device includes text-to-speech software and runs the Linux operating system. Similar in appearance to the PalmPilot class of handheld computers, the touch sensitive screen is operated on with a stylus; simple handwriting recognition software is provided by the program Tapatap.
The Simputer Trust licensed two manufacturers to build the devices, Encore Software, which has also built the Mobilis for Corporate/Educational purposes and the SATHI for Defence purposes, and PicoPeta Simputers, which released a consumer product named the Amida Simputer.
The device features include touchscreen, smart card, Serial port, and USB connections, and an Infrared Data Association (IrDA) port. It was released in both greyscale and color versions.
Software
The Simputer uses the Linux kernel (2.4.18 Kernel as of July 2005), and the Alchemy Window Manager (only the Amida Simputer). Software packages include: Scheduling, Calendar, Voice Recording and Playback, simple spreadsheet application, Internet and network connectivity, Web browsing and email, an e-Library, games, and support for Java ME, DotGNU (a free software implementation of .NET), and Flash.
In addition, both licensees developed custom applications for microbanking, traffic police, and medical applications.
Deployments
In 2004, Simputers were used by the government of Karnataka to automate the process of land records procurement. Simputers were also used in an ambitious project in Chhattisgarh for the purpose of e-education. In 2005, they were used in a variety of applications, such as automobile engine diagnostics (Mahindra & Mahindra in Mumbai), tracking of iron-ore movement from mine pithead to shipping point (Dempo, Goa), Microcredit (Sanghamitra, Mysore), Electronic Money Transfer between UK and Ghana (XK8 Systems, UK), and others. In recent times, the Simputer has seen deployment by the police force to track traffic offenders and issue traffic tickets.
Commercial production
Pilot production of the Simputer started in September 2002. In 2004, the Amida Simputer became commercially available for 12450 and up (approximately US$240). The prices for Amida Simputer vary depending on the screen type (monochrome or colour).
By 2006, both licensees had stopped actively marketing their Simputer devices. PicoPeta was acquired by Geodesic Information Systems (a developer of communication and collaboration systems) in 2005.
See also
Akash/Sakshat tablet
BOSS Linux developed by C-DAC
Digital divide
Longmeng or Dragon Dream is a Chinese low-cost computer being designed to cost €100
VIA pc-1 Initiative
ZX81 the first million selling low cost (£69/$99.95) Computer.
References
"Indian handheld to tackle digital divide". (July 18, 2001).
Srinivasan, S. (Apr. 3, 2005). "Handheld Computer Yet to Reach the Masses". Associated Press.
Swami Manohar (Nov. 2005) High Return Computing
BBC News, 10 September 2001 Computer deal for India's poor
Outlook India, September 22, 2002: Pilot production of Simputer begins
BBC News, 1 August 2005 - Woe for traffic offenders in Sim city
External links
Official website
PicoPeta Simputers
OpenAlchemy
Encore
Appropriate technology
Embedded Linux
Information and communication technologies in Asia
Linux-based devices
Mobile computers
Open-source hardware |
47891056 | https://en.wikipedia.org/wiki/Primetime%20Engineering%20Emmy%20Awards | Primetime Engineering Emmy Awards | A Primetime Emmy Engineering Award is an award given most years by the Television Academy, also known as the Academy of Television Arts & Sciences (ATAS). It is a Primetime Emmy Award given specifically for Outstanding Achievement in Engineering Development. According to the Television Academy, the Primetime Emmy Engineering Award (or Engineering Emmy) is presented to an individual, company or organization for engineering developments so significant an improvement on existing methods or so innovative in nature that they materially affect the transmission, recording or reception of television. The award, which is Television's highest engineering honor, is determined by a jury of highly qualified, experienced engineers in the Television industry.
The Primetime Emmy Awards have been given since 1948 to recognize outstanding achievements in Primetime Television for Performance, for the Creative Arts and for Engineering. The Primetime Emmy Engineering Award is not the same as the Technology & Engineering Emmy Award, which is given by the National Academy of Television Arts and Sciences (NATAS), the Television Academy's sister organization. NATAS gives Emmy Awards in various categories including "Daytime,” "Sports,” "News and Documentary,” and "Public Service.”
In addition to the Primetime Emmy Engineering Awards, since 2003 the Television Academy also bestows in most years the Philo T. Farnsworth Award, which is a Primetime Emmy Engineering Award given to honor companies and organizations that have significantly affected the state of television and broadcast engineering over a long period of time, and the Charles F. Jenkins Lifetime Achievement Award, which has been given in most years since 1991 to one or more individuals whose contributions over time have significantly affected the state of television technology and engineering.
The Primetime Engineering Emmys have been given annually since 1978 (the year that ATAS and the NATAS agreed to split ties), although Special Emmys for Outstanding Achievement in Engineering Development were occasionally bestowed in prior years. The awards which have been given include the Engineering Emmys, which are accorded the Emmy Statuette, and two other levels of recognition, the Engineering Plaque, and the Engineering Citation.
Awards
1978
Engineering Emmy Award: Petro Vlahos for the ULTIMATTE Video-Matting Device
Engineering Citation: To the Society of Motion Picture and Television Engineers (SMPTE)
1979
Engineering Emmy Award: Ampex Corporation for the Automatic Scan Tracking System for Helical Video Tape Equipment
Engineering Citation: Magicam, Inc. for the Development of Real Time Tracking of Independent Scenes
1980
Engineering Emmy Award: National Institute of Standards and Technology (NIST), Public Broadcasting Service (PBS), and American Broadcasting Company (ABC) for Closed Captioning for the Deaf System
Engineering Emmy Citation: David Bargan for the '409' and 'TRACE' Computer Programs used for Off-line Videotape Editing
Engineering Emmy Citation: Vital Industries for its Pioneering Development of Digital Video Manipulation Technology
Engineering Emmy Citation: Convergence Corporation for the ECS-100 Video Tape Editing Systems
1981
Engineering Emmy Award: Rank Cintel for the Mark III Flying Spot Telecine
1982
Engineering Emmy Award: Hal Collins for Contributions to the Art and Development of Videotape Editing (posthumous)
Engineering Emmy Award: Dubner Computer Systems, Inc., and the American Broadcasting Company (ABC) for the Dubner CBG-2 Electronic Character and Background Generator
Engineering Citation: Chapman Studio Equipment for the Development of Crane Systems
1983
Engineering Emmy Award: Eastman Kodak for the Development of High Speed Color Film 5294/7294 Color Negative Film
Engineering Citation: Ikegami Electronics for the Development of the EC-35 (a camera used for electronic cinematography)
Engineering Citation: Ampex Corporation for Digital Effects Displaying Capabilities with Improved Picture Quality
1984
Engineering Emmy Award: None given
Engineering Citation: Corporate Communications Consultants Inc. for the 60XL Color Correction by Armand Belmares Sarabia
1985
Engineering Emmy Award: Auricle Control Systems (ACS) for the Auricle Time Processor
1986
Engineering Emmy Award: Nagra, Inc., for the Nagra Recorder
Engineering Emmy Award: CBS, Sony and Cinedco for Design and Implementation of Electronic Editing Systems for Film Programs
1987
Engineering Emmy Award: Spectra Image, Inc. for D220 Dual Headed Video Disc Player
1988
Engineering Emmy Award: Optical Disc Corporation for the Recordable Laser Videodisc System
Engineering Emmy Award: Sony for the DVR-1000 Component Digital VTR
1989
Engineering Emmy Award: Pacific Video Inc. for the Electronic Laboratory
Engineering Emmy Award: Cinema Products Corporation for the Steadicam
Engineering Plaque: Composite Image Systems for the Pin Registered Transfer Process
Engineering Plaque: Istec, Inc. for the WESCAM Camera Mount
Engineering Plaque: Matthews Studio Electronics for the Nettman Cam-Remote
Engineering Plaque: Offbeat Systems for the Streamline Scoring System
Engineering Plaque: Steadi-Film Corporation for the Steadi-Film System
Engineering Plaque: UCLA Film and Television Archive for the restoration of the Fred Astaire Specials
1990
Engineering Emmy Award: Comark Communications, Inc. and Varian/Eimac for the Klystrode UHF High Power Amplifier Tube and Transmitter
Engineering Emmy Award: Zaxcom Video, Inc. for the TBC Control System
Engineering Plaque: Samuelson Alga Cinema for the Louma Camera Crane
Engineering Plaque: Alan Gordon Enterprises for Image 300 35mm High Speed Camera
1991
Engineering Emmy Award: Vari-Lite for the Series 200 Lighting System
Engineering Emmy Award: Camera Platforms International, Inc. for the D/ESAM Digital Mixer
Engineering Plaque: Manfred Klemme for the Dcode TS-1 Time Code Slate
Engineering Plaque: Lightmaker Company for the AC/DC HMI Ballast
Engineering Plaque: George Hill for Optex UK - Mini Image Intensifier for ENG Cameras
Engineering Plaque: Grass Valley Group for the Kadenza Digital Picture Processor
Charles H. Jenkins Lifetime Achievement Award: Harry Lubcke
1992
Engineering Emmy Award: Charles Douglass for the Invention and Development of the Post Production Sweetener
Engineering Emmy Award: The Accom D-Bridge 122 Video Encoder
Engineering Plaque: Filmlook, Inc. for the Filmlook Process for Film Simulation
Charles H. Jenkins Lifetime Achievement Award: Kerns H. Powers
1993
Engineering Emmy Award: Avid Technology for the Media Composer
Engineering Emmy Award: Newtek for Video Toaster
Engineering Plaque: CBS Laboratories for Mini-Rapid Deployment Earth Terminal (RADET)
Engineering Plaque: Les Aseere for the scientific detective work that solved the mystery of type "C" video tape dropout and ventilated scanner debris.
Charles H. Jenkins Lifetime Achievement Award: Richard S. O'Brien
1994
Engineering Emmy Award: Philips Var-Lite for the VL5 Wash Luminaire
Engineering Emmy Award: Kodak for the Keykode Edgeprint Film Numbering System
Engineering Plaque: Cinema Products Corporation, Research in Motion, Evertz Microsystems, and the National Film Board of Canada, four creative hardware developers whose reader, decoder and user technology enabled the widespread use of Keykode.
1995
Engineering Emmy Award: C-Cube Microsystems for the MPEG Encoding Chip Set
Engineering Emmy Award: Barber Technologies for the Barber Boom
Engineering Emmy Award: Tascam for the DA-88 Digital Multitrack Recorder
Engineering Emmy Award: Philips Laboratories for the Ghost Cancellation
Engineering Plaque: Saunders Electric Incorporated for the Synchronized Load Commander System and Mobile Power Distribution
Charles H. Jenkins Lifetime Achievement Award: Julius Barnathan
1996
Engineering Emmy Award: LaserPacific Media Corporation for the Supercomputer Assembly
Engineering Emmy Award: General Instrument Corporation for the Digicipher Digital Television System
Engineering Emmy Award: Scientific-Atlanta for the Powervu Digital Video Compression System
Engineering Emmy Award: Tektronix for the Profile Professional Disk Recorder
Charles H. Jenkins Lifetime Achievement Award: Joseph Flaherty
1997
Engineering Emmy Award: J.L. Fisher for the J.L. Fisher Camera Dolly
Engineering Emmy Award: Panasonic for the AJ-LT75 DVCPRO Laptop Editor
Engineering Emmy Award: Grand Alliance for the Digital TV Standard
Engineering Plaque: The BOOM TRAC Microphone Dolly System
Engineering Plaque: Alan Gordon Enterprises for the Mark V Director's Viewfinder
Charles H. Jenkins Lifetime Achievement Award: Richard E. Wiley
1998
Engineering Emmy Award: Brian Critchley of Digital Projection International and Larry Hornbeck of Texas Instruments' for the Digital Micromirror Device POWER Displays Projector
Engineering Emmy Award: Tiffen for the Design and Manufacture of State-of-the-art Camera Lens Filters
Engineering Emmy Award: Philips Digital Video Systems and Eastman Kodak for the Design and Manufacture of the Industry-Standard High-definition Digital Telecine
Engineering Plaque: Avid for the Real-Time Multicamera System
Engineering Plaque: Tektronix for the Lightworks 'Heavyworks' Multistream Editing Systems
Charles H. Jenkins Lifetime Achievement Award: Yves Faroudja
1999
Engineering Emmy Award: Sony for the HDCAM HDW-500 Digital HD Studio VTR
Engineering Emmy Award: George Hill and Derek Lightbody for Optex UK., for the Aurasoft Soft Light
Engineering Plaque: Videotek for the VTM-200 Series Multi-Format, On-Screen Monitoring
Engineering Plaque: Spectracine, Inc., for the Spectra Professional IV-A Digital Exposure Meter
Charles H. Jenkins Lifetime Achievement Award: Charles A. Steinberg
2000
Engineering Emmy Award: The Dorrough Loudness Meter
Engineering Emmy Award: The Panavision Lightweight Camera
Engineering Emmy Award: Clairmont Camera for the MovieCam Superlight
Engineering Plaque: Lipsner-Smith Company and Consolidated Film Industries (CFI) for their joint development of the Model CF-8200 Ultrasonic Film Cleaning Machine
Engineering Plaque: TEAC America, Inc. for the MMR-8 and MMP-16 Recorders
Engineering Plaque: Soundmaster Group for the Integrated Operations Nucleus ION Operating Environment
Engineering Plaque: Cooke Optics for Cooke Prime Lenses
Charles H. Jenkins Lifetime Achievement Award: Charles Mesak
2001
Engineering Emmy Award: Vari-Lite for the VARI*LITE Virtuoso Console
Engineering Emmy Award: Cast Lighting, Ltd. for WYSIWYG
Engineering Emmy Award: Da Vinci Systems for 2K Color Enhancement System
Engineering Emmy Award: Pandora International for Pogle Platinum with MegaDef
Engineering Emmy Award: Panavision for the Primo Lens Series
Engineering Emmy Award: Apple, Inc. for FireWire
Engineering Emmy Award: Clairmont Camera for Clairmont Camera Lenses
Engineering Plaque: Chapman and Leonard Studio Equipment, Inc. for the LenCin Pedestal
Charles H. Jenkins Lifetime Achievement Award: Gilbert P. Wyland
2002
Engineering Emmy Award: TM Systems for The Digital Solution to Language Translation, Dubbing and Subtitling
Engineering Emmy Award: Apple Inc. for Final Cut Pro
Engineering Emmy Award: 2d3 for the Boujou Automated Camera Tracker
Engineering Emmy Award: ARRI for Arriflex Cameras
Engineering Plaque: Barber Technologies for the EZ Prompter
Engineering Plaque: HBO Interactive Ventures for Band Of Brothers Interactive Television Programming
Charles H. Jenkins Lifetime Achievement Award: Charles Cappleman
2003
Engineering Emmy Award: Dedo Weigert of Dedotec, USA Inc. for the Dedolight 400 Series Lighting System
Engineering Emmy Award: Emory Cohen, Randolph Blim and Doug Jaqua of LaserPacific Media Corporation for the 24P HDTV Post-Production System
Engineering Emmy Award: David Pringle, Leonard Pincus, Ashot Nalbandyan, Thomas Kong and George Johnson of Lightning Strikes, Inc. for Softsun
Engineering Plaque: NewTek, Inc. for LightWave 3D
Charles H. Jenkins Lifetime Achievement Award: Ray Dolby
Philo T. Farnsworth Corporate Achievement Engineering Award: Panavision
2004
Engineering Emmy Award: Dolby Laboratories for the Dolby LM 100 Broadcast Loudness Meter With Dialogue Intelligence
Engineering Emmy Award: Sony and Panavision for the First 24P Digital Imaging System
Engineering Plaque: Philip John Greenstreet of Rosco Laboratories, Inc. for Roscolite Scenic Backdrops
Engineering Plaque: David Grober and Scott Lewallen of Motion Picture Marine for Perfect Horizon
Charles H. Jenkins Lifetime Achievement Award: Les Paul
Philo T. Farnsworth Corporate Achievement Engineering Award: Chyron Corporation
2005
Engineering Emmy Award: Dolby Laboratories for Dolby E Audio Coding Technology
Engineering Emmy Award: Sprint Corporation for Sprint PCS VisionSM Multimedia Services
Engineering Emmy Award: Toon Boom Animation Inc. for USAnimation Opus
Engineering Emmy Award: MobiTV for the first mobile television network and technology platform to bring live broadcasts to mobile phones.
Engineering Plaque: Litepanels, Inc for Litepanels Mini LED Light
2006
Engineering Emmy Award: None awarded
Engineering Plaque: Scott Walker, Mark Walker, Jeff Watts, Scott Noe, Richard Brooker of BOXX Communications, LLC for Vid-Wave Boxx
Engineering Plaque: Harry Fagle for the Four-Channel Video Integrator (Quad-Split)
2007
Engineering Emmy Award: None awarded
Engineering Plaque: TM Systems, LLC for the TM Systems QC Station
Engineering Plaque: Osram Sylvania for Osram HMI Metal Halide Lamp Technology
Engineering Plaque: Digital Vision for DVNR Image Processing Hardware and DVO Image Processing Software
Engineering Plaque: Silicon Optix for the Teranex Video Computer
Charles H. Jenkins Lifetime Achievement Award: Howard A. Anderson, Jr.
2008
Engineering Emmy Award: Joint Video Team Standards Committee (JVT) for the development of the High Profile for H.264/MPEG-4 AVC.
Engineering Emmy Award: Glenn Sanders and Howard Stark of Zaxcom, Inc. for the Deva Location Sound Recorder.
Engineering Plaque: Scott Leva for the Precision Stunt Air Bag
Engineering Plaque: Sebastian Cramer and Andreas Dasser of P+S Technik GmbH for the Skater Dolly Product Family
Engineering Plaque: Craige Bandy and Ed Bandy of Tricam Video Productions Company for the 360 Overhead Jib
Engineering Plaque: Georg Dole, Swen Gerards, Jan Huewel and Daniel Schaefer of Coolux Media Systems for Pandoras Box Real-Time Compositing Media Server
Charles H. Jenkins Lifetime Achievement Award: Woo Paik
Philo T. Farnsworth Corporate Achievement Engineering Award: Evertz Technologies Limited
2009
Engineering Emmy Award: Dolby Laboratories for the Dolby DP600 Program Optimizer
Engineering Emmy Award: Fujinon and NHK for the Fujinon Precision Focus Assistance System
Engineering Emmy Award: Jim Henson's Creature Shop for the Henson Digital Puppetry Studio
Engineering Emmy Award: Litepanels, Inc.for Litepanels LED Lighting Products
Engineering Plaque: Herb Ault, Aaron Hammel and Bob Anderson of Grip Trix, Inc., for the Grip Trix Electric Motorized Camera Dolly
Philo T. Farnsworth Corporate Achievement Engineering Award: The National Aeronautics and Space Administration (NASA). In commemoration of the 40th anniversary of the technological innovations that made possible the first live broadcast from the lunar surface by the crew of Apollo 11 on July 20, 1969.
2010
Engineering Emmy Award: Stagetec for the NEXUS Digital Audio Routing
Engineering Plaque: Apple, Inc. for Apple Final Cut Studio
Engineering Plaque: Avid Technology for Avid Media Access
Engineering Plaque: David Eubank for the pCAM Film + Digital Calculator
Engineering Plaque: Showtime Sports Interactive
Charles H. Jenkins Lifetime Achievement Award: Ron Estes and Robert Seidenglanz
Philo T. Farnsworth Corporate Achievement Engineering Awards: Desilu and Digidesign (now Avid Technology)
2011
Engineering Emmy Award: IBM and Fox Group for the Development and Application of LTFS (Linear Tape File System)
Engineering Emmy Award: Panavision and Sony for Single Chip Digital Camera Technology used for Primetime Television Production.
Engineering Emmy Award: Ultimate Arm, for the Ultimate Gyrostabilized Remote Controlled Crane
Engineering Emmy Award: Apple, Inc. for the iPad
Engineering Plaque: Yahoo! for Connected TV to Yahoo!
Engineering Certificate: The Xfinity iPad app
Engineering Certificate: Time Warner iPad app
Charles H. Jenkins Lifetime Achievement Award: Andy Setos
Philo T. Farnsworth Corporate Achievement Engineering Awards: Time Warner and Time Warner Cable for the creation of the Full Service Network
2012
Engineering Emmy Award: Colorfront, Ltd. for Colorfront On-Set Dailies
Engineering Emmy Award: FilmLight for Truelight On-Set and Baselight TRANSFER
Engineering Emmy Award: Academy of Motion Picture Arts and Sciences for the Academy Color Encoding System (ACES)
Engineering Emmy Award: The American Society of Cinematographers (ASC) Technology Committee for the ASC Color Decision List (ASC CDL)
Engineering Emmy Award: Dolby Laboratories Inc. for the Dolby PRM-4200 Professional Reference Monitor
Engineering Emmy Award: Sony for the BVM E250 OLED Reference Monitor.
Engineering Emmy Award: Toon Boom Animation Inc. for the Toon Boom Storyboard Pro
Engineering Emmy Award: Netflix Inc. for its new streaming video service.
Engineering Plaque: Adobe Systems for the Adobe Pass Viewer Authentication process
Charles H. Jenkins Lifetime Achievement Award: Dr. Richard Green
Philo T. Farnsworth Corporate Achievement Engineering Award: Eastman Kodak Company
2013
Engineering Emmy Award: YouTube
Engineering Emmy Award: Aspera, for FASP Transport Technology
Engineering Emmy Award: Josh C. Kline of Digital Dailies Web Based Streaming Production Dailies and Cuts
Engineering Emmy Award: iZotope for RX Audio Repair Technology (iZotope)
Engineering Emmy Award: Lightcraft Technology for Previzion Virtual Studio System
Engineering Plaque: LAWO AG Audio networking and routing system for large scale television entertainment productions
Engineering Plaque: Final Draft Inc., Final Draft Screenwriting Software
Charles H. Jenkins Lifetime Achievement Award: Chris Cookson
Philo T. Farnsworth Corporate Achievement Engineering Award: Sennheiser Electronic Corporation
2014
Engineering Emmy Award: Philips Professional Broadcasting for the LDK6000, DPM CCD Multi-format HDTV Camera System
Engineering Emmy Award: Sony Professional Solutions of America for the Multi-format HDTV CCD Fiber Optic Camera System
Engineering Emmy Award: High-Definition Multimedia Interface (HDMI)
Engineering Emmy Award: Intel Corp for High-bandwidth Digital Content Protection (HDCP)
Engineering Emmy Award: Advanced Television Systems Committee (ATSC) for its Recommended Practice on Techniques for Establishing and Maintaining Audio Loudness for Digital Television
Charles H. Jenkins Lifetime Achievement Award: Laurence J. Thorpe
Philo T. Farnsworth Corporate Achievement Engineering Award: The Society of Motion Picture and Television Engineers (SMPTE)
2015
Engineering Emmy Award: Mark Franken for EdiCue
Engineering Emmy Award: Michael Sechrest, Chris King, and Greg Croft for SpeedTree
Engineering Emmy Award: Zhou Wang, Alan Bovik, Hamid Sheikh and Eero Simoncelli for the Structural Similarity (SSIM) Video Quality Measurement Model
Charles H. Jenkins Lifetime Achievement Award: Garrett Brown
Philo T. Farnsworth Corporate Achievement Engineering Award: Grass Valley USA, LLC
The 67th Primetime Emmy Engineering Awards Ceremony took place on October 28, 2015 at Loews Hollywood Hotel.
2016
Engineering Emmy Award: SyncOnSet software application for production design
Engineering Emmy Award: Ncam Technologies for camera tracking technology
Engineering Emmy Award: Sony for the Sony 2/3" 4K Imaging System
Engineering Emmy Award: Saunders Electric for Saunders Mobile UPS Power Station
Engineering Emmy Award: Zaxcom Inc for innovations in digital wireless technology.
Engineering Emmy Award: Group It For Me! cloud-based software
Charles H. Jenkins Lifetime Achievement Award: John C. Malone
Philo T. Farnsworth Corporate Achievement Engineering Award: NHK's Science & Technology Research Laboratories
The 68th Primetime Emmy Engineering Awards Ceremony took place on October 26, 2016 at Loews Hollywood Hotel.
2017
Engineering Emmy Award: Arri for ARRI Alexa Camera System
Engineering Emmy Award: Canon Inc and Fujifilm (Fujinon) for 4K Zoom Lenses
Engineering Emmy Award: The Walt Disney Company for Disney Global Localization
Engineering Emmy Award: McDSP for the SA-2 Dialog Processor
Engineering Emmy Award: Joint Collaborative Team on Video Coding (JCT-VC) for High Efficiency Video Coding (HEVC)
Engineering Emmy Award: Shotgun Software
Charles H. Jenkins Lifetime Achievement Award: Leonardo Chiariglione
Philo T. Farnsworth Corporate Achievement Engineering Award: Sony Corporation
The 69th Primetime Emmy Engineering Awards Ceremony took place on October 25, 2017 at Loews Hollywood Hotel.
2018
Engineering Emmy Award: Chemical Wedding for Artemis Digital Director's Viewfinder
Engineering Emmy Award: Cospective for cineSync Review and Approval
Engineering Emmy Award: Codex Digital for Codex Recording Platform and Capture Media
Engineering Emmy Award: Blue Microphones for Blue Mix-Fi Headphones
Engineering Emmy Award: Production Resource Group for PRG GroundControl Followspot
Engineering Plaque: Customized Animal Tracking Solutions (CATS) for the CATS Cam: Animal-Borne Multi-Sensor Video System
Charles H. Jenkins Lifetime Achievement Award: Wendy Aylsworth
Philo T. Farnsworth Corporate Achievement Engineering Award: Avid
The 70th Primetime Emmy Engineering Awards Ceremony took place on October 24, 2018 at the JW Marriott Hotel LA Live.
2019
Engineering Emmy Award: Boris FX for Sapphire
Engineering Emmy Award: iZotope for RX 7 Audio Repair
Engineering Emmy Award: FabFilter for Pro-Q3 Audio Equalizer
Engineering Emmy Award: SilhouetteFX LLC for SilhouetteFX Rotoscoping
Engineering Emmy Award: Boris FX for Mocha Pro Motion Tracking System
Engineering Emmy Award: Joint Photographic Experts Group for JPEG Image Compression
Charles H. Jenkins Lifetime Achievement Award: Hugo Gaggioni
Philo T. Farnsworth Corporate Achievement Engineering Award: The American Society of Cinematographers (ASC)
The 71st Primetime Emmy Engineering Awards Ceremony took place on October 23, 2019 at the JW Marriott Hotel LA Live.
2020
Engineering Emmy Award: Evercast for Evercast real-time collaboration platform
Engineering Emmy Award: HP Inc for ZCentral Remote Boost
Engineering Emmy Award: Sohonet for ClearView Flex
Engineering Emmy Award: Teradici for Cloud Access Software
Engineering Emmy Award: Apple Inc for Apple ProRes
Engineering Emmy Award: CODEX for CODEX RAW Workflow
Engineering Emmy Award: Dan Dugan for Gain Sharing Automatic Microphone Mixing
Engineering Emmy Award: Epic Games for Unreal Engine
Engineering Emmy Award: RE:Vision Effects for optical flow-based postproduction video tools
Engineering Emmy Award: Sound Radix for Sound Radix Auto-Align Post
Engineering Emmy Award: Bill Spitzak, Jonathan Egstad, Peter Crossley and Jerry Huxtable for Nuke
The 72nd Primetime Emmy Engineering Awards Ceremony was streamed live on Emmys.com on Thursday, Oct. 29, 2020 at 5:00 p.m. PDT.
2021
Engineering Emmy Award: Marcos Fajardo, Alan King, and Thiago Ize for Arnold Global Illumination Rendering System
Engineering Emmy Award: ARRI for ARRI Skypanel
Engineering Emmy Award: CEDAR Audio Ltd. for CEDAR Studio
Engineering Emmy Award: Golaem for Golaem Crowd
Engineering Emmy Award: Stephen Regelous for Massive
Engineering Emmy Award: Steve Vitolo, Felipe A. Mendez, and Franco Zuccar for Scriptation
Engineering Emmy Award: Nicolaas Verheem, Marius van der Watt, Dennis Scheftner, and Zvi Reznic for Teradek Bolt 4K
Engineering Emmy Award: Chaos for V-Ray
Charles H. Jenkins Lifetime Achievement Award: Reed Hastings
Philo T. Farnsworth Corporate Achievement Engineering Award: Dolby Laboratories
The 73rd Primetime Emmy Engineering Awards Ceremony took place on Thursday, Oct. 21, at the JW Marriott Hotel, Los Angeles LA LIVE.
See also
List of American television awards
List of engineering awards
References
Engineering |
8024623 | https://en.wikipedia.org/wiki/28th%20Bomb%20Wing | 28th Bomb Wing | The 28th Bomb Wing is a United States Air Force unit assigned to the Eighth Air Force (8 AF) of the Air Force Global Strike Command (AFGSC) and is stationed at Ellsworth Air Force Base, South Dakota. The wing is also the "host unit" at Ellsworth AFB.
The wing is one of only two B-1B Lancer strategic bomber wings in the United States Air Force, the other being the 7th Bomb Wing at Dyess Air Force Base, Texas.
Active for over 60 years, the 28th was a component wing of Strategic Air Command's deterrent force throughout the Cold War.
The 28th Bomb Wing has been commanded by Colonel David Doss since July 2019. The previous commander was Colonel Gentry Boswell from 2015–2017 and Col Edwards from 2017-2019; its Command Chief Master Sergeant is CMSgt Adam Vizi.
Units
28th Operations Group
28th Operations Support Squadron
34th Bomb Squadron
37th Bomb Squadron
28th Mission Support Group
28th Civil Engineering Squadron
28th Communications Squadron
28th Contracting Squadron
28th Logistics Readiness Squadron
28th Force Support Squadron
28th Security Forces Squadron
28th Maintenance Group
28th Aircraft Maintenance Squadron
28th Maintenance Squadron
28th Munitions Squadron
28th Maintenance Operations Squadron
28th Medical Group
28th Medical Operations Squadron
28th Medical Support Squadron
Background of name
The motto "Guardian of the North" hails from the 28th Operations Group's World War II service in Alaska, the Aleutian Islands, and the Kuril Islands. The 28th Bomb Wing carries on the traditions of the 28th Operations Group.
Myth: The motto "Guardian of the North" is not related in any way to the wing's Cold War service with Boeing B-52 Stratofortress bombers. Although this motto seems to dovetail nicely with the idea of guarding the north (whether spearheading an attack over the North Pole, or defending against one) from the Soviets, this is definitely not the case.
History
For related history and lineage, see 28th Operations Group
The 28th Bomb Wing, under various designations, has been assigned to Ellsworth Air Force Base, South Dakota for over 60 years. It is the longest assigned active-duty unit at a single base in the United States Air Force.
Cold War
Established as the 28th Bombardment Wing, Very Heavy on 28 July 1947, the wing maintained proficiency in heavy bombardment from 1947 to 1948 and maintained proficiency in global bombardment, deploying tactical components or segments thereof as needed from 1948 to 1950.
In March 1953, an RB-36 and its entire crew of 23 crashed in Newfoundland while returning from a routine exercise in Europe. On 13 June 1953, President Dwight D. Eisenhower made a personal visit to dedicate the base in memory of Brig Gen Richard E. Ellsworth, commander of the 28th Strategic Reconnaissance Wing, who lost his life in that mishap.
Although the wing's aerial reconnaissance capability lasted until September 1958, by April 1955 the Air Force had already changed the wing back to its former status as the 28th Bombardment Wing, Heavy, under the 15th Air Force (later attached to the 3rd Air Division), which specialized almost exclusively in nuclear ordnance delivery. Headquarters Strategic Air Command (SAC) reassigned the 28 BMW from 8th Air Force back to 15th Air Force in October 1955. The wing also completed a deployment to Andersen Air Force Base, Guam, from April 1955 to July 1955.
Approximately one year later, SAC set plans in motion to replace the 28th's Convair B-36 Peacemakers with the new all-jet B-52 Stratofortress. The last B-36 left Ellsworth on 29 May 1957 and the first B-52 arrived sixteen days later. In 1958, all base units came under the command of the 821st Strategic Aerospace Division, headquartered at Ellsworth.
On 26 September 1958, two B-52Ds from the wing set world speed records for heavy aircraft. One flew a 5,000 km closed course at an average speed of 597.695 mph, while the other flew a 10,000 km closed course at a speed of 560.705 mph.
The wing added aerial refueling to its mission in 1959 with the addition of the Boeing KC-135 Stratotanker and also began operating post-attack command and control system for Fifteenth Air Force in January 1965, maintaining this capability through a rear echelon during the absences of the remainder of the wing.
The wing also temporarily controlled the 850th Strategic Missile Squadron, a non-equipped Titan I missile squadron from December 1960 to December 1961 pending the later establishment of the 44th Strategic Missile Wing at Ellsworth.
In April 1966, B-52Ds of the wing, together with D series bombers of the 484th Bombardment Wing deployed to Andersen Air Force Base, Guam, replacing the B-52Fs that SAC had been deploying to Andersen for the Vietnam War since the previous year. The 28th's planes and other B-52Ds had been modified under a program called Big Belly, which increased the bombload of wing aircraft to 84 500 lb bombs or 42 750 lb bombs, from the 27 it could previously carry. From this point, the Big Belly B-52D became the SAC workhorse in Southeast Asia.
Except for a small rear echelon left at Ellsworth, the wing's headquarters staff, aircraft and crews, and most support personnel were integrated into Operation Arc Light forces for combat in Southeast Asia, c. 9 March – c. 21 September 1966, c. 15 January – c. 19 July 1968, and c. 9 September 1969 – c. 18 March 1970. From April 1972 to October 1973 the wing also had most of its tactical aircraft and crews on loan to SAC organizations involved in combat operations, and the wing continued supporting Pacific forces with planes and crews into 1975.
In 1971, the wing converted from the B-52D to the B-52G, and converted again from B-52G to B-52H models in 1977. The B-52H mission expanded in 1984 to include sea reconnaissance, surveillance, and conventional operations from forward bases overseas, to include employment of the AGM-84 Harpoon missile. The wing also upgraded its tanker force to KC-135R variant in 1985 and 1986.
From 1 April 1970 to 30 September 1992, the 4th Airborne Command and Control Squadron (ACCS), part of the 28th BMW, provided airborne command post responsibilities with specially modified Boeing EC-135 airborne command post aircraft for Strategic Air Command as part of the Post Attack Command and Control System. The 4th ACCS was the workhorse of Airborne Launch Control System (ALCS) operations. Three dedicated Airborne Launch Control Centers (ALCC) (pronounced "Al-see"), designated ALCC No. 1, ALCC No. 2, and ALCC No. 3 were on ground alert around-the-clock providing ALCS coverage for five of the six Minuteman Intercontinental Ballistic Missile (ICBM) Wings. These dedicated ALCCs were mostly EC-135A aircraft but sometimes were EC-135C or EC-135G aircraft, depending on availability. ALCC No. 1 was on ground alert at Ellsworth AFB, SD and during a wartime scenario, its role would have been to take off and orbit between the Minuteman Wings at Ellsworth AFB, SD and F.E. Warren AFB, WY, providing ALCS assistance if needed. ALCCs No. 2 and No. 3 were routinely on forward deployed ground alert at Minot AFB, ND. During a wartime scenario, ALCC No. 3’s role would have been to take off and orbit between the Minuteman ICBM Wings at Minot AFB, ND and Grand Forks AFB, ND, providing ALCS assistance if needed. ALCC No. 2’s dedicated role was to take off and orbit near the Minuteman ICBM Wing at Malmstrom AFB, MT, providing ALCS assistance if needed. The 4th ACCS also maintained an EC-135C or EC-135G on ground alert at Ellsworth as the West Auxiliary Airborne Command Post (WESTAUXCP), which was a backup to SAC’s Looking Glass Airborne Command Post (ABNCP), as well as a radio relay link between the Looking Glass and ALCCs when airborne. Although equipped with ALCS, the WESTAUXCP did not have a dedicated Minuteman ICBM wing to provide ALCS assistance to.
In 1986, the 28 BMW made extensive preparations to phase out the aging B-52 fleet and become the new home for the advanced B-1 Lancer. Contractors completed new unaccompanied enlisted dormitories in March, a new security police group headquarters in October, and gave Ellsworth's 13,497-foot runway a much-needed facelift. In addition, they completed new aircraft maintenance facilities for the complex new B-1B. In January 1987, the wing received the first of 35 B-1B bombers.
The 37 BS returned to operational duty with the 28 BW in January 1987, just in time to join the 77 BS in training on the new bombers. The first B-1B arrived on 21 January 1987. In July 1988 the 57th Air Division became the wing's new higher headquarters. In 1989 the wing's B-1Bs earned the Fairchild Trophy, Crumm Linebacker Trophy, Eaker Trophy, and the Omaha Trophy for superior bomber operations and the most outstanding wing in SAC. The wing also provided tanker support for Operation Just Cause, December 1989 – January 1990.
In July 1990 the Strategic Warfare Center became the latest of the wing's intermediate headquarters. Adding to its extensive combat experience, the wing deployed both tanker and airborne command post aircraft to Operation's Desert Shield/Desert Stormfrom August 1990 – March 1991.
On 1 September 1991 SAC redesignated the 28th as the 28th Wing, and once again assigned it directly under Eighth Air Force, and as part of the new objective wing organization, reactivated the old 28 BG under the new name of the 28th Operations Group. The 28th Wing also regained host wing responsibilities for Ellsworth from the 44th Missile Wing.
Post Cold War era
With the end of the Cold War, on 28 September 1991 the Secretary of Defense ordered B-1Bs and tankers off alert. The 4 ACCS continued to maintain an alert crew until May 1992. On 1 June 1992, simultaneously, SAC inactivated, Air Combat Command activated, the 28th Wing changed names to the 28th Bomb Wing, and the 28 AREFS became a geographically separated unit assigned to Malmstrom Air Force Base, Montana. In September 1992 the 4 ACCS also inactivated, having effectively worked themselves out of a job by providing airborne command and control so faithfully for so long.
In 1993 the wing's B-1Bs were the first in ACC to transition from their former strategic role to an all-conventional mission. The 28th's operational squadrons could conceivably touch anywhere in the world to meet national defense needs. Ellsworth tested this concept in 1993 and early 1994 during such events as: "Team Spirit" (the first B-1Bs ever to land in South Korea); "Global Power" (various long-duration, round trip sorties flown from Ellsworth to bomb training ranges in another continent.); and "Bright Star" (the wing's second but the B-1Bs first visit to a major JCS exercise in Southwest Asia).
From June through December 1994, 28 BW B-1Bs participated in a Congressionally directed operational readiness assessment known locally as "Dakota Challenge." Towards the end of the exercise the wing deployed a squadron to Roswell, NM, to simulate flying from an austere location at wartime sortie rates. This test proved the B-1B to be a versatile weapon system. Excellent results were obtained. However, World Airpower Journal argued that '[i]t could be argued that the excellent results were meaningless, because they were so unrepresentative. Spare parts, equipment and people were brought in from the 7th and 384th Bomb Wings, at last bringing up the 28th up to 100% in all three respects. This was done at the expense of degrading the other two wings. It did however show what was possible, given funding and commitment.'. The journal favorably quoted General John M. Loh in this regard.
On 31 March 1995, the 77 BS—a unit that had served under the wing since 1948—inactivated. Its B-1Bs became part of ACC's reconstitution reserve. This action freed funds to allow the Air Force to develop new precision-guided munitions. The Air Force announced in early 1996 that the 77 BS would once again activate under the 28 BW on 1 April 1997. In November 1998, they received the first Block D upgraded B-1B in the USAF inventory. The Block D upgrade brings the capability for the B-1 to drop the Joint Direct Attack Munition, which is a global positioning system (GPS) guided munition. (These are the upgrades that were paid for with the funds that were freed during the 77 BS's inactivation).
One B-1B from the 28th Bomb Wing departed for Southwest Asia 18 December 1997 to supply additional bomber forces in the Middle East. The 7th Bomb Wing at Dyess Air Force Base also launched a B-1B 18 December. This brings the total number of B-1Bs in theater to six—three from Ellsworth and three from Dyess. B-1Bs from both bases saw their first combat action in air raids over Iraq 17 December 1997. Details on the number of B-1Bs used and battle damage assessment information has not been released; however, the missions were characterized as "very successful."
In December 1998, 28th deployed aircraft, which flew under the flag of the 28th Air Expeditionary Group in Operation Desert Fox, were the first B-1s to drop bombs on an enemy target.
In late March 2011, B-1 bombers from the 28th Bomb Wing were deployed on a mission to Libya to attack military targets in support of Operation Odyssey Dawn.
The Department of the Air Force announced in the spring of 2015 that effective 1 October 2015 the 28th, along with the 7th Bomb Wing at Dyess Air Force Base would be realigned under Air Force Global Strike Command (AFGSC), reuniting all the Air Force's bomber and strategic missiles under a single command for the first time since Strategic Air Command was disestablished 23 years earlier.
Operations and Decorations
Combat Operations: Except for a small rear echelon left at Ellsworth AFB, SD, the wing's headquarters staff, tactical aircraft and crews, and most support personnel were integrated into ARC LIGHT forces for combat in Southeast Asia, c. 9 March-c. 21 September 1966, c. 15 January-c. 19 July 1968, and c. 9 September 1969-c. 18 March 1970. From Apr 1972 to Oct 1973 the wing also had most of its tactical aircraft and crews on loan to SAC organizations involved in combat operations, and the wing continued supporting Pacific forces with planes and crews into 1975. In Nov 1997, deployed four B-1s and crews to Southwest Asia for a show of force against Iraq, the first real-world contingency use of the B-1 bomber. A year later (Nov 1998), the wing deployed three B-1s and crews to Southwest Asia for punitive attacks against Iraqi targets, the first combat use of the B-1. After terrorist attacks against the United States in Sep 2001, the wing deployed B-1 aircraft and crews to Diego Garcia, an island in the Indian Ocean, for bombing missions against enemy targets in Afghanistan.
Campaigns: None
Decorations: Air Force Outstanding Unit Awards with Combat "V" Device: 1 June 2001 – 31 May 2003; 20 September 2001 – 17 January 2002 (conferred). Air Force Outstanding Unit Awards: 1 September 1957 – 30 June 1958; 1 January-31 Dec 1966; 1 January-1 Mar 1968; 2 March-1 Jul 1968; 9 June-10 Jul 1972; 1 July 1976 – 30 June 1978; 1 July 1978 – 30 June 1980; 1 July 1981 – 30 June 1983; 1 July 1988 – 30 June 1990; 1 June-30 Nov 1994; 1 June 1997 – 31 May 1999; 1 June 2003 – 31 May 2005.
In 1989, won the Fairchild Trophy for excellence in bombing and navigation and the Omaha Trophy, presented to the outstanding wing in SAC.
Lineage
Designated as the 28th Bombardment Wing, Very Heavy on 28 July 1947
Organized on 15 August 1947
Redesignated 28th Bombardment Wing, Medium on 12 July 1948
Redesignated 28th Bombardment Wing, Heavy on 16 May 1949
Redesignated 28th Strategic Reconnaissance Wing on 1 April 1950
Redesignated 28th Strategic Reconnaissance Wing, Heavy on 16 July 1950
Redesignated 28th Bombardment Wing, Heavy on 1 October 1955
Redesignated 28th Wing on 1 September 1991
Redesignated 28th Bomb Wing on 1 June 1992.
Wing resources form the provisional 28th Air Expeditionary Wing when the wing is the primary force provider for deployments.
Assignments
Fifteenth Air Force, 15 August 1947
Eighth Air Force, 1 April 1950
Fifteenth Air Force, 1 April 1955 (attached to 3d Air Division, 14 April – 24 July 1955)
821st Air Division (later 821 Strategic Aerospace) Division), 1 January 1959
47th Air Division, 30 June 1971
4th Strategic Missile Division (later 4th Air Division), 15 January 1973
57th Air Division, 1 May 1982
4th Air Division, 23 January 1987
12th Air Division, 15 July 1988
Strategic Warfare Center, 31 July 1990
Eighth Air Force, 1 September 1991
Eighth Air Force, 1 June 1992 (continued from 1 September 1991)
Twelfth Air Force, 1 October 2002
Eighth Air Force, 1 October 2015
Components
Groups
28th Bombardment Group (later 28th Strategic Reconnaissance Group, 28th Operations Group): 15 August 1947 – 16 June 1952; since 1 September 1991
Detached 19 July – 18 October 1948
Squadrons
4th Airborne Command and Control Squadron: 1 April 1970 – 1 September 1991
37th Bomb Squadron: 1 July 1977 – 1 October 1982; 1 January 1987 – 1 September 1991
77th Strategic Reconnaissance Squadron (later 77th Bombardment Squadron): attached 10 February 1951 – 15 June 1952, assigned 16 June 1952 – 1 September 1991 (detached c. 9 March-c. 21 September 1966, c. 15 January-c. 19 July 1968, c. 19 August 1969-c. 23 March 1970).
717th Strategic Reconnaissance Squadron (later 717th Bombardment Squadron): attached 10 February 1951 – 15 June 1952, assigned 16 June 1952 – 1 February 1960
718th Strategic Reconnaissance Squadron (later 718th Bombardment Squadron): attached 10 February 1951 – 15 June 1952, assigned 16 June 1952 – 20 February 1960
928th Air Refueling Squadron: 1 February 1959 – 1 October 1960
28th Air Refueling Squadron: 1 October 1960 – 1 September 1991 (detached c. 9 March-c. 21 September 1966, c. 15 January-c. 19 July 1968, c. 19 August 1969-c. 23 March 1970)
97th Air Refueling Squadron: 1 July 1962 – 15 March 1964
850th Strategic Missile Squadron: 1 December 1960 – 1 January 1962
Stations
Rapid City Army Air Field (later Rapid City Air Force Base; Ellsworth Air Force Base), South Dakota, since 3 May 1947
28th Bombardment Group deployed at RAF Scampton, England, 19 July – 19 October 1948
77th Bombardment Squadron deployed to Andersen Air Force Base, Guam, c. 9 March – c. 21 September 1966, c. 15 January – c. 19 July 1968, and c. 9 September 1969 – c. 18 March 1970
Aircraft assigned
Boeing B-29 Superfortress, 1946–1950
Boeing RB-29 Superfortress, 1950
Convair B-36D Peacemaker, 1949–1950; RB-36D (24), June 1950 – 1957 (Seven B-36Bs were converted to RB-36D); 10 later converted to GRB-36D (FICON). Several RB-36D aircraft temporarily assigned to 91st Strategic Reconnaissance Squadron for duty during Korean War.
Boeing B-52 Stratofortress, 1957–1966, 1966–1968, 1968–1969, 1970–1986
B-52D, 1957–1971; B-52G, 1971–1977; B-52H, 1977–1986
Boeing KC-135 Stratotanker, 1959–1966, 1966–1968, 1968–1969, 1970–1992
Boeing EC-135A/C/G Airborne Launch Control Center and Looking Glass Airborne Command Post, 1967–1992
Boeing KC-97 Stratofreighter, 1962–1964
Rockwell B-1B Lancer, 1987–Present
References for commands and major units assigned, components and stations:
See also
List of B-52 Units of the United States Air Force
References
Notes
Bibliography
External links
28th Bomb Wing fact sheet
Air Force Historical Research Agency: 28th Bomb Wing
Military units and formations established in 1947
Strategic Air Command units
0028
Military units and formations in South Dakota
1947 establishments in South Dakota |
66106756 | https://en.wikipedia.org/wiki/Robert%20N.%20Rose | Robert N. Rose | Robert N. Rose (born February 27, 1951) is an American Wall Street financier and cybersecurity expert.
Rose is a member of the U.S. Department of Homeland Security's Homeland Security Advisory Council and chair of the Information and Communications Risk
Reduction Subcommittee. He was a Clinton Administration appointee to the Fulbright Foreign Scholarship Board.
Education
Rose obtained a BS from the School of Foreign Service at Georgetown University, majoring in international economics. During his studies at Georgetown, he was a member of the Delta Phi Epsilon Fraternity. In 1995, Rose received his Master of Public Administration from the Kennedy School of Government at Harvard University.
Career in finance
From 1995 to 2008, Rose was a Senior Managing Director at Bear Stearns, where he was Global Head of Sales and Marketing for PricingDirect and the Financial Analytics and Structured Transactions group.
Career in cybersecurity
Rose has served in various appointed U.S. government advisory positions in the areas of national security, cyber, and homeland security. In 1995, Rose was one of the founding members of the U.S. Secret Service’s Electronic Crime Task Force (ECFT) in New York. He was later appointed to the U.S. Department of State’s International Security Advisory Board (ISAB).
Rose was invited by the Aspen Security Forum to speak on “Cyber Power and Cyber-Security.” He also played a critical role in the 2012 establishment of the George Washington University Center for Cyber and Homeland Security.
Political work
Rose has been a longtime Democratic Party fundraiser and activist. He was a co-founder of the National Jewish Democratic Council in 1990, and in 1992, he was appointed to Democratic National Convention Site Selection Committee and was a member of New York 1992 Convention Host Executive Committee. In 2000, he was a member DNC National Convention Rules Committee. Rose was an Alternate Delegate for the 2004 Democratic National Convention held in Boston. He also served as Finance Chairman of the Democratic Party of Connecticut in 1993.
Publications
Co-authored, "Final Report of the Emerging Technologies Subcommittee Biotechnology,” U.S. Department of Homeland Security (August 18, 2020)
Co-authored, “Final Report of the Emerging Technologies Subcommittee Unmanned Aerial and Ground Based Systems,” U.S. Department of Homeland Security (February 24, 2020)
Co-authored, “Final Report of the Emerging Technologies Subcommittee 3-D Printing,” U.S. Department of Homeland Security (February 24, 2020)
Co-authored, “Final Report of Cybersecurity Subcommittee: State, Local, Tribal & Territorial,” U.S. Department of Homeland Security (November 14, 2019)
Co-authored, “Final Report of the Emerging Technologies Subcommittee Artificial Intelligence / Machine Learning,” U.S. Department of Homeland Security (November 14, 2019)
“Restructuring the U.S. Intelligence Community”, Center for Cyber and Homeland Security, George Washington University (June 2017)
Co-authored “Report on Arctic Policy,” U.S. Department of State (September 21, 2016)
“The Future of Insider Threats,” Forbes Online (August 30, 2016)
“A Practical Path to Cybersecurity,” Forbes Online (December 22, 2015)
Co-authored “A Framework for International Cyber Stability,” U.S. Department of State (July 2, 2014)
Co-authored “Defer Capital Gains. Don’t Cut the Tax,” op-ed article published in the New York Times, October 18, 1992. Article cited in Forbes magazine (January 18, 1993)
References
1951 births
Living people
Georgetown University alumni
Walsh School of Foreign Service alumni
Harvard Kennedy School people
Harvard University alumni
Connecticut Democrats
United States Department of Homeland Security officials |
1257746 | https://en.wikipedia.org/wiki/Wesley%20A.%20Clark | Wesley A. Clark | Wesley Allison Clark (April 10, 1927 – February 22, 2016) was an American physicist who is credited for designing the first modern personal computer. He was also a computer designer and the main participant, along with Charles Molnar, in the creation of the LINC computer, which was the first minicomputer and shares with a number of other computers (such as the PDP-1) the claim to be the inspiration for the personal computer.
Clark was born in New Haven, Connecticut, and grew up in Kinderhook, New York, and in northern California. His parents, Wesley Sr. and Eleanor Kittell, moved to California, and he attended the University of California, Berkeley, where he graduated with a degree in physics in 1947. Clark began his career as a physicist at the Hanford Site.
In 1981, Clark received the Eckert–Mauchly Award for his work on computer architecture. He was awarded an honorary degree by Washington University in 1984. He was elected to the National Academy of Engineering in 1999. Clark is a charter recipient of the IEEE Computer Society Computer Pioneer Award for "First Personal Computer".
At Lincoln Laboratory
Clark moved to the MIT Lincoln Laboratory in 1952 where he joined the Project Whirlwind staff. There he was involved in the development of the Memory Test Computer (MTC), a testbed for ferrite core memory that was to be used in Whirlwind. His sessions with the MTC, "lasting hours rather than minutes" helped form his views that computers were to be used as tools on demand for those who needed them. That view carried over into his designs for the TX-0 and TX-2 and the LINC. He expresses this view clearly here:
...both of the Cambridge machines, Whirlwind and MTC, had been completely committed to the air defense effort and were no longer available for general use. The only surviving computing system paradigm seen by M.I.T. students and faculty was that of a very large International Business Machine in a tightly sealed Computation Center: the computer not as tool, but as demigod. Although we were not happy about giving up the TX-0, it was clear that making this small part of Lincoln's advanced technology available to a larger M.I.T. community would be an important corrective step.
Clark is
one of the fathers of the personal computer... he was the architect of both the TX-0 and TX-2 at Lincoln Labs. He believed that "a computer should be just another piece of lab equipment." At a time when most computers were huge remote machines operated in batch mode, he advocated far more interactive access. He practiced what he preached, even though it often meant bucking current "wisdom" and authority (in a 1981 lecture, he mentioned that he had the distinction of being, "the only person to have been fired three times from MIT for insubordination".)
Clark's design for the TX-2 "integrated a number of man-machine interfaces that were just waiting for the right person to show up to use them in order to make a computer that was 'on-line'. When selecting a PhD thesis topic, an MIT student named Ivan Sutherland looked at the simple cathode ray tube and light pen on the TX-2's console and thought one should be able to draw on the computer. Thus was born Sketchpad, and with it, interactive computer graphics."
At Washington University
In 1964, Clark moved to Washington University in St. Louis where he and Charles Molnar worked on macromodules, which were fundamental building blocks in the world of asynchronous computing. The goal of the macromodules was to provide a set of basic building blocks that would allow computer users to build and extend their computers without requiring any knowledge of electrical engineering.
The New York Times series on the history of the personal computer had this to say in an article on August 19, 2001, "How the Computer Became Personal":
In the pantheon of personal computing, the LINC, in a sense, came first—more than a decade before Ed Roberts made PC's affordable for ordinary people. Work started on the Linc, the brainchild of the M.I.T. physicist Wesley A. Clark, in May 1961, and the machine was used for the first time at the National Institute of Mental Health in Bethesda, MD, the next year to analyze a cat's neural responses.
Each Linc had a tiny screen and keyboard and comprised four metal modules, which together were about as big as two television sets, set side by side and tilted back slightly. The machine, a 12-bit computer, included a one-half megahertz processor. Lincs sold for about $43,000—a bargain at the time—and were ultimately made commercially by Digital Equipment, the first minicomputer company. Fifty Lincs of the original design were built.
Role in ARPANET
Clark had a key insight in the planning for the ARPANET (the predecessor to the Internet). In April 1967, he suggested to Larry Roberts the idea of using separate small computers (later named Interface Message Processors) as a way of forming a message switching network and reducing load on the local computers. The same idea had earlier been independently developed by Donald Davies for the NPL network. The concept of packet switching was introduced to the ARPANET later at the Symposium on Operating Systems Principles in October 1967.
Post-Nixon China trip
In 1972, shortly after President Nixon's trip to China, Clark accompanied five other computer scientists to China for three weeks to "tour computer facilities and to discuss computer technology with Chinese experts in Shanghai and Beijing. Officially, the trip was seen by the Chinese in two lights: as a step in reestablishing the long-interrupted friendship between the two nations and as a step in opening channels for technical dialogue." The trip was organized by his colleague Severo Ornstein from MIT Lincoln Laboratory and Washington University. The other members of the group were: Thomas E. Cheatham, Anatol Holt, Alan J. Perlis and Herbert A. Simon.
Death
He was 88 when he died on February 22, 2016, at his home in Brooklyn due to severe atherosclerotic cardiovascular disease.
See also
List of pioneers in computer science
References
External links
Wesley Clark article in Smart Computing Encyclopedia
Oral history interview with Wesley Clark. Charles Babbage Institute, University of Minnesota. Clark describes his research at Lincoln Laboratory and interaction with the Information Processing Techniques Office (IPTO) of the Advanced Research Projects Agency (ARPA). Topics include various custom computers built at MIT, including the LINC computer; timesharing and network research; artificial intelligence research; ARPA contracting; interaction with IPTO directors; the work of Larry Roberts at IPTO.
Functional Description of the L1 Computer, March 1960 at bitsavers.org
The Logical Structure of Digital Computers, October 1955 at bitsavers.org
Multi-Sequence Program Concept, November, 1954 at bitsavers.org
1927 births
2016 deaths
American engineers
Washington University in St. Louis faculty
Washington University physicists
Scientists from New Haven, Connecticut
Members of the United States National Academy of Engineering
Engineers from Connecticut
MIT Lincoln Laboratory people |
3927666 | https://en.wikipedia.org/wiki/Business%20intelligence%20software | Business intelligence software | Business intelligence software is a type of application software designed to retrieve, analyze, transform and report data for business intelligence. The applications generally read data that has been previously stored, often - though not necessarily - in a data warehouse or data mart.
History
Development of business intelligence software
The first comprehensive business intelligence systems were developed by IBM and Siebel (currently acquired by Oracle) in the period between 1970 and 1990. At the same time, small developer teams were emerging with attractive ideas, and pushing out some of the products companies still use nowadays.
In 1988, specialists and vendors organized a Multiway Data Analysis Consortium in Rome, where they considered making data management and analytics more efficient, and foremost available to smaller and financially restricted businesses. By 2000, there were many professional reporting systems and analytic programs, some owned by top performing software producers in the United States of America.
Cloud-hosted business intelligence software
In the years after 2000, business intelligence software producers became interested in producing universally applicable BI systems which don’t require expensive installation, and could hence be considered by smaller and midmarket businesses which could not afford on premise maintenance. These aspirations emerged in parallel with the cloud hosting trend, which is how most vendors came to develop independent systems with unrestricted access to information.
From 2006 onwards, the positive effects of cloud-stored information and data management transformed itself to a completely mobile-affectioned one, mostly to the benefit of decentralized and remote teams looking to tweak data or gain full visibility over it out of office. As a response to the large success of fully optimized uni-browser versions, vendors have recently begun releasing mobile-specific product applications for both Android and iOS users. Cloud-hosted data analytics made it possible for companies to categorize and process large volumes of data, which is how we can currently speak of unlimited visualization, and intelligent decision making.
Types
The key general categories of business intelligence applications are:
Spreadsheets
Reporting and querying software: applications that extract, sort, summarize, and present selected data
Online analytical processing (OLAP)
Digital dashboards
Data mining
Business activity monitoring
Data warehouse
Local information systems
Data cleansing
Except for spreadsheets, these tools are provided as standalone applications, suites of applications, components of Enterprise resource planning systems, application programming interfaces or as components of software targeted to a specific industry. The tools are sometimes packaged into data warehouse appliances.
Open source free products
Apache Hive, hosted by the Apache Software Foundation
BIRT Project, by the Eclipse Foundation
D3.js
KNIME
Orange
Pentaho
R
TACTIC
Superset
Grafana
Open source commercial products
JasperReports: reporting, analysis, dashboard
Palo: OLAP server, worksheet server and ETL server
Pentaho: reporting, analysis, dashboard, data mining and workflow capabilities
TACTIC: reporting, management, dashboard, data mining and integration, workflow capabilities
Proprietary free products
Biml - Business Intelligence Markup Language
Datacopia
icCube
InetSoft
Splunk
Proprietary products
References
Business intelligence
Business software |
43791369 | https://en.wikipedia.org/wiki/Geo-blocking | Geo-blocking | Geo-blocking or geoblocking is technology that restricts access to Internet content based upon the user's geographical location. In a geo-blocking scheme, the user's location is determined using Internet geolocation techniques, such as checking the user's IP address against a blacklist or whitelist, accounts, and measuring the end-to-end delay of a network connection to estimate the physical location of the user. The result of this check is used to determine whether the system will approve or deny access to the website or to particular content. The geolocation may also be used to modify the content provided, for example, the currency in which goods are quoted, the price or the range of goods that are available, besides other aspects.
The term is most commonly associated with its use to restrict access to premium multimedia content on the Internet, such as films and television shows, primarily for copyright and licensing reasons. There are other uses for geo-blocking, such as blocking malicious traffic or to enforce price discrimination, location-aware authentication, fraud prevention, and online gambling (where gambling laws vary by region).
Justification
The ownership of exclusive territorial rights to content may differ between regions, requiring the providers of the content to disallow access for users outside of their designated region; for example, although an online service, HBO Now is only available to residents of the United States, and cannot be offered in other countries because its parent network HBO had already licensed exclusive rights to its programming to different broadcasters (such as in Canada, where HBO licensed its back-catalogue to Bell Media), who may offer their own, similar service specific to their own region and business model (such as Crave). For similar reasons, the library of content available on subscription video on demand services such as Netflix may also vary between regions, or the service may not even be available in the user's country at all.
Geo-blocking can be used for other purposes as well. Price discrimination by online stores can be enforced by geo-blocking, forcing users to buy products online from a foreign version of a site where prices may be unnecessarily higher than those of their domestic version (although the inverse is often the case). The "Australia Tax" has been cited as an example of this phenomenon, which has led to governmental pressure to restrict how geo-blocking can be used in this manner in the country.
Other noted uses include blocking access from countries that a particular website is not relevant to (especially if the majority of traffic from that country is malicious), and voluntarily blocking access to content or services that are illegal under local laws. This can include online gambling, and various international websites blocking access to users within the European Economic Area due to concerns of liability under the General Data Protection Regulation (GDPR).
Circumvention
Geo-blocking can be circumvented. When IP address-based geo-blocking is employed, virtual private network (VPN) and anonymizer services can be used to evade geo-blocks. A user can, for example, access a website using a U.S. IP address in order to access content or services that are not available from outside the country. Hulu, Netflix, Amazon and BBC iPlayer are among the foreign video services widely used through these means by foreign users. Its popularity among VPN users in the country prompted Netflix to officially establish an Australian version of its service in 2014. In response to complaints over the quality of domestic coverage by NBC, along with a requirement for viewers be a subscriber to a participating pay television provider in order to access the online content, a large number of American viewers used VPN services to stream foreign online coverage of the 2012 Summer Olympics and 2014 Winter Olympics from British and Canadian broadcasters. Unlike NBC's coverage, this foreign coverage only used a geo-block and did not require a TV subscription.
In 2013, the New Zealand internet service provider Slingshot introduced a similar feature known as "global mode"; initially intended for travellers to enable access to local websites blocked in New Zealand, the service was re-launched in July 2014 as a feature to all Slingshot subscribers. The consumer-focused re-launch focused on its ability to provide access to U.S. online video services. Unlike manually-configured VPN services, Global Mode was implemented passively at the ISP level and was automatically activated based on a whitelist, without any further user intervention.
Legality of circumvention for online video
The legality of circumventing geo-blocking to access foreign video services under local copyright laws is unclear and varies by country. Members of the entertainment industry (including broadcasters and studios) have contended that the use of VPNs and similar services to evade geo-blocking by online video services is a violation of copyright laws, as the foreign service does not hold the rights to make its content available in the user's country—thus infringing and undermining the rights held by a local rights holder. Accessing online video services from outside the country in which they operate is typically considered a violation of their respective terms of use; some services have implemented measures to block VPN users, despite there being legitimate uses for such proxy services, under the assumption that they are using them to evade geographic filtering.
Leaked e-mails from the Sony Pictures Entertainment hack revealed statements by Keith LeGoy, Sony Pictures Television's president of international distribution, describing the international usage of Netflix over VPN services as being "semi-sanctioned" piracy that helped to illicitly increase its market share, and criticizing the company for not taking further steps to prevent usage of the service outside of regions where they have licenses to their content, such as detecting ineligible users via their payment method. On 14 January 2016, Netflix announced its intent to strengthen measures to prevent subscribers from accessing regional versions of the service that they are not authorized to use.
Australia
In Australia, a policy FAQ published by then Minister for Communications Malcolm Turnbull, states that users violating an "international commercial arrangement to protect copyright in different countries or regions" is not illegal under Australian copyright law. However, an amendment to Australian copyright law allows courts to order the blocking of websites that primarily engage in "facilitating" copyright infringement—a definition which could include VPN services that market themselves specifically for the purpose of evading geo-blocking. Prior to the passing of this amendment in June 2015, Turnbull acknowledged that VPN services have "a wide range of legitimate uses, not least of which is the preservation of privacy—something which every citizen is entitled to secure for themselves—and [VPN providers] have no oversight, control or influence over their customers’ activities."
European Union
On 6 May 2015, the European Union announced the adoption of its "Digital Single Market" strategy, which would amongst other changes, aim to end the use of "unjustified" geo-blocking between EU countries, arguing that "too many Europeans cannot use online services that are available in other EU countries, often without any justification; or they are re-routed to a local store with different prices. Such discrimination cannot exist in a Single Market." However, proposals issued by the European Commission on 25 May 2016 excluded the territorial licensing of copyrighted audiovisual works from this strategy.
On 1 April 2018, new digital media portability rules took effect, which requires paid digital media services to offer "roaming" within the EU. This means that, for example, a subscriber to Netflix in one EU country must still be able to access their home country's version of the service when travelling into other EU countries.
The European Union has approved the Regulation on Measures to Combat Unjustified Geoblocking and Other Forms of Discrimination Based on Citizenship, Place of Residence or Location of a Person in the Internal Market, which entered into force on 3 December 2018.
The geo-blocking regulation aims to provide more options for consumers and businesses in the EU internal market. It addresses the problem that (potential) customers cannot buy goods and services from sellers located in another Member State for reasons related to their citizenship, place of residence or location, and therefore discriminate against them when they try to get access to the best offers, prices or terms of sale compared to the nationals or residents of the member state of the sellers.
The new rules only apply if the other party is a consumer or a company that purchases services or products exclusively for end use (B2C, B2B). Geo-blocking regulation does not apply if products are sold to business customers for commercial purposes. The Geoblocking Ordinance does not completely prohibit geoblocking and geo-discrimination: it only prohibits certain forms.
Geo-blocking regulations prohibit geo-blocking and geo-discrimination in three situations:
1) it is not permitted to deny website visitors access to it or automatically redirect them to another website depending on their location. Redirection is only allowed with the consent of the visitor. Similar rules apply to apps as well: they must be able to download and use them throughout the EU.
2) The rules apply to the means of payment accepted on the site. A payment method cannot be refused because the customer or his / or her bank is located in another EU Member State or because the means of payment was issued in another EU Member State. Other payment terms and higher transaction costs are also prohibited.
3) In certain situations, it is no longer allowed to apply other general conditions to foreign customers:
a) when providing digital services such as cloud services and web hosting;
c) when providing services in a physical location, such as renting cars or selling tickets for an event;
(c) When selling goods and offering, either deliver them to a specific area or collect them in a specific place (for example, a store).
The prohibition of direct or indirect discrimination on the basis of citizenship is a fundamental principle of EU law. In situations not covered by this Regulation, Article 20 (2) of the Services Directive (2006/123 / EC) may apply. According to this provision, sellers can only apply a difference of treatment based on nationality or place of residence if this is justified by objective criteria. In some cases, industry-specific legislation (such as transport or health) may also apply that addresses this issue. In addition, the Regulation does not affect the TFEU rules, including the non-discrimination rules.
New Zealand
In April 2015, a group of media companies in New Zealand, including MediaWorks, Spark, Sky Network Television, and TVNZ, jointly sent cease and desist notices to several ISPs offering VPN services for the purpose of evading geo-blocking, demanding that they pledge to discontinue the operation of these services by 15 April 2015, and to inform their customers that such services are "unlawful". The companies accused the ISPs of facilitating copyright infringement by violating their exclusive territorial rights to content in the country, and misrepresenting the alleged legality of the services in promotional material. In particular, Spark argued that the use of VPNs to access foreign video on demand services was cannibalizing its own domestic service Lightbox. At least two smaller providers (Lightwire Limited and Unlimited Internet) announced that they would pull their VPN services in response to the legal concerns. However, CallPlus, the parent company of Slingshot and Orcon, objected to the claims, arguing that the Global Mode service was "completely legal", and accused the broadcasters of displaying protectionism. Later that month, it was reported that the broadcasters planned to go forward with legal action against CallPlus.
On 24 June 2015, it was announced that the media companies reached an out-of-court settlement, in which ByPass Network Services, who operates the service, would discontinue it effective 1 September 2015.
See also
Regional lockout
IP address blocking
Internet censorship
Blocking of YouTube videos in Germany
References
Copyright law
Criticism of intellectual property
Internet censorship
Geographic position |
409093 | https://en.wikipedia.org/wiki/Griffith%20University | Griffith University | Griffith University is a public research university in South East Queensland on the east coast of Australia. Formally founded in 1971, Griffith opened its doors in 1975, introducing Australia's first degrees in environmental science and Asian studies.
The university is named after Sir Samuel Walker Griffith, who was twice Premier of Queensland and the first Chief Justice of the High Court of Australia. Sir Samuel Griffith played a major role in the Federation of Australia and was the principal author of the Australian constitution.
Opening initially with the one campus at Nathan and 451 students, the University now has five physical campuses spanning three cities, the largest of which are the Gold Coast campus at Southport and the Nathan campus in Brisbane. The Mount Gravatt and South Bank campuses are also located in Brisbane, while the Logan campus is at Meadowbrook. In 2018, the University launched its Digital campus, now its sixth campus, which offers a range of online degrees.
Griffith has around 50,000 students and offers a full suite of undergraduate, postgraduate and research degrees in the areas of business and government, criminology and law, education, engineering and information technology, environment, planning and architecture, health, humanities and languages, music, science and aviation, and visual and creative arts. It is a verdant university and a member of the IRU.
In the 2019 Student Experience Survey, Griffith University recorded the seventh highest student satisfaction rating out of all Australian universities, and the highest student satisfaction rating out of all public Queensland universities, with an overall satisfaction rating of 82.1. Since 2012, Griffith University has received more Australian Awards for University Teaching than any other Australian university. Griffith has many distinguished alumni and academic staff, including 2017 Australian of the Year Emeritus Professor Alan Mackay-Sim.
History
Beginnings
In 1965, of natural bushland at Nathan were set aside for a new campus. Initially the site was to be part of the University of Queensland, which was experiencing strong demand in humanities and social sciences. By 1970 a new institution was being mooted, and Theodor Bray (later Sir Theodor Bray) was asked by the Queensland Government to establish a second for Brisbane and the third for the state. After several months of discussion, the Queensland Government announced on 24 December 1970 that Bray would head a committee charged with establishing Griffith University. The Mount Gravatt site was renamed Nathan and set to become Griffith's first campus.
On 30 September 1971, the Queensland Government officially created and recognised Griffith University with the passing of the Assent to Griffith University Act 1971.
On 5 March 1975, Griffith University began teaching 451 students in four schools: Australian Environmental Studies, Humanities, Modern Asian Studies and Science. The university was distinguished by its "problem-based" rather than disciplinary approach to course design and research.
Expansion
In the 1990s, the Dawkins Revolution saw a number of tertiary education reforms in Australia, resulting in a series of amalgamations of colleges and universities.
In 1990, the Mount Gravatt Teacher's College (established in 1969) and Gold Coast College of Advanced Education (established in 1987) became official campuses of Griffith University. The Queensland Conservatorium of Music continued the higher education mergers and became an official part of Griffith University in 1991. Originally established in 1957, the new entity became known as Queensland Conservatorium Griffith University. In 1992, the amalgamations were completed for Griffith, with the Queensland College of Art (QCA), established in 1881 and recognised as the oldest continuous operating art training institution in Australia, officially becoming part of the university.
Griffith's fifth campus, Logan, opened in 1998. Located in the suburb of Meadowbrook, on an area of green fields south of Brisbane, the Logan campus was established to specifically address the interests and needs of the Logan City area.
Griffith University was an official Partner of the Gold Coast 2018 Commonwealth Games. Over 500 university students and staff were closely involved in the planning and delivery of event.
Campuses
Griffith University's campuses are distinctive for their nature-based settings within urban environments.
Gold Coast campus
The Gold Coast campus is located in the Gold Coast suburb of Southport. Set in native bushland, on the land of the Aboriginal Yugambeh and Kombumerri peoples, this campus plays host to over 18,200 students from all over Australia and the world. It is Griffith University's largest campus.
The campus has seen significant growth and development over the last few years, with the opening of the $150 million Griffith Health Centre and the neighbouring Gold Coast University Hospital in 2013, and the launch of the $38 million Griffith Business School building in 2014. The campus is serviced by two Gold Coast light rail (G:link) stations, and is a major interchange for bus routes.
Logan campus
Logan is Griffith University's community-focused campus. Hosting almost 2500 students, the campus offers degrees in human services and social work, nursing and midwifery, business and commerce, education and information technology. The campus has strong connections with the local community, hosting numerous sporting and cultural events throughout the year.
Nathan campus
Nathan, Griffith's foundation campus, is situated in tranquil, native bushland on the edge of Toohey Forest and less than 10 kilometres from the Brisbane CBD. Nathan hosts over 13,000 students and offers degrees in business and government, engineering and information technology, environment, humanities and languages, law, and science and aviation.
The buildings at the Nathan campus were designed to fit into the environment by Roger Kirk Johnson the founding architectural designer of the campus, following the slope of the land and using architectural means of cooling. The library building was designed by Robin Gibson and won the first national award for library design. The clusters of buildings, sports facilities, bushland reserves and recreational areas are connected by integrated networks of walking paths. On the northern edge of the campus lies the Dunn Memorial, a fitting tribute.
In 2013, the six-star, green-rated Sir Samuel Griffith Centre was opened on the Nathan campus. The building operates off the grid and is powered by a combination of photovoltaics and hydrogen.
The campus has two residential colleges for students and a range of sporting facilities.
Mount Gravatt campus
The Mount Gravatt campus, adjacent to the Nathan campus, hosts 4400 students. It is the university's social sciences and humanities hub and the base for research into crucial social issues, including education and suicide prevention.
Like Nathan, the campus is situated on the edge of Toohey Forest. The campus features a recently upgraded aquatic and fitness centre, with a heated pool and indoor and outdoor recreation areas, co-located with a 16-court tennis centre, a training oval, and basketball and netball courts. On-campus student accommodation is also available.
South Bank campus
Located in Brisbane's cultural precinct, the South Bank campus is Griffith University's creative hub. It encompasses Griffith's Queensland College of Art and Queensland Conservatorium, and the Griffith Film School and Griffith Graduate Centre. , enrolment for all four units is about 3400 students.
Digital campus
Griffith's Digital campus, officially launched in 2018, offers over 100 degrees that can be studied online. With over 20,000 students, the Digital campus is Griffith's third largest and fastest growing campus. Griffith also offers online degrees in partnership with Open Universities Australia, and free online courses through FutureLearn.
Organisation
Griffith University is structured in four academic groups, with teaching offered through a range of schools, colleges and departments.
Arts, Education and Law
School of Criminology and Criminal Justice
School of Education and Professional Studies
School of Humanities, Languages and Social Science
Griffith Law School
Queensland College of Art
Griffith Film School
Queensland Conservatorium
Griffith Business School
Department of Accounting, Finance and Economics
Department of Employment Relations and Human Resources
Department of Business Strategy and Innovation
Department of Marketing
Department of Tourism, Sport and Hotel Management
School of Government and International Relations
Griffith Health
School of Applied Psychology
School of Health Sciences and Social Work
School of Medicine and Dentistry
School of Nursing and Midwifery
School of Pharmacy and Medical Sciences
Griffith Sciences
School of Engineering and Built Environment
School of Environment and Science
School of Information and Communication Technology
Academic profile
Rankings
In Australia, Griffith University ranks 18th out of 37 universities. Griffith is in the top 400 universities worldwide in five major world rankings; Academic Ranking of World Universities (ARWU), QS World University Rankings (QS), Leiden Ranking, Times Higher Education World University Rankings (THE-WUR), University Ranking by Academic Performance (URAP).
Griffith also ranks highly as a young university, ranking 33rd in the 2021 QS University Rankings Top 50 Under 50 and 30th in the 2020 Times Higher Education Top 100 under 50.
Griffith has several top ranking subjects according to the ShanghaiRanking Global Ranking of Academic Subjects 2020:
Top 10
Hospitality and Tourism Management (third globally, first in Australia)
Marine/Ocean Engineering (eighth globally, second in Australia)
Nursing and Midwifery (second globally, first in Australia)
Top 50-100
Computer Science and Engineering
Education
Geography
Law (first in Australia)
Water Resources
Top 101-150
Chemical Engineering (equal fifth in Australia)
Civil Engineering
Dentistry and Oral Sciences (third in Australia)
Ecology
Environmental Science and Engineering
Materials Science and Engineering
Nanoscience and Nanotechnology
Oceanography
Pharmacy and Pharmaceutical Sciences
Political Science (third to fifth in Australia)
Public Administration
According to Excellence in Research for Australia (2018), Griffith was rated ‘well above world-standard’ in 24 fields of research, including chemical sciences, dentistry, political science and technology.
MBA
The Griffith MBA is ranked among Australia's leading MBA programs in CEO Magazine and its 2015 MBA Rankings. The rankings are compiled by the International Graduate Forum and are designed to present a 360-degree view of the world's leading business schools. The Griffith MBA is placed sixth in the top tier of Australian programs, and is the only Queensland program to feature in the top 10. It also features in the magazine's top 20 Global MBA Rankings.
The MBA is also the highest-ranking Australian MBA in the Aspen Institute's Centre for Business Education's most recent Beyond Grey Pinstripes Global Top 100, ranked at number 26. Griffith University was awarded this ranking for its focus on responsible leadership, sustainable business practices and the Asia-Pacific. It was also acknowledged as one of Australia's best, ranking fourth in Australia in the 2015 Financial Review BOSS Magazine MBA Survey.
Teaching awards
Griffith features prominently in Australia's national teaching awards and citations. Since 2009, Griffith has won 8 awards for Teaching Excellence, 6 awards for Programs that Enhance Learning, 42 Citations for Outstanding Contributions to Student Learning and seven National Teaching Fellowships. Three Griffith staff have been named the Prime Minister's Australian Teacher of the Year.
Research
Griffith researchers work in 38 centres and institutes, investigating areas such as water science, climate change adaptation, criminology and crime prevention, sustainable tourism and health and chronic disease.
The university's major research institutes include:
Advanced Design and Prototyping Technologies Institute (ADaPT)
Australian Rivers Institute
Cities Research Institute
Environmental Futures Research Institute
Griffith Asia Institute
Griffith Criminology Institute
Griffith Institute for Educational Research
Griffith Institute for Tourism
Institute for Glycomics
Institute for Integrated and Intelligent Systems
Menzies Health Institute Queensland (formerly the Griffith Health Institute)
Griffith Institute for Drug Discovery (GRIDD)
Additionally, Griffith hosts several externally supported centres and facilities, including:
Australian Institute for Suicide Research and Prevention
National Climate Change Adaptation Research Facility
Smart Water Research Centre
NHMRC Centre of Research Excellence in Nursing
Research commercialisation
Griffith offers research commercialisation and services for business, industry and government through Griffith Enterprise.
Other centres
As well as research centres and institutes, Griffith has a number of cultural and community focused organisations. These include the EcoCentre, which provides a space for environmental education activities, exhibitions, seminars and workshops; and the Centre for Interfaith & Cultural Dialogue (formerly the Multi-Faith Centre).
Recognised research
In 2021, a research team led by the university discovered a new type of tree frog in New Guinea which is commonly known as the "chocolate frog".
Student life
Student organisations
Griffith University has a wide array of cultural, intellectual, sporting and social groups. Its Student Guild takes care of these clubs on the Gold Coast campus, as well as student issues, accommodation, employment, publication, events, sport and recreation. On the Nathan campus, Campus Life supports many clubs including the long running GRUBS (Griffith University Bushwalking Club), the Karate and Kickboxing club and the Griffith University Aikido Club.
Uniquely, Griffith University students are represented by two statutory embedded student organisations. The Griffith University Student Representative Council (GUSRC) represents undergraduate students and the Griffith University Postgraduate Students Association (GUPSA) represents post-graduate students in all campuses apart from the Gold Coast. GUPSA is a constituent member of the Council of Australian Postgraduate Associations. Unique to the Gold Coast is the Student Guild (GUSG), which represents all students from this campus and has an administrative structure that is apparently independent to the university.
Griffith Honours College
The Griffith Honours College offers high achieving students potential opportunities to enrich their university experience through mentoring, international experiences, leadership roles and community engagement activities.
Griffith Sports College
Students who are elite athletes are eligible to join the Griffith Sports College, which provides support by helping them balance sporting and university commitments.
GUMURRII Student Support Unit
The GUMURRII Student Support Unit (SSU) is the heart of Griffith's Aboriginal and Torres Strait Islander community and is located on each of Griffith's five campuses.
GUMURRII is a dedicated Student Support Unit for Aboriginal and Torres Strait Islander students. Aboriginal and Torres Strait Islander staff assist students from recruitment, to orientation providing undergraduate and postgraduate support through to graduation to afar.
Griffith College
Located on Griffith University's Mount Gravatt and Gold Coast campuses, Griffith College, formerly the Queensland Institute of Business and Technology, offers undergraduate diplomas in a range of areas, which provide a pathway into many of Griffith's degree programs.
Griffith English Language Institute
Students from non-English-speaking backgrounds can study English at the Griffith English Language Institute (GELI). A wide range of English language courses are available to help students improve their English for work, travel, study or everyday purposes.
Residential colleges
Griffith University has four residential colleges, with two located on its Nathan campus and one each on its Mt Gravatt and Gold Coast campuses. The three colleges located in Brisbane compete in the sporting Inter-College Cup, also known as the ICC. The premier event of the ICC is the Phar Cup, where both female and male teams compete in rugby league matches against each other. The colleges are as follows:
Bellenden Ker College, a.k.a. BK, is a co-educational college located on the Nathan Campus in the Toohey forest reserve.
KGBC, also known as "The Flats", consist of four co-educational undergraduate and postgraduate apartments on the Nathan Campus.
Mt Gravatt College, a.k.a. MG, is a co-educational college located on the Mt Gravatt Campus which itself sits on the hill for which the surrounding suburbs are named.
Griffith University Village is a collection of co-ed apartments on the Gold Coast Campus.
Safe Campuses initiative
Between 2011 and 2016 there were 46 officially reported cases of sexual abuse and harassment on campus released by the university, resulting in no expulsions and one six-month suspension, the highest reported stats in Queensland at the time. This was fewer than the 2017 Australian Human Rights Commission report on sexual assault and harassment, which found reported figures higher than this.
Following the release of the report, Griffith University established the Safe Campuses Taskforce. The Taskforce and its working parties are working to ensure Griffith's campuses provide safe, inclusive and respectful environments for all students and staff.
Alumni
Griffith has over 200,000 alumni. Notable graduates have been journalists, musicians, actors, artists, filmmakers, photographers, athletes, activists and politicians in the Parliament of Australia and the Parliament of Queensland.
See also
List of universities in Australia
References
External links
Australian university rankings
Griffith University website
Griffith University Village
Griffith University Online
1971 establishments in Australia
APRA Award winners
Universities in Brisbane
Educational institutions established in 1971
Education on the Gold Coast, Queensland
Universities in Queensland
Schools in Queensland |
21876751 | https://en.wikipedia.org/wiki/Optical%20sorting | Optical sorting | Optical sorting (sometimes called digital sorting) is the automated process of sorting solid products using cameras and/or lasers.
Depending on the types of sensors used and the software-driven intelligence of the image processing system, optical sorters can recognize objects' color, size, shape, structural properties and chemical composition. The sorter compares objects to user-defined accept/reject criteria to identify and remove defective products and foreign material (FM) from the production line, or to separate product of different grades or types of materials.
Optical sorting achieves non-destructive, 100 percent inspection in-line at full production volumes.
Optical sorters are in widespread use in the food industry worldwide, with the highest adoption in processing harvested foods such as potatoes, fruits, vegetables and nuts where it achieves non-destructive, 100 percent inspection in-line at full production volumes. The technology is also used in pharmaceutical manufacturing and nutraceutical manufacturing, tobacco processing, waste recycling and other industries. Compared to manual sorting, which is subjective and inconsistent, optical sorting helps improve product quality, maximize throughput and increase yields while reducing labor costs.
History
Optical sorting is an idea that first came out of the desire to automate industrial sorting of agricultural goods like fruits and vegetables. Before automated optical sorting technology was conceived in the 1930s, companies like Unitec were producing wooden machinery to assist in the mechanical sorting of fruit processing. In 1931, a company known as “the Electric Sorting Company” was incorporated and began the creation of the world’s first color sorters, which were being installed and used in Michigan’s bean industry by 1932. In 1937, optical sorting technology had advanced to allow for systems based on a two-color principle of selection. The next few decades saw the installation of new and improved sorting mechanisms, like gravity feed systems, and the implementation of optical sorting in more agricultural industries.
In the late 1960s, optical sorting began to be implemented to new industries beyond agriculture, like the sorting of ferrous and non-ferrous metals. By the 1990s, optical sorting was being used heavily in the sorting of solid wastes.
With the large technological revolution happening in the late 1990s and early 2000s, optical sorters were being made more efficient via the implementation of new optical sensors, like CCD, UV, and IR cameras. Today, optical sorting is used in a wide variety of industries and, as such, is implemented with a varying selection of mechanisms to assist in that specific sorter’s task.
The sorting system
In general, optical sorters feature four major components: the feed system, the optical system, image processing software, and the separation system. The objective of the feed system is to spread products into a uniform monolayer so products are presented to the optical system evenly, without clumps, at a constant velocity. The optical system includes lights and sensors housed above and/or below the flow of the objects being inspected. The image processing system compares objects to user-defined accept/reject thresholds to classify objects and actuate the separation system. The separation system — usually compressed air for small products and mechanical devices for larger products, like whole potatoes — pinpoints objects while in-air and deflects the objects to remove into a reject chute while the good product continues along its normal trajectory.
The ideal sorter to use depends on the application. Therefore, the product's characteristics and the user's objectives determine the ideal sensors, software-driven capabilities and mechanical platform.
Sensors
Optical sorters require a combination of lights and sensors to illuminate and capture images of the objects so the images can be processed. The processed images will determine if the material should be accepted or rejected.
There are camera sorters, laser sorters and sorters that feature a combination of the two on one platform. Lights, cameras, lasers and laser sensors can be designed to function within visible light wavelengths as well as the infrared (IR) and ultraviolet (UV) spectrums. The optimal wavelengths for each application maximize the contrast between the objects to be separated. Cameras and laser sensors can differ in spatial resolution, with higher resolutions enabling the sorter to detect and remove smaller defects.
Cameras
Monochromatic cameras detect shades of gray from black to white and can be effective when sorting products with high-contrast defects.
Sophisticated color cameras with high color resolution are capable of detecting millions of colors to better distinguish more subtle color defects. Trichromatic color cameras (also called three-channel cameras) divide light into three bands, which can include red, green and/or blue within the visible spectrum as well as IR and UV.
Coupled with intelligent software, sorters that feature cameras are capable of recognizing each object's color, size and shape; as well as the color, size, shape and location of a defect on a product. Some intelligent sorters even allow the user to define a defective product based on the total defective surface area of any given object.
Lasers
While cameras capture product information based primarily on material reflectance, lasers and their sensors are able to distinguish a material's structural properties along with their color. This structural property inspection allows lasers to detect a wide range of organic and inorganic foreign material such as insects, glass, metal, sticks, rocks and plastic; even if they are the same color as the good product.
Lasers can be designed to operate within specific wavelengths of light; whether on the visible spectrum or beyond. For example, lasers can detect chlorophyll by stimulating fluorescence using specific wavelengths; which is a process that is very effective for removing foreign material from green vegetables.
Camera/laser combinations
Sorters equipped with cameras and lasers on one platform are generally capable of identifying the widest variety of attributes. Cameras are often better at recognizing color, size and shape while laser sensors identify differences in structural properties to maximize foreign material detection and removal.
Hyperspectral Imaging
Driven by the need to solve previously impossible sorting challenges, a new generation of sorters that feature multispectral and hyperspectral imaging systems are being developed.
Like trichromatic cameras, multispectral and hyperspectral cameras collect data from the electromagnetic spectrum. Unlike trichromatic cameras, which divide light into three bands, hyperspectral systems can divide light into hundreds of narrow bands over a continuous range that covers a vast portion of the electromagnetic spectrum. Compared to the three data points per pixel collected by trichromatic cameras, hyperspectral cameras can collect hundreds of data points per pixel, which are combined to create a unique spectral signature (also called a fingerprint) for each object. When complemented by capable software intelligence, a hyperspectral sorter processes those fingerprints to enable sorting on the chemical composition of the product. This is an emerging area of chemometrics.
Software-driven intelligence
Once the sensors capture the object's response to the energy source, image processing is used to manipulate the raw data. The image processing extracts and categorizes information about specific features. The user then defines accept/reject thresholds that are used to determine what is good and bad in the raw data flow. The art and science of image processing lies in developing algorithms that maximize the effectiveness of the sorter while presenting a simple user-interface to the operator.
Object-based recognition is a classic example of software-driven intelligence. It allows the user to define a defective product based on where a defect lies on the product and/or the total defective surface area of an object. It offers more control in defining a wider range of defective products. When used to control the sorter's ejection system, it can improve the accuracy of ejecting defective products. This improves product quality and increases yields.
New software-driven capabilities are constantly being developed to address the specific needs of various applications. As computing hardware becomes more powerful, new software-driven advancements become possible. Some of these advancements enhance the effectiveness of sorters to achieve better results while others enable completely new sorting decisions to be made.
Platforms
The considerations that determine the ideal platform for a specific application include the nature of the product – large or small, wet or dry, fragile or unbreakable, round or easy to stabilize – and the user's objectives. In general, products smaller than a grain of rice and as large as whole potatoes can be sorted. Throughputs range from less than 2 metric tons of product per hour on low-capacity sorters to more than 35 metric tons of product per hour on high-capacity sorters.
Channel sorters
The simplest optical sorters are channel sorters, a type of color sorter that can be effective for products that are small, hard, and dry with a consistent size and shape; such as rice and seeds. For these products, channel sorters offer an affordable solution and ease of use with a small footprint. Channel sorters feature monochromatic or color cameras and remove defects and foreign material based only on differences in color.
For products that cannot be handled by a channel sorter – such as soft, wet, or nonhomogeneous products – and for processors that want more control over the quality of their product, freefall sorters (also called waterfall or gravity-fed sorters), chute-fed, sorters or belt sorters are more ideal. These more sophisticated sorters often feature advanced cameras and/or lasers that, when complemented by capable software intelligence, detect objects' size, shape, color, structural properties, and chemical composition.
Freefall and chute-fed sorters
As the names imply, freefall sorters inspect product in-air during the freefall and chute-fed sorters stabilize product on a chute prior to in-air inspection. The major advantages of freefall and chute-fed sorters, compared to belt sorters, are a lower price point and lower maintenance. These sorters are often most suitable for nuts and berries as well as frozen and dried fruits, vegetables, potato strips and seafood, in addition to waste recycling applications that require mid-volume throughputs.
Belt sorters
Belt sorting platforms are often preferred for higher capacity applications such as vegetable and potato products prior to canning, freezing or drying. The products are often stabilized on a conveyor belt prior to inspection. Some belt sorters inspect products from above the belt, while other sorters also send products off of the belt for an in-air inspection. These sorters can either be designed to achieve traditional two-way sorting or three-way sorting if two ejector systems with three outfeed streams are equipped.
ADR systems
A fifth type of sorting platform, called an automated defect removal (ADR) system, is specifically for potato strips (French fries). Unlike other sorters that eject products with defects from the production line, ADR systems identify defects and actually cut the defects from the strips. The combination of an ADR system followed by a mechanical nubbin grader is another type of optical sorting system because it uses optical sensors to identify and remove defects.
Single-file inspection systems
The platforms described above all operate with materials in bulk; meaning they do not need the materials to be in a single-file to be inspected. In contrast, a sixth type of platform, used in the pharmaceutical industry, is a single-file optical inspection system. These sorters are effective in removing foreign objects based on differences in size, shape and color. They are not as popular as the other platforms due to decreased efficiency.
Mechanical graders
For products that require sorting only by size, mechanical grading systems are used because sensors and image processing software is not necessary. These mechanical grading systems are sometimes referred to as sorting systems, but should not be confused with optical sorters that feature sensors and image processing systems.
Practical usage
Waste and recycling
Optical sorting machines can be used to identify and discard manufacturing waste, such as metals, drywall, cardboard, and various plastics. In the metal industry, optical sorting machines are used to discard plastics, glass, wood, and other non-needed metals. The plastic industry uses optical sorting machines to not only discard various materials like those listed, but also different types of plastics. Optical sorting machines discard different types of plastics by distinguishing resin types. Resin types that optical sorting machines can identify are: HDPE, PVC, PLA, PE, and others.
Optical sorting also aids in recycling since the discarded materials are stored in bins. Once a bin is full of a given material, it can be sent to the appropriate recycling facility. Optical sorting machines’ ability to distinguish between resin types also aids in the process of recycling plastics because there are different methods used for each plastic type.
Food and drink
In the coffee industry, optical sorting machines are used to identify and remove underdeveloped coffee beans called quakers; quakers are beans that contain mostly carbohydrates and sugars. A more accurate calibration offers a lower total number of defective products. Some coffee companies like Counter Culture use these machines in addition to pre-existing sorting methods in order to create a better tasting cup of coffee. One limitation is that someone has to program these machines by hand to identify defective products.
However, this science is not limited to coffee beans; food items such as mustard seeds, fruits, wheat, and hemp can all be processed through optical sorting machines.
In the wine manufacturing process, grapes and berries are sorted like coffee beans. Grape sorting is used to ensure no unripe/green parts to the plant are involved in the wine making process. In the past, manual sorting via sorting tables was used to separate the defective grapes from the more effective grapes. Now, mechanical harvesting provides a higher effectiveness rate compared to manual sorting. At different points in the line, materials are sorted out via several optical sorting machines. Each machine is looking for various materials of differing shapes and sizes.
The berries or grapes can then be sorted accordingly using a camera, a laser, or a form of LED technology with regard to the shape and form of the given fruit. The sorting machine then discards any unnecessary elements.
See also
Food grading
Food safety
Food technology
Recycling#Sorting
References
Industrial machinery
Food processing
Image processing
Recycling
Applications of computer vision |
3538953 | https://en.wikipedia.org/wiki/Sebastian%20Thrun | Sebastian Thrun | Sebastian Thrun (born May 14, 1967) is a German-American entrepreneur, educator, and computer scientist. He is CEO of Kitty Hawk Corporation, and chairman and co-founder of Udacity. Before that, he was a Google VP and Fellow, a Professor of Computer Science at Stanford University, and before that at Carnegie Mellon University. At Google, he founded Google X and Google's self-driving car team. He is also an Adjunct Professor at Stanford University and at Georgia Tech.
Thrun led development of the robotic vehicle Stanley which won the 2005 DARPA Grand Challenge, and which has since been placed on exhibit in the Smithsonian Institution's National Museum of American History. His team also developed a vehicle called Junior, which placed second at the DARPA Grand Challenge in 2007. Thrun led the development of the Google self-driving car.
Thrun is also known for his work on probabilistic algorithms for robotics with applications including robotic mapping. In recognition of his contributions, and at the age of 39, he was elected into the National Academy of Engineering and also into the Academy of Sciences Leopoldina in 2007. The Guardian recognized him as one of 20 "fighters for internet freedom".
Early life and education
Thrun was born in 1967 in Solingen, Germany (former West Germany), the son of Winfried and Kristin (Grüner) Thrun. He completed his Vordiplom (intermediate examination) in computer science, economics, and medicine at the University of Hildesheim in 1988. At the University of Bonn, he completed a Diplom (first degree) in 1993 and a Ph.D. (summa cum laude) in 1995 in computer science and statistics.
Career and research
In 1995 he joined the Computer Science Department at Carnegie Mellon University (CMU) as a research computer scientist. In 1998 he became an assistant professor and co-director of the Robot Learning Laboratory at CMU. As a faculty member at CMU, he co-founded the Master's Program in Automated Learning and Discovery, which later would become a Ph.D. program in the broad area of machine learning and scientific discovery. In 2001 Thrun spent a sabbatical year at Stanford University. He returned to CMU to an endowed professorship, the Finmeccanica Associate Professor of Computer Science and Robotics.
Thrun left CMU in July 2003 to become an associate professor at Stanford University and was appointed as the director of SAIL in January 2004. From 2007–2011, Thrun was a full professor of computer science and electrical engineering at Stanford. On April 1, 2011, Thrun relinquished his tenure at Stanford to join Google as a Google Fellow. On January 23, 2012, he co-founded an online private educational organization, Udacity. He was a Google VP and Fellow, and worked on development of the Google driverless car system. Thrun was interviewed in the 2018 documentary on artificial intelligence Do You Trust This Computer?.
Robotics
Thrun developed a number of autonomous robotic systems that earned him international recognition. In 1994, he started the University of Bonn's Rhino project together with his doctoral thesis advisor Armin B. Cremers. In 1997 Thrun and his colleagues Wolfram Burgard and Dieter Fox developed the world's first robotic tour guide in the Deutsches Museum Bonn (1997). In 1998, the follow-up robot "Minerva" was installed in the Smithsonian's National Museum of American History in Washington, DC, where it guided tens of thousands of visitors during a two-week deployment period. Thrun went on to found the CMU/Pitt Nursebot project, which fielded an interactive humanoid robot in a nursing home near Pittsburgh, Pennsylvania. In 2002, Thrun helped develop mine mapping robots in a project with his colleagues William L. Whittaker and Scott Thayer, research professors at Carnegie Mellon University. After his move to Stanford University in 2003, he engaged in the development of the robot Stanley, which in 2005 won the DARPA Grand Challenge. His former graduate student Michael Montemerlo, who was co-advised by William L. Whittaker, led the software development for this robot. In 2007, Thrun's robot "Junior" won second place in the 2007 DARPA Urban Challenge. Thrun joined Google as part of a sabbatical, together with several Stanford students. At Google, he co-developed Google Street View.
Thrun's best known contributions to robotics are on the theoretical end. He contributed to the area of probabilistic robotics, a field that marries statistics and robotics. He and his research group made substantial contributions in areas of mobile robot localization, mapping (SLAM), and control. Probabilistic techniques have since become mainstream in robotics, and are used in numerous commercial applications. In the fall of 2005, Thrun published a textbook entitled Probabilistic Robotics together with his long-term co-workers Dieter Fox and Wolfram Burgard. Since 2007, a Japanese translation of Probabilistic Robotics has been available on the Japanese market.
Thrun is one of the principal investors of the Stanford spin-off VectorMagic.
Awards
Named one of Brilliant 5 by Popular Science in 2005
CAREER award from the National Science Foundation, 1999—2003
Olympus award, German Society for Pattern Recognition, 2001
Fast Company: Fifth most creative person in 2011
4 on Foreign Policy magazine's Top 100 Global Thinkers of 2012
Max-Planck-Research Award, 2011
Inaugural AAAI Ed Feigenbaum Prize. Selected as the fifth most creative person in the business world by the Fast Company in 2011
Thrun was the 2012 recipient of Smithsonian magazine's American Ingenuity Award in the Education category.
Fellow of the European Association for Artificial Intelligence (EurAI)
References
External links
1967 births
Living people
Artificial intelligence researchers
German computer scientists
Machine learning researchers
German roboticists
People from Solingen
American investors
University of Bonn alumni
Stanford University School of Engineering faculty
West German expatriates in the United States
Carnegie Mellon University faculty
Google employees
Google Fellows
Fellows of the Association for the Advancement of Artificial Intelligence
Fellows of the European Association for Artificial Intelligence |
23597211 | https://en.wikipedia.org/wiki/Director%20of%20the%20Cybersecurity%20and%20Infrastructure%20Security%20Agency | Director of the Cybersecurity and Infrastructure Security Agency | The Director of the Cybersecurity and Infrastructure Security Agency is a high level civilian official in the United States Department of Homeland Security. The Director, as head of Cybersecurity and Infrastructure Security Agency at DHS, is the principal staff assistant and adviser to both the Secretary of Homeland Security and the Deputy Secretary of Homeland Security for all DHS programs designed to reduce the nation's risk to terrorism and natural disasters. The Director is appointed from civilian life by the President with the consent of the Senate to serve at the pleasure of the President.
The position was created in November 2018, replacing the position of Under Secretary of Homeland Security for National Protection and Programs.
Overview
The Director of the Cybersecurity and Infrastructure Security Agency is responsible for directing all of the Department of Homeland Security's integrated efforts to reduce the risk of terrorism and natural disasters to the Nation's physical, cyber and communications infrastructure.
The Director is a Level III position within the Executive Schedule. Since January 2010, the annual rate of pay for Level III is $165,300.
Directors
Reporting Officials
Officials reporting to the Director of the CISA include:
Deputy Director of the CISA - Nitin Natarajan
Executive Assistant Director for Cybersecurity Division (CSD) - Eric Goldstein
Executive Assistant Director for Infrastructure Security Division (ISD) - Dr. David Mussington
Executive Assistant Director for Emergency Communications (ECD) - Billy Bob Brown, Jr.
Assistant Director for National Risk Management Center (NRMC) - Bob Kolasky
Assistant Director for Integrated Operations Division (IOD) - Rick Driggers
Assistant Director for Stakeholder Engagement Division (SED) - Alaina Clark
Director of the Federal Protective Service - L. Eric Patterson
References |
11059950 | https://en.wikipedia.org/wiki/List%20of%20CDMA%20terminology | List of CDMA terminology |
This article contains terminology related to CDMA International Roaming. To quickly find a term, click on the first letter of the term below:
# | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z
#
1x – See 1xRTT
1xEV-DO – cdma2000 Evolution, Data Optimized
1xRTT – cdma2000 Radio Transmission Technology
2G Authentication – See CAVE-based Authentication
3G Authentication – See AKA
3GPP2 – Third Generation Partnership Project 2
A
A12 Authentication
AAA – Authentication, Authorization, and Accounting
AC – Authentication Center – See CAVE-based Authentication
Access Authentication
Acquisition_Table – See PRL
Active Pilot – base station(s) currently serving a call. A base station usually has 3 pilot numbers. Also See PN Offset.
AKA – Authentication and Key Agreement
A-key – Authentication Key – See CAVE-based Authentication
AMPS – Advanced Mobile Phone System
AN – Access Network
ANI – Automatic Number Identification
ANID – Access Network Identifiers
ANSI – American National Standards Institute
ANSI-41 – See IS-41
ARP – Authorized Receipt Point
ARPU – Average revenue per user
AT – Access Terminal
Authentication
Authorization
Automatic Call Delivery
Automatic Roaming
Autonomous Registration
B
Band
Bandclass
BID – Billing Identification
Bilateral Roaming
BILLID – BillingID
Border System
C
Call Disconnect
Caller ID
Call Release
Call termination
Carrier
CAVE – Cellular Authentication and Voice Encryption
CAVE-based Authentication
CDG – CDMA Development Group
CDMA – Code Division Multiple Access
CDR – Call Detail Record
Cell site
CIBER – Cellular Intercarrier Billing Exchange Roamer
Cibernet
CHAP – Challenge- Handshake Authentication Protocol aka (HDR – High Data Rate)
Clearing
Clearinghouse
CLI – Calling Line Identification – See Caller ID
CLIP – Calling Line Identification Presentation – See Caller ID
CLLI – Common Language Location Identifier
Clone
Closed PRL – See PRL
CoA – Care-of-Address – See Mobile IP
CND – Caller Number Display – See Caller ID
CNID – Calling Number Identification – See Caller ID
CRX – CDMA Packet Data Roaming Exchange
CSCF – Call Session Control Function – See IMS
CTIA – Cellular Telecom. & Internet Association
D
D-AMPS – Digital Analog Mobile Phone Service
DES – Data Encryption Standard
Diameter
DO – See 1xEV-DO
DRRR – Direct Routing for Roamer to Roamer
Dual-mode handset (i.e. dual-mode mobile phones)
E
eHRPD – Enhanced HRPD
EDI – Electronic Data Interchange
EDT – Electronic Data Transfer
Encryption
Enhanced PRL
ERI – Enhanced Roaming Indicator – See Roaming Indicator
ESA – Enhanced Subscriber Authentication – See AKA
ESN – Electronic Serial Number
ESPM – Extended System Parameters Message
EV-DO – See 1xEV-DO
F
FA – Foreign Agent – See Mobile IP
FCC – U.S. Federal Communications Commission
Financial Settlement
FOTA – Firmware Over-the-Air – See OTA
Frequency Block
G
Global Challenge – See CAVE-based Authentication
Global Title
GTT – Global Title Translation
H
HA – Home Agent – See Mobile IP
Handoff (data)
Handoff (voice)
HLR – Home Location Register
Home Address – See Mobile IP
Home System
HNI – Home Network Identifier – See IMSI
Home SID/NID List
HRPD – High Rate Packet Data – See 1xEV-DO
HRPD Session
HSS – Home Subscriber Server – See IMS
Hybrid Device
I
ICCID – Integrated Circuit Card IDentifier (sim card Number)
IETF – Internet Engineering Task Force
IFAST – International Forum on ANSI-41 Standards Technology
IIF – Interworking and Interoperability Function
IMEI – International mobile equipment Identity
IMS – IP Multimedia Subsystem
IMSI – International Mobile Subscriber Identity
IMSI 11 12 – Same as MNC (Mobile Network Code)
IMSI S – Short IMSI, Mobile Identification Number
Inbound Roamer
Industry Organizations
INF – Industry Negative File
Interconnection
Inter standard roaming
IRM – International roaming MIN
IS-2000 – Superseded by TIA-2000
IS-41 – Superseded by TIA-41
IS-835
IS-856 – Superseded by TIA-856
IS-95
ISG – International Signaling Gateway
ISUP – Integrated Services User Part
ITU – International Telecommunications Union
J
J-STD-038
K
Key
L
L2TP – Layer 2 Tunneling Protocol
LAC – L2TP Access Concentrator – See L2TP
Line Range
LNS – L2TP Network Server – See L2TP
M
MABEL – Major Account Billing Exchange Logistical
Main Service Instance – See Service Instance
MAP – Mobile Application Part – See TIA-41
MBI – MIN Block Identifier
MC – Message Center – See SMS
MCC – Mobile Country Code
MDN – Mobile Directory Number
ME – Mobile Equipment
MEID – Mobile Equipment Identifier
MIN – Mobile Identification Number
MIP – Mobile IP – See Mobile IP
MMD – Multimedia Domain
MMS – Multimedia Messaging Service
MN – Mobile Node
MNC – Mobile Network Code
MN ID – Mobile Node Identifier – See A12 Authentication
Mobile IP
MS – Mobile Station
MSC – Mobile Switching Center
MSCID – Mobile Switching Center Identification
MSCIN – MSC Identification Number
MSID – Mobile Station Identity
MSIN – Mobile Subscription Identification Number, same as MIN
MSISDN – Mobile Station Integrated Services Digital Network Number
MSL – Master Subsidy Lock
MTSO – Mobile Telephone Switching Office – See MSC
Multi-Band Handset
Multi-Mode Handset
N
NAI – Network Access Identifier
NAM – Number Assignment Module
NANP – North American Numbering Plan
Negative System – See PRL
Net Settlement
NID – Network Identification Number
NMSI – National Mobile Station Identity
NMSID – National Mobile Station IDentity, Same as NMSI
NPA-NXX – See NANP
O
OMA – Open Mobile Alliance
Open PRL – See PRL
OTAPA – Over The Air Parameter Administration
OTASP – Over The Air Service Provisioning
OTA – Over-The-Air Programming
Outbound Roamer
P
PAP – Password Authentication Protocol
Packet Data Service
Packet Data Service Option
Packet Data Session
PCS – Personal Communications Services
PDSN – packet data serving node
Permissive Mode – See PRL
PIN – Personal Identification Number – See RVR
Plus Code Dialing
PN Offset – Identifies a base station. As base station usually has 3 pilot numbers. Also See Active Pilot.
Point of Attachment – See Mobile IP
PPP – Point-to-Point Protocol
PPP Service Instance – See Service Instance
PPP Session – Point-to-Point Protocol Session
Preferred System – See PRL
PRL – Preferred Roaming List
Profiling
PUZL – Preferred User Zone List
PZID – Packet Zone Identification
Q
R
RADIUS – Remote Authentication Dial In User Service
RADIUS Server – See AAA
RAN – Radio Access Network – See AN
Restrictive Mode – See PRL
RFC – Request For Comments
RN – Radio Network – See AN
RoamEx
Roaming
Roaming Agreement
Roaming Indicator
RSP – Roaming Service Provider
RUIM – Removable User Identity Module
RVR – Roamer Verification and Reinstatement
S
Sector ID
Service Instance
Service Option
Serving System
SID – System ID
SID/NID Lockout List
SIP – Session Initiation Protocol
SMS – Short Message Service
SMSC – Short Message Service Centre – See SMS
Soft Handoff
SO33 – Service Option 33 – See Service Option
SO59 – Service Option 59 – See Service Option
SPC – Service Programming Code, same as MSL (Master Subsidy Lock)
SPASM – Subscriber Parameter Administration Security Mechanism
SPC – Service Programming Code
Subnet ID
Supplementary Services
SSPR – System Selection for Preferred Roaming
System table – See PRL
T
TDS – Technical Data Sheet
Telcordia
TIA – Telecommunications Industry Association
TIA-2000
TIA-41 – Cellular Radio-Telecommunications Intersystem Operations
TIA-856
TIA-878
TLDN – Temporary Local Directory Number
TMSI – Temporary Mobile Station Identity
Triple DES – Triple Data Encryption Standard
Trading Partner Agreements
U
UDR – Usage Data Records
UIM – User Identity Module – See RUIM
UIMID – UIM Identifier – See RUIM
Unique Challenge – See CAVE-based Authentication
V
Verification – See RVR
Visited System
VLR – Visitor Location Register
W
WIN – Wireless Intelligent Network
WCDMA – Wideband Code Division Multiple Access
X
X0 Records – See CIBER
X2 Records – See CIBER
Y
Z
External links
CDMA Development Group (CDG)
International Roaming, CDMA Development Group (CDG)
Code division multiple access |
17139861 | https://en.wikipedia.org/wiki/Anki%20%28software%29 | Anki (software) | Anki (/ˈɒŋkiː/; Japanese: [aŋki]) is a free and open-source flashcard program using spaced repetition, a technique from cognitive science for fast and long-lasting memorization. "Anki" () is the Japanese word for "memorization".
The SM-2 algorithm, created for SuperMemo in the late 1980s, forms the basis of the spaced repetition methods employed in the program. Anki's implementation of the algorithm has been modified to allow priorities on cards and to show flashcards in order of their urgency.
The cards are presented using HTML and may include text, images, sounds, videos, and LaTeX equations. The decks of cards, along with the user's statistics, are stored in the open SQLite format.
Features
Notes
Cards are generated from information stored as "notes". Notes are analogous to database entries and can have an arbitrary number of fields. For example, with respect to learning a language, a note may have the following fields and example entries:
Field 1: Expression in target language –
Field 2: Pronunciation – [sound file with the word pronounced]
Field 3: Meaning of expression in familiar language – "cake"
This example illustrates what some programs call a three-sided flashcard, but Anki's model is more general and allows any number of fields to be combined in various cards.
The user can design cards that test the information contained in each note. One card may have a question (expression) and an answer (pronunciation, meaning).
By keeping the separate cards linked to the same fact, spelling mistakes can be adjusted against all cards at the same time, and Anki can ensure that related cards are not shown in too short a spacing.
A special note type allows generation of cloze deletion cards (in Anki 1.2.x, those were ordinary cards with cloze markup added using a tool in the fact editor).
Syncing
Anki supports synchronization with a free (but proprietary) online service called AnkiWeb. This allows users to keep decks synchronized across multiple computers and to study online or on a cell phone.
There also is a third-party open-source (AGPLv3) AnkiWeb alternative, called ankisyncd, which users can run on their own local computers or servers.
Japanese and Chinese reading generation
Anki can automatically fill in the reading of Japanese and Chinese text. Since version 0.9.9.8.2, these features are in separate plug-ins.
Add-ons
More than 750 add-ons for Anki are available, often written by third-party developers. They provide support for speech synthesis, enhanced user statistics, image occlusion, incremental reading, more efficient editing and creation of cards through batch editing, modifying the GUI, simplifying import of flashcards from other digital sources, adding an element of gamification, etc.
Shared decks
While Anki's user manual encourages the creation of one's own decks for most material, there is still a large and active database of shared decks that users can download and use. Available decks range from foreign-language decks (often constructed with frequency tables) to geography, physics, biology, chemistry and more. Various medical science decks, often made by multiple users in collaboration, are also available.
Comparisons
Anki's current scheduling algorithm is derived from SM-2 (an older version of the SuperMemo algorithm), though the algorithm has been significantly changed from SM-2 and is also far more configurable. One of the most apparent differences is that while SuperMemo provides users a 6-point grading system (0 through 5, inclusive), Anki only provides at most 4 grades (again, hard, good, and easy). Anki also has significantly changed how review intervals grow and shrink (making many of these aspects of the scheduler configurable through deck options), though the core algorithm is still based on SM-2's concept of ease factors as the primary mechanism of evolving card review intervals.
Anki was originally based on the SM-5 algorithm, but the implementation was found to have seemingly incorrect behaviour (harder cards would have their intervals grow more quickly than easier cards in certain circumstances) leading the authors to switch Anki's algorithm to SM-2 (which was further evolved into the modern Anki algorithm). At the time, this led Elmes to claim that SM-5 and later algorithms were flawed which was strongly rebutted by Piotr Woźniak, the author of SuperMemo. Since then, Elmes has clarified that it is possible that the flaw was due to a bug in their implementation of SM-5 (the SuperMemo website does not describe SM-5 in complete detail), but added that due to licensing requirements Anki will not use any newer versions of the SuperMemo algorithm. The latest SuperMemo algorithm in 2019 is SM-18.
Some Anki users who have experimented with the Anki algorithm and its settings have published configuration recommendations, made add-ons to modify Anki's algorithm, or developed their own separate software.
Mobile versions
The following smartphone/tablet and Web clients are available as companions to the desktop version:
AnkiMobile for iPhone, iPod touch or iPad (paid)
AnkiWeb (online server, free to use; includes add-on and deck hosting)
AnkiDroid for Android (free of charge, under GPLv3; by Nicolas Raoul)
The flashcards and learning progress can be synchronized both ways with Anki using AnkiWeb. With AnkiDroid it is possible to have the flashcards read in several languages using text-to-speech (TTS). If a language does not exist in the Android TTS engine (e.g. Russian in the Android version Ice Cream Sandwich), a different TTS engine such as SVOX TTS Classic can be used.
History
The oldest mention of Anki that the developer Damien Elmes could find in 2011 was dated 5 October 2006, which was thus declared Anki's birthdate.
Version 2.0 was released on 6 October 2012.
Version 2.1 was released on 6 August 2018.
Utility
While Anki may primarily be used for language learning or a classroom setting, many have reported other uses for Anki: scientist Michael Nielsen is using it to remember complex topics in a fast-moving field, others are using it to remember memorable quotes, the faces of business partners or medical residents, or to remember business interviewing strategies.
In 2010, Roger Craig obtained the then-all-time record for single-day winnings on the quiz show Jeopardy! after using Anki to memorize a vast number of facts.
Medical education
Anki is quickly becoming an important resource for many medical students in the US. A study in 2015 at Washington University School of Medicine found that 31% of students who responded to a medical education survey reported using Anki as a study resource. The same study found a positive relationship between the number of unique Anki cards studied and USMLE Step 1 scores in a multi-variate analysis. Some third-party resources, such as Boards and Beyond, have Anki decks based on them.
Copera Inc.'s Anki for Palm OS
An unrelated flashcard program called Anki for Palm OS was created by Copera, Inc. (formerly known as Cooperative Computers, Inc.) and released at the PalmSource conference in February 2002. Anki for Palm OS was sold from 2002 to 2006 as a commercial product. In late 2007, Copera, Inc. decided to release Anki for Palm OS as freeware.
See also
List of flashcard software
Computer-assisted language learning
References
Further reading
(part 2)
External links
AnkiMobile Flashcards on the App Store
SM2 Algorithm
Anki Algorithm
Spaced repetition software
Free software programmed in Python
Educational software that uses Qt
Free educational software
Free and open-source Android software
Software using the GNU AGPL license |
17977433 | https://en.wikipedia.org/wiki/AME%20Accounting%20Software | AME Accounting Software | AME Accounting Software is a business accounting software application developed by AME Software Products, Inc.
AME Accounting Software includes Payroll, General Ledger, Accounts Receivable, Accounts Payable, 1099 Vendor Management, MICR check printing, and Direct Deposit. The software is mostly used by small and medium size businesses, as well as accounting practices that process payroll and do bookkeeping for other businesses.
The General Ledger software implements a double-entry bookkeeping system, and all modules are able to post entries to General Ledger.
The General Ledger software features comprehensive reports, that include Income Statement, Balance Sheet, Cash Flow Statement, Trial Balance Worksheet. The Payroll software calculates federal and state taxes, prints W2, 1099, and payroll checks, and is capable of producing reports for 50 states.
AME Accounting Software was initially developed for DOS. In 1998 AME released payroll software for Windows.
The current version, AME 2.0 released in 2004, includes all features that are required for running a small business or an accounting practice. The user interface is simple and intuitively understandable.
As noted in 2008 June/July issue of CPA Technology Advisor Magazine: "AME offers a good payroll module and core financial functions that are sufficient for smaller entities, especially for businesses with limited technical expertise. It is attractively priced and covers the basic needs of a small company." AME stands for Accounting Made Easy.
References
External links
AME Accounting Software
Accounting software |
51675654 | https://en.wikipedia.org/wiki/Sam%20Darnold | Sam Darnold | Samuel Richard Darnold (born June 5, 1997) is an American football quarterback for the Carolina Panthers of the National Football League (NFL). He played college football at USC, where he won the 2017 Rose Bowl as a freshman, and was selected third overall by the New York Jets in the 2018 NFL Draft. At age 21, he was the NFL's youngest opening-day starting quarterback since the AFL–NFL merger. Darnold served as the Jets' starter from 2018 to 2020 until he was traded to the Panthers in 2021.
Early years
Darnold was born in Dana Point, California on June 5, 1997. He started playing basketball when he was five years old.
Darnold attended San Clemente High School in San Clemente, California. After playing baseball in his freshman year, he played football and basketball. During his high school basketball career, Darnold excelled, and was named South Coast League Most Valuable Player twice, along with being named to the all-CIF team. Basketball coach Marc Popovich stated Darnold's basketball skills helped translate into football, being the "only guy [I've] ever had who could get a defensive rebound and launch a 70-foot pass on target, pretty much in the same motion, to a guy breaking out in the fastbreak. It was almost Wes Unseld-like." Popovich added that Darnold could have played college basketball in the Pac-12 Conference or the Mountain West Conference "at worst."
On the football team, Darnold played wide receiver and linebacker, though he played quarterback as a sophomore after the starting quarterback was injured in a game against Tesoro High School. He threw a touchdown pass and scored the game-winning two-point conversion, but returned to playing receiver and linebacker a week later. When he legitimately became the school's quarterback, Darnold broke the school record for the most touchdown passes in a game when he threw five on two occasions. He missed much of his junior year with a foot injury. In his senior year, San Clemente reached the CIF-Southern Section Southwest Division championship game, where they lost 37–44 to Trabuco Hills High School in a comeback upset after Sam was knocked out of the game with a concussion. He ended his senior season with 3,000 passing yards and 39 touchdowns, along with 800 rushing yards and 13 rushing touchdowns.
Darnold was rated by Rivals.com as a four-star recruit and was ranked as the eighth best dual-threat quarterback in his class and 179th best player overall. However, he did not have much footage of him performing at recruiting camps, preferring to show his play in games. As a result, San Clemente head football coach Jaime Ortiz elected to provide video of his basketball career to football coaches. He received scholarship offers to play college football from schools like Oregon, Utah, Northwestern and Duke. During a football camp, USC coaches Clay Helton and Steve Sarkisian were impressed by Darnold's performance, and extended to him a scholarship to play for the Trojans.
College career
2015 season
USC defensive coordinator Justin Wilcox, who recruited Darnold to the school, wanted him to play linebacker before he declined. In the 2015 season, Darnold redshirted for his freshman year as he was behind Cody Kessler and Max Browne on the depth chart.
2016 season
Entering the 2016 season as a redshirt freshman, Darnold was the second-string quarterback behind Max Browne. In three games as backup quarterback, Darnold saw limited action, completing 14-of-22 passes for two touchdowns and an interception. After a 1–2 start to the season, Browne was benched in favor of Darnold. In his first career start with USC against the Utah Utes, Darnold completed 18-of-26 passes for 253 yards and recorded a rushing touchdown as USC lost 27–31. After the loss, Darnold's Trojans did not lose a game for the remainder of the season, including a 26–13 upset win over the #4-ranked Washington Huskies. The USC offense recorded an average of 37 points and 518 yards per game, while Darnold set the school record for most passing touchdowns by a freshman with 26, ten more than the previous record set by Todd Marinovich in 1989. Against Arizona and California, Darnold became the first quarterback in school history to record five touchdown passes in consecutive games, while also throwing for multiple touchdowns in eight straight games, the first USC quarterback to do so since Matt Leinart did in 2004. On the ground, Darnold recorded 230 rushing yards, the most by a USC quarterback since Reggie Perry's 254 yards in 1991. Darnold was named the 2016 Pac-12 Conference Freshman Offensive Player of the Year in late November.
USC was invited to play in the 2017 Rose Bowl, their first appearance in the bowl in eight seasons. In the 52–49 victory over Penn State, Darnold completed 33-of-53 passes for 453 yards, while also setting Rose Bowl records in passing touchdowns (5) and total yards (453). The 453 yards recorded ranked second in the bowl's history, only trailing Danny O'Neil's 456 in the 1995 game.
On January 4, 2017, it was announced that Darnold was awarded the Archie Griffin Award, which was awarded annually to college football's most valuable player to his team throughout the season, an award no other freshman had ever won previously. Darnold was also named to the Football Writers Association of America's Freshman All-America team.
2017 season
Entering the 2017 season as a redshirt sophomore, Darnold became an early favorite for the Heisman Trophy, and eventually a first-round pick in the 2018 draft. The season did not start the way Darnold had expected. In six games, he had matched the number of interceptions that he had thrown the previous year. This was accredited to breaking in a new receiver group, numerous injuries, and questionable coaching decisions. Despite this, he led USC to a dominant victory over Stanford by a score of 42–24. He then led an overtime victory over the Texas Longhorns during which he drove the Trojans to a game-tying field goal in the final 39 seconds of regulation. Darnold guided USC to the Pac-12 Conference championship with a 31–28 victory over Stanford in the conference title game where he was named the game's MVP after throwing for over 300 yards and two touchdowns. The win earned USC a spot in the 2017 Cotton Bowl where, despite 356 yards passing the Trojans were soundly defeated by the Ohio State Buckeyes, 24–7.
Statistics
Professional career
On January 3, 2018, Darnold announced that he would enter the 2018 NFL Draft.
New York Jets
Darnold was selected by the New York Jets in the first round, with the third overall selection, of the 2018 NFL Draft. On July 30, 2018, Darnold signed a four-year deal worth $30.25 million fully guaranteed featuring a $20 million signing bonus with the Jets.
2018 season
Darnold made his professional debut on August 10, in the first preseason game against the Atlanta Falcons, where he finished with 96 passing yards and a touchdown as the Jets won 17–0. On August 29, the Jets named Darnold the starter for Week 1 of the season.
Darnold played his first regular season game on September 10, 2018 during Monday Night Football against the Detroit Lions, making him the youngest opening-day starting quarterback since the AFL–NFL merger. His first pass resulted in an interception returned for a touchdown by Quandre Diggs. However, he responded well and finished with 198 passing yards and two touchdowns as the Jets won 48–17. During the Jets' home opener against the Miami Dolphins in Week 2, Darnold finished with 334 passing yards, a touchdown, and two interceptions as the Jets lost 20–12. During a Thursday Night Football game against the Cleveland Browns in Week 3, Darnold finished with 169 passing yards and two interceptions as the Jets lost 21–17. During Week 4 against the Jacksonville Jaguars, Darnold finished with 167 passing yards and a touchdown as the Jets lost 31–12. During Week 5 against the Denver Broncos, Darnold finished with 198 passing yards, three touchdowns, and an interception, while the Jets combined for 323 rushing yards and won 34–16. During Week 6 against the Indianapolis Colts, Darnold finished with 280 passing yards, two touchdowns, and an interception as the Jets won 42–34. During Week 7 against the Minnesota Vikings, Darnold committed 4 turnovers, including 3 interceptions and a lost fumble. He finished with 206 passing yards and a touchdown as the Jets lost 37–17. During Week 8 against the Chicago Bears, Darnold finished with 153 passing yards and a touchdown as the Jets lost 24–10.
During a rematch against the Dolphins in Week 9, Darnold threw four interceptions, finishing the game with 229 passing yards in a 13–6 Jets' loss. Darnold suffered through a foot injury, which sidelined him and caused Josh McCown to start for the Jets. After missing three games due to a foot injury, Darnold returned to action in a Week 14 matchup against the Buffalo Bills and fellow rookie quarterback Josh Allen. Darnold temporarily left the game due to an injury on the same foot, but eventually returned, finishing with 170 passing yards, a touchdown, and an interception as the Jets ended their six-game losing streak and won 27–23. He led the team on its game-winning drive, completing a 37-yard pass to Robby Anderson to help set up a touchdown run by Elijah McGuire. During Saturday Night Football against the Houston Texans in Week 15, Darnold finished with 253 passing yards and two touchdowns as the Jets lost 29–22. During Week 16 against the Green Bay Packers, Darnold finished with 341 passing yards and three touchdowns. Marred by 16 penalties, the Jets squandered a 15-point lead and lost 44–38 in overtime. During Week 17 against the New England Patriots, Darnold finished with 167 passing yards as the Jets lost 38–3 in the regular-season finale. Darnold finished the season with 2,865 passing yards, 17 passing touchdowns, and 15 interceptions.
2019 season
During the Jets' home opener against the Buffalo Bills in Week 1, Darnold finished with 179 passing yards and a touchdown. Despite the Jets having a 16–0 lead midway through the third quarter and four takeaways, the team lost 16–17. On September 12, it was reported that Darnold was diagnosed with mononucleosis, and he subsequently missed three games. He returned in Week 6 against the Dallas Cowboys where he finished with 338 passing yards, two touchdowns, and an interception as the Jets won 24–22. In the game, Darnold threw a 92-yard touchdown pass to Robby Anderson. He was named the AFC Offensive Player of the Week for his performance. During Monday Night Football against the New England Patriots in Week 7, the Jets only had 154 total yards of offense with Darnold throwing four interceptions and losing a fumble as the Jets were shut out 33–0. A sound bite captured by NFL Films and ESPN showed Darnold, who was mic'd up, commenting that he was "seeing ghosts" while struggling during the game, which led to mockery by opposing NFL fanbases.
During Week 8 against the Jacksonville Jaguars, Darnold finished with 218 passing yards, two touchdowns, and three interceptions as the Jets lost 29–15. During Week 9 against the Miami Dolphins, Darnold finished with 260 passing yards, a touchdown, and an interception as the Jets lost 26–18. During Week 10 against the New York Giants, Darnold finished with 230 passing yards, 25 rushing yards, and two total touchdowns as the Jets won 34–27. During Week 11 against the Washington Redskins, Darnold finished with 293 passing yards, four touchdowns, and an interception as the Jets won 34–17. During Week 12 against the Oakland Raiders, Darnold finished with 315 passing yards and two touchdowns as the Jets won 34–3. In Week 13, Darnold finished with 239 passing yards, but penalties by the offensive line proved to be costly as the Jets lost 22–6 to the Cincinnati Bengals. During a Dolphins' rematch in Week 14, Darnold finished with 270 passing yards, two touchdowns, and an interception as the Jets won 22–21. Darnold finished the 2019 season with 3,024 passing yards, 19 touchdowns, and 13 interceptions and had 33 carries for 62 rushing yards and two rushing touchdowns.
2020 season
During the season opener against the Buffalo Bills in Week 1, Darnold finished with 215 passing yards, a touchdown, and an interception as the Jets lost 27–17. During the Jets' home opener against the San Francisco 49ers in Week 2, Darnold finished with 179 passing yards and a touchdown as the Jets lost 31–13. During Week 3 against the Indianapolis Colts, Darnold finished with 168 passing yards, a touchdown, and three interceptions, two of which were returned for a touchdown. He was sacked in the endzone for a safety as the Jets lost 36–7. During Thursday Night Football against the Denver Broncos in Week 4, Darnold finished with 230 passing yards, 86 rushing yards, and highlighted a big play with a 46-yard rushing touchdown. He briefly left the game with a shoulder injury after a sack but was allowed back in the game. Still, the Jets lost 37–28. After missing two games due to shoulder soreness, Darnold returned in Week 7 against the Bills, finishing with 120 passing yards and two interceptions as the Jets lost 18–10.
In Week 8 against the Kansas City Chiefs, Darnold reaggravated his shoulder injury and missed the Jets' next two games. He made his return in Week 12 against the Miami Dolphins. During the game, Darnold continued to struggle, throwing for 197 yards and two interceptions in the 20–3 loss.
In Week 13 against the Las Vegas Raiders, Darnold threw for 186 yards, two touchdowns, and one interception and also recorded a rushing touchdown during the 31–28 loss. This was Darnold’s first game of the season in which he threw more touchdown passes than interceptions. Darnold finished the season with 2,208 passing yards, nine touchdowns, and 11 interceptions to go along with 217 rushing yards and two rushing touchdowns in 12 games as the Jets finished 2–14.
Carolina Panthers
On April 5, 2021, Darnold was traded to the Carolina Panthers in exchange for a 2021 sixth-round pick and second- and fourth-round picks in 2022. On April 30, 2021, the team exercised the fifth-year option on Darnold's contract, worth a guaranteed $18.858 million for the 2022 season.
2021 season
Darnold made his first start for the Panthers on September 12, 2021, facing his former team, the New York Jets. During the game, he threw for 279 passing yards and a touchdown, as well as adding a five-yard rushing touchdown, as the Panthers won 19–14. Against the New Orleans Saints, Darnold threw for 305 yards two touchdowns, and an interception, as the Panthers won 26–7. Against the Houston Texans, Darnold threw for 304 yards and rushed two touchdowns as the Panthers won 24–9. Darnold suffered a fractured scapula during the Panthers 6–24 loss against the New England Patriots which made him miss four to six weeks with a shoulder injury. He was placed on injured reserve on November 12, 2021. He was activated on December 25. Darnold would go on to replace a struggling Cam Newton in the second quarter of a Week 16 game against the division rival Tampa Bay Buccaneers. He completed 15 of 32 passes for 190 yards as the Panthers went on to lose 32-6. It was the team's 10th loss in 12 games.
Playing style
Though not regarded widely as a dual-threat quarterback, Darnold has been praised for his mobility in the pocket, which, when needed, allows him to escape pressure, extend plays and throw on the run. He has also been described as a "gunslinger".
NFL career statistics
Highlights and awards
AFC Offensive Player of the Week (Week 6, 2019)
NFL records
Youngest quarterback to post a passer rating higher than 110 – 116.8 rating at 21 years, 97 days old
Jets franchise records
Highest completion percentage by a rookie quarterback in a single season (2018) – 57.7
Highest rookie quarterback rating, minimum seven appearances – 77.6
Personal life
Darnold's mother is a physical education teacher at Shorecliffs Middle School. His older sister, Franki, is a college volleyball player at the University of Rhode Island. His grandfather, Dick Hammer, was a Marlboro Man actor and USC athlete.
He is friends with fellow quarterbacks Josh Allen of the Buffalo Bills and Kyle Allen of the Washington Football Team, with the three often training together in the off-season.
References
External links
Carolina Panthers bio
USC Trojans bio
1997 births
Living people
American football quarterbacks
Carolina Panthers players
New York Jets players
People from Dana Point, California
Players of American football from California
Sportspeople from Orange County, California
USC Trojans football players |
46459003 | https://en.wikipedia.org/wiki/George%20Penton | George Penton | George Washington "Doc" Penton (September 6, 1882 – July 11, 1969) was an American football player and coach. He served as the head football coach at Jacksonville State Normal School (now Jacksonville State University) in 1910 and at Troy State Normal School (now Troy University) from 1911 to 1912, compiling a career college football coaching record of 8–4–3. Penton played college football at Auburn University as a guard and fullback from 1907 to 1909. He was the brother of fellow football player and coach, John Penton.
Playing career
Penton played football, baseball, basketball, and track at Auburn University. He was a guard and fullback for Mike Donahue's Auburn Tigers football team from 1907 to 1909.
1909
Dick Jemison selected him second-team All-Southern at fullback.
Coaching career
1912
Penton was athletic director at Troy University and led the Troy Trojans to its only perfect season in 1912, a 3–0 record.
1913
Penton was then an assistant under Donahue in 1913. His first year there the team won the Southern Intercollegiate Athletic Association championship.
1919–1921
Penton coached the Sidney Lanier High School Poets from 1919 to 1921.
Head coaching record
College
References
External links
1882 births
1969 deaths
American football fullbacks
American football guards
Auburn Tigers football coaches
Auburn Tigers football players
Jacksonville State Gamecocks football coaches
Troy Trojans athletic directors
Troy Trojans football coaches
High school football coaches in Alabama
People from Coosa County, Alabama
Sportspeople from Montgomery, Alabama
Coaches of American football from Alabama
Players of American football from Montgomery, Alabama |
11162665 | https://en.wikipedia.org/wiki/C%26C | C&C | C&C may refer to:
C&C Group (formerly Cantrell and Cochrane), a consumer goods group based in Ireland
C&C Yachts, sailboat builder
C+C Music Factory, an American dance-pop and hip hop group
Cambridge & Coleridge Athletic Club, based in Cambridge, United Kingdom
Castles & Crusades, a role-playing game
Chris & Cosey, an industrial music project of Throbbing Gristle members
City and Colour, acoustic project from musician Dallas Green
Coheed and Cambria, a rock band from Nyack, New York, formed in 1995
Chocolate and Cheese, album by Ween
Codes and ciphers, see Cryptography
Command and control, the exercise of authority by a commanding officer over military forces in the accomplishment of a mission
Command and control (management), an approach to decision making in organizations
Command and control (malware), a control mechanism for botnets
Command & Conquer, a real-time strategy video game series
Contraction and Convergence, an approach to limiting carbon dioxide emissions globally |
6136074 | https://en.wikipedia.org/wiki/Troy%20High%20School%20%28Ohio%29 | Troy High School (Ohio) | Troy High School is a public high school in Troy, Ohio, part of Troy City Schools. The current complex was built in 1958, and has an enrollment of 1,504 students. The school's mascot is the Trojan. As of 2019-20, the Trojans are again Miami Valley League (MVL) members.
1969–1971 Trojans football
In 1969 Troy Trojans was a high school team that was rebuilding and led by undersized sophomores, they were having a terrible season (they went 2-7-1), this making four straight losing seasons, two in the Western Ohio League (WOL) 1968 and 1969, and two before that (1966–1967) in the Miami Valley League (MVL). On the last play, of the last game of the season, with the game tied 22-22 against powerful rival Huber Heights Wayne, the pass went to the 165-pound Randy Walker. He was tackled from the end zone. After the game the coach, James "Jim" Conard, made the entire team walk around with a piece of cloth that was long, until the start of the 1970 season. Walker gave up his first love, baseball, joined the track team for speed and stamina, and started lifting weights-gaining 30 pounds, reporting to fall practice at 195-pounds.
The strip of cloth would motivate his team, Conard and Walker would not lose another game the remaining two seasons, going 20-0, and winning back-to-back WOL titles. In 1971 they outscored opponents, 406-54, out-gained opponents to 1,267, and punting only 19 times all season. The team and the defense dominated opponents, forcing 31 turnovers, and posting five shutouts (including a 35-0 victory over Wayne).
Three backs would be selected to the All-Western Ohio League team, Gordon Bell, Walker, and Joe Allen. Bell, who gained in three seasons, rushed for 1,447 yards (on 198 carries) and scored 19 touchdowns in 1971, and was named first team All-Ohio for both 1970 and 1971. Bell would have also been "Ohio Back of the Year" had he not finish second, to Archie Griffin (Columbus Eastmoor), in both years. Walker, whose main assignment was to block, rushed for , on a per carry average. Allen had on 67 carries (8.1 YPC). David Starkey, the heart of the defense and was named an All-Ohio defensive lineman. Elmo Boyd, a track star who only played football his senior season (1971), finished with 12 catches for 374 yards (31.2 yards-per-catch average) and seven touchdowns.
In 2001, the 1971 team was selected by a panel of Dayton Daily News sports writers as the best Miami Valley prep football team of the last 50 years. Coach Conard retired after the 1971 season to become a principal at Troy Junior High.
20 players from the 1971 Trojans would go on to play college football, 15 of those at the Division I level, and two would play in the National Football League (NFL). Bell would go on to play for the Michigan Wolverines in college, gaining 2,900 yards, and play four seasons in the NFL for the New York Giants and St. Louis Cardinals. Walker would star in three seasons at fullback for the Miami RedHawks, on a team that went 32-1-1, winning the Mid-American Conference title all three years. Drafted by the Cincinnati Bengals (1976; 13th round), Walker would instead choose to become an assistant coach (later head coach of both the Miami RedHawks and the Northwestern Wildcats). Boyd went on to play football at Eastern Kentucky and with the NFL's San Francisco 49ers. Starkey and fullback Joe Allen would play for the Florida Gators. In addition quarterback Al Mayer went on to play at Marshall.
Clubs and activities
The school's Latin Club functions as a local chapter of both the Ohio Junior Classical League (OJCL) and National Junior Classical League (NJCL).
Troy High School entered the National Archery in the Schools Program (NASP) in 2008. Since then, the Troy High School Archery Team has gone to the state NASP tournament in Columbus, OH and the national NASP tournament in Louisville, KY every year. In 2011 the team as a whole placed 29th in the nation.
Other clubs and activities include:
Academic Quiz Team
ASL Club
Yearbook
ASTRA
Book Club
Business Club (FBLA)
Drama Club
Great Outdoors Club
FCCLA
Interact
Junior Cabinet
Key Club
Math Club
Musical
National Honor Society
Newspaper
Senior Cabinet
Spanish Club (inactive for 2020-21 school year)
Student Council
Xtreme Bots
OHSAA State Championships
2016 Girls Bowling
2011 Boys Bowling
Trojan Marching Band
The Troy High School Trojan Marching Band has qualified for State Marching Band Finals since the events inception in 1980. In addition, the Troy Marching Band has received a Superior rating at this event since 2000. The Trojan Marching Band has performed in the London New Year's Day Parade in 2002, 2006, 2010, 2014, and 2018.
Recent years
In October 2005, a $12 million renovation commenced. These plans included a new gymnasium, student commons, science wing, new restrooms, new tennis court as well as a new student parking lot. Construction began about January 2006 and was completed about March 2007.
Notable alumni
Gordon Bell, NFL football player
Ryan Brewer, 1998 Ohio Mr. Football award winner
Nancy J. Currie, American astronaut
Kris Dielman, NFL football player and four time Pro Bowl selection
Heath Murray, Former MLB player (San Diego Padres, Detroit Tigers, Cleveland Indians)
Tommy Myers, NFL football player
Bob Ferguson, NFL football player
Tanya Thornton Shewell, Maryland politician
Tom Vaughn, NFL football player
Randy Walker, NCAA football coach
References
External links
District Website
High schools in Miami County, Ohio
Educational institutions established in 1852
Public high schools in Ohio
1852 establishments in Ohio |
18879 | https://en.wikipedia.org/wiki/Massachusetts%20Institute%20of%20Technology | Massachusetts Institute of Technology | The Massachusetts Institute of Technology (MIT) is a private land-grant research university in Cambridge, Massachusetts. Established in 1861, MIT has since played a key role in the development of modern technology and science, ranking it among the top academic institutions in the world.
Founded in response to the increasing industrialization of the United States, MIT adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River, and encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory, the Bates Center, and the Haystack Observatory, as well as affiliated laboratories such as the Broad and Whitehead Institutes.
, 98 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 41 astronauts, 16 Chief Scientists of the U.S. Air Force, and numerous heads of states have been affiliated with MIT. The institute also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. MIT is a member of the Association of American Universities (AAU) and has received more Sloan Research Fellowships than any other university in North America.
History
Foundation and vision
In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a "Conservatory of Art and Science", but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.
Rogers, a graduate of William and Mary and professor at UVA, wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:
The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.
The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.
Early developments
Two days after MIT was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT's first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions "to promote the liberal and practical education of the industrial classes" and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst. In 1866, the proceeds from land sales went toward new buildings in the Back Bay.
MIT was informally called "Boston Tech". The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.
The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these "Boston Tech" years, MIT faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot's repeated attempts to merge MIT with Harvard College's Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.
In 1916, the MIT administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT's move to a spacious new campus largely consisting of filled land on a tract along the Cambridge side of the Charles River. The neoclassical "New Technology" campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious "Mr. Smith", starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($ million in 2015 dollars) in cash and Kodak stock to MIT.
Curricular reforms
In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms "renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering". Unlike Ivy League schools, MIT catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.
Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at MIT that "the Institute is widely conceived as basically a vocational school", a "partly unjustified" perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.
Defense research
MIT's involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at MIT's Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper's Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, MIT became the nation's largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($ billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.
These activities affected MIT profoundly. A 1949 report noted the lack of "any great slackening in the pace of life at the Institute" to match the return to peacetime, remembering the "academic tranquility of the prewar years", though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of MIT between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, MIT no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.
In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and MIT's defense research. In this period MIT's various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. MIT ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to "greater strength and unity" after these times of turmoil. However six MIT students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT's role in military research and its suppression of these protests. (Richard Leacock's film, November Actions, records some of these tumultuous events.)
In the 1980s, there was more controversy at MIT over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, MIT's research for the military has included work on robots, drones and 'battle suits'.
Recent history
MIT has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman's GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the OpenCourseWare project has made course materials for over 2,000 MIT classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.
MIT was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new "backlot" buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School's eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.
In 2001, inspired by the open source and open access movements, MIT launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, MIT announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its "MITx" program, for a modest fee. The "edX" online platform supporting MITx was initially developed in partnership with Harvard and its analogous "Harvardx" initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the MIT faculty adopted an open-access policy to make its scholarship publicly accessible online.
MIT has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier's memorial service was attended by more than 10,000 people, in a ceremony hosted by the MIT community with thousands of police officers from the New England region and Canada. On November 25, 2013, MIT announced the creation of the Collier Medal, to be awarded annually to "an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the MIT community and in all aspects of his life". The announcement further stated that "Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness".
In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.
The Laser Interferometer Gravitational-Wave Observatory (LIGO) was designed and constructed by a team of scientists from California Institute of Technology, MIT, and industrial contractors, and funded by the National Science Foundation. It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and MIT physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an MIT graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.
In 2021, MIT researchers in the field of Computer Science and Artificial Intelligence developed an AI system that makes robots better at handling objects. The simulated, anthropomorphic hand created could manipulate more than 2,000 objects. And the system didn’t need to know what it was about to pick up to find a way to move it around in its hand.
Campus
MIT's campus in the city of Cambridge spans approximately a mile along the north side of the Charles River basin. The campus is divided roughly in half by Massachusetts Avenue, with most dormitories and student life facilities to the west and most academic buildings to the east. The bridge closest to MIT is the Harvard Bridge, which is known for being marked off in a non-standard unit of length – the smoot.
The Kendall/MIT MBTA Red Line station is located on the northeastern edge of the campus, in Kendall Square. The Cambridge neighborhoods surrounding MIT are a mixture of high tech companies occupying both modern office and rehabilitated industrial buildings, as well as socio-economically diverse residential neighborhoods. In early 2016, MIT presented its updated Kendall Square Initiative to the City of Cambridge, with plans for mixed-use educational, retail, residential, startup incubator, and office space in a dense high-rise transit-oriented development plan. The MIT Museum will eventually be moved immediately adjacent to a Kendall Square subway entrance, joining the List Visual Arts Center on the eastern end of the campus.
Each building at MIT has a number (possibly preceded by a W, N, E, or NW) designation, and most have a name as well. Typically, academic and office buildings are referred to primarily by number while residence halls are referred to by name. The organization of building numbers roughly corresponds to the order in which the buildings were built and their location relative (north, west, and east) to the original center cluster of Maclaurin buildings. Many of the buildings are connected above ground as well as through an extensive network of tunnels, providing protection from the Cambridge weather as well as a venue for roof and tunnel hacking.
MIT's on-campus nuclear reactor is one of the most powerful university-based nuclear reactors in the United States. The prominence of the reactor's containment building in a densely populated area has been controversial, but MIT maintains that it is well-secured. In 1999 Bill Gates donated US$20 million to MIT for the construction of a computer laboratory named the "William H. Gates Building", and designed by architect Frank Gehry. While Microsoft had previously given financial support to the institution, this was the first personal donation received from Gates.
MIT Nano, also known as Building 12, is an interdisciplinary facility for nanoscale research. Its cleanroom and research space, visible through expansive glass facades, is the largest research facility of its kind in the nation. With a cost of US$400 million, it is also one of the costliest buildings on campus. The facility also provides state-of-the-art nanoimaging capabilities with vibration damped imaging and metrology suites sitting atop a slab of concrete underground.
Other notable campus facilities include a pressurized wind tunnel for testing aerodynamic research, a towing tank for testing ship and ocean structure designs, and previously Alcator C-Mod, which was the largest fusion device operated by any university. MIT's campus-wide wireless network was completed in the fall of 2005 and consists of nearly 3,000 access points covering of campus.
In 2001, the Environmental Protection Agency sued MIT for violating the Clean Water Act and the Clean Air Act with regard to its hazardous waste storage and disposal procedures. MIT settled the suit by paying a $155,000 fine and launching three environmental projects. In connection with capital campaigns to expand the campus, the Institute has also extensively renovated existing buildings to improve their energy efficiency. MIT has also taken steps to reduce its environmental impact by running alternative fuel campus shuttles, subsidizing public transportation passes, and building a low-emission cogeneration plant that serves most of the campus electricity, heating, and cooling requirements.
MIT has substantial commercial real estate holdings in Cambridge on which it pays property taxes, plus an additional voluntary payment in lieu of taxes (PILOT) on academic buildings which are legally tax-exempt. , it is the largest taxpayer in the city, contributing approximately 14% of the city's annual revenues. Holdings include Technology Square, parts of Kendall Square, and many properties in Cambridgeport and Area 4 neighboring the educational buildings. The land is held for investment purposes and potential long-term expansion.
Architecture
MIT's School of Architecture, now the School of Architecture and Planning, was the first formal architecture program in the United States, and it has a history of commissioning progressive buildings. The first buildings constructed on the Cambridge campus, completed in 1916, are sometimes called the "Maclaurin buildings" after Institute president Richard Maclaurin who oversaw their construction. Designed by William Welles Bosworth, these imposing buildings were built of reinforced concrete, a first for a non-industrial – much less university – building in the US. Bosworth's design was influenced by the City Beautiful Movement of the early 1900s and features the Pantheon-esque Great Dome housing the Barker Engineering Library. The Great Dome overlooks Killian Court, where graduation ceremonies are held each year. The friezes of the limestone-clad buildings around Killian Court are engraved with the names of important scientists and philosophers. The spacious Building 7 atrium at 77 Massachusetts Avenue is regarded as the entrance to the Infinite Corridor and the rest of the campus.
Alvar Aalto's Baker House (1947), Eero Saarinen's MIT Chapel and Kresge Auditorium (1955), and I.M. Pei's Green, Dreyfus, Landau, and Wiesner buildings represent high forms of post-war modernist architecture. More recent buildings like Frank Gehry's Stata Center (2004), Steven Holl's Simmons Hall (2002), Charles Correa's Building 46 (2005), and Fumihiko Maki's Media Lab Extension (2009) stand out among the Boston area's classical architecture and serve as examples of contemporary campus "starchitecture". These buildings have not always been well received; in 2010, The Princeton Review included MIT in a list of twenty schools whose campuses are "tiny, unsightly, or both".
Housing
Undergraduates are guaranteed four-year housing in one of MIT's 11 undergraduate dormitories. Out of the 11 dormitories, 10 are currently active due to one of the residential halls, Burton Conner, undergoing renovation from 2020 to 2022. Those living on campus can receive support and mentoring from live-in graduate student tutors, resident advisors, and faculty housemasters. Because housing assignments are made based on the preferences of the students themselves, diverse social atmospheres can be sustained in different living groups; for example, according to the Yale Daily News staff's The Insider's Guide to the Colleges, 2010, "The split between East Campus and West Campus is a significant characteristic of MIT. East Campus has gained a reputation as a thriving counterculture." MIT also has 5 dormitories for single graduate students and 2 apartment buildings on campus for married student families.
MIT has an active Greek and co-op housing system, including thirty-six fraternities, sororities, and independent living groups (FSILGs). , 98% of all undergraduates lived in MIT-affiliated housing; 54% of the men participated in fraternities and 20% of the women were involved in sororities. Most FSILGs are located across the river in Back Bay near where MIT was founded, and there is also a cluster of fraternities on MIT's West Campus that face the Charles River Basin. After the 1997 alcohol-related death of Scott Krueger, a new pledge at the Phi Gamma Delta fraternity, MIT required all freshmen to live in the dormitory system starting in 2002. Because FSILGs had previously housed as many as 300 freshmen off-campus, the new policy could not be implemented until Simmons Hall opened in that year.
In 2013–2014, MIT abruptly closed and then demolished undergrad dorm Bexley Hall, citing extensive water damage that made repairs infeasible. In 2017, MIT shut down Senior House after a century of service as an undergrad dorm. That year, MIT administrators released data showing just 60% of Senior House residents had graduated in four years. Campus-wide, the four-year graduation rate is 84% (the cumulative graduation rate is significantly higher).
Organization and administration
MIT is chartered as a non-profit organization and is owned and governed by a privately appointed board of trustees known as the MIT Corporation. The current board consists of 43 members elected to five-year terms, 25 life members who vote until their 75th birthday, 3 elected officers (President, Treasurer, and Secretary), and 4 ex officio members (the president of the alumni association, the Governor of Massachusetts, the Massachusetts Secretary of Education, and the Chief Justice of the Massachusetts Supreme Judicial Court). The board is chaired by Diane Greene SM ’78, co-founder and former CEO of VMware and former CEO of Google Cloud. The Corporation approves the budget, new programs, degrees and faculty appointments, and elects the President to serve as the chief executive officer of the university and preside over the Institute's faculty. MIT's endowment and other financial assets are managed through a subsidiary called MIT Investment Management Company (MITIMCo). Valued at $16.4 billion in 2018, MIT's endowment was then the sixth-largest among American colleges and universities.
MIT has five schools (Science, Engineering, Architecture and Planning, Management, and Humanities, Arts, and Social Sciences) and one college (Schwarzman College of Computing), but no schools of law or medicine. While faculty committees assert substantial control over many areas of MIT's curriculum, research, student life, and administrative affairs, the chair of each of MIT's 32 academic departments reports to the dean of that department's school, who in turn reports to the Provost under the President. The current president is L. Rafael Reif, who formerly served as provost under President Susan Hockfield, the first woman to hold the post.
Academics
MIT is a large, highly residential, research university with a majority of enrollments in graduate and professional programs. The university has been accredited by the New England Association of Schools and Colleges since 1929. MIT operates on a 4–1–4 academic calendar with the fall semester beginning after Labor Day and ending in mid-December, a 4-week "Independent Activities Period" in the month of January, and the spring semester commencing in early February and ceasing in late May.
MIT students refer to both their majors and classes using numbers or acronyms alone. Departments and their corresponding majors are numbered in the approximate order of their foundation; for example, Civil and Environmental Engineering is , while Linguistics and Philosophy is . Students majoring in Electrical Engineering and Computer Science (EECS), the most popular department, collectively identify themselves as "Course 6". MIT students use a combination of the department's course number and the number assigned to the class to identify their subjects; for instance, the introductory calculus-based classical mechanics course is simply "8.01" at MIT.
Undergraduate program
The four-year, full-time undergraduate program maintains a balance between professional majors and those in the arts and sciences, and has been dubbed "most selective" by U.S. News, admitting few transfer students and 4.1% of its applicants in the 2020–2021 admissions cycle. MIT offers 44 undergraduate degrees across its five schools. In the 2017–2018 academic year, 1,045 bachelor of science degrees (abbreviated "SB") were granted, the only type of undergraduate degree MIT now awards. In the 2011 fall term, among students who had designated a major, the School of Engineering was the most popular division, enrolling 63% of students in its 19 degree programs, followed by the School of Science (29%), School of Humanities, Arts, & Social Sciences (3.7%), Sloan School of Management (3.3%), and School of Architecture and Planning (2%). The largest undergraduate degree programs were in Electrical Engineering and Computer Science (), Computer Science and Engineering (), Mechanical Engineering (), Physics (), and Mathematics ().
All undergraduates are required to complete a core curriculum called the General Institute Requirements (GIRs). The Science Requirement, generally completed during freshman year as prerequisites for classes in science and engineering majors, comprises two semesters of physics, two semesters of calculus, one semester of chemistry, and one semester of biology. There is a Laboratory Requirement, usually satisfied by an appropriate class in a course major. The Humanities, Arts, and Social Sciences (HASS) Requirement consists of eight semesters of classes in the humanities, arts, and social sciences, including at least one semester from each division as well as the courses required for a designated concentration in a HASS division. Under the Communication Requirement, two of the HASS classes, plus two of the classes taken in the designated major must be "communication-intensive", including "substantial instruction and practice in oral presentation". Finally, all students are required to complete a swimming test; non-varsity athletes must also take four quarters of physical education classes.
Most classes rely on a combination of lectures, recitations led by associate professors or graduate students, weekly problem sets ("p-sets"), and periodic quizzes or tests. While the pace and difficulty of MIT coursework has been compared to "drinking from a fire hose", the freshmen retention rate at MIT is similar to other research universities. The "pass/no-record" grading system relieves some pressure for first-year undergraduates. For each class taken in the fall term, freshmen transcripts will either report only that the class was passed, or otherwise not have any record of it. In the spring term, passing grades (A, B, C) appear on the transcript while non-passing grades are again not recorded. (Grading had previously been "pass/no record" all freshman year, but was amended for the Class of 2006 to prevent students from gaming the system by completing required major classes in their freshman year.) Also, freshmen may choose to join alternative learning communities, such as Experimental Study Group, Concourse, or Terrascope.
In 1969, Margaret MacVicar founded the Undergraduate Research Opportunities Program (UROP) to enable undergraduates to collaborate directly with faculty members and researchers. Students join or initiate research projects ("UROPs") for academic credit, pay, or on a volunteer basis through postings on the UROP website or by contacting faculty members directly. A substantial majority of undergraduates participate. Students often become published, file patent applications, and/or launch start-up companies based upon their experience in UROPs.
In 1970, the then-Dean of Institute Relations, Benson R. Snyder, published The Hidden Curriculum, arguing that education at MIT was often slighted in favor of following a set of unwritten expectations and that graduating with good grades was more often the product of figuring out the system rather than a solid education. The successful student, according to Snyder, was the one who was able to discern which of the formal requirements were to be ignored in favor of which unstated norms. For example, organized student groups had compiled "course bibles"—collections of problem-set and examination questions and answers for later students to use as references. This sort of gamesmanship, Snyder argued, hindered development of a creative intellect and contributed to student discontent and unrest.
Graduate program
MIT's graduate program has high coexistence with the undergraduate program, and many courses are taken by qualified students at both levels. MIT offers a comprehensive doctoral program with degrees in the humanities, social sciences, and STEM fields as well as professional degrees. The Institute offers graduate programs leading to academic degrees such as the Master of Science (which is abbreviated as SM at MIT), various Engineer's Degrees, Doctor of Philosophy (PhD), and Doctor of Science (ScD) and interdisciplinary graduate programs such as the MD-PhD (with Harvard Medical School) and a joint program in oceanography with Woods Hole Oceanographic Institution.
Admission to graduate programs is decentralized; applicants apply directly to the department or degree program. More than 90% of doctoral students are supported by fellowships, research assistantships (RAs), or teaching assistantships (TAs).
MIT Bootcamps
MIT Bootcamps are intense week-long innovation and leadership programs that challenge participants to develop a venture in a week. Each Bootcamp centers around a particular topic, specific to an industry, leadership skill set, or emerging technology. Cohorts are organized into small teams who work on an entrepreneurial project together, in addition to individual learning and team coaching. The program includes a series of online seminars with MIT faculty, practitioners, and industry experts, innovation workshops with bootcamp instructors focused on putting the theory participants have learned into practice, coaching sessions, and informal office hours for learners to exchange ideas freely. Bootcampers are tasked with weekly "deliverables," which are key elements of a business plan, to help guide the group through the decision-making process involved in building an enterprise. The experience culminates in a final pitch session, judged by a panel of experts.
MIT Bootcamp instructors include Eric von Hippel, Sanjay Sarma, Erdin Beshimov, and Bill Aulet. MIT Bootcamps were founded by Erdin Beshimov.
Rankings
MIT also places among the top five in many overall rankings of universities (see right) and rankings based on students' revealed preferences. For several years, U.S. News & World Report, the QS World University Rankings, and the Academic Ranking of World Universities have ranked MIT's School of Engineering first, as did the 1995 National Research Council report. In the same lists, MIT's strongest showings apart from in engineering are in computer science, the natural sciences, business, architecture, economics, linguistics, mathematics, and, to a lesser extent, political science and philosophy.
Times Higher Education has recognized MIT as one of the world's "six super brands" on its World Reputation Rankings, along with Berkeley, Cambridge, Harvard, Oxford and Stanford. In 2019, it ranked 3rd among the universities around the world by SCImago Institutions Rankings. In 2017, the Times Higher Education World University Rankings rated MIT the #2 university for arts and humanities. MIT was ranked #7 in 2015 and #6 in 2017 of the Nature Index Annual Tables, which measure the largest contributors to papers published in 82 leading journals.
Georgetown University researchers ranked it #3 in the U.S. for 20-year return on investment.
Collaborations
The university historically pioneered research and training collaborations between academia, industry and government. In 1946, President Compton, Harvard Business School professor Georges Doriot, and Massachusetts Investor Trust chairman Merrill Grisswold founded American Research and Development Corporation, the first American venture-capital firm. In 1948, Compton established the MIT Industrial Liaison Program. Throughout the late 1980s and early 1990s, American politicians and business leaders accused MIT and other universities of contributing to a declining economy by transferring taxpayer-funded research and technology to international – especially Japanese – firms that were competing with struggling American businesses. On the other hand, MIT's extensive collaboration with the federal government on research projects has led to several MIT leaders serving as presidential scientific advisers since 1940. MIT established a Washington Office in 1991 to continue effective lobbying for research funding and national science policy.
The US Justice Department began an investigation in 1989, and in 1991 filed an antitrust suit against MIT, the eight Ivy League colleges, and eleven other institutions for allegedly engaging in price-fixing during their annual "Overlap Meetings", which were held to prevent bidding wars over promising prospective students from consuming funds for need-based scholarships. While the Ivy League institutions settled, MIT contested the charges, arguing that the practice was not anti-competitive because it ensured the availability of aid for the greatest number of students. MIT ultimately prevailed when the Justice Department dropped the case in 1994.
MIT's proximity to Harvard University ("the other school up the river") has led to a substantial number of research collaborations such as the Harvard-MIT Division of Health Sciences and Technology and the Broad Institute. In addition, students at the two schools can cross-register for credits toward their own school's degrees without any additional fees. A cross-registration program between MIT and Wellesley College has also existed since 1969, and in 2002 the Cambridge–MIT Institute launched an undergraduate exchange program between MIT and the University of Cambridge. MIT also has a long term partnership with Imperial College London, for both student exchanges and research collaboration. More modest cross-registration programs have been established with Boston University, Brandeis University, Tufts University, Massachusetts College of Art and the School of the Museum of Fine Arts, Boston.
MIT maintains substantial research and faculty ties with independent research organizations in the Boston area, such as the Charles Stark Draper Laboratory, the Whitehead Institute for Biomedical Research, and the Woods Hole Oceanographic Institution. Ongoing international research and educational collaborations include the Amsterdam Institute for Advanced Metropolitan Solutions (AMS Institute), Singapore-MIT Alliance, MIT-Politecnico di Milano, MIT-Zaragoza International Logistics Program, and projects in other countries through the MIT International Science and Technology Initiatives (MISTI) program.
The mass-market magazine Technology Review is published by MIT through a subsidiary company, as is a special edition that also serves as an alumni magazine. The MIT Press is a major university press, publishing over 200 books and 30 journals annually, emphasizing science and technology as well as arts, architecture, new media, current events, and social issues.
Libraries, collections and museums
The MIT library system consists of five subject libraries: Barker (Engineering), Dewey (Economics), Hayden (Humanities and Science), Lewis (Music), and Rotch (Arts and Architecture). There are also various specialized libraries and archives. The libraries contain more than 2.9 million printed volumes, 2.4 million microforms, 49,000 print or electronic journal subscriptions, and 670 reference databases. The past decade has seen a trend of increased focus on digital over print resources in the libraries. Notable collections include the Lewis Music Library with an emphasis on 20th and 21st-century music and electronic music, the List Visual Arts Center's rotating exhibitions of contemporary art, and the Compton Gallery's cross-disciplinary exhibitions. MIT allocates a percentage of the budget for all new construction and renovation to commission and support its extensive public art and outdoor sculpture collection.
The MIT Museum was founded in 1971 and collects, preserves, and exhibits artifacts significant to the culture and history of MIT. The museum now engages in significant educational outreach programs for the general public, including the annual Cambridge Science Festival, the first celebration of this kind in the United States. Since 2005, its official mission has been, "to engage the wider community with MIT's science, technology and other areas of scholarship in ways that will best serve the nation and the world in the 21st century".
Research
MIT was elected to the Association of American Universities in 1934 and is classified among "R1: Doctoral Universities – Very high research activity"; research expenditures totaled $952 million in 2017. The federal government was the largest source of sponsored research, with the Department of Health and Human Services granting $255.9 million, Department of Defense $97.5 million, Department of Energy $65.8 million, National Science Foundation $61.4 million, and NASA $27.4 million. MIT employs approximately 1300 researchers in addition to faculty. In 2011, MIT faculty and researchers disclosed 632 inventions, were issued 153 patents, earned $85.4 million in cash income, and received $69.6 million in royalties. Through programs like the Deshpande Center, MIT faculty leverage their research and discoveries into multi-million-dollar commercial ventures.
In electronics, magnetic core memory, radar, single electron transistors, and inertial guidance controls were invented or substantially developed by MIT researchers. Harold Eugene Edgerton was a pioneer in high speed photography and sonar. Claude E. Shannon developed much of modern information theory and discovered the application of Boolean logic to digital circuit design theory. In the domain of computer science, MIT faculty and researchers made fundamental contributions to cybernetics, artificial intelligence, computer languages, machine learning, robotics, and cryptography. At least nine Turing Award laureates and seven recipients of the Draper Prize in engineering have been or are currently associated with MIT.
Current and previous physics faculty have won eight Nobel Prizes, four Dirac Medals, and three Wolf Prizes predominantly for their contributions to subatomic and quantum theory. Members of the chemistry department have been awarded three Nobel Prizes and one Wolf Prize for the discovery of novel syntheses and methods. MIT biologists have been awarded six Nobel Prizes for their contributions to genetics, immunology, oncology, and molecular biology. Professor Eric Lander was one of the principal leaders of the Human Genome Project. Positronium atoms, synthetic penicillin, synthetic self-replicating molecules, and the genetic bases for Amyotrophic lateral sclerosis (also known as ALS or Lou Gehrig's disease) and Huntington's disease were first discovered at MIT. Jerome Lettvin transformed the study of cognitive science with his paper "What the frog's eye tells the frog's brain". Researchers developed a system to convert MRI scans into 3D printed physical models.
In the domain of humanities, arts, and social sciences, as of October 2019 MIT economists have been awarded seven Nobel Prizes and nine John Bates Clark Medals. Linguists Noam Chomsky and Morris Halle authored seminal texts on generative grammar and phonology. The MIT Media Lab, founded in 1985 within the School of Architecture and Planning and known for its unconventional research, has been home to influential researchers such as constructivist educator and Logo creator Seymour Papert.
Spanning many of the above fields, MacArthur Fellowships (the so-called "Genius Grants") have been awarded to 50 people associated with MIT. Five Pulitzer Prize–winning writers currently work at or have retired from MIT. Four current or former faculty are members of the American Academy of Arts and Letters.
Allegations of research misconduct or improprieties have received substantial press coverage. Professor David Baltimore, a Nobel Laureate, became embroiled in a misconduct investigation starting in 1986 that led to Congressional hearings in 1991. Professor Ted Postol has accused the MIT administration since 2000 of attempting to whitewash potential research misconduct at the Lincoln Lab facility involving a ballistic missile defense test, though a final investigation into the matter has not been completed. Associate Professor Luk Van Parijs was dismissed in 2005 following allegations of scientific misconduct and found guilty of the same by the United States Office of Research Integrity in 2009.
In 2019, Clarivate Analytics named 54 members of MIT's faculty to its list of "Highly Cited Researchers". That number places MIT 8th among the world's universities.
Discoveries and innovation
Natural sciences
Oncogene – Robert Weinberg discovered genetic basis of human cancer.
Reverse transcription – David Baltimore independently isolated, in 1970 at MIT, two RNA tumor viruses: R-MLV and again RSV.
Thermal death time – Samuel Cate Prescott and William Lyman Underwood from 1895 to 1898. Done for canning of food. Applications later found useful in medical devices, pharmaceuticals, and cosmetics.
Computer and applied sciences
Akamai Technologies – Daniel Lewin and Tom Leighton developed a faster content delivery network, now one of the world's largest distributed computing platforms, responsible for serving between 15 and 30 percent of all web traffic.
Cryptography – MIT researchers Ron Rivest, Adi Shamir and Leonard Adleman developed one of the first practical public-key cryptosystems , the RSA cryptosystem, and started a company, RSA Security .
Digital circuits – Claude Shannon, while a master's degree student at MIT, developed the digital circuit design theory which paved the way for modern computers.
Electronic ink – developed by Joseph Jacobson at MIT Media Lab.
Emacs (text editor) – development began during the 1970s at the MIT AI Lab.
Flight recorder (black box) – Charles Stark Draper developed the black box at MIT's Instrumentation Laboratory. That lab later made the Apollo Moon landings possible through the Apollo Guidance Computer it designed for NASA.
GNU Project – Richard Stallman formally founded the free software movement in 1983 by launching the GNU Project at MIT.
Julia (programming language) - Development was started in 2009, by Jeff Bezanson, Stefan Karpinski, Viral B. Shah, and Alan Edelman, all at MIT at that time, and continued with the contribution of a dedicated MIT Julia Lab
Lisp (programming language) – John McCarthy invented Lisp at MIT in 1958.
Lithium-ion battery efficiencies – Yet-Ming Chiang and his group at MIT showed a substantial improvement in the performance of lithium batteries by boosting the material's conductivity by doping it with aluminium, niobium and zirconium.
Macsyma, one of the oldest general-purpose computer algebra systems; the GPL-licensed version Maxima remains in wide use.
MIT OpenCourseWare – the OpenCourseWare movement started in 1999 when the University of Tübingen in Germany published videos of lectures online for its timms initiative (Tübinger Internet Multimedia Server). The OCW movement only took off, however, with the launch of MIT OpenCourseWare and the Open Learning Initiative at Carnegie Mellon University in October 2002. The movement was soon reinforced by the launch of similar projects at Yale, Utah State University, the University of Michigan and the University of California Berkeley.
Perdix micro-drone – autonomous drone that uses artificial intelligence to swarm with many other Perdix drones.
Project MAC – groundbreaking research in operating systems, artificial intelligence, and the theory of computation. DARPA funded project.
Radar – developed at MIT's Radiation Laboratory during World War II.
SKETCHPAD – invented by Ivan Sutherland at MIT (presented in his PhD thesis). It pioneered the way for human–computer interaction (HCI). Sketchpad is considered to be the ancestor of modern computer-aided design (CAD) programs as well as a major breakthrough in the development of computer graphics in general.
VisiCalc – first spreadsheet computer program for personal computers, originally released for the Apple II by VisiCorp. MIT alumni Dan Bricklin and Bob Frankston rented time sharing at night on an MIT mainframe computer (that cost $1/hr for use).
World Wide Web Consortium – founded in 1994 by Tim Berners-Lee, (W3C) is the main international standards organization for the World Wide Web
X Window System – pioneering architecture-independent system for graphical user interfaces that has been widely used for Unix and Linux systems.
Companies and entrepreneurship
MIT alumni and faculty have founded numerous companies, some of which are shown below:
Analog Devices, 1965, co-founders Ray Stata, (SB, SM) and Matthew Lorber (SB)
BlackRock, 1988, co-founder Bennett Golub, (SB, SM, PhD)
Bose Corporation, 1964, founder Amar Bose (SB, PhD)
Buzzfeed, 2006, co-founder Jonah Peretti (SM)
Dropbox, 2007, founders Drew Houston (SB) and Arash Ferdowsi (drop-out)
Hewlett-Packard, 1939, co-founder William R. Hewlett (SM)
HuffPost, 2005, co-founder Jonah Peretti (SM)
Intel, 1968, co-founder Robert Noyce (PhD)
Koch Industries, 1940, founder Fred C. Koch (SB), sons William (SB, PhD), David (SB)
Qualcomm, 1985, co-founders Irwin M. Jacobs (SM, PhD) and Andrew Viterbi (SB, SM)
Raytheon, 1922, co-founder Vannevar Bush (DEng, Professor)
Renaissance Technologies, 1982, founder James Simons (SB)
Texas Instruments, 1930, founder Cecil Howard Green (SB, SM)
TSMC, 1987, founder Morris Chang (SB, SM)
VMware, 1998, co-founder Diane Greene (SM)
Traditions and student activities
The faculty and student body place a high value on meritocracy and on technical proficiency. MIT has never awarded an honorary degree, nor does it award athletic scholarships, ad eundem degrees, or Latin honors upon graduation. However, MIT has twice awarded honorary professorships: to Winston Churchill in 1949 and Salman Rushdie in 1993.
Many upperclass students and alumni wear a large, heavy, distinctive class ring known as the "Brass Rat". Originally created in 1929, the ring's official name is the "Standard Technology Ring". The undergraduate ring design (a separate graduate student version exists as well) varies slightly from year to year to reflect the unique character of the MIT experience for that class, but always features a three-piece design, with the MIT seal and the class year each appearing on a separate face, flanking a large rectangular bezel bearing an image of a beaver. The initialism IHTFP, representing the informal school motto "I Hate This Fucking Place" and jocularly euphemized as "I Have Truly Found Paradise", "Institute Has The Finest Professors", "Institute of Hacks, TomFoolery and Pranks", "It's Hard to Fondle Penguins", and other variations, has occasionally been featured on the ring given its historical prominence in student culture.
Activities
MIT has over 500 recognized student activity groups, including a campus radio station, The Tech student newspaper, an annual entrepreneurship competition, a crime club, and weekly screenings of popular films by the Lecture Series Committee. Less traditional activities include the "world's largest open-shelf collection of science fiction" in English, a model railroad club, and a vibrant folk dance scene. Students, faculty, and staff are involved in over 50 educational outreach and public service programs through the MIT Museum, Edgerton Center, and MIT Public Service Center.
Fraternities and sororities provide a base of activities in addition to housing. Approximately 1,000 undergrads, 48% of men and 30% of women, participate in one of several dozen Greek Life men's, women's and co-ed chapters on the campus.
The Independent Activities Period is a four-week-long "term" offering hundreds of optional classes, lectures, demonstrations, and other activities throughout the month of January between the Fall and Spring semesters. Some of the most popular recurring IAP activities are Autonomous Robot Design (course 6.270), Robocraft Programming (6.370), and MasLab competitions, the annual "mystery hunt", and Charm School. More than 250 students pursue externships annually at companies in the US and abroad.
Many MIT students also engage in "hacking", which encompasses both the physical exploration of areas that are generally off-limits (such as rooftops and steam tunnels), as well as elaborate practical jokes. Examples of high-profile hacks have included the abduction of Caltech's cannon, reconstructing a Wright Flyer atop the Great Dome, and adorning the John Harvard statue with the Master Chief's Mjölnir Helmet.
Athletics
MIT sponsors 31 varsity sports and has one of the three broadest NCAA Division III athletic programs. MIT participates in the NCAA's Division III, the New England Women's and Men's Athletic Conference, the New England Football Conference, NCAA's Division I Patriot League for women's crew, and the Collegiate Water Polo Association (CWPA) for Men's Water Polo. Men's crew competes outside the NCAA in the Eastern Association of Rowing Colleges (EARC). The intercollegiate sports teams, called the MIT Engineers won 22 Team National Championships, 42 Individual National Championships. MIT is the all-time Division III leader in producing Academic All-Americas (302) and rank second across all NCAA Divisions only behind the University of Nebraska. MIT Athletes won 13 Elite 90 awards and ranks first among NCAA Division III programs, and third among all divisions. In April 2009, budget cuts led to MIT eliminating eight of its 41 sports, including the mixed men's and women's teams in alpine skiing and pistol; separate teams for men and women in ice hockey and gymnastics; and men's programs in golf and wrestling.
People
Students
MIT enrolled 4,602 undergraduates and 6,972 graduate students in 2018–2019. Undergraduate and graduate students came from all 50 US states as well as from 115 foreign countries.
MIT received 33,240 applications for admission to the undergraduate Class of 2025: it admitted 1,365 (4.1 percent). In 2019, 29,114 applications were received for graduate and advanced degree programs across all departments; 3,670 were admitted (12.6 percent) and 2,312 enrolled (63 percent).
Undergraduate tuition and fees for 2019-2020 was $53,790 for nine months. 59% of students were awarded a need-based MIT scholarship. Graduate tuition and fees for 2019-2020 was also $53,790 for nine months, and summer tuition was $17,800. Financial support for graduate students are provided in large part by individual departments. They include fellowships, traineeships, teaching and research assistantships, and loans. The annual increase in expenses had led to a student tradition (dating back to the 1960s) of tongue-in-cheek "tuition riots".
MIT has been nominally co-educational since admitting Ellen Swallow Richards in 1870. Richards also became the first female member of MIT's faculty, specializing in sanitary chemistry. Female students remained a small minority prior to the completion of the first wing of a women's dormitory, McCormick Hall, in 1963. Between 1993 and 2009 the proportion of women rose from 34 percent to 45 percent of undergraduates and from 20 percent to 31 percent of graduate students. , women outnumbered men in Biology, Brain & Cognitive Sciences, Architecture, Urban Planning, and Biological Engineering.
Faculty and staff
, MIT had 1,030 faculty members. Faculty are responsible for lecturing classes, for advising both graduate and undergraduate students, and for sitting on academic committees, as well as for conducting original research. Between 1964 and 2009 a total of seventeen faculty and staff members affiliated with MIT won Nobel Prizes (thirteen of them in the latter 25 years). As of October 2020, 37 MIT faculty members, past or present, have won Nobel Prizes, the majority in Economics or Physics.
, current faculty and teaching staff included 67 Guggenheim Fellows, 6 Fulbright Scholars, and 22 MacArthur Fellows. Faculty members who have made extraordinary contributions to their research field as well as the MIT community are granted appointments as Institute Professors for the remainder of their tenures. Susan Hockfield, a molecular neurobiologist, served as MIT's president from 2004 to 2012. She was the first woman to hold the post.
MIT faculty members have often been recruited to lead other colleges and universities. Founding faculty-member Charles W. Eliot became president of Harvard University in 1869, a post he would hold for 40 years, during which he wielded considerable influence both on American higher education and on secondary education. MIT alumnus and faculty member George Ellery Hale played a central role in the development of the California Institute of Technology (Caltech), and other faculty members have been key founders of Franklin W. Olin College of Engineering in nearby Needham, Massachusetts.
former provost Robert A. Brown served as president of Boston University; former provost Mark Wrighton is chancellor of Washington University in St. Louis; former associate provost Alice Gast is president of Lehigh University; and former professor Suh Nam-pyo is president of KAIST. Former dean of the School of Science Robert J. Birgeneau was the chancellor of the University of California, Berkeley (2004–2013); former professor John Maeda was president of Rhode Island School of Design (RISD, 2008–2013); former professor David Baltimore was president of Caltech (1997–2006); and MIT alumnus and former assistant professor Hans Mark served as chancellor of the University of Texas system (1984–1992).
In addition, faculty members have been recruited to lead governmental agencies; for example, former professor Marcia McNutt is president of the National Academy of Sciences, urban studies professor Xavier de Souza Briggs served as the associate director of the White House Office of Management and Budget, and biology professor Eric Lander was a co-chair of the President's Council of Advisors on Science and Technology. In 2013, faculty member Ernest Moniz was nominated by President Obama and later confirmed as United States Secretary of Energy. Former professor Hans Mark served as Secretary of the Air Force from 1979 to 1981. Alumna and Institute Professor Sheila Widnall served as Secretary of the Air Force between 1993 and 1997, making her the first female Secretary of the Air Force and first woman to lead an entire branch of the US military in the Department of Defense.
, MIT was the second-largest employer in the city of Cambridge. Based on feedback from employees, MIT was ranked #7 as a place to work, among US colleges and universities . Surveys cited a "smart", "creative", "friendly" environment, noting that the work-life balance tilts towards a "strong work ethic" but complaining about "low pay" compared to an industry position.
Notable alumni
Many of MIT's over 120,000 alumni have achieved considerable success in scientific research, public service, education, and business. , 41 MIT alumni have won Nobel Prizes, 48 have been selected as Rhodes Scholars, 61 have been selected as Marshall Scholars, and 3 have been selected as Mitchell Scholars.
Alumni in United States politics and public service include former Chairman of the Federal Reserve Ben Bernanke, former MA-1 Representative John Olver, former CA-13 Representative Pete Stark, Representative Thomas Massie, Senator Alex Padilla, former National Economic Council chairman Lawrence H. Summers, and former Council of Economic Advisors chairman Christina Romer. MIT alumni in international politics include Foreign Affairs Minister of Iran Ali Akbar Salehi, Israeli Prime Minister Benjamin Netanyahu, President of Colombia Virgilio Barco Vargas, President of the European Central Bank Mario Draghi, former Governor of the Reserve Bank of India Raghuram Rajan, former British Foreign Minister David Miliband, former Greek Prime Minister Lucas Papademos, former UN Secretary General Kofi Annan, former Iraqi Deputy Prime Minister Ahmed Chalabi, former Minister of Education and Culture of The Republic of Indonesia Yahya Muhaimin, former Jordanian Minister of Education, Higher Education and Scientific Research and former Jordanian Minister of Energy and Mineral Resources Khaled Toukan. Alumni in sports have included Olympic fencing champion Johan Harmenberg.
MIT alumni founded or co-founded many notable companies, such as Intel, McDonnell Douglas, Texas Instruments, 3Com, Qualcomm, Bose, Raytheon, Apotex, Koch Industries, Rockwell International, Genentech, Dropbox, and Campbell Soup. According to the British newspaper The Guardian, "a survey of living MIT alumni found that they have formed 25,800 companies, employing more than three million people including about a quarter of the workforce of Silicon Valley. Those firms collectively generate global revenues of about $1.9 trillion (£1.2 trillion) a year". If the companies founded by MIT alumni were a country, they would have the 11th-highest GDP of any country in the world.
MIT alumni have led prominent institutions of higher education, including the University of California system, Harvard University, the New York Institute of Technology, Johns Hopkins University, Carnegie Mellon University, Tufts University, Rochester Institute of Technology, Rhode Island School of Design (RISD), UC Berkeley College of Environmental Design, the New Jersey Institute of Technology, Northeastern University, Tel Aviv University, Lahore University of Management Sciences, Rensselaer Polytechnic Institute, Tecnológico de Monterrey, Purdue University, Virginia Polytechnic Institute, KAIST, and Quaid-e-Azam University. Berklee College of Music, the largest independent college of contemporary music in the world, was founded and led by MIT alumnus Lawrence Berk for more than three decades.
More than one third of the United States' crewed spaceflights have included MIT-educated astronauts, a contribution exceeding that of any university excluding the United States service academies. Of the 12 people who have set foot on the Moon , four graduated from MIT (among them Apollo 11 Lunar Module Pilot Buzz Aldrin). Alumnus and former faculty member Qian Xuesen led the Chinese nuclear-weapons program and became instrumental in the PRC rocket-program.
Noted alumni in non-scientific fields include author Hugh Lofting, sculptor Daniel Chester French, guitarist Tom Scholz of the band Boston, the British BBC and ITN correspondent and political advisor David Walter, The New York Times columnist and Nobel Prize-winning economist Paul Krugman, The Bell Curve author Charles Murray, United States Supreme Court building architect Cass Gilbert,
Pritzker Prize-winning architects I.M. Pei and Gordon Bunshaft.
See also
The Coop, campus bookstore
Engineering
Glossary of engineering
Murray Eden
Notes
References
Citations
Sources
Also see the bibliography maintained by MIT's Institute Archives & Special Collections and Written Works in MIT in popular culture.
Nelkin, Dorothy. (1972). The University and Military Research: Moral politics at MIT (science, technology and society). New York: Cornell University Press. .
Postle, Denis. (1965). How to be First. BBC documentary on MIT available at reidplaza.com
Renehan, Colm. (2007). Peace Activism at the Massachusetts Institute of Technology from 1975 to 2001: A case study, PhD thesis, Boston: Boston College.
External links
Universities and colleges in Cambridge, Massachusetts
Universities and colleges in Middlesex County, Massachusetts
Engineering universities and colleges in Massachusetts
Technological universities in the United States
Land-grant universities and colleges
Educational institutions established in 1861
1861 establishments in Massachusetts
Rugby league stadiums in the United States
Science and technology in Massachusetts
Private universities and colleges in Massachusetts |
8328026 | https://en.wikipedia.org/wiki/Classmate%20PC | Classmate PC | The Classmate PC, formerly known as Eduwise, is Intel's entry into the market for low-cost personal computers for children in the developing world. It is in some respects similar to the One Laptop Per Child (OLPC) trade association's Children's Machine (XO), which has a similar target market. Although made for profit, the Classmate PC is considered an Information and Communication Technologies for Development project (ICT4D). Introduced in 2006, the device falls into the then popular category of netbooks.
Intel's World Ahead Program was established May 2006. The program designed a platform for low cost laptops that third party manufacturers could use to produce low cost machines under their own respective brands.
The Classmate PC is a reference design by Intel. Intel does not build the subnotebooks, but does produce the chips that power them. The reference design is then used by original equipment manufacturers (OEMs) worldwide to build their own branded Classmate PC.
Classmate PC (Clamshell, first generation)
The reference hardware specifications as of September 28, 2006 are:
Customized mini chassis 245 mm × 196 mm × 44 mm
CPU: Intel Celeron M mobile processor (915GMS + ICH6-M)
CPU clock speed 900 MHz (with 32 KB L1 cache, no L2 cache, and 400 MHz FSB)
800 × 480 7-inch diagonal LCD, LVDS Interface, LED B/L
256 MB of DDR2 RAM
1 GB/2 GB flash memory (connected via USB)
10/100 Mbit/s Ethernet
Realtek WLAN 802.11b/g with antenna (connected via USB)
Intel GMS915 integrated graphics chip (8 MB shared memory)
Built in microphone
Built in stereo speakers
Stereo 2 channel audio, jacks for external stereo speakers and microphones, Line-out, and Mic-in
Integrated keyboard with hot keys
Cycle touch pad with left and right buttons
Customized Note Taker with wireless pen
TPM1.2 (Trusted Platform Module from Infineon Technologies or Nuvoton) used for the Intel anti-theft technology feature (discontinued in 2015)
Power source:
4-cell Li-ion battery with adapter – approximately 3.5 hours usage
6-cell Li-ion battery option – approximately 5 hours usage
There was a consumer model called MiLeap X, build by HCL Infosystems, India.
Second generation (Convertible)
The successor of the original Classmate design was announced in April 2008 and reviewed. Later on, different photos of the successor leaked. Photos of Classmate PC 3 as a tablet PC are available. The second generation Classmate was unveiled on 3 April 2008 at Intel's Developer Forum. Significant upgrades include:
Available 30 GB PATA hard disk drive (in addition to 1, 2, and 4 GB SSD).
Built-in webcam
Available 9" LCD (the 7" LCD is still available)
Up to 512 MB RAM
802.11s (mesh networking, currently only usable on Linux-based Classmates)
Available 6-cell battery for up to 6.5 hours usage
Touchscreen – pen and on-screen soft keyboard
Tablet mode – simple user-interface shell; quick launcher for tablet mode
Enhanced software – easier network connection and collaboration simple computer management, and localized, education-friendly content
Third generation (Convertible)
On Computex 2009, Intel presented the third generation of the Classmate PC. It comes with a camera and an accelerometer.
Intel-Powered Convertible Classmate PC
The Intel-Powered Convertible Classmate PC had its official release at CES in January 2009 and was aimed at students, teachers, and parents. The Convertible Classmate can be converted from a traditional laptop to a tablet PC to allow children to write and draw more naturally. This model was designed by TEAMS Design Shanghai and won several design awards such as the Appliance Design EID Award, 2008 Spark Award and IF 2008 China Award. The initial model includes the following:
Dual mode: tablet mode and traditional laptop mode
When open like a traditional laptop, the screen swivels 180 degrees for easier sharing
1.6 GHz Intel Atom processor
1 GB RAM (2 GB max.)
60 GB hard disk drive (PATA 1.8", ZIF socket)
8.9 inch touch screen & advanced palm-resting technology
Allows for writing or drawing directly on the screen
Built-in camera rotates 180° to enable students to interact in a new way
Portability: Carrying handle, lightweight and compact size
Water-resistant keyboard
Education-specific features and touch-optimized software
SD card reader; documentation says that you can boot the system off an SD card, or a standard USB thumbdrive
Comes with either Windows XP installed (standard) or Windows XP Professional (an extra US$186 on the 2go)
Software
Intel announced that its device would run either Linux or Windows XP Professional. Intel is not using Windows XP Embedded as initially planned. Intel has been actively working with various international and local Linux distributions in various countries.
Intel has worked with Mandriva to customize their Linux distribution for Classmate PC.
Currently, the Intel powered classmate PC has been shown to run the following Linux distributions:
Mandriva Linux (International & Pan-European Linux operating system mandriva.com)
Metasys (International Syst in Brazil)
Latin America
In Latin America, contingent upon the receipt of sufficient international fund monies, the Mexican and Brazilian governments are evaluating whether to buy Intel's or the OLPC's laptop. Regardless of the hardware chosen, the Brazilian government announced that it would use the Linux operating system. It has been confirmed that Intel will be shipping the laptops with Mandriva Linux, Discovery 2007 edition as well as the Classmate 2.0 Linux distribution by the Brazilian company Metasys.
In 2008 the Venezuelan government ordered one million Classmates from Portugal, one of several bilateral deals that Portuguese officials valued at more than US$3 billion.
Oscar Clarke, president of Intel of Brasil, delivered thirty production units to the Brazilian Ministry of Education (MEC), for evaluation by SERPRO (Federal Data Processing Service of Brazil).
It is currently available in Argentina made by 10 different brands of manufacturers with OS like Linux Rxart and Windows also in Paraguay through HITECER S.A - TROVATO CISA GROUP with Rxart Linux by the Argentinian company Pixart. About the end of 2009, Argentina's government planned to give all the public high school students one of these netbooks in this edition Rxart Linux. US$740 million will be used to give all public schools routers and infrastructure to reduce the digital gap around the country.
In 2011, Mexico retracted its bid to buy several million Classmate PCs and instead selected a specialized variant of the Mexican-built Lanix LT laptop series running Windows 7 and Linux Rxart to equip students in 16,000 schools across the country.
Africa
In Africa, Intel has also started shipping to Libya as part of its deal of supplying 150,000 units.
In Seychelles, Classmate PC were supplied to primary schools as part of Sheikh Khalifa ICT project Seychelles 2010. In Kenya, Intel has partnered the distributors of Mecer products Mustek E.A who have worked with other government and non-government organizations to distribute the Classmate PC to rural areas.
Asia
In 2009, State Government of Terengganu with cooperation from (Top IT Industries Sdn Bhd) and Intel Malaysia introduces CMPC through local brand TC (known as Top IT now) in Terengganu as e-book to replace the traditional text book in primary school all around Terengganu. First batch of E09 model landed on Terengganu on March 2009 and till now the State Government still providing CMPC to primary school students for free. Ranging from E09 (2009), E10 (2010) and the latest model E11 (2011 - till now) is given to give boost to ICT literacy to school children.
In Asia, it has been available in Indonesia since early March 2008, through two local brands: Axioo and Zyrex. The Zyrex brand, called Anoa, is a rebranded Classmate PC equipped with the Intel ULV 900 MHz (400 MHz FSB) processor, 512 MB RAM, 2 GB SSD, Wi-Fi, LAN, 7 inch screen, 2× USB ports, card reader. The Classmate PC is available in Linux or Windows XP operating system, with the XP version incurring extra cost to cover the licensing.
The Classmate PC is currently available in India as Connoiseur Electronics Pvt.Ltd., Smartbook & Classmate PC series. This comes with Windows or Ubuntu Linux OS pre-installed.
In late 2007, a deal was made with the Vietnamese government to supply local schools with a special Classmate PC for discounted price. As this version is loaded with Hacao Linux, the government was able to avoid operating system licensing fees.
Europe and USA
The second generation of the device will be available in Europe and United States, in the hope that more sales will drive down the price.
Current generation are available from CTL (computers) in the United States under CTL's 2goPC brand as the 2go Classmate PC E10IS, the 2go Classmate PC E11 and the 2go Convertible Classmate PC NL2.
Classmate PC is available from CMS Computers and RM Education in the United Kingdom and most other European nations.
On May 20, 2008, Italian company Olidata announced the release of a modded version of the Classmate PC named Jumpc. This version was first on sale in Italy, but by the end of the year it was also available in many European countries.
On July 31, 2008, Intel, JP Sá Couto (the producer of the Tsunami Portuguese computers) and the Portuguese Government headed by Prime Minister José Sócrates announced the production of the "Magalhães" (a tribute to Portuguese navigator Magellan), an Intel Classmate-based computer that will be produced in Portugal (by JP Sá Couto) and distributed to Portuguese children in primary education for €50 (free or at €20 for students on social aid), as well as being exported to other countries.
Intel Classmate PC (the 7" version) is available in Greece, and sold as the InfoQuest "Quest Classmate", with a blue-coloured exterior. Its specifications include 2 GB storage, Windows XP Professional, no hard drive, no camera, and SD card support. It is sold by various retailers, including MediaMarkt.
Serbian leading IT company "COMTRADE" from April 2009 will introduce "ComTrade CoolBook" netbook (Classmate PC), they already donated 30 netbook's to one Belgrade elementary school.
Canada
The Intel Classmate series in Canada is available through MDG Computers under the following brand names:
The Intel Clamshell second generation is called the MDG Mini 8.9" Rugged Netbook PC
The Intel Convertible third generation is called the MDG Flip 8.9" Touchscreen Netbook PC
They are currently being sold through sears.ca and theshoppingchannel.com. For Q4 of 2009, 10.1" versions of both netbooks were planned.
Comparisons with OLPC project
The Classmate PC represents Intel's competitive response to the OLPC XO, whose unusually low prices and use of AMD chips threatened to steal market share from Intel and other major manufacturers. Although Intel initially criticized the XO laptops for their lack of functionality—Intel's Chairman of the Board, Craig Barrett, himself claimed consumers wanted the full functionality of a PC—the Classmate PCs are currently being heavily marketed against the XO worldwide. Intel had already secured deals to sell hundreds of thousands of Classmate PCs to Libya, Nigeria and Pakistan, the same developing countries the OLPC project had been targeting.
The goals of the Classmate PC project and the OLPC project have some differences. According to Intel: the Classmate PC aims to provide technology that fits into the larger, primarily Windows-based computing environment. However, according to the OLPC, the XO breaks from the desktop metaphor to provide a UI (Sugar) that they feel is more suited to the educational needs of children.
While the OLPC uses hardware and software highly customized for the educational environment, Intel has argued that the developing world wants to have generic PCs. In December 2005, Intel publicly dismissed the XO as a 'gadget'.
Intel joined the OLPC project in July 2007 and was widely expected to work on a version of the project's laptop that used an Intel chip, only to pull out of the project in January 2008. Intel spokesman Chuck Mulloy said it had pulled out because the OLPC organization had asked it to stop backing rival low-cost laptops, while OLPC's founder Nicholas Negroponte has accused Intel of underhand sales tactics and trying to block contracts to buy his machines.
Technical comparison
Media appearance
BBC World broadcast a program about the OLPC and the Classmate PC on 10 January 2008.
Notes
References
Intel reference hardware specifications sheet
External links
https://linuxmanr4.com/2013/11/26/micompumx-linux-argentina/
Appropriate technology
Information and communication technologies for development
Subnotebooks
Linux-based devices
Intel
E-learning |
2533718 | https://en.wikipedia.org/wiki/55th%20Wing | 55th Wing | The 55th Wing is a United States Air Force unit assigned to Air Combat Command. The wing is primarily stationed at Offutt Air Force Base, Nebraska, but maintains one of its groups and associated squadrons at Davis-Monthan Air Force Base, Arizona as a geographically separated unit.
The 55 WG is the only Air Force wing with continuous operations, maintenance, and aircraft presence in the United States Central Command area of responsibility since Operation Desert Storm.
The wing's mission is to provide worldwide reconnaissance, real-time intelligence, command and control, information warfare and combat support to U.S. leaders and commanders. One of the wing's units, the 55th Operations Group, operates 46 aircraft, including 13 models of seven different types. It is the largest wing in Air Combat Command and flies the most diverse number of aircraft.
History
For additional history and lineage, see 55th Operations Group
The "Fightin' Fifty-Fifth" has made significant contributions to the defense of the United States of America for more than 50 years. Since its inception, the unit has operated around the world, flying a wide variety of aircraft.
Cold War
On 1 November 1950, the 55th Strategic Reconnaissance Wing (55 SRW) was activated under the Wing Base Organization at Ramey Air Force Base, Puerto Rico as the headquarters for the 55th Strategic Reconnaissance Group and its supporting units. From 1950 to 1954 the Wing's task was to perform strategic reconnaissance, charting photography, precise electronic geodetic mapping, and electronic reconnaissance missions. In 1952, the wing moved to Forbes Air Force Base, Kansas and converted to Boeing RB-50 Superfortresses. On 13 March 1953, a wing RB-50 flying out of Eielson Air Force Base, Alaska was attacked by Soviet Mikoyan-Gurevich MiG-15 fighters near Siberia, but was able to ward off the fighter's attack with defensive fire. The United States protested the attack, stating the plane was on a weather reconnaissance flight over international waters, 25 miles from the Kamchatka Peninsula. The Soviets responded by saying the plane was intercepted over their territory near Cape Krestovoi. A little more than three months later, on 29 July 1953 an RB-50 of the wing's 343d Strategic Reconnaissance Squadron was shot down by Soviet fighters about ninety miles south of Vladivostok. The Soviet Union did not deny the plane's location was over water, but claimed that the bomber had twice flown over Soviet territory and fired on their MiGs, who then returned fire defensively.
The wing formally assumed a global strategic reconnaissance mission in 1954 and transitioned to the RB-47E "Stratojet." The Wing was deployed at Ben Guerir Air Base, in what was then French Morocco, between May and August 1955.
When the mapping and charting functions originally assigned to the 55th Reconnaissance Group were transferred on 1 May 1954, the wing assumed the mission of global strategic reconnaissance, including electronic reconnaissance. It also carried out weather reconnaissance operations until June 1963, and photographic reconnaissance missions until May 1964.
The 55 SRW moved to Offutt Air Force Base, Nebraska, in August 1966. That same year the 55th's 38th Strategic Reconnaissance Squadron assumed responsibility for SAC's airborne command and control system. The 2d Airborne Command and Control Squadron inherited this mission after activation in April 1970. The 1st Airborne Command and Control Squadron, flying E-4A aircraft, transferred to the 55th on 1 November 1975, bringing with it the National Emergency Airborne Command Post, now called the National Airborne Operations Center. The Wing flew reconnaissance operations during the U.S. military operations in Grenada in 1983 and Libya in 1986. On 1 March 1986, the 55 SRW became the host unit at Offutt after the inactivation of the 3902d Air Base Wing.
The Wing ended nearly twenty-five years of continuous Airborne Command Post ('Looking Glass') operations in 1990, assumed a modified alert posture, and continued worldwide reconnaissance. In October 1998, the wing transferred control of the EC-135 LOOKING GLASS mission to the United States Navy's TACAMO aircraft and the 7th Airborne Command and Control Squadron, which flew the EC-135 LOOKING GLASS aircraft, inactivated.
The wing deployed a Rivet Joint RC-135 from Hellenikon Air Base, Greece to Riyadh Air Base, Saudi Arabia on 8 August 1990, and began 24-hour-a-day reconnaissance of the region two days later for Central Command Commander Gen. Norman Schwarzkopf, under Operation Desert Shield. At the start of Operation Desert Storm, 18 January 1991, the wing continued to provide real-time information. In 1996, this operation moved to Prince Sultan Air Base, Saudi Arabia. On 9 August 2015, the wing celebrated 25 years of what is believed to be the longest continuous deployment by an Air Force unit.
Current operations
The 55th Strategic Reconnaissance Wing became the 55th Wing on 1 September 1991, to reflect the wing's performance of a diversity of missions. When SAC disestablished and Air Combat Command (ACC) established, the wing transferred to ACC and gained its fifth operational location.
The 55th SRW and the 55th Wing has been awarded the USAF's P. T. Cullen Award five times since 1971 for its contributions to photo and signal intelligence collection.
Aircraft and crews from the unit have at times temporarily relocated to the nearby Lincoln Air National Guard Base when Offutt's runway has been closed for repairs.
Mission
The 55th Operations Group is Air Combat Command's largest group, has operational control over 12 squadrons and two detachments worldwide. It employs 46 aircraft, including 13 models of seven different types.
The 55th Communications Group provides worldwide command, control, communications and computer (C4) systems, information management and combat support to war-fighting and national leadership. It also provides communications technology and support to the 55th Wing and 44 tenant units.
Combat-ready EC-130H Compass Call aircraft, crews, maintenance and operational support to combatant commanders is provided by the 55th Electronic Combat Group, based Davis Monthan Air Force Base, Arizona
Operations are supported by the 55th Maintenance Group which provides centralized direction of all maintenance staff functions providing support to world-wide aircraft reconnaissance missions. The 55th Medical Group serves 50,000 beneficiaries with extensive outpatient clinic capabilities and ancillary support and the 55th Mission Support Group provides vital mission support for Offutt Air Force Base through engineering, security, mission support, services, supply, transportation, contracting and deployment readiness programs.
Component units and assigned aircraft
Unless otherwise indicated, units are based at Offut AFB, Nebraska, and subordinate units are located at the same location as their commanding group.
55th Wing Staff
55th Comptroller Squadron
55th Operations Group
38th Reconnaissance Squadron – RC-135V/W Rivet Joint, TC-135W
45th Reconnaissance Squadron – OC-135B Open Skies, RC-135S Cobra Ball, RC-135U Combat Sent, TC-135W, WC-135W Constant Phoenix
55th Intelligence Support Squadron
55th Operations Support Squadron
82d Reconnaissance Squadron (Kadena AB, Japan) – RC-135
95th Reconnaissance Squadron (RAF Mildenhall, United Kingdom) – RC-135
97th Intelligence Squadron
338th Combat Training Squadron – RC-135, OC-135, WC-135
343rd Reconnaissance Squadron – RC-135V/W Rivet Joint, TC-135W
390th Intelligence Squadron
488th Intelligence Squadron
55th Communications Group
55th Cyber Squadron
55th Strategic Communications Squadron
55th Electronic Combat Group (Davis-Monthan AFB, Arizona)
41st Electronic Combat Squadron – EC-130H Compass Call
42nd Electronic Combat Squadron – EC-130H Compass Call
43rd Electronic Combat Squadron – EC-130H Compass Call
755th Aircraft Maintenance Squadron
755th Operations Support Squadron
55th Maintenance Group
55th Aircraft Maintenance Squadron
55th Maintenance Squadron
55th Medical Group
55th Aerospace Medicine Squadron
55th Dental Squadron
55th Medical Operations Squadron
55th Medical Support Squadron
55th Mission Support Group
55th Civil Engineering Squadron
55th Contracting Squadron
55th Force Support Squadron
55th Logistics Readiness Flight
55th Security Forces Squadron
Lineage
Established as the 55th Strategic Reconnaissance Wing on 29 June 1948
Activated on 19 July 1948
Inactivated on 14 October 1949
Redesignated 55th Strategic Reconnaissance Wing, Medium on 27 October 1950
Activated on 1 November 1950
Redesignated: 55th Strategic Reconnaissance Wing on 16 August 1966
Redesignated: 55th Wing on 1 September 1991
Assignments
311th Air Division, 19 July 1948 – 14 October 1949
Second Air Force, 1 November 1950
21 Air (later, 21 Strategic Aerospace) Division, 1 October 1952
Attached to 5th Air Division, 18 May-16 August 1955
810th Strategic Aerospace Division, 1 September 1964
12th Strategic Aerospace Division, 2 July 1966
14 Strategic Aerospace (later, 14 Air) Division, 30 June 1971
4th Air Division, 1 October 1976
57th Air Division, 1 April 1980
12th Air Division, 1 October 1982
14th Air Division, 1 October 1985
Second Air Force, 1 September 1991
Twelfth Air Force, 1 July 1993
Eighth Air Force, 1 October 2002
Twelfth Air Force, 1 October 2009
Twenty-Fifth Air Force, 1 October 2014
Sixteenth Air Force, 11 October 2019 – present
Components
Groups
55th Strategic Reconnaissance Group (later 55th Operations Group): 19 July 1948 – 14 October 1949; 1 November 1950 - 16 June 1952; 1 September 1991 – present
55th Electronic Combat Group: 3 February 2003 – present
Squadrons
1st Airborne Command and Control Squadron: 1 November 1975 – 6 October 2016
1st Strategic Reconnaissance Squadron (Provisional): attached 1 September-9 October 1948
1st Strategic Reconnaissance Squadron: attached 10–26 October 1948; attached 14 January-1 June 1949
2d Airborne Command and Control Squadron: 1 April 1970 – 19 July 1994
7th Airborne Command and Control Squadron: 19 July 1994 – 1 October 1998
23d Strategic Reconnaissance Squadron: attached 1–17 June 1949
38th Reconnaissance: attached 6 January 1951 – 15 June 1952, assigned 16 June 1952 – 1 April 1970; assigned 1 April 1979 – Present.
24th Reconnaissance Squadron, 7 July 1992 – 30 June 1994
45th Reconnaissance Squadron, 1 July 1994 – Present
55th Air Refueling Squadron: attached 8 January 1951 – 15 June 1952, assigned 16 June 1952 – 18 February 1954; assigned 1 October 1955 – 15 March 1963 (detached 31 October-27 December 1956)
55th Mobile Command and Control Squadron: 1984-29 September 2006
82d Reconnaissance Squadron: 2 October 1991 – present
97th Intelligence Squadron: ?–present
323d Strategic Reconnaissance Squadron: attached 19 September-10 October 1949
338th Strategic Reconnaissance Squadron: attached 25 November 1950 – 15 June 1952, assigned 16 June 1952 – 15 June 1963; assigned 25 March-25 December 1967
343d Strategic Reconnaissance Squadron (later 343d Reconnaissance Squadron): attached 19 July-26 October 1948; attached 4 January 1951 – 15 June 1952, assigned 16 June 1952 - Present.
390th Intelligence Squadron ?–present
488th Intelligence Squadron?–present
548th Strategic Missile Squadron: attached 1–31 August 1964, assigned 1 September 1964 – 25 March 1965
922d Reconnaissance Squadron: 1 June 1992 – 30 June 1994
Stations
Topeka Air Force Base (later Forbes Air Force Base), Kansas, 19 July 1948 – 14 October 1949
Ramey Air Force Base, Puerto Rico, 1 November 1950
Forbes Air Force Base, Kansas, 5 October 1952
Offutt Air Force Base, Nebraska, 16 August 1966 – present
Aircraft and missiles
B/RB-17 Flying Fortress, 1948–1949
B/RB-29 Superfortress, 1948–1949; 1950-1951
RC-54 Skymaster, 1948
RB-50 Superfortress, 1950–1954
EB/RB-47 Stratojet, 1954–1967
KC-97 Stratofreighter, 1956-?
SM-65 Atlas, 1964–1965
EC-135, 1966–1998
KC-135 Stratotanker, 1966–1998
RC-135, 1967–present
Boeing E-4, 1975–2016
C-135, 1977–1994
NKC-135, 1983–1994
TC-135, 1988–present
T-38, 1992–1995
WC-135, 1992–present
C-21: 1993-1997
OC-135: 1994–present
EC-130 Hercules, 2002–present
See also
List of B-29 units of the United States Air Force
List of B-50 units of the United States Air Force
List of B-47 units of the United States Air Force
References
Notes
Explanatory notes
Citations
Bibliography
External links
Offut Air Force Base units - 55th Wing
55th Wing Fact Sheet
USAF Aircraft Serial Number Search
"A Tale of Two Airplanes" by Kingdon R. "King" Hawes, Lt Col, USAF (Ret.)
0055
Military units and formations in Nebraska |
909695 | https://en.wikipedia.org/wiki/SuperTux | SuperTux | SuperTux is a free and open-source two-dimensional platform video game published under the GNU General Public License (GPL). The game was inspired by Nintendo's Super Mario Bros. series; instead of Mario, the hero in the game is Tux, the official mascot of the Linux kernel.
History
The game was originally created by Bill Kendrick and is maintained by the SuperTux Development Team. It is written mostly in the C++ programming language. Many of the in-game graphics were created by Ingo Ruhnke, author of Pingus.
The game was developed under usage of Simple DirectMedia Layer as cross-platform middlelayer targeting OpenGL and OpenAL. Game engine and physics engine are own developed. The game's metadata are S-Expressions of the programming language Lisp, scripts are written in Squirrel.
The development occurs in a series of stable milestones, each one improving steadily upon the last. First Milestone 1 (version 0.1.1-0.1.3) was released in 2004. Version 0.4.0 was released on December 20, 2015, which features significant improvements to gameplay, all new graphics, a switch to SDL2, and new features.
Milestone 2 (version 0.5.0) was officially released as stable in 2016, with the inclusion of the official level editor. Version 0.6.0 was released on December 23, 2018 with redesigned Icy Island and Forest, revamped rendering engine and many minor improvements.
On January 13, 2022, Supertux was released on Steam as an Early Access game.
Gameplay
Gameplay in SuperTux is similar to Super Mario Bros.. Tux can jump under bonus blocks marked with question marks to gain coins or retrieve power-ups such as the egg, which makes Tux bigger and allows him to take an extra hit before dying. Other objects such as trampolines and invincibility granting stars can also be obtained from these blocks. Tux can defeat some bad guys by jumping on them, and most can be defeated or frozen by shooting bullets after collecting a fire flower or an ice flower. Earth flowers grant Tux a miner helmet with a spotlight for dark areas and can give invincibility for a few seconds, and air flowers allow Tux to glide in the air, jump higher and move faster. If Tux gets hit after he collected a flower, he loses his helmet and transforms back into big Tux. The objective of each level is to get to the end, usually marked by checker-patterned poles.
At the end of each world is a boss, such as the Yeti boss on Icy Island.
Contributed Levels
In addition to the two main worlds, there are contributed levels, which include the Bonus Islands and a special retro levelset (Revenge in Redmond) designed to celebrate the game's 20th anniversary. In addition to these, there are installable addons and custom levels added by the player, either created in the internal Level Editor or added manually.
Add-on levels
There are additional add-on levels in SuperTux which can be installed with the built-in add-on manager or manually. The add-on manager lists over 20 add-ons. New add-ons are usually published on the forum and can be added to the list after testing.
Plot
In the game, Tux begins in Icy Island. Tux holds a picnic with Penny, his girlfriend. He starts dancing and distracted, he doesn't notice that the villain of the game, named Nolok, kidnapping Penny. Once he finds that Penny is missing, and determined to save Penny, Tux begins his journey. He then navigates the Icy Island and later Forest to find her.
Reception
In 2007 Punto Informatico described the atmosphere of the game as pleasant and praised the free availability of the game.
In 2008, SuperTux was used as a game for children by school district #73 in British Columbia, which had decided to transition to free and open-source software.
Also in 2008, SuperTux was included in the Caixa Mágica operative system, a popular OS used in Magalhães computer.
The game was ported to other platforms, including GP2X, Pocket PC, PSP, and Palm WebOS. The game was also scheduled to be included in the release of the EVO Smart Console as of April 2009.
In May 2017 download portal Softpedia lists for the Linux version alone over 80,000 downloads, Softonic over 750,000 downloads for the Windows version. Between 2002 and May 2017 SuperTux aggregated also over 850,000 downloads via SourceForge.net.
See also
Secret Maryo Chronicles
Mari0
List of open-source video games
References
External links
Official SuperTux website
Download stable or development snapshots
Review and screen shots of Milestone 1 at HeadshotGamer.com
2004 video games
BeOS games
GP2X games
Linux games
Open-source video games
Platform games
MacOS games
Video games developed in the United States
Video games set in Antarctica
Video game clones
Windows games |
2010815 | https://en.wikipedia.org/wiki/How%20to%20Make%20a%20Monster%20%282001%20film%29 | How to Make a Monster (2001 film) | How to Make a Monster is a 2001 film starring Steven Culp and Clea DuVall. It is the third release in the Creature Features series of film remakes produced by Stan Winston. Julie Strain made a cameo appearance in the film as herself. How to Make a Monster debuted on October 14, 2001 on Cinemax. In 2005, it was nominated for a Hollywood Makeup Artist Award and Hair Stylist Guild Award.
Plot
Following a disastrous video game test, Clayton Software fires the development team and replaces them with weapons expert Hardcore, game artificial intelligence designer Sol, and sound effects creator Bug. Company CEO Faye Clayton promising $1 million to whoever makes the scariest game,sparking a rivalry amongst the trio. Three weeks later,, the programmers try out their game and computer network-integrated motion capture suit with help from company intern Laura Wheeler. However, lightning strikes the building, causing a blackout and wiping their data. Knowing someone has to stay overnight to monitor the backup, the four use the game to decide. Ultimately, Sol is chosen to stay behind.
After he inserts his new AI chip into the company mainframe, the suit activates and kill Sol. The following morning, Hardcore and Bug find Sol's body merged to the suit and the backup CD gone. The former attempts to review the security camera's footage, but is attacked by the suit, which decapitates him so it can take his body and weapons to better resemble one of the game's monsters. Upon learning of what happened, Bug theorizes the lightning strike, Sol's chip, and Hardcore's system rewrite caused the suit to believe the real world is part of the game. To stop the monster, Bug, Laura, and Clayton businessman Peter Drummond try to shut down the computer and wipe the game's data. However, the security system malfunctions, trapping them inside. The monster attacks Bug, but he exposes a gas line and uses his lighter to ignite the gas, killing himself and the monster. Nonetheless, it returns to the mo-cap suit and attacks Drummond, but Laura saves him by fighting the in-game monster. She tries to beat the game, but becomes frustrated and hysterical until Drummond suggests she use a virtual reality headset, promising to stay with her while she fights. In the midst of playing though, she realizes he left her before the real world monster returns. She escapes to the kitchen, where she finds Hardcore's PDA contains footage of Drummond stealing the backup CD. She later finds and confronts Drummond at gunpoint. After he mocks her, she shoots him in the knee and allows the monster to kill him before luring it to a fish tank to electrocute it.
Sometime later, a jaded, world weary Laura turns in the final version of the game and demands the bonus for herself, which she uses to become the new CEO of Clayton Software, renaming it Wheeler Software.
Cast
Steven Culp as Peter S. Drummond, a hardened businessman at Clayton Software. He is revealed to have stolen the copy of the evil monster for the game.
Clea DuVall as Laura Wheeler, the kind, 24-year-old intern of Clayton Software. Eventually, Laura kills the monster and becomes a ruthless CEO.
Tyler Mane as 'Hardcore', the muscular developer who is responsible for designing the game's weapons for motion capture sessions.
Jason Marsden as 'Bug', the developer who creates the sound and music for the game.
Karim Prince as Sol, the developer who programs the game's artificial intelligence.
Julie Strain as Herself. Strain makes a cameo appearance in the film when she arrives for a motion-capture session.
James Sullivan as The Monster, who is brought to life through a motion capture suit as a result of a lightning strike.
Colleen Camp as Faye Clayton, the head of the computer software company Clayton Software.
Danny Masterson as Jeremy, the abusive rapist boyfriend of Laura Wheeler.
DVD release
How to Make a Monster was released on DVD on June 11, 2002. The film is presented in anamorphic widescreen and its audio is presented in 5.1 surround sound in both English and French. Extra features include a "making-of" featurette, photo galleries of drawings and behind-the-scenes images, and theatrical trailers for other Columbia TriStar horror films. The DVD also includes DVD-ROM content for personal computers.
Reception
Nathan Rabin of The A.V. Club wrote, "Huang's film intermittently qualifies as an intriguing experiment, but it quickly runs out of ideas and energy." Maitland McDonagh of TV Guide rated it 2/4 stars and wrote, "One in a series of made-for-cable, in-name-only remakes of cheesy cult classics, this sci-fi horror picture is tailor-made for people who hate video games." Beyond Hollywood wrote that the film's budget limitations and technological ignorance make it "at best silly, and at worst pure crap". Adam Tyner of DVD Talk rated it 2/5 stars and concluded, "It's not an awful movie, after all...just a decidedly lackluster one."
References
Further reading
Staiger, Michael. "Evilution - Die Bestie aus dem Cyberspace". In "film-dienst" (Germany), Vol. 55, Iss. 15, 16 July 2002, Pg. 32
External links
How to Make a Monster at Allmovie
Post-Halloween Horror Special – How to Make a Monster
Video Review on gameXcess.net
2001 films
2001 horror films
2001 television films
2001 science fiction films
Remakes of American films
American films
American science fiction horror films
English-language films
Horror film remakes
2000s monster movies
American robot films
American science fiction television films
Films directed by George Huang
American monster movies
American horror television films
Films about video games |
6759 | https://en.wikipedia.org/wiki/Context-free%20grammar | Context-free grammar | In formal language theory, a context-free grammar (CFG) is a formal grammar whose production rules are of the form
with a single nonterminal symbol, and a string of terminals and/or nonterminals ( can be empty). A formal grammar is "context free" if its production rules can be applied regardless of the context of a nonterminal. No matter which symbols surround it, the single nonterminal on the left hand side can always be replaced by the right hand side. This is what distinguishes it from a context-sensitive grammar.
A formal grammar is essentially a set of production rules that describe all possible strings in a given formal language. Production rules are simple replacements. For example, the first rule in the picture,
replaces with . There can be multiple replacement rules for a given nonterminal symbol. The language generated by a grammar is the set of all strings of terminal symbols that can be derived, by repeated rule applications, from some particular nonterminal symbol ("start symbol").
Nonterminal symbols are used during the derivation process, but do not appear in its final result string.
Languages generated by context-free grammars are known as context-free languages (CFL). Different context-free grammars can generate the same context-free language. It is important to distinguish the properties of the language (intrinsic properties) from the properties of a particular grammar (extrinsic properties). The language equality question (do two given context-free grammars generate the same language?) is undecidable.
Context-free grammars arise in linguistics where they are used to describe the structure of sentences and words in a natural language, and they were in fact invented by the linguist Noam Chomsky for this purpose. By contrast, in computer science, as the use of recursively-defined concepts increased, they were used more and more. In an early application, grammars are used to describe the structure of programming languages. In a newer application, they are used in an essential part of the Extensible Markup Language (XML) called the Document Type Definition.
In linguistics, some authors use the term phrase structure grammar to refer to context-free grammars, whereby phrase-structure grammars are distinct from dependency grammars. In computer science, a popular notation for context-free grammars is Backus–Naur form, or BNF.
Background
Since the time of Pāṇini, at least, linguists have described the grammars of languages in terms of their block structure, and described how sentences are recursively built up from smaller phrases, and eventually individual words or word elements. An essential property of these block structures is that logical units never overlap. For example, the sentence:
John, whose blue car was in the garage, walked to the grocery store.
can be logically parenthesized (with the logical metasymbols [ ]) as follows:
[John[, [whose [blue car]] [was [in [the garage]]],]] [walked [to [the [grocery store]]]].
A context-free grammar provides a simple and mathematically precise mechanism for describing the methods by which phrases in some natural language are built from smaller blocks, capturing the "block structure" of sentences in a natural way. Its simplicity makes the formalism amenable to rigorous mathematical study. Important features of natural language syntax such as agreement and reference are not part of the context-free grammar, but the basic recursive structure of sentences, the way in which clauses nest inside other clauses, and the way in which lists of adjectives and adverbs are swallowed by nouns and verbs, is described exactly.
Context-free grammars are a special form of Semi-Thue systems that in their general form date back to the work of Axel Thue.
The formalism of context-free grammars was developed in the mid-1950s by Noam Chomsky, and also their classification as a special type of formal grammar (which he called phrase-structure grammars). What Chomsky called a phrase structure grammar is also known now as a constituency grammar, whereby constituency grammars stand in contrast to dependency grammars. In Chomsky's generative grammar framework, the syntax of natural language was described by context-free rules combined with transformation rules.
Block structure was introduced into computer programming languages by the Algol project (1957–1960), which, as a consequence, also featured a context-free grammar to describe the resulting Algol syntax. This became a standard feature of computer languages, and the notation for grammars used in concrete descriptions of computer languages came to be known as Backus–Naur form, after two members of the Algol language design committee. The "block structure" aspect that context-free grammars capture is so fundamental to grammar that the terms syntax and grammar are often identified with context-free grammar rules, especially in computer science. Formal constraints not captured by the grammar are then considered to be part of the "semantics" of the language.
Context-free grammars are simple enough to allow the construction of efficient parsing algorithms that, for a given string, determine whether and how it can be generated from the grammar. An Earley parser is an example of such an algorithm, while the widely used LR and LL parsers are simpler algorithms that deal only with more restrictive subsets of context-free grammars.
Formal definitions
A context-free grammar is defined by the 4-tuple ,
where
is a finite set; each element is called a nonterminal character or a variable. Each variable represents a different type of phrase or clause in the sentence. Variables are also sometimes called syntactic categories. Each variable defines a sub-language of the language defined by .
is a finite set of terminals, disjoint from , which make up the actual content of the sentence. The set of terminals is the alphabet of the language defined by the grammar .
is a finite relation in , where the asterisk represents the Kleene star operation. The members of are called the (rewrite) rules or productions of the grammar. (also commonly symbolized by a )
is the start variable (or start symbol), used to represent the whole sentence (or program). It must be an element of .
Production rule notation
A production rule in is formalized mathematically as a pair , where is a nonterminal and is a string of variables and/or terminals; rather than using ordered pair notation, production rules are usually written using an arrow operator with as its left hand side and as its right hand side:
.
It is allowed for to be the empty string, and in this case it is customary to denote it by ε. The form is called an -production.
It is common to list all right-hand sides for the same left-hand side on the same line, using | (the pipe symbol) to separate them. Rules and can hence be written as . In this case, and are called the first and second alternative, respectively.
Rule application
For any strings , we say directly yields , written as , if with and such that and . Thus, is a result of applying the rule to .
Repetitive rule application
For any strings we say yields or is derived from if there is a positive integer and strings such that . This relation is denoted , or in some textbooks. If , the relation holds. In other words, and are the reflexive transitive closure (allowing a string to yield itself) and the transitive closure (requiring at least one step) of , respectively.
Context-free language
The language of a grammar is the set
of all terminal-symbol strings derivable from the start symbol.
A language is said to be a context-free language (CFL), if there exists a CFG , such that .
Non-deterministic pushdown automata recognize exactly the context-free languages.
Examples
Words concatenated with their reverse
The grammar , with productions
,
,
,
is context-free. It is not proper since it includes an ε-production. A typical derivation in this grammar is
.
This makes it clear that
.
The language is context-free, however, it can be proved that it is not regular.
If the productions
,
,
are added, a context-free grammar for the set of all palindromes over the alphabet is obtained.
Well-formed parentheses
The canonical example of a context-free grammar is parenthesis matching, which is representative of the general case. There are two terminal symbols "(" and ")" and one nonterminal symbol S. The production rules are
,
,
The first rule allows the S symbol to multiply; the second rule allows the S symbol to become enclosed by matching parentheses; and the third rule terminates the recursion.
Well-formed nested parentheses and square brackets
A second canonical example is two different kinds of matching nested parentheses, described by the productions:
with terminal symbols [ ] ( ) and nonterminal S.
The following sequence can be derived in that grammar:
Matching pairs
In a context-free grammar, we can pair up characters the way we do with brackets. The simplest example:
This grammar generates the language , which is not regular (according to the pumping lemma for regular languages).
The special character ε stands for the empty string. By changing the above grammar to
we obtain a grammar generating the language instead. This differs only in that it contains the empty string while the original grammar did not.
Distinct number of a's and b's
A context-free grammar for the language consisting of all strings over {a,b} containing an unequal number of a's and b's:
Here, the nonterminal T can generate all strings with more a's than b's, the nonterminal U generates all strings with more b's than a's and the nonterminal V generates all strings with an equal number of a's and b's. Omitting the third alternative in the rules for T and U doesn't restrict the grammar's language.
Second block of b's of double size
Another example of a non-regular language is . It is context-free as it can be generated by the following context-free grammar:
First-order logic formulas
The formation rules for the terms and formulas of formal logic fit the definition of context-free grammar, except that the set of symbols may be infinite and there may be more than one start symbol.
Examples of languages that are not context free
In contrast to well-formed nested parentheses and square brackets in the previous section, there is no context-free grammar for generating all sequences of two different types of parentheses, each separately balanced disregarding the other, where the two types need not nest inside one another, for example:
or
The fact that this language is not context free can be proven using Pumping lemma for context-free languages and a proof by contradiction, observing that all words of the form
should belong to the language. This language belongs instead to a more general class and can be described by a conjunctive grammar, which in turn also includes other non-context-free languages, such as the language of all words of the form
.
Regular grammars
Every regular grammar is context-free, but not all context-free grammars are regular. The following context-free grammar, for example, is also regular.
The terminals here are and , while the only nonterminal is .
The language described is all nonempty strings of s and s that end in .
This grammar is regular: no rule has more than one nonterminal in its right-hand side, and each of these nonterminals is at the same end of the right-hand side.
Every regular grammar corresponds directly to a nondeterministic finite automaton, so we know that this is a regular language.
Using pipe symbols, the grammar above can be described more tersely as follows:
Derivations and syntax trees
A derivation of a string for a grammar is a sequence of grammar rule applications that transform the start symbol into the string.
A derivation proves that the string belongs to the grammar's language.
A derivation is fully determined by giving, for each step:
the rule applied in that step
the occurrence of its left-hand side to which it is applied
For clarity, the intermediate string is usually given as well.
For instance, with the grammar:
the string
can be derived from the start symbol with the following derivation:
(by rule 1. on )
(by rule 1. on the second )
(by rule 2. on the first )
(by rule 2. on the second )
(by rule 3. on the third )
Often, a strategy is followed that deterministically chooses the next nonterminal to rewrite:
in a leftmost derivation, it is always the leftmost nonterminal;
in a rightmost derivation, it is always the rightmost nonterminal.
Given such a strategy, a derivation is completely determined by the sequence of rules applied. For instance, one leftmost derivation of the same string is
(by rule 1 on the leftmost )
(by rule 2 on the leftmost )
(by rule 1 on the leftmost )
(by rule 2 on the leftmost )
(by rule 3 on the leftmost ),
which can be summarized as
rule 1
rule 2
rule 1
rule 2
rule 3.
One rightmost derivation is:
(by rule 1 on the rightmost )
(by rule 1 on the rightmost )
(by rule 3 on the rightmost )
(by rule 2 on the rightmost )
(by rule 2 on the rightmost ),
which can be summarized as
rule 1
rule 1
rule 3
rule 2
rule 2.
The distinction between leftmost derivation and rightmost derivation is important because in most parsers the transformation of the input is defined by giving a piece of code for every grammar rule that is executed whenever the rule is applied. Therefore, it is important to know whether the parser determines a leftmost or a rightmost derivation because this determines the order in which the pieces of code will be executed. See for an example LL parsers and LR parsers.
A derivation also imposes in some sense a hierarchical structure on the string that is derived. For example, if the string "1 + 1 + a" is derived according to the leftmost derivation outlined above, the structure of the string would be:
where indicates a substring recognized as belonging to . This hierarchy can also be seen as a tree:
This tree is called a parse tree or "concrete syntax tree" of the string, by contrast with the abstract syntax tree. In this case the presented leftmost and the rightmost derivations define the same parse tree; however, there is another rightmost derivation of the same string
(by rule 1 on the rightmost )
(by rule 3 on the rightmost )
(by rule 1 on the rightmost )
(by rule 2 on the rightmost )
(by rule 2 on the rightmost ),
which defines a string with a different structure
and a different parse tree:
Note however that both parse trees can be obtained by both leftmost and rightmost derivations. For example, the last tree can be obtained with the leftmost derivation as follows:
(by rule 1 on the leftmost )
(by rule 1 on the leftmost )
(by rule 2 on the leftmost )
(by rule 2 on the leftmost )
(by rule 3 on the leftmost ),
If a string in the language of the grammar has more than one parsing tree, then the grammar is said to be an ambiguous grammar. Such grammars are usually hard to parse because the parser cannot always decide which grammar rule it has to apply. Usually, ambiguity is a feature of the grammar, not the language, and an unambiguous grammar can be found that generates the same context-free language. However, there are certain languages that can only be generated by ambiguous grammars; such languages are called inherently ambiguous languages.
Example: Algebraic expressions
Here is a context-free grammar for syntactically correct infix algebraic expressions in the variables x, y and z:
This grammar can, for example, generate the string
as follows:
(by rule 5)
(by rule 6, applied to the leftmost )
(by rule 7, applied to the rightmost )
(by rule 8, applied to the leftmost )
(by rule 8, applied to the rightmost )
(by rule 4, applied to the leftmost )
(by rule 6, applied to the fourth )
(by rule 4, applied to the rightmost )
(etc.)
Note that many choices were made underway as to which rewrite was going to be performed next.
These choices look quite arbitrary. As a matter of fact, they are, in the sense that the string finally generated is always the same. For example, the second and third rewrites
(by rule 6, applied to the leftmost )
(by rule 7, applied to the rightmost )
could be done in the opposite order:
(by rule 7, applied to the rightmost )
(by rule 6, applied to the leftmost )
Also, many choices were made on which rule to apply to each selected .
Changing the choices made and not only the order they were made in usually affects which terminal string comes out at the end.
Let's look at this in more detail. Consider the parse tree of this derivation:
Starting at the top, step by step, an S in the tree is expanded, until no more unexpanded es (nonterminals) remain.
Picking a different order of expansion will produce a different derivation, but the same parse tree.
The parse tree will only change if we pick a different rule to apply at some position in the tree.
But can a different parse tree still produce the same terminal string,
which is in this case?
Yes, for this particular grammar, this is possible.
Grammars with this property are called ambiguous.
For example, can be produced with these two different parse trees:
However, the language described by this grammar is not inherently ambiguous:
an alternative, unambiguous grammar can be given for the language, for example:
,
once again picking as the start symbol. This alternative grammar will produce with a parse tree similar to the left one above, i.e. implicitly assuming the association , which does not follow standard order of operations. More elaborate, unambiguous and context-free grammars can be constructed that produce parse trees that obey all desired operator precedence and associativity rules.
Normal forms
Every context-free grammar with no ε-production has an equivalent grammar in Chomsky normal form, and a grammar in Greibach normal form. "Equivalent" here means that the two grammars generate the same language.
The especially simple form of production rules in Chomsky normal form grammars has both theoretical and practical implications. For instance, given a context-free grammar, one can use the Chomsky normal form to construct a polynomial-time algorithm that decides whether a given string is in the language represented by that grammar or not (the CYK algorithm).
Closure properties
Context-free languages are closed under the various operations, that is, if the languages K and L are
context-free, so is the result of the following operations:
union K ∪ L; concatenation K ∘ L; Kleene star L*
substitution (in particular homomorphism)
inverse homomorphism
intersection with a regular language
They are not closed under general intersection (hence neither under complementation) and set difference.
Decidable problems
The following are some decidable problems about context-free grammars.
Parsing
The parsing problem, checking whether a given word belongs to the language given by a context-free grammar, is decidable, using one of the general-purpose parsing algorithms:
CYK algorithm (for grammars in Chomsky normal form)
Earley parser
GLR parser
LL parser (only for the proper subclass of for LL(k) grammars)
Context-free parsing for Chomsky normal form grammars was shown by Leslie G. Valiant to be reducible to boolean matrix multiplication, thus inheriting its complexity upper bound of O(n2.3728639). Conversely, Lillian Lee has shown O(n3−ε) boolean matrix multiplication to be reducible to O(n3−3ε) CFG parsing, thus establishing some kind of lower bound for the latter.
Reachability, productiveness, nullability
A nonterminal symbol is called productive, or generating, if there is a derivation for some string of terminal symbols. is called reachable if there is a derivation for some strings of nonterminal and terminal symbols from the start symbol. is called useless if it is unreachable or unproductive. is called nullable if there is a derivation . A rule is called an ε-production. A derivation is called a cycle.
Algorithms are known to eliminate from a given grammar, without changing its generated language,
unproductive symbols,
unreachable symbols,
ε-productions, with one possible exception, and
cycles.
In particular, an alternative containing a useless nonterminal symbol can be deleted from the right-hand side of a rule.
Such rules and alternatives are called useless.
In the depicted example grammar, the nonterminal D is unreachable, and E is unproductive, while C → C causes a cycle.
Hence, omitting the last three rules doesn't change the language generated by the grammar, nor does omitting the alternatives "| Cc | Ee" from the right-hand side of the rule for S.
A context-free grammar is said to be proper if it has neither useless symbols nor ε-productions nor cycles.
Combining the above algorithms, every context-free grammar not generating ε can be transformed into a weakly equivalent proper one.
Regularity and LL(k) checks
It is decidable whether a given grammar is a regular grammar, as well as whether it is an LL(k) grammar for a given k≥0. If k is not given, the latter problem is undecidable.
Given a context-free language, it is neither decidable whether it is regular, nor whether it is an LL(k) language for a given k.
Emptiness and finiteness
There are algorithms to decide whether a language of a given context-free language is empty, as well as whether it is finite.
Undecidable problems
Some questions that are undecidable for wider classes of grammars become decidable for context-free grammars; e.g. the emptiness problem (whether the grammar generates any terminal strings at all), is undecidable for context-sensitive grammars, but decidable for context-free grammars.
However, many problems are undecidable even for context-free grammars. Examples are:
Universality
Given a CFG, does it generate the language of all strings over the alphabet of terminal symbols used in its rules?
A reduction can be demonstrated to this problem from the well-known undecidable problem of determining whether a Turing machine accepts a particular input (the halting problem). The reduction uses the concept of a computation history, a string describing an entire computation of a Turing machine. A CFG can be constructed that generates all strings that are not accepting computation histories for a particular Turing machine on a particular input, and thus it will accept all strings only if the machine doesn't accept that input.
Language equality
Given two CFGs, do they generate the same language?
The undecidability of this problem is a direct consequence of the previous: it is impossible to even decide whether a CFG is equivalent to the trivial CFG defining the language of all strings.
Language inclusion
Given two CFGs, can the first one generate all strings that the second one can generate?
If this problem was decidable, then language equality could be decided too: two CFGs G1 and G2 generate the same language if L(G1) is a subset of L(G2) and L(G2) is a subset of L(G1).
Being in a lower or higher level of the Chomsky hierarchy
Using Greibach's theorem, it can be shown that the two following problems are undecidable:
Given a context-sensitive grammar, does it describe a context-free language?
Given a context-free grammar, does it describe a regular language?
Grammar ambiguity
Given a CFG, is it ambiguous?
The undecidability of this problem follows from the fact that if an algorithm to determine ambiguity existed, the Post correspondence problem could be decided, which is known to be undecidable.
Language disjointness
Given two CFGs, is there any string derivable from both grammars?
If this problem was decidable, the undecidable Post correspondence problem could be decided, too: given strings over some alphabet , let the grammar consist of the rule
;
where denotes the reversed string and doesn't occur among the ; and let grammar consist of the rule
;
Then the Post problem given by has a solution if and only if and share a derivable string.
Extensions
An obvious way to extend the context-free grammar formalism is to allow nonterminals to have arguments, the values of which are passed along within the rules. This allows natural language features such as agreement and reference, and programming language analogs such as the correct use and definition of identifiers, to be expressed in a natural way. E.g. we can now easily express that in English sentences, the subject and verb must agree in number. In computer science, examples of this approach include affix grammars, attribute grammars, indexed grammars, and Van Wijngaarden two-level grammars. Similar extensions exist in linguistics.
An extended context-free grammar (or regular right part grammar) is one in which the right-hand side of the production rules is allowed to be a regular expression over the grammar's terminals and nonterminals. Extended context-free grammars describe exactly the context-free languages.
Another extension is to allow additional terminal symbols to appear at the left-hand side of rules, constraining their application. This produces the formalism of context-sensitive grammars.
Subclasses
There are a number of important subclasses of the context-free grammars:
LR(k) grammars (also known as deterministic context-free grammars) allow parsing (string recognition) with deterministic pushdown automata (PDA), but they can only describe deterministic context-free languages.
Simple LR, Look-Ahead LR grammars are subclasses that allow further simplification of parsing. SLR and LALR are recognized using the same PDA as LR, but with simpler tables, in most cases.
LL(k) and LL(*) grammars allow parsing by direct construction of a leftmost derivation as described above, and describe even fewer languages.
Simple grammars are a subclass of the LL(1) grammars mostly interesting for its theoretical property that language equality of simple grammars is decidable, while language inclusion is not.
Bracketed grammars have the property that the terminal symbols are divided into left and right bracket pairs that always match up in rules.
Linear grammars have no rules with more than one nonterminal on the right-hand side.
Regular grammars are a subclass of the linear grammars and describe the regular languages, i.e. they correspond to finite automata and regular expressions.
LR parsing extends LL parsing to support a larger range of grammars; in turn, generalized LR parsing extends LR parsing to support arbitrary context-free grammars. On LL grammars and LR grammars, it essentially performs LL parsing and LR parsing, respectively, while on nondeterministic grammars, it is as efficient as can be expected. Although GLR parsing was developed in the 1980s, many new language definitions and parser generators continue to be based on LL, LALR or LR parsing up to the present day.
Linguistic applications
Chomsky initially hoped to overcome the limitations of context-free grammars by adding transformation rules.
Such rules are another standard device in traditional linguistics; e.g. passivization in English. Much of generative grammar has been devoted to finding ways of refining the descriptive mechanisms of phrase-structure grammar and transformation rules such that exactly the kinds of things can be expressed that natural language actually allows. Allowing arbitrary transformations does not meet that goal: they are much too powerful, being Turing complete unless significant restrictions are added (e.g. no transformations that introduce and then rewrite symbols in a context-free fashion).
Chomsky's general position regarding the non-context-freeness of natural language has held up since then, although his specific examples regarding the inadequacy of context-free grammars in terms of their weak generative capacity were later disproved.
Gerald Gazdar and Geoffrey Pullum have argued that despite a few non-context-free constructions in natural language (such as cross-serial dependencies in Swiss German and reduplication in Bambara), the vast majority of forms in natural language are indeed context-free.
See also
Parsing expression grammar
Stochastic context-free grammar
Algorithms for context-free grammar generation
Pumping lemma for context-free languages
References
Notes
Further reading
. Chapter 4: Context-Free Grammars, pp. 77–106; Chapter 6: Properties of Context-Free Languages, pp. 125–137.
. Chapter 2: Context-Free Grammars, pp. 91–122; Section 4.1.2: Decidable problems concerning context-free languages, pp. 156–159; Section 5.1.1: Reductions via computation histories: pp. 176–183.
External links
Computer programmers may find the stack exchange answer to be useful.
CFG Developer created by Christopher Wong at Stanford University in 2014; modified by Kevin Gibbons in 2015.
1956 in computing
Compiler construction
Formal languages
Programming language topics
Wikipedia articles with ASCII art |
49555553 | https://en.wikipedia.org/wiki/New%20Regent%20Street | New Regent Street | New Regent Street is a pedestrian mall in Christchurch. Built as a private development in the early 1930s with 40 shops in Spanish Mission architectural style, it is one of the city's major tourist attractions. Providing a number of small shops as a comprehensive development was an advanced idea at the time, and New Regent Street is regarded as a forerunner to modern shopping malls. Due to its coherent architectural character, the buildings in the streets are listed as Category I heritage items by Heritage New Zealand, and in addition, the entire street has a historic area listing. The street was pedestrianised in 1994 in preparation for the introduction of the Christchurch heritage tram, which began operation in February 1995. Damaged in the February 2011 Christchurch earthquake, the street and buildings reopened in April 2013, and the tram returned from November of that year. Following the 2016 Valentine's Day earthquake, five of the buildings that had not been repaired after the previous earthquakes have been cordoned off, which stopped the tram from operating on its original heritage loop until May.
Background and location
New Regent Street is located in the Christchurch Central City. It is oriented in a north-south direction and placed between Armagh Street at its north end, and Gloucester Street on its south side. Cathedral Square, the centre of Christchurch, is located one block over to the south-west. What is now New Regent Street was originally known as "The Circus paddock", as visiting circuses would make use of the land. From 1888, the land was occupied by a building called the Colosseum. The Colosseum was initially an ice skating rink, then used for a boot factory, became a taxi rank for some time and in 1908, it was Christchurch's first movie theatre. The Colosseum was demolished in January 1931.
History
Development
In 1929, businessman and chairman of The Press, George Gould, proposed a variety of measures to relieve traffic congestion in Colombo Street and Cathedral Square, including a new bridge over the Avon River connecting Oxford Terrace with Durham Street south, and a new diagonal street from the Armagh Street / Manchester Street intersection to Gloucester Street near its intersection with Colombo Street. The latter proposal would have required the demolition of the Colosseum. Gould suggested this diagonal street be called Little High Street, in reference to the diagonal High Street further south in the central city. Only two weeks after Gould's proposal, a group of businessman led by Arthur Francis Stacey put a proposal for a new street with a Spanish theme to Christchurch City Council's town planning committee. The group had secured options on the Colosseum and on those two properties that separated the Colosseum from access onto Armagh Street. The plans had been drawn by Francis Willis, who had previously been employed by Christchurch City Council as their architect, but who had since 1924 been self-employed. Stacey and his business partners had formed a company called Regent Street Limited in 1929; other company directors were David Manson, Alexander Hamilton Forbes, and John Joseph Dougall. By February 1930, the project had been approved in principle by Christchurch City Council. The concept of a number of small shops all built as a comprehensive development was advanced for its time, and can be regarded as the forerunner of modern shopping malls.
A building permit for the construction of the buildings was imminent, but had not been issued by January 1931, while the demolition of the Colosseum was carried out. The contract for the construction of the 40 buildings and the roadway was let to the Boyle Brothers for NZ£32,000. The builder did not perform and the contract was retendered for approximately NZ£32,000, with P. Graham and Son, Limited, as the successful party. The overall cost of the project was NZ£90,000. The street was built at a width of , and the north-south length of Christchurch Central City blocks is . The sculptor William Trethewey carried out some of the interior decorations.
The company asked for the street to be named Regent Street, but the Christchurch City Council declined the request on the grounds that a street of that name already existed in Sydenham, and suggested New Regent Street instead. Regent Street in Sydenham was renamed Roxburgh Street in 1948. New Regent Street is named for London's Regent Street, itself that city's first planned street, and built between 1814 and 1825.
The street was finished in late March 1932. One of its features was a lighting system consisting of 400 lamps, and when this was first switch on for a trial on 24 March, hundreds of citizens went for a walk down the long street. To make use of the lighting, the formal opening was held on the evening of 1 April 1932. The proceedings were chaired by Dan Sullivan, the Mayor of Christchurch, who also gave the first speech. Other speakers were David Manson (chairman of Regent Street Ltd), city councillor A. H. Andrews (chairman of the town planning committee), and Stacey. Mrs. Manson then cut the ribbon. During the opening speech of the mayor, the crowd was entertained by a person from the crowd warning of a man with a sword, reference to the de Groot incident that happened at the Sydney Harbour Bridge two weeks earlier where a protester cut the ribbon prior to the official ceremony.
The opening happened during the depth of the Great Depression, and it was one of only a few larger projects undertaken in the South Island at the time. Whilst Manson claimed in his speech at the opening that half the shops had been let, only three shops were actually occupied. Later in 1932, the contractor, P. Graham and Son, took two of the directors (Manson and Forbes) to court over outstanding payments of NZ£1,000. The contractor won the case and was awarded costs.
Problems with parking for extended periods of time occurred and in April 1933, parking of cars in New Regent Street was time-restricted to 20 minutes. Around the same time, Regent Street Limited was in financial difficulty and petition to wind up the company was filed, and the judge gave the company a fortnight to pay up, which resolved the issue. Regent Street Limited sold off individual shops in the 1940s until all 40 units were in private ownership. The roadway was transferred to Christchurch City Council, making it a public road, after World War II.
Pedestrian mall
In 1986, a one-way restriction was imposed for driving on New Regent Street. This measure was in place for only eight years before the street was closed to traffic in 1994 and turned into a pedestrian mall in preparation for the reintroduction of the Christchurch tram. Originally, it was intended for the tram to go back and forth along Worcester Street, but the plans were changed and a loop created, with Rolleston Avenue, Armagh Street, New Regent Street and the John Britten property known as Cathedral Junction making up the route. The tram began operating on 4 February 1995.
Earthquakes
The buildings sustained damage during the 22 February 2011 Christchurch earthquake. The whole central city was cordoned off, and the public did not have access. Repairs to the street and buildings were carried out by Naylor Love Construction Limited for NZ$3,000,000, with Fulton Ross Team Architects providing the architectural inputs. New Regent Street was set to be reopened in December 2012, but this was delayed to February 2013, and then March, and it finally did open on Saturday, 20 April 2013. The Press described the opening as an "anti-climax", as only five of the shops were open for business. A further fourteen shops had been tenanted but were not ready, while tenants for seven shops had yet to be found. Five of the buildings, all owned by Helen Thacker, had not been repaired or earthquake strengthened. Two of those buildings were classed as earthquake-prone, and after months of negotiations by Canterbury Earthquake Recovery Authority staff, the owner agreed to have those two buildings earthquake strengthened. This allowed for safety fences to be removed from New Regent Street in December 2013.
The tram began operating again in November 2013 on a limited route from New Regent Street to Worcester Boulevard; the tracks in Armagh Street had to be repaired still. In November 2014, the pre-earthquake loop reopened. Some New Regent Street retailers claim that half their custom comes from tourist arriving there by tram.
Following the 2016 Valentine's Day earthquake, the five properties in New Regent Street owned by Helen Thacker were cordoned off due to risk of collapse of their façades, which stopped the tram from doing its traditional route through the pedestrian mall. Later on, a further two properties not owned by Thacker were cordoned off. The operator of the tramway, Michael Esposito from Welcome Aboard, claimed that the tram had so far brought 100,000 customers to New Regent Street during the 2015/16 summer. Tim Hunter, the chief executive of Christchurch & Canterbury Tourism, lamented that the inaction of one owner "will put us back in the world news as not being visitor friendly." The street's spokesperson stated that business had been "disappointing" while the tram was not operating. The barriers were removed and the tram began operating again on 1 May 2016.
Heritage registrations
The buildings were classified as Category I heritage items by the New Zealand Historic Places Trust (since renamed Heritage New Zealand) on 28 June 1990, with registration number 4385. On 27 October 1994, the New Zealand Historic Places Trust registered the street as a historic area, with registration number 7057. The buildings are listed in the Christchurch District Plan as group 2 heritage buildings.
See also
List of historic places in Christchurch
References
External links
Official website
Construction plans and photo library held by Christchurch City Libraries
Christchurch Central City
NZHPT Category I listings in Canterbury, New Zealand
NZHPT historic areas register in the Canterbury Region
1932 establishments in New Zealand
Buildings and structures in Christchurch
Tourist attractions in Christchurch
Pedestrian malls in New Zealand
2011 Christchurch earthquake
Streets in Christchurch |
6891990 | https://en.wikipedia.org/wiki/Duplicity%20%28software%29 | Duplicity (software) | Duplicity is a software suite that provides encrypted, digitally signed, versioned, local or remote backup of files requiring little of the remote server. Released under the terms of the GNU General Public License (GPL), Duplicity is free software.
Duplicity devises a scheme where the first archive is a complete (full) backup, and subsequent (incremental) backups only add differences from the latest full or incremental backup. Chains consisting of a full backup and a series of incremental backups can be recovered to the point in time that any of the incremental steps were taken. If any of the incremental backups are missing, then the incremental backups following it cannot be reconstructed. It does this using GnuPG, librsync, tar, and rdiff. To transmit data to the backup repository, it can use SSH/SCP/SFTP, local file access, rsync, FTP, Amazon S3, Google Cloud Storage, Rackspace Cloud Files, and others.
Duplicity works best under Unix-like operating systems (such as Linux, BSD, and Mac OS X), though it can be used with Windows under Cygwin or the Windows Subsystem for Linux. Currently duplicity supports deleted files, full Unix permissions, directories, and symbolic links, fifos, and device files, but not hard links.
Déjà Dup is a graphical user interface for Duplicity.
See also
List of backup software
Duplicati is a C# re-implementation of Duplicity
References
External links
Duplicity home page
Free backup software
Backup software for Linux |
3009097 | https://en.wikipedia.org/wiki/Software%20and%20Information%20Industry%20Association | Software and Information Industry Association | The Software and Information Industry Association (SIIA) is a trade association dedicated to the entertainment, consumer and business software industries. Established in 1984 as the Software Publishers Association (SPA), the SIIA took its new name when it merged with the related Information Industry Association on January 1, 1999. The joint enterprise was headed by Software Publishers Association founder Ken Wasch and operated out of the SPA's existing offices.
The SPA was active in lobbying, industry research and anti-piracy efforts. Its head of research, Ann Stephens, went on to found PC Data in 1991. By 1995, the SPA had over 1,100 software companies in its membership and according to Wired was among "the most powerful computer-related trade groups" before its merger with the Information Industry Association. While Microsoft became a member of the SPA in 1986, it split with the SIIA in 2000 after the group sided against Microsoft in United States v. Microsoft Corp. The Wall Street Journal described Microsoft as the SIIA's "largest member" before the departure.
Until 1999, the Software Publishers Association hosted the SPA Annual Conference for software companies. It was renamed the InfoSoft Essentials conference in 1999.
Divisions
Public Policy ~ legal and public policy
IP Protection ~ protecting software content
Connectiv ~ business information
ETIN ~ Education Technology
FISD ~ Financial & Information
SIPA ~ Specialized Information Publishers
SSD ~ Software & Services
Advocacy
SIIA filed briefs in Allen v. Cooper, which was decided in 2020: the Supreme Court of the United States abrogated the Copyright Remedy Clarification Act as unconstitutional, SIIA had argued the opposite view.
CODiE Awards
Beginning in 1986, the Software Publishers Association hosted the "Excellence in Software Awards" ceremony, an annual black-tie event that The Washington Post and Los Angeles Times compared to the Academy Awards. The Excellence in Software Awards were later renamed the "CODiE Awards", and are now presented by the Software and Information Industry Association.
The CODiE are awards to two broad categories: business technology and education technology. There are awards in more than 75 categories, advertised with the statement, "With a grand total of more than 75 different categories, you're sure to find several to meet your marketing/PR objectives!". Notable past winners include companies such as Adobe, BrainPOP, Google, Knewton, McGraw-Hill Education, Jigsaw, Netsuite, Red Hat, Rosetta Stone, Salesforce.com, Digimind, Scribe Software, Vocus, WSJ.com, codemantra, IXL Learning, itslearning, and more.
Jesse H. Neal Awards
The Jesse H. Neal Awards were created in 1955 for editorial excellence in business Media and named after Jesse H. Neal, Connectiv's first managing director. Nations Restaurant News says winning the Neal Award is like winning the Pulitzer Prize for Business-to-business (B2B) platforms. Entries are judged in three areas ~ editorial craftsmanship, extent of service to the field and journalistic enterprise. Out of the 21 categories one winner will be selected for The Grand Neal Award. As of 2018 there have been 23 winners of The Grand Neal Award. In 2019 John Heltman, Business and Finance Reporter with American Banker and SourceMedia won with Nobody's Home
See also
List of computer-related awards
References
External links
Technology trade associations
Computer-related awards |
39136609 | https://en.wikipedia.org/wiki/Bill%20Buchanan%20%28computer%20scientist%29 | Bill Buchanan (computer scientist) | William Buchanan OBE FBCS CEng PFHEA (born 6 March 1961) is a Scottish computer scientist. Buchanan was born in Falkirk, Scotland in 1961. He currently leads the Blockpass ID Lab and the Centre for Distributed Computing and Security at Edinburgh Napier University. He is a Professor in the School of Computing. Buchanan was awarded the best Best Lecturer/Tutor in the School of Computing at Edinburgh Napier University within the student-nominated excellence awards in 2019 and also in 2020.
Buchanan was appointed Officer of the Order of the British Empire (OBE) in the 2017 Birthday Honours for services to cyber security, and was the first person to receive an OBE related to cyber security. In 2018, he received an "Outstanding Contribution to Knowledge Exchange" award at the Scottish Knowledge Exchange awards. Buchanan has led research which has led to three successful spin-out companies: Zonefox, Symphonic and Cyan Forensics. In October 2018, Zonefox was acquired by Fortinet, and in November 2020, Symphonic were acquired by Ping Identity. Buchanan also supported the creation of the MemCrypt spin-out, and which focuses on the discovery of cryptographic keys in memory (based on the PhD work of Dr Peter McLaren). The work has since been applied to ransomware detection and recovery. In 2021, the research work related to MemCrypt received a Leading Light Innovation award at the Scottish Cybersecurity awards.
He is the creator and sole author of the Asecuritysite.com web site, and which focuses on covering cryptography and various areas of networking and cybersecurity, from both a theoretical and practical approach. Buchanan was also the software creator of the Bright Red Publishing Digital Zone, and which contains Web-based content for many of the subjects included in the Scottish Qualifications Authority (SQA) N5, Higher and Advanced Higher syllabus' in Scotland.
With a documentary on Cyber Security, broadcast on Monday 8 November 2015, Buchanan and his team set up a fake Web site for hackers to gain access to, as part of the BBC Panorama programme.
References
1961 births
Living people
Scottish computer scientists
British computer scientists
Academics of Edinburgh Napier University
Fellows of the British Computer Society
Officers of the Order of the British Empire
People from Falkirk
Principal Fellows of the Higher Education Academy |
20824597 | https://en.wikipedia.org/wiki/Sitrion | Sitrion | Sitrion (formerly NewsGator Technologies) is a multinational software company headquartered in Denver, Colorado. Sitrion develops and markets mobility and collaboration software. It was founded in 2004 under the name NewsGator. It was initially a consumer company focused on RSS aggregation, before shifting its focus to the enterprise market. The company raised $12 million in funding in 2007 and acquired Tomoye in 2010. In 2013, NewsGator acquired Sitrion, and in 2014, chose to keep the same name.
Corporate history
Sitrion was founded as NewsGator in 2004 by Greg Reinacker and was initially self-funded. Mobius provided an undisclosed amount of Series A and Series B funding in the early 2000s. A third $6 million round of funding was raised in 2005.
In 2005, the company acquired Bradbury Software, which developed a desktop application called FeedDemon and a CSS/xHTML editor called TopStyle.
Another $12 million in funding was raised in December 2007, followed by $10 million in 2009. From 2004 to 2010, the company's CEO was JB Holston.
In 2010, the company acquired Canada-based Tomoye, an enterprise community and collaboration computing vendor, primarily used by government organizations.
Daniel Kraft became CEO & President in 2012.
In 2013, NewsGator acquired Sitrion, who was previously a business partner. In January of the following year, the company changed its name to Sitrion. While initially, the purchase of Sitrion was intended to expand the product portfolio of NewsGator, it then began focusing on selling social networking software to client companies (a business it had been in since 2007)—specifically Sitrion Social Workplace. Kraft has stated that introducing social collaboration concepts can create a more invested staff, saying, "If you terminate nonproductive conversations, then you’re eliminating the ability to make the network a community people would like to belong to."
In 2018, Sitrion was acquired by employee engagement company, Limeade.
Software
Sitrion Social
Sitrion Social is an add-on for SharePoint, which also integrates with SAP. Version 3.0 of Social Sites (now Sitrion Social) was released in late 2009. Among other improvements, it introduced a microblogging component to the software. In 2012, the company started developing software for vertical markets and products for regulatory compliance in partnership with HiSoftware. A simplified user interface for NewsGator Tomoye was introduced in 2011. Sitrion Social is comparable to Facebook, but designed for use with SharePoint and used for employee collaboration. It is typically used to add functionality to Microsoft SharePoint and SAP processes.
Computers and Applied Sciences reviewed version 2.7 of Social Sites, Sitrion's Facebook-like employee collaboration tool. It praised the software's user interface and synchronization across devices.
Past software
In 2002, Sitrion (then called NewsGator) originally developed consumer RSS software that allowed users to receive notifications when a blog, news site or other page was updated. It was one of few readers that integrated directly with Microsoft Outlook. In 2003, a version 1.3 was introduced that added support for subscribing to newsgroups through Outlook, automated some of the organization features and made other improvements. Version 2.0 was released in March 2004, which allowed users to get notifications from mobile devices, to access exclusive content and made other improvements. By 2004, it was one of the better known RSS tools. It also made a partnership that year with PR firm Edelman to develop a reader tool called Hosted Conversations used by Edelman clients and available to others. Integration with Factiva was also introduced that year. NewsGator started pursuing the enterprise market in late 2005 with a new NewsGator Enterprise Server (NGES) product.
References
Software companies based in Colorado
RSS
Companies based in Denver
2004 establishments in Colorado
Collaborative software
Software companies of the United States |
18938683 | https://en.wikipedia.org/wiki/GNU%20General%20Public%20License | GNU General Public License | The GNU General Public License (GNU GPL or simply GPL) is a series of widely used free software licenses that guarantee end users the four freedoms to run, study, share, and modify the software. The licenses were originally written by Richard Stallman, founder of the Free Software Foundation (FSF), for the GNU Project, and grant the recipients of a computer program the rights of the Free Software Definition. The GPL series are all copyleft licenses, which means that any derivative work must be distributed under the same or equivalent license terms. This is in distinction to permissive software licenses, of which the BSD licenses and the MIT License are widely used, less restrictive examples. GPL was the first copyleft license for general use.
Historically, the GPL license family has been one of the most popular software licenses in the free and open-source software domain. Prominent free software programs licensed under the GPL include the Linux kernel and the GNU Compiler Collection (GCC). David A. Wheeler argues that the copyleft provided by the GPL was crucial to the success of Linux-based systems, giving the programmers who contributed to the kernel the assurance that their work would benefit the whole world and remain free, rather than being exploited by software companies that would not have to give anything back to the community.
In 2007, the third version of the license (GPLv3) was released to address some perceived problems with the second version (GPLv2) which were discovered during the latter's long-time usage. To keep the license up to date, the GPL license includes an optional "any later version" clause, allowing users to choose between the original terms or the terms in new versions as updated by the FSF. Developers can omit it when licensing their software; the Linux kernel, for instance, is licensed under GPLv2 without the "any later version" clause.
History
The GPL was written by Richard Stallman in 1989, for use with programs released as part of the GNU project. The original GPL was based on a unification of similar licenses used for early versions of GNU Emacs (1985), the GNU Debugger, and the GNU C Compiler. These licenses contained similar provisions to the modern GPL, but were specific to each program, rendering them incompatible, despite being the same license. Stallman's goal was to produce one license that could be used for any project, thus making it possible for many projects to share code.
The second version of the license, version 2, was released in 1991. Over the following 15 years, members of the free software community became concerned over problems in the GPLv2 license that could let someone exploit GPL-licensed software in ways contrary to the license's intent. These problems included tivoization (the inclusion of GPL-licensed software in hardware that refuses to run modified versions of its software), compatibility issues similar to those of the Affero General Public License, and patent deals between Microsoft and distributors of free and open-source software, which some viewed as an attempt to use patents as a weapon against the free software community.
Version 3 was developed to attempt to address these concerns and was officially released on 29 June 2007.
Version 1
Version 1 of the GNU GPL, released on 25 February 1989, prevented what were then the two main ways that software distributors restricted the freedoms that define free software. The first problem was that distributors may publish binary files only—executable, but not readable or modifiable by humans. To prevent this, GPLv1 stated that copying and distributing copies or any portion of the program must also make the human-readable source code available under the same licensing terms.
The second problem was that distributors might add restrictions, either to the license or by combining the software with other software that had other restrictions on distribution. The union of two sets of restrictions would apply to the combined work, thus adding unacceptable restrictions. To prevent this, GPLv1 stated that modified versions, as a whole, had to be distributed under the terms in GPLv1. Therefore, software distributed under the terms of GPLv1 could be combined with software under more permissive terms, as this would not change the terms under which the whole could be distributed. However, software distributed under GPLv1 could not be combined with software distributed under a more restrictive license, as this would conflict with the requirement that the whole be distributable under the terms of GPLv1.
Version 2
According to Richard Stallman, the major change in GPLv2 was the "Liberty or Death" clause, as he calls it – Section 7. The section says that licensees may distribute a GPL-covered work only if they can satisfy all of the license's obligations, despite any other legal obligations they might have. In other words, the obligations of the license may not be severed due to conflicting obligations. This provision is intended to discourage any party from using a patent infringement claim or other litigation to impair users' freedom under the license.
By 1990, it was becoming apparent that a less restrictive license would be strategically useful for the C library and for software libraries that essentially did the job of existing proprietary ones; when version 2 of the GPL (GPLv2) was released in June 1991, therefore, a second license – the GNU Library General Public License – was introduced at the same time and numbered with version 2 to show that both were complementary. The version numbers diverged in 1999 when version 2.1 of the LGPL was released, which renamed it the GNU Lesser General Public License to reflect its place in the philosophy. The GPLv2 was also modified to refer to the new name of the LGPL, but its version number remained the same, resulting in the original GPLv2 not being recognised by the Software Package Data Exchange (SPDX).
The license includes instructions to specify "version 2 of the License, or (at your option) any later version" to allow the flexible optional use of either version 2 or 3, but some developers change this to specify "version 2" only.
Version 3
In late 2005, the Free Software Foundation (FSF) announced work on version 3 of the GPL (GPLv3). On 16 January 2006, the first "discussion draft" of GPLv3 was published, and the public consultation began. The public consultation was originally planned for nine to fifteen months, but finally stretched to eighteen months with four drafts being published. The official GPLv3 was released by the FSF on 29 June 2007. GPLv3 was written by Richard Stallman, with legal counsel from Eben Moglen and Richard Fontana from the Software Freedom Law Center.
According to Stallman, the most important changes were in relation to software patents, free software license compatibility, the definition of "source code", and hardware restrictions on software modifications, such as tivoization. Other changes related to internationalization, how license violations are handled, and how additional permissions could be granted by the copyright holder. The concept of "software propagation", as a term for the copying and duplication of software, was explicitly defined.
The public consultation process was coordinated by the Free Software Foundation with assistance from Software Freedom Law Center, Free Software Foundation Europe, and other free software groups. Comments were collected from the public via the gplv3.fsf.org web portal, using purpose-written software called stet.
During the public consultation process, 962 comments were submitted for the first draft. By the end of the comment period, a total of 2,636 comments had been submitted.
The third draft was released on 28 March 2007. This draft included language intended to prevent patent-related agreements such as the controversial Microsoft-Novell patent agreement, and restricted the anti-tivoization clauses to a legal definition of a "user" and a "consumer product". It also explicitly removed the section on "Geographical Limitations", whose probable removal had been announced at the launch of the public consultation.
The fourth discussion draft, which was the last, was released on 31 May 2007. It introduced Apache License version 2.0 compatibility (prior versions are incompatible), clarified the role of outside contractors and made an exception to avoid the perceived problems of a Microsoft–Novell style agreement, saying in Section 11 paragraph 6 that:
This aimed to make future such deals ineffective. The license was also meant to cause Microsoft to extend the patent licenses it granted to Novell customers for the use of GPLv3 software to all users of that GPLv3 software; this was possible only if Microsoft was legally a "conveyor" of the GPLv3 software.
Early drafts of GPLv3 also let licensors add an Affero-like requirement that would have plugged the ASP loophole in the GPL. As there were concerns expressed about the administrative costs of checking code for this additional requirement, it was decided to keep the GPL and the Affero license separated.
Others, notably some high-profile Linux kernel developers such as Linus Torvalds, Greg Kroah-Hartman, and Andrew Morton, commented to the mass media and made public statements about their objections to parts of discussion drafts 1 and 2. The kernel developers referred to GPLv3 draft clauses regarding DRM/Tivoization, patents, and "additional restrictions", and warned of a Balkanisation of the "Open Source Universe". Linus Torvalds, who decided not to adopt the GPLv3 for the Linux kernel, reiterated his criticism several years later.
GPLv3 improved compatibility with several free software licenses such as the Apache License, version 2.0, and the GNU Affero General Public License, which GPLv2 could not be combined with. However, GPLv3 software could only be combined and share code with GPLv2 software if the GPLv2 license used had the optional "or later" clause and the software was upgraded to GPLv3. While the "GPLv2 or any later version" clause is considered by FSF as the most common form of licensing GPLv2 software, Toybox developer Rob Landley described it as a lifeboat clause. Software projects licensed with the optional "or later" clause include the GNU Project, while a prominent example without the clause is the Linux kernel.
The final version of the license text was published on 29 June 2007.
Terms and conditions
The terms and conditions of the GPL must be made available to anybody receiving a copy of the work that has a GPL applied to it ("the licensee"). Any licensee who adheres to the terms and conditions is given permission to modify the work, as well as to copy and redistribute the work or any derivative version. The licensee is allowed to charge a fee for this service or do this free of charge. This latter point distinguishes the GPL from software licenses that prohibit commercial redistribution. The FSF argues that free software should not place restrictions on commercial use, and the GPL explicitly states that GPL works may be sold at any price.
The GPL additionally states that a distributor may not impose "further restrictions on the rights granted by the GPL". This forbids activities such as distributing the software under a non-disclosure agreement or contract.
The fourth section for version 2 of the license and the seventh section of version 3 require that programs distributed as pre-compiled binaries be accompanied by a copy of the source code, a written offer to distribute the source code via the same mechanism as the pre-compiled binary, or the written offer to obtain the source code that the user got when they received the pre-compiled binary under the GPL. The second section of version 2 and the fifth section of version 3 also require giving "all recipients a copy of this License along with the Program". Version 3 of the license allows making the source code available in additional ways in fulfillment of the seventh section. These include downloading source code from an adjacent network server or by peer-to-peer transmission, provided that is how the compiled code was available and there are "clear directions" on where to find the source code.
The FSF does not hold the copyright for a work released under the GPL unless an author explicitly assigns copyrights to the FSF (which seldom happens except for programs that are part of the GNU project). Only the individual copyright holders have the authority to sue when a license violation is suspected.
Use of licensed software
Software under the GPL may be run for all purposes, including commercial purposes and even as a tool for creating proprietary software, such as when using GPL-licensed compilers. Users or companies who distribute GPL-licensed works (e.g. software), may charge a fee for copies or give them free of charge. This distinguishes the GPL from shareware software licenses that allow copying for personal use but prohibit the commercial distribution or proprietary licenses where copying is prohibited by copyright law. The FSF argues that freedom-respecting free software should also not restrict commercial use and distribution (including redistribution):
In purely private (or internal) use—with no sales and no distribution—the software code may be modified and parts reused without requiring the source code to be released. For sales or distribution, the entire source code needs to be made available to end users, including any code changes and additions—in that case, copyleft is applied to ensure that end users retain the freedoms defined above.
However, software running as an application program under a GPL-licensed operating system such as Linux is not required to be licensed under GPL or to be distributed with source-code availability—the licensing depends only on the used libraries and software components and not on the underlying platform. For example, if a program consists only of original source code, or is combined with source code from other software components, then the custom software components need not be licensed under GPL and need not make their source code available; even if the underlying operating system used is licensed under the GPL, applications running on it are not considered derivative works. Only if GPLed parts are used in a program (and the program is distributed), then all other source code of the program needs to be made available under the same license terms. The GNU Lesser General Public License (LGPL) was created to have a weaker copyleft than the GPL, in that it does not require custom-developed source code (distinct from the LGPL'ed parts) to be made available under the same license terms.
The fifth section of version 3 states that no GPL-licensed code shall be considered an effective "technical protection measure" as defined by Article 11 of the WIPO Copyright Treaty, and that those who convey the work waive all legal power to prohibit circumvention of the technical protection measure "to the extent such circumvention is effected by exercising rights under this License with respect to the covered work". This means that users cannot be held liable for circumventing DRM implemented using GPL v3-licensed code under laws such as the U.S. Digital Millennium Copyright Act (DMCA).
Copyleft
The distribution rights granted by the GPL for modified versions of the work are not unconditional. When someone distributes a GPL'ed work plus their own modifications, the requirements for distributing the whole work cannot be any greater than the requirements that are in the GPL.
This requirement is known as copyleft. It earns its legal power from the use of copyright on software programs. Because a GPL work is copyrighted, a licensee has no right to redistribute it, not even in modified form (barring fair use), except under the terms of the license. One is only required to adhere to the terms of the GPL if one wishes to exercise rights normally restricted by copyright law, such as redistribution. Conversely, if one distributes copies of the work without abiding by the terms of the GPL (for instance, by keeping the source code secret), they can be sued by the original author under copyright law.
Copyright law has historically been used to prevent distribution of work by parties not authorized by the creator. Copyleft uses the same copyright laws to accomplish a very different goal. It grants rights to distribution to all parties insofar as they provide the same rights to subsequent ones, and they to the next, etc. In this way the GPL and other copyleft licenses attempt to enforce libre access to the work and all derivatives.
Many distributors of GPL'ed programs bundle the source code with the executables. An alternative method of satisfying the copyleft is to provide a written offer to provide the source code on a physical medium (such as a CD) upon request. In practice, many GPL'ed programs are distributed over the Internet, and the source code is made available over FTP or HTTP. For Internet distribution, this complies with the license.
Copyleft applies only when a person seeks to redistribute the program. Developers may make private modified versions with no obligation to divulge the modifications, as long as they do not distribute the modified software to anyone else. Copyleft applies only to the software, and not to its output (unless that output is itself a derivative work of the program). For example, a public web portal running a modified derivative of a GPL'ed content management system is not required to distribute its changes to the underlying software, because its output is not a derivative.
There has been debate on whether it is a violation of the GPL to release the source code in obfuscated form, such as in cases in which the author is less willing to make the source code available. The consensus was that while unethical, it was not considered a violation. The issue was clarified when the license was altered with v2 to require that the "preferred" version of the source code be made available.
License versus contract
The GPL was designed as a license, rather than a contract. In some Common Law jurisdictions, the legal distinction between a license and a contract is an important one: contracts are enforceable by contract law, whereas licenses are enforced under copyright law. However, this distinction is not useful in the many jurisdictions where there are no differences between contracts and licenses, such as Civil Law systems.
Those who do not accept the GPL's terms and conditions do not have permission, under copyright law, to copy or distribute GPL-licensed software or derivative works. However, if they do not redistribute the GPL'ed program, they may still use the software within their organization however they like, and works (including programs) constructed by the use of the program are not required to be covered by this license.
Software developer Allison Randal argued that the GPLv3 as a license is unnecessarily confusing for lay readers, and could be simplified while retaining the same conditions and legal force.
In April 2017, a US federal court ruled that an open-source license is an enforceable contract.
Derivations
The text of the GPL is itself copyrighted, and the copyright is held by the Free Software Foundation.
The FSF permits people to create new licenses based on the GPL, as long as the derived licenses do not use the GPL preamble without permission. This is discouraged, however, since such a license might be incompatible with the GPL and causes a perceived license proliferation.
Other licenses created by the GNU project include the GNU Lesser General Public License, GNU Free Documentation License, and Affero General Public License.
The text of the GPL is not itself under the GPL. The license's copyright disallows modification of the license. Copying and distributing the license is allowed since the GPL requires recipients to get "a copy of this License along with the Program". According to the GPL FAQ, anyone can make a new license using a modified version of the GPL as long as they use a different name for the license, do not mention "GNU", and remove the preamble, though the preamble can be used in a modified license if permission to use it is obtained from the Free Software Foundation (FSF).
Linking and derived works
Libraries
According to the FSF, "The GPL does not require you to release your modified version or any part of it. You are free to make modifications and use them privately, without ever releasing them." However, if one releases a GPL-licensed entity to the public, there is an issue regarding linking: namely, whether a proprietary program that uses a GPL library is in violation of the GPL.
This key dispute is whether non-GPL software can legally statically link or dynamically link to GPL libraries. Different opinions exist on this issue. The GPL is clear in requiring that all derivative works of code under the GPL must themselves be under the GPL. Ambiguity arises with regards to using GPL libraries, and bundling GPL software into a larger package (perhaps mixed into a binary via static linking). This is ultimately a question not of the GPL per se, but of how copyright law defines derivative works. The following points of view exist:
Point of view: dynamic and static linking violate GPL
The Free Software Foundation (which holds the copyright of several notable GPL-licensed software products and of the license text itself) asserts that an executable that uses a dynamically linked library is indeed a derivative work. This does not, however, apply to separate programs communicating with one another.
The Free Software Foundation also created the LGPL, which is nearly identical to the GPL, but with additional permissions to allow linking for the purposes of "using the library".
Richard Stallman and the FSF specifically encourage library writers to license under the GPL so that proprietary programs cannot use the libraries, in an effort to protect the free-software world by giving it more tools than the proprietary world.
Point of view: static linking violates GPL but unclear as of dynamic linking
Some people believe that while static linking produces derivative works, it is not clear whether an executable that dynamically links to a GPL code should be considered a derivative work (see weak copyleft). Linux author Linus Torvalds agrees that dynamic linking can create derived works but disagrees over the circumstances.
A Novell lawyer has written that dynamic linking not being derivative "makes sense" but is not "clear-cut", and that evidence for good-intentioned dynamic linking can be seen by the existence of proprietary Linux kernel drivers.
In Galoob v. Nintendo, the United States Ninth Circuit Court of Appeals defined a derivative work as having form' or permanence" and noted that "the infringing work must incorporate a portion of the copyrighted work in some form", but there have been no clear court decisions to resolve this particular conflict.
Point of view: linking is irrelevant
According to an article in the Linux Journal, Lawrence Rosen (a one-time Open Source Initiative general counsel) argues that the method of linking is mostly irrelevant to the question about whether a piece of software is a derivative work; more important is the question about whether the software was intended to interface with client software and/or libraries.
He states, "The primary indication of whether a new program is a derivative work is whether the source code of the original program was used [in a copy-paste sense], modified, translated or otherwise changed in any way to create the new program. If not, then I would argue that it is not a derivative work," and lists numerous other points regarding intent, bundling, and linkage mechanism.
He further argues on his firm's website that such "market-based" factors are more important than the linking technique.
There is also the specific issue of whether a plugin or module (such as the NVidia or ATI graphics card kernel modules) must also be GPL, if it could reasonably be considered its own work. This point of view suggests that reasonably separate plugins, or plugins for software designed to use plugins, could be licensed under an arbitrary license if the work is GPLv2. Of particular interest is the GPLv2 paragraph:
The GPLv3 has a different clause:
As a case study, some supposedly proprietary plugins and themes/skins for GPLv2 CMS software such as Drupal and WordPress have come under fire, with both sides of the argument taken.
The FSF differentiates on how the plugin is being invoked. If the plugin is invoked through dynamic linkage and it performs function calls to the GPL program then it is most likely a derivative work.
Communicating and bundling with non-GPL programs
The mere act of communicating with other programs does not, by itself, require all software to be GPL; nor does distributing GPL software with non-GPL software. However, minor conditions must be followed that ensures the rights of GPL software are not restricted. The following is a quote from the gnu.org GPL FAQ, which describes to what extent software is allowed to communicate with and be bundled with GPL programs:
The FSF thus draws the line between "library" and "other program" via 1) "complexity" and "intimacy" of information exchange and 2) mechanism (rather than semantics), but resigns that the question is not clear-cut and that in complex situations, case law will decide.
Legal status
The first known violation of the GPL was in 1989, when NeXT extended the GCC compiler to support Objective-C, but did not publicly release the changes. After an inquiry they created a public patch. There was no lawsuit filed for this violation.
In 2002, MySQL AB sued Progress NuSphere for copyright and trademark infringement in United States district court. NuSphere had allegedly violated MySQL's copyright by linking MySQL's GPL'ed code with NuSphere Gemini table without complying with the license. After a preliminary hearing before Judge Patti Saris on 27 February 2002, the parties entered settlement talks and eventually settled. After the hearing, FSF commented that "Judge Saris made clear that she sees the GNU GPL to be an enforceable and binding license."
In August 2003, the SCO Group stated that they believed the GPL to have no legal validity and that they intended to pursue lawsuits over sections of code supposedly copied from SCO Unix into the Linux kernel. This was a problematic stand for them, as they had distributed Linux and other GPL'ed code in their Caldera OpenLinux distribution, and there is little evidence that they had any legal right to do so except under the terms of the GPL. In February 2018, after federal circuit court judgement, appeal, and the case being (partially) remanded to the circuit court, the parties restated their remaining claims and provided a plan to move toward final judgement. The remaining claims revolved around Project Monterey, and were finally settled in November 2021 by IBM paying $14.25 million to the TSG (previously SCO) bankruptcy trustee.
In April 2004, the netfilter/iptables project was granted a preliminary injunction against Sitecom Germany by Munich District Court after Sitecom refused to desist from distributing Netfilter's GPL'ed software in violation of the terms of the GPL. Harald Welte, of Netfilter, was represented by ifrOSS co-founder Till Jaeger. In July 2004, the German court confirmed this injunction as a final ruling against Sitecom. The court's justification was that:
Defendant has infringed on the copyright of plaintiff by offering the software 'netfilter/iptables' for download and by advertising its distribution, without adhering to the license conditions of the GPL. Said actions would only be permissible if the defendant had a license grant.... This is independent of the questions whether the licensing conditions of the GPL have been effectively agreed upon between plaintiff and defendant or not. If the GPL were not agreed upon by the parties, defendant would notwithstanding lack the necessary rights to copy, distribute, and make the software 'netfilter/iptables' publicly available.
This exactly mirrored the predictions given previously by the FSF's Eben Moglen. This ruling was important because it was the first time that a court had confirmed that violating terms of the GPL could be a copyright violation and established jurisprudence as to the enforceability of the GPL version 2 under German law.
In May 2005, Daniel Wallace filed suit against the Free Software Foundation in the Southern District of Indiana, contending that the GPL is an illegal attempt to fix prices (at zero). The suit was dismissed in March 2006, on the grounds that Wallace had failed to state a valid antitrust claim; the court noted that "the GPL encourages, rather than discourages, free competition and the distribution of computer operating systems, the benefits of which directly pass to consumers". Wallace was denied the possibility of further amending his complaint, and was ordered to pay the FSF's legal expenses.
On 8 September 2005, the Seoul Central District Court ruled that the GPL was not material to a case dealing with trade secrets derived from GPL-licensed work. Defendants argued that since it is impossible to maintain trade secrets while being compliant with GPL and distributing the work, they are not in breach of trade secrets. This argument was considered without ground.
On 6 September 2006, the gpl-violations.org project prevailed in court litigation against D-Link Germany GmbH regarding D-Link's copyright-infringing use of parts of the Linux kernel in storage devices they distributed. The judgment stated that the GPL is valid, legally binding, and stands in German court.
In late 2007, the BusyBox developers and the Software Freedom Law Center embarked upon a program to gain GPL compliance from distributors of BusyBox in embedded systems, suing those who would not comply. These were claimed to be the first US uses of courts for enforcement of GPL obligations. (See BusyBox GPL lawsuits.)
On 11 December 2008, the Free Software Foundation sued Cisco Systems, Inc. for copyright violations by its Linksys division, of the FSF's GPL-licensed coreutils, readline, Parted, Wget, GNU Compiler Collection, binutils, and GNU Debugger software packages, which Linksys distributes in the Linux firmware of its WRT54G wireless routers, as well as numerous other devices including DSL and Cable modems, Network Attached Storage devices, Voice-Over-IP gateways, virtual private network devices, and a home theater/media player device.
After six years of repeated complaints to Cisco by the FSF, claims by Cisco that they would correct, or were correcting, their compliance problems (not providing complete copies of all source code and their modifications), of repeated new violations being discovered and reported with more products, and lack of action by Linksys (a process described on the FSF blog as a "five-years-running game of Whack-a-Mole") the FSF took them to court.
Cisco settled the case six months later by agreeing "to appoint a Free Software Director for Linksys" to ensure compliance, "to notify previous recipients of Linksys products containing FSF programs of their rights under the GPL," to make source code of FSF programs freely available on its website, and to make a monetary contribution to the FSF.
In 2011, it was noticed that GNU Emacs had been accidentally releasing some binaries without corresponding source code for two years, in opposition to the intended spirit of the GPL, resulting in a copyright violation. Richard Stallman described this incident as a "very bad mistake", which was promptly fixed. The FSF did not sue any downstream redistributors who also unknowingly violated the GPL by distributing these binaries.
In 2017 Artifex, the maker of Ghostscript, sued Hancom, the maker of an office suite which included Ghostscript. Artifex offers two licenses for Ghostscript; one is the Affero GPL License and the other is a commercial license. Hancom did not acquire a commercial license from Artifex nor did it release its office suite as free software. Artifex sued Hancom in US District Court and made two claims. First, Hancom's use of Ghostscript was a violation of copyright; and second, Hancom's use of Ghostscript was a license violation. Judge Jacqueline Scott Corley found the GPL license was an enforceable contract and Hancom was in breach of contract.
Compatibility and multi-licensing
Code licensed under several other licenses can be combined with a program under the GPL without conflict, as long as the combination of restrictions on the work as a whole does not put any additional restrictions beyond what GPL allows. In addition to the regular terms of the GPL, there are additional restrictions and permissions one can apply:
If a user wants to combine code licensed under different versions of GPL, then this is only allowed if the code with the earlier GPL version includes an "or any later version" statement. For instance, the GPLv3-licensed GNU LibreDWG library cannot be used anymore by LibreCAD and FreeCAD who have GPLv2-only dependencies.
Code licensed under LGPL is permitted to be linked with any other code no matter what license that code has, though the LGPL does add additional requirements for the combined work. LGPLv3 and GPLv2-only can thus commonly not be linked, as the combined Code work would add additional LGPLv3 requirements on top of the GPLv2-only licensed software. Code licensed under LGPLv2.x without the "any later version" statement can be relicensed if the whole combined work is licensed to GPLv2 or GPLv3.
FSF maintains a list of GPL-compatible free software licenses containing many of the most common free software licenses, such as the original MIT/X license, the BSD license (in its current 3-clause form), and the Artistic License 2.0.
Starting from GPLv3, it is unilaterally compatible for materials (like text and other media) under Creative Commons Attribution-ShareAlike 4.0 International License to be remixed into the GPL-licensed materials (prominently software), not vice versa, for niche use cases like game engine (GPL) with game scripts (CC BY-SA).
David A. Wheeler has advocated that free/open source software developers use only GPL-compatible licenses, because doing otherwise makes it difficult for others to participate and contribute code. As a specific example of license incompatibility, Sun Microsystems' ZFS cannot be included in the GPL-licensed Linux kernel, because it is licensed under the GPL-incompatible Common Development and Distribution License. Furthermore, ZFS is protected by patents, so distributing an independently developed GPL-ed implementation would still require Oracle's permission.
A number of businesses use multi-licensing to distribute a GPL version and sell a proprietary license to companies wishing to combine the package with proprietary code, using dynamic linking or not. Examples of such companies include MySQL AB, Digia PLC (Qt framework, before 2011 from Nokia), Red Hat (Cygwin), and Riverbank Computing (PyQt). Other companies, like the Mozilla Foundation (products include Mozilla Application Suite, Mozilla Thunderbird, and Mozilla Firefox), used multi-licensing to distribute versions under the GPL and some other open-source licenses.
Text and other media
It is possible to use the GPL for text documents instead of computer programs, or more generally for all kinds of media, if it is clear what constitutes the source code (defined as "the preferred form of the work for making changes in it"). For manuals and textbooks, though, the FSF recommends the GNU Free Documentation License (GFDL) instead, which it created for this purpose. Nevertheless, the Debian developers recommended (in a resolution adopted in 2006) to license documentation for their project under the GPL, because of the incompatibility of the GFDL with the GPL (text licensed under the GFDL cannot be incorporated into GPL software). Also, the FLOSS Manuals foundation, an organization devoted to creating manuals for free software, decided to eschew the GFDL in favor of the GPL for its texts in 2007.
If the GPL is used for computer fonts, any documents or images made with such fonts might also have to be distributed under the terms of the GPL. This is not the case in countries that recognize typefaces (the appearance of fonts) as being a useful article and thus not eligible for copyright, but font files as copyrighted computer software (which can complicate font embedding, since the document could be considered 'linked' to the font; in other words, embedding a vector font in a document could force it to be released under the GPL, but a rasterized rendering of the font would not be subject to the GPL). The FSF provides an exception for cases where this is not desired.
Adoption
Historically, the GPL license family has been one of the most popular software licenses in the FOSS domain.
A 1997 survey of MetaLab, then the largest free software archive, showed that the GPL accounted for about half of the software licensed therein. Similarly, a 2000 survey of Red Hat Linux 7.1 found that 53% of the source code was licensed under the GPL. , about 68% of all projects and 82.1% of the open source industry certified licensed projects listed on SourceForge.net were from the GPL license family. , the GPL family accounted for 70.9% of the 44,927 free software projects listed on Freecode.
After the release of the GPLv3 in June 2007, adoption of this new GPL version was much discussed and some projects decided against upgrading. For instance the Linux kernel, MySQL, BusyBox, AdvFS, Blender, VLC media player, and MediaWiki decided against adopting GPLv3.
On the other hand, in 2009, two years after the release of GPLv3, Google open-source programs office manager Chris DiBona reported that the number of open-source project licensed software that had moved from GPLv2 to GPLv3 was 50%, counting the projects hosted at Google Code.
In 2011, four years after the release of the GPLv3, 6.5% of all open-source license projects are GPLv3 while 42.5% are GPLv2 according to Black Duck Software data. Following in 2011 451 Group analyst Matthew Aslett argued in a blog post that copyleft licenses went into decline and permissive licenses increased, based on statistics from Black Duck Software. Similarly, in February 2012 Jon Buys reported that among the top 50 projects on GitHub five projects were under a GPL license, including dual licensed and AGPL projects.
GPL usage statistics from 2009 to 2013 was extracted from Freecode data by Walter van Holst while analyzing license proliferation.
In August 2013, according to Black Duck Software, the website's data shows that the GPL license family is used by 54% of open-source projects, with a breakdown of the individual licenses shown in the following table. However, a later study in 2013 showed that software licensed under the GPL license family has increased, and that even the data from Black Duck Software has shown a total increase of software projects licensed under GPL. The study used public information gathered from repositories of the Debian Project, and the study criticized Black Duck Software for not publishing their methodology used in collecting statistics. Daniel German, Professor in the Department of Computer Science at the University of Victoria in Canada, presented a talk in 2013 about the methodological challenges in determining which are the most widely used free software licenses, and showed how he could not replicate the result from Black Duck Software.
In 2015, according to Black Duck, GPLv2 lost its first position to the MIT license and is now second, the GPLv3 dropped to fourth place while the Apache license kept its third position.
A March 2015 analysis of the GitHub repositories revealed, for the GPL license family, a usage percentage of approximately 25% among licensed projects. In June 2016, an analysis of Fedora Project's packages revealed the GNU GPL version 2 or later as the most popular license, and the GNU GPL family as the most popular license family (followed by the MIT, BSD, and GNU LGPL families).
An analysis of whitesourcesoftware.com in April 2018 of the FOSS ecosystem saw the GPLv3 on third place (18%) and the GPLv2 on fourth place (11%), after MIT license (26%) and Apache 2.0 license (21%).
Reception
Legal barrier to app stores
The GPL is incompatible with many application digital distribution systems, like the Mac App Store, and certain other software distribution platforms (on smartphones as well as PCs). The problem lies in the right "to make a copy for your neighbour", as this right is violated by digital rights management systems embedded within the platform to prevent copying of paid software. Even if the application is free in the app store in question, it might result in a violation of that app store's terms.
There is a distinction between an app store, which sells DRM-restricted software under proprietary licenses, and the more general concept of digital distribution via some form of online software repository. Various UNIX-like distributions provide app repositories, including Fedora, RHEL, CentOS, Ubuntu, Debian, FreeBSD, OpenBSD, and so on. These specific app repos all contain GPL-licensed software apps, in some cases even when the core project does not permit GPL-licensed code in the base system (for instance OpenBSD). In other cases, such as the Ubuntu App Store, proprietary commercial software applications and GPL-licensed applications are both available via the same system; the reason that the Mac App Store (and similar projects) is incompatible with GPL-licensed apps is not inherent in the concept of an app store, but is rather specifically due to Apple's terms-of-use requirement that all apps in the store utilize Apple DRM restrictions. Ubuntu's app store does not demand any such requirement: "These terms do not limit or restrict your rights under any applicable open source software licenses."
Microsoft
In 2001, Microsoft CEO Steve Ballmer referred to Linux as "a cancer that attaches itself in an intellectual property sense to everything it touches". In response to Microsoft's attacks on the GPL, several prominent Free Software developers and advocates released a joint statement supporting the license. Microsoft has released Microsoft Windows Services for UNIX, which contains GPL-licensed code. In July 2009, Microsoft itself released a body of around 20,000 lines of Linux driver code under the GPL. The Hyper-V code that is part of the submitted code used open-source components licensed under the GPL and was originally statically linked to proprietary binary parts, the latter being inadmissible in GPL-licensed software.
"Viral" nature
The description of the GPL as "viral", when called 'General Public Virus' or 'GNU Public Virus' (GPV), dates back to a year after the GPLv1 was released.
In 2001, the term received broader public attention when Craig Mundie, Microsoft Senior Vice President, described the GPL as being "viral". Mundie argues that the GPL has a "viral" effect in that it only allows the conveyance of whole programs, which means programs that link to GPL libraries must themselves be under a GPL-compatible license, else they cannot be combined and distributed.
In 2006, Richard Stallman responded in an interview that Mundie's metaphor of a "virus" is wrong as software under the GPL does not "attack" or "infect" other software. Accordingly, Stallman believes that comparing the GPL to a virus is inappropriate, and that a better metaphor for software under the GPL would be a spider plant: if one takes a piece of it and puts it somewhere else, it grows there too.
On the other hand, the concept of a viral nature of the GPL was taken up by others later too. For instance, a 2008 article stated: "The GPL license is 'viral,' meaning any derivative work you create containing even the smallest portion of the previously GPL licensed software must also be licensed under the GPL license."
Barrier to commercialization
The FreeBSD project has stated that "a less publicized and unintended use of the GPL is that it is very favorable to large companies that want to undercut software companies. In other words, the GPL is well suited for use as a marketing weapon, potentially reducing overall economic benefit and contributing to monopolistic behavior" and that the GPL can "present a real problem for those wishing to commercialize and profit from software."
Richard Stallman wrote about the practice of selling license exceptions to free software licenses as an example of ethically acceptable commercialization practice. Selling exceptions here means that the copyright holder of a given software releases it (along with the corresponding source code) to the public under a free software license, "then lets customers pay for permission to use the same code under different terms, for instance allowing its inclusion in proprietary applications". Stallman considered selling exceptions "acceptable since the 1990s, and on occasion I've suggested it to companies. Sometimes this approach has made it possible for important programs to become free software". Although the FSF does not practice selling exceptions, a comparison with the X11 license (which is a non-copyleft free software license) is proposed for suggesting that this commercialization technique should be regarded as ethically acceptable. Releasing a given program under a non-copyleft free software license would permit embedding the code in proprietary software. Stallman comments that "either we have to conclude that it's wrong to release anything under the X11 license—a conclusion I find unacceptably extreme—or reject this implication. Using a non-copyleft license is weak, and usually an inferior choice, but it's not wrong. In other words, selling exceptions permits some embedding in proprietary software, and the X11 license permits even more embedding. If this doesn't make the X11 license unacceptable, it doesn't make selling exceptions unacceptable".
Open-source criticism
In 2000, developer and author Nikolai Bezroukov published an analysis and comprehensive critique of GPL's foundations and Stallman's software development model, called "Labyrinth of Software Freedom".
Version 2 of the WTFPL (Do What The Fuck You Want To Public License) was created by Debian project leader Sam Hocevar in 2004 as a parody of the GPL.
In 2005, open source software advocate Eric S. Raymond questioned the relevance of GPL then for the FOSS ecosystem, stating: "We don't need the GPL anymore. It's based on the belief that open source software is weak and needs to be protected. Open source would be succeeding faster if the GPL didn't make lots of people nervous about adopting it." Richard Stallman replied: "GPL is designed to ... ensure that every user of a program gets the essential freedoms—to run it, to study and change the source code, to redistribute copies, and to publish modified versions... [Raymond] addresses the issue in terms of different goals and values—those of 'open source,' which do not include defending software users' freedom to share and change software."
In 2007, Allison Randal, who took part in the GPL draft committee, criticized the GPLv3 for being incompatible with the GPLv2 and for missing clarity in the formulation. Similarly, Whurley prophesied in 2007 the downfall of the GPL due to the lack of focus for the developers with GPLv3 which would drive them towards permissive licenses.
In 2009, David Chisnall described in an InformIT article, "The Failure of the GPL", the problems with the GPL, among them incompatibility and complexity of the license text.
In 2014, dtrace developer and Joyent CTO Bryan Cantrill called the copyleft GPL a "Corporate Open Source Anti-pattern" by being "anti-collaborative" and recommended instead permissive software licenses.
GPLv3 criticism
Already in September 2006, in the draft process of the GPLv3, several high-profile developers of the Linux kernel, for instance Linus Torvalds, Greg Kroah-Hartman, and Andrew Morton, warned on a splitting of the FOSS community: "the release of GPLv3 portends the Balkanisation of the entire Open Source Universe upon which we rely."
Similarly Benjamin Mako Hill argued in 2006 on the GPLv3 draft, noting that a united, collaborating community is more important than a single license.
Following the GPLv3 release in 2007, some journalists and Toybox developer Rob Landley criticized that with the introduction of the GPLv3 the split between the open source and free software community became wider than ever. As the significantly extended GPLv3 is essentially incompatible with the GPLv2, compatibility between both is only given under the optional "or later" clause of the GPL, which was not taken for instance by the Linux kernel. Bruce Byfield noted that before the release of the GPLv3, the GPLv2 was a unifying element between the open-source and the free software community.
For the LGPLv3, GNU TLS maintainer Nikos Mavrogiannopoulos similarly argued, "If we assume that its [the LGPLv3] primary goal is to be used by free software, then it blatantly fails that", after he re-licensed GNU TLS from LGPLv3 back to LGPLv2.1 due to license compatibility issues.
Lawrence Rosen, attorney and computer specialist, praised in 2007 how the community using the Apache license was now able to work together with the GPL community in a compatible manner, as the problems of GPLv2 compatibility with Apache licensed software were resolved with the GPLv3. He said, "I predict that one of the biggest success stories of GPLv3 will be the realization that the entire universe of free and open-source software can thus be combined into comprehensive open source solutions for customers worldwide."
In July 2013, Flask developer Armin Ronacher draws a less optimistic conclusion on the GPL compatibility in the FOSS ecosystem: "When the GPL is involved the complexities of licensing becomes a non fun version of a riddle", also noting that the conflict between Apache License 2.0 and GPLv2 still has impact on the ecosystem.
See also
Anti-copyright
Dual-licensing
European Union Public Licence (EUPL)
GPL font exception
GPL linking exception
List of software licenses
Permissive and copyleft licenses
:Category:Software using the GPL license
Notes
References
External links
GNU General Public License (version 3)
GNU General Public License v2.0—This version is deprecated by the FSF but is still used by many software projects, including Linux kernel and GNU packages.
GNU General Public License v1.0—This version is deprecated by the FSF.
The Emacs General Public License, a February 1988 version, a direct predecessor of the GNU GPL
History of the GPL by Li-Cheng Tai, 4 July 2001
A Practical Guide to GPL Compliance (Covers GPLv2 and v3)—from the Software Freedom Law Center
A paper on enforcing the GPL
Frequently Asked Questions about the GPL
GNU General Public License and Commentaries, edited by Robert Chassell
List of presentation transcripts about the GPL and free software licenses by the FSFE
The Labyrinth of Software Freedom BSD vs GPL and social aspects of free licensing debate, by Nikolai Bezroukov
Free and open-source software licenses
Copyleft software licenses
GNU Project
Copyleft |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.