id
stringlengths
3
8
url
stringlengths
32
207
title
stringlengths
1
114
text
stringlengths
93
492k
13073470
https://en.wikipedia.org/wiki/Font%20Fusion
Font Fusion
Bitstream Font Fusion is a small, fast, object-oriented font engine written in ANSI C capable of rendering high-quality text on any platform, any device, and at any resolution. The entire source code is portable, optimized, and executes independent of operating system and processor. The font engine is capable of rendering 2,400-3,300 characters per second on a 100 MIPS CPU. Font Fusion is designed such that it can meet the memory and performance requirements, even if the Asian languages that contain thousands of characters are to be supported. Font Fusion is also the core technology behind other Bitstream products, Panorama, ThunderHawk and myMMS. Version history In late 1980s, Sampo Kaasila, lead developer of TrueType and founder of Type Solutions (now a wholly owned subsidiary of Bitstream Inc.) designed T2K, a font renderer, which provided an object-oriented design, advanced architecture and algorithms, and was capable, to embed in all sorts of devices. Later in 1998, Bitstream acquired Type solutions and T2K evolved into Bitstream's font rasterizer, called Font Fusion. Features Enhanced Font Support - Font Fusion provides support for Web Open Font Format (WOFF), OpenType fonts, Multiple Master Postscript Fonts, and Type1 fonts. New Font Manager - The new Font Manager module has been written from the scratch. As compared to previous version, it is faster, consumes less memory, and has a rich set of user APIs. It also includes an optional Android wrapper Add-on which enables an Android application to use Font Fusion rendering engine. Optimized hmtx Structure - Includes an optimized Horizontal Metrics hmtx table loading process. Added support for 32-bit Filter Tag - Font Fusion includes support for 32 bit Filter Tag, now more number of filters can be added by a user of Font Fusion. Lossless Font Compression — The font engine can read and render industry-standard fonts, bitmap fonts, and outline fonts in a compressed format. The engine has a unique capability of font compression such that each font consumes less memory, and achieving a 2-to-1 compression factor. For example, a unified stroke-based CJK font, with 37,000 characters is under 1MB with optimum compression. CJK Bitmap Font Compression — Font Fusion implements a compression algorithm for CJK bitmap fonts, which ideally compresses the embedded bitmaps and provides a compressed CJK bitmap font support. This font format is Bitstream proprietary compression format for CJK bitmap fonts. Fractional Size and Positioning — Supports fractional sizing and positioning of characters, such that text strings can be fit into any region. Smart Scaling — Smart scaling regulates the adjustment of characters that extend beyond the set height parameters and may get clipped when rendered on small screen devices. The technology ensures that the scaled characters are in proportion to the other characters in the font. Cache Management — Includes a dedicated cache manager to manage the system performance. Uses the cache to store rendered characters (bitmaps). Small footprint — The Font Fusion code size for devices varies from 65 – 187 KB, depending on the configuration chosen. Extraordinary Typographical Quality — Native TrueType hinting produces high-quality output on any device. Additionally, the anti-aliasing techniques, TV/LCD modes improve the glyph output irrespective of the device (a mobile handset or a large digital TV). Low Memory Requirement — Only 16 – 40 KB of RAM required for a Latin font and 27 – 34 KB of RAM required for a stroke based Asian font. Scalable Text — Supports high quality scalable text that can be used by mobile and smart phones. The device manufacturers and mobile developers can replace the bitmaps at a single point size with scalable font that can be rendered at different font sizes. Stroke-based Fonts Fupport — Uses a proprietary stroke-based font technology that uses a library of common components, called "radicals", that appear in characters repeatedly. The radicals and strokes are then pieced together and rendered on the fly to create characters. Language Coverage/Font Support Compact Asian fonts Industry Standard Asian Fonts Cyrillic Greek Arabic (complex scripting language) Hebrew (complex scripting language) Indian (complex scripting languages) Thai (complex scripting language) Over 50 other worldwide languages Font Formats Supported Multiple master fonts WOFF fonts Type 1 TrueType TrueType collections OpenType Compact font format (CFF)/Type 2 TrueDoc Portable Font Resources (PFRs) Bitstream Speedo T2K Font Fusion Stroke (FFS) Embedded bitmaps (TrueType, TrueDoc, and T2K) Bitmap Distribution Format (BDF) Mac font suitcase (Dfont) Windows bitmap font format FNT/FON PCLeo (PCL Encapsulated Outline), an Intellifont font format PCLetto (PCL Encapsulated Truetype Outline), soft fonts for printing applications Applications/Operating Systems Supported Cross-platform applications Web (HTML) applications Macintosh & Windows BREW Linux & UNIX Embedded operating systems Real time operating systems Devices Supported Consumer Electronic Devices, Mobile Handset, Set-top box, Digital TV, Printer, Printer Controller, Fax Machine, Multi-function Device, Medical Imaging Device, GPS System, Automobile Display, and other Embedded System Software Applications Supported Web application, Graphics application, Gaming application Font Fusion Plug-In for Symbian Font Fusion plug-in is available for the Symbian OS as a dynamic-link library (DLL). The plug-in inherits all the features supported by the core Font Fusion engine. Font Fusion Plug-In for BREW Font Fusion plug-in for BREW platform provides a standard font-rendering framework that implements different BREW interfaces, supporting scalable and multilingual text. Font Fusion Plug-In for Qtopia Font Fusion framework is available for Qtopia allowing any third party font rendering engine to work as plug-in with the Qt/Qtopia application platform. The framework also adds the capability to have any font format compatibility with Qt/Qtopia. See also Bitstream Inc. References External links Bitstream Releases Panorama 5.0 and Font Fusion 5.0 2007 Bitstream Press Releases Font Engine and Font Rasterizer Technology Embedded Technology Journal Font Fusion Bitstream Offers Small, Fast Font Rendering Subsystem for Symbian OS Mobile Phones Typography
35506284
https://en.wikipedia.org/wiki/Software%20bus
Software bus
A software bus is a software architecture model where a shared communication channel facilitates connections and communication between software modules. This makes software buses conceptually similar to the bus term used in computer hardware for interconnecting pathways. In the early microcomputer era of the 1970s, Digital Research's operating system CP/M was often described as a software bus. Lifeboat Associates, an early distributor of CP/M and later of MS-DOS software, had a whole product line named Software Bus. D-Bus is used in many modern desktop environments to allow multiple processes to communicate with one another. Examples Lifeboat Associates Software Bus-80 aka SB-80, a version of CP/M-80 for 8080/Z80 8-bit computers Lifeboat Associates Software Bus-86 aka SB-86, a version of MS-DOS for x86 16-bit computers. Component Object Model for in-process and interprocess communication. D-Bus for interprocess communication. Enterprise service bus for distributed communication. See also Bus (computing) References External links Microsoft MSDN: Microsoft on the Enterprise Service Bus (ESB) Software architecture
26814551
https://en.wikipedia.org/wiki/Agression%20%28band%29
Agression (band)
Agression is an American hardcore punk band from Silver Strand Beach, Oxnard, California, United States. Agression was one of the first bands of the "Nardcore" scene and an early example of the "skate punk" style. The band fused skate culture with the punk scene, featuring a song about skateboarding ("Intense Energy") and a Glen E. Friedman photo of Arthur Lake skating a pool on their first and most popular album, Don't Be Mistaken. The band is known for its fast-paced, aggressive songs such as "Slammin' at the Club", "Money Machine", "Dear John Letter", "Never Alone", "Go to War", and "Locals Only". History Early 1980s - 2000 Before becoming a punk band, in 1979 singer Mark Hickey, guitarist Henry Knowles, and bassist Bob Clark skated for Sims. Clark claims to be a co-founding member of Agression. Although Clark was verifiably close friends with Hickey and Knowles at the time, Hickey contradicts this claim in an interview with The Hooligan magazine. Regardless, Clark was the original and most senior bassist according to the timeline of lineups and albums released with his participation. Big Bob, Mark and Henry were roommates. The unit was one of the first groups to be involved with the Better Youth Organization (BYO) – the collective started by Shawn and Mark Stern of Youth Brigade — and contributed three songs, "Intense Energy", "Dear John Letter" and "Rat Race" to Someone Got Their Head Kicked In, the label's 1982 compilation album. Don't Be Mistaken, their debut album, was released on BYO in 1983, full of rapid-fire skatepunk rhythms. Contributions to various Mystic Records compilations followed, with Agression's sound wavering between skatepunk and more generic heavy metal; a self-titled LP on that label appeared in 1985, as did the "official" bootleg of an Agression show at the New York club CBGB. Agression was able to get as far as CBGB with the aid of their manager, Scott Hatch, who also managed the more established hardcore outfit Fear, among a stable of many hardcore punk rock bands, Killer Pussy, TSOL and Rik L Rik. As Hickey said at the time, it was difficult not to appreciate the coolness of Agression and Fear listed on one bill. Aber left the group in 1985 for a spot with Angry Samoans. Agression went through many iterations of band members, with vocalist Mark Hickey being the only original member remaining throughout the band's career. Agression relocated from Oxnard, California to a more central location in Denver, Colorado as its home base in the late 1980s. Knowles left the band after a few years in Colorado, returning to California. A short while after, Clark also left the band. For a short time, Agression went on hiatus as new members were sought out. The remaining members in Colorado were Hickey and Mike Minnick, A.K.A. "Fluffy Machete". Minnick was well known in the So-Cal punk scene as the former drummer of two seminal Nardcore bands, Ill Repute and Habeas Corpus. To fill the vacuum left by the loss of Knowles, in 1992, Hickey permanently added his friend and locally well known guitar virtuoso Kent Taylor, and went through several fill-in bassists including "Jelly Roll" (real name unknown), and Conrad Sear, before finally settling on bassist Adam Pittman A.K.A. "Commander Adama" as a permanent addition to the band. Sear was temporarily retained for a short period as a rhythm guitarist, and left the band soon after. With the new Colorado lineup, Agression began touring again, and recorded two albums, Nowhere to Run and Live at the Lair, a live album recorded in the infamous Denver punk club, the "Lion's Lair". Hickey's version of Agression fizzled out and disbanded after he finally pulled the plug in June 1998. The distinctive "bar-s" logo the band used since 1981, designed by artist Jamie Hernandez (illustrator of the comic Love and Rockets) resembling a black bar with a single lightning-bolt style S, or more accurately, the Sig, or Sowilo Germanic rune meaning "sun", or "victory" offset to the left of the bar, was misunderstood by many of the younger members of the punk scene. The logo was mistaken for a Nazi "SS" symbol, and rumors circulated that the band were racists, Nazis or white supremacists. This was compounded by Hickey's self-identification as a "traditional skinhead". 2000 onwards In early 2000, the band reformed in San Jose, California, with Knowles, Rob Thacker, Ryan Fassler and Drew Klein. This line-up continued through many shows with bands like Dead Kennedys, DI, Jim Jones Brigade, and Pitch Black. This incarnation of the band continued until the summer of 2001. In late 2001, Knowles contacted original drummer Mark Aber/Rooney about playing together again with Big Bob on bass but Henry became ill. The new millennium brought about the deaths of two of its founding members; Hickey in 2000, due to acute liver failure, and Knowles to leukemia in 2002. In mid-2002, a "Henry Knowles Memorial Show" took place at Skatelab, in Simi Valley, California, featuring the "Nardcore All-Stars" (who did an all-Agression song set list), Rejected Society, The Missing 23rd, Dr. Know and Ill Repute, among others. To this day, many bands are still highly influenced by the sound of Agression. The high-energy, surf guitar influenced sound and the distinctive voice of Hickey set them apart from contemporaries with their unique sound and attitude. Dr. Strange records has released a tribute to Agression titled Taking Out A Little Agression featuring many of their contemporaries along with modern bands covering Agression songs. In October 2006, several former members of Agression played the Spike Fest in Long Beach. This lineup consisted of Clark, Mark Aber, Thacker, Dave Haro, and JellyRoll for a few songs. This continued in November for two shows in Northern California. From June 14–17, 2007, the same lineup minus JellyRoll performed in Northern California in a four-day tour to celebrate the release of the Agression tribute compilation. This release is out on Dr. Strange Records on CD and vinyl, and was expected to be available in stores worldwide on July 24, 2007. This compilation is dedicated to all those who have ever played in Agression along the years. Track # 28 is a song called "What Did I Do", which was written and recorded by Henry Knowles a few years before he died. More recently, Aggression played at Punk Rock Picnic 2011 on Saturday, April 9, 2011, with the latest line-up: Mark Aber/Rooney, Clark, Thacker, Jess Leedy and Danny Dorman from Circle One. They have continued to play shows and are set to record a new album for DC-Jam Records. On January 20, 2013, a man was shot and killed and another man was stabbed when a fight at an Agression concert spilled into the parking lot in Torrance. Members Original members: Mark Hickey - vocals Henry Knowles - guitar Big Bob Clark - bass (died 2021) Mark Aber/Van Haeln - drums. Other former drummers include Larry White, John Mitchell. Current line-up: Jess Leedy - lead vocals Chuck Schultz - drums Wyatt Torres - guitar Discography Better Youth Organization Presents - Someone Got Their Head Kicked In Compilation LP (BYO, 1982) Don't Be Mistaken LP (BYO 1983) Copulation - The Sound of Hollywood Compilation LP (Mystic, 1984) Nardcore Compilation LP (Mystic, 1984) Agression Bootleg LP, (Mystic/Bootleg 1985) Agression LP (Mystic 1985) Live Underground Railroad 7" (Mystic 1986) Best of Agression CD (Mystic 1995) Nowhere to Run CD (High Five Records 1995) Live at the Lair CD (High Five Records 1996) Full Circle CD (Cleopatra Records 2003) Locals Only: Live CD (Mystic 2005) Grind Kings CD (Lucky 13) References Other sources last.fm All Music Guide, Loftus KPFR radio interview, July, 1995, Hickey, Pittman, Minnick, Taylor and Sear Hooligan Magazine, 1996 article: "Interview with Mark Hickey" PEAK radio interview 1995 by DJ Sam Stock with Hickey, Minnick Totalpunk web radio Negative Man information page Musical groups from Ventura County, California Hardcore punk groups from California Musicians from Oxnard, California Cleopatra Records artists
18606825
https://en.wikipedia.org/wiki/Billix
Billix
Billix is a live USB or live CD that supports multiple Linux distributions (including several special purpose distributions). Billix was designed as a toolkit for UNIX and Linux system administrators. As configured, it supports several distributions: Damn Small Linux Ubuntu (three most current versions plus server versions of each) Debian (two most current versions) CentOS (two most current versions) Fedora (latest version) Also, the following programs are included: Memtest86 DBAN External links Billix homepage Billix project on SourceForge.net Billix: Sysadmin's Swiss Army Knife at LinuxJournal.com Announcing Billix 0.27 and... SuperBillix 0.27! at LinuxJournal.com Live USB Operating system distributions bootable from read-only media
55604212
https://en.wikipedia.org/wiki/R%C3%A9sum%C3%A9%20parsing
Résumé parsing
Resume parsing, also known as CV parsing, resume extraction, or CV extraction, allows for the automated storage and analysis of resume data. The resume is imported into parsing software and the information is extracted so that it can be sorted and searched. Principle Resume parsers analyze a resume, extract the desired information, and insert the information into a database with a unique entry for each candidate. Once the resume has been analyzed, a recruiter can search the database for keywords and phrases and get a list of relevant candidates. Many parsers support semantic search, which adds context to the search terms and tries to understand intent in order to make the results more reliable and comprehensive. Machine learning Machine learning is extremely important for resume parsing. Each block of information needs to be given a label and sorted into the correct category, whether that's education, work history, or contact information. Rule-based parsers use a predefined set of rules to parse the text. This method does not work for resumes because the parser needs to "understand the context in which words occur and the relationship between them." For example, if the word "Harvey" appears on a resume, it could be the name of an applicant, refer to the college Harvey Mudd, or reference the company Harvey & Company LLC. The abbreviation MD could mean "Medical Doctor" or "Maryland". A rule-based parser would require incredibly complex rules to account for all the ambiguity and would provide limited coverage. This leads us to Machine Learning and specifically Natural Language Processing (NLP). NLP is a branch of Artificial Intelligence and it uses Machine Learning to understand content and context as well as make predictions. Many of the features of NLP are extremely important in resume parsing. Acronym normalization and tagging accounts for the different possible formats of acronyms and normalizes them. Lemmatization reduces words to their root using a language dictionary and Stemming removes “s”, “ing”, etc. Entity extraction uses regex expressions, dictionaries, statistical analysis and complex pattern-based extraction to identify people, places, companies, phone numbers, email addresses, important phrases and more. Effectiveness Resume parsers have achieved up to 87% accuracy, which refers to the accuracy of data entry and categorizing the data correctly. Human accuracy is typically not greater than 96%, so the resume parsers have achieved "near human accuracy." One executive recruiting company tested three resume parsers and humans to compare the accuracy in data entry. They ran 1000 resumes through the resume parsing software and had humans manually parse and enter the data. The company brought in a third party to evaluate how the humans did compared to the software. They found that the results from the resume parsers were more comprehensive and had fewer mistakes. The humans did not enter all the information on the resumes and occasionally misspelled words or wrote incorrect numbers. In a 2012 experiment, a resume for an ideal candidate was created based on the job description for a clinical scientist position. After going through the parser, one of the candidate's work experiences was completely lost due to the date being listed before the employer. The parser also didn't catch several educational degrees. The result was that the candidate received a relevance ranking of only 43%. If this had been a real candidate's resume, they wouldn't have moved on to the next step even though they were qualified for the position. It would be helpful if a similar study was conducted on current resume parsers to see if there have been any improvements over the past few years. Benefits A famous study conducted by Marianne Bertrand and Sendhil Mullainathan in 2003 looked at whether candidates with the names Emily and Greg were more employable than Lakisha and Jamal. The conclusion was that resumes with white-sounding names received 50% more callbacks than ones with black-sounding names. In 2014, a study was done in Australia and New Zealand to investigate name discrimination based on gender. Insync Surveys, a research firm and Hays, a recruitment specialist sent out a resume to 1,029 hiring managers with the name being the only difference. Half the hiring managers received a resume for Simon Cook and the other half got a resume for Susan Campbell. The study found that Simon was more likely to get a callback. Resume parsing allows candidates to be ranked based on objective information and can help prevent the bias that so easily shows up in the hiring process. The software can be programmed to ignore and hide factors that contribute to bias such as name, gender, race, age, address and more. The technology is extremely cost-effective and a resource saver. Rather than asking candidates to manually enter the information, which could discourage them from applying or wasting recruiter's time, data entry is now done automatically. The contact information, relevant skills, work history, educational background and more specific information about the candidate is easily accessible. The applicant screening process is now significantly faster and more efficient. Instead of having to look at every resume, recruiters can filter them by specific characteristics, sort and search them. This allows recruiters to move through the interview process and fill positions at a faster rate. One of the biggest complaints people searching for jobs have is the length of the application process. With resume parsers, the process is now faster and candidates have an improved experience. The technology helps prevent qualified candidates from slipping through the cracks. On average, a recruiter spends 6 seconds looking at a resume. When a recruiter is looking through hundreds or thousands of them, it can be easy to miss or lose track of potential candidates. Once a candidate's resume has been analyzed, their information remains in the database. If a position comes up that they are qualified for, but haven't applied to, the company still has their information and can reach out to them. Challenges The parsing software has to rely on complex rules and statistical algorithms to correctly capture the desired information in the resumes. There are many variations of writing style, word choice, syntax, etc. and the same word can have multiple meanings. The date alone can be written hundreds of different ways. It is still a challenge for these resume parsers to account for all the ambiguity. Natural Language Processing and Artificial Intelligence still have a way to go in understanding context-based information and what humans mean to convey in written language. Resume optimization Resume parsers have become so omnipresent that rather than writing to a recruiter, candidates should focus on writing to the parsing system. Understanding how they work is a great first step, but there are also specific changes an applicant can make to optimize their resume. Here are some tips on how to do that: Use keywords from the job description in relevant places on your resume. These keywords will almost certainly be included in the parsing process. Don't use headers or footers. They tend to confuse the parsing algorithms. Use a simple style for fonts, layouts and formatting. Avoid graphics. Use standard section names such as “Work Experience” and “Education”. Avoid using acronyms unless they're included in the job description. The safest option may be to write the long form and include the acronym after in parentheses. Don't start with dates in the "Work Experience" section. Parsers typically look for dates following job titles or company names. Stay consistent with formatting past work experience. The standard is job title, company title, and then employment dates. Most resume parsers claim to work with all of the main file types, but stick with docx, doc and pdf to be on the safe side. Software and vendors There are many stand-alone options for resume parsers including RChilli, Skillate, CandidateZip, Sovren, Daxtra, Textkernel, Hireability and they are also typically bundled in with Applicant Tracking Systems, which are used by companies to streamline the hiring process. 90% of Fortune 500 companies use Applicant Tracking Systems and they can do everything from processing job applications, managing the recruiting process and executing the hiring decision. With recent advancements in AI sophistication and Machine Learning, and the text mining and analysis processes improvements, which ensure up to 95% accuracy in the data processing, many AI technologies have sprung up to help the job seekers in the creation of application documents. These services focus on creating ATS-friendly resumes, execute resume check and screening, and help with all of the preparation and application processes. Some of the AI builders, such as Leap.ai and Skillroads, concentrate on the resume creation while others, like Stella, also offer help with the job hunt itself as they match candidates to appropriate vacancies. In 2017, Google made an attempt at dismantling the US$215.68 Bn (as of 2017) global recruitment market via the creation of Google for Jobs, which is predicted to greatly affect the labor market. This expansion to the search engine uses Cloud Talent Solution, Google's own invention, which is another iteration of the smart AI resume builder and matching system. Future Resume parsers are already standard in most mid- to large-sized companies and this trend will continue as the parsers become even more affordable. A qualified candidate's resume can be ignored if it is not formatted the proper way or doesn't contain specific keywords or phrases. As Machine Learning and Natural Language Processing get better, so will the accuracy of resume parsers. One of the areas resume parsing software is working on expanding into is performing contextual analysis on the information in the resume rather than purely extracting it. One employee at a parsing company said “a parser needs to classify data, enrich it with knowledge from other sources, normalize data so it can be used for analysis and allow for better searching.” Parsing companies are also being asked to expand beyond just resumes or even LinkedIn profiles. They are working on extracting information from industry-specific sites such as GitHub and social media profiles. References E-recruitment Tasks of natural language processing
236654
https://en.wikipedia.org/wiki/Internet%20Group%20Management%20Protocol
Internet Group Management Protocol
The Internet Group Management Protocol (IGMP) is a communications protocol used by hosts and adjacent routers on IPv4 networks to establish multicast group memberships. IGMP is an integral part of IP multicast and allows the network to direct multicast transmissions only to hosts that have requested them. IGMP can be used for one-to-many networking applications such as online streaming video and gaming, and allows more efficient use of resources when supporting these types of applications. IGMP is used on IPv4 networks. Multicast management on IPv6 networks is handled by Multicast Listener Discovery (MLD) which is a part of ICMPv6 in contrast to IGMP's bare IP encapsulation. Architecture A network designed to deliver a multicast service using IGMP might use this basic architecture: IGMP operates between a host and a local multicast router. Switches featuring IGMP snooping also derive useful information by observing these IGMP transactions. Protocol Independent Multicast (PIM) is then used between the local and remote multicast routers to direct multicast traffic from hosts sending multicasts to hosts that have registered through IGMP to receive them. IGMP operates on the network layer (layer 3), just the same as other network management protocols like ICMP. The IGMP protocol is implemented on hosts and within routers. A host requests membership to a group through its local router while a router listens for these requests and periodically sends out subscription queries. A single router per subnet is elected to perform this querying function. Some multilayer switches include an IGMP querier capability to allow their IGMP snooping features to work in the absence of an IGMP-capable router in the layer 2 network. IGMP is vulnerable to some attacks, and firewalls commonly allow the user to disable it if not needed. Versions There are three versions of IGMP. IGMPv1 is defined by , IGMPv2 is defined by and IGMPv3 was initially defined by . updates both IGMPv3 and MLDv2 to better support source-specific multicast. IGMPv2 improves IGMPv1 by adding the ability for a host to signal desire to leave a multicast group. IGMPv3 improves IGMPv2 by supporting source-specific multicast and introduces membership report aggregation. These versions are backwards compatible. A router supporting IGMPv3 can support clients running IGMPv1, IGMPv2 and IGMPv3. IGMPv1 uses a query-response model. Queries are sent to 224.0.0.1. Membership reports are sent to the group's multicast address. IGMPv2 accelerates the process of leaving a group and adjusts other timeouts. Leave-group messages are sent to 224.0.0.2. A group-specific query is introduced. Group-specific queries are sent to the group's multicast address. A means for routers to select an IGMP querier for the network is introduced. IGMPv3 introduces source-specific multicast capability. Membership reports are sent to 224.0.0.22. Messages There are several types of IGMP messages: General membership queries Sent by multicast routers to determine which multicast addresses are of interest to systems attached to the network(s) they serve to refresh the group membership state for all systems on its network. Group-specific membership queries Used for determining the reception state for a particular multicast address Group-and-source-specific queries Allow the router to determine if any systems desire reception of messages sent to a multicast group from a source address specified in a list of unicast addresses Membership reports Sent by multicast receivers in response to a membership query or asynchronously when first registering for a multicast group Leave group messages Sent by multicast receivers when specified multicast transmissions are no longer needed at the receiver IGMP messages are carried in bare IP packets with IP protocol number 2. Similar to the Internet Control Message Protocol, there is no transport layer used with IGMP messaging. IGMPv2 messages Where: Type Indicates the message type as follows {| class="wikitable" |+IGMP message type values |- !Message !Type value |- | Membership Query |0x11 |- | IGMPv1 Membership Report | 0x12 |- | IGMPv2 Membership Report | 0x16 |- | IGMPv3 Membership Report | 0x22 |- | Leave Group | 0x17 |} Max Resp Time Specifies the required responsiveness of replies to a Membership Query (0x11). This field is meaningful only in Membership Query; in other messages it is set to 0 and ignored by the receiver. The field specifies time in units of 0.1 second (a field value of 10 specifies 1 second). Larger values reduce IGMP traffic burstiness and smaller values improve protocol responsiveness when the last host leaves a group. Group Address This is the multicast address being queried when sending a Group-Specific or Group-and-Source-Specific Query. The field is zeroed when sending a General Query. The message is sent using the following IP destination addresses: IGMPv3 membership query Where: Max Resp Code This field specifies the maximum time (in 1/10 second increments) allowed before sending a responding report. If the number is below 128, the value is used directly. If the value is 128 or more, it is interpreted as an exponent and mantissa. Checksum This is the 16-bit one's complement of the one's complement sum of the entire IGMP message. Group Address This is the multicast address being queried when sending a Group-Specific or Group-and-Source-Specific Query. The field is zeroed when sending a General Query. Resv This field is reserved. It should be zeroed when sent and ignored when received. S (Suppress Router-side Processing) Flag When this flag is set, it indicates to receiving routers that they are to suppress the normal timer updates. QRV (Querier's Robustness Variable) If this is non-zero, it contains the Robustness Variable value used by the sender of the query. Routers should update their Robustness Variable to match the most recently received query unless the value is zero. QQIC (Querier's Query Interval Code) This code is used to specify the Query Interval value (in seconds) used by the querier. If the number is below 128, the value is used directly. If the value is 128 or more, it is interpreted as an exponent and mantissa. Number of Sources (N) This field specifies the number of source addresses present in the query. For General and Group-Specific Queries, this value is zero. For Group-and-Source-Specific Queries, this value is non-zero, but limited by the network's MTU. Source Address [i] The Source Address [i] fields are a vector of n IP unicast addresses, where n is the value in the Number of Sources (N) field. Implementations The FreeBSD, Linux and Windows operating systems support IGMP at the host side. See also Internet Group Management Protocol with Access Control Notes References Internet protocols Internet Standards Internet layer protocols Network layer protocols
14851
https://en.wikipedia.org/wiki/Ian%20Murdock
Ian Murdock
Ian Ashley Murdock (April28, 1973 – December 28, 2015) was an American software engineer, known for being the founder of the Debian project and Progeny Linux Systems, a commercial Linux company. Life and career Although Murdock's parents were both from Southern Indiana, he was born in Konstanz, West Germany, on April 28, 1973, where his father was pursuing postdoctoral research. The family returned to the United States in 1975, and Murdock grew up in Lafayette, Indiana, beginning in 1977 when his father became a professor of entomology at Purdue University. Murdock graduated from Harrison High School in 1991, and then earned his bachelor's degree in computer science from Purdue in 1996. While a college student, Murdock founded the Debian project in August 1993, and wrote the Debian Manifesto in January 1994. Murdock conceived Debian as a Linux distribution that embraced open design, contributions, and support from the free software community. He named Debian after his then-girlfriend (later wife) Debra Lynn, and himself. They later married, had three children, and divorced in January 2008. In January 2006, Murdock was appointed Chief Technology Officer of the Free Standards Group and elected chair of the Linux Standard Base workgroup. He continued as CTO of the Linux Foundation when the group was formed from the merger of the Free Standards Group and Open Source Development Labs. Murdock left the Linux Foundation to join Sun Microsystems in March 2007 to lead Project Indiana, which he described as "taking the lesson that Linux has brought to the operating system and providing that for Solaris", making a full OpenSolaris distribution with GNOME and userland tools from GNU plus a network-based package management system. From March 2007 to February 2010, he was Vice President of Emerging Platforms at Sun, until the company merged with Oracle and he resigned his position with the company. From 2011 until 2015 Murdock was Vice President of Platform and Developer Community at Salesforce Marketing Cloud, based in Indianapolis. From November 2015 until his death Murdock was working for Docker, Inc. Death Murdock died on December 28, 2015 in San Francisco. Though initially no cause of death was released, in July 2016 it was announced his death had been ruled a suicide. The police confirmed that the cause of death was due to asphyxiation caused by hanging himself with a vacuum cleaner electrical cord. The last tweets from Murdock's Twitter account first announced that he would commit suicide, then said he would not. He reported having been accused of assault on a police officer after having been himself assaulted and sexually humiliated by the police, then declared an intent to devote his life to opposing police abuse. His Twitter account was taken down shortly afterwards. The San Francisco police confirmed he was detained, saying he matched the description in a reported attempted break-in and that he appeared to be drunk. The police stated that he became violent and was ultimately taken to jail on suspicion of four misdemeanor counts. They added that he did not appear to be suicidal and was medically examined prior to release. Later, police returned on reports of a possible suicide. The city medical examiner's office confirmed Murdock was found dead. See also List of Debian project leaders References External links (archived) . https://archive.org/details/IanMurdockHomepage.tar https://archive.org/download/AutopsyIanMurdockDebianLinuxFounder/Autopsy-Ian-Murdock-Debian-Linux-Founder.pdf Webarchive.org Official autopsy documents Webarchive.org Unofficial website backup after his death 1973 births 2015 deaths American computer programmers Debian Project leaders Free software programmers Open source people People from Indianapolis People from Konstanz Purdue University alumni Suicides by hanging in California Sun Microsystems people
37004368
https://en.wikipedia.org/wiki/FusionCharts
FusionCharts
FusionCharts, part of InfoSoft Global (P) Ltd, is privately held software provider of data visualization products (JavaScript Charts, Maps, Widgets and Dashboards) with offices in Bangalore and Kolkata, India. FusionCharts has 23,000 customers and 500,000 users in 120 countries, including technology giants such as Apple, Google, ZOHO, Cisco, Facebook, Intel, LinkedIn, Microsoft, Hewlett-Packard, IBM, EMC, Nokia, Tibco, as well as The Weather Channel, NASA, and the Federal Government of the United States. A 100% bootstrapped company, FusionCharts has earned a 2010-11 revenue of $4.5 million and has clocked revenues of up to $7 million, or Rs 39 crore. History The idea behind FusionCharts was born in 2001 when 16-year-old Pallav Nadhani found himself dissatisfied with Microsoft Excel's charting capabilities while using the program to complete high school class assignments. Nadhani subsequently authored an article on Wrox Press's ASPToday.com technology website which examined the thesis that Macromedia Flash, then used mainly for web banners and pop-up ad, could be used to build an interactive charting solution for business applications such as dashboards and reports. The article earned him $1,500 and feedback from developers, which together acted as seed money and motivation for establishing the FusionCharts concept. In 2002 at 17, Nadhani founded Infosoft Global. The initial product had six charts and was built using ActionScript. Nadhani worked alone developing the product, website, documents, sales and marketing and customer support for the first three years. As the company began to take off in 2005, he acquired office space in Bangur and hired 20 employees over the following 2–3 years. The venture grew during this period without raising external funding by bootstrapping. By 2009, the company had moved to Salt Lake City, Kolkata, and had grown to over 50 employees. Since arriving in Salt Lake, the staff has expanded by 250 percent of its original size, and in 2011 FusionCharts opened their second office in Bangalore. FusionCharts' client list, with customers across 118 countries and numerous business sectors, has drawn significant attention. The company was placed squarely on the global platform following its 2010 selection by US President Barack Obama to design the digital dashboards for the federal administration, the Federal IT Dashboard. FusionCharts was the first Indian startup to gain the attention of the Obama administration. Co-founder Pallav Nadhani is the CEO of FusionCharts and also runs a seed funding venture capital fund named Seeders Inc. In March 2020, the company was acquired by Idera, Inc., a U.S.-based software company. Marketing Since its founding in 2003, FusionCharts has put together an almost completely online network of international resellers serving places like Korea, Brazil, China and the United States. The company has also made use of search engine optimization and pay per click marketing, and engages users and developers by composing articles, whitepapers, and case studies on the subject of data visualization in various online publications. FusionCharts also engages in advertising in both online and print developer magazines in key markets such as the US, Europe and Korea. The company is increasing its social media marketing and is emphasizing push–pull marketing strategy. Another aspect of marketing is a simple licensing framework as well as a FOSS version of its FusionCharts product that serves over 100,000 users. FusionCharts is typically targeted to developers who wish to integrate interactive charts in their reports, dashboards, analytics, monitors, and surveys. Products Pallav Nadhani's original ASPToday.com article called for creating a charting library using Flash, combined with ASP to power it with data. Developers responded positively and shared ideas on how to increase its power and functionality. Subsequently, Pallav coded this idea into a charting application, which led to the birth of the FusionCharts software. FusionCharts has since transitioned to use JavaScript, SVG and VML to render charts, widgets and maps. This allows its components to be used on all mobile devices and cross-platform browsers. It does allow for optional rendering using Flash. FusionCharts Suite XT can today be used with any web scripting language to deliver interactive and powerful charts. Using XML and JSON as its data interfaces, FusionCharts Suite XT makes full use of HTML 5 technologies to create compact, interactive, and visually-arresting charts. FusionCharts Suite XT Book British book publisher Packt UK has released a guidebook aimed at helping new users learn the basics of the FusionCharts Suite XT. Achievements In 2009, the company was included in the NASSCOM EMERGE 50 Leaders for 2009 due to its success in establishing its data visualization software globally. In the same year it was also awarded the Deloitte Technology Fast50 India award in 2009. FusionCharts was in 2010 named as one of the 15 companies likely to become the next Infosys. The chart-gadget within Google Docs is powered by FusionCharts. Internally, Google employees also employ FusionCharts software for reporting. FusionCharts says that it powers over one billion charts every month globally. See also JavaScript framework JavaScript library References Charts Indian companies established in 2003 Software companies of India Information technology companies of Bangalore Companies based in Kolkata Data visualization software International information technology consulting firms Information technology consulting firms of India Infographics 2003 establishments in Karnataka Software companies established in 2003
4240965
https://en.wikipedia.org/wiki/Operational%20risk%20management
Operational risk management
The term operational risk management (ORM) is defined as a continual cyclic process which includes risk assessment, risk decision making, and implementation of risk controls, which results in acceptance, mitigation, or avoidance of risk. ORM is the oversight of operational risk, including the risk of loss resulting from inadequate or failed internal processes and systems; human factors; or external events. Unlike other type of risks (market risk, credit risk, etc.) operational risk had rarely been considered strategically significant by senior management. Four principles The U.S. Department of Defense summarizes the principles of ORM as follows: Accept risk when benefits outweigh the cost. Accept no unnecessary risk. Anticipate and manage risk by planning. Make risk decisions in the right time at the right level. Three levels In Depth In depth risk management is used before a project is implemented, when there is plenty of time to plan and prepare. Examples of in depth methods include training, drafting instructions and requirements, and acquiring personal protective equipment. Deliberate Deliberate risk management is used at routine periods through the implementation of a project or process. Examples include quality assurance, on-the-job training, safety briefs, performance reviews, and safety checks. Time Critical Time critical risk management is used during operational exercises or execution of tasks. It is defined as the effective use of all available resources by individuals, crews, and teams to safely and effectively accomplish the mission or task using risk management concepts when time and resources are limited. Examples of tools used includes execution check-lists and change management. This requires a high degree of situational awareness. Process The International Organization for Standardization defines the risk management process in a four-step model: Establish context Risk assessment Risk identification Risk analysis Risk evaluation Risk treatment Monitor and review This process is cyclic as any changes to the situation (such as operating environment or needs of the unit) requires re-evaluation per step one. Deliberate The U.S. Department of Defense summarizes the deliberate level of ORM process in a five-step model: Identify hazards Assess hazards Make risk decisions Implement controls Supervise (and watch for changes) Time critical The U.S. Navy summarizes the time-critical risk management process in a four-step model: 1. Assess the situation. The three conditions of the Assess step are task loading, additive conditions, and human factors. Task loading refers to the negative effect of increased tasking on performance of the tasks. Additive factors refers to having a situational awareness of the cumulative effect of variables (conditions, etc.). Human factors refers to the limitations of the ability of the human body and mind to adapt to the work environment (e.g. stress, fatigue, impairment, lapses of attention, confusion, and willful violations of regulations). 2. Balance your resources. This refers to balancing resources in three different ways: Balancing resources and options available. This means evaluating and leveraging all the informational, labor, equipment, and material resources available. Balancing Resources versus hazards. This means estimating how well prepared you are to safely accomplish a task and making a judgement call. Balancing individual versus team effort. This means observing individual risk warning signs. It also means observing how well the team is communicating, knows the roles that each member is supposed to play, and the stress level and participation level of each team member. 3. Communicate risks and intentions. Communicate hazards and intentions. Communicate to the right people. Use the right communication style. Asking questions is a technique to opening the lines of communication. A direct and forceful style of communication gets a specific result from a specific situation. 4. Do and debrief. (Take action and monitor for change.) This is accomplished in three different phases: Mission Completion is a point where the exercise can be evaluated and reviewed in full. Execute and Gauge Risk involves managing change and risk while an exercise is in progress. Future Performance Improvements refers to preparing a "lessons learned" for the next team that plans or executes a task. Benefits Reduction of operational loss. Lower compliance/auditing costs. Early detection of unlawful activities. Reduced exposure to future risks. Chief Operational Risk Officer The role of the Chief Operational Risk Officer (CORO) continues to evolve and gain importance. In addition to being responsible for setting up a robust Operational Risk Management function at companies, the role also plays an important part in increasing awareness of the benefits of sound operational risk management. Most complex financial institutions have a Chief Operational Risk Officer. The position is also required for Banks that fall into the Basel II Advanced Measurement Approach "mandatory" category. Software The impact of the Enron failure and the implementation of the Sarbanes–Oxley Act has caused several software development companies to create enterprise-wide software packages to manage risk. These software systems allow the financial audit to be executed at lower cost. Forrester Research has identified 115 Governance, Risk and Compliance vendors that cover operational risk management projects. Active Agenda is an open source project dedicated to operational risk management. See also Basel II Benefit risk Cost risk Data governance Fuel price risk management Key risk indicator (KRI) Operational risk Optimism bias Risk Risk management Risk management tools Solvency II Tactical Risk Management References General OPNAVINST 3500.39C OPERATIONAL RISK MANAGEMENT (ORM) MARINE CORPS ORDER 3500.27B OPERATIONAL RISK MANAGEMENT (ORM) Cited External links The Institute of Operational Risk The institute provides professional recognition and enables members to maintain competency in the discipline of operational risk. Operational Risk Institute An association of operational risk training professionals that renders key training on Op Risk related subjects including Business Continuity. Operational Risk Management Software 5 Essential features must incorporate in ORM Software to avoid risks. Operational Risk Management of U.S. Insurers How well do you understand operational Risk Management. Operational risk Risk management in business
613569
https://en.wikipedia.org/wiki/Build%20%28game%20engine%29
Build (game engine)
Build is a first-person shooter engine created by Ken Silverman, author of Ken's Labyrinth, for 3D Realms. Like the Doom engine, the Build engine represents its world on a two-dimensional grid using closed 2D shapes called sectors, and uses simple flat objects called sprites to populate the world geometry with objects. The Build engine is generally considered to be a 2.5D engine since the basic world geometry is two-dimensional with an added height component, allowing each sector to have a different ceiling height and floor height. Playing the game shows that some floors can be lower and some can be higher; same with ceilings (in relation to each other). Floors and ceilings can hinge along one of the sector's walls, resulting in a slope. With this information, the Build engine renders the world in a way that looks three-dimensional, unlike modern game engines that create actual 3D environments. Though the Build engine achieved most of its fame as a result of powering the 1996 first-person shooter Duke Nukem 3D, it was also used for many other games. Technical features Sectors Sectors are the building blocks of a level's layout, consisting of a two-dimensional polygonal outline when viewed from above, with the top and bottom faces of the sector given separate altitudes to create a three-dimensional space. Hence, all walls are perfectly vertical—anything appearing otherwise is technically a sloped floor or ceiling. The word room can be used as a loose substitute to aid understanding, though one room in the game world can consist of many sectors, and parallaxed skies can give the illusion of being outdoors. Sectors can be manipulated in real-time; all of their attributes such as shape, height, and slope could be modified "on-the-fly" by games, unlike the earlier Doom engine. This allowed games to have destructible environments, such as those seen in Blood. This technique is similar to the use of push walls in the earlier Apogee Software title Rise of the Triad which featured similar dynamic environments. Developers of games based on the engine used special reserved "sprites" (game objects), often called "sector ", that, when given special tags (numbers with defined meanings), would allow the level designer to construct a dynamic world; similar tag information could be given to the sector walls and floor area to give a sector special characteristics. For example, a particular sector effector may let players fall through the floor if they walk over it and teleport them to another sector; in practice, this could be used to create the effect of falling down a hole to a bigger room or creating a body of water that could be jumped into to explore underwater. A sector could be given a tag that made it behave like an elevator or lift. Sectors could overlap one another, provided they could not be seen at the same time (if two overlapping sectors were seen at the same time, a hall of mirrors effect resulted). This allowed the designers to create, for instance, air ducts that appeared to extend across the top of another room (however, doing so could be tricky for designers due to the 2D viewpoint used for much of the editing process). This allowed the designers to create worlds that would be physically impossible (e.g. a doorway of a small building could lead into a network of rooms larger than the building itself). While all these made the games using the engine appear to be 3D, it wouldn't be until later first-person shooters, such as Quake, which used the Quake engine, that the engine actually stored the world geometry as true 3D information, making the creation of one area stacked atop another area in a single map very feasible. Voxels Later versions of Ken Silverman's Build engine allowed game selected art tiles to be replaced by 3D objects made of voxels. This feature appeared too late to be used in Duke Nukem 3D, but was seen in some of the later Build engine games. Blood uses voxels for weapon and ammo pickups, power-ups, and eye-candy (such as the tombstones in the "Cradle to Grave" level, some chairs, and a crystal ball in "Dark Carnival"). Shadow Warrior makes even more advanced use of the technology, with voxels that can be placed on walls (all of the game's switches and buttons are voxels). For several years, Ken worked on a modern engine based entirely on voxels, known as Voxlap. Room over room One limitation of the Build engine is that its level geometry is only capable of representing one connection between sectors for any given wall. Due to this, a structure as simple as a shelf with space both above and below it is impossible, though sometimes sprites or voxels can be substituted. Buildings with several floors are technically possible, but it is not possible for such a building to contain an external window directly above or below another window. In addition, some liberties will need to be taken with the staircases, elevators, and other methods of access for each floor. Several Build engine games (namely Shadow Warrior, Blood, and Redneck Rampage) worked around this by displaying a "viewport" to another sector through an additional rendering pass. This technique, called room-over-room (ROR), appears seamless to the player. In addition to an expanded range of vertical construction, ROR was often used to give bodies of water translucent surfaces. ROR was never a feature of the Build engine itself, but rather a "trick" that was created by game developers. A trick used in Duke Nukem 3D to get around this, as in the case of its opaque underwater sections, was to simply transport the player quickly to another region of the map made to mimic it, similar to the elevators from Rise of the Triad. In 2011, a feature was added to EDuke32 called true room over room (TROR), which allows multiple sectors to be stacked vertically so that each sector's wall has its own connection, enabling vertically-unrestricted structures. The difference between ROR and TROR is that TROR sectors physically overlap in the map data and editor (allowing for easy creation and visualization), rather than being drawn from separate locations using view portals, hence true room over room. TROR is a feature of the EDuke32 source port, not a game feature or trick. List of Build Engine games Games that are built directly on the Build engine Rock'n'Shaolin: Legend of the Seven Paladins 3D (1994) (only released locally, illegally used an early version of the Build engine) Witchaven (1995) William Shatner's TekWar (1995) Duke Nukem 3D (1996) Duke Nukem 3D: Plutonium PAK (1996) Duke Nukem 3D: Atomic Edition (1996) Duke!ZONE II (1997) Duke Xtreme (1997) Duke it Out in D.C. (1997) Duke Caribbean: Life's a Beach (1997) Duke: Nuclear Winter (1997) PowerSlave (Exhumed in Europe and Seireki 1999: Pharaoh no Fukkatsu in Japan) (1996) Witchaven II: Blood Vengeance (1996) Blood (1997) Blood: Plasma Pak (1997) Blood: Cryptic Passage (1997) Shadow Warrior (1997) Shadow Warrior: Twin Dragon (1998) Shadow Warrior: Wanton Destruction (2005) Games that are based on the Duke Nukem 3D code Redneck Rampage (1997) Redneck Rampage: Suckin' Grits on Route 66 (1997) So You Wanna Be A Redneck (1998) Redneck Rampage Rides Again (1998) Redneck Deer Huntin' (1998) Extreme Paintbrawl (1998) NAM (1998) Liquidator (1998) (only released locally, illegally used Build engine and published by Akella) World War II GI (1999) World War II GI: Platoon Leader (1999) Ion Fury (2019) (via EDuke32) Unreleased Build engine games Fate (unfinished, only a demo exists) Corridor 8: Galactic Wars (unfinished, source code is available) Shadow Warrior: Deadly Kiss (unreleased, screenshots were released in January 1998.) Development The Build engine was essentially a one-man project for Ken Silverman, though he consulted John Carmack for guidance early in the project. Silverman was hired by 3D Realms on the basis of his demo for Build. Though he continued to refine the engine after becoming employed at 3D Realms, according to Silverman he never teamed with any other 3D Realms employees on the project and was never directed to tailor the engine towards any particular game. Source release and further developments On June 20, 2000 (according to his website) Ken Silverman released the Build engine source code under a proprietary license. Silverman explained that after id Software set a precedent by releasing the source code for the Doom engine, fans had been pressuring him to release the source code for the Build engine. Early days Version 2.0 of Matt Saettler's EDuke, a project to improve Duke Nukem 3D for modders, was sent to 3D Realms for packaging shortly after the release of the Build source, leaving Duke Nukem 3D the pre-built libraries that 3D Realms had used with the original Duke. (Both Duke Nukem 3D and EDuke were still closed-source at this point.) With the 2.1 private betas, Saettler worked towards integrating Silverman's build source into the Duke source code, but the project fizzled out before producing anything more than some very buggy private betas. A few total conversion teams for Build games decided to work from Silverman's Build code directly, and an enhanced version of the Build editor known as Mapster was also developed. It was claimed at the time by many on the 3D Realms forums that it would be impossible to port Build to a multitasking OS, as it needed a large contiguous block of memory that wouldn't be available in a multitasking environment. This statement did not hold up to scrutiny, as all modern operating systems use virtual memory which allows apps to get contiguous logical memory without using contiguous physical memory, but conventional wisdom of the time was that porting Build to such an OS was unfeasible. Duke Nukem 3D source release On April 1, 2003, after several years of claims to the contrary, 3D Realms released the source code to Duke Nukem 3D under the GPL-2.0-or-later license. Not long afterwards, both Ryan C. Gordon and Jonathon Fowler created and released source ports of the game, including the Build engine. It was possible to play Duke Nukem 3D well on the NT line of Windows (including Windows 2000/XP) and on Linux and other Unix operating systems, and interest in the source ports soared. icculus.org port Ryan C. Gordon (icculus), with the help of others, made the first port of the engine using SDL. The port was first to Linux, then to Cygwin, and finally to a native Windows build using the Watcom C++ compiler, which was the compiler used for the original DOS build (despite being compiled with Watcom C++, Build is plain C.) There was some talk of Matt Saettler using this to port EDuke to Windows, but nothing came of it. JonoF port A second source port was made to Windows, and later to Linux and Mac OS X, by Jonathon Fowler (JonoF). This port, JFDuke3D, initially did not have network game support, though this was added later in development. Polymost The task of updating the Build engine to a true 3D renderer was taken on by Silverman himself. In the release notes for Polymost, he wrote: "When 3D Realms released the Duke Nukem 3D source code, I thought somebody would do a OpenGL or Direct3D port. Well, after a few months passed, I saw no sign of somebody working on a true hardware-accelerated port of Build, just people saying it wasn't possible. Eventually, I realized the only way this was going to happen was for me to do it myself." The Polymost renderer allowed for 3D hardware-accelerated graphics using OpenGL. It also introduced "hightile", a feature that made it possible to replace the game's original textures with high-resolution replacements in a variety of formats. Polymost has been utilized in Jonathon Fowler's JFBuild, JFDuke3D, JFShadowWarrior, and source ports derived from their code bases. EDuke32 The source for EDuke 2.0 was later released, followed by the source for the last private beta of EDuke 2.1 (which never made it to a release version). Richard Gobeille (TerminX) merged the EDuke 2.0 source with JFDuke3D to make EDuke32. Another port, Wineduke, based on the icculus code, has since died off, leaving EDuke32 the only EDuke port still in development. EDuke32 also supports the games NAM and WWII GI, as EDuke was based on the code to those games. Polymer On April 1, 2009, an OpenGL shader model 3.0 renderer was revealed to have been developed for EDuke32, named Polymer to distinguish from Ken Silverman's Polymost. At first it was thought to be an April Fools' joke, but the renderer was later made public. It allows for more modern effects such as real-time dynamic colored lighting and shadow mapping, specular and normal mapping, and other shader-based features in addition to most of the features added to Polymost over the years. Although Polymer is completely usable, it is technically incomplete and unoptimised, and is still in development. The developers of EDuke32 have stated that once Polymer has been rewritten for speed, it will supplant Polymost completely, as it is a superior renderer, and can be made to look identical to Polymost. Other game ports The Shadow Warrior source code was released on April 1, 2005 under the GPL-2.0-or-later license, and JonoF released a source port of it, JFShadowWarrior, on April 2, 2005. However, he admitted that he had access to the Shadow Warrior source code about a week before its release. This port was later forked by ProASM for the SWP port. The Transfusion project aimed to re-create Blood in the DarkPlaces engine, but as of 2006, this project is far from complete, though it has complete deathmatch multiplayer; a similar project is BloodCM which recreates all of the Monolith made single player levels for Blood on top of EDuke32, as well as ZBlood which ports some Blood assets and levels onto ZDoom. The source code of Witchaven, Witchaven II: Blood Vengeance, William Shatner's TekWar, and Corridor 8: Galactic Wars have also surfaced. The legal status of these, however, is unclear. The full source code to an alpha version of Blood was also leaked, and was used as a reference for an otherwise reverse engineered port to Java using LibGDX called BloodGDX in May 2017. This followed from the author's previous port of TekWar released in January 2016, and has been followed up by ports for Witchaven, Redneck Rampage, Duke Nukem 3D, Powerslave, Legends of the Seven Paladins and Shadow Warrior, now all collectively called BuildGDX. A further port of Blood, called NBlood, was released in January 2019 based on EDuke32 and the creator's previous Rednukem port for Redneck Rampage. An EDuke32 port for PowerSlave, called PCExhumed, was released on November 21, 2019. The source port Raze forks various Build engine ports, including JFDuke3D, SWP, NBlood, Rednukem, and PCExhumed, and ties it to a new underlying backend based on GZDoom. Successor After multiple attempts to design a successor to Build, Silverman again began experimenting with such an idea in 2006. He used this work - now called Build 2 - while teaching 3D game programming to children at a summer camp from 2007 until 2009, and work continued until 2011 when he lost interest in the project. It features a more advanced lighting system, voxel rendering for entities and true room-over-room 3D spaces, and at least in part retained backwards compatibility with the original Build. Silverman released his drafts to the public on March 7, 2018. The source code was published under a proprietary license on June 8, 2019. References External links Ken Silverman's Build Engine Page BUILD2 Demo and Tools Build engine basic tutorial using mapster32 1995 software Game engines for Linux Video game engines
47740780
https://en.wikipedia.org/wiki/Eps1.0%20hellofriend.mov
Eps1.0 hellofriend.mov
"eps1.0_hellofriend.mov" is the pilot episode of USA Network's drama-thriller television series Mr. Robot. The pilot was directed by Niels Arden Oplev and written by creator and showrunner Sam Esmail. The episode aired on June 24, 2015, and was watched by approximately 1.75 million people in the U.S., the highest rating the series has ever received. The episode was praised for its writing, music, cinematography, and performances, particularly that of Rami Malek, although it received some criticism for its similarity to David Fincher's Fight Club, a film which Esmail has stated he took inspiration from. Plot Elliot Alderson is a socially anxious cybersecurity engineer who works at Allsafe Security in New York City while moonlighting as a computer hacker. Elliot narrates directly to the audience, speaking to an imaginary character in his mind. He believes that he is being followed by men in suits, possibly over his actions the night before. In a flashback, Elliot engineers a child pornographer's arrest by hacking the man's computer and sending its illegal content to the police. On the train ride home, he again sees the men in suits along with a man in glasses who attempts to talk to him. The next day, Elliot reports to work at Allsafe, where he provides computer security for the very corporations he despises. At a therapy session, Elliot narrates how he has hacked his therapist, Krista, and has unsuccessfully attempted to hack her boyfriend, Michael Hansen. Allsafe executive Gideon is preparing to host their largest client, the multi-national conglomerate E Corp (which Elliot refers to as "Evil Corp"). During their tour of the office, Elliot has a strange interaction with E Corp's Senior VP of Technology, Tyrell Wellick. After work, Elliot snorts morphine to help him cope with his depression and loneliness and afterward takes suboxone in case he goes through withdrawal. His neighbor and drug dealer, Shayla, offers him molly, and they have sex. Later that night, he gets a notification on his phone that Krista has checked in at a local restaurant with Michael. Using a ruse, Elliot manages to get Michael's telephone number. While walking home, Elliot receives a panicked phone call from Angela, his childhood best friend, begging him to come back to work. At Allsafe, Elliot finds Angela and his colleague Lloyd attempting to stop a DDoS attack on E Corp's servers. Elliot realizes that they cannot stop the hack locally because of the rootkit that the hackers wrote and placed in the root directory of the server (CS 30), and together with Gideon he flies to E Corp's server farm to stop the hack in person. While examining the hacked server, Elliot finds a file with a message in it for him. The message simply says, "Leave me here," and after a quick debate with himself, he leaves it on the server, but changes the file so that only he can access it. On his train ride home from Allsafe, Elliot is once again confronted by the man in glasses, whom he refers to as Mr. Robot due to the logo on his shirt. Mr. Robot tells Elliot to follow him off the train, but only if he didn't delete the file from E Corp's server. They head to an abandoned arcade in Coney Island, where Mr. Robot explains that he and a small group of hackers are the ones who attacked E Corp's server. Saving their file instead of deleting it was a test, which Elliot has passed. Mr. Robot welcomes Elliot into the hackers’ group: "fsociety". Elliot returns home and compiles all the evidence needed to turn fsociety into the FBI. Elliot visits Mr. Robot again to tell him that he will be turning him in. On the Wonder Wheel, Mr. Robot asks Elliot to modify the file to show that E Corp's CTO Terry Colby was behind the hack instead of fsociety. Mr. Robot offers Elliot the chance to take E Corp down completely, and Elliot returns home where he modifies the data file as asked. In a meeting with E Corp, the FBI, and Allsafe the next day, Elliot prepares to give the FBI the evidence against fsociety. However, after Terry Colby insults Angela and has her removed from the meeting, he gives the FBI the falsified info that incriminates Colby. Nineteen days later, Elliot is anxious for something to happen to Terry Colby or E Corp. To occupy his mind, Elliot turns back to hacking Michael. He discovers that “Michael” is using a fake name and profile, and is actually married to someone else. He confronts and threatens the man, telling him that he must reveal his deception to Krista, or Elliot will dump all his collected evidence on the man's wife. Elliot also demands that the man gives him his dog, which he had been abusing. In his next therapy session, Elliot sees Krista is obviously emotionally distracted and knows that the man broke up with her. Elliot returns to work and attempts to patch his relationship with Angela, who hasn't spoken to him since the meeting with the FBI. They make up, and as they hug, everyone in the office begins to stare. They realize that everyone is staring at the TV monitor behind them, which is showing the news that Terry Colby has been arrested by the FBI. Elliot goes to Times Square to watch the news, but he is confronted by the men in suits. They escort him to E Corp's headquarters and lead him into a room to be confronted by Tyrell Wellick. Production Writing Sam Esmail originally intended Mr. Robot to be a feature film. However, midway through writing the first act, he found that script had expanded considerably, and that it became a script more suited for a television show. He removed 20 of around 89 pages of the script, and used it as the pilot for the series. Esmail took the script to film and television production company Anonymous Content to see if it could be developed into a television series, which was then picked up by USA Network. The network gave a pilot order to Mr. Robot in July 2014. Casting Casting of the series was an arduous task, as no actor was acclimated to the tone of the series. Casting director Susie Farris (a role she shares with Beth Bowling and Kim Miscia) was surprised by some requirements; she recounted that, "Some of the things [Sam] would say on the phone, I remember just being like, 'Are you kidding? What?' And I think that's what makes the show so good, is because he did have such a vision, and that's not necessarily what happens in episodic television always." According to Esmail, he contemplated rewriting the script before Malek had his audition, saying; "I was auditioning people, and a lot of great actors came in, [but] they were starting to sound very cold to me and so I started second-guessing the script. I felt like I was being lectured by this guy. I felt like the character was being too obnoxious. And then Rami came in and just auditioned with this vulnerability and this warmth that instead of me feeling cold and disconnected from the character, it made me want to reach out to him and hug him. It's something that, once he did the audition, we all knew this was our guy." Filming The pilot was filmed on location in New York. Filming locations included Silvercup Studios and Coney Island, which served as the base of operations for the hacking group fsociety. Reception The episode received universal acclaim from critics and audiences. The episode has an approval rating of 100% on Rotten Tomatoes, with an average score of 8.3/10. The episode got a rating of 8.7/10 from Amy Ratcliffe of IGN, who praised Malek, saying "Mr. Robot made a fantastic first impression with its pilot. It's obviously impossible to judge a series based on a single episode, but they've played a strong first hand. Rami Malek is positively brilliant as Elliot, and the character's Robin Hood-esque nature has appeal. The stakes are high and the conspiracies and mysteries are riveting – it's a world I can't wait to see more of." This episode was nominated for three Primetime Emmy Awards – Outstanding Lead Actor in a Drama Series for Rami Malek, Outstanding Writing for a Drama Series for Sam Esmail, and Outstanding Music Composition for a Series for Mac Quayle, winning for Malek's performance and Quayle's music composition. In its initial broadcast on USA Network on June 24, 2015, the episode received 1.75 million viewers. The episode also had 2.6 million views prior to its broadcast, as the episode was made available online beginning May 27. References External links "eps1.0_hellofriend.mov" at USA Network 2015 American television episodes American television pilots Mr. Robot episodes Television episodes about child sexual abuse Television episodes about pedophilia Articles with underscores in the title
308137
https://en.wikipedia.org/wiki/Avionics%20software
Avionics software
Avionics software is embedded software with legally mandated safety and reliability concerns used in avionics. The main difference between avionic software and conventional embedded software is that the development process is required by law and is optimized for safety. It is claimed that the process described below is only slightly slower and more costly (perhaps 15 percent) than the normal ad hoc processes used for commercial software. Since most software fails because of mistakes, eliminating the mistakes at the earliest possible step is also a relatively inexpensive and reliable way to produce software. In some projects however, mistakes in the specifications may not be detected until deployment. At that point, they can be very expensive to fix. The basic idea of any software development model is that each step of the design process has outputs called "deliverables." If the deliverables are tested for correctness and fixed, then normal human mistakes can not easily grow into dangerous or expensive problems. Most manufacturers follow the waterfall model to coordinate the design product, but almost all explicitly permit earlier work to be revised. The result is more often closer to a spiral model. For an overview of embedded software see embedded system and software development models. The rest of this article assumes familiarity with that information, and discusses differences between commercial embedded systems and commercial development models. General overview Since most avionics manufacturers see software as a way to add value without adding weight, the importance of embedded software in avionic systems is increasing. Most modern commercial aircraft with auto-pilots use flight computers and so called flight management systems (FMS) that can fly the aircraft without the pilot's active intervention during certain phases of flight. Also under development or in production are unmanned vehicles: missiles and drones which can take off, cruise and land without airborne pilot intervention. In many of these systems, failure is unacceptable. The reliability of the software running in airborne vehicles (civil or military) is shown by the fact that most airborne accidents occur due to manual errors. Unfortunately reliable software is not necessarily easy to use or intuitive, poor user interface design has been a contributing cause of many aerospace accidents and deaths. Regulatory issues Due to safety requirements, most nations regulate avionics, or at least adopt standards in use by a group of allies or a customs union. The three regulatory organizations that most affect international aviation development are the U.S, the E.U. and Russia. In the U.S., avionic and other aircraft components have safety and reliability standards mandated by the Federal Aviation Regulations, Part 25 for Transport Airplanes, Part 23 for Small Airplanes, and Parts 27 and 29 for Rotorcraft. These standards are enforced by "designated engineering representatives" of the FAA who are usually paid by a manufacturer and certified by the FAA. In the European Union the IEC describes "recommended" requirements for safety-critical systems, which are usually adopted without change by governments. A safe, reliable piece of avionics has a "CE Mark." The regulatory arrangement is remarkably similar to fire safety in the U.S. and Canada. The government certifies testing laboratories, and the laboratories certify both manufactured items and organizations. Essentially, the oversight of the engineering is outsourced from the government and manufacturer to the testing laboratory. To assure safety and reliability, national regulatory authorities (e.g. the FAA, CAA, or DOD) require software development standards. Some representative standards include MIL-STD-2167 for military systems, or RTCA DO-178B and its successor DO-178C for civil aircraft. The regulatory requirements for this software can be expensive compared to other software, but they are usually the minimum that is required to produce the necessary safety. Development process The main difference between avionics software and other embedded systems is that the actual standards are often far more detailed and rigorous than commercial standards, usually described by documents with hundreds of pages. It is usually run on a real time operating system. Since the process is legally required, most processes have documents or software to trace requirements from numbered paragraphs in the specifications and designs to exact pieces of code, with exact tests for each, and a box on the final certification checklist. This is specifically to prove conformance to the legally mandated standard. Deviations from a specific project to the processes described here can occur due to usage of alternative methods or low safety level requirements. Almost all software development standards describe how to perform and improve specifications, designs, coding, and testing (See software development model). However avionics software development standards add some steps to the development for safety and certification: Human interfaces Projects with substantial human interfaces are usually prototyped or simulated. The videotape is usually retained, but the prototype retired immediately after testing, because otherwise senior management and customers can believe the system is complete. A major goal is to find human-interface issues that can affect safety and usability. Hazard analysis Safety-critical avionics usually have a hazard analysis. The early stages of the project, already have at least a vague idea of the main parts of the project. An engineer then takes each block of a block diagram and considers the things that could go wrong with that block, and how they affect the system as a whole. Subsequently, the severity and probability of the hazards are estimated. The problems then become requirements that feed into the design's specifications. Projects involving military cryptographic security usually include a security analysis, using methods very like the hazard analysis. Maintenance manual As soon as the engineering specification is complete, writing the maintenance manual can start. A maintenance manual is essential to repairs, and of course, if the system can't be fixed, it will not be safe. There are several levels to most standards. A low-safety product such as an in-flight entertainment unit (a flying TV) may escape with a schematic and procedures for installation and adjustment. A navigation system, autopilot or engine may have thousands of pages of procedures, inspections and rigging instructions. Documents are now (2003) routinely delivered on CD-ROM, in standard formats that include text and pictures. One of the odder documentation requirements is that most commercial contracts require an assurance that system documentation will be available indefinitely. The normal commercial method of providing this assurance is to form and fund a small foundation or trust. The trust then maintains a mailbox and deposits copies (usually in ultrafiche) in a secure location, such as rented space in a university's library (managed as a special collection), or (more rarely now) buried in a cave or a desert location. Design and specification documents These are usually much like those in other software development models. A crucial difference is that requirements are usually traced as described above. In large projects, requirements-traceability is such a large expensive task that it requires large, expensive computer programs to manage it. Code production and review The code is written, then usually reviewed by a programmer (or group of programmers, usually independently) that did not write it originally (another legal requirement). Special organizations also usually conduct code reviews with a checklist of possible mistakes. When a new type of mistake is found it is added to the checklist, and fixed throughout the code. The code is also often examined by special programs that analyze correctness (Static code analysis), such as SPARK examiner for the SPARK (a subset of the Ada programming language) or lint for the C-family of programming languages (primarily C, though). The compilers or special checking programs like "lint" check to see if types of data are compatible with the operations on them, also such tools are regularly used to enforce strict usage of valid programming language subsets and programming styles. Another set of programs measure software metrics, to look for parts of the code that are likely to have mistakes. All the problems are fixed, or at least understood and double-checked. Some code, such as digital filters, graphical user interfaces and inertial navigation systems, are so well understood that software tools have been developed to write the software. In these cases, specifications are developed and reliable software is produced automatically. Unit testing "Unit test" code is written to exercise every instruction of the code at least once to get 100% code coverage. A "coverage" tool is often used to verify that every instruction is executed, and then the test coverage is documented as well, for legal reasons. This test is among the most powerful. It forces detailed review of the program logic, and detects most coding, compiler and some design errors. Some organizations write the unit tests before writing the code, using the software design as a module specification. The unit test code is executed, and all the problems are fixed. Integration testing As pieces of code become available, they are added to a skeleton of code, and tested in place to make sure each interface works. Usually the built-in-tests of the electronics should be finished first, to begin burn-in and radio emissions tests of the electronics. Next, the most valuable features of the software are integrated. It is very convenient for the integrators to have a way to run small selected pieces of code, perhaps from a simple menu system. Some program managers try to arrange this integration process so that after some minimal level of function is achieved, the system becomes deliverable at any following date, with increasing numbers of features as time passes. Black box and acceptance testing Meanwhile, the test engineers usually begin assembling a test rig, and releasing preliminary tests for use by the software engineers. At some point, the tests cover all of the functions of the engineering specification. At this point, testing of the entire avionic unit begins. The object of the acceptance testing is to prove that the unit is safe and reliable in operation. The first test of the software, and one of the most difficult to meet in a tight schedule, is a realistic test of the unit's radio emissions. This usually must be started early in the project to assure that there is time to make any necessary changes to the design of the electronics. The software is also subjected to a structural coverage analysis, where test's are run and code coverage is collected and analysed. Certification Each step produces a deliverable, either a document, code, or a test report. When the software passes all of its tests (or enough to be sold safely), these are bound into a certification report, that can literally have thousands of pages. The designated engineering representative, who has been striving for completion, then decides if the result is acceptable. If it is, he signs it, and the avionic software is certified. See also Annex: Acronyms and abbreviations in avionics DO-178B Software development model Hazard analysis The Power of 10: Rules for Developing Safety-Critical Code References External links Generic Avionics Software Specification from the Software Engineering Institute (SEI) Avionics Transport software Software
34424906
https://en.wikipedia.org/wiki/Blocks%20That%20Matter
Blocks That Matter
Blocks That Matter is a 2D puzzle-platform game developed by French independent studio Swing Swing Submarine. It was released on August 19, 2011 for Windows, Mac, Linux and Xbox 360. Gameplay The player takes the role of "Tetrobot", a robot which is able to collect blocks by hitting them from under (Mario reference) or drilling them, and can then construct shapes in a "puzzle mode" using four blocks at a time. Blocks That Matter heavily references the games Minecraft and Tetris through the main characters, in-game mechanics and concepts such as "Pajitnovian physics". The Tetrobot will also be playable as a character in the upcoming game UFHO2. The game's name comes from special blocks hidden in the story mode levels. They are unusable blocks that initially appear in levels as treasure chests, which serve no purpose other than to unlock bonus levels. The blocks are direct references to famous blocks in games that inspired the developers, such as a ?-block from Super Mario Bros., an Aperture Science Weighted Companion Cube from Portal, Rubik's Cube, and a die, hence the "blocks that matter". Release Blocks That Matter was made available as a bonus during the Voxatron Humble Indie Bundle along with The Binding of Isaac. In June 2018 the game was ported to the OpenPandora, an ARM processor-based Linux handheld. Reception Blocks That Matter received generally favourable reviews, with an aggregated score of 76 on Metacritic. Blocks That Matter was the Dream Build Play 2011 Challenge "Grand Prize" Winner, earning $40,000 and the opportunity for an XBLA publishing contract. References External links Official website 2011 video games Linux games MacOS games Puzzle-platform games Single-player video games Video games with Steam Workshop support Video games developed in France Windows games Xbox 360 games Xbox 360 Live Indie games
11501406
https://en.wikipedia.org/wiki/Workday%2C%20Inc.
Workday, Inc.
Workday, Inc., is an American on‑demand (cloud-based) financial management and human capital management software vendor. Workday was founded by David Duffield, founder and former CEO of ERP company PeopleSoft, along with former PeopleSoft chief strategist Aneel Bhusri, following Oracle's acquisition of PeopleSoft in 2005. In October 2012, Workday launched a successful IPO (initial public offering) that valued the company at $9.5 billion. Some competitors of Workday include SAP Successfactors, Ceridian and Oracle. In 2020, Fortune magazine ranked Workday Inc. at number five on their Fortune List of the Top 100 Companies to Work For in 2020 based on an employee survey of satisfaction. San Francisco Business Times ranked Workday at number two on their Best Places to Work in the Bay in the largest companies category. History Workday was founded in March 2005 and launched in November 2006. Initially, it was funded by Duffield and venture capital firm Greylock Partners. In December 2008, Workday moved its headquarters from Walnut Creek, California, to Pleasanton, California, where PeopleSoft founder Duffield's prior company, was located. On February 6, 2008, Workday announced that it had reached a definitive agreement to purchase Cape Clear Software. In May 2008, Workday signed a large contract with Flextronics to provide human capital management software services. Other large, multinational companies that have publicly disclosed contracts or deployments of Workday include Aviva, Chiquita Brands, CAE Inc., Fairchild Semiconductor, Rentokil Initial, Thomson Reuters, and Time Warner. On April 29, 2009, Workday announced that it secured $75 million in funding led by New Enterprise Associates. Existing investors Greylock Partners and Workday CEO and co‑founder Dave Duffield also participated in the round. On October 24, 2011, Workday announced $85 million in new funding, bringing total capital raised to $250 million. Investors in the latest round included T. Rowe Price, Morgan Stanley Investment Management, Janus, and Bezos Expeditions, the personal investment entity of Amazon CEO and founder Jeff Bezos. As of spring 2012, Workday had 310 customers, ranging from mid-sized businesses to Fortune 500 companies. In October 2012, Workday launched its initial public offering (IPO) on the New York Stock Exchange with ticker symbol WDAY. Its shares were priced at $28 and ended trading Friday, October 12, at $48.69, "propelled the start-up to a market capitalization of nearly $9.5 billion including unexercised stock options." It sold 22.75 million Class A shares, raising $637 million. The IPO raised more cash than any launch in the U.S. technology sector since Facebook's $16 billion IPO in May 2012. Its shares surged 74% in their IPO, underscoring investor interest in cloud computing. In 2018 Workday acquired Filip Doušek's company Stories.bi. The Co-CEO of Workday is Aneel Bhusri, who is a partner with Greylock Partners and handled senior leadership positions earlier in his career at PeopleSoft. In 2020, Chano Fernandez was promoted to co-CEO. Dave Duffield serves as the chairman of the board. In November 2021, Workdays announced its acquisition of VNDLY, startup that helps companies manage external workforce personnel, for $510 million. Business model Workday makes money by selling subscriptions to its services rather than selling the software outright. Expenses are booked up front when it signs on a new customer but the associated revenue is recognized over the life of multiyear agreements. In first quarter 2016, Workday announced annual revenue in excess of $1 billion for the first time ever in fiscal year 2016. Corporate governance Duffield holds voting rights to Workday shares that were worth $3.4 billion and Bhusri held rights to shares valued at $1.3 billion. Collectively, they hold 67% of the company's voting shares. This voting structure makes the event of a hostile takeover much less likely. Product Workday has released 34 updates to its product line as of March, 2020, its most recent being "2020 R1". It releases a major update every 6 months, every September and March Workday operates data centers located in Ashburn, Virginia; Lithia Springs, Georgia; Portland, Oregon; Dublin, Ireland; Amsterdam, Netherlands; and also uses Amazon Web Services for its primary computing infrastructure platforms in order to accelerate its worldwide expansion. In February 2014, Workday acquired the startup Identified and its artificial intelligence Syman to create its Insight Apps line of products. The first products running SYMAN were announced at Workday Rising 2014. In July 2017, Workday CEO Aneel Bhusri announced that the company had decided to open up its platform to developers, partners and third-party software. As a result, Workday will enter the Platform as a Service (PaaS) market. Bhusri said the move will allow customers to build custom extensions and applications to work with Workday. In January 2018, Workday announced that it acquired SkipFlag, makers of an AI knowledge base that builds itself from a company's internal communications. Acquisitions References External links Software companies established in 2005 American companies established in 2005 2012 initial public offerings Software companies based in the San Francisco Bay Area Companies based in Pleasanton, California Cloud computing providers Companies listed on the Nasdaq Companies in the NASDAQ-100 Human resource management software ERP software companies 2005 establishments in California Software companies of the United States
23407868
https://en.wikipedia.org/wiki/Software%20development%20process
Software development process
In software engineering, a software development process is the process of dividing software development work into smaller, parallel or sequential steps or subprocesses to improve design, product management. It is also known as a software development life cycle (SDLC). The methodology may include the pre-definition of specific deliverables and artifacts that are created and completed by a project team to develop or maintain an application. Most modern development processes can be vaguely described as agile. Other methodologies include waterfall, prototyping, iterative and incremental development, spiral development, rapid application development, and extreme programming. A life-cycle "model" is sometimes considered a more general term for a category of methodologies and a software development "process" a more specific term to refer to a specific process chosen by a specific organization. For example, there are many specific software development processes that fit the spiral life-cycle model. The field is often considered a subset of the systems development life cycle. History The software development methodology (also known as SDM) framework didn't emerge until the 1960s. According to Elliott (2004) the systems development life cycle (SDLC) can be considered to be the oldest formalized methodology framework for building information systems. The main idea of the SDLC has been "to pursue the development of information systems in a very deliberate, structured and methodical way, requiring each stage of the life cycle––from the inception of the idea to delivery of the final system––to be carried out rigidly and sequentially" within the context of the framework being applied. The main target of this methodology framework in the 1960s was "to develop large scale functional business systems in an age of large scale business conglomerates. Information systems activities revolved around heavy data processing and number crunching routines". Methodologies, processes, and frameworks range from specific proscriptive steps that can be used directly by an organization in day-to-day work, to flexible frameworks that an organization uses to generate a custom set of steps tailored to the needs of a specific project or group. In some cases, a "sponsor" or "maintenance" organization distributes an official set of documents that describe the process. Specific examples include: 1970s Structured programming since 1969 Cap Gemini SDM, originally from PANDATA, the first English translation was published in 1974. SDM stands for System Development Methodology 1980s Structured systems analysis and design method (SSADM) from 1980 onwards Information Requirement Analysis/Soft systems methodology 1990s Object-oriented programming (OOP) developed in the early 1960s, and became a dominant programming approach during the mid-1990s Rapid application development (RAD), since 1991 Dynamic systems development method (DSDM), since 1994 Scrum, since 1995 Team software process, since 1998 Rational Unified Process (RUP), maintained by IBM since 1998 Extreme programming, since 1999 2000s Agile Unified Process (AUP) maintained since 2005 by Scott Ambler Disciplined agile delivery (DAD) Supersedes AUP 2010s Scaled Agile Framework (SAFe) Large-Scale Scrum (LeSS) DevOps It is notable that since DSDM in 1994, all of the methodologies on the above list except RUP have been agile methodologies - yet many organisations, especially governments, still use pre-agile processes (often waterfall or similar). Software process and software quality are closely interrelated; some unexpected facets and effects have been observed in practice Among these another software development process has been established in open source. The adoption of these best practices known and established processes within the confines of a company is called inner source. Prototyping Software prototyping is about creating prototypes, i.e. incomplete versions of the software program being developed. The basic principles are: Prototyping is not a standalone, complete development methodology, but rather an approach to try out particular features in the context of a full methodology (such as incremental, spiral, or rapid application development (RAD)). Attempts to reduce inherent project risk by breaking a project into smaller segments and providing more ease-of-change during the development process. The client is involved throughout the development process, which increases the likelihood of client acceptance of the final implementation. While some prototypes are developed with the expectation that they will be discarded, it is possible in some cases to evolve from prototype to working system. A basic understanding of the fundamental business problem is necessary to avoid solving the wrong problems, but this is true for all software methodologies. Methodologies Agile development "Agile software development" refers to a group of software development frameworks based on iterative development, where requirements and solutions evolve via collaboration between self-organizing cross-functional teams. The term was coined in the year 2001 when the Agile Manifesto was formulated. Agile software development uses iterative development as a basis but advocates a lighter and more people-centric viewpoint than traditional approaches. Agile processes fundamentally incorporate iteration and the continuous feedback that it provides to successively refine and deliver a software system. Agile model also include following software development processes: Dynamic systems development method (DSDM) Kanban Scrum Crystal Atern Lean software development Continuous integration Continuous integration is the practice of merging all developer working copies to a shared mainline several times a day. Grady Booch first named and proposed CI in his 1991 method, although he did not advocate integrating several times a day. Extreme programming (XP) adopted the concept of CI and did advocate integrating more than once per day – perhaps as many as tens of times per day. Incremental development Various methods are acceptable for combining linear and iterative systems development methodologies, with the primary objective of each being to reduce inherent project risk by breaking a project into smaller segments and providing more ease-of-change during the development process. There are three main variants of incremental development: A series of mini-Waterfalls are performed, where all phases of the Waterfall are completed for a small part of a system, before proceeding to the next increment, or Overall requirements are defined before proceeding to evolutionary, mini-Waterfall development of individual increments of a system, or The initial software concept, requirements analysis, and design of architecture and system core are defined via Waterfall, followed by incremental implementation, which culminates in installing the final version, a working system. Rapid application development Rapid application development (RAD) is a software development methodology, which favors iterative development and the rapid construction of prototypes instead of large amounts of up-front planning. The "planning" of software developed using RAD is interleaved with writing the software itself. The lack of extensive pre-planning generally allows software to be written much faster, and makes it easier to change requirements. The rapid development process starts with the development of preliminary data models and business process models using structured techniques. In the next stage, requirements are verified using prototyping, eventually to refine the data and process models. These stages are repeated iteratively; further development results in "a combined business requirements and technical design statement to be used for constructing new systems". The term was first used to describe a software development process introduced by James Martin in 1991. According to Whitten (2003), it is a merger of various structured techniques, especially data-driven information technology engineering, with prototyping techniques to accelerate software systems development. The basic principles of rapid application development are: Key objective is for fast development and delivery of a high quality system at a relatively low investment cost. Attempts to reduce inherent project risk by breaking a project into smaller segments and providing more ease-of-change during the development process. Aims to produce high quality systems quickly, primarily via iterative Prototyping (at any stage of development), active user involvement, and computerized development tools. These tools may include Graphical User Interface (GUI) builders, Computer Aided Software Engineering (CASE) tools, Database Management Systems (DBMS), fourth-generation programming languages, code generators, and object-oriented techniques. Key emphasis is on fulfilling the business need, while technological or engineering excellence is of lesser importance. Project control involves prioritizing development and defining delivery deadlines or “timeboxes”. If the project starts to slip, emphasis is on reducing requirements to fit the timebox, not in increasing the deadline. Generally includes joint application design (JAD), where users are intensely involved in system design, via consensus building in either structured workshops, or electronically facilitated interaction. Active user involvement is imperative. Iteratively produces production software, as opposed to a throwaway prototype. Produces documentation necessary to facilitate future development and maintenance. Standard systems analysis and design methods can be fitted into this framework. Waterfall development The waterfall model is a sequential development approach, in which development is seen as flowing steadily downwards (like a waterfall) through several phases, typically: Requirements analysis resulting in a software requirements specification Software design Implementation Testing Integration, if there are multiple subsystems Deployment (or Installation) Maintenance The first formal description of the method is often cited as an article published by Winston W. Royce in 1970, although Royce did not use the term "waterfall" in this article. Royce presented this model as an example of a flawed, non-working model. The basic principles are: Project is divided into sequential phases, with some overlap and splash back acceptable between phases. Emphasis is on planning, time schedules, target dates, budgets and implementation of an entire system at one time. Tight control is maintained over the life of the project via extensive written documentation, formal reviews, and approval/signoff by the user and information technology management occurring at the end of most phases before beginning the next phase. Written documentation is an explicit deliverable of each phase. The waterfall model is a traditional engineering approach applied to software engineering. A strict waterfall approach discourages revisiting and revising any prior phase once it is complete. This "inflexibility" in a pure waterfall model has been a source of criticism by supporters of other more "flexible" models. It has been widely blamed for several large-scale government projects running over budget, over time and sometimes failing to deliver on requirements due to the Big Design Up Front approach. Except when contractually required, the waterfall model has been largely superseded by more flexible and versatile methodologies developed specifically for software development. See Criticism of Waterfall model. Spiral development In 1988, Barry Boehm published a formal software system development "spiral model," which combines some key aspect of the waterfall model and rapid prototyping methodologies, in an effort to combine advantages of top-down and bottom-up concepts. It provided emphasis in a key area many felt had been neglected by other methodologies: deliberate iterative risk analysis, particularly suited to large-scale complex systems. The basic principles are: Focus is on risk assessment and on minimizing project risk by breaking a project into smaller segments and providing more ease-of-change during the development process, as well as providing the opportunity to evaluate risks and weigh consideration of project continuation throughout the life cycle. "Each cycle involves a progression through the same sequence of steps, for each part of the product and for each of its levels of elaboration, from an overall concept-of-operation document down to the coding of each individual program." Each trip around the spiral traverses four basic quadrants: (1) determine objectives, alternatives, and constraints of the iteration; (2) evaluate alternatives; Identify and resolve risks; (3) develop and verify deliverables from the iteration; and (4) plan the next iteration. Begin each cycle with an identification of stakeholders and their "win conditions", and end each cycle with review and commitment. Advanced methodologies Other high-level software project methodologies include: Behavior-driven development and business process management Chaos model - The main rule is always resolve the most important issue first. Incremental funding methodology - an iterative approach Lightweight methodology - a general term for methods that only have a few rules and practices Structured systems analysis and design method - a specific version of waterfall Slow programming, as part of the larger Slow Movement, emphasizes careful and gradual work without (or minimal) time pressures. Slow programming aims to avoid bugs and overly quick release schedules. V-Model (software development) - an extension of the waterfall model Unified Process (UP) is an iterative software development methodology framework, based on Unified Modeling Language (UML). UP organizes the development of software into four phases, each consisting of one or more executable iterations of the software at that stage of development: inception, elaboration, construction, and guidelines. Many tools and products exist to facilitate UP implementation. One of the more popular versions of UP is the Rational Unified Process (RUP). Big Bang methodology - an approach for small or undefined projects, generally consisting of little to no planning with high risk. Process meta-models Some "process models" are abstract descriptions for evaluating, comparing, and improving the specific process adopted by an organization. ISO/IEC 12207 is the international standard describing the method to select, implement, and monitor the life cycle for software. The Capability Maturity Model Integration (CMMI) is one of the leading models and based on best practice. Independent assessments grade organizations on how well they follow their defined processes, not on the quality of those processes or the software produced. CMMI has replaced CMM. ISO 9000 describes standards for a formally organized process to manufacture a product and the methods of managing and monitoring progress. Although the standard was originally created for the manufacturing sector, ISO 9000 standards have been applied to software development as well. Like CMMI, certification with ISO 9000 does not guarantee the quality of the end result, only that formalized business processes have been followed. ISO/IEC 15504 Information technology—Process assessment also known as Software Process Improvement Capability Determination (SPICE), is a "framework for the assessment of software processes". This standard is aimed at setting out a clear model for process comparison. SPICE is used much like CMMI. It models processes to manage, control, guide and monitor software development. This model is then used to measure what a development organization or project team actually does during software development. This information is analyzed to identify weaknesses and drive improvement. It also identifies strengths that can be continued or integrated into common practice for that organization or team. ISO/IEC 24744 Software Engineering—Metamodel for Development Methodologies, is a powertype-based metamodel for software development methodologies. SPEM 2.0 by the Object Management Group Soft systems methodology - a general method for improving management processes Method engineering - a general method for improving information system processes In practice A variety of such frameworks have evolved over the years, each with its own recognized strengths and weaknesses. One software development methodology framework is not necessarily suitable for use by all projects. Each of the available methodology frameworks are best suited to specific kinds of projects, based on various technical, organizational, project and team considerations. Software development organizations implement process methodologies to ease the process of development. Sometimes, contractors may require methodologies employed, an example is the U.S. defense industry, which requires a rating based on process models to obtain contracts. The international standard for describing the method of selecting, implementing and monitoring the life cycle for software is ISO/IEC 12207. A decades-long goal has been to find repeatable, predictable processes that improve productivity and quality. Some try to systematize or formalize the seemingly unruly task of designing software. Others apply project management techniques to designing software. Large numbers of software projects do not meet their expectations in terms of functionality, cost, or delivery schedule - see List of failed and overbudget custom software projects for some notable examples. Organizations may create a Software Engineering Process Group (SEPG), which is the focal point for process improvement. Composed of line practitioners who have varied skills, the group is at the center of the collaborative effort of everyone in the organization who is involved with software engineering process improvement. A particular development team may also agree to programming environment details, such as which integrated development environment is used, and one or more dominant programming paradigms, programming style rules, or choice of specific software libraries or software frameworks. These details are generally not dictated by the choice of model or general methodology. See also Systems development life cycle Computer-aided software engineering (some of these tools support specific methodologies) List of software development philosophies Outline of software engineering OpenUP Project management Software development Software development effort estimation Software release life cycle Top-down and bottom-up design#Computer science References External links Selecting a development approach at cms.hhs.gov. Gerhard Fischer, "The Software Technology of the 21st Century: From Software Reuse to Collaborative Software Design", 2001 Subway map of agile practices at Agile Alliance Methodology Software engineering
7848203
https://en.wikipedia.org/wiki/Mike%20Sanford%20Sr.
Mike Sanford Sr.
Michael Charles Sanford (born April 20, 1955) is an American high school football coach and former player. He served as the head football coach at University of Nevada, Las Vegas (UNLV) from 2005 to 2009 and at Indiana State University from 2013 to 2016. He is a graduate of the University of Southern California (USC), where he played quarterback for the Trojans from 1973 through 1976. He is the father of former Western Kentucky head coach and current Colorado offensive coordinator, Mike Sanford Jr. Head coaching career UNLV On December 6, 2004, UNLV hired Sanford as the school's ninth head coach, taking over for the legendary John Robinson who went 2–9 his final year. Sanford inherited a program in decline. Some had blamed the program's problems on an inability to keep local talent at home. It was hoped that Sanford would reverse the fortunes of the long-suffering program with this new policy. His first two years at the helm produced just four total wins, on par with John Robinson's final season total. Sanford's Rebels achieved one of biggest victories in UNLV football history, a 23–20 overtime win at 15th-ranked Arizona State on September 13, 2008. Sanford said it was the biggest win of his coaching career. At the end of the 2009 season, UNLV announced it had fired Sanford. He left the Rebels after five seasons with an overall mark of 16–43. Indiana State On December 14, 2012, Indiana State hired Sanford as the school's twenty-fourth head coach, taking over for Trent Miles who went 7–4 his final year. The ensuing regime change for the Sycamores led to a season fraught with a struggle to adjust to the system implemented by Sanford and his new staff, multiple injuries to several key starters, and many off-field issues which led to Indiana State losing all but one game that season (a 70-7 victory over Quincy). Despite finishing the 2013 season with a disappointing 1-11 record, Sanford's Sycamores were more competitive than the overall record might indicate, with close losses to Purdue, Tennessee Tech, Youngstown State, South Dakota, and Western Illinois. The 2014 season, however, showed a marked improvement by Indiana State in their second year under Sanford. After suffering an early defeat at the hands of Big Ten member Indiana University; the Sycamores followed a dominant performance against Tennessee Tech by completing an upset victory over the Division I FBS and historic rival Ball State Cardinals to reclaim the Victory Bell for the first time since 1987. Sanford's leadership led to not only a record-setting senior season by quarterback Mike Perish, but a regular season record of 7-5 (4-4 in MVFC); as well as maintaining a Top 25 ranking since Week 4 (27 September). The Sycamores were 5-3 against ranked FCS teams and won their opening round NCAA playoffs game before falling to Chattanooga in the second round of the playoffs. The season concluded with an 8-6 record, the best for Indiana State in thirty years. Indiana State took a step back in 2015 despite entering the season with high expectations following the successful 2014 campaign. Following the graduation of starting quarterback Mike Perish, Sanford opted to start redshirt sophomore Matt Adam at quarterback. This ultimately forced ISU to alter the successful offensive game plan from the previous season and install a more rush-oriented offense tailored to Adam's skill set. Despite starting the season 4-2, the Sycamores suffered several devastating injuries that wore away at both the depth and consistency and led to ISU falling to North Dakota State, Illinois State, Northern Iowa, and Western Illinois in consecutive games. The Sycamores concluded their season with a 27-24 victory over Youngstown State to finish 5-6. The 2016 season witnessed another disappointing regression. Starting quarterback Matt Adam was ruled academically ineligible to play and led to a competition that consisted of three inexperienced quarterbacks, which ultimately was won by Isaac Harker. Sparked by a potent passing game, the Sycamores started the season with a 3-1 record but ultimately faltered (blowing several leads) and lost six of their last seven games. Despite the poor 4-7 record the Sycamores notched their first victory over rival Illinois State for the first time in six years. Sanford resigned as head coach of Indiana State December 16, 2016 citing a "special opportunity" following the appointment of his son, Mike Sanford Jr., as head coach of Western Kentucky. Sanford was 18-30 after four years at ISU. Assistant coaching career Sanford began his coaching career as a graduate assistant at USC in 1977. Since then, he has served as an assistant coach for numerous teams in the collegiate and professional ranks, including San Diego City College, the United States Military Academy, Virginia Military Institute, Long Beach State, Purdue, USC, Notre Dame, the San Diego Chargers and Stanford. In 2003 Urban Meyer hired Sanford as his offensive coordinator at Utah. That year, the Utes won the Mountain West Conference Championship and won the Liberty Bowl. The next year, the Utes repeated as conference champs and finished the season 12–0, including a win over Pittsburgh in the Fiesta Bowl. Sanford's offense averaged 45.3 points a game, and quarterback Alex Smith was MWC Player of the Year as well as a finalist for the Heisman Trophy. The following spring, Smith was the first overall pick in the 2005 NFL Draft, selected by the San Francisco 49ers. Louisville On December 22, 2009, Sanford was named offensive coordinator and assistant head coach at the University of Louisville. After a lackluster offensive performance against Marshall on October 1, 2011, Sanford did not travel with the Cardinals for their next game, against North Carolina. He did not attend any of the practices in the week leading up to the game. Quarterbacks coach Shawn Watson served as offensive play-caller for that game. CBSSports.com's Brett McMurphy reported that Sanford had been fired and replaced by Watson. However, The Courier-Journal's Rick Bozich reported that Sanford was still with the team, but may be demoted to a position coach. Bozich later confirmed that Sanford was no longer offensive coordinator. The following Monday, October 10, Strong announced that Watson would serve as offensive coordinator for the remainder of the season, but that Sanford would remain on the coaching staff in another capacity. He also denied rumors that there had been an altercation between them during the week. On October 22, 2011, Strong announced that Sanford was no longer with the program. Utah State (first stint) Sanford joined the Utah State coaching staff in March 2012. He served as the assistant head coach and running backs/tight ends coach. Sanford previously worked with head coach Gary Andersen at the University of Utah. Western Kentucky Sanford joined the Western Kentucky coaching staff in December 2016. He served as the running backs coach and special teams coordinator under his son, Mike Sanford Jr. Utah State (second stint) Sanford joined the Utah State coaching staff for the 2019 season. He served as a consultant under head coach Gary Anderson and offensive coordinator Mike Sanford Jr., his son. Personal He is the father of former Western Kentucky head coach and current Minnesota offensive coordinator, Mike Sanford Jr. Head coaching record References External links Indiana State profile 1955 births Living people American football quarterbacks Army Black Knights football coaches High school football coaches in Nevada Indiana State Sycamores football coaches Junior college football coaches in the United States Long Beach State 49ers football coaches Louisville Cardinals football coaches People from Los Altos, California Players of American football from California Purdue Boilermakers football coaches San Diego Chargers coaches Stanford Cardinal football coaches UNLV Rebels football coaches USC Trojans football coaches USC Trojans football players Utah State Aggies football coaches Utah Utes football coaches VMI Keydets football coaches Western Kentucky Hilltoppers football coaches
5791719
https://en.wikipedia.org/wiki/Chambersburg%20Area%20Senior%20High%20School
Chambersburg Area Senior High School
Chambersburg Area Senior High School (CASHS) is a public high school located in Franklin County, Pennsylvania. The school serves grades 9, 10, 11, and 12. Students come from Chambersburg and surrounding townships of Hamilton, Greene, Lurgan, Letterkenny and Guilford. CASHS is accredited by the Middle States Association of Colleges and Schools. As of the 2005-06 school year, the school had an enrollment of 1,858 students and 105.5 classroom teachers on a FTE basis, for a student-teacher ratio of 17.6. There is one building principal, five administrators, eight guidance counselors, and four secretaries. CASHS has occupied its current facilities since 1955. Graduation rate In 2012, the graduation rate at Chambersburg Area Senior High School was 80%. In 2011, the graduation rate was 85%. In 2010, the Pennsylvania Department of Education issued a new, 4-year cohort graduation rate. Chambersburg Area School District's rate was 87% for 2010. Former calculation graduation rate 2015 - 79% 2010 - 87% 2009 - 90% 2008 - 90% 2007 - 90% Academic achievement In 2012, Chambersburg Area Senior High School declined further to Corrective Action II 6th Year due to its ongoing failure to improve student achievement in mathematics and reading. The school achieved just 7 of 18 measured academic metrics. In 2011, Chambersburg Area Senior High School declined to Corrective Action II 5th Year due to its ongoing failure to improve student achievement in mathematics. Science achievement is also very low. Under the federal No Child Left Behind Act, the school administration was required to notify parents of the school's poor achievement outcomes and to offer the parent the opportunity to transfer to a successful school within the District. Additionally the Chambersburg Area Senior High School administration was required by the Pennsylvania Department of Education, to develop a School Improvement Plan to address the school's low student achievement. Under the Pennsylvania Accountability System, the School must pay for additional tutoring for struggling students. Chambersburg Area Senior High School is eligible for special, extra funding under School Improvement Grants which the school must apply for each year. 2010 - Corrective Action II 4th Year due to chronically low student achievement. 2009 - Corrective Action II 3rd Year due to unresolved low student achievement. 2008 - Corrective Action II 2nd Year due to chronic low student achievement. 2007 - Corrective Action II 2006 - School Improvement Level II 2005 - School Improvement Level I PSSA Results: 11th Grade Reading: 2012 - 64% on grade level, (19% below basic). State - 67% of 11th graders are on grade level. 2011 - 69% (17% below basic). State - 69.1% 2010 - 64% (20% below basic). State - 67% 2009 - 63%, State - 65% 2008 - 64%, State - 65% 2007 - 61%, State - 65% 11th Grade Math: 2012 - 53% on grade level (28% below basic). In Pennsylvania, 59% of 11th graders are on grade level. 2011 - 60%, (20% below basic). State - 60.3% 2010 - 61% (24% below basic). State - 59% 2009 - 57%, State - 56% 2008 - 59%, State - 56% 2007 - 60%, State - 53% 11th Grade Science: 2012 - 34% on grade level (17% below basic). State - 42% of 11th graders were on grade level. 2011 - 37% (18% below basic). State - 40%. 2010 - 34% (24% below basic). State - 39% 2009 - 35%, State - 40% 2008 - 32%, State - 39% Science in Motion Chambersburg Area Senior High School and both of the District's middle schools took advantage of a state program called Science in Motion which brought college professors and sophisticated science equipment to the school to raise science awareness and to provide inquiry-based experiences for the students. The Science in Motion program was funded by a state appropriation and cost the school nothing to participate. Gettysburg College provided the experiences to the schools. College remediation According to a Pennsylvania Department of Education study released in January 2009, 16% of Chambersburg Area Senior High School graduates required remediation in mathematics and or reading before they were prepared to take college level courses in the Pennsylvania State System of Higher Education or community colleges. Less than 66% of Pennsylvania high school graduates, who enroll in a four-year college in Pennsylvania, will earn a bachelor's degree within six years. Among Pennsylvania high school graduates pursuing an associate degree, only one in three graduate in three years. Per the Pennsylvania Department of Education, one in three recent high school graduates who attend Pennsylvania's public universities and community colleges takes at least one remedial course in math, reading or English. Dual enrollment The high school offers a dual enrollment program. This state program permits high school students to take courses, at local higher education institutions, to earn college credits. Students remain enrolled at their high school, including the graduation ceremony. The courses count towards high school graduation requirements and towards earning a college degree. The students continue to have full access to activities and programs at their high school. The college credits are offered at a deeply discounted rate. The state offers a small grant to assist students in costs for tuition, fees and books.> Under the Pennsylvania Transfer and Articulation Agreement, many Pennsylvania colleges and universities accept these credits for students who transfer to their institutions. The Pennsylvania College Credit Transfer System reported in 2009, that students saved nearly $35.4 million by having their transferred credits count towards a degree under the new system. For the 2009-10 funding year, the school district received a state grant of $12,909 for the program. SAT scores In 2012, 362 Chambersburg Area Senior High School students took the SAT exams. The District's Verbal Average Score was 490. The Math average score was 489. The Writing average score was 476. The statewide Verbal SAT exams results were: Verbal 491, Math 501, Writing 480. In the USA, 1.65 million students took the exams achieving scores: Verbal 496, Math 514, Writing 488. According to the College Board the maximum score on each section was 800, and 360 students nationwide scored a perfect 2,400. 485 In 2011, 301 Chambersburg Area Senior High School students took the SAT exams. The District's Verbal Average Score was . The Math average score was 490. The Writing average score was 463. Pennsylvania ranked 40th among states with SAT scores: Verbal - 493, Math - 501, Writing - 479. In the United States, 1.65 million students took the exam in 2011. They averaged 497 (out of 800) verbal, 514 math and 489 in writing. Graduation requirements The Chambersburg Area School Board has determined that in order to graduate a student must earn 23 credits, including: 4 Credit Units of English; 4 Credit Units of Math (Algebra I, Geometry, Algebra II, 4th math); 3.5 Credit Units of Social Science (Early Am. Hist., World Hist., Am. Hist., and Civics); 3 Credit Units of Science (Biology plus 2 other sciences); 1.5 Credit Units of Wellness and Fitness; 6.5 Credit Units of Electives; 0.50 of ICT (Information Communication Technology). By law, all Pennsylvania secondary school students must complete a project as a part of their eligibility to graduate from high school. The type of project, its rigor and its expectations are set by the individual school district. Chambersburg Area Senior High School requires the completion of a Junior Project to fulfill this requirement. The Junior Project is completed in conjunction with their junior English class. By Pennsylvania School Board regulations, beginning with the class of 2017, public school students must demonstrate successful completion of secondary level course work in Algebra I, Biology, and English Literature by passing the Keystone Exams. The exam is given at the end of the course. Keystone Exams replace the PSSAs for 11th grade. Students have several opportunities to pass the exam, with those who do not able to perform a project in order to graduate. For the class of 2019, a Composition exam will be added. For the class of 2020, passing a civics and government exam will be added to the graduation requirements. In 2011, Pennsylvania high school students field tested the Algebra 1, Biology and English Lit exams. The statewide results were: Algebra 1 38% on grade level, Biology 35% on grade level and English Lit - 49% on grade level. Individual student, school or district reports were not made public, although they were reported to district officials by the Pennsylvania Department of Education. Students identified as having special needs and qualifying for an Individual Educational Program (IEP) may graduate by meeting the requirements of their IEP. Other high school options Students in Chambersburg Area School District have several options outside of the traditional high school program. Chambersburg Area Career Magnet School offers a 9th grade through 12th program with a technology emphasis, career exploration and acceleration to graduate early. Students apply to attend. Franklin County Career and Technical Center - votech training program. Franklin Virtual Academy - 9th through 12th. A joint venture of Chambersburg Area School District, Fannett-Metal School District, Greencastle-Antrim School District, Southern Huntingdon County School District and Waynesboro Area School District. A self paced, custom blend of rigorous, multi-media rich online classes. FVA students have the option of filling their schedules with online classes or creating a blend of online and in-school classes in their home high school. Students graduate with a diploma from their respective high school. Awards and recognition Principal Dr. Barry Purvis was recognized as the 2006 High School Principal of the Year by the Pennsylvania Association of Elementary and Secondary School Principals. Extracurricular activities and Color Day Chambersburg Area Senior High School offers a wide variety of extracurricular activities and an extensive, costly sports program. In addition to a full range of sports, the school also maintains a band, an orchestra, a glee club, a student newspaper, a national honors society, a national art honors society, a variety of language clubs, a math club, a ping pong club, a ski club, an economics club, a sports club, a drama club, a camera club, a religious fellowship, a student government, a small business club, and a number of other organizations. Chambersburg Area Senior High School is also well known for its Color Day tradition. Every year since the early 1920s, classes are suspended for a series of games and competitions between the freshmen, sophomore, junior, and senior classes. The term Color Day originated from the hues given to the different grades: gold and blue are worn by future graduates of an odd numbered year (example 2013) and red and white for the even numbered years (example 2014). Athletics The 7,000-seat Trojan Stadium was overhauled in 2006 as part of a $6.5 million renovation project that included additional home seating and renovated visitors bleachers, along with a new press box. Other enhancements to the facility included artificial turf, a running track, concession stands, restrooms, ticket booths and parking lots. The Trojans called Henninger Field their home for football from 1898 until 1956, for soccer from 1968–2003 and 2005, and for baseball from the early 1900s until 2006. The district funds: Boys Baseball - AAAA Basketball- AAAA Cross Country - AAA Football - AAAA Golf - AAA Lacrosse - AAAA Soccer - AAA Swimming and Diving - AAA Tennis - AAA Track and Field - AAA Volleyball - AAA Wrestling - AAA Girls Basketball - AAAA Cross Country - AAA Field Hockey - AAA Golf - AAA Gymnastics - AAAA Lacrosse - AAAA Soccer (Fall) - AAA Softball - AAA Swimming and Diving - AAA Girls' Tennis - AAA Track and Field - AAA Volleyball - AAA On June 18, 2004, the Chambersburg Area Senior High School Trojans boys baseball team won the Pennsylvania Interscholastic Athletic Association (PIAA) Class AAA state championship, defeating Peters Township High School by 12-5, in a game played at RiverSide Stadium in Harrisburg. Coming on the heels of this state title, the baseball team was ranked 7th in the Eastern United States by USA Today in their final 2004 rankings. The Girls' Gymnastics team was recognized as the 2005 team state champion in the State Silver Division. Track-and-field team member Lorraine Hill had the second-longest girls high school javelin throw in the nation in 2006 with a throw of 157 feet, four inches. Hill won the 2006 Pennsylvania Interscholastic Athletic Association Class AAAA state javelin championship, won second-place finish at the Penn Relays and finished third at the Nike Team Nationals Outdoor competition in the javelin that year. Hill was named a first-team All-American by American Track & Field Magazine for her achievements in 2006. References External links Chambersburg Area Senior High School National Center for Education Statistics for Chambersburg Area Senior High School Chambersburg Area Senior High School Web page at Great Schools Web site Alumni Site for the CASHS Class of 1968 High schools in Central Pennsylvania Educational institutions established in 1955 Chambersburg, Pennsylvania Schools in Franklin County, Pennsylvania Public high schools in Pennsylvania 1955 establishments in Pennsylvania
8609484
https://en.wikipedia.org/wiki/ArcSoft
ArcSoft
ArcSoft, Inc. () is a photo and video imaging software development company that offers various imaging technologies across devices with major platforms – from smartphones, tablets, PCs, smart TVs, digital cameras to cloud based enterprise solutions. Established in 1994, ArcSoft is a private company with 800 employees, including more than 600 scientists and engineers. ArcSoft is headquartered in Fremont, California, with regional commercial and development facilities in Europe and Asia, specifically Taipei, Seoul, Tokyo, Shanghai, Hangzhou, and Nanjing. Michael Deng, ArcSoft's CEO, founded the company with $150,000 in funding from family and friends. Products ArcSoft markets the following desktop and mobile software: Mobile Apps Perfect365: portrait makeovers Closeli and simplicam ArcNote: for taking photo notes. Video entertainment and editing TotalMedia Theatre: movie experience on PC (Retired Product) ShowBiz: video editor Media management and utilities MediaConverter: converts music, photos, and videos into formats for standalone media players Photo+: photo viewer for Windows and Mac. Photo editing Panorama Maker: turns photos and videos into panoramas (this software is no longer being supported or updated by ArcSoft). PhotoStudio: for advanced photo editing Portrait+: for portrait enhancement PhotoImpression: for rich photo editing PhotoBase: for basic photo editing References Companies based in Fremont, California Software companies based in the San Francisco Bay Area Companies listed on the Shanghai Stock Exchange Software companies of the United States 1994 establishments in the United States 1994 establishments in California Software companies established in 1994 American companies established in 1994
1630037
https://en.wikipedia.org/wiki/Banking%20in%20India
Banking in India
Modern banking in India originated in the mid of 18th century. Among the first banks were the Bank of Hindustan, which was established in 1770 and liquidated in 1829–32; and the General Bank of India, established in 1786 but failed in 1791. The largest and the oldest bank which is still in existence is the State Bank of India (SBI). It originated and started working as the Bank of Calcutta in mid-June 1806. In 1809, it was renamed as the Bank of Bengal. This was one of the three banks founded by a presidency government, the other two were the Bank of Bombay in 1840 and the Bank of Madras in 1843. The three banks were merged in 1921 to form the Imperial Bank of India, which upon India's independence, became the State Bank of India in 1955. For many years, the presidency banks had acted as quasi-central banks, as did their successors, until the Reserve Bank of India was established in 1935, under the Reserve Bank of India Act, 1934. In 1960, the State Banks of India was given control of eight state-associated banks under the State Bank of India (Subsidiary Banks) Act, 1959. These are now called its associate banks. In 1969, the Government of India nationalised 14 major private banks; one of the big banks was Bank of India. In 1980, 6 more private banks were nationalised. These nationalised banks are the majority of lenders in the Indian economy. They dominate the banking sector because of their large size and widespread networks. The Indian banking sector is broadly classified into scheduled and non-scheduled banks. The scheduled banks are those included under the 2nd Schedule of the Reserve Bank of India Act, 1934. The scheduled banks are further classified into: nationalised banks; State Bank of India and its associates; Regional Rural Banks (RRBs); foreign banks; and other Indian private sector banks. The SBI has merged its Associate banks into itself to create the largest Bank in India on 1 April 2017. With this merger SBI has a global ranking of 236 on Fortune 500 index. The term commercial banks refers to both scheduled and non-scheduled commercial banks regulated under the Banking Regulation Act, 1949. Generally the supply, product range and reach of banking in India is fairly mature-even though reach in rural India and to the poor still remains a challenge. The government has developed initiatives to address this through the State Bank of India expanding its branch network and through the National Bank for Agriculture and Rural Development (NABARD) with facilities like microfinance. History Ancient India The Vedas are the ancient Indian texts mention the concept of usury, with the word kusidin translated as "usurer". The Sutras (700–100 BCE) and the Jatakas (600–400 BCE) also mention usury. Texts of this period also condemned usury: Vasishtha forbade Brahmin and Kshatriya varnas from participating in usury. By the 2nd century CE, usury became more acceptable. The Manusmriti considered usury an acceptable means of acquiring wealth or leading a livelihood. It also considered money lending above a certain rate and different ceiling rates for different castes a grave sin. The Jatakas, Dharmashastras and Kautilya also mention the existence of loan deeds, called rnapatra, rnapanna, or rnalekhaya. Later during the Mauryan period (321–185 BCE), an instrument called adesha was in use, which was an order on a banker directing him to pay the sum on the note to a third person, which corresponds to the definition of a modern bill of exchange. The considerable use of these instruments has been recorded. In large towns, merchants also gave letters of credit to one another. Medieval Period The use of loan deeds continued into the Mughal era and were called dastawez (in Urdu/Hindi). Two types of loans deeds have been recorded. The dastawez-e-indultalab was payable on demand and dastawez-e-miadi was payable after a stipulated time. The use of payment orders by royal treasuries, called barattes, have been also recorded. There are also records of Indian bankers using issuing bills of exchange on foreign countries. The evolution of hundis, a type of credit instrument, also occurred during this period and remain in use. Colonial era During the period of British rule merchants established the Union Bank of Calcutta in 1829, first as a private joint stock association, then partnership. Its proprietors were the owners of the earlier Commercial Bank and the Calcutta Bank, who by mutual consent created Union Bank to replace these two banks. In 1840 it established an agency at Singapore, and closed the one at Mirzapore that it had opened in the previous year. Also in 1840 the Bank revealed that it had been the subject of a fraud by the bank's accountant. Union Bank was incorporated in 1845 but failed in 1848, having been insolvent for some time and having used new money from depositors to pay its dividends. The Allahabad Bank, established in 1865 and still functioning today, is the oldest Joint Stock bank in India, it was not the first though. That honour belongs to the Bank of Upper India, which was established in 1863 and survived until 1913, when it failed, with some of its assets and liabilities being transferred to the Alliance Bank of Simla. Foreign banks too started to appear, particularly in Calcutta, in the 1860s. Grindlays Bank opened its first branch in Calcutta in 1864. The Comptoir d'Escompte de Paris opened a branch in Calcutta in 1860, and another in Bombay in 1862; branches followed in Madras and Pondicherry, then a French possession. HSBC established itself in Bengal in 1869. Calcutta was the most active trading port in India, mainly due to the trade of the British Empire, and so became a banking centre. The first entirely Indian joint stock bank was the Oudh Commercial Bank, established in 1881 in Faizabad. It failed in 1958. The next was the Punjab National Bank, established in Lahore in 1894, which has survived to the present and is now one of the largest banks in India. Around the turn of the 20th century, the Indian economy was passing through a relative period of stability. Around five decades had elapsed since the Indian rebellion, and the social, industrial and other infrastructure had improved. Indians had established small banks, most of which served particular ethnic and religious communities. The presidency banks dominated banking in India but there were also some exchange banks and a number of Indian joint stock banks. All these banks operated in different segments of the economy. The exchange banks, mostly owned by Europeans, concentrated on financing foreign trade. Indian joint stock banks were generally under capitalised and lacked the experience and maturity to compete with the presidency and exchange banks. This segmentation let Lord Curzon to observe, "In respect of banking it seems we are behind the times. We are like some old fashioned sailing ship, divided by solid wooden bulkheads into separate and cumbersome compartments." The period between 1906 and 1911 saw the establishment of banks inspired by the Swadeshi movement. The Swadeshi movement inspired local businessmen and political figures to found banks of and for the Indian community. A number of banks established then have survived to the present such as Catholic Syrian Bank, The South Indian Bank, Bank of India, Corporation Bank, Indian Bank, Bank of Baroda, Canara Bank and Central Bank of India. The fervour of Swadeshi movement led to the establishment of many private banks in Dakshina Kannada and Udupi district, which were unified earlier and known by the name South Canara (South Kanara) district. Four nationalised banks started in this district and also a leading private sector bank. Hence undivided Dakshina Kannada district is known as "Cradle of Indian Banking". The inaugural officeholder was the Britisher Sir Osborne Smith(1 April 1935), while C. D. Deshmukh(11 August 1943) was the first Indian governor. On December 12, 2018,Shaktikanta Das, who was the finance secretary with the Government of India, begins his journey as the new RBI Governor, taking charge from Urjit R Patel. During the First World War (1914–1918) through the end of the Second World War (1939–1945), and two years thereafter until the independence of India were challenging for Indian banking. The years of the First World War were turbulent, and it took its toll with banks simply collapsing despite the Indian economy gaining indirect boost due to war-related economic activities. At least 94 banks in India failed between 1913 and 1918 as indicated in the following table: Post-Independence During 1938–46, bank branch offices trebled to 3,469 and deposits quadrupled to 962 crore. Nevertheless, the partition of India in 1947 adversely impacted the economies of Punjab and West Bengal, paralysing banking activities for months. India's independence marked the end of a regime of the Laissez-faire for the Indian banking. The Government of India initiated measures to play an active role in the economic life of the nation, and the Industrial Policy Resolution adopted by the government in 1948 envisaged a mixed economy. This resulted in greater involvement of the state in different segments of the economy including banking and finance. The major steps to regulate banking included: The Reserve Bank of India, India's central banking authority, was established in April 1935, but was nationalized on 1 January 1949 under the terms of the Reserve Bank of India (Transfer to Public Ownership) Act, 1948 (RBI, 2005b). In 1949, the Banking Regulation Act was enacted, which empowered the Reserve Bank of India (RBI) to regulate, control, and inspect the banks in India. The Banking Regulation Act also provided that no new bank or branch of an existing bank could be opened without a license from the RBI, and no two banks could have common directors. Nationalisation in 1969 Despite the provisions, control and regulations of the Reserve Bank of India, banks in India except the State Bank of India (SBI), remain owned and operated by private persons. By the 1960s, the Indian banking industry had become an important tool to facilitate the development of the Indian economy. At the same time, it had emerged as a large employer, and a debate had ensued about the nationalization of the banking industry. Indira Gandhi, the then Prime Minister of India, expressed the intention of the Government of India in the annual conference of the All India Congress Meeting in a paper entitled Stray thoughts on Bank Nationalization. Thereafter, the Government of India issued the Banking Companies (Acquisition and Transfer of Undertakings) Ordinance, 1969 and nationalized the 14 largest commercial banks with effect from the midnight of 19 July 1969. These banks contained 85 percent of bank deposits in the country. Within two weeks of the issue of the ordinance, the Parliament passed the Banking Companies (Acquisition and Transfer of Undertaking) Bill, and it received presidential approval on 9 August 1969. The following banks were nationalized in 1969: Allahabad Bank (now Indian Bank) Bank of Baroda Bank of India Bank of Maharashtra Central Bank of India Canara Bank Dena Bank (now Bank of Baroda) Indian Bank Indian Overseas Bank Punjab National Bank Syndicate Bank (now Canara Bank) UCO Bank Union Bank of India United Bank of India( now Punjab National Bank) Nationalisation in 1980 A second round of nationalizations of six more commercial banks followed in 1980. The stated reason for the nationalization was to give the government more control of credit delivery. With the second round of nationalizations, the Government of India controlled around 91% of the banking business of India. The following banks were nationalized in 1980: Punjab and Sind Bank Vijaya Bank (Now Bank of Baroda) Oriental Bank of Commerce (now Punjab National Bank) Corporation Bank (now Union Bank of India) Andhra Bank (now Union Bank of India) New Bank of India (now Punjab National Bank) Later on, in the year 1993, the government merged New Bank of India with Punjab National Bank. It was the only merger between nationalised banks and resulted in the reduction of the number of nationalised banks from 20 to 19. Until the 1990s, the nationalized banks grew at a pace of around 4%, closer to the average growth rate of the Indian economy. Liberalisation in the 1990s In the early 1990s, the then government embarked on a policy of liberalisation, licensing a small number of private banks. These came to be known as New Generation tech-savvy banks, and included Global Trust Bank (the first of such new generation banks to be set up), which later amalgamated with Oriental Bank of Commerce, IndusInd Bank, UTI Bank (since renamed Axis Bank), ICICI Bank and HDFC Bank. This move – along with the rapid growth in the economy of India – revitalised the banking sector in India, which has seen rapid growth with strong contribution from all the three sectors of banks, namely, government banks, private banks and foreign banks. The next stage for the Indian banking has been set up, with proposed relaxation of norms for foreign direct investment. All foreign investors in banks may be given voting rights that could exceed the present cap of 10% at present. In 2019, Bandhan bank specifically, increased the foreign investment percentage limit to 49%. It has gone up to 74% with some restrictions. The new policy shook the banking sector in India completely. Bankers, till this time, were used to the 4–6–4 method (borrow at 4%; lend at 6%; go home at 4) of functioning. The new wave ushered in a modern outlook and tech-savvy methods of working for traditional banks. All this led to the retail boom in India. People demanded more from their banks and received more. PSB Amalgamations in the 2000s and 2010s SBI SBI merged with its associate bank State Bank of Saurashtra in 2008 and State Bank of Indore in 2009. Following a merger process, the merger of the 5 remaining associate banks, (viz. State Bank of Bikaner and Jaipur, State Bank of Hyderabad, State Bank of Mysore, State Bank of Patiala, State Bank of Travancore); and the Bharatiya Mahila Bank) with the SBI was given an in-principle approval by the Union Cabinet on 15 June 2016. This came a month after the SBI board had, on 17 May 2016, cleared a proposal to merge its five associate banks and Bharatiya Mahila Bank with itself. On 15 February 2017, the Union Cabinet approved the merger of five associate banks with SBI. An analyst foresaw an initial negative impact as a result of different pension liability provisions and accounting policies for bad loans. The merger went into effect from 1 April 2017. BOB On 17 September 2018, the Government of India proposed the amalgamation of Dena Bank and Vijaya Bank with erstwhile Bank of Baroda, pending (namesake) approval from the boards of the three banks. The Union Cabinet and the boards of the banks approved with the merger on 2 January 2019. Under the terms of the amalgamation, Dena Bank and Vijaya Bank shareholders received 110 and 402 equity shares of the Bank of Baroda, respectively, of face value 2 for every 1,000 shares they held. The amalgamation became effective from 1 April 2019. PNB On 30 August 2019, Finance Minister announced that the Oriental Bank of Commerce and United Bank of India would be merged with Punjab National Bank, making PNB the second largest PSB after SBI with assets of and 11,437 branches. MD and CEO of UBI, Ashok Kumar Pradhan, stated that the merged entity would begin functioning from 1 April 2020. The Union Cabinet approved the merger on 4 March 2020. PNB announced that its board had approved the merger ratios the next day. Shareholders of OBC and UBI will receive 1,150 shares and 121 shares of Punjab National Bank, respectively, for every 1,000 shares they hold. The merge came into effect since 1 April 2020. Post merger, Punjab National Bank has become the second largest public sector bank in India Canara Bank On 30 August 2019, Finance Minister announced that Syndicate Bank would be merged with Canara Bank. The proposal would create the fourth largest PSB trailing SBI, PNB, BoB with assets of and 10,324 branches. The Board of Directors of Canara Bank approved the merger on 13 September 2019. The Union Cabinet approved the merger on 4 March 2020. Canara Bank assumed control over Syndicate Bank on 1 April 2020 with Syndicate Bank shareholders receiving 158 equity shares in the former for every 1,000 shares they hold. Union Bank of India On 30 August 2019, Finance Minister announced that Andhra Bank and Corporation Bank would be merged into Union Bank of India. The proposal would make Union Bank of India the fifth largest PSB with assets of and 9,609 branches. The Board of Directors of Andhra Bank approved the merger on 13 September. The Union Cabinet approved the merger on 4 March, and it was completed on 1 April 2020. Indian Bank On 30 August 2019, Finance Minister announced that Allahabad Bank would be merged with Indian Bank. The proposal would create the seventh largest PSB in the country with assets of . The Union Cabinet approved the merger on 4 March 2020. Indian Bank assumed control of Allahabad Bank on 1 April 2020. Rescue of private and co-operative banks (2020s) Yes bank In April 2020, RBI enlisted SBI to rescue the troubled lender Yes Bank, in the form of investment with assistance from other lenders viz., ICICI Bank, HDFC Bank and Kotak Mahindra Bank. SBI went on to own 48% share capital of Yes bank, which it later diluted to 30% in an FPO in the following months. Lakshmi Vilas Bank In November 2020, RBI asked DBS Bank India Limited (DBIL) to take over the operations of the private sector bank Lakshmi Vilas Bank whose net worth has turned negative, following mismanagement and two failed merger attempts with NBFCs. DBS India's then having just 12 branches benefited by LVB's 559 branches. In a first of a kind move, Tier- II bond holders have been asked by RBI to write off their holdings in LVB. Punjab and Maharashtra Co-operative Bank In January 2022, RBI asked Unity Small Finance Bank Limited (Unity SFB) to take over the operations of the private sector bank Punjab and Maharashtra Co-operative Bank (PMC), following mismanagement and one failed merger attempts with NBFC/SFBs. Unity SFB then was being created by Centrum Finance and payment provider BharatPe to absorb the liabilities of the scam hit bank. In a first of a kind move, RBI allowed an established cooperative bank to merge into a then being created SFB. Regional Rural banks revamp With a new policy effected in late 2010, the RRBs which served a smaller locality spanning a few districts, were merged into a state level entity following the merger of nationalised banks and their equity in RRBs getting sequentially higher. This eliminated the existential competition and cooperation between RRB's and essentially making them a subsidiary bank of the promoter nationalised bank with state equity. Current period Changed no of nationalised bank [There has no exact data available now] The Indian banking sector is broadly classified into scheduled banks and non-scheduled banks. All banks included in the Second Schedule to the Reserve Bank of India Act, 1934 are Scheduled Banks. These banks comprise Scheduled Commercial Banks and Scheduled Co-operative Banks. Scheduled Co-operative Banks consist of Scheduled State Co-operative Banks and Scheduled Urban Cooperative Banks. In the bank group-wise classification, IDBI Bank Ltd. is included in the category of other public sector bank. With the growth in the Indian economy expected to be strong for quite some time-especially in its services sector-the demand for banking services, especially retail banking, mortgages and investment services are expected to be strong. One may also expect M&As, takeovers, and asset sales. In March 2006, the Reserve Bank of India allowed Warburg Pincus to increase its stake in Kotak Mahindra Bank (a private sector bank) to 10%. This was the first time an investor was allowed to hold more than 5% in a private sector bank since the RBI announced norms in 2005 that any stake exceeding 5% in the private sector banks would need to be vetted by them. In recent years critics have charged that the non-government owned banks are too aggressive in their loan recovery efforts in connection with housing, vehicle and personal loans. There are press reports that the banks' loan recovery efforts have driven defaulting borrowers to suicide. By 2013 the Indian Banking Industry employed 1,175,149 employees and had a total of 109,811 branches in India and 171 branches abroad and manages an aggregate deposit of and bank credit of . The net profit of the banks operating in India was against a turnover of for the financial year 2012–13. Pradhan Mantri Jan Dhan Yojana (, ) is a scheme for comprehensive financial inclusion launched by the Prime Minister of India, Narendra Modi, in 2014. Run by Department of Financial Services, Ministry of Finance, on the inauguration day, 1.5 Crore (15 million) bank accounts were opened under this scheme. By 15 July 2015, 16.92 crore accounts were opened, with around were deposited under the scheme, which also has an option for opening new bank accounts with zero balance. Payment Bank Payments bank is a new model of banks conceptualized by the Reserve Bank of India (RBI). These banks can accept a restricted deposit, which is currently limited to ₹2 lakh per customer. These banks may not issue loans or credit cards, but may offer both current and savings accounts. Payments banks may issue ATM and debit cards, and offer net-banking and mobile-banking. The draft guidelines for licensing of payments banks in the private sector were formulated and released for public comments on July 17, 2014. The banks will be licensed as payments banks under Section 22 of the Banking Regulation Act, 1949, and will be registered as public limited company under the Companies Act, 2013. Small finance banks To further the objective of financial inclusion, the RBI granted approval in 2016 to ten entities to set up small finance banks. Since then, all ten have received the necessary licenses. A small finance bank is a niche type of bank to cater to the needs of people who traditionally have not used scheduled banks. Each of these banks is to open at least 25% of its branches in areas that do not have any other bank branches (unbanked regions). A small finance bank should hold 75% of its net credits in loans to firms in priority sector lending, and 50% of the loans in its portfolio must be less than lakh (US$,000). Banking codes and standards The Banking Codes and standards Board of India is an independent and autonomous banking industry body that monitors banks in India.To improve the quality of banking services in India S S Tarapore (former deputy governor of RBI) had the idea to form this committee. Adoption of banking technology Information Technology has had a great impact on the Indian banking system. The use of computers led to the introduction of online banking in India. The use of computers in the banking sector increased many fold after the economic liberalisation of 1991 as the country's banking sector has been exposed to the world's market. Indian banks were finding it difficult to compete with the international banks in customer service without the use of information technology. The RBI set up a number of committees to define and co-ordinate banking technology. These have included: In 1984 was formed the Committee on Mechanisation in the Banking Industry (1984) whose chairman was Dr. C Rangarajan, Deputy Governor, Reserve Bank of India. The major recommendations of this committee were introducing MICR technology in all the banks in the metropolises in India. This provided for the use of standardised cheque forms and encoders. In 1988, the RBI set up the Committee on Computerisation in Banks (1988) headed by Dr. C Rangarajan. It emphasised that settlement operation must be computerised in the clearing houses of RBI in Bhubaneshwar, Guwahati, Jaipur, Patna and Thiruvananthapuram. It further stated that there should be National Clearing of inter-city cheques at Kolkata, Mumbai, Delhi, Chennai and MICR should be made operational. It also focused on computerisation of branches and increasing connectivity among branches through computers. It also suggested modalities for implementing on-line banking. The committee submitted its reports in 1989 and computerisation began from 1993 with the settlement between IBA and bank employees' associations. In 1994, the Committee on Technology Issues relating to Payment systems, Cheque Clearing and Securities Settlement in the Banking Industry (1994) was set up under Chairman W S Saraf. It emphasised Electronic Funds Transfer (EFT) system, with the BANKNET communications network as its carrier. It also said that MICR clearing should be set up in all branches of all those banks with more than 100 branches. In 1995, the Committee for proposing Legislation on Electronic Funds Transfer and other Electronic Payments (1995) again emphasised EFT system. In July 2016, Deputy Governor R Gandhi of the Central Bank of India "urged banks to work to develop applications for digital currencies and distributed ledgers." Automatic teller machine growth The total number of automated teller machines (ATMs) installed in India by various banks as of 2018 was 2,38,000. The new private sector banks in India have the most ATMs, followed by off-site ATMs belonging to SBI and its subsidiaries and then by nationalised banks and foreign banks, while on-site is highest for the nationalised banks of India. Cheque truncation initiative In 2008 the Reserve Bank of India introduced a system to allow cheque truncation—the conversion of checks from physical form to electronic form when sending to the paying bank—in India, the cheque truncation system as it was known was first rolled out in the National Capital Region and then rolled out nationally. Expansion of banking infrastructure Physical as well as virtual expansion of banking through mobile banking, internet banking, tele banking, bio-metric and mobile ATMs etc. is taking place since last decade and has gained momentum in last few years. Data Breaches 2016 Indian Banks data breach A huge data breach on debit cards issued by various Indian banks was reported in October 2016. It was estimated 3.2 million debit cards were compromised. Major Indian banks- SBI, HDFC Bank, ICICI, Yes Bank and Axis Bank were among the worst hit. Many users reported unauthorised use of their cards in locations in China. This resulted in one of the India's biggest card replacement drive in banking history. The biggest Indian bank State Bank of India announced the blocking and replacement of almost 600,000 debit cards. See also History of banking Institute of Banking Personnel Selection Banking Frontiers magazine, being published since 2002 Indian Rupee Private-sector banks in India Public sector banks in India References Further reading Banking Frontiers magazine, being published since 2002 The Evolution of the State Bank of India (The Era of the Imperial Bank of India, 1921–1955) (Volume III) External links Reserve Bank of India Indian Banking Failure Banking
49938759
https://en.wikipedia.org/wiki/Wireless%20Conference%20Microphone
Wireless Conference Microphone
Wireless Conference Microphone is the conference microphone or conference terminal evolved from the wired one to the wireless one. The main frequency range of wireless conference microphone is various: the mostly applied one is 2.4 GHz based on Wifi. 5.8GHz is also applied in the wireless conference microphone communication system. The traditional conference microphone is connected to the control unit or PC with some cables to modularize the signal or encrypt the signal. The encryption technology of wireless conference system is various. AES is frequently applied to encrypt the signal communication between wireless conference microphone. AES is available in many different encryption packages, and is the first (and only) publicly accessible cipher approved by the National Security Agency (NSA) for top secret information The signal exchange and encryption between wireless conference microphones is based on the central control unit. The deployment of wireless conference system do not require cable. Meanwhile, the wireless environment of the operation site is important for the wireless conference microphone. With co-channel distortion from mobile phones or other Wifi, the communication of the wireless conference microphone would be disturbed. References Wireless
42814162
https://en.wikipedia.org/wiki/Android%20Team%20Awareness%20Kit
Android Team Awareness Kit
Android Team Awareness Kit (ATAK) is an Android smartphone geospatial infrastructure and military situation awareness app. It allows for precision targeting, surrounding land formation intelligence, situational awareness, navigation, and data sharing. This Android app is a part of the larger TAK family of products. ATAK has a plugin architecture which allows developers to add functionality. This extensible plugin architecture that allows enhanced capabilities for specific mission sets (Direct Action, Combat Advising, Law Enforcement, Protection Operations, Border Security, Disaster Response, Off-grid Communications, Precision Mapping and Geotagging). It enables users to navigate using GPS and geospatial map data overlayed with real-time situational awareness of ongoing events. The ATAK software represents the surrounding area using the military standard MIL-STD-2525B symbology, and customized symbols such as icons from Google Earth and Google Maps for iconography and the Cursor on Target data format standard for communication. Initially created in 2010 by the Air Force Research Laboratory, and based on the NASA WorldWind Mobile codebase its development and deployment grew slowly, then rapidly since 2016. As of 2020, ATAK has a growing base of 250,000 military and civilian users across numerous public safety agencies and US partner nations, and has seen the addition of 15 United States Department of Defense programs. Development and usage ATAK began in August 2010 and was originally based on NASA WorldWind Mobile. The goal was to demonstrate robust information sharing in a mobile format. In 2013, officials at Draper Laboratory said that the system would be compatible with Android mobile operating systems and could be used for navigation, spatial awareness, and controlling drones. On October 14, 2014, U.S. Army Geospatial Center recommended AFRL's Android Team Awareness Kit (ATAK), over the world-leader Esri's Commercial Joint Mapping Tool Kit (CJMTK), NASA's World Wind, and the Army's Globe Engine (AGE) for map engine driving the Nett Warrior End User Device. ATAK was selected due to similar capabilities with CJMTK, similar risk, and less than one-third of the total cost. According to a January 2016 article in National Defense Magazine, "[ATAK] has already been fielded to AFSOC units". In September 2015, DARPA reported that ATAK was used in a successful demonstration of the Persistent Close Air Support Program, and is in use by thousands of users. Polaris integrated its Ground Guidance software into an ATAK Plugin to allow on and off-road routing for mounted and dismounted soldiers, accounting for terrain, weather, enemy activity and equipment load. In 2018, USAF Security Forces deployed ATAK at Eglin AFB, Florida. The Android Team Awareness Kit or TAK is currently used by thousands of Department of Homeland Security personnel, along with other members of the Homeland Security Enterprise including state and local public safety personnel. It is in various stages of transition across DHS components and is the emerging DHS-wide solution for tactical awareness. TAK has supported the rescue of over 2,000 people during disaster response for seven major hurricanes (Harvey, Irma, Maria, Florence, Lane, Michael, and Dorian). The capability is also regularly used during daily public safety operations and national security special events like United Nations General Assembly meetings and the Super Bowl. ATAK Versions ATAK has various end-user versions: ATAK - Civilian (ATAK-CIV) - A distribution controlled but fully-releasable version of the TAK Product line for First Responders, Licensed Commercial Developers. Distribution for ATAK-CIV is through Approved, Government Hosted Sites, Direct Commercial Sales (DCS). This version has no ITAR capabilities. ATAK - Government (ATAK-GOV) - ITAR restricted version of the TAK Product line for USG entities and Foreign Government. Distribution for ATAK-GOV are through Approved, Government Hosted Sites; Direct Commercial Sales (DCS). This version of ATAK has no military (MIL) sensitive capabilities. ATAK - Military (ATAK-MIL) - Military Sensitive version of the TAK Product line for US and Foreign Military end-users. Similar to ATAK-GOV, distribution is through Approved, Government Hosted Sites. However, is not available through Direct Commercial Sales (DCS). ATAK - Public Release (ATAK-PR) - (Discontinued) Was made available for download on takmaps.com in April 2020. ATAK-PR is publicly releasable version of the TAK Product line for public individuals for public uses. This version of ATAK is not plugin capable. And is only compatible with arm64 based systems due to file size restrictions. End users with armeabi-v7a or x86 devices are to use ATAK-CIV. ATAK - FVEY "Five Eyes" (ATAK-FVEY) ATAK-CIV On September 1, 2020 - the TAK Product Center released ATAK-CIV (Android Team Awareness Kit - Civil Use) - Version 4.1.1.0 on Google Play Store. Other Versions In addition to the Android version, there is also a Microsoft Windows version (WinTAK) and an Apple iOS version under development (iTAK). WinTAK is an application developed for the Microsoft Windows Operating System which uses maps to allow for precise targeting, intelligence on surrounding land formations, navigation, and generalized situational awareness. It was developed in conjunction with ATAK to provide similar functionality on a Windows platform. Commercial Licensing In January 2015, AFRL began licensing ATAK through TechLink to U.S. companies, for commercial use to support state/local government uses as well as civilian uses. As of January 2020, one hundred companies have licensed ATAK for commercial uses. Corona Fire Department is one example of a local public safety agency using ATAK. Corona uses PAR Government's Team Connect platform to leverage ATAK. In civilian use, ATAK is often referred to as Android Team Awareness Kit. Federal Government release As of March 31, 2020, the civilian version of ATAK, referred to as CivTAK has been approved for "Public Release" by Army Night Vision and is available for download on takmaps.com And subsequently named Android Team Awareness Kit (ATAK) - Civilian. License Grant Upon running ATAK-PR 4.0.0.1, the application splash screen shows a statement; "Approved for public release; distribution is unlimited.". The license conditions are detailed in the ATAK Software License Agreement found in the Support menu of ATAK. Open Source On August 19, 2020 - The source code for the Android Tactical Assault Kit for Civilian Use (ATAK-CIV), the official geospatial-temporal and situational awareness tool used by the US Government. Has been released on United States Department of Defense - Defense Digital Service GitHub repository. ATAK-CIV is managed by the Tactical Assault Kit Configuration Steering Board (TAK CSB) and is designed for used by (US) federal employees. It is made available to the open source community with the hope that community contributions will improve functionality, add features, and mature this work. Users Military United States Special Operations Command United States Army United States Army Special Operations Command United States Air Force United States National Guard United States Coast Guard Federal, State and Local Government United States Department of Homeland Security United States Secret Service Federal Bureau of Investigation U.S. Customs and Border Protection Immigration and Customs Enforcement Federal Emergency Management Agency Albuquerque Fire Rescue Colorado Department of Public Safety, DFPC Center of Excellence for Advanced Technology Aerial Firefighting (CoE) Corona, California - Corona Fire Department New York City Police Department Philippine National Police - JTF CoViD Shield British Army - 1st The Queen's Dragoon Guards References External links TAK Product Center TAK Product Center - Legacy Portal CivTAK.org - News, Licensing, Support for TAK Tools TAKCiv Community Support Wiki Android (operating system) Geographic information systems
17606964
https://en.wikipedia.org/wiki/Movie%20Outline
Movie Outline
Movie Outline is a word processing program to step outline a cinematic story and format a screenplay. It was created by Dan Bronzite, a produced UK screenwriter. It was released in 2004 as an outliner but has expanded its features in later releases. The software is based on the principle of step-outlining, whereby a writer creates their story step-by-step before writing the screenplay. This essential process is sometimes overlooked by the novice writer who may prefer to jump headfirst into writing their full screenplay before properly planning their story structure. However, this approach normally results in more rewrites in the long run. One of Movie Outline's features is its "Reference Plugins" which allow comparison of the progress of the writer's story with outlines and analyses of Hollywood movies. Program features Step outlining - allows development of the story, script and characters step-by-step. Automatic script formatting - automatically formats the script as it is typed through auto-complete and keyboard shortcuts. Character development - allows creation of character profiles. Color-coded story structuring - the story and script can be structured into color-coded acts (or chapters if writing a novel etc.) and the structures created can be saved as templates. Software includes sample templates such as the Hero's Journey (Mythic Structure) and the classic three-act structure. Dialogue focus tools - this feature allows isolation of voice-over or dialogue between two characters allowing creation of unique characters and consistent voices. Story analysis tools - uses "FeelFactors" to analyze the pacing of the story (conflict, tension, action etc.) through a visual graph. Reference movie breakdowns - includes outlines and analyses of successful produced Hollywood movies. Drag and drop index cards - organizational tool that allows rearrangement of the story structure. Cross-platform - documents can be exchanged between Windows and Mac platforms. Import and export - imports plain text documents that have been written in standard word processors or other script-formatting software and automatically reformats them to industry standard screenplay format. Exports documents to many file formats including Rich Text, HTML (web page), Adobe Acrobat PDF and Movie Outline's own secure reference format. History Movie Outline was launched in early 2004 for Windows purely as an outliner. In 2005 Version 2 was released along with a Mac OS X version, adding new features including structuring tools and a story tasks "to do" list. In 2007 Nuvotech Limited released Version 3 at the Screenwriting Expo in Los Angeles, California. This incarnation had evolved into a fully fledged screenplay development application that now included professional script formatting features and character development tools. The product was well received by users and reviewers and highlighted the importance of outlining to the screenwriting community. In 2010 Version 3.1 was released with over 100 new features and improvements and although it was technically a major upgrade Nuvotech offered it for free to its existing users. In 2017 Nuvotech re-launched Movie Outline under the new brand name of Script Studio with a slate of new features including a dedicated Novel Mode, Night Mode, Unicode and right-to-left language support, WYSIWYG Dual Dialogue, global Scratch Pad, new templates and a brand new modern interface design. System requirements Windows 10/8/7/Vista/XP operating system 52 MB of free hard disk space 1 GHz CPU or higher Intel Pentium/Celeron Compatible Processor 512 MB of RAM. VGA or higher monitor resolution Macintosh OS X 10.5 or above 75 MB of free hard disk space 1 GHz CPU or higher G4 or above PowerPC or Intel processor 512 MB of RAM VGA or higher monitor resolution References External links Creative Screenwriting Magazine review StoryPros.com Review The Script Connection endorsement Screenwriting software MacOS text-related software Windows text-related software
4283705
https://en.wikipedia.org/wiki/W3C%20Software%20Notice%20and%20License
W3C Software Notice and License
The W3C Software Notice and License is a permissive free software license used by software released by the World Wide Web Consortium, like Amaya. The license is a permissive license, compatible with the GNU General Public License. Software using the License Arena Amaya Libwww Line Mode Browser See also Free software portal Software using the W3C license (category) World Wide Web Consortium References External links Text of the license Free and open-source software licenses
22349
https://en.wikipedia.org/wiki/Odyssey
Odyssey
The Odyssey (; , ) is one of two major ancient Greek epic poems attributed to Homer. It is one of the oldest extant works of literature still widely read by modern audiences. As with the Iliad, the poem is divided into 24 books. It follows the Greek hero Odysseus, king of Ithaca, and his journey home after the Trojan War. After the war itself, which lasted ten years, his journey lasted for ten additional years, during which time he encountered many perils and all his crew mates were killed. In his absence, Odysseus was assumed dead, and his wife Penelope and son Telemachus had to contend with a group of unruly suitors who were competing for Penelope's hand in marriage. The Odyssey was originally composed in Homeric Greek in around the 8th or 7th century BCE and, by the mid-6th century BCE, had become part of the Greek literary canon. In antiquity, Homer's authorship of the poem was not questioned, but contemporary scholarship predominantly assumes that the Iliad and the Odyssey were composed independently, and the stories themselves formed as part of a long oral tradition. Given widespread illiteracy, the poem was performed by an aoidos or rhapsode, and more likely to be heard than read. Crucial themes in the poem include the ideas of nostos (νόστος; "return"), wandering, xenia (ξενία; "guest-friendship"), testing, and omens. Scholars still reflect on the narrative significance of certain groups in the poem, such as women and slaves, who have a more prominent role in the epic than in many other works of ancient literature. This focus is especially remarkable when considered beside the Iliad, which centres the exploits of soldiers and kings during the Trojan War. The Odyssey is regarded as one of the most significant works of the Western canon. The first English translation of the Odyssey was in the 16th century. Adaptations and re-imaginings continue to be produced across a wide variety of mediums. In 2018, when BBC Culture polled experts around the world to find literature's most enduring narrative, the Odyssey topped the list. Synopsis Exposition (books 1-4) The Odyssey begins after the end of the ten-year Trojan War (the subject of the Iliad), from which Odysseus, king of Ithaca, has still not returned due to angering Poseidon, the god of the sea. Odysseus' son, Telemachus, is about 20 years old and is sharing his absent father's house on the island of Ithaca with his mother Penelope and the suitors of Penelope, a crowd of 108 boisterous young men who each aim to persuade Penelope for her hand in marriage, all the while reveling in the king's palace and eating up his wealth. Odysseus' protectress, the goddess Athena, asks Zeus, king of the gods, to finally allow Odysseus to return home when Poseidon is absent from Mount Olympus. Then, disguised as a chieftain named Mentes, Athena visits Telemachus to urge him to search for news of his father. He offers her hospitality and they observe the suitors dining rowdily while Phemius, the bard, performs a narrative poem for them. That night, Athena, disguised as Telemachus, finds a ship and crew for the true prince. The next morning, Telemachus calls an assembly of citizens of Ithaca to discuss what should be done with the insolent suitors, who then scoff at Telemachus. Accompanied by Athena (now disguised as Mentor), the son of Odysseus departs for the Greek mainland, to the household of Nestor, most venerable of the Greek warriors at Troy, who resided in Pylos after the war. From there, Telemachus rides to Sparta, accompanied by Nestor's son. There he finds Menelaus and Helen, who are now reconciled. Both Helen and Menelaus also say that they returned to Sparta after a long voyage by way of Egypt. There, on the island of Pharos, Menelaus encounters the old sea-god Proteus, who told him that Odysseus was a captive of the nymph Calypso. Telemachus learns the fate of Menelaus' brother, Agamemnon, king of Mycenae and leader of the Greeks at Troy: he was murdered on his return home by his wife Clytemnestra and her lover Aegisthus. The story briefly shifts to the suitors, who have only just now realized that Telemachus is gone. Angry, they formulate a plan to ambush his ship and kill him as he sails back home. Penelope overhears their plot and worries for her son's safety. Escape to the Phaeacians (books 5-8) In the course of Odysseus' seven years as a captive of the goddess Calypso on an island (Ogygia), she has fallen deeply in love with him, even though he spurns her offers of immortality as her husband and still mourns for home. She is ordered to release him by the messenger god Hermes, who has been sent by Zeus in response to Athena's plea. Odysseus builds a raft and is given clothing, food, and drink by Calypso. When Poseidon learns that Odysseus has escaped, he wrecks the raft but, helped by a veil given by the sea nymph Ino, Odysseus swims ashore on Scherie, the island of the Phaeacians. Naked and exhausted, he hides in a pile of leaves and falls asleep. The next morning, awakened by girls' laughter, he sees the young Nausicaä, who has gone to the seashore with her maids after Athena told her in a dream to do so. He appeals for help. She encourages him to seek the hospitality of her parents, Arete and Alcinous. Alcinous promises to provide him a ship to return him home, without knowing who Odysseus is. He remains for several days. Odysseus asks the blind singer Demodocus to tell the story of the Trojan Horse, a stratagem in which Odysseus had played a leading role. Unable to hide his emotion as he relives this episode, Odysseus at last reveals his identity. He then tells the story of his return from Troy. Odysseus' account of his adventures (books 9-12) Odysseus recounts his story to the Phaeacians. After a failed raid, Odysseus and his twelve ships were driven off course by storms. Odysseus visited the lotus-eaters who gave his men their fruit that caused them to forget their homecoming. Odysseus had to drag them back to the ship by force. Afterwards, Odysseus and his men landed on a lush, uninhabited island near the land of the Cyclopes. The men then landed on shore and entered the cave of Polyphemus, where they found all the cheeses and meat they desired. Upon returning home, Polyphemus sealed the entrance with a massive boulder and proceeded to eat Odysseus' men. Odysseus devised an escape plan in which he, identifying himself as "Nobody," plied Polyphemus with wine and blinded him with a wooden stake. When Polyphemus cried out, his neighbors left after Polyphemus claimed that "Nobody" had attacked him. Odysseus and his men finally escaped the cave by hiding on the underbellies of the sheep as they were let out of the cave. As they escaped, however, Odysseus, taunting Polyphemus, revealed himself. The Cyclops prayed to his father Poseidon, asking him to curse Odysseus to wander for ten years. After the escape, Aeolus gave Odysseus a leather bag containing all the winds, except the west wind, a gift that should have ensured a safe return home. Just as Ithaca came into sight, the sailors opened the bag while Odysseus slept, thinking it contained gold. The winds flew out and the storm drove the ships back the way they had come. Aeolus, recognizing that Odysseus had drawn the ire of the gods, refused to further assist him. After the cannibalistic Laestrygonians destroyed all of his ships except his own, Odysseus sailed on and reached the island of Aeaea, home of witch-goddess Circe. She turned half of his men into swine with drugged cheese and wine. Hermes warned Odysseus about Circe and gave Odysseus an herb called moly, making him resistant to Circe's magic. Odysseus forced Circe to change his men back to their human form, and was seduced by her. They remained with her for one year. Finally, guided by Circe's instructions, Odysseus and his crew crossed the ocean and reached a harbour at the western edge of the world, where Odysseus sacrificed to the dead. Odysseus summoned the spirit of the prophet Tiresias and was told that he may return home if he is able to stay himself and his crew from eating the sacred livestock of Helios on the island of Thrinacia and that failure to do so would result in the loss of his ship and his entire crew. For Odysseus' encounter with the dead, see Nekuia. Returning to Aeaea, they buried Elpenor and were advised by Circe on the remaining stages of the journey. They skirted the land of the Sirens. All of the sailors had their ears plugged up with beeswax, except for Odysseus, who was tied to the mast as he wanted to hear the song. He told his sailors not to untie him as it would only make him drown himself. They then passed between the six-headed monster Scylla and the whirlpool Charybdis. Scylla claimed six of his men. Next, they landed on the island of Thrinacia, with the crew overriding Odysseus's wishes to remain away from the island. Zeus caused a storm which prevented them from leaving, causing them to deplete the food given to them by Circe. While Odysseus was away praying, his men ignored the warnings of Tiresias and Circe and hunted the sacred cattle of Helios. The Sun God insisted that Zeus punish the men for this sacrilege. They suffered a shipwreck and all but Odysseus drowned. Odysseus clung to a fig tree. Washed ashore on Ogygia, he remained there as Calypso's lover. Return to Ithaca (books 13-20) Having listened to his story, the Phaeacians agree to provide Odysseus with more treasure than he would have received from the spoils of Troy. They deliver him at night, while he is fast asleep, to a hidden harbour on Ithaca. Odysseus awakens and believes that he has been dropped on a distant land before Athena appears to him and reveals that he is indeed on Ithaca. She hides his treasure in a nearby cave and disguises him as an elderly beggar so he can see how things stand in his household. He finds his way to the hut of one of his own slaves, swineherd Eumaeus, who treats him hospitably and speaks favorably of Odysseus. After dinner, the disguised Odysseus tells the farm laborers a fictitious tale of himself. Telemachus sails home from Sparta, evading an ambush set by the Suitors. He disembarks on the coast of Ithaca and meets Odysseus. Odysseus identifies himself to Telemachus (but not to Eumaeus), and they decide that the Suitors must be killed. Telemachus goes home first. Accompanied by Eumaeus, Odysseus returns to his own house, still pretending to be a beggar. He is ridiculed by the Suitors in his own home, especially Antinous. Odysseus meets Penelope and tests her intentions by saying he once met Odysseus in Crete. Closely questioned, he adds that he had recently been in Thesprotia and had learned something there of Odysseus's recent wanderings. Odysseus's identity is discovered by the housekeeper, Eurycleia, when she recognizes an old scar as she is washing his feet. Eurycleia tries to tell Penelope about the beggar's true identity, but Athena makes sure that Penelope cannot hear her. Odysseus swears Eurycleia to secrecy. Slaying of the Suitors (books 21-24) The next day, at Athena's prompting, Penelope maneuvers the Suitors into competing for her hand with an archery competition using Odysseus' bow. The man who can string the bow and shoot an arrow through a dozen axe heads would win. Odysseus takes part in the competition himself: he alone is strong enough to string the bow and shoot the arrow through the dozen axe heads, making him the winner. He then throws off his rags and kills Antinous with his next arrow. Odysseus kills the other Suitors, first using the rest of the arrows and then by swords and spears once both sides armed themselves. Once the battle is won, Telemachus also hangs twelve of their household maids whom Eurycleia identifies as guilty of betraying Penelope or having sex with the Suitors. Odysseus identifies himself to Penelope. She is hesitant but recognizes him when he mentions that he made their bed from an olive tree still rooted to the ground. Structure The Odyssey is 12,109 lines composed in dactylic hexameter, also called Homeric hexameter. It opens in medias res, in the middle of the overall story, with prior events described through flashbacks and storytelling. The 24 books correspond to the letters of the Greek alphabet; the division was likely made after the poem's composition by someone other than Homer, but is generally accepted. In the Classical period, some of the books (individually and in groups) were commonly given their own titles: Book 1–4: Telemachy —the story focuses on the perspective of Telemachus. Books 9–12: Apologoi—Odysseus recalls his adventures for his Phaeacian hosts. Book 22: Mnesterophonia ('slaughter of the suitors'; + ). Book 22 concludes the Greek Epic Cycle, though fragments remain of the "alternative ending" of sorts known as the Telegony. The Telegony aside, the last 548 lines of the Odyssey, corresponding to Book 24, are believed by many scholars to have been added by a slightly later poet. Geography The events in the main sequence of the Odyssey (excluding Odysseus' embedded narrative of his wanderings) have been said to take place in the Peloponnese and in what are now called the Ionian Islands. There are difficulties in the apparently simple identification of Ithaca, the homeland of Odysseus, which may or may not be the same island that is now called (modern Greek: ). The wanderings of Odysseus as told to the Phaeacians, and the location of the Phaeacians' own island of Scheria, pose more fundamental problems, if geography is to be applied: scholars, both ancient and modern, are divided as to whether any of the places visited by Odysseus (after Ismaros and before his return to Ithaca) are real. Both antiquated and contemporary scholars have attempted to map Odysseus' journey, but now largely agree that the landscapes, especially of the Apologia (Books 9 to 11), include too many mythological aspects as features to be uncontroversially mappable. Classicist Peter T. Struck created an interactive map which plots Odysseus' travels, including his near homecoming which was thwarted by the bag of wind. Influences Scholars have seen strong influences from Near Eastern mythology and literature in the Odyssey. Martin West notes substantial parallels between the Epic of Gilgamesh and the Odyssey. Both Odysseus and Gilgamesh are known for traveling to the ends of the earth, and on their journeys go to the land of the dead. On his voyage to the underworld, Odysseus follows instructions given to him by Circe, who is located at the edges of the world and is associated through imagery with the sun. Like Odysseus, Gilgamesh gets directions on how to reach the land of the dead from a divine helper: the goddess Siduri, who, like Circe, dwells by the sea at the ends of the earth, whose home is also associated with the sun. Gilgamesh reaches Siduri's house by passing through a tunnel underneath Mt. Mashu, the high mountain from which the sun comes into the sky. West argues that the similarity of Odysseus' and Gilgamesh's journeys to the edges of the earth are the result of the influence of the Gilgamesh epic upon the Odyssey. In 1914, paleontologist Othenio Abel surmised the origins of the Cyclops to be the result of ancient Greeks finding an elephant skull. The enormous nasal passage in the middle of the forehead could have looked like the eye socket of a giant, to those who had never seen a living elephant. Classical scholars, on the other hand, have long known that the story of the Cyclops was originally a folk tale, which existed independently of the Odyssey and which became part of it at a later date. Similar stories are found in cultures across Europe and the Middle East. According to this explanation, the Cyclops was originally simply a giant or ogre, much like Humbaba in the Epic of Gilgamesh. Graham Anderson suggests that the addition about it having only one eye was invented to explain how the creature was so easily blinded. Themes and patterns Homecoming Homecoming (Ancient Greek: νόστος, nostos) is a central theme of the Odyssey. Anna Bonafazi of the University of Cologne writes that, in Homer, nostos is "return home from Troy, by sea". Agatha Thornton examines nostos in the context of characters other than Odysseus, in order to provide an alternative for what might happen after the end of the Odyssey. For instance, one example is that of Agamemnon's homecoming versus Odysseus'. Upon Agamemnon's return, his wife Clytemnestra and her lover, Aegisthus kill Agamemnon. Agamemnon's son, Orestes, out of vengeance for his father's death, kills Aegisthus. This parallel compares the death of the suitors to the death of Aegisthus and sets Orestes up as an example for Telemachus. Also, because Odysseus knows about Clytemnestra's betrayal, Odysseus returns home in disguise in order to test the loyalty of his own wife, Penelope. Later, Agamemnon praises Penelope for not killing Odysseus. It is because of Penelope that Odysseus has fame and a successful homecoming. This successful homecoming is unlike Achilles, who has fame but is dead, and Agamemnon, who had an unsuccessful homecoming resulting in his death. Wandering Only two of Odysseus's adventures are described by the narrator. The rest of Odysseus' adventures are recounted by Odysseus himself. The two scenes described by the narrator are Odysseus on Calypso's island and Odysseus' encounter with the Phaeacians. These scenes are told by the poet to represent an important transition in Odysseus' journey: being concealed to returning home. Calypso's name comes from the Greek word (), meaning 'to cover' or 'conceal', which is apt, as this is exactly what she does with Odysseus. Calypso keeps Odysseus concealed from the world and unable to return home. After leaving Calypso's island, the poet describes Odysseus' encounters with the Phaeacians—those who "convoy without hurt to all men"—which represents his transition from not returning home to returning home. Also, during Odysseus' journey, he encounters many beings that are close to the gods. These encounters are useful in understanding that Odysseus is in a world beyond man and that influences the fact he cannot return home. These beings that are close to the gods include the Phaeacians who lived near the Cyclopes, whose king, Alcinous, is the great-grandson of the king of the giants, Eurymedon, and the grandson of Poseidon. Some of the other characters that Odysseus encounters are the cyclops Polyphemus, the son of Poseidon; Circe, a sorceress who turns men into animals; and the cannibalistic giants, the Laestrygonians. Guest-friendship Throughout the course of the epic, Odysseus encounters several examples of xenia ("guest-friendship"), which provide models of how hosts should and should not act. The Phaeacians demonstrate exemplary guest-friendship by feeding Odysseus, giving him a place to sleep, and granting him many gifts and a safe voyage home, which are all things a good host should do. Polyphemus demonstrates poor guest-friendship. His only "gift" to Odysseus is that he will eat him last. Calypso also exemplifies poor guest-friendship because she does not allow Odysseus to leave her island. Another important factor to guest-friendship is that kingship implies generosity. It is assumed that a king has the means to be a generous host and is more generous with his own property. This is best seen when Odysseus, disguised as a beggar, begs Antinous, one of the suitors, for food and Antinous denies his request. Odysseus essentially says that while Antinous may look like a king, he is far from a king since he is not generous. According to J. B. Hainsworth, guest-friendship follows a very specific pattern: The arrival and the reception of the guest. Bathing or providing fresh clothes to the guest. Providing food and drink to the guest. Questions may be asked of the guest and entertainment should be provided by the host. The guest should be given a place to sleep, and both the guest and host retire for the night. The guest and host exchange gifts, the guest is granted a safe journey home, and the guest departs. Another important factor of guest-friendship is not keeping the guest longer than they wish and also promising their safety while they are a guest within the host's home. Testing Another theme throughout the Odyssey is testing. This occurs in two distinct ways. Odysseus tests the loyalty of others and others test Odysseus' identity. An example of Odysseus testing the loyalties of others is when he returns home. Instead of immediately revealing his identity, he arrives disguised as a beggar and then proceeds to determine who in his house has remained loyal to him and who has helped the suitors. After Odysseus reveals his true identity, the characters test Odysseus' identity to see if he really is who he says he is. For instance, Penelope tests Odysseus' identity by saying that she will move the bed into the other room for him. This is a difficult task since it is made out of a living tree that would require being cut down, a fact that only the real Odysseus would know, thus proving his identity. For more information on the progression of testing type scenes, read more below. Testing also has a very specific type scene that accompanies it. Throughout the epic, the testing of others follows a typical pattern. This pattern is: Odysseus is hesitant to question the loyalties of others. Odysseus tests the loyalties of others by questioning them. The characters reply to Odysseus' questions. Odysseus proceeds to reveal his identity. The characters test Odysseus' identity. There is a rise of emotions associated with Odysseus' recognition, usually lament or joy. Finally, the reconciled characters work together. Omens Omens occur frequently throughout the Odyssey. Within the epic poem, they frequently involve birds. According to Thornton, most crucial is who receives each omen and in what way it manifests. For instance, bird omens are shown to Telemachus, Penelope, Odysseus, and the suitors. Telemachus and Penelope receive their omens as well in the form of words, sneezes, and dreams. However, Odysseus is the only character who receives thunder or lightning as an omen. She highlights this as crucial because lightning, as a symbol of Zeus, represents the kingship of Odysseus. Odysseus is associated with Zeus throughout both the Iliad and the Odyssey. Omens are another example of a type scene in the Odyssey. Two important parts of an omen type scene are the recognition of the omen, followed by its interpretation. In the Odyssey, all of the bird omens—with the exception of the first—show large birds attacking smaller birds. Accompanying each omen is a wish which can be either explicitly stated or only implied. For example, Telemachus wishes for vengeance and for Odysseus to be home, Penelope wishes for Odysseus' return, and the suitors wish for the death of Telemachus. Textual history Composition The date of the poem is a matter of some disagreement among classicists. In the middle of the 8th century BCE, the inhabitants of Greece began to adopt a modified version of the Phoenician alphabet to write down their own language. The Homeric poems may have been one of the earliest products of that literacy, and if so, would have been composed some time in the late 8th century BCE. Inscribed on a clay cup found in Ischia, Italy, are the words "Nestor's cup, good to drink from." Some scholars, such as Calvert Watkins, have tied this cup to a description of King Nestor's golden cup in the Iliad. If the cup is an allusion to the Iliad, that poem's composition can be dated to at least 700–750 BCE. Dating is similarly complicated by the fact that the Homeric poems, or sections of them, were performed regularly by rhapsodes for several hundred years. The Odyssey as it exists today is likely not significantly different. Aside from minor differences, the Homeric poems gained a canonical place in the institutions of ancient Athens by the 6th century. In 566 BCE, Peisistratos instituted a civic and religious festival called the Panathenaia, which featured performances of Homeric poems. These are significant because a "correct" version of the poems had to be performed, indicating that a particular version of the text had become canonised. Textual tradition The Iliad and the Odyssey were widely copied and used as school texts in lands where the Greek language was spoken throughout antiquity. Scholars may have begun to write commentaries on the poems as early as the time of Aristotle in the 4th century BCE. In the 3rd and 2nd centuries BCE, scholars affiliated with the Library of Alexandria—particularly Zenodotus of Ephesus and Aristarchus of Samothrace—edited the Homeric poems, wrote commentaries on them, and helped establish the canonical texts. The Iliad and the Odyssey remained widely studied and used as school texts in the Byzantine Empire during the Middle Ages. The Byzantine Greek scholar and archbishop Eustathios of Thessalonike (c. 1115–1195/6 AD) wrote exhaustive commentaries on both of the Homeric epics that became seen by later generations as authoritative; his commentary on the Odyssey alone spans nearly 2,000 oversized pages in a twentieth-century edition. The first printed edition of the Odyssey, known as the editio princeps, was produced in 1488 by the Greek scholar Demetrios Chalkokondyles, who had been born in Athens and had studied in Constantinople. His edition was printed in Milan by a Greek printer named Antonios Damilas. Since the late 19th century, many papyri containing fragments of the Odyssey have been found in Egypt, some with content different from later medieval versions. In 2018, the Greek Cultural Ministry revealed the discovery of a clay tablet near the Temple of Zeus at Olympia, containing 13 verses from the Odyssey 14th book. While it was initially reported to date from the 3rd century AD, the date still needs to be confirmed. English translations The poet George Chapman finished the first complete English translation of the Odyssey in 1614, which was set in rhyming couplets of iambic pentameter. Emily Wilson, a professor of classical studies at the University of Pennsylvania, noted that, as late as the first decade of the 21st century, almost all of the most prominent translators of Greek and Roman literature had been men. She called her experience of translating Homer one of "intimate alienation." Wilson writes that this has affected the popular conception of characters and events of the Odyssey, inflecting the story with connotations not present in the original text: "For instance, in the scene where Telemachus oversees the hanging of the slaves who have been sleeping with the suitors, most translations introduce derogatory language ("sluts" or "whores") [...] The original Greek does not label these slaves with derogatory language." In the original Greek, the word used is hai, the feminine article, equivalent to "those female people". Influence The influence of the Homeric texts can be difficult to summarise because of how greatly they have impacted the popular imagination and cultural values. The Odyssey and the Iliad formed the basis of education for members of ancient Mediterranean society. That curriculum was adopted by Western humanists, meaning the text was so much a part of the cultural fabric that it became irrelevant whether an individual had read it. As such, the influence of the Odyssey has reverberated through over a millennium of writing. The poem topped a poll of experts by BBC Culture to find literature's most enduring narrative. It is widely regarded by western literary critics as a timeless classic, and remains one of the oldest works of extant literature commonly read by Western audiences. Literature In Canto XXVI of the Inferno, Dante Alighieri meets Odysseus in the eighth circle of hell, where Odysseus himself appends a new ending to the Odyssey in which he never returns to Ithaca and instead continues his restless adventuring. Edith Hall suggests that Dante's depiction of Odysseus became understood as a manifestation of Renaissance colonialism and othering, with the cyclops standing in for "accounts of monstrous races on the edge of the world", and his defeat as symbolising "the Roman domination of the western Mediterranean". Irish poet James Joyce's modernist novel Ulysses (1922) was significantly influenced by the Odyssey. Joyce had encountered the figure of Odysseus in Charles Lamb's Adventures of Ulysses, an adaptation of the epic poem for children, which seems to have established the Latin name in Joyce's mind. Ulysses, a re-telling of the Odyssey set in Dublin, is divided into 18 sections ("episodes") which can be mapped roughly onto the 24 books of the Odyssey. Joyce claimed familiarity with the original Homeric Greek, but this has been disputed by some scholars, who cite his poor grasp of the language as evidence to the contrary. The book, and especially its stream of consciousness prose, is widely considered foundational to the modernist genre. Modern writers have revisited the Odyssey to highlight the poem's female characters. Canadian writer Margaret Atwood adapted parts of the Odyssey for her novella, The Penelopiad (2000). The novella focuses on Odysseus' wife, Penelope, and the twelve female slaves hanged by Odysseus at the poem's ending, an image which haunted her. Atwood's novella comments on the original text, wherein Odysseus' successful return to Ithaca symbolises the restoration of a patriarchal system. Similarly, Madeline Miller's Circe (2018) revisits the relationship between Odysseus and Circe on Aeaea. As a reader, Miller was frustrated by Circe's lack of motivation in the original poem, and sought to explain her capriciousness. The novel recontextualises the sorceress' transformations of sailors into pigs from an act of malice into one of self-defence, given that she has no superhuman strength with which to repel attackers. Film and television adaptations Ulysses (1954) is a film adaptation starring Kirk Douglas as Ulysses, Silvana Mangano as Penelope and Circe, and Anthony Quinn as Antinous. L'Odissea (1968) is an Italian-French-German-Yugoslavian television miniseries praised for its faithful rendering of the original epic. Ulysses 31 (1981) is a Japanese-French anime that updates the ancient setting into a 31st-century space opera. Nostos: The Return (1989) is an Italian film about Odysseus' homecoming. Directed by Franco Piavoli, it relies on visual storytelling and has a strong focus on nature. Ulysses' Gaze (1995), directed by Theo Angelopoulos, has many of the elements of the Odyssey set against the backdrop of the most recent and previous Balkan Wars. The Odyssey (1997) is a television miniseries directed by Andrei Konchalovsky and starring Armand Assante as Odysseus and Greta Scacchi as Penelope. O Brother, Where Art Thou? (2000) is a crime comedy-drama film written, produced, co-edited and directed by the Coen Brothers, and is very loosely based on Homer's poem. Opera and music Il ritorno d'Ulisse in patria, first performed in 1640, is an opera by Claudio Monteverdi based on the second half of Homer's Odyssey. Rolf Riehm composed an opera based on the myth, Sirenen – Bilder des Begehrens und des Vernichtens (Sirens – Images of Desire and Destruction), which premiered at the Oper Frankfurt in 2014. Robert W. Smith's second symphony for concert band, The Odyssey, tells four of the main highlights of the story in the piece's four movements: "The Iliad", "The Winds of Poseidon", "The Isle of Calypso", and "Ithaca". See also Aeneid Gulliver's Travels English translations of Homer List of literary cycles Odyssean gods Parallels between Virgil's Aeneid and Homer's Iliad and Odyssey Sinbad the Sailor Sunpadh The Voyage of Bran References Citations Bibliography Lattimore, Richmond, trans. 1975. The Odyssey of Homer. New York: Harper & Row. Further reading Austin, N. 1975. Archery at the Dark of the Moon: Poetic Problems in Homer’s Odyssey. Berkeley: University of California Press. Clayton, B. 2004. A Penelopean Poetics: Reweaving the Feminine in Homer's Odyssey. Lanham: Lexington Books. — 2011. "Polyphemus and Odysseus in the Nursery: Mother’s Milk in the Cyclopeia." Arethusa 44(3):255–77. Bakker, E. J. 2013. The Meaning of Meat and the Structure of the Odyssey. Cambridge: Cambridge University Press. Barnouw, J. 2004. Odysseus, Hero of Practical Intelligence. Deliberation and Signs in Homer's Odyssey. Lanham, MD: University Press of America. Dougherty, C. 2001. The Raft of Odysseus: The Ethnographic Imagination of Homer's Odyssey. New York: Oxford University Press. Fenik, B. 1974. Studies in the Odyssey. Hermes: Einzelschriften 30. Wiesbaden, West Germany: F. Steiner. Griffin, J. 1987. Homer: The Odyssey. Landmarks in World Literature. Cambridge: Cambridge University Press. Louden, B. 2011. Homer’s Odyssey and the Near East. Cambridge: Cambridge University Press. — 1999. The Odyssey: Structure, Narration and Meaning. Baltimore: Johns Hopkins University Press. Minchin, E. 2010. "The Expression of Sarcasm in the 'Odyssey'." Mnemosyne 63(4):533–56. Müller, W. G. 2015. "From Homer’s Odyssey to Joyce’s Ulysses: Theory and Practice of an Ethical Narratology." Arcadia 50(1):9–36. Perpinyà, Núria. 2008. Las criptas de la crítica. Veinte lecturas de la Odisea [The Crypts of Criticism: Twenty Interpretations of the 'Odyssey']. Madrid: Gredos. Lay summary via El Cultural (in Spanish). Reece, Steve. 1993. The Stranger's Welcome: Oral Theory and the Aesthetics of the Homeric Hospitality Scene. Ann Arbor: University of Michigan Press. — 2011. "Toward an Ethnopoetically Grounded Edition of Homer’s Odyssey." Oral Tradition 26:299–326. — 2011. "Penelope's Early Recognition’ of Odysseus from a Neoanalytic and Oral Perspective." College Literature 38(2):101–17. Saïd, S. 2011 [1998].. Homer and the Odyssey. New York: Oxford University Press. Turkeltaub, D. 2014. “Penelope's ‘Stout Hand’ and Odyssean Humour.” The Journal of Hellenic Studies 134:103–19. West, E. 2014. “Circe, Calypso, Hiḍimbā.” Journal of Indo-European Studies 42(1):144–74. External links The Odyssey (in Ancient Greek) on Perseus Project The Odyssey, trans. by A. T. Murray (1919) on Perseus Project BBC audio file — In our time BBC Radio 4 [discussion programme, 45 mins] The Odyssey Comix — A detailed retelling and explanation of Homer's Odyssey in comic-strip format by Greek Myth Comix The Odyssey — Annotated text and analyses aligned to Common Core Standards "Homer's Odyssey: A Commentary" by Denton Jaques Snider on Project Gutenberg 8th-century BC books Ancient Greek religion Epic Cycle Poems adapted into films Public domain books Pigs in literature Sequels
27118794
https://en.wikipedia.org/wiki/Kyocera%20Zio
Kyocera Zio
The Kyocera Zio (also known as SANYO Zio, also stylized ZIO, model numbers SCP-8600/M6000) is an Internet-enabled 3G smartphone manufactured by Kyocera, running Google's Android operating system. It was announced on March 23, 2010, and is expected to sell for a retail price of $169 and $216, with no carrier subsidies. As such, it will be one of the lowest cost smartphones running the Android operating system. Leap Wireless, a low-cost and prepaid CDMA-based wireless carrier in the US, announced on March 23, 2010 that it would introduce the Zio smartphone, in late summer 2010. This would be the first Android smartphone offered by Leap Wireless or its Cricket Wireless subsidiary. Kyocera has stated that the phone is easily upgraded to Android version 2.0 or 2.1, based on carrier wishes. Cricket Wireless released an update for the phone to Android 2.2 on February 28, 2011. The SANYO Zio became available to Sprint customers on October 10, 2010 with Android 2.1 (Eclair) and is one of the first to use Sprint's exclusive Sprint ID user interface. References External links Kyocera Zio M6000 Zio from Cricket Wireless Zio from Sprint Kyocera mobile phones Sanyo mobile phones Smartphones Touchscreen portable media players Mobile phones introduced in 2010 Android (operating system) devices Linux-based devices
21756219
https://en.wikipedia.org/wiki/List%20of%20HTTP%20status%20codes
List of HTTP status codes
This is a list of Hypertext Transfer Protocol (HTTP) response status codes. Status codes are issued by a server in response to a client's request made to the server. It includes codes from IETF Request for Comments (RFCs), other specifications, and some additional codes used in some common applications of the HTTP. The first digit of the status code specifies one of five standard classes of responses. The message phrases shown are typical, but any human-readable alternative may be provided. Unless otherwise stated, the status code is part of the HTTP/1.1 standard (RFC 7231). The Internet Assigned Numbers Authority (IANA) maintains the official registry of HTTP status codes. All HTTP response status codes are separated into five classes or categories. The first digit of the status code defines the class of response, while the last two digits do not have any classifying or categorization role. There are five classes defined by the standard: 1xx informational response – the request was received, continuing process 2xx successful – the request was successfully received, understood, and accepted 3xx redirection – further action needs to be taken in order to complete the request 4xx client error – the request contains bad syntax or cannot be fulfilled 5xx server error – the server failed to fulfil an apparently valid request 1xx informational response An informational response indicates that the request was received and understood. It is issued on a provisional basis while request processing continues. It alerts the client to wait for a final response. The message consists only of the status line and optional header fields, and is terminated by an empty line. As the HTTP/1.0 standard did not define any 1xx status codes, servers must not send a 1xx response to an HTTP/1.0 compliant client except under experimental conditions. 100 Continue The server has received the request headers and the client should proceed to send the request body (in the case of a request for which a body needs to be sent; for example, a POST request). Sending a large request body to a server after a request has been rejected for inappropriate headers would be inefficient. To have a server check the request's headers, a client must send Expect: 100-continue as a header in its initial request and receive a 100 Continue status code in response before sending the body. If the client receives an error code such as 403 (Forbidden) or 405 (Method Not Allowed) then it should not send the request's body. The response 417 Expectation Failed indicates that the request should be repeated without the Expect header as it indicates that the server does not support expectations (this is the case, for example, of HTTP/1.0 servers). 101 Switching Protocols The requester has asked the server to switch protocols and the server has agreed to do so. 102 Processing (WebDAV; RFC 2518) A WebDAV request may contain many sub-requests involving file operations, requiring a long time to complete the request. This code indicates that the server has received and is processing the request, but no response is available yet. This prevents the client from timing out and assuming the request was lost. 103 Early Hints (RFC 8297) Used to return some response headers before final HTTP message. 2xx success This class of status codes indicates the action requested by the client was received, understood, and accepted. 200 OK Standard response for successful HTTP requests. The actual response will depend on the request method used. In a GET request, the response will contain an entity corresponding to the requested resource. In a POST request, the response will contain an entity describing or containing the result of the action. 201 Created The request has been fulfilled, resulting in the creation of a new resource. 202 Accepted The request has been accepted for processing, but the processing has not been completed. The request might or might not be eventually acted upon, and may be disallowed when processing occurs. 203 Non-Authoritative Information (since HTTP/1.1) The server is a transforming proxy (e.g. a Web accelerator) that received a 200 OK from its origin, but is returning a modified version of the origin's response. 204 No Content The server successfully processed the request, and is not returning any content. 205 Reset Content The server successfully processed the request, asks that the requester reset its document view, and is not returning any content. 206 Partial Content (RFC 7233) The server is delivering only part of the resource (byte serving) due to a range header sent by the client. The range header is used by HTTP clients to enable resuming of interrupted downloads, or split a download into multiple simultaneous streams. 207 Multi-Status (WebDAV; RFC 4918) The message body that follows is by default an XML message and can contain a number of separate response codes, depending on how many sub-requests were made. 208 Already Reported (WebDAV; RFC 5842) The members of a DAV binding have already been enumerated in a preceding part of the (multistatus) response, and are not being included again. 226 IM Used (RFC 3229) The server has fulfilled a request for the resource, and the response is a representation of the result of one or more instance-manipulations applied to the current instance. 3xx redirection This class of status code indicates the client must take additional action to complete the request. Many of these status codes are used in URL redirection. A user agent may carry out the additional action with no user interaction only if the method used in the second request is GET or HEAD. A user agent may automatically redirect a request. A user agent should detect and intervene to prevent cyclical redirects. 300 Multiple Choices Indicates multiple options for the resource from which the client may choose (via agent-driven content negotiation). For example, this code could be used to present multiple video format options, to list files with different filename extensions, or to suggest word-sense disambiguation. 301 Moved Permanently This and all future requests should be directed to the given URI. 302 Found (Previously "Moved temporarily") Tells the client to look at (browse to) another URL. The HTTP/1.0 specification (RFC 1945) required the client to perform a temporary redirect with the same method (the original describing phrase was "Moved Temporarily"), but popular browsers implemented 302 redirects by changing the method to GET. Therefore, HTTP/1.1 added status codes 303 and 307 to distinguish between the two behaviours. 303 See Other (since HTTP/1.1) The response to the request can be found under another URI using the GET method. When received in response to a POST (or PUT/DELETE), the client should presume that the server has received the data and should issue a new GET request to the given URI. 304 Not Modified (RFC 7232) Indicates that the resource has not been modified since the version specified by the request headers If-Modified-Since or If-None-Match. In such case, there is no need to retransmit the resource since the client still has a previously-downloaded copy. 305 Use Proxy (since HTTP/1.1) The requested resource is available only through a proxy, the address for which is provided in the response. For security reasons, many HTTP clients (such as Mozilla Firefox and Internet Explorer) do not obey this status code. 306 Switch Proxy No longer used. Originally meant "Subsequent requests should use the specified proxy." 307 Temporary Redirect (since HTTP/1.1) In this case, the request should be repeated with another URI; however, future requests should still use the original URI. In contrast to how 302 was historically implemented, the request method is not allowed to be changed when reissuing the original request. For example, a POST request should be repeated using another POST request. 308 Permanent Redirect (RFC 7538) This and all future requests should be directed to the given URI. 308 parallel the behaviour of 301, but does not allow the HTTP method to change. So, for example, submitting a form to a permanently redirected resource may continue smoothly. 4xx client errors This class of status code is intended for situations in which the error seems to have been caused by the client. Except when responding to a HEAD request, the server should include an entity containing an explanation of the error situation, and whether it is a temporary or permanent condition. These status codes are applicable to any request method. User agents should display any included entity to the user. 400 Bad Request The server cannot or will not process the request due to an apparent client error (e.g., malformed request syntax, size too large, invalid request message framing, or deceptive request routing). 401 Unauthorized (RFC 7235) Similar to 403 Forbidden, but specifically for use when authentication is required and has failed or has not yet been provided. The response must include a WWW-Authenticate header field containing a challenge applicable to the requested resource. See Basic access authentication and Digest access authentication. 401 semantically means "unauthorised", the user does not have valid authentication credentials for the target resource. Note: Some sites incorrectly issue HTTP 401 when an IP address is banned from the website (usually the website domain) and that specific address is refused permission to access a website. 402 Payment Required Reserved for future use. The original intention was that this code might be used as part of some form of digital cash or micropayment scheme, as proposed, for example, by GNU Taler, but that has not yet happened, and this code is not widely used. Google Developers API uses this status if a particular developer has exceeded the daily limit on requests. Sipgate uses this code if an account does not have sufficient funds to start a call. Shopify uses this code when the store has not paid their fees and is temporarily disabled. Stripe uses this code for failed payments where parameters were correct, for example blocked fraudulent payments. 403 Forbidden The request contained valid data and was understood by the server, but the server is refusing action. This may be due to the user not having the necessary permissions for a resource or needing an account of some sort, or attempting a prohibited action (e.g. creating a duplicate record where only one is allowed). This code is also typically used if the request provided authentication by answering the WWW-Authenticate header field challenge, but the server did not accept that authentication. The request should not be repeated. 404 Not Found The requested resource could not be found but may be available in the future. Subsequent requests by the client are permissible. 405 Method Not Allowed A request method is not supported for the requested resource; for example, a GET request on a form that requires data to be presented via POST, or a PUT request on a read-only resource. 406 Not Acceptable The requested resource is capable of generating only content not acceptable according to the Accept headers sent in the request. See Content negotiation. 407 Proxy Authentication Required (RFC 7235) The client must first authenticate itself with the proxy. 408 Request Timeout The server timed out waiting for the request. According to HTTP specifications: "The client did not produce a request within the time that the server was prepared to wait. The client MAY repeat the request without modifications at any later time." 409 Conflict Indicates that the request could not be processed because of conflict in the current state of the resource, such as an edit conflict between multiple simultaneous updates. 410 Gone Indicates that the resource requested is no longer available and will not be available again. This should be used when a resource has been intentionally removed and the resource should be purged. Upon receiving a 410 status code, the client should not request the resource in the future. Clients such as search engines should remove the resource from their indices. Most use cases do not require clients and search engines to purge the resource, and a "404 Not Found" may be used instead. 411 Length Required The request did not specify the length of its content, which is required by the requested resource. 412 Precondition Failed (RFC 7232) The server does not meet one of the preconditions that the requester put on the request header fields. 413 Payload Too Large (RFC 7231) The request is larger than the server is willing or able to process. Previously called "Request Entity Too Large". 414 URI Too Long (RFC 7231) The URI provided was too long for the server to process. Often the result of too much data being encoded as a query-string of a GET request, in which case it should be converted to a POST request. Called "Request-URI Too Long" previously. 415 Unsupported Media Type (RFC 7231) The request entity has a media type which the server or resource does not support. For example, the client uploads an image as image/svg+xml, but the server requires that images use a different format. 416 Range Not Satisfiable (RFC 7233) The client has asked for a portion of the file (byte serving), but the server cannot supply that portion. For example, if the client asked for a part of the file that lies beyond the end of the file. Called "Requested Range Not Satisfiable" previously. 417 Expectation Failed The server cannot meet the requirements of the Expect request-header field. 418 I'm a teapot (RFC 2324, RFC 7168) This code was defined in 1998 as one of the traditional IETF April Fools' jokes, in RFC 2324, Hyper Text Coffee Pot Control Protocol, and is not expected to be implemented by actual HTTP servers. The RFC specifies this code should be returned by teapots requested to brew coffee. This HTTP status is used as an Easter egg in some websites, such as Google.com's I'm a teapot easter egg. 421 Misdirected Request (RFC 7540) The request was directed at a server that is not able to produce a response (for example because of connection reuse). 422 Unprocessable Entity (WebDAV; RFC 4918) The request was well-formed but was unable to be followed due to semantic errors. 423 Locked (WebDAV; RFC 4918) The resource that is being accessed is locked. 424 Failed Dependency (WebDAV; RFC 4918) The request failed because it depended on another request and that request failed (e.g., a PROPPATCH). 425 Too Early (RFC 8470) Indicates that the server is unwilling to risk processing a request that might be replayed. 426 Upgrade Required The client should switch to a different protocol such as TLS/1.3, given in the Upgrade header field. 428 Precondition Required (RFC 6585) The origin server requires the request to be conditional. Intended to prevent the 'lost update' problem, where a client GETs a resource's state, modifies it, and PUTs it back to the server, when meanwhile a third party has modified the state on the server, leading to a conflict. 429 Too Many Requests (RFC 6585) The user has sent too many requests in a given amount of time. Intended for use with rate-limiting schemes. 431 Request Header Fields Too Large (RFC 6585) The server is unwilling to process the request because either an individual header field, or all the header fields collectively, are too large. 451 Unavailable For Legal Reasons (RFC 7725) A server operator has received a legal demand to deny access to a resource or to a set of resources that includes the requested resource. The code 451 was chosen as a reference to the novel Fahrenheit 451 (see the Acknowledgements in the RFC). 5xx server errors The server failed to fulfil a request. Response status codes beginning with the digit "5" indicate cases in which the server is aware that it has encountered an error or is otherwise incapable of performing the request. Except when responding to a HEAD request, the server should include an entity containing an explanation of the error situation, and indicate whether it is a temporary or permanent condition. Likewise, user agents should display any included entity to the user. These response codes are applicable to any request method. 500 Internal Server Error A generic error message, given when an unexpected condition was encountered and no more specific message is suitable. 501 Not Implemented The server either does not recognize the request method, or it lacks the ability to fulfil the request. Usually this implies future availability (e.g., a new feature of a web-service API). 502 Bad Gateway The server was acting as a gateway or proxy and received an invalid response from the upstream server. 503 Service Unavailable The server cannot handle the request (because it is overloaded or down for maintenance). Generally, this is a temporary state. 504 Gateway Timeout The server was acting as a gateway or proxy and did not receive a timely response from the upstream server. 505 HTTP Version Not Supported The server does not support the HTTP protocol version used in the request. 506 Variant Also Negotiates (RFC 2295) Transparent content negotiation for the request results in a circular reference. 507 Insufficient Storage (WebDAV; RFC 4918) The server is unable to store the representation needed to complete the request. 508 Loop Detected (WebDAV; RFC 5842) The server detected an infinite loop while processing the request (sent instead of 208 Already Reported). 510 Not Extended (RFC 2774) Further extensions to the request are required for the server to fulfil it. 511 Network Authentication Required (RFC 6585) The client needs to authenticate to gain network access. Intended for use by intercepting proxies used to control access to the network (e.g., "captive portals" used to require agreement to Terms of Service before granting full Internet access via a Wi-Fi hotspot). Unofficial codes The following codes are not specified by any standard. 419 Page Expired (Laravel Framework) Used by the Laravel Framework when a CSRF Token is missing or expired. 420 Method Failure (Spring Framework) A deprecated response used by the Spring Framework when a method has failed. 420 Enhance Your Calm (Twitter) Returned by version 1 of the Twitter Search and Trends API when the client is being rate limited; versions 1.1 and later use the 429 Too Many Requests response code instead. The phrase "Enhance your calm" comes from the 1993 movie Demolition Man, and its association with this number is likely a reference to cannabis. 430 Request Header Fields Too Large (Shopify) Used by Shopify, instead of the 429 Too Many Requests response code, when too many URLs are requested within a certain time frame. 450 Blocked by Windows Parental Controls (Microsoft) The Microsoft extension code indicated when Windows Parental Controls are turned on and are blocking access to the requested webpage. 498 Invalid Token (Esri) Returned by ArcGIS for Server. Code 498 indicates an expired or otherwise invalid token. 499 Token Required (Esri) Returned by ArcGIS for Server. Code 499 indicates that a token is required but was not submitted. 509 Bandwidth Limit Exceeded (Apache Web Server/cPanel) The server has exceeded the bandwidth specified by the server administrator; this is often used by shared hosting providers to limit the bandwidth of customers. 529 Site is overloaded Used by Qualys in the SSLLabs server testing API to signal that the site can't process the request. 530 Site is frozen Used by the Pantheon web platform to indicate a site that has been frozen due to inactivity. 598 (Informal convention) Network read timeout error Used by some HTTP proxies to signal a network read timeout behind the proxy to a client in front of the proxy. 599 Network Connect Timeout Error An error used by some HTTP proxies to signal a network connect timeout behind the proxy to a client in front of the proxy. Internet Information Services Microsoft's Internet Information Services (IIS) web server expands the 4xx error space to signal errors with the client's request. 440 Login Time-out The client's session has expired and must log in again. 449 Retry With The server cannot honour the request because the user has not provided the required information. 451 Redirect Used in Exchange ActiveSync when either a more efficient server is available or the server cannot access the users' mailbox. The client is expected to re-run the HTTP AutoDiscover operation to find a more appropriate server. IIS sometimes uses additional decimal sub-codes for more specific information, however these sub-codes only appear in the response payload and in documentation, not in the place of an actual HTTP status code. nginx The nginx web server software expands the 4xx error space to signal issues with the client's request. 444 No Response Used internally to instruct the server to return no information to the client and close the connection immediately. 494 Request header too large Client sent too large request or too long header line. 495 SSL Certificate Error An expansion of the 400 Bad Request response code, used when the client has provided an invalid client certificate. 496 SSL Certificate Required An expansion of the 400 Bad Request response code, used when a client certificate is required but not provided. 497 HTTP Request Sent to HTTPS Port An expansion of the 400 Bad Request response code, used when the client has made a HTTP request to a port listening for HTTPS requests. 499 Client Closed Request Used when the client has closed the request before the server could send a response. Cloudflare Cloudflare's reverse proxy service expands the 5xx series of errors space to signal issues with the origin server. 520 Web Server Returned an Unknown Error The origin server returned an empty, unknown, or unexpected response to Cloudflare. 521 Web Server Is Down The origin server refused connections from Cloudflare. Security solutions at the origin may be blocking legitimate connections from certain Cloudflare IP addresses. 522 Connection Timed Out Cloudflare timed out contacting the origin server. 523 Origin Is Unreachable Cloudflare could not reach the origin server; for example, if the DNS records for the origin server are incorrect or missing. 524 A Timeout Occurred Cloudflare was able to complete a TCP connection to the origin server, but did not receive a timely HTTP response. 525 SSL Handshake Failed Cloudflare could not negotiate a SSL/TLS handshake with the origin server. 526 Invalid SSL Certificate Cloudflare could not validate the SSL certificate on the origin web server. Also used by Cloud Foundry's gorouter. 527 Railgun Error Error 527 indicates an interrupted connection between Cloudflare and the origin server's Railgun server. 530 Error 530 is returned along with a 1xxx error. AWS Elastic Load Balancer Amazon's Elastic Load Balancing adds a few custom return codes 460 Client closed the connection with the load balancer before the idle timeout period elapsed. Typically when client timeout is sooner than the Elastic Load Balancer's timeout. 463 The load balancer received an X-Forwarded-For request header with more than 30 IP addresses. 561 Unauthorized An error around authentication returned by a server registered with a load balancer. You configured a listener rule to authenticate users, but the identity provider (IdP) returned an error code when authenticating the user. Caching warning codes The following caching related warning codes are specified under RFC 7234. Unlike the other status codes above, these are not sent as the response status in the HTTP protocol, but as part of the "Warning" HTTP header. Since this header is often neither sent by servers nor acknowledged by clients, it will soon be obsoleted by the HTTP Working Group. 110 Response is Stale The response provided by a cache is stale (the content's age exceeds a maximum age set by a Cache-Control header or heuristically chosen lifetime). 111 Revalidation Failed The cache was unable to validate the response, due to an inability to reach the origin server. 112 Disconnected Operation The cache is intentionally disconnected from the rest of the network. 113 Heuristic Expiration The cache heuristically chose a freshness lifetime greater than 24 hours and the response's age is greater than 24 hours. 199 Miscellaneous Warning Arbitrary, non-specific warning. The warning text may be logged or presented to the user. 214 Transformation Applied Added by a proxy if it applies any transformation to the representation, such as changing the content encoding, media type or the like. 299 Miscellaneous Persistent Warning Same as 199, but indicating a persistent warning. See also Custom error pages List of FTP server return codes List of HTTP header fields List of SMTP server return codes Common Log Format Notes References External links RFC 7231 – Hypertext Transfer Protocol (HTTP/1.1): Semantics and Content – Section 6, Response Status Codes Hypertext Transfer Protocol (HTTP) Status Code Registry Microsoft Knowledge Base: MSKB943891: The HTTP status codes in IIS 7.0 Microsoft Office Knowledge Base: Error Code 2–11 HTTP status codes
35774706
https://en.wikipedia.org/wiki/EM%20Client
EM Client
eM Client is a Windows and macOS based email client for sending and receiving emails, managing calendars, tasks, contacts, and notes. Live chat is integrated as well. It was developed as a user-friendly alternative to existing email clients and calendar solutions. eM Client was originally developed in 2006 and has provided updated versions (the latest version 8.2 of the software was released in March 2021). Features eM Client has a range of features for handling email, including advanced rules management, mass mail, delayed send, or a built-in translator for incoming and outgoing messages. It supports signatures, Quick Text, and tagging and categorization for easy searching. Watch for Replies and Snooze Email functions are available, as well as direct cloud attachments from cloud services like Dropbox, Google Drive, OneDrive, ownCloud or Nextcloud. eM Client provides also a lookup service for GnuPG public keys (eM Keybook) in order to more easily send encrypted communication via email, and generally simplify PGP encryption in email communication. Since eM Client 8.2, Online Meetings are supported (via Zoom, MS Teams, and Google Meet). eM Client allows extensive appearance customization (including a visual theme editor). Email support eM Client supports all major email platforms including Exchange, Gmail, G Suite, Office365, iCloud and any POP3, SMTP, IMAP or CalDAV server. Automatic setup works for Gmail, Exchange, Office 365, Outlook, iCloud, or other major email services. Recently, auto-import option was added to transfer data from IncrediMail as well. Server compatibility eM Client is compatible with: G Suite (Gmail, Hangouts, and others) iCloud MS Office 365 MS Exchange IceWarp SmarterMail Kerio MDaemon Fastmail System requirements For Windows: Windows 7 or higher 350 MB of free space for installation (+additional space for data, which can be stored on a different drive if needed - since eM Client has no limit on the number of emails/data stored in its database, the only limitation is the capacity of user's hard drive) Minimum of 2 GB of RAM and 1.6 GHz CPU For macOS: OS X 10.11 and the newer versions are supported only the last three macOS versions are officially supported, but eM Client runs on El Capitan too eM Client requires Microsoft .NET Framework 2.0 installation. References Email clients Personal information managers Calendaring software
144589
https://en.wikipedia.org/wiki/Information%20warfare
Information warfare
Information warfare (IW) (as different from cyber warfare that attacks computers, software, and command control systems) is a concept involving the battlespace use and management of information and communication technology (ICT) in pursuit of a competitive advantage over an opponent. Information warfare is the manipulation of information trusted by a target without the target's awareness so that the target will make decisions against their interest but in the interest of the one conducting information warfare. As a result, it is not clear when information warfare begins, ends, and how strong or destructive it is. Information warfare may involve the collection of tactical information, assurance(s) that one's information is valid, spreading of propaganda or disinformation to demoralize or manipulate the enemy and the public, undermining the quality of the opposing force's information and denial of information-collection opportunities to opposing forces. Information warfare is closely linked to psychological warfare. The United States military focus tends to favor technology and hence tends to extend into the realms of electronic warfare, cyberwarfare, information assurance and computer network operations, attack, and defense. Most of the rest of the world use the much broader term of "Information Operations" which, although making use of technology, focuses on the more human-related aspects of information use, including (amongst many others) social network analysis, decision analysis, and the human aspects of command and control. Overview Information warfare can take many forms: Television, internet and radio transmission(s) can be jammed. Television, internet and radio transmission(s) can be hijacked for a disinformation campaign. Logistics networks can be disabled. Enemy communications networks can be disabled or spoofed, especially online social community in modern days. Stock exchange transactions can be sabotaged, either with electronic intervention, by leaking sensitive information or by placing disinformation. The use of drones and other surveillance robots or webcams. Communication management The U.S. Air Force has had Information Warfare Squadrons since the 1980s. In fact, the official mission of the U.S. Air Force is now "To fly, fight and win...in air, space and cyberspace", with the latter referring to its information warfare role. As the U.S. Air Force often risks aircraft and aircrews to attack strategic enemy communications targets, remotely disabling such targets using software and other means can provide a safer alternative. In addition, disabling such networks electronically (instead of explosively) also allows them to be quickly re-enabled after the enemy territory is occupied. Similarly, counter-information warfare units are employed to deny such capability to the enemy. The first application of these techniques was used against Iraqi communications networks in the Gulf War. Also during the Gulf War, Dutch hackers allegedly stole information about U.S. troop movements from U.S. Defense Department computers and tried to sell it to the Iraqis, who thought it was a hoax and turned it down. In January 1999, U.S. Air Intelligence computers were hit by a coordinated attack (Moonlight Maze), part of which came from a Russian mainframe. This could not be confirmed as a Russian cyber attack due to non-attribution – the principle that online identity may not serve as proof of real world identity. New battlefield The innovation of more advanced and autonomous ICTs has engendered a new revolution in military affairs, which encompasses nations' use of ICTs in both cyberspace and the physical battlefield to wage war against their adversaries. The three most prevalent revolutions in military affairs come in the form of cyberattacks, autonomous robots and communication management. Within the realm of cyberspace, there are two primary weapons: network-centric warfare and C4ISR, which denotes integrated Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance. Furthermore, cyberspace attacks initiated by one nation against another nation have an underlying goal of gaining information superiority over the attacked party, which includes disrupting or denying the victimized party's ability to gather and distribute information. A real-world occurrence that illustrated the dangerous potential of cyberattacks transpired in 2007, when a strike from Israeli forces demolished an alleged nuclear reactor in Syria that was being constructed via a collaborative effort between Syria and North Korea. Accompanied with the strike was a cyberattack on Syria's air defenses, which left them blind to the attack on the nuclear reactor and, ultimately allowed for the attack to occur (New York Times 2014). An example of a more basic attack on a nation within cyberspace is a distributed denial of service (DDOS) attack, which is utilized to hinder networks or websites until they lose their primary functionality. As implied, cyberattacks do not just affect the military party being attacked, but rather the whole population of the victimized nation. Since more aspects of daily life are being integrated into networks in cyberspace, civilian populations can potentially be negatively affected during wartime. For example, if a nation chose to attack another nation's power grid servers in a specific area to disrupt communications, civilians and businesses in that area would also have to deal with power outages, which could potentially lead to economic disruptions as well. Moreover, physical ICTs have also been implemented into the latest revolution in military affairs by deploying new, more autonomous robots (i.e. – unmanned drones) into the battlefield to carry out duties such as patrolling borders and attacking ground targets. Humans from remote locations pilot many of the unmanned drones, however, some of the more advanced robots, such as the Northrop Grumman X-47B, are capable of autonomous decisions. Despite piloting the drones from remote locations, a proportion of drone pilots still suffer from stress factors of more traditional warfare. According to NPR, a study performed by the Pentagon in 2011 found that 29% of drone pilots are “burned out” and undergo high levels of stress. Furthermore, approximately 17% of the drone pilots surveyed as the study were labeled “clinically distressed” with some of those pilots also showing signs of post-traumatic stress disorder. Modern ICTs have also brought advancements to communications management among military forces. Communication is a vital aspect of war for any involved party and, through the implementation of new ICTs such as data-enabled devices, military forces are now able to disseminate information faster than ever before. For example, some militaries are now employing the use of iPhones to upload data and information gathered by drones in the same area. Legal and ethical concerns While information warfare has yielded many advances in the types of attack that a government can make, it has also raised concerns about the moral and legal ambiguities surrounding this particularly new form of war. Traditionally, wars have been analyzed by moral scholars according to just war theory. However, with Information Warfare, Just War Theory fails because the theory is based on the traditional conception of war. Information Warfare has three main issues surrounding it compared to traditional warfare: The risk for the party or nation initiating the cyberattack is substantially lower than the risk for a party or nation initiating a traditional attack. This makes it easier for governments, as well as potential terrorist or criminal organizations, to make these attacks more frequently than they could with traditional war. Information communication technologies (ICT) are so immersed in the modern world that a very wide range of technologies are at risk of a cyberattack. Specifically, civilian technologies can be targeted for cyberattacks and attacks can even potentially be launched through civilian computers or websites. As such, it is harder to enforce control of civilian infrastructures than a physical space. Attempting to do so would also raise many ethical concerns about the right to privacy, making defending against such attacks even tougher. The mass-integration of ICT into our system of war makes it much harder to assess accountability for situations that may arise when using robotic and/or cyber attacks. For robotic weapons and automated systems, it's becoming increasingly hard to determine who is responsible for any particular event that happens. This issue is exacerbated in the case of cyberattacks, as sometimes it is virtually impossible to trace who initiated the attack in the first place. Recently, legal concerns have arisen centered on these issues, specifically the issue of the right to privacy in the United States of America. Lt. General Keith B. Alexander, who served as the head of Cyber Command under President Barack Obama, noted that there was a "mismatch between our technical capabilities to conduct operations and the governing laws and policies" when writing to the Senate Armed Services Committee. A key point of concern was the targeting of civilian institutions for cyberattacks, to which the general promised to try to maintain a mindset similar to that of traditional war, in which they will seek to limit the impact on civilians. See also Active measures Black propaganda Character assassination Cyberwarfare Communications security Command and control warfare Disinformation Electronic warfare Historical revisionism Fake news Fifth Dimension Operations Gatekeeper (politics) Industrial espionage Information operations Internet manipulation Irregular warfare iWar Kompromat List of cyber warfare forces Network-centric warfare Political warfare Psychological warfare Public affairs (military) Public relations Storm botnet Transparency Group specific: Chinese information operations and information warfare Cyberwarfare in Russia Taliban propaganda White Paper on El Salvador US specific: Active Measures Working Group CIA COINTELPRO Edward Bernays Enemy Image, a documentary about the Pentagon's approach to news coverage of war Information Operations Roadmap Information Operations (United States) Pentagon military analyst program Special Activities Division Titan Rain References Bibliography Books Jerome Clayton Glenn, "Future Mind" Chapter 9. Defense p.195-201. Acropolis Books LTD, Washington, DC (1989) Winn Schwartau, "Information Warfare: Chaos on the Electronic Superhighway" Thunder's Mouth Press (1993) Winn Schwartau, ed, Information Warfare: Cyberterrorism: Protecting your personal security in the electronic age, Thunder's Mouth Press, 2nd ed, (1996) (). John Arquilla and David Ronfeldt, In Athena's Camp, RAND (1997). Dorothy Denning, Information Warfare and Security, Addison-Wesley (1998) (). James Adams, The Next World War: Computers are the Weapons and the Front line is Everywhere, Simon and Schuster (1998) (). Edward Waltz, Information Warfare Principles and Operations, Artech House, 1998, John Arquilla and David Ronfeldt, Networks and Netwars: The Future of Terror, Crime, and Militancy, RAND (2001) (). Ishmael Jones, The Human Factor: Inside the CIA's Dysfunctional Intelligence Culture, Encounter Books, New York (2010) (). Information/intelligence warfare. Gregory J. Rattray, Strategic Warfare in Cyberspace, MIT Press (2001) (). Anthony H. Cordesman, Cyber-threats, Information Warfare, and Critical Infrastructure Protection: DEFENDING THE US HOMELAND (2002) (). Leigh Armistead, Information Operations: The Hard Reality of Soft Power, Joint Forces Staff College and the National Security Agency (2004) (). Thomas Rid, War and Media Operations: The US Military and the Press from Vietnam to Iraq, Routledge (2007) (). Other Science at War: Information Warfare, The History Channel (1998). External links Resources Politically Motivated Computer Crime Cyberspace and Information Operations Study Center , Air University, U.S. Air Force. IWS - The Information Warfare Site Information Warfare Monitor - Tracking Cyberpower (University of Toronto, Canada/Munk Centre) Twitter: InfowarMonitor Information Warfare, I-War, IW, C4I, Cyberwar Federation of American Scientists - IW Resources Association of Old Crows http://www.crows.org The Electronic Warfare and Information Operations Association. C4I.org - Computer Security & Intelligence Information Warfare, Information Operations and Electronic Attack Capabilities Air Power Australia. Committee on Policy Consequences and Legal/Ethical Implications of Offensive Information Warfare , The National Academies. Program on Information and Warfare, Global Information Society Project, World Policy Institute. Information Warriors Information Warriors is web forum dedicated to the discussion of Navy Information Warfare. Mastermind Corporation Information Warfare Tactics Analysis Information Warfare in Biology Nature's Exploitation of Information to Win Survival Contests, Monash University, Computer Science. Course syllabi COSC 511 Information Warfare: Terrorism, Crime, and National Security @ Department of Computer Science, Georgetown University (1997–2002) (Dorothy Denning). CSE468 Information Conflict (Honours) @ School of Computer Science and Software Engineering, Monash University (2006) (Carlo Kopp). Information Warfare, Cyberterrorism, and Hacktivism from Cybercrime, Cyberterrorism and Digital Law Enforcement, New York Law School. Papers: research and theory Col Andrew Borden, USAF (Ret.), What is Information Warfare? Aerospace Power Chronicles (1999). Dr Carlo Kopp, A Fundamental Paradigm of Infowar (February 2000). Research & Theory Links, Cyberspace and Information Operations Study Center, Air War College, Air University, U.S. Air Force. Lachlan Brumley et al., Cutting Through the Tangled Web: An Information-Theoretic Perspective on Information Warfare (October 2012). Michael MacDonald (2012) "Black Logos: Rhetoric and Information Warfare", pages 189–220 in Literature, Rhetoric and Values: Selected Proceedings of a Conference held at University of Waterloo, 3–5 June 2011, editors Shelley Hulan, Murray McArthur and Randy Allen Harris, Cambridge Scholars Publishing . Taddeo, Mariarosaria (2012). Information Warfare: A Philosophical Perspective. Philosophy and Technology 25 (1):105-120. Inna, Vasilyeva. "The Value of Interaction for Russia, the USA and China Facing the Information Warfare." IJCWT 3.4 (2013): 1–9. . Papers: Other An essay on Information Operations by Zachary P. Hubbard News articles Army, Air Force seek to go on offensive in cyber war, GovExec.com (June 13, 2007). NATO says urgent need to tackle cyber attack, Reuters (June 14, 2007). America prepares for 'cyber war' with China, Telegraph.uk.co (June 15, 2007). NATO, US gear up for cyberpunk warfare, The Register (June 15, 2007). United States Department of Defense IO Doctrine Information Operations Roadmap (DOD 2003) Information Operations (JP 3-13 2006) Operations Security (JP 3-13.3) Military Deception (JP 3-13.4) Joint Doctrine for PSYOP (JP 3-53 2003) Joint Doctrine for Public Affairs (JP 3-61 2005) Destabilizing Terrorist Networks: Disrupting and Manipulating Information Flows in the Global War on Terrorism, Yale Information Society Project Conference Paper (2005). Seeking Symmetry in Fourth Generation Warfare: Information Operations in the War of Ideas, Presentation (PDF slides) to the Bantle - Institute for National Security and Counterterrorism (INSCT) Symposium, Syracuse University (2006). K. A. Taipale, Seeking Symmetry on the Information Front: Confronting Global Jihad on the Internet, 16 National Strategy F. Rev. 14 (Summer 2007). Propaganda in the United States Propaganda techniques using information Psychological warfare techniques Warfare post-1945 Disinformation
15891863
https://en.wikipedia.org/wiki/Southwestern%20Central%20High%20School
Southwestern Central High School
Southwestern Central High School is a public high school located at 600 Hunt Road in West Ellicott, New York, that is part of the Southwestern Central School District. Educates students in grades 9 through 12 from the surrounding areas of Busti, Celoron, Lakewood, and West Ellicott. Academics In 2009, Southwestern ranked 35th out of 131, 36th in 2010 and 35th of 133 in 2011 among Western New York high schools in terms of academic performance. Athletics The Trojans compete in the fall in football, cross country, boys soccer, girls soccer, girls volleyball, girls tennis, and girls swimming and diving. Winter sports include basketball, wrestling, bowling, and boys swimming. Spring sports include track and field, boys tennis, baseball, softball, and golf. Swimming The Southwestern girls swim team won the CCAA league with a season of no losses. The girls also claimed the Section 6 Class C title for both 2011, 2012, 2013, and 2014 seasons. Football The football team won the New York State Public High School Athletic Association Class C championship in 2008. The Trojans completed the season with a 13–0 record. Coach Jay Sirianni was named New York State Class C Coach of the Year. Quarterback Zack Sopak and lineman Jasen Carlson were named to the first team All-State football team. In 2009 the Trojans (13–0) claimed their second straight New York State Public High School Athletic Association Class C football championship by downing Bronxville of Section 1, 40–14, at the Carrier Dome. Lineman Jasen Carlson was awarded the Trench trophy as the Best High School Lineman in Western New York and Quarterback Zack Sopak was awarded the Connolly Cup as most outstanding football player in Western New York. First Team All-State honors awarded to Carlson, Sopak, Levi Bursch, and Ryan Buzzetto. From 2015 to 2019, the Trojans football team was coached by former NFL player Jehuu Caulcrick. Bowling In the 2011–12 season the boys and girls varsity bowling teams won division one champions being led by team seniors Jenny Goodwill, Jessica Watson, Stephen Pool, Tom Dietrick, Sam Nelson, Kim Bajdo, Nicole Vincent, Lauren Schweichler, and Hannah Johnston. Notable alumni Laura Kightlinger Amy King, flight attendant killed in September 11 attacks J.C. Matteson, Army sergeant killed during the Second Battle of Fallujah Jackson Rohm Nick Sirianni, Philadelphia Eagles' head coach (2021–present) Aaron Swanson, U.S. Marine killed in active duty in Afghanistan. References External links Official website Public high schools in New York (state) Jamestown, New York Schools in Chautauqua County, New York 1951 establishments in New York (state) Educational institutions established in 1951
949758
https://en.wikipedia.org/wiki/GnuTLS
GnuTLS
GnuTLS (, the GNU Transport Layer Security Library) is a free software implementation of the TLS, SSL and DTLS protocols. It offers an application programming interface (API) for applications to enable secure communication over the network transport layer, as well as interfaces to access X.509, PKCS #12, OpenPGP and other structures. Features GnuTLS consists of a library that allows client applications to start secure sessions using the available protocols. It also provides command-line tools, including an X.509 certificate manager, a test client and server, and random key and password generators. GnuTLS has the following features: TLS 1.3, TLS 1.2, TLS 1.1, TLS 1.0, and SSL 3.0 protocols Datagram TLS (DTLS) 1.2, and DTLS 1.0, protocols TLS-SRP: Secure remote password protocol (SRP) for TLS authentication TLS-PSK: Pre-shared key (PSK) for TLS authentication X.509 and OpenPGP certificate handling CPU assisted cryptography and cryptographic accelerator support (/dev/crypto), VIA PadLock and AES-NI instruction sets Support for smart cards and for hardware security modules Storage of cryptographic keys in the system's Trusted Platform Module (TPM) History Origin GnuTLS was initially created around March 2003 by Nikos Mavrogiannopoulos to allow applications of the GNU Project to use secure protocols such as TLS. Although OpenSSL already existed, OpenSSL's license is not compatible with the GPL; thus software under the GPL, such as GNU software, could not use OpenSSL without making a GPL linking exception. License The GnuTLS library was licensed originally under the GNU Lesser General Public License v2, while included applications use the GNU General Public License. In August 2011 the library was updated to the LGPLv3. After it was noticed that there were new license compatibility problems introduced, especially with other free software with the license change, after discussions the license was downgraded again to LGPLv2.1 in March 2013. Split from the GNU/FSF GnuTLS was created for the GNU Project, but in December 2012 its maintainer, Nikos Mavrogiannopoulos, dissociated the project from GNU after policy disputes with the Free Software Foundation. Richard Stallman opposed this move and suggested forking the project instead. Soon afterward, developer Paolo Bonzini ended his maintainership of GNU Sed and Grep, expressing concerns similar to those of GnuTLS maintainer Mavrogiannopoulos. Deployment Software packages using GnuTLS include(d): GNOME CenterIM Exim WeeChat Mutt Wireshark slrn Lynx CUPS gnoMint GNU Emacs Synology DiskStation Manager OpenConnect See also Comparison of TLS implementations wolfSSL (previously CyaSSL) mbed TLS (previously PolarSSL) List of free and open-source software packages Network Security Services References External links GNU Friends - An Interview with GNU TLS developer Nikos Mavroyanopoulos – a 2003 interview Fellowship interview with Simon Josefsson – a 2009 interview Cryptographic software GNU Project software Free security software Transport Layer Security implementation
28119486
https://en.wikipedia.org/wiki/4FFF%20N618
4FFF N618
4FFF N618 is a discontinued electronic-book reader developed by an Indian Company, Condor Technology Associates, and based on a Linux platform. The device is sold under various brand names worldwide. Features 4FFF N618 provides a 16 levels of grayscale SiPix touchscreen display for viewing digital content. Pages are turned using the buttons on the device. The N618 connects to the internet through an available Wi-Fi connections. Users can read books without a wireless connection: disconnecting the wireless connection can prolong the battery's charge for up to 29 days. Specifications The Display is an electronic paper touchscreen from SiPix with 800×600 pixels (4:3) on 6 inch (167 ppi density) and 16 levels of grayscale. CPU Samsung 2416 ARM9 @ 400 MHz OS Linux 2.6.23 Memory 128 MB (MDDR) 2 GB (NAND) External microSD/microSDHC (up to 16 GB) Connectivity Wi-Fi b/g microUSB high speed audio jack Miscellaneous 1530 mAh, 3.7 V 240 gram Reading mode 10.000 pages (Wi-Fi off) 3.000 pages (Wi-Fi on) Stand by mode: 700 hours Formats supported Text ePUB HTML PDF RTF TXT Picture BMP JPEG PNG Audio MP3 Sold as The device is sold worldwide under various brand names. Asia India: eGriver Touch Europe France: OYO (as released in October 2010 by Medion, chapitre.com/Direct Group Bertelsmann) Germany, Austria, Switzerland: OYO (Medion, Thalia) Switzerland: ImCoSys ereader Italy: DeVo eVreader Netherlands: ProMedia eBook Reader; Icarus Sense E650SR; OYO (Medion, Selexyz) Poland: OYO (Medion, Empik) Spain: Nvsbl L337 / Booq Avant / Papyre 6.2 Russia: Mr. Book, One-XT, Айчиталка Bulgaria: Prestigio PER5062B Turkey: Reeder Reeder2 Across Europe available as: Icarus Sense Middle East Israel: E-vrit North America Canada: Pandigital Novel 6" Personal eReader United States: Qisda QD060B00 / Pandigital Novel 6" Personal eReader South America Brazil: Positivo Alfa (not supports any audio file) Modification Being that the hardware utilizes Linux-based software it can be changed or improved to their heart's content. The firmware is labeled as “QT Software” and varies from vendor to vendor. The upgrade contains multiple image files with the extension of img, along with other system files. Flashing to upgrade the firmware can be taken advantage of as it does not seem to check if the version that is being installed is older. Due to this, using firmware made by the other vendors on the model each sold in particular is possible. Dual-boot The devices seem to contain a native dual-boot capability. When a device specific key combination is pressed during power-on any linux-kernel + ramdisk combination is booted from the sd-card. References External links Official website 4FFF.com 4FFF N618 User Manual (Dutch) MobileRead Wiki Oyo reader review (video) Reader review (blog) Dedicated e-book devices Electronic paper technology Linux-based devices
41066666
https://en.wikipedia.org/wiki/The%20Eye%20Tribe
The Eye Tribe
The Eye Tribe was a Danish startup company that produced eye tracking technology and was selling it to software developers for them to incorporate the eye tracking device into their applications and programs. The Eye Tribe’s software allowed a user to direct a smart phone, tablet, or computer with just the look of an eye. The company focused on a sleek appearance and a portable structure. History Sune Alstrup Johansen (CEO), Javier San Agustin, Martin Tall (CTO), and Henrik Skovsgaard are the four founders of the company started in 2011. The four men met in 2006 at the IT University of Copenhagen. The four quickly found their ambition to create eye-tracking technology at an affordable cost, and soon took the rights and ownership for their ideas from the university where they were working and created their start-up company. The men named the company “Senseye”, until later changing their name to “The Eye Tribe”. The company started to take off in 2011 when The Eye Tribe participated in the European Startup Bootcamp accelerator program. After the StartupBootcamp, the company started to make its mark and become more well known, earning a spot in the “Cool Vendors in Human-Machine Interface, 2012” report by Gartner Inc. among five other companies. Later in 2012, The Eye Tribe received US$2.3 million from the Danish National Advanced Technology Foundation and another million from private European investors. The Eye Tribe is also leading a USD 4.4 million government-funded project, making it the major project for developing eye tracking for hand-held devices. On December 12 2016 The Eye Tribe sent an email to its customer list, informing them that they "decided to go in a different direction with their technology and stopped development of their products." On 29 December Facebook bought the company for its Oculus division in order to incorporate the technology in Vr gaming. Technology , The Eye Tribe was getting ready to send out the first shipments of their eye tracking technology. The Eye Tribe has broken the record for smallest eye tracker device in the world, measuring in at 20 × 1.9 × 1.9 cm. Also, the eye tracker does not require a separate power source, making it even more portable. The device uses a USB 3.0 connection, which allows it to run with most computers and tablets. The Eye Tribe is compatible with Microsoft Windows 7 or newer and OS X, but the company is in the process of working on support for other major platforms, such as Android. They are selling the device to developers with a simple software development kit using C++, C# and Java programming platforms. The main components of the Eye Tribe tracker are a camera and a high-resolution infrared LED, which can easily be set up in a cell phone or mobile device. The Eye Tribe’s device uses a camera to track the user’s eye movement. The camera tracks even the most minuscule of movements of the users’ pupils, by taking the images and running them through computer-vision algorithms. The algorithms read “on-screen gaze coordinates” and help the software to then determine where on the screen the user is looking. The algorithms also work with the hardware, camera sensor and light, to enhance the users’ experiences in many different kinds of light settings and environment, although the device works best indoors. Before using the eye tracking device, a calibration is needed in order for the device to find a user's pupils and identify unique eye characteristics needed to help enhance the accuracy of tracking one's gaze. The tracker has an average accuracy of about 0.5 degree of visual angle and can identify and follow the movement of an eye with sub millimeter precision, which is around the size of a fingertip. Uses The Eye Tribe Company is developing their eye tracking device in hopes that sometime in the near future many products, such as smart phones, tablets, and computers, will carry Eye Tribe’s software. The company’s goal is for their eye tracking technology to become a household item and a common feature on most devices. From their demos, The Eye Tribe makes it clear that they hope their technology will become versatile, used for many things from games to working, from browsing the web to security. A game most often used in their demos is Fruit Ninja, an application on most smart phones (used in iPhones and androids). The game usually uses a touch screen to slice fruit, but with eye tribe technology, the gamer would just look at the screen and use their gaze to play. Eye Tribe is working with other application designers to integrate their technology into other games for pleasure. The Eye Tribe Company often demonstrates how their software works in their demos by showing someone scrolling down a web page by just staring at the screen. It exemplifies how the device can be hands-free when needed, making it easy and quick to read and browse the web. An example would be when you are watching a how-to video, you can pause it or rewind with your eyes, because your hands are too busy. Another example of eye tracking is security. Users can set a gaze-operated password, where they would have to look at certain parts of the screen in order to unlock the device. Some would argue that this is a more efficient and secure way to lock their devices. Closing down On December 12, 2016, The Eye Tribe sent the following email to their customers: An Update From The Eye TribeThank you for supporting The Eye Tribe and ordering the world’s first truly affordable eye tracker. It is customers like you that have helped us get to where we are today.Unfortunately, we’ve decided to go in a different direction with our technology and will stop development of our products. We thought you should hear this news directly from us. We thank you for the time you’ve spent in discussions.-The Eye Tribe TeamA similar email was sent to the users of their Eyeproof service: An Update From EyeProofAfter several years of providing online eye tracking analysis, we are announcing that we are wrapping up EyeProof. On January 31st 2017 we will shut down our service and delete all data. No user data, account information, stimuli or recordings will be kept after this point.We want to make this closing as smooth as possible for everyone who uses EyeProof. If you have an active account you are able to download all your data before the closing date. We kindly ask you to navigate to the export section, which can be found on the right-side panel, for each of your studies and manually export your data and screenshots. The EyeProof documentation has a section explaining the different fields for the raw data, fixations, AOIs and area transitions for each resource: http://beta.eyeproof.net/docs/#exportAll the best and thank you for supporting EyeProof and contributing to the eye tracking community.Happy tracking!The EyeProof TeamThere wasn't any further explanation provided as to the reason, or if any support would be available to those who already purchased their eye tracking device. After the announcement, the company's website (theeyetribe.com) was taken down, and email sent to their support bounced back. A slightly different version of the company website was put back online a few days later, lacking any mention of the company ceasing operations. Neither the blog nor the website has any mentions of the ceasing of operations the company's customers had been notified of, leaving some uncertainty about the company's status or future. Acquisition It was reported recently that Oculus VR (owned by Facebook) acquired Eye Tribe and all of the employees and assets. References Software companies of Denmark
540289
https://en.wikipedia.org/wiki/Compatibility%20layer
Compatibility layer
In software engineering, a compatibility layer is an interface that allows binaries for a legacy or foreign system to run on a host system. This translates system calls for the foreign system into native system calls for the host system. With some libraries for the foreign system, this will often be sufficient to run foreign binaries on the host system. A hardware compatibility layer consists of tools that allow hardware emulation. Software Examples include: Wine, which runs some Microsoft Windows binaries on Unix-like systems using a program loader and the Windows API implemented in DLLs Windows's application compatibility layers to attempt to run poorly written applications or those written for earlier versions of the platform. Lina, which runs some Linux binaries on Windows, Mac OS X and Unix-like systems with native look and feel. KernelEX, which runs some Windows 2000/XP programs on Windows 98/Me. Executor, which runs 68k-based "classic" Mac OS programs in Windows, Mac OS X and Linux. Anbox, an Android compatibility layer for Linux. Columbia Cycada, which runs Apple iOS applications on Android systems Hybris, library that translates Bionic into glibc calls. Darling, a translation layer that attempts to run Mac OS X and Darwin binaries on Linux. Windows Subsystem for Linux v1, which runs Linux binaries on Windows via a compatibility layer which translates Linux system calls into native windows system calls. Cygwin, a POSIX-compatible environment that runs natively on Windows. 2ine, a project to run OS/2 application on Linux Rosetta 2, Apple's translation layer bundled with macOS Big Sur to allow x86-64 exclusive applications to run on ARM hardware. Compatibility layer in kernel: FreeBSD's Linux compatibility layer, which enables binaries built specifically for Linux to run on FreeBSD the same way as the native FreeBSD API layer. FreeBSD also has some Unix-like system emulations, including NDIS, NetBSD, PECoff, SVR4, and different CPU versions of FreeBSD. NetBSD has several Unix-like system emulations. Windows Subsystem for Linux provides a Linux-compatible kernel interface developed by Microsoft. The PEACE Project (aka COMPAT_PECOFF) has Win32 compatible layer for NetBSD. The project is now inactive. On RSTS/E for the PDP-11 series of minicomputers, programs written to run on the RT-11 operating system could run (without recompiling) on RSTS through the RT-11 Run-Time System having its EMT flag set, meaning that an RT-11 EMT instruction that matches a RSTS EMT is diverted to the RT-11 Run-Time System which translates them to the equivalent RSTS EMT. Programs written to take advantage of RSTS directly (or calls to RSTS within the Run-Time system itself) signal this by having a second EMT instruction (usually EMT 255) immediately before the actual RSTS EMT code. A compatibility layer avoids both the complexity and the speed penalty of full hardware emulation. Some programs may even run faster than the original, e.g. some Linux applications running on FreeBSD's Linux compatibility layer may perform better than the same applications on Red Hat Linux. Benchmarks are occasionally run on Wine to compare it to Windows NT-based operating systems. Even on similar systems, the details of implementing a compatibility layer can be quite intricate and troublesome; a good example is the IRIX binary compatibility layer in the MIPS architecture version of NetBSD. A compatibility layer requires the host system's CPU to be (upwardly) compatible to that of the foreign system. For example, a Microsoft Windows compatibility layer is not possible on PowerPC hardware because Windows requires an x86 CPU. In this case full emulation is needed. Hardware Hardware compatibility layers involve tools that allow hardware emulation. Some hardware compatibility layers involve breakout boxes because breakout boxes can provide compatibility for certain computer buses that are otherwise incompatible with the machine. See also Hypervisor Paravirtualization Emulator Cross-platform virtualization Computing platform Shim (computing) Driver wrapper Glue code References External links Windows XP Application Compatibility Technologies (Dave Morehouse and Todd Phillips, Microsoft Corporation, 1 June 2001)
42088557
https://en.wikipedia.org/wiki/DADiSP
DADiSP
DADiSP (Data Analysis and Display, pronounced day-disp) is a numerical computing environment developed by DSP Development Corporation which allows one to display and manipulate data series, matrices and images with an interface similar to a spreadsheet. DADiSP is used in the study of signal processing, numerical analysis, statistical and physiological data processing. Interface DADiSP is designed to perform technical data analysis in a spreadsheet like environment. However, unlike a typical business spreadsheet that operates on a table of cells each of which contain single scalar values, a DADiSP Worksheet consists of multiple interrelated windows where each window contains an entire series or multi-column matrix. A window not only stores the data, but also displays the data in several interactive forms, including 2D graphs, XYZ plots, 3D surfaces, images and numeric tables. Like a traditional spreadsheet, the windows are linked such that a change to the data in one window automatically updates all dependent windows both numerically and graphically. Users manipulate data primarily through windows. A DADiSP window is normally referred to by the letter "W" followed by a window number, as in "W1". For example, the formula W1: 1..3 assigns the series values {1, 2, 3} to "W1". The formula W2: W1*W1 sets a second window to compute the square of each value in "W1" such that "W2" will contain the series {1, 4, 9}. If the values of "W1" change to {3, 5, 2, 4}, the values of "W2" automatically update to {9, 25, 4, 16}. Programming language DADiSP includes a series based programming language called SPL (Series Processing Language) used to implement custom algorithms. SPL has a C/C++ like syntax and is incrementally compiled into intermediate bytecode, which is executed by a virtual machine. SPL supports both standard variables assigned with = and "hot" variables assigned with :=. For example, the statement A = 1..3 assigns the series {1, 2, 3} to the standard variable "A". The square of the values can be assigned with B = A * A. Variable "B" contains the series {1, 4, 9}. If "A" changes, "B" does not change because "B" preserves the values as assigned without regard to the future state of "A". However, the statement A := 1..3 creates a "hot" variable. A hot variable is analogous to a window, except hot variables do not display their data. The assignment B := A * A computes the square of the values of "A" as before, but now if "A" changes, "B" automatically updates. Setting A = {3, 5, 2, 4} causes "B" to automatically update with {9, 25, 4, 16}. History DADiSP was originally developed in the early 1980s as part of a research project at MIT to explore the aerodynamics of Formula One racing cars. The original goal of the project was to enable researchers to quickly explore data analysis algorithms without the need for traditional programming. Version history DADiSP 6.7 B02, Jan 2017 DADiSP 6.7 B01, Oct 2015 DADiSP 6.5 B05, Dec 2012 DADiSP 6.5, May 2010 DADiSP 6.0, Sep 2002 DADiSP 5.0, Oct 2000 DADiSP 4.1, Dec 1997 DADiSP 4.0, Jul 1995 DADiSP 3.01, Feb 1993 DADiSP 2.0, Feb 1992 DADiSP 1.05, May 1989 DADiSP 1.03, Apr 1987 See also List of numerical-analysis software Comparison of numerical-analysis software References Further reading Allen Brown, Zhang Jun: First Course In Digital Signal Processing Using DADiSP, Abramis, Charles Stephen Lessard: Signal Processing of Random Physiological Signals (Google eBook), Morgan & Claypool Publishers External links DSP Development Corporation (DADiSP vendor) DADiSP Online Help DADiSP Tutorials Getting Started with DADiSP Introduction to DADiSP Data analysis software Data-centric programming languages Data mining and machine learning software Numerical linear algebra Data visualization software Statistical programming languages C (programming language) software Software modeling language
4102341
https://en.wikipedia.org/wiki/Church%20software
Church software
Church software is any type of computer software specifically designed for use by a church. There are administrative packages tailored to handle membership databases and finances, and also worship presentation programs to generate images for video projectors. Worship presentation software A worship presentation program is a specialised presentation program designed for displaying images (primarily song lyrics, often with cinemagraphs video background) during some forms of Christian worship. Some programs include other features to help plan the service or schedule participants. There are programs available both commercially, as shareware and as free open source software (for example OpenLP). Church management software Church management software is a specialized software that assists churches and other religious organizations in organization and automation of daily operations. These packages typically assist in the management of membership and mailings, fundraising, events, report generation, and bulletin publishing. Churches use the packages to reduce the cost of operations and track the growth in their congregations. The growth in the church management software business coincides with the growing trend of using computers for religious activity. In the UK, increased usage of such software is attributed to data management requirements such as GDPR. Larger systems allow multi-user access, with security options to protect confidentiality. Flexible features to keep and report information on attendance and pastoral visits can help church staff manage members. Using a purpose-made package guards against relying on the knowledge of a specific individual to maintain a custom database. However, different church management applications vary significantly from one another, and what works well for one church may not fit the needs of another. Free open source church management systems are also available. See also Bible software Contemporary worship music References Administrative software Christian software Presentation software
6407876
https://en.wikipedia.org/wiki/MapInfo%20Professional
MapInfo Professional
MapInfo Pro is a desktop geographic information system (GIS) software product produced by Precisely (formerly: Pitney Bowes Software and MapInfo Corporation) and used for mapping and location analysis. MapInfo Pro allows users to visualize, analyze, edit, interpret, understand and output data to reveal relationships, patterns, and trends. MapInfo Pro allows users to explore spatial data within a dataset, symbolize features, and create maps. History Version 4 of the product, released in 1995, saw the product renamed to "MapInfo Professional". Version 9.5 was released in June 2008. Version 9.5.1 was released in December 2008. The primary enhancements in these releases included the use of a new graphics engine which allows for translucency and anti-aliasing when displaying maps. A set of CAD like editing tools were also added in this release. Version 10 was released in June 2009. The primary enhancements included a more intuitive user interface, including a rewritten Layer Control dialog box, compatibility with PostGIS and a PDF generator that supports both Layered and georeference PDF files. Version 10.5 was released in May 2010. The primary enhancements included a new Table Manager window, a built in ability to publish to MapInfo Stratus, ability to ingest Bing Maps directly as background mapping and enhanced support for Catalog Service for the Web (CSW). Version 11 was released in June 2011. The primary enhancement included performance tuning and usability improvements on the Browser window for creating and analysing tabular data. Integration with MapInfo Manager, a product for managing spatial data and providing [INSPIRE] compliance. Support for 64 bit operating systems was improved with the ability to use up to 4 GB of RAM (instead of 2GB, the limit when running on 32 bit operating systems). Version 11.5 was released in June 2012. The primary enhancements include a new window for Creating Legends, further enhancements to the new Browser window (introduced in v11.0) and further integration with MapInfo Manager, including the ability to edit metadata within the Catalog Browser. Version 12 was released in June 2013, with improvements to Cartographic Output; Support for Windows 8, SQL Server 2012, PostGIS2; and a new In-Product Notifications feature utilizing RSS. Version 12.5 of MapInfo Pro was the first time that a 64 bit version of the product was introduced. MapInfo Pro 12.5 32 bit was released in July 2014 and 64 bit in October 2014. The 64 bit release saw the introduction of a new ribbon UI and layout window, as well allowing for a new framework to handle background processing and multi-threading. Version 15 of MapInfo Pro 32 bit was released in June 2015 and 64 bit (15.2) was released in October 2015. Highlights include geopackage support as well as changes to the TAB file format to allow larger files and Unicode. The 64 bit version of 15.2 saw the introduction of MapInfo Pro Advanced as a new licensing level for the product which incorporates all new raster capabilities into the product including a .NET SDK. MapInfo Pro Advanced allows users to visualize very large raster files at high resolution such as 1m for a whole country and incorporating multiple satellite bands. This is achieved using a new multi resolution raster file format (.mrr). Version 16 of MapInfo Pro 64 bit was released in September 2016. Notable features include redesigned Ribbon interface, new interactive interface for thematic mapping, WFS 2.0 and WMTS support, Geopackage support. All new 64-bit version of EasyLoader is included with the release. Version 17.0 of MapInfo Pro 64 bit was released in April 2018. Python support was added. Version 2019 of MapInfo Pro 64 bit was released in Nov 2019. Much extended SQL is a key new feature. The mother company is rebranded as Precisely by its new owner Syncsort. Uses MapInfo Pro is a 64-bit GIS (Geographic Information System) application used by GIS engineers and business analysts. Industry examples include: Insurance – Analyze exposure to risk from environmental or natural hazards such as floods, tornadoes, hurricanes or crime. Perform demographic and risk analysis to determine the best target locations to acquire new potential policy holders. Environment – Analyze and assess environmental impacts such as pollution, erosion, invasive species, climate changes including human induced changes to the environment. Engineering – Coordinate with local planning and engineering groups for construction projects. Assist related groups by helping them understand environmental impacts or locations of public or utility infrastructure such as water, gas and electrical services. Telco – Produce coverage maps, visualize gaps in coverage, plan for additional coverage. Maximize new investment based on demographics, local terrain and available real estate for cell tower sites. Marketing - The application of location intelligence to identify geographic areas in which to deliver marketing. Retail Site Selection - Determining the optimum location to open or close a site (store, factory, depot etc.). The selection process is typically based on customers or worker location, demographics, buying patterns, transport links, nearby facilities. Crime Analysis - Systematic analysis of spatial data for identifying and analyzing patterns and trends in crime and disorder. Mineral Exploration - Visualisation of spatial data such as drill holes, soil samples, geophysical survey data, tenement boundaries and cadastral data. System Features Data Format --- MapInfo Pro is a database which manages information as a system of Tables. Each table is either a map file (graph) or a database file (text) and is denoted the file extension .TAB. MapInfo creates a visual display of the data in the form of a map (map window) and/or tabular form (browser window). Once data has been referenced in a table it is assigned X and Y coordinates so that the records can be displayed as objects on a map. This is known as Geocoding. Objects (points, lines, polygons) can be enhanced to highlight specific variations on a theme through the creation of a Thematic map. The basic data is overlaid with graphic styles (e.g. colour shades, hatch patterns) to display information on a more sophisticated level. For example, population density between urban and rural areas may show the cities in deep red (to indicate a high ratio of inhabitants per square mile), while showing remote areas in very pale red (to indicate a low concentration of inhabitants). Retrieval of information is conducted using data filters and "Query" functions . Selecting an object in a map window or records in a browser produces a temporary table that provides a range of values specified by the end-user. More advanced "Structured Query Language" (SQL) analysis allows the user to combine a variety of operations to derive answers to complex questions. This may involve a combination of tables and resultant calculations may be such as the number of points in polygons, proportional overlaps, and statistical breakdowns. The quantity and quality of the attributes associated with objects are dependent on the structure of the original tables. Vector analysis is a primary function of MapInfo based on X, Y coordinates and the user can create and edit data directly with commands such as: node editing, combine, split, erase, buffer, clip region. MapInfo Pro includes a range of engineering “CAD like” drawing and editing tools such as lines, circles, and polygons (referred to as "regions") which can be incorporated into tables or drawn as temporary overlays. Printout of MapInfo maps and/or statistics is managed through design settings in the Layout Window. Layout design enables the creation of composite presentations with maps, tables, legends, text, images, lines and shapes. Output hardware includes large format plotters and high spec. business printers. Data from MapInfo may be embedded into applications such as Microsoft PowerPoint or Word using copy/paste commands and resized as required. Compatibility with External Software Systems --- MapInfo Pro can read and write other file formats for data exchange with applications such as: ESRI Shapefile and AutoCAD DXF CSV and delimited ASCII text Microsoft Excel and Microsoft Access Bitmaps or Raster Formats such as GeoTIFF, ECW, Mr. SID, JPEG, PNG, MRR Spatial Databases: Oracle, PostGIS, SQL Server, SQLite and GeoPackage Open Geospatial Consortium Web Services: Web Feature Service, Web Map Service, Catalog Service for the Web Web Base Maps: Bing, Open StreetMap (OSM) Historical Notes With MapInfo Professional, the Sydney Organising Committee for the Olympic Games (SOCOG) created hundreds of maps for the longest torch relay in the history of the modern games. The Olympic Torch Relay covered 26,940 kilometres (16,740 miles) in 100 days and traversed Australia by road, railway and boat. The torch route was designed to ensure that more than 85 percent of the Australian population was within a one-hour drive of the chosen route, which passed through 1,000 towns. In addition, TNT Express used MapInfo to map more than 5,500 delivery routes to deliver Olympic tickets to more than 400,000 Australian homes. See also MapBasic MapInfo TAB format List of GIS software References External links Pitney Bowes Software's MapInfo Professional Support Page LI360, MapInfo user community Review: Mapping the world with GIS wares MapInfo Discussion List Directions Magazine review of MapInfo Professional v11 GIS software
351541
https://en.wikipedia.org/wiki/Virtual%20Network%20Computing
Virtual Network Computing
In computing, Virtual Network Computing (VNC) is a graphical desktop-sharing system that uses the Remote Frame Buffer protocol (RFB) to remotely control another computer. It transmits the keyboard and mouse input from one computer to another, relaying the graphical-screen updates, over a network. VNC is platform-independent – there are clients and servers for many GUI-based operating systems and for Java. Multiple clients may connect to a VNC server at the same time. Popular uses for this technology include remote technical support and accessing files on one's work computer from one's home computer, or vice versa. VNC was originally developed at the Olivetti & Oracle Research Lab in Cambridge, United Kingdom. The original VNC source code and many modern derivatives are open source under the GNU General Public License. There are a number of variants of VNC which offer their own particular functionality; e.g., some optimised for Microsoft Windows, or offering file transfer (not part of VNC proper), etc. Many are compatible (without their added features) with VNC proper in the sense that a viewer of one flavour can connect with a server of another; others are based on VNC code but not compatible with standard VNC. VNC and RFB are registered trademarks of RealVNC Ltd. in the US and some other countries. History The Olivetti & Oracle Research Lab (ORL) at Cambridge in the UK developed VNC at a time when Olivetti and Oracle Corporation owned the lab. In 1999, AT&T acquired the lab, and in 2002 closed down the lab's research efforts. Developers who worked on VNC while still at the AT&T Research Lab include: Tristan Richardson (inventor) Andy Harter (project leader) Quentin Stafford-Fraser James Weatherall Andy Hopper Following the closure of ORL in 2002, several members of the development team (including Richardson, Harter, Weatherall and Hopper) formed RealVNC in order to continue working on open-source and commercial VNC software under that name. The original GPLed source code has fed into several other versions of VNC. Such forking has not led to compatibility problems because the RFB protocol is designed to be extensible. VNC clients and servers negotiate their capabilities with handshaking in order to use the most appropriate options supported at both ends. , RealVNC Ltd claims the term "VNC" as a registered trademark in the United States and in other countries. Etymology The name Virtual Network Computer/Computing (VNC) originated with ORL's work on a thin client called the Videotile, which also used the RFB protocol. The Videotile had an LCD display with pen input and a fast ATM connection to the network. At the time, network computer was commonly used as a synonym for a thin client; VNC is essentially a software-only (i.e. virtual) network computer. Operation The VNC server is the program on the machine that shares some screen (and may not be related to a physical display – the server can be "headless"), and allows the client to share control of it. The VNC client (or viewer) is the program that represents the screen data originating from the server, receives updates from it, and presumably controls it by informing the server of collected local input. The VNC protocol (RFB protocol) is very simple, based on transmitting one graphic primitive from server to client ("Put a rectangle of pixel data at the specified X,Y position") and event messages from client to server. In the normal method of operation a viewer connects to a port on the server (default port: 5900). Alternatively (depending on the implementation) a browser can connect to the server (default port: 5800). And a server can connect to a viewer in "listening mode" on port 5500. One advantage of listening mode is that the server site does not have to configure its firewall to allow access on port 5900 (or 5800); the duty is on the viewer, which is useful if the server site has no computer expertise and the viewer user is more knowledgeable. The server sends small rectangles of the framebuffer to the client. In its simplest form, the VNC protocol can use a lot of bandwidth, so various methods have been devised to reduce the communication overhead. For example, there are various encodings (methods to determine the most efficient way to transfer these rectangles). The VNC protocol allows the client and server to negotiate which encoding they will use. The simplest encoding, supported by all clients and servers, is raw encoding, which sends pixel data in left-to-right scanline order, and after the original full screen has been transmitted, transfers only rectangles that change. This encoding works very well if only a small portion of the screen changes from one frame to the next (as when a mouse pointer moves across a desktop, or when text is written at the cursor), but bandwidth demands get very high if a lot of pixels change at the same time (such as when scrolling a window or viewing full-screen video). VNC by default uses TCP port 5900+N, where N is the display number (usually :0 for a physical display). Several implementations also start a basic HTTP server on port 5800+N to provide a VNC viewer as a Java applet, allowing easy connection through any Java-enabled web-browser. Different port assignments can be used as long as both client and server are configured accordingly. A HTML5 VNC client implementation for modern browsers (no plugins required) exists too. Although possible even on low bandwidth, using VNC over the Internet is facilitated if the user has a broadband connection at both ends. However, it may require advanced network address translation (NAT), firewall and router configuration such as port forwarding in order for the connection to go through. Users may establish communication through virtual private network (VPN) technologies to ease usage over the Internet, or as a LAN connection if VPN is used as a proxy, or through a VNC repeater (useful in presence of a NAT). Xvnc is the Unix VNC server, which is based on a standard X server. To applications, Xvnc appears as an X "server" (i.e., it displays client windows), and to remote VNC users it is a VNC server. Applications can display themselves on Xvnc as if it were a normal X display, but they will appear on any connected VNC viewers rather than on a physical screen. Alternatively, a machine (which may be a workstation or a network server) with screen, keyboard, and mouse can be set up to boot and run the VNC server as a service or daemon, then the screen, keyboard, and mouse can be removed and the machine stored in an out-of-the way location. In addition, the display that is served by VNC is not necessarily the same display seen by a user on the server. On Unix/Linux computers that support multiple simultaneous X11 sessions, VNC may be set to serve a particular existing X11 session, or to start one of its own. It is also possible to run multiple VNC sessions from the same computer. On Microsoft Windows the VNC session served is always the current user session. Users commonly deploy VNC as a cross-platform remote desktop system. For example, Apple Remote Desktop for Mac OS X (and more recently, "Back to My Mac" in 'Leopard' - Mac OS X 10.5) interoperates with VNC and will connect to a Unix user's current desktop if it is served with x11vnc, or to a separate X11 session if one is served with TightVNC. From Unix, TightVNC will connect to a Mac OS X session served by Apple Remote Desktop if the VNC option is enabled, or to a VNC server running on Microsoft Windows. In July 2014 RealVNC published a Wayland developer preview. Security By default, RFB is not a secure protocol. While passwords are not sent in plain-text (as in telnet), cracking could prove successful if both the encryption key and encoded password were sniffed from a network. For this reason it is recommended that a password of at least 8 characters be used. On the other hand, there is also an 8-character limit on some versions of VNC; if a password is sent exceeding 8 characters, the excess characters are removed and the truncated string is compared to the password. UltraVNC supports the use of an open-source encryption plugin which encrypts the entire VNC session including password authentication and data transfer. It also allows authentication to be performed based on NTLM and Active Directory user accounts. However, use of such encryption plugins makes it incompatible with other VNC programs. RealVNC offers high-strength AES encryption as part of its commercial package, along with integration with Active Directory. Workspot released AES encryption patches for VNC. According to TightVNC, TightVNC is not secure as picture data is transmitted without encryption. To circumvent this, it should be tunneled through an SSH connection (see below). VNC may be tunneled over an SSH or VPN connection which would add an extra security layer with stronger encryption. SSH clients are available for most platforms; SSH tunnels can be created from UNIX clients, Microsoft Windows clients, Macintosh clients (including Mac OS X and System 7 and up) – and many others. There are also freeware applications that create instant VPN tunnels between computers. An additional security concern for the use of VNC is to check whether the version used requires authorization from the remote computer owner before someone takes control of their device. This will avoid the situation where the owner of the computer accessed realizes there is someone in control of their device without previous notice. See also Comparison of remote desktop software LibVNCServer LinkVNC PocketVNC RealVNC Remmina SPICE TigerVNC TightVNC VirtualGL#TurboVNC UltraVNC Vinagre References External links RFB 3.8 Protocol Standard AT&T VNC - Original AT&T-Cambridge VNC website Free network-related software Remote desktop protocols
58905440
https://en.wikipedia.org/wiki/Water%20Witch%20%281835%20steamer%29
Water Witch (1835 steamer)
Water Witch (or Waterwitch) was an early British wood-hulled paddle steamer, built in 1835 at Harwich, England for steam packet services from Dover to London and to Boulogne. A successful fast ship, she was later operated on services on the South Coast of England and in the Bristol Channel Description Water Witch was launched on 6 August 1835 by George Graham in the former Royal Naval Dockyard at Harwich, Essex, completed her final outfitting on the River Thames, and arrived at Dover on 24 September 1835. She initially measured 89 tons burthen and the hull was long, in beam and deep. She was engined with a 2-cylinder beam engine, made by Maudslay, Sons and Field at Lambeth, of 80 horse power and driving two side paddle wheels. Service from Kent ports The steamer was owned by John Hayward of Dover and others, including her builder George Graham, and captained by William Hayward. The Haywards were the first private operator of steam vessels from Dover, beginning with Sovereign in 1822. Built specifically for the steam packet services from Dover to London and to Boulogne, Water Witch proved to be a fast vessel, beating both British Post Office packet steamers and French state vessels in speed trials. Initially she was partnered on the London service by the steamer Dover Castle under Capt. Luckhurst, and on sailings to Boulogne by Royal George under Capt. Swaffer, but by 1837 was fully dedicated to the Boulogne route. On 24 June 1843, with the South Eastern Railway Company's line from London having reached Folkestone, Water Witch was specially chartered from Capt. Hayward for a trial trip by its directors and engineer, together with their guests, of a steam ferry service from Folkestone Harbour (which the company had purchased) to Boulogne. The voyage was successful, and demonstrated that a day trip to France from London was possible. Although the subsequent public services were run by ships of the New Commercial Steam Packet Company, when that company withdrew its ships in February 1844, Haywards' Water Witch and Royal George were chartered to fill the gap for ten months. Poole-Portsmouth steam packet In early 1845 Haywards sold Water Witch to the short-lived Poole, Isle of Purbeck, Isle of Wight and Portsmouth Steam Packet Company, and she was re-registered at Poole on 31 May 1845. She was put on a twice-weekly service between Poole and Portsmouth, with calls at Brownsea Island, South Haven, Yarmouth and Cowes. In addition to the packet service, she was used as a tug to assist larger vessels entering and leaving Poole. The opening of the Southampton and Dorchester Railway in 1847 had an adverse effect on demand from passengers and for freight and they consequently looked for alternative trades for Water Witch, their only vessel; one possibility was a service between Poole and the Channel Islands. By mid-1848 other possibilities had not materialised and Water Witch was offered for sale, though a buyer was not found until the end of the year. Bristol Channel services Water Witch began a new service for the Bideford-based North Devon Steam Packet Company in February 1849, connecting Bideford and Barnstaple with Bristol through separate weekly services to each Devon port; calls were also made at Ilfracombe and Lynmouth, and the sailings were timed to connect with the Liverpool steamers at Bristol. In September 1851, after a period offering free return passages to customers making their way to the Great Exhibition in London, the ship was advertised for sale by auction, and then again in December when her North Devon sailings had ended. She was next offered for sale in early 1853, still at Bideford, but with no indication that she had been active in 1852. On 12 January 1857 Water Witch, after extensive repairs and with new boilers, commenced a freight service between Gloucester and Bideford, via Swansea as well as offering towage services to Bristol Channel ports. Notes References Ships built in Harwich 1835 ships Packet (sea transport) Dover, Kent Victorian-era passenger ships of the United Kingdom
32432213
https://en.wikipedia.org/wiki/Cameyo
Cameyo
Cameyo is an application virtualization product. It aims to virtualize Windows applications so that they can run on other machines or in HTML5 browsers. It is reported to be easy to use, light in weight, and compatible with a wide variety of applications. The company’s web site includes a library of ready-to-use virtualized free and open-source virtual applications which can be downloaded or run in the browser. Cameyo has a free edition for home and small businesses for up to 49 machines. History The Cameyo application virtualization product was launched in 2010, and since then, has undergone at least two major and several minor versions yearly to improve the quality and functioning of the application. Cameyo claims to be one of the pioneers in linking app virtualization with cloud storage systems and HTML5. It has added new features recently, such as making it possible to run the applications on operating systems other than Windows, like Linux and Android. Since 2014, Cameyo has launched another angle to its operation: now it is possible to run virtualized Windows applications through the web browsers directly, with each application running in a new tab. Operations Once Cameyo has packaged an application, its output is a standalone EXE that contains the virtualization engine and the original software’s files and registry. It can then be directly run on target Windows machines. Since it is self-contained, it does not require an agent to be pre-installed on target machines. It can also be uploaded to a Cameyo server, making it possible to run virtual applications through a browser. Virtualizing applications Cameyo itself is a portable virtual application that does not need to be installed on the computer. It can be deleted once the virtualization is completed as a single file without leaving any traces in the registry. Cameyo essentially reduces all the files, folders, registry items, and binaries of the application that needs to be virtualized into a single executable file that can run without installation as a single file from any storage device on any computer. This single executable file can be carried in a USB device or be uploaded in a cloud storage system. This operation can be carried out either by downloading Cameyo on the computer, or through their online system by uploading the system file of the required application. However, it is advisable to download Cameyo on the computer as it provides for a better sequencing process. The following are the highlights of the procedure: The application to be virtualized needs to be installed after opening Cameyo. If it is previously installed, it will have to be uninstalled and reinstalled once Cameyo is in capture mode. Cameyo takes snapshots of the computer before and after the installation of the desired application, compares the changes in the two snapshots, and hence, captures the application as it makes changes to the registry and system files. Both steps for taking snapshots requires some time, depending upon the speed of the computer, the size of the application, and the state of the registry of the computer Cameyo makes it possible to virtualize multiple applications at the same time into one executable file Close any background applications, and updating of windows and anti-virus programs, so that no extra material is captured by Cameyo, thus making the application bulky Once the application is virtualized, it will be saved in the defined destination folder from where it can be accessed, opened, or copied in a portable storage device for use on other computers There is also an option in Cameyo for editing virtualized applications in case any changes need to be made regarding the settings, name, or registry items of the application. Running applications through browsers Cameyo has recently made it possible to run virtualized applications directly through web browsers, hence, claiming to make software discovery and usage easier. Through this operation, there is no longer need to even carry the application, as it can be directly accessed from the website of the company. The company provides a large variety of open access free software and applications which are pre-virtualized for direct use. However, personal applications from the computer or the cloud storage device can also be uploaded onto the company server and accessed through the browser. Each application runs in a separate tab. As with virtualizing applications, browser applications now can also run on operating systems other than Windows. To use this function, create a free account with the company, and link the Dropbox or other cloud storage system with the account in order to access personal files from applications running in the browser. References Creators Virtualization software
40000349
https://en.wikipedia.org/wiki/IcCube
IcCube
icCube is a company founded in Switzerland that provides business intelligence (BI) software of the same name. The software can be fully embedded, can be hosted in a managed environment or installed in a customer's machine on premises. The BI tool allows end-users to create or edit dashboards themselves and is capable of processing data from multiple sources in real-time. The software makes the dashboards, the dashboard builder, the schema/cube builder and the server monitoring application accessible from a web browser only. No software has to be installed at the device of the end-user. Next to the browser-based dashboard builder, data can be accessed by running queries directly on the OLAP cube using MDX, SQL or R. History icCube sells an online analytical processing (OLAP) server. In June 2010 its first public community version (0.9.2) was released. Since then, the company released versions such as: Architecture icCube is implemented in the Java programming language and follows J2ee standards. For the latter, it embeds both an Http server (Jetty) and a servlet container to handle all the communication tasks. Being an in-memory OLAP server, the icCube server does not need to source its data from a RDBMS; any data source that exposes its data in a tabular form can be used; several plugins exists for accessing files, HTTP stream, etc. Accessing datasource that expose JSON objects is also supported (e.g., MongoDB). icCube is then taking care of possibly complex relations (e.g., many-2-many) implied by the JSON structure. Accessing icCube (cube modeling, server monitoring, MDX queries, Web reporting and dashboards) is performed through a Web interface and a JSON Rest API. The icCube OLAP server does not use any caching or pre-aggregation mechanism. Interfaces icCube uses Multidimensional Expressions (MDX) as its query language and several extensions to the original language : function declarations, vector (even at measures level), matrix, objects, Java and R interactions. icCube patented an MDX debugger. icCube supports a standard interface and a proprietary one. The XML for Analysis (XMLA) protocol can connect to any compatible reporting tool. icCube supports its own proprietary protocol called GVI. HTTP based, it can be extended. This protocol uses the Google Visualization wire protocol. Javascript is the primary implementation language and a Java mapping library is also available. Since icCube 6.8.6, the icCube server supports a JSON REST API for a programmatic access. See also Comparison of OLAP servers Business intelligence References Online analytical processing Business intelligence Data analysis software Reporting software
6734774
https://en.wikipedia.org/wiki/StopBadware
StopBadware
StopBadware is an anti-malware nonprofit organization focused on making the Web safer through the prevention, mitigation, and remediation of badware websites. It is the successor to StopBadware.org, a project started in 2006 at the Berkman Center for Internet and Society at Harvard University. It spun off to become a standalone organization, and dropped the ".org" in its name, in January 2010. People The founders of StopBadware.org were John Palfrey, then Executive Director of the Berkman Center, and Jonathan Zittrain, then at the Oxford Internet Institute. Both are now Professors of Law at Harvard University and faculty co-directors of the Berkman Center. Board members of StopBadware include Vint Cerf (Chair), Esther Dyson, Philippe Courtot, Alex Eckelberry, Michael Barrett, Brett McDowell, Eric Davis, and Maxim Weinstein, StopBadware's former executive director. John Palfrey, Ari Schwartz, John Morris, Paul Mockapetris, and Mike Shaver formerly served on the Board. Supporters StopBadware is funded by corporate and individual donations. Some of its current partners include Google, Mozilla, PayPal, Qualys, Verisign, Verizon, and Yandex. Google, GFI Software, and NSFocus participate as data providers in the organization's Badware Website Clearinghouse (see below). Previous supporters include AOL, Lenovo, Sun Microsystems, Trend Micro, and MySpace. Consumer Reports WebWatch, a now-defunct part of Consumers Union, served as an unpaid special advisor while StopBadware.org was a project at the Berkman Center. Activities StopBadware's current focus is on fighting "badware by working to strengthen the entire Web ecosystem." In pursuit of this some of the organization's activities include maintaining a badware website clearinghouse, acting as an independent reviewer of blacklisted sites, website owner and user education, and a "We Stop Badware" program for Web hosts. In June 2012 StopBadware launched the Ads Integrity Alliance with support from founding members AOL, Facebook, Google, the Interactive Advertising Bureau (IAB), and Twitter. The Alliance is a resource for online ad platforms seeking to protect users from deceptive or harmful ads. The organization receives data from its data providers and maintains a searchable clearinghouse (Badware Website Clearinghouse) of URLs blacklisted by those data providers. StopBadware's independent review process gives webmasters the option to request removal from data providers' blacklists and is intended to function as "due process" for webmasters whose sites have been listed as bad. StopBadware maintains a community forum, BadwareBusters.org, which includes an online form for reporting badware URLs encountered by the community. StopBadware also aggregates badware statistics, advocates for consumer protection in public policy, and publishes advisory documents (software guidelines, best practices for web hosting providers) compiled with input from the organization's working groups. Defining "badware" Originally StopBadware.org originally, in 2006, defined "badware" as follows: If the application acts deceptively or irreversibly. If the application engages in potentially objectionable behavior without: First, prominently disclosing to the user that it will engage in such behavior, in clear and non-technical language, and Then, obtaining the user's affirmative consent to that aspect of the application. The original mission was to "provide tools and information that assist industry and policymakers in meeting their responsibility to protect users from badware, and that help users protect themselves." StopBadware took the position that software is badware if it does certain prohibited things, despite any disclaimer in an EULA or purported consent by the user. "Silently downloading" and "Installing additional software without informing the user of the identity and purpose of that software (bundling)" are examples of such prohibited behavior. StopBadware investigated reports of improper behavior by programs, and offered vendors the opportunity to reply to their findings. Currently StopBadware now focuses on web-based malware and presently defines badware as "software that fundamentally disregards a user's choice about how his or her computer or network connection will be used." This includes viruses, Trojans, rootkits, botnets, spyware, scareware, and many other types of malware. A badware website is a website that helps distribute badware, either intentionally or because it has been compromised. Google and StopBadware There is a common misconception that StopBadware blacklists websites and that Google uses this blacklist to protect their users. In fact, Google's Safe Browsing initiative uses automated systems to identify and blacklist websites. This blacklist is used by Google to warn users before they visit potentially dangerous sites. The Firefox web browser and other applications also use Google's Safe Browsing API to warn their users based on the same blacklist. The confusion is likely due to the close relationship between Google and StopBadware. Google links to StopBadware from their interstitial warning pages. The link directs users to StopBadware's educational content about badware; it also points webmasters to StopBadware's independent review process so site owners can request removal from Google's blacklist. StopBadware's Badware Website Clearinghouse also lists websites blacklisted by Google. Google uses automated systems to search for websites that distribute badware, and issues warnings about websites on which malicious activity is detected. When a user tries to access one of these sites, that user is redirected to an interstitial page wherein Google warns the user of the detected malicious activity. Google attempts to notify site owners when blacklisting a website. On February 2nd, 2009, for the duration of approximately one hour, all sites were temporarily listed as "potentially harmful to [ones] computer". See also Malware References External links StopBadware website BadwareBusters, StopBadware's online community Organizations established in 2006 Consumer rights organizations Privacy organizations Information technology organizations based in North America Organizations based in Massachusetts Harvard Law School
356878
https://en.wikipedia.org/wiki/Xcode
Xcode
Xcode is Apple's integrated development environment (IDE) for macOS, used to develop software for macOS, iOS, iPadOS, watchOS, and tvOS. It was initially released in late 2003; the latest stable release is version 13.2.1, released on December 17, 2021, and is available via the Mac App Store free of charge for macOS Monterey users. Registered developers can download preview releases and prior versions of the suite through the Apple Developer website. Xcode includes Command Line Tools (CLT), which enable UNIX-style development via the Terminal app in macOS. They can also be downloaded and installed without the GUI. Major features Xcode supports source code for the programming languages: C, C++, Objective-C, Objective-C++, Java, AppleScript, Python, Ruby, ResEdit (Rez), and Swift, with a variety of programming models, including but not limited to Cocoa, Carbon, and Java. Third parties have added support for GNU Pascal, Free Pascal, Ada, C#, Go, Perl, and D. Xcode can build fat binary (universal binary) files containing code for multiple architectures with the Mach-O executable format. These helped ease the transitions from 32-bit PowerPC to 64-bit PowerPC, from PowerPC to Intel x86, from 32-bit to 64-bit Intel, and from x86 to Apple silicon by allowing developers to distribute a single application to users and letting the operating system automatically choose the appropriate architecture at runtime. Using the iOS SDK, tvOS SDK, and watchOS SDK, Xcode can also be used to compile and debug applications for iOS, iPadOS, tvOS, and watchOS. Xcode includes the GUI tool Instruments, which runs atop a dynamic tracing framework, DTrace, created by Sun Microsystems and released as part of OpenSolaris. Xcode also integrates built-in support for source code management using the Git version control system and protocol, allowing the user to create and clone Git repositories (which can be hosted on source code repository hosting sites such as GitHub, Bitbucket, and Perforce, or self-hosted using open-source software such as GitLab), and to commit, push, and pull changes, all from within Xcode, automating tasks that would traditionally be performed by using Git from the command line. Composition The main application of the suite is the integrated development environment (IDE), also named Xcode. The Xcode suite includes most of Apple's developer documentation, and built-in Interface Builder, an application used to construct graphical user interfaces. Up to Xcode 4.1, the Xcode suite included a modified version of the GNU Compiler Collection. In Xcode 3.1 up to Xcode 4.6.3, it included the LLVM-GCC compiler, with front ends from the GNU Compiler Collection and a code generator based on LLVM. In Xcode 3.2 and later, it included the Clang C/C++/Objective-C compiler, with newly-written front ends and a code generator based on LLVM, and the Clang static analyzer. Starting with Xcode 4.2, the Clang compiler became the default compiler, Starting with Xcode 5.0, Clang was the only compiler provided. Up to Xcode 4.6.3, the Xcode suite used the GNU Debugger (GDB) as the back-end for the IDE's debugger. Starting with Xcode 4.3, the LLDB debugger was also provided; starting with Xcode 4.5 LLDB replaced GDB as the default back-end for the IDE's debugger. Starting with Xcode 5.0, GDB was no longer supplied. Removed features Formerly, Xcode supported distributing a product build process over multiple systems. One technology involved was named Shared Workgroup Build, which used the Bonjour protocol to automatically discover systems providing compiler services, and a modified version of the free software product distcc to facilitate the distribution of workloads. Earlier versions of Xcode provided a system named Dedicated Network Builds. These features are absent in the supported versions of Xcode. Xcode also includes Apple's WebObjects tools and frameworks for building Java web applications and web services (formerly sold as a separate product). As of Xcode 3.0, Apple dropped WebObjects development inside Xcode; WOLips should be used instead. Xcode 3 still includes the WebObjects frameworks. Version history 1.x series Xcode 1.0 was released in fall 2003. Xcode 1.0 was based on Project Builder, but had an updated user interface (UI), ZeroLink, Fix & Continue, distributed build support, and Code Sense indexing. The next significant release, Xcode 1.5, had better code completion and an improved debugger. 2.x series Xcode 2.0 was released with Mac OS X v10.4 "Tiger". It included the Quartz Composer visual programming language, better Code Sense indexing for Java, and Ant support. It also included the Apple Reference Library tool, which allows searching and reading online documentation from Apple's website and documentation installed on a local computer. Xcode 2.1 could create universal binary files. It supported shared precompiled headers, unit testing targets, conditional breakpoints, and watchpoints. It also had better dependency analysis. The final version of Xcode for Mac OS X v10.4 was 2.5. 3.x series Xcode 3.0 was released with Mac OS X v10.5 "Leopard". Notable changes since 2.1 include the DTrace debugging tool (now named Instruments), refactoring support, context-sensitive documentation, and Objective-C 2.0 with garbage collection. It also supports Project Snapshots, which provide a basic form of version control; Message Bubbles, which show build errors debug values alongside code; and building four-architecture fat binaries (32 and 64-bit Intel and PowerPC). Xcode 3.1 was an update release of the developer tools for Mac OS X, and was the same version included with the iPhone SDK. It could target non-Mac OS X platforms, including iPhone OS 2.0. It included the GCC 4.2 and LLVM GCC 4.2 compilers. Another new feature since Xcode 3.0 is that Xcode's SCM support now includes Subversion 1.5. Xcode 3.2 was released with Mac OS X v10.6 "Snow Leopard" and installs on no earlier version of OS X. It supports static program analysis, among other features. It also drops official support for targeting versions earlier than iPhone OS 3.0. But it is still possible to target older versions, and the simulator supports iPhone OS 2.0 through 3.1. Also, Java support is "exiled" in 3.2 to the organizer. Xcode 3.2.6 is the last version that can be downloaded for free for users of Mac OS X Snow Leopard (though it’s not the last version that supports Snow Leopard; 4.2 is). Downloading Xcode 3.2.6 requires a free registration at Apple's developer site. 4.x series In June 2010, at the Apple Worldwide Developers Conference version 4 of Xcode was announced during the Developer Tools State of the Union address. Version 4 of the developer tools consolidates the Xcode editing tools and Interface Builder into one application, among other enhancements. Apple released the final version of Xcode 4.0 on March 9, 2011. The software was made available for free to all registered members of the $99 per year Mac Developer program and the $99 per year iOS Developer program. It was also sold for $4.99 to non-members on the Mac App Store (no longer available). Xcode 4.0 drops support for many older systems, including all PowerPC development and software development kits (SDKs) for Mac OS X 10.4 and 10.5, and all iOS SDKs older than 4.3. The deployment target can still be set to produce binaries for those older platforms, but for Mac OS platforms, one is then limited to creating x86 and x86-64 binaries. Later, Xcode was free to the general public. Before version 4.1, Xcode cost $4.99. Xcode 4.1 was made available for free on July 20, 2011 (the day of Mac OS X Lion's release) to all users of Mac OS X Lion on the Mac App Store. On August 29, 2011, Xcode 4.1 was made available for Mac OS X Snow Leopard for members of the paid Mac or iOS developer programs. Xcode 4.1 was the last version to include GNU Compiler Collection (GCC) instead of only LLVM GCC or Clang. On October 12, 2011, Xcode 4.2 was released concurrently with the release of iOS 5.0, and it included many more and improved features, such as storyboarding and automatic reference counting (ARC). Xcode 4.2 is the last version to support Mac OS X 10.6 "Snow Leopard", but is available only to registered developers with paid accounts; without a paid account, 3.2.6 is the latest download that appears for Snow Leopard. Xcode 4.3, released on February 16, 2012, is distributed as one application bundle, Xcode.app, installed from the Mac App Store. Xcode 4.3 reorganizes the Xcode menu to include development tools. Xcode 4.3.1 was released on March 7, 2012 to add support for iOS 5.1. Xcode 4.3.2 was released on March 22, 2012 with enhancements to the iOS Simulator and a suggested move to the LLDB debugger versus the GDB debugger (which appear to be undocumented changes). Xcode 4.3.3, released in May 2012, featured an updated SDK for Mac OS X 10.7.4 "Lion" and a few bug fixes. Xcode 4.4 was released on July 25, 2012. It runs on both Mac OS X Lion (10.7) and OS X Mountain Lion (10.8) and is the first version of Xcode to contain the OS X 10.8 "Mountain Lion" SDK. Xcode 4.4 includes support for automatic synthesizing of declared properties, new Objective-C features such as literal syntax and subscripting, improved localization, and more. On August 7, 2012, Xcode 4.4.1 was released with a few bug fixes. On September 19, 2012, iOS 6 and Xcode 4.5 were released. Xcode added support for iOS 6 and the 4-inch Retina Display on iPhone 5 and iPod touch 5th generation. It also brought some new Objective-C features to iOS, simplified localization, and added auto-layout support for iOS. On October 3, 2012, Xcode 4.5.1 was released with bug fixes and stability improvements. Less than a month later, Xcode 4.5.2 was released, with support for iPad Mini and iPad with Retina Display, and bug fixes and stability improvements. On January 28, 2013, iOS 6.1 and Xcode 4.6 were released. 5.x series On June 10, 2013, at the Apple Worldwide Developers Conference, version 5 of Xcode was announced. On September 18, 2013, Xcode 5.0 was released. It shipped with iOS 7 and OS X 10.8 Mountain Lion SDKs. However, support for OS X 10.9 Mavericks was only available in beta versions. Xcode 5.0 also added a version of Clang generating 64-bit ARM code for iOS 7. Apple removed support for building garbage collected Cocoa binaries in Xcode 5.1. 6.x series On June 2, 2014, at the Worldwide Developers Conference, Apple announced version 6 of Xcode. One of the most notable features was support for Swift, an all-new programming language developed by Apple. Xcode 6 also included features like Playgrounds and live debugging tools. On September 17, 2014, at the same time, iOS 8 and Xcode 6 were released. Xcode could be downloaded on the Mac App Store. 7.x series On June 8, 2015, at the Apple Worldwide Developers Conference, Xcode version 7 was announced. It introduced support for Swift 2, and Metal for OS X, and added support for deploying on iOS devices without an Apple Developer account. Xcode 7 was released on September 16, 2015. 8.x series On June 13, 2016, at the Apple Worldwide Developers Conference, Xcode version 8 was announced; a beta version was released the same day. It introduced support for Swift 3. Xcode 8 was released on September 13, 2016. 9.x series On June 5, 2017, at the Apple Worldwide Developers Conference, Xcode version 9 was announced; a beta version was released the same day. It introduced support for Swift 4 and Metal 2. It also introduced remote debugging on iOS and tvOS devices wirelessly, through Wi-Fi. Xcode 9 was publicly released on September 19, 2017. 10.x series On June 4, 2018, at the Apple Worldwide Developers Conference, Xcode version 10 was announced; a beta version was released the same day. Xcode 10 introduced support for the Dark Mode announced for macOS Mojave, the collaboration platforms Bitbucket and GitLab (in addition to already supported GitHub), training machine learning models from playgrounds, and the new features in Swift 4.2 and Metal 2.1, as well as improvements to the editor and the project build system. Xcode 10 also dropped support for building 32-bit macOS apps and no longer supports Subversion integration. Xcode 10 was publicly released on September 17, 2018. 11.x series On June 3, 2019, at the Apple Worldwide Developers Conference, Xcode version 11 was announced; a beta version was released the same day. Xcode 11 introduced support for the new features in Swift 5.1, as well as the new SwiftUI framework (although the interactive UI tools are available only when running under macOS 10.15). It also supports building iPad applications that run under macOS; includes integrated support for the Swift Package Manager; and contains further improvements to the editor, including a "minimap" that gives an overview of a source code file with quick navigation. Xcode 11 requires macOS 10.14 or later and Xcode 11.4 requires 10.15 or later. Xcode 11 was publicly released on September 20, 2019. 12.x series On June 22, 2020, at the Apple Worldwide Developers Conference, Xcode version 12 was announced; a beta version was released the same day. Xcode 12 introduced support for Swift 5.3 and requires macOS 10.15.4 or later. Xcode 12 was publicly released on September 16, 2020. 13.x series Xcode 13 was announced on June 7, 2021 at WWDC21; the first beta version was released on the same day. The new version introduced support for Swift 5.5 and requires macOS 11.3 or later. Xcode 13 contains SDKs for iOS / iPadOS 15, macOS 12, watchOS 8, and tvOS 15. Xcode 13’s major features include the new concurrency model, improved support for version control providers (such as GitHub), including the ability to browse, view, and comment on pull requests right in the app interface. Xcode 13 was publicly released on September 20, 2021. Version comparison table Xcode 1.0 - Xcode 2.x (before iOS support) Xcode 3.0 - Xcode 4.x Xcode 5.0 - 6.x (since arm64 support) Xcode 7.0 - 10.x (since Free On-Device Development) Xcode 11.x - 13.x (since SwiftUI framework) Toolchain versions Xcode 1.0 - Xcode 2.x (before iOS support) Xcode 3.0 - Xcode 4.x Xcode 5.0 - 6.x (since arm64 support) Xcode 7.0 - 10.x (since Free On-Device Development) Xcode 11.x - 13.x (since SwiftUI framework) See also XcodeGhost References External links Xcode – Mac App Store Apple Developer Connection: Xcode tools and resources Xcode Release Notes — Archive Download Xcode 2003 software Freeware History of software Integrated development environments IOS IOS development software MacOS programming tools MacOS text editors MacOS-only software made by Apple Inc. Software version histories User interface builders
44498
https://en.wikipedia.org/wiki/State%20Council%20of%20the%20People%27s%20Republic%20of%20China
State Council of the People's Republic of China
The State Council, constitutionally synonymous with the Central People's Government since 1954 (particularly in relation to local governments), is the chief administrative authority of the People's Republic of China. It is chaired by the premier and includes the heads of each of the cabinet-level executive departments. Currently, the council has 35 members: the premier, one executive vice premier, three other vice premiers, five state councillors (of whom three are also ministers and one is also the secretary-general), and 26 in charge of the Council's constituent departments. In the politics of China, the Central People's Government forms one of three interlocking branches of power, the others being the Chinese Communist Party (CCP) and the People's Liberation Army (PLA). The State Council directly oversees provincial-level People's Governments, and in practice maintains membership with the top levels of the CCP. Aside from very few non-CCP ministers, members of the State Council are also members of the CCP's Central Committee. Organization The State Council meets every six months. Between meetings it is guided by a (Executive Meeting) that meets weekly. The standing committee includes the premier, one executive vice premier, three vice premiers, and five other state councillors (normally one of whom serves as Secretary-General of the State Council, and two of whom concurrently serve as ministers). The vice-premiers and state councillors are nominated by the premier, and appointed by the president with National People's Congress' (NPC) approval. Incumbents may serve two successive five-year terms. Each vice premier oversees certain areas of administration. Each State Councillor performs duties as designated by the Premier. The secretary-general heads the General Office which handles the day-to-day work of the State Council. The secretary-general has relatively little power and should not be confused with the General Secretary of the Chinese Communist Party. Each ministry supervises one sector. Commissions outrank ministries and set policies for and coordinate the related activities of different administrative organs. Offices deal with matters of ongoing concern. Bureaus and administrations rank below ministries. In addition to the 25 ministries, there are 38 centrally administered government organizations that report directly to the state council. The heads of these organizations attend full meetings of the state committee on an irregular basis. In practice, the vice premiers and State Councillors assume responsibility for one or more sectors or issues, and remain in contact with the various bodies responsible for policy related to that area. This allows the Standing Committee to oversee a wide range of government functions. The State Council, like all other governmental bodies, is nominally responsible to the NPC and its Standing Committee in conducting a wide range of government functions both at the national and at the local levels, and nominally acts by virtue of the NPC's authority. In practice, however, the NPC had historically done little more than ratify decisions already made by the State Council. More recently, however, the NPC has taken on a more independent role. There has been at least one case where the NPC has outright rejected an initiative of the State Council and a few cases where the State Council has withdrawn or greatly modified a proposal in response to NPC opposition. The State Council and the CCP are also tightly interlocked. With rare exceptions, State Councillors are high-ranking members of the CCP. Although, as Party members, they are supposed to follow Party instructions, because they tend to be senior members of the Party they also have substantial influence over what those instructions are. This results in a system which is unlike the Soviet practice in which the Party effectively controlled the State. Rather, the Party and State are fused at this level of government. The members of the State Council derive their authority from being members of the state, while as members of the Party they coordinate their activities and determine key decisions such as the naming of personnel. There were attempts to separate the party and state in the late 1980s under Deng Xiaoping and Zhao Ziyang and have the Party in charge of formulating policy and the State Council executing policy, but these efforts were largely abandoned in the early 1990s. As the chief administrative organ of government, its main functions are to formulate administrative measures, issue decisions and orders, and monitor their implementation; draft legislative bills for submission to the NPC or its Standing Committee; and prepare the economic plan and the state budget for deliberation and approval by the NPC. The State Council is the functional center of state power and clearinghouse for government initiatives at all levels. With the government's emphasis on economic modernization, the State Council clearly acquired additional importance and influence. The State Council controls the Ministry for National Defense but does not control the People's Liberation Army, which is instead controlled by the Central Military Commission. Members Executive Meeting (Standing Committee) Plenary Meeting The Plenary Meeting of State Council is hosted by the Premier, joined by Vice Premiers, State Councillors, Ministers in charge of Ministries and Commissions, the Governor of the People's Bank, the Auditor-General, and the Secretary General. It usually runs bi-annually and when necessary, non-members can be invited to participate. Organizational structure General Office of the State Council Secretary-General of the State Council Deputy Secretaries-General of the State Council Constituent Departments of the State Council (cabinet-level) Special Organization directly under the State Council Ministry-level State-owned Assets Supervision and Administration Commission of the State Council (SASAC) (), established in 2003 Organizations directly under the State Council Ministry-level General Administration of Customs of the People's Republic of China () State Administration of Taxation () State Administration for Market Regulation () National Radio and Television Administration () General Administration of Sport () Counsellors' Office of the State Council () Names reserved (formerly ministry-level) National Press and Publication Administration (), additional name "National Copyright Administration" (), both names reserved by the CCP Central Propaganda Department Sub-ministry-level National Bureau of Statistics () China International Development Cooperation Agency () National Healthcare Security Administration () National Government Offices Administration (), formerly the "Government Offices Administration of the State Council" () Names reserved (formerly sub-ministry-level) National Religious Affairs Administration (), a name reserved by the CCP Central United Front Work Department Administrative Offices of the State Council Ministry-level Hong Kong and Macau Affairs Office of the State Council () State Council Research Office () Names reserved (formerly ministry-level) Overseas Chinese Affairs Office of the State Council (), a name reserved by the CCP Central United Front Work Department Taiwan Affairs Office of the State Council (), under the CCP Central Committee Cyberspace Administration of China (), under the CCP Central Committee State Council Information Office (), a name reserved by the CCP Central Publicity Department Institutions directly under the State Council Ministry-level Xinhua News Agency (), administered by the CCP Central Publicity Department Chinese Academy of Sciences () Chinese Academy of Engineering () Chinese Academy of Social Sciences () Development Research Center of the State Council () China Media Group (), administered by the CCP Central Publicity Department China Banking and Insurance Regulatory Commission () China Securities Regulatory Commission () Names reserved (formerly ministry-level) Chinese Academy of Governance (), a name reserved by the CCP Central Party School Sub-ministry-level China Meteorological Administration () National Administrations administrated by ministry-level agencies Sub-ministry-level National Public Complaints and Proposals Administration (国家信访局), administrated by the State Council General Office National Food and Strategic Reserves Administration (国家粮食和物资储备局), administrated by the National Development and Reform Commission National Energy Administration (), administrated by the National Development and Reform Commission State Administration of Science, Technology and Industry for National Defense (), administrated by the Ministry of Industry and Information Technology State Tobacco Monopoly Administration () (officially sharing its office with China National Tobacco Corporation), administrated by the Ministry of Industry and Information Technology National Immigration Administration (), additional name "Exit and Entry Administration" () for Mainland-Hong Kong-Macau-Taiwan border control, administrated by the Ministry of Public Security National Forestry and Grassland Administration (), additional name "National Park Administration" (), administrated by the Ministry of Natural Resources National Railway Administration (), administrated by the Ministry of Transport Civil Aviation Administration of China (CAAC) (), administrated by the Ministry of Transport State Post Bureau (), administrated by the Ministry of Transport, officially sharing its office with China Post Group Corporation National Administration for Rural Revitalization () (established on 25 February 2021), administrated by the Ministry of Agriculture and Rural Affairs National Cultural Heritage Administration (), administrated by the Ministry of Culture and Tourism National Administration of Traditional Chinese Medicine (), administrated by the National Health Commission National Bureau of Disease Control and Prevention () (established on 13 May 2021), administrated by the National Health Commission National Mine Safety Administration (国家矿山安全监察局), administrated by the Ministry of Emergency Management State Administration of Foreign Exchange (), administrated by the People's Bank of China National Medical Products Administration (), formerly China Food and Drug Administration (CFDA) (), now administrated by the State Administration for Market Regulation National Intellectual Property Administration (), administrated by the State Administration for Market Regulation Names reserved (formerly sub-ministry-level) National Civil Service Administration (国家公务员局), reserved by the Organization Department of the Chinese Communist Party National Archives Administration (), i.e. the Central Archives () (of the CCP), under the General Office of the CCP Central Committee National Administration of State Secrets Protection (), i.e. the Office of the Central Secrecy Commission () (of the CCP), under the General Office of the CCP Central Committee National Cryptography Administration (国家密码管理局), i.e. the Office of the Central Leading Group for Cryptography Work () (of the CCP), under the General Office of the CCP Central Committee State Language Commission (国家语言文字工作委员会), reserved by the Ministry of Education State Administration of Foreign Experts Affairs (), reserved by the Ministry of Science and Technology China National Space Administration (), reserved by the Ministry of Industry and Information Technology China Atomic Energy Authority (), reserved by the Ministry of Industry and Information Technology State Oceanic Administration (), reserved by the Ministry of Natural Resources National Nuclear Safety Administration (国家核安全局), reserved by the Ministry of Ecology and Environment Certification and Accreditation Administration (国家认证认可监督管理委员会), reserved by the State Administration of Market Regulation Standardization Administration (), reserved by the State Administration of Market Regulation Interdepartmental coordinating agencies National Defense Mobilization Commission (NDMC; ), established in 1994 National Energy Commission (NEC; ), established in 2010 Financial Stability and Development Committee (FSDC; ), established in 2017 and many more... Agencies dispatched by the State Council Ministry-level Liaison Office of the Central People's Government in the Hong Kong Special Administrative Region (), established on 18 January 2000. Liaison Office of the Central People's Government in the Macao Special Administrative Region (), established on 18 January 2000. Sub-ministry-level Office for Safeguarding National Security of the Central People's Government in the Hong Kong Special Administrative Region (), established on 1 July 2020. See also Department of State Affairs in the Three Departments and Six Ministries system Ming dynasty: Central Secretariat → Grand Secretariat Qing dynasty: Grand Secretariat → Grand Council → Cabinet Republic of China: State Council (1912–28) → Executive Yuan (1928–present) People's Republic of China: Government Administration Council of the Central People's Government (1949–54); Ministries of the PRC References External links China, People's Republic of, State Council Government agencies established in 1954 1954 establishments in China
2598929
https://en.wikipedia.org/wiki/Avidemux
Avidemux
Avidemux is a free and open-source software application for non-linear video editing and transcoding multimedia files. The developers intend it as "a simple tool for simple video processing tasks" and to allow users "to do elementary things in a very straightforward way". It is written in C++ and uses Qt for its graphical user interface, and FFmpeg for its multimedia functions. Starting with version 2.4, Avidemux also offers a command-line interface, and since version 2.6, the original GTK port has not been maintained and is now discontinued. Avidemux is developed for Linux, macOS, and Windows. Unofficial builds are also available for FreeBSD, NetBSD, and OpenBSD. Features Avidemux is capable of non-linear video editing, applying visual effects (called "Filters" by Avidemux) to video, and transcoding video into various formats. Some of the filters were ported from MPlayer and Avisynth. Avidemux can also insert audio streams into a video file (an action known as multiplexing or "muxing") or extract audio streams from video files (an action known as "demuxing"). An integral and important part of the design of the program is its project system, which uses the SpiderMonkey JavaScript engine. Whole projects with all options, configurations, selections, and preferences can be saved into a project file. Like VirtualDub's VCF scripting capabilities, Avidemux has advanced scripting available for it both in its GUI and command line modes. It also supports a non-project system just like VirtualDub, where users can simply create all of their configurations and save the video directly without making a project file. A project queue system is also available. Avidemux has built-in subtitle processing, both for optical character recognition of DVD subtitles and for rendering hard subtitles. Avidemux supports various subtitle formats, including MicroDVD (.SUB), SubStation Alpha (.SSA), Advanced SubStation Alpha (.ASS) and SubRip (.SRT). Components Avidemux was written from scratch, but additional code from FFmpeg, MPlayer, Transcode and Avisynth has been used on occasion as well. Nonetheless, it is a completely standalone program that does not require any other programs to read, decode, or encode other than itself. The built-in libavcodec library from the FFmpeg project is used for decoding and encoding of various audio and video formats such as MPEG-4 ASP. The primary (though not the only) Avidemux programmer uses the nickname 'Mean' on the Avidemux forum. Multithreading Multithreading has been implemented in the following areas of Avidemux (some partially through libavcodec): Encoding MPEG-1 and MPEG-2 (using libavcodec) MPEG-4 Part 2 SP/ASP (using libavcodec or Xvid) Earlier versions of Xvid are not compatible with this feature. H.264/MPEG-4 Part 10 AVC (using x264) H.265/HEVC (using x265) Decoding MPEG-1 and MPEG-2 (using libavcodec) MPEG-4 Part 2 SP/ASP (using libavcodec) Supported formats Avidemux supports following file formats: See also List of video editing software Comparison of video editing software Comparison of video converters Notes References Further reading External links Cross-platform free software Free software programmed in C++ Free video software Free video conversion software Software that uses FFmpeg Video editing software Video editing software for macOS Video editing software for Linux Video editing software for Windows Video software that uses Qt Software that was ported from GTK to Qt Video editing software that uses GTK
2555859
https://en.wikipedia.org/wiki/Pardus%20%28operating%20system%29
Pardus (operating system)
Pardus is a Linux distribution developed with support from the government of Turkey. Pardus' main focus is office-related work including use in Turkish government agencies. Despite that, Pardus ships in several languages. Its ease of use and availability free of charge has spawned numerous communities throughout the world. Development Pardus was started by Turkish National Research Institute of Electronics and Cryptology (UEKAE), a division of the Scientific and Technological Research Council of Turkey (TÜBİTAK). The first live CD version of Pardus was a fork of Gentoo Linux. The current version is a fork of Debian. Release history PiSi package management PiSi (; Packages Installed Successfully as Intended; also a Turkish word meaning "kitty", intended as a pun on the distribution's name, which is derived from pardus, the species name of the leopard.) is a package management system that was developed for Pardus. It was used in the initial versions of the distribution, but abandoned in favor of APT since the project moved to Debian base. Pardus 2011.2, released on September 19, 2011, was the last Pardus release that used PiSi. PiSi stores and handles dependencies for various packages, libraries, and COMAR tasks. Some features of PiSi include: Uses the LZMA compression algorithm Written in Python Package sources are written in XML and Python Database access implemented with Berkeley DB Integrates low-level and high-level package operations (dependency resolution) Framework approach to build applications and tools upon A community fork of the old Pardus with PiSi package management exists, called PiSi Linux. PiSi Linux's latest stable version is 1.2, and latest development version is 2.0 Beta 2. eopkg - the package manager of the Solus project, a rolling-release Linux distribution, is based on / derived from PiSi. YALI YALI (Yet Another Linux Installer) is the first Pardus software a user encounters. Basically, it recognizes the hardware and installs Pardus software from installation media (i.e. CD) to a user-selected hard disk partition. YALI can handle resizing of NTFS partitions found on the disk. A yalı is a waterside mansion common in the Bosphorus region. This project is stopped and not being used since the migration to Debian-base. KAPTAN KAPTAN is a desktop greeter, which starts at the first start. It allows a user to change the desktop theme, mouse, keyboard and language settings, date and time, KDE menus, wallpaper, Package Manager settings, smolt, number of desktops. The word Kaptan means 'captain' in Turkish. This project is stopped and not being used since the migration to Debian-base. Reception Ladislav Bodnar, the creator of DistroWatch, wrote in his round-up of Linux/*nix in 2006 that Pardus is one of the distros he was most impressed by that year "... thanks to unique package management ideas, innovative start-up sequence and general desktop polish ..." Dmitri Popov, an author of Linux User & Developer, titled his review of Pardus 2011 Beta as the most exciting distro of the year. Social events and participation Pardus participated in Google Summer of Code 2008 and 2009. Pardus attended CeBIT Eurasia in 2006, 2008, 2009, 2010, and 2011. Derivatives Pardus Community Edition based on Debian released on April 12, 2013. Pisi Linux and Pardus-Anka projects forked from PiSi based Pardus. A group of volunteers aim to continue PiSi and other features of Pardus independently. Pisi Linux released two new versions. These versions are direct continuation of Pardus 2011.2 64bit edition, includes updated versions of Pisi, Kaptan etc. Usage Turkish Armed Forces (partially) Ministry of Foreign Affairs (Turkey) (partially) Ministry of National Defence (Turkey) Turkish Police (partially) Social Security Institution (migrating) Schools (partially) References External links PardusWiki – Multilingual resources PiSi Linux's website Pardus in OpenSourceFeed Gallery Debian-based distributions KDE Science and technology in Turkey Scientific and Technological Research Council of Turkey State-sponsored Linux distributions Turkish inventions Turkish-language Linux distributions Linux distributions
64799079
https://en.wikipedia.org/wiki/AllTrails
AllTrails
AllTrails is a fitness and travel mobile app used in outdoor recreational activities. AllTrails is commonly used for outdoor activities such as hiking, mountain biking, climbing and snow sports. The service allows users to access a database of trail maps, which includes crowdsourced reviews and images. Depending on a user's subscription status these resources can be used online and offline. AllTrails first launched in 2010 within the AngelPad accelerator program and gained seed funding shortly thereafter from the firms 500 Startups and 2020 Ventures. In 2018, AllTrails was acquired by Spectrum Equity. AllTrails operates on a freemium business model. It is accessible through a mobile app or a web browser for computers. History Founding AllTrails was founded in 2010 by Russell Cook. It was accepted into AngelPad's inaugural class. This incubation period preceded its official launch in December 2010. 2011-2012 In 2011, within one year of its launch, AllTrails gained $400k in seed funding from two venture capital funds. A major partnership between National Geographic and AllTrails was announced in early 2012. This partnership gave the app a boost to its user base and augmented its mapping data. This partnership resulted in National Geographic's Topo.com being merged and redirected to AllTrails. In Q4 of 2012, AllTrails crossed the 1 million install milestone. In 2018, though the company was said to be cash flow positive, it announced a new round of funding which ultimately resulted in its acquisition. 2015-2017 In August 2016, AllTrails announced that it had acquired EveryTrail from TripAdvisor, who had formerly acquired the company in 2011. AllTrails was listed in the 100 Best Android Apps of 2017 by PhanDroid. 2018-2019 In 2018, Spectrum Equity provided $75M in funding to AllTrails in exchange for a majority position in the company. This infusion of capital enabled the company to focus on business expansion and to attract additional talent. At that time, Ben Spero and Matt Neidlinger of Spectrum joined AllTrails’ board of directors. In May 2019, AllTrails announced that the app was available in French, German and Spanish. Formerly, the app had only been available in English. AllTrails partnered with Rolling Strong in December 2019 to help truck drivers stay fit and healthy on the road. 2020 In the summer of 2020, Apple partnered with AllTrails to add hiking information to their Apple Maps feature. This was largely in response to the growing popularity of outdoor activities during the global COVID-19 pandemic. The global COVID pandemic did not hurt the app as individuals reconnected with nature and outdoor activities for instance hiking while working from home. These trends in behavior contributed to "8.7 million users" installing the application in 2020 which is a "89%" increase from the previous year 2021 In early 2021, All Trails broke the news that it had officially reached 1 million paid subscribers. Operation Registration is required on All Trails. Users can register via Apple, Facebook, Google or by manually entering their name, email and password. Once a user has registered, they will be able to search and explore trails which is the core value proposition of the app. Upon arriving at a trail, a user will be able to see information about the trail, track their activity, or even add new trails to the service. additional features have been added lately but a user must purchase a subscription in order to access them. Features All Trails operates on a freemium service. Users can access the app's advanced features via a subscription service called AllTrails Pro. Users As of January 2012, All Trails had reached 200,000 users. In that year, AllTrails partnered with NatGeo. By the end of the year it had crossed the 1 million install mark. As of August 2020, all trails claim a global user base of over 20 million in more than 100 countries. Acquisitions Since its founding in 2010, All Trails has acquired 4 organizations: Every Trail, Gpsies, iFootpath, Trails.com. With the exception of GPSies, all other apps were either integrated into the company or discontinued as of July 2020. The GPSies acquisition was announced on July 30, 2019. The Trails.com acquisition was announced on July 18, 2019. The iFootpath acquisition was announced on April 24, 2019. The EveryTrail acquisition was announced on August 16, 2016. All acquisitions were made for undisclosed sums. See also Comparison of bicycle route planning websites References External links Android (operating system) software Collaborative mapping GIS software Google Maps Internet properties established in 2010 Satellite navigation software IOS software Mobile route-planning software Route planning software Web Map Services Route planning websites
3426959
https://en.wikipedia.org/wiki/St.%20Thomas%20College%2C%20Thrissur
St. Thomas College, Thrissur
St Thomas' College (Autonomous), Thrissur is a government aided college located in Thrissur City, the Cultural Capital of Kerala, India. Founded by Mar Adolph Medlycott in 1889, this college played a very important role in the development of the State. It is the oldest college in the erstwhile princely state of Cochin and present day Thrissur district. It is the second private college to be recognised as a first grade college under the University of Madras, in then existed princely states of Travancore, Cochin and Malabar which later became mostly the present geographical area of Kerala. It is the first Catholic college in Kerala and is conducted by the Syro-Malabar Catholic Archdiocese of Thrissur. Mar Andrews Thazhath is the Patron and Mar Tony Neelankavil is the Manager of the College now. The college is affiliated with University of Calicut. The College attained Autonomous status in 2014 and was recognised as College with Potential for Excellence by University Grants Commission in 2016. The College was awarded A Grade in its third cycle of reaccreditation by National Assessment and Accreditation Council (NAAC) after the peer team review in 2016. History Started as a lower secondary school in 1889, the founder, the first Vicar Apostolic of Thrissur, Adolphus Edwin Medlycott, named it St. Thomas’ College. The college started to function in the compound of present day Bishop's House compound. In 1919, it was raised to a second grade college in arts, affiliated to the University of Madras. The college was elevated as a first grade college under University of Madras in April 1925 and it is the second non-government college in Kerala to achieve that status, Union Christian College, Aluva being the first. The buildings of the College were constructed in a phased manner. Rev. Medlycott has given a description about the College at the time of his return in 1896 as below: The first building of the college is the one which is having the tower. Menacherry Hall was built in the year 1941. The Science Block (present day Academic Block on the opposite side of the road) was built in 1956. In 2004, St. Thomas' was accredited with a B++ grade by NAAC, an autonomous organization instituted by the UGC. In September 2010, The college was re-accredited with A grade by NAAC. The College attained A Grade in its NAAC reaccreditation in 2016. The year 2018-2019 was celebrated as Sathyaprayan 1919-2019, the centennial as second grade college and the commemoration was inaugurated by Honourable President of India, Shri Ram Nath Kovind on 7 August 2018. Academic programmes The college offers graduate, postgraduate and Ph.D. level programmes in a wide range of subjects. There are twenty-four teaching departments (including ten research departments and the thirteen postgraduate departments) which encompass 140 teachers. Departments Humanities Department of History Department of Political Science Languages Department of English Department of Hindi Department of Malayalam Department of Sanskrit Mathematical Sciences Department of Mathematics Department of Statistics Computer Science and Application Department of Computer Science Department of Computer Application Department of Data Science Life Sciences Department of Botany Department of Zoology Physical Sciences Department of Physics Department of Chemistry Department of Electronics Social Sciences Department of Commerce Department of Criminology and Police Science Department of Economics Department of Forensic Science Department of Management Studies Department of Media Studies Department of Psychology Department of Social Work Other Department of Library and Information Studies Department of Physical Education Department of Physics The Department of Physics was started in 1922 within three years of the establishment of the college itself. In 1951 a B.A Degree in Physics was started (later changed to BSc) was started with the affiliation of the Madras University. In 1961 MSc Physics (of Kerala University) was started which was a rare course in those days (one among four colleges in Kerala then). This Department offers bachelor's degrees and master's degrees in Physics. Department of Zoology The department of Zoology, established in 1957, was affiliated to the University of Kerala. Post Graduate course was introduced in 1969 with entomology as the special paper. The department was approved as a recognized research center under university of Calicut in 1974 and is enriched with a full-fledged Research laboratory supported by Department of Science and Technology. Department of Statistics In the year 1955, the Degree Course in B.A. Statistics was started under the Dept. of Mathematics and Statistics affiliated to the University of Madras. The B.A. Course was converted to three year B.Sc. course in Statistics in 1958 under the University of Kerala. The Department of Statistics was established in the year 1984. This department offers bachelor's degrees, Master's degrees and Doctoral Studies (PhD). Doctoral Studies (PhD) The College is offering research programmes under the following ten departments: Department of Botany Department of Chemistry Department of Commerce Department of Computer Science Department of Economics Department of English Department of Mathematics Department of Physics Department of Statistics Department of Zoology Department of Library and Information Studies The Department of Library and Information Studies has two constituents: (1) the library which supplements the academic, intellectual, informational, inspirational, spiritual and recreational requirements of the academia with its rich resources and services, and (2) the department of information studies which offers various academic programmes. Department of Library The Department of library is one of the largest and oldest college libraries in Kerala. It is divided into four sections: General, Science, English and a separate section for the self-financing courses in Jubilee Block. The General section holds general, multidisciplinary, language and literature books in Malayalam, Sanskrit and Hindi along with the books in humanities and social sciences. This section also has a rich collection of reference books. The Science section contains volumes in natural, physical and mathematical sciences. The English section possesses resources in English language and literature. The Self Financing Section of the library has books in the disciplines of Social Work, Electronics, Computer Applications, Business and Media Studies. The Library houses about a hundred thousand books and subscribes to more than one hundred periodicals including journals, magazines and dailies. The valuable collections include forty sets of Encyclopӕdiae, twenty sets of Dictionaries and Directories of various categories. Some of the collections are very rare, viz. 4 editions of Encyclopædia Britannica, 2 editions of World Book Encyclopedia, Collier's Encyclopedia (24 volumes), Funk & Wagnalls New Encyclopedia (29 volumes), Compton's Pictured Encyclopedia (15 volumes), The Book of Knowledge: The Children's Encyclopædia, International Encyclopedia of Ecology and Environment (30 volumes), Encyclopedia of Visual Art (10 volumes) and Encyclopedia of Animal World (21 volumes). In the collection of dictionaries there are 20 sets of various types, among them A New English Dictionary (9 volumes), published in 1888, is the rarest and the most valuable. Moreover, A Dictionary of English Language abridged by Robert Gordon Latham from that of Samuel Johnson in 1882, Malayalam-English Dictionary by Hermann Gundert and Malayalam-Portuguese Dictionary by Arnos Padre are available in the Library. Many of the old ecclesiastical books which were in the beginnings of the college were taken to the collection of Bishop's house to make room for academic books. The College Library provides open access to its documents and any member can browse through the collections which are arranged according to Dewey Decimal Classification (DDC). The Library provides an online catalogue to trace these books. The members can also access scholarly literature which is available online through National Library and Information Services Infrastructure for Scholarly Content (N-LIST) of UGC-INFLIBNET. Through N-LIST, members can access to 6,000+ journals, 1,99,500+ ebooks under N-LIST and 6,00,000 ebooks through National Digital Library of India. Different sections of the College Library houses 40 terminals of computers for the use of students and faculty. The Library provides reference, scholarly literature search services and similarity check for plagiarism. Department of Information Studies The academic Department of Information Studies offers the Postgraduate Certificate Programme in Information Studies. The programme is delivered entirely online using virtual learning platform and consists of three modules: information literacy, academic writing and communication, and intellectual property and its management. The programme is first of its kind in India. Training on electronic reference management and awareness about web research profiles, citation metrics, intellectual property rights, ethical methods of research and publishing, pseudo-journals and plagiarism are also imparted by the Department. In the first quarter of 2020, the department started to offer the UGC approved two credit course Research and Publication Ethics online through St Thomas E-learning Platform (STEP), the Moodle instance of the college. Notable alumni Alphons Joseph - Malayalam Film Music Director Ashok Menon, Former Judge, Kerala High Court Anil Akkara, Former Member Of Kerala Legislative Assembly Arun G Raghavan, Actor Biju Menon, Actor C. Achutha Menon - Indian Communist Leader and Chief Minister of Kerala state C. Janardhanan, Communist Leader, Former Member of Parliament Chinmayananda Saraswati - Chinmaya Mission Devan- Malayalam Film Actor E. Santhosh Kumar, Winner of The Kerala Sahithya Academy Award for the best novel in 2012. Eknath Easwaran - Internationally respected spiritual teacher Eluvathingal Devassy Jemmis - Padma Shri awardee, Professor at Indian Institutes of Science Education and Research EMS Namboodiripad - Indian Communist Leader and first Chief Minister of Kerala state George Alapatt - Fourth Bishop of Syro-Malabar Catholic Archdiocese of Thrissur George Menachery – Historian Hariharan (director) Johnson- Malayalam Music Director Joseph Mundassery - Noted Literary critique and Education Minister of Kerala Kochouseph Chittilappilly - MD of V-Guard Industries Ltd Mar Aprem Mooken - Metropolitan of the Church of the East in India Mathai Manjooran - Indian Independence Activist Narain (actor), Actor Ouseppachan - Malayalam Film Music Director P. T. Kunju Muhammed- Malayalam Film Director and Producer Panampilly Govinda Menon - First Prime Minister of Cochin, Last Chief Minister of Travancore-Cochin and Union Cabinet minister for Law and Railways. Paulose II (Indian Orthodox Church) - Supreme primate of the Malankara Orthodox Syrian Church Ranjith Sankar, Director Shine Tom Chacko, Actor T. J. Saneesh Kumar Joseph, Member Of Kerala Legislative Assembly T. Pradeep - Professor of Chemistry, Chennai T. G. Ravi- Malayalam Film Actor V. M. Sudheeran- Political leader Victor Manjila- Former Indian Football Player Notable faculty Joseph Mundassery George Menachery C. Raveendranath K. A. Jayaseelan See also Christ College, Irinjalakuda Sree Kerala Varma college Vimala College University of Calicut References 8. Lazo Link International Pvt. Ltd. // LAZO LINK INTERNATIONAL, N.P. TOWER, WEST FORT JUNCTION, THRISSUR, KERALA, INDIA - 680004. // ( www.lazolink.com ) External links Official Website of St. Thomas' College Thrissur Official website of Department of Computer Science, St. Thomas' College, Thrissur Thrissur Colleges Catholic universities and colleges in India Archdiocese of Thrissur Arts and Science colleges in Kerala Colleges affiliated with the University of Calicut Colleges in Thrissur Educational institutions established in 1889 1889 establishments in British India Academic institutions formerly affiliated with the University of Madras
44363434
https://en.wikipedia.org/wiki/Robert%20B.%20Grady
Robert B. Grady
Robert Bruce Grady (12 January 1943 – 18 October 2014) was an engineer who was an expert in software development. He attended the Massachusetts Institute of Technology (MIT), graduating in 1965 with a bachelor's degree in electrical engineering. He was a member of Sigma Phi Epsilon fraternity, and he was a star player on the MIT basketball team. In 1965 he received the Howard W. Johnson Award for Male Senior Athlete of the Year, which is given to only one athlete per year across all athletics at MIT. Grady wrote three books on software metrics and project management: Software Metrics: Establishing a Company-Wide Program (with Deborah L. Caswell), Practical Software Metrics for Project Management and Process Improvement, and Successful Software Process Improvement. Grady and Caswell's first book, Software Metrics, has been cited 661 times in other scholarly articles and publications. References 1943 births 2014 deaths American software engineers American electrical engineers MIT Engineers men's basketball players American men's basketball players
7004765
https://en.wikipedia.org/wiki/Wii%20Menu
Wii Menu
The Wii Menu is the graphical shell of the Wii game console, as part of the Wii system software. It has four pages, each with a 4:3 grid, and each displaying the current time and date. Available applications, known as "channels", are displayed and can be navigated using the pointer capability of the Wii Remote. The grid is customizable; users can move channels (except for the Disc Channel) among the menu's 48 customizable slots by pressing and holding the A and B button while hovering over the channel the user wanted to move. By pressing the plus and minus buttons on the Wii Remote users can scroll across accessing empty slots. Pre-installed channels Disc Channel The Disc Channel is the primary way to play Wii and GameCube titles from supported Nintendo optical discs inserted into the console. If no disc is inserted, the message "Please insert a disc." will be displayed along with images of a template Wii and GameCube disc (the latter is not visible on the Wii Family Edition units, the Wii Mini, and the Wii U due to lack of GameCube support). The "Start" button will also remain deactivated until a playable disc is inserted. When a disc is inserted, the channel preview and banner on the menu will change to the one supplied by the title and the "Start" button will become available. If it is a GameCube disc, the banner and preview will change to the GameCube logo with the GameCube startup theme playing. Each Wii game disc includes a system update partition, which includes the latest Wii software from the time the game was released. If a disc that is inserted contains newer software than the one installed on the console, installing the new software will be required to play the game. This allows users without an internet connection to still receive system updates. When loaded into the disc slot, an icon on the Disc Channel that says "Wii System Update" appears. After selecting the channel, the Wii will automatically update. If these updates are not installed, the game will remain unplayable until the update is installed, as each time the channel is loaded with the game inserted, the update prompt will appear, and declining the update will return the player to the Wii Menu instead of starting the game. Games requiring a system update can still be played without updating using homebrew software, such as Gecko OS or a USB loader. Mii Channel The Mii Channel is an avatar creator, where users can design 3D caricatures of people called Miis by selecting from a group of facial and bodily features. At the Game Developers Conference 2007, Shigeru Miyamoto explained that the look and design of the Mii characters are based on Kokeshi, a form of Japanese doll used as souvenir gifts. A Wired interview of Katsuya Eguchi (producer of Animal Crossing and Wii Sports) held in 2006 confirmed that the custom player avatar feature shown at Nintendo's E3 Media Briefing would be included in the hardware. The feature was described as part of a "profile" system that contains the Mii and other pertinent player information. This application was officially unveiled by Nintendo in September 2006. It is incorporated into Wii's operating system interface as the "Mii Channel". Users can select from pre-made Miis or create their own by choosing custom facial shapes, colors, and positioning. In certain games like Wii Sports, Wii Play, Wii Fit, Wii Sports Resort, Wii Party, Wii Fit Plus, Mario & Sonic at the Olympic Games, WarioWare: Smooth Moves, Mario Kart Wii, Mario Party 8, My Pokémon Ranch, Animal Crossing: City Folk, Mario Strikers Charged, and Guitar Hero 5, each player's Mii will serve as the character the player controls in some/all forms of gameplay. Miis can interact with other Wii users by showing up on their Wii consoles through the WiiConnect24 feature or by talking with other Miis created by Wii owners all over the world. This feature is called Mii Parade. Early-created Miis as well as those encountered in Mii Parades may show up as spectators in some games. Miis can be stored on Wii Remotes and taken to other Wii consoles. The Wii Remote can hold a maximum of 10 Miis. In addition, Mii characters can be transferred from a user's Wii to Nintendo 3DS consoles, as well as supported Nintendo DS games via the Mii Channel. While in the channel, pressing A, followed by B, then 1, and holding 2 on the Wii Remote allows the user to unlock the feature. The Mii Channel is succeeded by the Mii Maker app for both Nintendo 3DS and Wii U, and the Mii options in Settings for Nintendo Switch. According to Nintendo president Satoru Iwata, over 160 million Mii characters had been created using the ‘’Mii Channel’’ as of May 2010. Photo Channel If a user inserts an SD card into the console, or receives photos (JPEG) or videos (MJPEG) via email, they can be viewed using the Photo Channel. The user can create a slideshow simply by inserting an SD card with photos and, optionally, MP3 or AAC files (see note regarding December 10, 2007 update to version 1.1). The Wii will automatically add Ken Burns Effect transitions between the photos and play either the music on the SD card or built-in music in the background. A built-in editor allows users to add markings and effects to their photos or videos (The edits float statically above the videos). Mosaics can also be created with this feature. Puzzles can be created from photos or videos with varying degrees of difficulty (However, your first puzzle will be six-pieces) with 6, 12, 24 and 48 piece puzzles available, with 192 selectable while holding down 1 on the Wii remote. Edited photos can be saved to the Wii and sent to other Wiis via the message board. According to the system's manual, the following file extensions (i.e. formats) are supported: Photos (jpeg/jpg), Movies (mov/avi), and Music (mp3/aac). JPEG files can be up to 8192x8192 resolution and in baseline format. Video data contained within the .mov or .avi files must be in an OpenDML compliant MotionJPEG use some variant of this format for their videos. Photos, even high resolution ones, are compressed and decreased in resolution. Photo Channel 1.1 Photo Channel 1.1 is an optional update to the Photo Channel that became available on the Wii Shop Channel on December 10, 2007. It allows users to customize the Photo Channel icon on the Wii Menu with photos from an SD Card or the Wii Message Board. It also allows playback of songs in random order. The update replaced MP3 support with support for MPEG-4 encoded audio files encoded with AAC in the .m4a extension. Wii owners who updated to version 1.1 can revert to version 1.0 by deleting it from the channels menu in the data management setup. Consoles released after December 10, 2007 come with the version 1.1 update pre-installed, and cannot be downgraded to version 1.0. Owners of Japanese systems can download a "Revert to Photo Channel 1.0" Channel from the Wii Shop Channel if they wish to do so. Wii Shop Channel The Wii Shop Channel allowed users to download games and other software by redeeming Wii Points, which could be obtained by purchasing Nintendo Points cards from retail outlets or directly through the Wii Shop Channel using MasterCard or Visa credit cards online. Users could browse in the Virtual Console, WiiWare, or Wii Channels sections for downloads. A feature to purchase downloaded software as gifts for others became available worldwide on December 10, 2007. Additional channels that were not released at the console's launch were available for purchase in the Wii Shop Channel. These included: Internet Channel, Everybody Votes Channel, Check Mii Out Channel, Nintendo Channel, Netflix Channel, and the Japan-only Television Friend Channel. Until the channel's shut down on January 30, 2019, all downloadable channels were free of charge. The name was originally going to be called the Shopping Channel. Nintendo discontinued the Wii Shop Channel on January 30, 2019 (having announced that they planned to do so on September 29, 2017), with the purchase of Wii Points ending on March 26, 2018. The ability to redownload previously purchased content and/or transfer Wii data from the Wii to the Wii U still remains available. Forecast Channel The Forecast Channel allowed weather reports and forecasts to be shown on the console from the Internet via the WiiConnect24 service. The Forecast Channel displayed a view of the Earth as a globe (courtesy of NASA), with which users can view weather in other regions. The user could also spin the globe. When fully zoomed out, an accurate star map was visible in the background. (The Big Dipper and the constellation Orion were easily recognizable, for example.) The Forecast Channel features included the current forecast, the UV index, today's overall forecast, tomorrow's forecast, a 5-day forecast (only for the selected country in which the user lives), a laundry check (Japan Only) and pollen count (Japan only). The Forecast Channel first became available on December 19, 2006. Certain games like Madden NFL 07, Nights: Journey of Dreams, and Mario & Sonic at the Olympic Winter Games could use the Forecast Channel to simulate weather conditions depending on the player's region. There are slight variations of Forecast Channel versions in different regions. When viewing weather conditions in Japan, a different set of weather icons is used. Additionally, the laundry index was only featured in the Japanese version. After the August 6, 2007 update, the Forecast Channel showed the icon for the current weather on the Wii Menu. Long neglect of this channel would result in the icon not appearing, although the set time was longer than that of the News Channel. The Forecast Channel (along with the News Channel) was not available in South Korea. Like the four other Wii channels (News Channel, Everybody Votes Channel, Check Mii Out Channel/Mii Contest Channel, Nintendo Channel), the Forecast Channel ended its seven-year support on June 27, 2013. News Channel The News Channel allowed users to access news headlines and current news events obtained from the Internet. News articles were available on a globe view, allowing users to view news from certain areas of the world (similar to the Forecast Channel), and as a slide show. The content was automatically updated and viewable via WiiConnect24 with clickable news images supported. The channel contained seven categories: National News, International News, Sports, Arts/Entertainment, Business, Technology and Oddities. The News Channel became available in North America, Europe, and Australia on January 26, 2007. Content was in a variety of languages provided by the Associated Press, who had a two-year contract to provide news and photos to Nintendo. Canadian news was submitted by the Canadian Press for publication. Japanese news was provided by Goo. European news was provided by Agence France-Presse. Starting with the August 6, 2007 update, the News Channel showed a news ticker in the Wii Menu, and when selecting the channel. However, not visiting the channel for a period of time resulted in the ticker not appearing, instead displaying "You must use the News Channel regularly for news to be displayed on this screen." on the preview screen until the channel was opened up. A December 20, 2007 PAL region update increased the number of news feeds to the channel, sourced from a larger number of news resources and agencies, providing more news that were available per country. The News Channel (along with the Forecast Channel) was not available in South Korea. Like the four other Wii channels (Forecast Channel, Everybody Votes Channel, Check Mii Out Channel/Mii Contest Channel, Nintendo Channel), the News Channel ended its seven-year support on June 27, 2013. Get Connected Video Channel The Get Connected Video Channel or Wii & The Internet Channel (or alternatively known as the Wii + Internet Channel or Wii: See What You Can Do On the Internet) is installed to Wii console units manufactured in October 2008 or later. It contains an informational video specifying the benefits of connecting the Wii console to the Internet, such as downloading extra channels, new software, Virtual Console titles, and playing games over Nintendo Wi-Fi Connection. The Get Connected Video Channel is the only pre-installed channel that takes up spare internal memory, and the only channel that can be manually deleted or moved to an SD Card by the user. The channel takes up 1,180 blocks of memory, which is over half the Wii's internal memory space. The large size of this channel is likely due to the fact it is available in multiple languages; three videos in the U.S. versions, and six videos in the PAL versions. Upon connecting to the Internet and running the channel, the user will be asked if they would like to delete it. It cannot be re-downloaded or restored upon deletion. The same video presentation contained in the channel can also be viewed on an archived version of Nintendo's official website. Furthermore, several gaming stores such as GameStop had this channel in their Wii stations. The channel is also available in multiple languages. Unlike the other channels, the video in the channel is not translated digitally, but is presented in multiple dubs, which means there are multiple copies of the same video in a single channel. The language of the video is presented is respectively according to the Wii's language setting. Available languages are English, French, and Spanish in the U.S. versions; and English, French, Spanish, German, Italian, and Dutch in the PAL version. The availability of multiple dubs is a likely factor that contributes to the large size of the channel. Internet Channel The Internet Channel is a version of the Opera web browser for use on the Wii by Opera Software and Nintendo. On December 22, 2006 a free demo version (promoted as "Internet Channel: Trial Version") of the browser was released. The final version (promoted as "Internet Channel: Final Version") of the browser was released on April 11, 2007 and was free to download until June 30, 2007. After this deadline had passed, the Internet Channel cost 500 Wii Points to download until September 1, 2009, though users who downloaded the browser before June 30, 2007, could continue to use it at no cost for the lifetime of the Wii system. An update (promoted as the "Internet Channel") on October 10, 2007 added USB keyboard compatibility. On September 1, 2009 the Internet Channel was made available to Wii owners for no cost of Wii Points and updated to include improved Adobe Flash Player support. A refund was issued to those who paid for the channel in the form of one free NES game download worth 500 Wii Points. The Internet Channel uses whichever connection is chosen in the Wii settings, and utilizes the user's internet connection directly; there is no third party network that traffic is being routed through. It receives a connection from a router/modem and uses a web browser to pull up HTTP and HTTPS (secure and encrypted) web pages. Opera, the Wii's web browser, is capable of rendering most web sites in the same manner as its desktop counterpart by using Opera's Medium Screen Rendering technology. For most Internet users, the Wii offers all of the functionality they need to perform the most common Internet tasks. The software is saved to the Wii's 512 MB internal flash memory (it can be copied to an SD card after it has been downloaded). The temporary Internet files (maximum of 5MB for the trial version) can only be saved to the Wii's internal memory. The application launches within a few seconds, after connecting to the Internet through a wireless LAN using the built-in interface or a wired LAN by using the USB to the Ethernet adapter. The Opera-based Wii browser allows users full access to the Internet and supports all the same web standards that are included in the desktop versions of Opera, including CSS and JavaScript. It is also possible for the browser to use technologies such as Ajax, SVG, RSS, and Adobe Flash Player 8 and limited support for Adobe Flash Player 9. Opera Software has indicated that the functionality will allow for third parties to create web applications specifically designed for the use on the Wii Browser, and it will support widgets, standalone web-based applications using Opera as an application platform. Third party APIs and SDKs have been released that allow developers to read the values of the Wii Remote buttons in both Flash and JavaScript. This allows for software that previously required keyboard controls to be converted for use with the Wii Remote. The browser was also used to stream BBC iPlayer videos from April 9, 2008 after an exclusive deal was made with Nintendo UK and the BBC to offer their catch-up service for the Wii. However, the September 2009 update caused the iPlayer to no longer operate. The BBC acknowledged the issue and created a dedicated channel instead. In June 2009, YouTube released YouTube XL, a TV-friendly version of the popular video-sharing website. The regular YouTube page would redirect the browser to YouTube XL, if the website detects that the Internet Channel or the PlayStation 3 browser is being used. Everybody Votes Channel Everybody Votes Channel allowed users to vote in simple opinion polls and compare and contrast opinions with those of friends, family, and people across the globe. Everybody Votes Channel was launched on February 13, 2007, and was available in the Wii Channels section of the Wii Shop Channel. The application allowed Wii owners to vote on various questions using their Mii as a registered voter. Additionally, voters were also able to make predictions for the choice that will be the most popular overall after their own vote has been cast. Each Mii's voting and prediction record is tracked and voters can also view how their opinions compare to others. Whether the Mii is correct in its predictions or not is displayed on a statistics page along with a counter of how many times that Mii has voted. Up to six Miis would be registered to vote on the console. The channel was free to download. Each player would make a suggestion for a poll a day. Like the other four Wii channels (Forecast Channel, News Channel, Nintendo Channel, Check Mii Out Channel/Mii Contest Channel), the Everybody Votes Channel ended its seven-year support on June 27, 2013 due to Nintendo shifting its resources to its next generation projects. Unlike the other discontinued channels, Everybody Votes Channel remains accessible with users able to view the latest poll data posted, albeit the channel will never be updated again. Check Mii Out Channel The Check Mii Out Channel (also known as the Mii Contest Channel in Australia and Europe) was a channel that allowed players to share their Miis and enter them into popularity contests. It was first available on November 11, 2007. It was available free to download from the Wii Channels section of the Wii Shop Channel. Users would post their own Miis in the Posting Plaza, or import other user-submitted Miis to their own personal Mii Parade. Each submitted Mii was assigned a 12-digit entry number to aid in searching. Submitted Miis were given 2 initials by their creator and a notable skill/talent to aid in sorting. In the Contests section, players submitted their own Miis to compete in contests to best fit a certain description (e.g. Mario without his cap). After the time period for sending a Mii had expired, the user had the choice of voting for three Miis featured on the judging panel, with ten random Miis being shown at a time. Once the judging period is over, the results of the contest may be viewed. Their selection and/or submission's popularity in comparison to others was displayed, as well as the winning Mii and user. The Check Mii Out Channel sent messages to the Wii Message Board concerning recent contests. Participants in certain contests would add their user and submitted Mii to a photo with a background related to the contest theme. This picture would then be sent to the Wii Message Board. This channel ended its seven-year support on June 27, 2013 like the four other channels (Forecast Channel, News Channel, Everybody Votes Channel, Nintendo Channel). Nintendo Channel The Nintendo Channel (also known as the Everybody's Nintendo Channel in Japan) allowed Wii users to watch videos such as interviews, trailers, commercials, and even download demos for the Nintendo DS. The Nintendo Channel has the ability to support Nintendo Entertainment System games, Super NES games, Nintendo 64 games, and GameCube games. Later the channel was used for the Wii U, and the Nintendo Switch under the name of the Nintendo E-Shop. In this capacity the channel worked in a similar way to the DS Download Station. The channel provided games, info, pages and users could rate games that they have played. A search feature was also available to assist users in finding new games to try or buy. The channel had the ability to take the user directly into the Wii Shop Channel for buying the wanted game immediately. The Nintendo Channel was launched in Japan on November 27, 2007, in North America on May 7, 2008, and in Europe and Australia on May 30, 2008. The Nintendo Channel was updated with different Nintendo DS demos and new videos every week; the actual day of the week varies across different international regions. An updated version of the Nintendo Channel was released in Japan on July 15, 2009, North America on September 14, 2009, and in Europe on December 15, 2009. The update introduced a new interface and additional features, options, and statistics for users to view. However, the European version was missing some of these new additional features, such as options for choosing video quality. In addition, a weekly show known as Nintendo Week began airing exclusively on the North American edition of the channel, while another show, Nintendo TV, was available on the UK version of the channel. The Nintendo Channel and the other 4 channels (Forecast Channel, News Channel, Everybody Votes Channel, Check Mii Out Channel/Mii Contest Channel) ended their seven-year support on June 27, 2013. A few shows appeared on Nintendo Channel which were no more than 20 minutes long: Nintendo Week Nintendo Week was a show on the Nintendo Channel. The hosts were Gary and Allison, but other co-hosts appeared as well like Dark Gary, Daniel, and others. Ultimate Wii Challenge/New Super Mario Bros. Wii Challenge The hosts were David and Ben. They tried to beat each other's time in Nintendo Games like New Super Mario Bros. Wii, Donkey Kong Country Returns, Super Mario Galaxy 2, and Kirby's Epic Yarn. In a few episodes, Ben and David worked together in levels of a few games. Disconnection It was announced on April 12, 2013 that the Forecast Channel, the News Channel, the Everybody Votes Channel, the Check Mii Out Channel/Mii Contest Channel, and the Nintendo Channel would close permanently on June 27, 2013, as Nintendo terminated the WiiConnect24 service which these channels required, and shifted their resources to their next-generation projects. Additional channels These channels are those that can be acquired through the usage of various games and accessories. Wii Fit/Wii Fit Plus Channel Wii Fit allows users to install the Wii Fit Channel to the Wii Menu. The channel allows them to view and compare their results, and those of others, as well as their progress in the game, without requiring the game disc to be inserted. The channel is essentially a stripped down version of Wii Fit. It allows users to view statistics from the game including users' BMI measurements and balance test scores in the form of a line graph, as well as keep track of the various activities they have undertaken with a calendar. Users are also able to weigh themselves and do a BMI and balance test with the channel once per day. However, if the player wishes to do any exercises or play any of the aerobics games and/or balance games, the game prompts the user to insert the Wii Fit game disc. Mario Kart Channel Mario Kart Wii allows players to install the Mario Kart Channel on their Wii console. The channel can work without inserting the Mario Kart Wii disc into the console, but to compete in races and time trials the disc is required. The use of the Mario Kart Channel allows for a number of options. A ranking option lets players see their best Time Trial scores for each track and compare their results to those of their friends and other players worldwide, represented by their Miis. Players will have the option of racing against the random or selective ghosts, or improving their results gradually by taking on the ghosts of rivals, those with similar race times. Users have the option to submit these times for others around the world to view. Players can also manage and register friends using the channel and see if any of them are currently online. Another feature of the channel are Tournaments, where Nintendo invited players to challenges similar to the missions on Mario Kart DS. Players were also able to compare their competition rankings with other players. As of May 20, 2014, most features of the channel have been discontinued. Jam with the Band Live Channel The Nintendo DS game Jam with the Band supports the Jam with the Band Live Channel (known as the Speaker Channel in Japan) that allows players to connect their game to a Wii console and let the game's audio be played through the channel. The channel supports multiple players. Wii Speak Channel Users with the Wii Speak peripheral are able to access the Wii Speak Channel. Users can join one of four rooms (with no limit to the number of people in each room) to chat with others online. Each user is represented by their own Mii, which lip-syncs to their words. In addition, users can also leave audio messages for other users by sending a message to their Wii Message Board. Users can also photo slideshows and comment on them. The Wii Speak Channel became available in North America and Europe on December 5, 2008, and was discontinued on May 20, 2014. The Wii Speak Channel is succeeded by Wii U Chat, which is standardized for the Wii U console. Rabbids Channel A channel created by Rabbids Go Home. When the game is started up for the first time or when the player goes to the player profile screen, the player may install the Rabbids Channel, which will appear on the Wii Menu after downloaded. Players can use the channel to view other people's Rabbids and enter contests. Downloadable channels Downloadable Channels are Channels that can be bought from the Wii Shop Channel. Virtual Console Channels Virtual Console channels are channels that allow users to play their downloaded Virtual Console games obtained from the Wii Shop Channel. The Virtual Console portion of the Wii Shop Channel specializes in older software originally designed and released for home entertainment platforms that are now defunct. These games are played on the Wii through the emulation of the older hardware. The prices are generally the same in almost every region and are determined primarily by the software's original platform. Originally, there was going to be one whole Virtual Console channel where you can launch your Virtual Console games sorted by game console, but this idea was dropped. WiiWare Channels Functioning similarly to the Virtual Console channels, WiiWare channels allow users to use their WiiWare games obtained from the Wii Shop Channel. The WiiWare section specializes in downloadable software specifically designed for the Wii. The first WiiWare games were made available on March 25, 2008 in Japan. WiiWare games launched in North America on May 12, 2008, and launched in Europe and Australia on May 20, 2008. The WiiWare section is being touted as a forum to provide developers with small budgets to release smaller-scale games without the investment and risk of creating a title to be sold at retail (somewhat similar to the Xbox Live Arcade and the PlayStation Store). While actual games have been planned to appear in this section since its inception, there had been no official word on when any would be appearing until June 27, 2007, when Nintendo made an official confirmation in a press release which revealed the first titles would surface sometime in 2008. According to Nintendo, "The remarkable motion controls will give birth to fresh takes on established genres, as well as original ideas that currently exist only in developers' minds." Like Virtual Console games, WiiWare games are purchased using Wii Points. Nintendo handles all pricing options for the downloadable games. Television Friend Channel (Japan only) The Television Friend Channel allowed Wii users to check what programs are on the television. Content was provided by Guide Plus. It was developed by HAL Laboratory. The channel had been said to be "very fun and Nintendo-esque". A "stamp" feature allowed users to mark programs of interest with a Mii-themed stamp. If an e-mail address or mobile phone number would have been registered in the address book, the channel could send out an alert 30 minutes prior to the start of the selected program. The channel tracked the stamps of all Wii users and allowed users to rate programs on a five-star scale. Additionally, when the channel was active the Wii Remote could be used to change the TV's volume and channel so that users can tune into their shows by way of the channel. The Television Friend Channel launched in Japan on March 4, 2008, and was discontinued on July 24, 2011 due to the shutdown of analog television broadcasts in Japan. It was never launched outside Japan, as most countries, unlike Japan, have a guide built into set-top boxes and/or TVs. The Television Friend Channel was succeeded by the now-defunct Nintendo TVii, which was standardized for the Wii U console. It also had the Kirby 1-UP sound, since it was made by HAL Laboratory. This was later removed before the release of the channel. Digicam Print Channel The Digicam Print Channel was a channel developed in collaboration with Fujifilm that allowed users to import their digital photos from an SD card and place them into templates for printable photo books and business cards through a software wizard. The user was also able to place their Mii on a business card. The completed design would then be sent online to Fujifilm who printed and delivered the completed product to the user. The processing of individual photos was also available. The Digicam Print Channel became available from July 23, 2008 in Japan, and ceased operation on June 26, 2013. Today and Tomorrow Channel The Today and Tomorrow Channel became available in Japan on December 2, 2008, and in Europe, Australia, and South Korea on September 9, 2009. The channel was developed in collaboration with Media Kobo and allows users to view fortunes for up to six Miis across five categories: love, work, study, communications, and money. The channel also features a compatibility test that compares two Miis, and also gives out "lucky words" that must be interpreted by the user. The channel uses Mii birthdate data, but users must input a birth year when they are loaded onto the channel. This channel was never released in North America, and although it was discontinued on January 30, 2019 with the Wii Shop Channel discontinuation, it can still be redownloaded if obtained before the Wii Shop Channel's closure. Wii no Ma (Japan only) A video on-demand service channel was released in Japan on May 1, 2009. The channel is a joint venture between Nintendo and Japanese advertising agency Dentsu. The channel's interface is built around a virtual living room, where up to 8 Miis can be registered and interact with each other. The virtual living room contains a TV which takes the viewer to the video list. Celebrity "concierge" Miis occasionally introduce special programming. Nintendo ceased operations of Wii no Ma on April 30, 2012. Demae Channel (Japan only) A food delivery service channel was released in Japan on May 26, 2009. The channel was a joint venture between Nintendo and Japanese on-line food delivery portal service Demae-can, and was developed by Denyu-sha. The channel offered a wide range of foods provided by different food delivery companies which can be ordered directly through the Wii channel. A note was posted to the Wii Message Board containing what had been ordered and the total price. The food was then delivered to the address the Wii user has registered on the channel. On February 22, 2017, Demae Channel was delisted from the Wii Shop Channel, it was later discontinued alongside the Wii U version on March 31, 2017. BBC iPlayer Channel (UK only) Wii access to the BBC iPlayer was interrupted on April 9, 2008, when an update to the Opera Browser turned out to be incompatible with the BBC iPlayer. The BBC chose not to make the BBC iPlayer compatible with the upgrade. This was resolved on November 18, 2009 when they released the BBC iPlayer Channel, allowing easier access to the BBC iPlayer. The BBC had since offered a free, dedicated Wii channel version of their BBC iPlayer application which is only available in the UK. By February 10, 2015, however, the channel was retired and consequently removed from Wii Shop Channel since newer versions are not compatible, and as per BBC's policy to retire older versions as a resource management. The channel had since been succeeded by the BBC iPlayer app on the UK edition of the Wii U eShop, which was released in May 2015. Netflix Channel A channel released in the United States and Canada on October 18, 2010 and in the UK and Ireland on January 9, 2012. This channel allowed Netflix subscribers to use that service's "Watch Instantly" movie streaming service over the Wii with their regular Netflix subscription fee, and replaced the previous Wii "streaming disc" mailed to Netflix customers with Wii consoles from February to October 2010 due to contractual limitations involving Xbox 360 exclusivity. The channel was free to download in the Wii Channels section of the Wii Shop Channel. The channel displayed roughly 12 unique categories of videos with exactly 75 video titles in each category. The TV category had many seasons of videos (i.e. 15–100 episodes) associated with each title. There were also categories for videos just watched, new releases, and videos recommended (based on the user's Netflix subscription history). On July 31, 2018, the channel was delisted from the Wii Shop Channel; Netflix would drop support for the Wii on January 30, 2019. LoveFilm Instant Channel (UK only) On 4 December 2012, the LoveFilm Instant channel was available to download on Wii consoles in the UK; the channel was discontinued on October 31, 2017. Kirby TV Channel (PAL regions only) The Kirby TV Channel launched on June 23, 2011 in Europe, Australia and New Zealand, and has since been discontinued. The channel allowed users to view episodes of the animated series Kirby: Right Back at Ya! for free. This channel was succeeded by the Nintendo Anime Channel, a Nintendo 3DS video-on-demand app, available in Japan and Europe, which streamed curated anime or anime-inspired shows, such as Kirby: Right Back at Ya! Hulu Plus Channel (USA only) Hulu Plus Channel was a channel for Wii, also as announced in Nintendo Updates on Nintendo Channel. Hulu Plus Channel included classic shows and other Hulu included shows. The channel launched in 2012, and was only available in the United States. On January 30, 2019, Hulu dropped support for the Wii. The Legend of Zelda: Skyward Sword Save Data Update Channel The Legend of Zelda: Skyward Sword Save Data Update Channel fixes an issue in the game The Legend of Zelda: Skyward Sword. This title is the only Wii game to ever receive a downloadable, self-patching service, wherein previous titles with technical issues, such as Metroid: Other M, required the game's owners experiencing said issues to send their Wii consoles to customer service where Nintendo had to manually fix such issues. YouTube Channel The YouTube channel allowed the user to view YouTube videos on the television screen and had the ability to sign into an existing YouTube account. The YouTube channel, which became available without warning, is currently only available in the North American, UK, Japanese, and Australian versions of the Wii system, with the North American release on November 15, 2012, only three days before the Wii U was released in North America. Google planned to gradually make the channel available on Wii in other countries besides the aforementioned regions. The YouTube channel was initially categorized on the Wii Shop Channel as a "WiiWare" title by mistake, but this was later fixed when the Wii U Transfer Tool channel became available. On June 26, 2017, YouTube terminated legacy support for all devices that continue using the Flash-based YouTube app (typically found in most TV devices released before 2012), which includes the Wii. Wii U Transfer Tool Channel This application became available on the Wii Shop Channel the day the Wii U was released per respective region. The only purpose of this channel is to assist transferring all eligible content out from a Wii console to a Wii U console, where the said content would be available via Wii Mode on the target Wii U. The application can transfer all available listed WiiWare titles (initially with the sole exemption of LostWinds for unknown reasons, but the game had since become available for both transfer to and purchase on Wii U since May 2014), all available listed Virtual Console titles, game save data, DLC data, Mii Channel data, Wii Shop Channel data (including Wii Points, conditional that accumulated total does not exceed 10,000 Wii Points on target Wii U), and Nintendo Wi-Fi Connection ID data to a target Wii U (albeit now moot since the service was discontinued in May 2014), but it cannot transfer Wii settings data, pre-installed WiiWare/Virtual Console titles (such as Donkey Kong: Original Edition that came pre-installed in the PAL version of the Super Mario Bros. 25th Anniversary Wii bundle), any game or application software that had been since delisted from the Wii Shop Channel prior to the release of Wii U (such as the Donkey Kong Country trilogy), software that is already available on the target Wii U's Wii Mode, WiiConnect24-supported software and save data (which includes the 16-digit Wii console Friend Code), and Nintendo GameCube save data since the Wii U does not support the latter two. It is possible to move content from multiple Wii consoles to a single target Wii U console, as well as multiple transfers from a single Wii console if required, albeit the last Wii console's content will overwrite any similar Wii data transferred to target Wii U earlier. Due to technical limitations, the channel cannot directly transfer any eligible background data which has been saved on the console's SD card. The Wii U Transfer Tool Channel features an animation based on the Pikmin series, wherein a visual transfer display of various Pikmin drones would automatically carry the eligible data and software to a Hocotate-based space ship bound for the Wii U. While context dynamic, this animation is not interactive, and only exists for entertainment purposes. The ability to transfer content from the Wii to the Wii U is still available for the foreseeable future after the Wii Shop Channel's shutdown on January 30, 2019. Amazon Video Amazon Video, a video on demand service provided by Amazon.com, was released as a downloadable Wii channel in the United States in January 2013; the service was discontinued on January 30, 2019. Crunchyroll In late 2014, Crunchyroll released their video app for the Wii's successor, Wii U, in North America. However, believing there are still many actively connected Wii consoles in its twilight years, Crunchyroll had surprised users with dedicated a Crunchyroll channel for Wii as well, launching the app categorized under "WiiWare" on October 15, 2015 in North America and the PAL regions. The Crunchyroll Wii channel currently only permits access to Premium account holders to the majority of the prime content. On May 5, 2017, less than 20 months after its launch, Crunchyroll ceased support for the Wii due to technical limitations after the service updated with new technology. Wii Message Board The Message Board allows users to leave messages for friends, family members, or other users on a calendar-based message board. Users could also use WiiConnect24 to trade messages and pictures with other Wii owners, conventional email accounts (email pictures to console, but not pictures to email), and mobile phones (through text messages). Each Wii has an individual wii.com email account containing the Wii Number. Prior to trading messages it is necessary to add and approve contacts in the address book, although the person added will not get an automatic notification of the request, and must be notified by other means. The service also alerts all users of incoming game-related information. Message Board was available for users to post messages that are available to other Wii users by usage of Wii Numbers with WiiConnect24. In addition to writing text, players can also include images from an SD card in the body of messages, as well as attaching a Mii to the message. Announcements of software updates and video game news are posted by Nintendo. The Message Board can be used for posting memos for oneself or for family members without going online. These messages could then be put on any day of the calendar. The Wii Message Board could also be updated automatically by a real-time game like Animal Crossing. Wii Sports, Wii Play, Mario Kart Wii, Wii Speak Channel, Wii Sports Resort, Super Mario Galaxy & Super Mario Galaxy 2 use the Message Board to update the player on any new high scores or gameplay advancements, such as medal placements in the former two titles, completions of races including a photo, audio messages, and letters from the Mailtoad via the Wii Message Board. Metroid Prime 3: Corruption, Super Mario Galaxy, Super Smash Bros Brawl, Elebits, Animal Crossing: City Folk, Dewy's Adventure and the Virtual Console game Pokémon Snap allow players to take screenshots and post them to the Message Board to edit later or send to friends via messages. Except for Nintendo GameCube games, the Message Board also records the play history in the form of "Today's Accomplishments". This feature automatically records details of what games or applications were played and for how long. It cannot be deleted or hidden without formatting the console itself. Prior to its closure, the Nintendo Channel was able to automatically tally all Wii game play data from the Message Board and display them in an ordered list within the channel. Subsequent system updates added a number of minor features to the Message Board, including minor aesthetic changes, USB keyboard support and the ability to receive Internet links from friends, which can be launched in the Internet Channel. An exploit in the Wii Message Board can be used to homebrew a Wii via a tool called LetterBomb. Discontinuation The WiiConnect24 service has been terminated as of June 27, 2013, completely ceasing the data exchange functionality of the Wii Message Board for all Wii consoles, whether as messages or game data. However, Nintendo is still able to continue sending some notification messages after that date to any continuously connected Wii consoles. SD Card Menu The SD Card Menu is a feature made available with the release of Wii Menu version 4.0. This menu allows the user to run Virtual Console games, WiiWare games, and Wii Channels directly from the SD card, which makes it possible to free up the Wii's internal memory. Applications can be downloaded to the SD card directly from the Wii Shop Channel as well. When running an application from the SD Card Menu, it is temporarily copied to the internal memory of the Wii, meaning the internal memory still must contain an amount of free blocks equal to the application's size. If the internal memory does not have enough space, the Channel will run an "Automanager" program, which clears up space for the user in one of many ways (selectable by the user). The manager can place the largest channels on the user's Wii in the SD card, put smaller channels on the SD card until enough space remains to run the channel, clear channels from the left side of the Wii menu to the right side, or from the right side to the left until there are enough blocks to run the channel. System updates and Parental Controls The Wii is capable of downloading updates to its core operating software. These updates may include additional features, patches/fixes, or support for newly released channels. When an update becomes available, Nintendo notifies users by sending a message to their console. Updates are included with certain Wii games, both requiring one to be fully updated in order to play and providing the update should one lack the necessary internet connection. The Wii Menu also featured Parental Controls to restrict access to certain operations. See also Wii system software Xbox 360 Dashboard/New Xbox Experience (NXE) XrossMediaBar References External links Wii Menu from Nintendo.com Introducing Wii Menu from Wii.com Menu Graphical user interface elements Video games scored by Kazumi Totaka
1607329
https://en.wikipedia.org/wiki/Raph%20Levien
Raph Levien
Raphael Linus Levien (also known as Raph Levien; born July 16, 1961) is a Dutch software developer, a member of the free software developer community, through his creation of the Advogato virtual community and his work with the free software branch of Ghostscript. From 2007 until 2018, and from 2021 onwards, he was employed at Google. He holds a PhD in Computer Science from UC Berkeley. He also made a computer-assisted proof system similar to Metamath: Ghilbert. In April 2016, Levien announced a text editor made as a "20% Project" (Google allows some employees to spend 20% of their working hours developing their own projects): Xi. Imaging and typography The primary focus of Levien's work and research is in the varied areas regarding the theory of imaging—that is, rendering pictures and fonts for electronic display, which in addition to being aesthetically and mathematically important also contribute to the accessibility and search-openness of the web. Levien has written several papers documenting his research in halftoning technology, which have been implemented in the Gimp-Print free software package, as well as by several commercial implementations. He also created Gill, the GNOME desktop illustration application which aimed at supporting the W3C SVG standard for Vector Graphics. He states it was named after Eric Gill, the English type designer responsible for the Gill Sans, Perpetua and Joanna fonts. Direct development on Gill ceased around the year 2000, but a fork of its codebase has evolved to Sodipodi, and through it to Inkscape. In 2009, Levien completed a PhD thesis entitled 'From Spiral to Spline: Optimal Techniques in Interactive Curve Design' and published a standalone essay on the mathematical history of Elastica. He calls the Elastica "A beautiful family of curves based on beautiful mathematics and a rich and fascinating history." Beginning in 2010, his work with Google largely focused on introducing high-quality, open licensed, well organized webfonts to the internet through Google's webfont API. Here, his experience with typographical technology, history and industry helped to shape the development of this growing resource, though he has since moved on from the project to work on Android fonts and text layout. One of his own fonts, Inconsolata (named in 2009 as one of the ten best programming fonts by Hivelogic, and generally known for its clean lines and elegant design) is now available within the Google library. Regarding this font and his curves work in general, Levien had to say, "And, in fact, I don't just use the Euler spirals, I use a mixture of curves (my package is called Spiro, which is kind of an abbreviation for polynomial spirals). Most of Inconsolata (the monospaced font mentioned above) is drawn using G4-continuous splines, which are a very close approximation to the Minimum Variation Curve of Henry Moreton. I now think that's overkill, and G2-continuous splines (the Euler spiral ones) are plenty, and could be done with fewer points." Advogato In November 1999, Levien founded Advogato, a social website for the free software community, to test his ideas of attack-resistant trust metrics and to provide a development-focused forum for the free software community that was free of the kind of commercial motivations of such sites as SourceForge. The site has been successful from the point of view of the first criterion, surviving many attacks aimed at subverting the attack metric, made both by developers trying out attacks, and by spammers. The site has needed only relatively minor changes to cope with these. The site's trust metric provides, alongside Epinions, one of the two most important datasets used in empirical analysis of trust metrics and reputation systems. Levien observed that Google's PageRank algorithm can be understood to be an attack resistant trust metric rather similar to that behind Advogato. The site has had a more rocky road as a forum for free software developers, and currently hosts less discussion than at its peak as developers have moved from forums to weblogs. Due to this, Advogato has added a syndication feature which includes the weblogs of its current certified developer base. It remains one of the earlier networking sites, and is still a place for active discussion on development of free software. Activism in GPL-licensed software and encryption legislation Levien played a small part in precipitating the relaxation of the US crypto export legislation, by filing for a Commodities Jurisdiction Request for a T-shirt containing an implementation of the RSA encryption algorithm, in four lines of Perl. At the time (1995), the code on the T-shirt would have been regarded as a munition by the United States and other NATO governments. ZD-Net's Interactive week summarised the issue that patents pose to the free software community: As a resolution to this conflict, in March 2000, Levien made a patent grant of his patent portfolio to the GPL community. Personal life He is divorced, with two sons: Alan and Max, and a stepdaughter. He is a member of the Berkeley Monthly Meeting of the Religious Society of Friends (Quakers). In the book TeX People: Interviews from the world of TeX, Levien notes, "I was born in Enkhuizen, the Netherlands, and moved to Virginia when I was three, so I don't really speak Dutch or anything but I do find myself with a liking for herring." Bibliography Raph Levien (2004). Attack Resistant Trust Metrics. Early draft of abandoned PhD manuscript. Raph Levien (2007). Lessons From Advogato (video) (abstract). Google Tech Talks, June 25, 2007. References External links Profile raph at Advogato Text of Levien's patent grant 1961 births Living people Free software programmers Reputation management American Quakers University of California, Berkeley alumni American typographers and type designers Google employees People from Enkhuizen Dutch emigrants to the United States
507928
https://en.wikipedia.org/wiki/Inferno%20%28operating%20system%29
Inferno (operating system)
Inferno is a distributed operating system started at Bell Labs and now developed and maintained by Vita Nuova Holdings as free software under the MIT license. Inferno was based on the experience gained with Plan 9 from Bell Labs, and the further research of Bell Labs into operating systems, languages, on-the-fly compilers, graphics, security, networking and portability. The name of the operating system and many of its associated programs, as well as that of the current company, were inspired by Dante Alighieri's Divine Comedy. In Italian, Inferno means "hell" — of which there are nine circles in Dante's Divine Comedy. Design principles Inferno was created in 1995 by members of Bell Labs' Computer Science Research division to bring ideas of Plan 9 from Bell Labs to a wider range of devices and networks. Inferno is a distributed operating system based on three basic principles drawn from Plan 9: Resources as files: all resources are represented as files within a hierarchical file system Namespaces: a program's view of the network is a single, coherent namespace that appears as a hierarchical file system but may represent physically separated (locally or remotely) resources Standard communication protocol: a standard protocol, called Styx, is used to access all resources, both local and remote To handle the diversity of network environments it was intended to be used in, the designers decided a virtual machine was a necessary component of the system. This is the same conclusion of the Oak project that became Java, but arrived at independently. The Dis virtual machine is a register machine intended to closely match the architecture it runs on, as opposed to the stack machine of the Java Virtual Machine. An advantage of this approach is the relative simplicity of creating a just-in-time compiler for new architectures. The virtual machine provides memory management designed to be efficient on devices with as little as 1 MiB of memory and without memory-mapping hardware. Its garbage collector is a hybrid of reference counting and a real-time coloring collector that gathers cyclic data. The Inferno kernel contains the virtual machine, on-the-fly compiler, scheduler, devices, protocol stacks, and the name space evaluator for each process' file name space, and the root of the file system hierarchy. The kernel also includes some built-in modules that provide interfaces of the virtual operating system, such as system calls, graphics, security, and math modules. The Bell Labs Technical Journal paper introducing Inferno listed several dimensions of portability and versatility provided by the OS: Portability across processors: it currently runs on ARM, SGI MIPS, HP PA-RISC, IBM PowerPC, Sun SPARC, and Intel x86 architectures and is readily portable to others. Portability across environments: it runs as a stand-alone operating system on small terminals, and also as a user application under Bell Plan 9, MS Windows NT, Windows 95, and Unix (SGI Irix, Sun Solaris, FreeBSD, Apple Mac OS X, Linux, IBM AIX, HP-UX, Digital Tru64). In all of these environments, Inferno programs see an identical interface. Distributed design: the identical environment is established at the user's terminal and at the server, and each may import the resources (for example, the attached I/O devices or networks) of the other. Aided by the communications facilities of the run-time system, programs may be split easily (and even dynamically) between client and server. Minimal hardware requirements: it runs useful applications stand-alone on machines with as little as 1 MiB of memory, and does not require memory-mapping hardware. Portable programs: Inferno programs are written in the type-safe language Limbo and compiled to Dis bytecode, which can be run without modifications on all Inferno platforms. Dynamic adaptability: programs may, depending on the hardware or other resources available, load different program modules to perform a specific function. For example, a video player might use any of several different decoder modules. These design choices were directed to provide standard interfaces that free content and service providers from concern of the details of diverse hardware, software, and networks over which their content is delivered. Features Inferno programs are portable across a broad mix of hardware, networks, and environments. It defines a virtual machine, known as Dis, that can be implemented on any real machine, provides Limbo, a type-safe language that is compiled to portable byte code, and, more significantly, it includes a virtual operating system that supplies the same interfaces whether Inferno runs natively on hardware or runs as a user program on top of another operating system. A communications protocol called Styx is applied uniformly to access both local and remote resources, which programs use by calling standard file operations, open, read, write, and close. As of the fourth edition of Inferno, Styx is identical to Plan 9's newer version of its hallmark 9P protocol, 9P2000. Most of the Inferno commands are very similar to Unix commands with the same name. History Inferno is a descendant of Plan 9 from Bell Labs, and shares many design concepts and even source code in the kernel, particularly around devices and the Styx/9P2000 protocol. Inferno shares with Plan 9 the Unix heritage from Bell Labs and the Unix philosophy. Many of the command line tools in Inferno were Plan 9 tools that were translated to Limbo. In the mid-1990s, Plan 9 development was set aside in favor of Inferno. The new system's existence was leaked by Dennis Ritchie in early 1996, after less than a year of development on the system, and publicly presented later that year as a competitor to Java. At the same time, Bell Labs' parent company AT&T licensed Java technology from Sun Microsystems. In March–April 1997 IEEE Internet Computing included an advertisement for Inferno networking software. It claimed that various devices could communicate over "any network" including the Internet, telecommunications and LANs. The advertisement stated that video games could talk to computers,–a PlayStation was pictured–cell phones could access email and voice mail was available via TV. Lucent used Inferno in at least two internal products: the Lucent VPN Firewall Brick, and the Lucent Pathstar phone switch. They initially tried to sell source code licenses of Inferno but found few buyers. Lucent did little marketing and missed the importance of the Internet and Inferno's relation to it. During the same time Sun Microsystems was heavily marketing its own Java programming language, which was targeting a similar market, with analogous technology, that worked in web browsers and also filled the demand for object-oriented languages popular at that time. Lucent licensed Java from Sun, claiming that all Inferno devices would be made to run Java. A Java byte code to Dis byte code translator was written to facilitate that. However, Inferno still did not find customers. The Inferno Business Unit closed after three years, and was sold to Vita Nuova. Vita Nuova continued development and offered commercial licenses to the complete system, and free downloads and licenses (not GPL compatible) for all of the system except the kernel and VM. They ported the software to new hardware and focused on distributed applications. Eventually, Vita Nuova released the 4th edition under more common free software licenses, and in 2021 they relicensed all editions under mainly the MIT license. Ports Inferno runs directly on native hardware and also as an application providing a virtual operating system which runs on other platforms. Programs can be developed and run on all Inferno platforms without modification or recompilation. Native ports include these architectures: x86, MIPS, ARM, PowerPC, SPARC. Hosted or virtual OS ports include: Microsoft Windows, Linux, FreeBSD, Plan 9, Mac OS X, Solaris, IRIX, UnixWare. Inferno can also be hosted by a plugin to Internet Explorer. Vita Nuova said that plugins for other browsers were under development, but they were never released. Inferno has also been ported to Openmoko, Nintendo DS, SheevaPlug, and Android. Distribution Inferno 4th edition was released in early 2005 as free software. Specifically, it was dual-licensed under two structures. Users could either obtain it under a set of free software licenses, or they could obtain it under a proprietary license. In the case of the free software license scheme, different parts of the system were covered by different licenses, including the GNU General Public License, the GNU Lesser General Public License, the Lucent Public License, and the MIT License, excluding the fonts, which are sub-licensed from Bigelow and Holmes. In March 2021, all editions were relicensed under mainly the MIT license. See also Language-based system Singularity (operating system) Notes References Further reading describes the 3rd edition of the Inferno operating system, though it focuses more on the Limbo language and its interfaces to the Inferno system, than on the Inferno system itself. For example, it provides little information on Inferno's versatile command shell, which is understandable since it is a programming language textbook. , uses Inferno for examples of operating system design. was intended to provide an operating-system-centric point of view, but was never completed. External links Documentation papers for the latest inferno release. Inferno Fourth Edition Download, including source code. Mailing list and other resources. Ninetimes: News and articles about Inferno, Plan 9 and related technologies. Inferno programmer's notebook - A journal made by an Inferno developer. Try Inferno: free, in-browser access to a live Inferno system. Inferno OS to Raspberry Pi Labs: Porting Inferno OS to Raspberry Pi 1996 software ARM operating systems Distributed operating systems Embedded operating systems Real-time operating systems X86 operating systems PowerPC operating systems MIPS operating systems
3246514
https://en.wikipedia.org/wiki/ECW%20%28file%20format%29
ECW (file format)
ECW (Enhanced Compression Wavelet) is a proprietary wavelet compression image format optimized for aerial photography and satellite imagery. It was developed by Earth Resource Mapping, and is now owned by Intergraph part of Hexagon AB. The lossy compression format efficiently compresses very large images with fine alternating contrast while retaining their visual quality. In 1998 Earth Resource Mapping Ltd in Perth, Western Australia company founder Stuart Nixon, and two software developers Simon Cope and Mark Sheridan were researching rapid delivery of terabyte sized images over the internet using inexpensive server technology. The outcome of that research was two products, Image Web Server (IWS) and ECW. ECW represented a fundamental mathematical breakthrough enabling Discrete Wavelet Transforms (DWT) and inverse-DWT operations to be performed on very large images very quickly, while only using a tiny amount of RAM. For ECW patents, see and . For IWS patent, see . These patents have been obtained by ERDAS Inc. through the acquisition of Earth Resource Mapping on May 21, 2007. Indirectly Hexagon AB owns these patents because they acquired Leica Geosystems in 2005 who had acquired ERDAS Inc in 2001. After JPEG2000 became an image standard, ER Mapper added tools to read and write JPEG2000 data into the ECW SDK to form the ECW JPEG2000 SDK. After subsequent purchase by ERDAS (themselves subsequently merged into Intergraph), the software development kit was renamed to the ERDAS ECW/JP2 SDK. v5 of the SDK was released on 2 July 2013. Properties Map projection information can be embedded into the ECW file format to support geospatial applications. Image data of up to 65,535 bands (layers or colors) can be compressed into the ECW v2 or v3 file format at a rate of over 25 MB per second on an i7 740QM (4-cores) 1.731 GHz processor using v4.2 of the ECW/JP2 SDK. Data flow compression allows for compression of large images with small RAM requirements. The file format can achieve typical compression ratios from 1:2 to 1:100. The ECW Protocol (ECWP) is an efficient streaming protocol used to transmit ECW and JPEG2000 images over networks, such as the Internet. ECWP supports ECWPS for private and secure encrypted streaming of image data over public networks such as the Internet. There is a very fast read-only SDK supporting ECW and JPEG2000 which is available for no charge for desktop implementation for Windows, Linux and MacOSX. A read / write SDK can be purchased for desktop and server implementations for Windows, Linux and MacOSX). A full functioning server implementation (using ECW, JPEG2000, ECWP and JPIP) is available within the PROVIDER SUITE of the Power Portfolio (formerly IWS) license. A previous version of the SDK (3.3) is available in open source, and can be used for non-Microsoft operating systems, such as Linux, macOS or Android. References External links Ueffing - Wavelet based ECW image compression (PDF) ERDAS ECW/JP2 SDK V5 FAQ (PDF) GIS raster file formats
175004
https://en.wikipedia.org/wiki/Light-on-dark%20color%20scheme
Light-on-dark color scheme
Light-on-dark color scheme—also called black mode, dark mode, dark theme or night mode—is a color scheme that uses light-colored text, icons, and graphical user interface elements on a dark background and is often discussed in terms of computer user interface design and web design. Many modern websites and operating systems offer the user an optional light-on-dark display mode. Some users find dark mode displays more visually appealing, and they can reduce power consumption. Displaying white on full brightness uses roughly six times as much power as pure black on a 2016 Google Pixel, which has an OLED display. Most modern operating systems support an optional light-on-dark color scheme. History Predecessors of modern computer screens, such as cathode-ray oscillographs, oscilloscopes, etc., tended to plot graphs and introduce other content as glowing traces on a black background. With the introduction of computer screens, originally user interfaces were formed on CRTs like those used for oscillographs or oscilloscopes. The phosphor was normally a very dark color, and lit up brightly when the electron beam hit it, appearing to be white, green, blue, or amber on a black background, depending on phosphors applied on a monochrome screen. RGB screens continued to operate similarly, using all the beams set to "on" to form white. With the advent of teletext, research was done into which primary and secondary light colors and combinations worked best for this new medium. Cyan or yellow on black was typically found to be optimal from a palette of black, red, green, yellow, blue, magenta, cyan and white. The opposite color set, a dark-on-light color scheme, was originally introduced in WYSIWYG word processors to simulate ink on paper, and became the norm. In early 2018, designer Sylvain Boyer extended the dark mode concept to the core interface of smartphones with OLED screens to save power consumption. Firefox and Chromium have optional dark theme for all internal screens, and in 2019, Apple announced that a light-on-dark mode would be available across all native applications in iOS 13 and iPadOS. It will also be possible for third-party developers to implement their own dark themes. Wikipedia has light on dark mode but only available on its Android and iOS applications and therefore certain Chromium/Firefox extensions have been made to provide a dark mode on Wikipedia. In 2019, a "prefers-color-scheme" option was created for front-end web developers, being a CSS property that signals a user's choice for their system to use a light or dark color theme. Energy usage Light on dark color schemes require less energy to display on OLED displays. This positively impacts battery life and energy consumption. While an OLED will consume around 40% of the power of an LCD displaying an image that is primarily black, it can use more than three times as much power to display an image with a white background, such as a document or web site. This can lead to reduced battery life and energy usage unless a light-on-dark color scheme is used. The long-term reduced power usage may also prolong battery life or the useful life of the display and battery. The energy savings that can be achieved using a light-on-dark color scheme are because of how OLED screens work: in an OLED screen, each subpixel generates its own light and it only consumes power when generating light. This is in contrast to how an LCD works: in an LCD, subpixels either block or allow light from an always-on (lit) LED backlight to pass through. "AMOLED Black" color schemes (that use pure black instead of dark gray) do not necessarily save more energy than other light-on-dark color schemes that use dark gray instead of black, as the power consumption on an AMOLED screen decreases proportionately to the average brightness of the displayed pixels. Although it is true that AMOLED black does save more energy than dark gray, the additional energy savings are often negligible; AMOLED black will only give an additional energy saving of less than 1%, for instance, over the dark gray that's used in the dark theme for Google's official Android apps. In November 2018, Google confirmed that dark mode on Android saved battery life. Issues with the web Some argue that a color scheme with light text on a dark background is easier to read on the screen, because the lower overall brightness causes less eyestrain. Others argue to the contrary. The caveat is that most pages on the web are designed for white backgrounds; GIF and PNG images with a transparency bit instead of alpha channels tend to show up with choppy outlines, as well as causing problems with other graphical elements. There is a prefers-color-scheme media feature on CSS, to detect if the user has requested light or dark color scheme and serve the requested color scheme. It can be indicated from the user's operating system preference or a user agent. CSS example: @media (prefers-color-scheme: dark) { body { color: #ccc; background: #222; } } JavaScript example: if (window.matchMedia('(prefers-color-scheme: dark)').matches) { dark(); } See also AMOLED Blackle OLED Solarized (color scheme) References User interfaces Display technology Color schemes Computer graphics
9002006
https://en.wikipedia.org/wiki/Conference%20room%20pilot
Conference room pilot
Conference room pilot (CRP) is a term used in software procurement and software acceptance testing. A CRP may be used during the selection and implementation of a software application in an organisation or company. The purpose of the conference room pilot is to validate a software application against the business processes of end-users of the software, by allowing end-users to use the software to carry out typical or key business processes using the new software. A commercial advantage of a conference room pilot is that it may allow the customer to prove that the new software will do the job (meets business requirements and expectations) before committing to buying the software, thus avoiding buying an inappropriate application. The term is most commonly used in the context of 'out of the box' (OOTB) or 'commercial off-the-shelf' software (COTS). Compared to user acceptance testing Although a conference room pilot shares some features of user acceptance testing (UAT), it should not be considered a testing process – it validates that a design or solution is fit for purpose at a higher level than functional testing. Shared features of CRP and UAT include: End-to-end business processes are used as a "business input" for both Functionality demonstrations Non-functional validation(e.g. performance testing) Differences between a conference room pilot and a formal UAT: It is attempting to identify how well the application meets business needs, and identify gaps, whilst still in the design phase of the project There is an expectation that changes will be required before acceptance of the solution The software is ‘on trial’ and may be rejected completely in favour of another solution. References https://web.archive.org/web/20120306082951/http://www.ensync-corp.com/consulting/conference_room_pilot.cfm?section=consulting https://web.archive.org/web/20100410184030/http://www-archive.ui-integrate.uillinois.edu/news_art_crp.asp https://web.archive.org/web/20120306082951/http://www.bourkeconsulting.com/documents/POCCRPBCAWebsite020903.pdf https://web.archive.org/web/20120101095056/http://www.smthacker.co.uk/conference_room_pilot.htm Software testing
1185056
https://en.wikipedia.org/wiki/Segmentation%20and%20reassembly
Segmentation and reassembly
Segmentation and reassembly (SAR) is the process used to fragment and reassemble variable length packets into fixed length cells so as to allow them to be transported across asynchronous transfer mode (ATM) networks or other cell based infrastructures. Since ATM's payload is only 48 bytes, nearly every packet from any other protocol has to be processed in this way. Thus, it is an essential process for any ATM node. It is usually handled by a dedicated chip, called the SAR. The process is conceptually simple: an incoming packet from another protocol to be transmitted across the ATM network is chopped up into segments that fit into 48-byte chunks carried as ATM cell payloads. At the far end, these chunks are fitted back together to reconstitute the original packet. The process is analogous to the fragmentation of IP packets on reaching an interface with a maximum transmission unit (MTU) less than the packet size and the subsequent reassembly of the original packet once the fragments have reached the original packet's destination. Since different types of data are encapsulated in different ways, the details of the segmentation process vary according to the type of data being handled. There are several different schemes, referred to as ATM adaptation layers (AAL). The schemes are: AAL0 – Raw cells with no special format AAL1 – Constant bitrate, circuit emulation (T1, E1, etc.) AAL2 – Variable bitrate synchronous traffic, eous traffic, e.g. Frame Relay transport AAL5 – Used for most data traffic, such as IP See also Packet segmentation Packet aggregation Network protocols Networking standards
9561751
https://en.wikipedia.org/wiki/Microsoft%20Office%202008%20for%20Mac
Microsoft Office 2008 for Mac
Microsoft Office 2008 for Mac is a version of the Microsoft Office productivity suite for Mac OS X. It supersedes Office 2004 for Mac (which did not have Intel native code) and is the Mac OS X equivalent of Office 2007. Office 2008 was developed by Microsoft's Macintosh Business Unit and released on January 15, 2008. Office 2008 was followed by Microsoft Office for Mac 2011 released on October 26, 2010, requiring a Mac with an Intel processor and Mac OS version 10.5 or better. Office 2008 is also the last version to feature Entourage, which was replaced by Outlook in Office 2011. Microsoft stopped supporting Office 2008 on April 9, 2013. Release Office 2008 was originally slated for release in the second half of 2007; however, it was delayed until January 2008, purportedly to allow time to fix lingering bugs. Office 2008 is the only version of Office for Mac supplied as a Universal Binary. Unlike Office 2007 for Windows, Office 2008 was not offered as a public beta before its scheduled release date. Features Office 2008 for Mac includes the same core programs currently included with Office 2004 for Mac: Entourage, Excel, PowerPoint and Word. Mac-only features included are a publishing layout view, which offers functionality similar to Microsoft Publisher for Windows, a "Ledger Sheet mode" in Excel to ease financial tasks, and a "My Day" application offering a quick way to view the day's events. Office 2008 supports the new Office Open XML format, and defaults to saving all files in this format. On February 21, 2008 Geoff Price revealed that the format conversion update for Office 2004 would be delayed until June 2008 in order to provide the first update to Office 2008. Microsoft Visual Basic for Applications is not supported in this version. As a result, such Excel add-ins dependent on VBA, such as Solver, have not been bundled in the current release. In June 2008, Microsoft announced that it is exploring the idea of bringing some of the functionality of Solver back to Excel. In late August 2008, Microsoft announced that a new Solver for Excel 2008 was available as a free download from Frontline Systems, original developers of the Excel Solver. However, Excel 2008 also lacks other functionality, such as Pivot Chart functionality, which has long been a feature in the Windows version. In May 2008, Microsoft announced that VBA will be making a return in the next version of Microsoft Office for Mac. AppleScript and the Open Scripting Architecture will still be supported. Limitations Office 2008 for Mac lacks feature parity with the Windows version. The lack of Visual Basic for Applications (VBA) support in Excel makes it impossible to use macros programmed in VBA. Microsoft's response is that adding VBA support in Xcode would have resulted in an additional two years added to the development cycle of Office 2008. Other unsupported features include: OMML equations generated in Word 2007 for Windows, Office "Ribbon", Mini Toolbar, Live Preview, and an extensive list of features are unsupported such as equivalent SharePoint integration with the Windows version. Some features are missing on Excel 2008 for Mac, including: data filters (Data Bars, Top 10, Color-based, Icon-based), structured references, Excel tables, Table styles, a sort feature allowing more than three columns at once and more than one filter on a sort. Benchmarks suggest that the original release of Office 2008 runs slower on Macs with PowerPC processors, and does not provide a significant speed bump for Macs with Intel processors. A data-compatibility problem has also been noted with CambridgeSoft's chemical structure drawing program, ChemDraw. Word 2008 does not retain the structural information when a chemical structure is copied from ChemDraw and pasted into a document. If a structure is recopied from a Word 2008 document, and is pasted back into ChemDraw, it appears as a non-editable image rather than a recognized chemical structure. There is no such problem in Word 2004 or X. This issue has not been fixed in the SP2 (version 12.2.0, released in July 2009). On May 13, 2008, Microsoft released Office 2008 Service Pack 1 as a free update. However, there have been many reports of the updater failing to install, resulting in a message saying that an updatable version of Office 2008 was not found. This appears to be related to users modifying the contents of the Microsoft Office folder in ways which do not cause problems with most other software (such as "localizing" using a program to remove application support files in unwanted languages), and which do not affect Office's operations, but which cause the updaters' installers to believe that the application is not valid for update. A small modification to the installer has been found an effective work-around (see reference). Another widespread problem reported after SP1 is that Office files will no longer open in Office applications when opened (double-clicked) from the Mac OS X Finder or launched from other applications such as an email attachment. The trigger for this problem is that Microsoft in SP1 unilaterally and without warning deprecated certain older Mac OS 'Type' codes such as "WDBN" that some files may have, either because they are simply very old, or because some applications assign the older Type code when saving them to the disk. Users have seen the problem affect even relatively new Type codes, however, such as 'W6BN'. Microsoft is apparently looking into the problem, but it is unclear if they will reinstate the older Type codes, citing security concerns. Another problem with cross-platform compatibility is that images inserted into any Office application by using either cut and paste or drag and drop result in a file that does not display the inserted graphic when viewed on a Windows machine. Instead, the Windows user is told "QuickTime and a TIFF (LZW) decompressor are needed to see this picture". A user presented one solution as far back as December 2004. A further example of the lack of feature parity is the track changes function. Whereas users of Word 2003 or 2007 for Windows are able to choose freely between showing their changes in-line or as balloons in the right-hand margin, choosing the former option in Word 2004 or Word 2008 for Mac OS also turns off all comment balloons; comments in this case are visible only in the Reviewing Pane or as popup boxes (i.e. upon mouseover). This issue has not been resolved to date and is present in the latest version of Word for the Mac, namely Word 2011. The toolbox found in Office 2008 also has problems when the OS X feature Spaces is used: switching from one Space to another will cause elements of the Toolbox to get trapped on one Space until the Toolbox is closed and reopened. The only remedy for this problem is to currently disable Spaces, or at least refrain from using it whilst working in Office 2008. Microsoft has acknowledged this problem and states that it is an architectural problem with the implementation of Spaces. Apple has been informed of the problem, according to Microsoft. The problem appears to be caused by the fact that the Toolbox is Carbon-based. Using Microsoft Office with Mac OS X 10.6 Snow Leopard solves some of the problems. In addition, there is no support for right to left and bidirectional languages (such as Arabic, Hebrew, Persian, etc.) in Office 2008, making it impossible to read or edit a right to left document in Word 2008 or PowerPoint 2008. Languages such as Thai are similarly not supported, although installing fonts can sometimes allow documents written in these languages to be displayed. Moreover, Office 2008 proofing tools support only a limited number of languages (Danish, Dutch, English, Finnish, French, German, Italian, Japanese, Norwegian, Portuguese, Spanish, Swedish, and Swiss German). Proofing tools for other languages failed to find their way to the installation pack, and are not offered by Microsoft commercially in the form of separately sold language packs. At the same time, Office applications are not integrated with the proofing tools native to Mac OS X 10.6 Snow Leopard. Microsoft Visio is not available for OS X. This means that any embedded Visio diagrams in other Office documents (e.g. Word) cannot be edited in Office on the Mac. Embedded Visio diagrams appear as a low-quality bitmap both in the WYSIWYG editor and upon printing the document on the Mac. Office for Mac 2008 also has a shorter lifecycle than Office 2007. Support for Office for Mac 2008 ended on April 9, 2013. As 32-bit software, it will not run on macOS Catalina or later versions of macOS. It is also not officially supported from OS X Mavericks to macOS Mojave. Editions See also Office suite Office Open XML software Comparison of Office Open XML and OpenDocument References External links MacBU interview: Office 2008 Exchange Server support Office 2008: lush "Escher" graphics engine First look: Office 2008 MacOS-only software Office 2008 for Mac 2008 software
22280755
https://en.wikipedia.org/wiki/St.%20Lawrence%20University%20%28Uganda%29
St. Lawrence University (Uganda)
St. Lawrence University (Uganda) (SLAU) is a private university in Uganda. Location The University campus is located in Mengo, Rubaga Division, in Kampala, Uganda's largest city and capital. The university campus sits on of land and is located near Kabaka's Lake, in Mengo, close to the main Lubiri (Palace of the Kabaka of Buganda), west of the central business district of Kampala. The coordinates of the main campus of SLAU are 0° 18' 6.00"N, 32° 33' 43.00"E (Latitude:0.301667; Longitude: 32.561945). History The University was founded in 2006. The University took in the first class of students in September 2007. Academic departments SLAU is composed of five faculties: Faculty of Business Studies Faculty of Education Faculty of Humanities Faculty of Computer Science & Information Technology Faculty of Industrial Art & Design Courses offered The following courses are offered at the University: Undergraduate degree courses Bachelor of Information Technology Bachelor of Computer Science Bachelor of Business Administration Bachelor of Arts with education Bachelor of Development Studies Bachelor of Economics Bachelor of Environment Management Bachelor of Guidance & Counseling Bachelor of Public Administration Bachelor of Mass Communication Bachelor of Social Work & Social Administration Bachelor of Industrial Art and Design Bachelor of International Relations and Diplomatic Studies Bachelor of Tourism and Hospitality Management Diploma courses Diploma in Business Administration Diploma in Information Technology Diploma in Industrial Art and Design Certificate courses Certificate in Information Technology Certificate of Industrial Art and Design See also Education in Uganda List of universities in Uganda List of Ugandan university leaders Universities Offering Business Courses in Uganda Lubaga Division References External links About SLAU Universities and colleges in Uganda Educational institutions established in 2006 Kampala District Lubaga Division 2006 establishments in Uganda
4249746
https://en.wikipedia.org/wiki/Oxygene%20%28programming%20language%29
Oxygene (programming language)
Oxygene (formerly known as Chrome) is a programming language developed by RemObjects Software for Microsoft's Common Language Infrastructure, the Java Platform and Cocoa. Oxygene is based on Delphi's Object Pascal, but also has influences from C#, Eiffel, Java, F# and other languages. Compared to the now deprecated Delphi.NET, Oxygene does not emphasize total backward compatibility, but is designed to be a "reinvention" of the language, be a good citizen on the managed development platforms, and leverage all the features and technologies provided by the .NET and Java runtimes. Oxygene is a commercial product and offers full integration into Microsoft's Visual Studio IDE on Windows, as well as its own IDE called Fire for use on macOS. The command-line compiler is available for free. Oxygene is one of six languages supported by the underlying Elements Compiler toolchain, next to C#, Swift, Java, Go and Mercury (based on Visual Basic.NET). From 2008 to 2012, RemObjects Software licensed its compiler and IDE technology to Embarcadero to be used in their Embarcadero Prism product. Starting in the Fall of 2011, Oxygene became available in two separate editions, with the second edition adding support for the Java and Android runtimes. Starting with the release of XE4, Embarcadero Prism is no longer part of the RAD Studio SKU. Numerous support and upgrade paths for Prism customers exist to migrate to Oxygene. As of 2016, there is only one edition of Oxygene, which allows development on Windows or macOS, and which can create executables for Windows, Linux, WebAssembly .NET, iOS, Android, Java and macOS. The language The Oxygene language has its origins in Object Pascal in general and Delphi in particular, but was designed to reflect the guidelines of .NET programming and to create fully CLR-compliant assemblies. Therefore, some minor language features known from Object Pascal / Delphi have been dropped or revised, while a slew of new and more modern features, such as Generics or Sequences and Queries have been added to the language. Oxygene is an object-oriented language, which means it uses classes, which can hold data and execute code, to design programs. Classes are "prototypes" for objects, like the idea of an apple is the prototype for the apple one can actually buy in a shop. It is known that an apple has a colour, and that it can be peeled: those are the data and executable "code" for the apple class. Oxygene provides language-level support for some features of parallel programming. The goal is to use all cores or processors of a computer to improve performance. To reach this goal, tasks have to be distributed among several threads. The .NET Framework's ThreadPool class offered a way to efficiently work with several threads. The Task Parallel Library (TPL) was introduced in .NET 4.0 to provide more features for parallel programming. Operators can be overloaded in Oxygene using the class operator syntax: class operator implicit(i : Integer) : MyClass; Note, that for operator overloading each operator has a name, that has to be used in the operator overloading syntax, because for example "+" would not be a valid method name in Oxygene. Program structure Oxygene does not use "Units" like Delphi does, but uses .NET namespaces to organize and group types. A namespace can span multiple files (and assemblies), but one file can only contain types of one namespace. This namespace is defined at the very top of the file: namespace ConsoleApplication1; Oxygene files are separated into an interface and an implementation section, which is the structure known from Delphi. The interface section follows the declaration of the namespace. It contains the uses clause, which in Oxygene imports types from other namespaces: uses System.Linq; Imported namespaces have to be in the project itself or in referenced assemblies. Unlike in C#, in Oxygene alias names cannot be defined for namespaces, only for single type names (see below). Following the uses clause a file contains type declarations, like they are known from Delphi: interface type ConsoleApp = class public class method Main; end; As in C#, the Main method is the entry point for every program. It can have a parameter args : Array of String for passing command line arguments to the program. More types can be declared without repeating the type keyword. The implementation of the declared methods is placed in the implementation section: implementation class method ConsoleApp.Main; begin // add your own code here Console.WriteLine('Hello World.'); end; end. Files are always ended with end. Types As a .NET language, Oxygene uses the .NET type system: There are value types (like structs) and reference types (like arrays or classes). Although it does not introduce own "pre-defined" types, Oxygene offers more "pascalish" generic names for some of them, so that for example the System.Int32 can be used as Integer and Boolean (System.Boolean), Char (System.Char), Real (System.Double) join the family of pascal-typenames, too. The struct character of these types, which is part of .NET, is fully preserved. As in all .NET languages types in Oxygene have a visibility. In Oxygene the default visibility is assembly, which is equivalent to the internal visibility in C#. The other possible type visibility is public. type MyClass = public class end; The visibility can be set for every type defined (classes, interfaces, records, ...). An alias name can be defined for types, which can be used locally or in other Oxygene assemblies. type IntList = public List<Integer>; //visible in other Oxygene-assemblies SecretEnumerable = IEnumerable<String>; //not visible in other assemblies Public type aliases won't be visible for other languages. Records Records are what .NET structs are called in Oxygene. They are declared just like classes, but with the record keyword: type MyRecord = record method Foo; end; As they're just .NET structs, records can have fields, methods and properties, but do not have inheritance and cannot implement interfaces. Interfaces Interfaces are a very important concept in the .NET world, the framework itself makes heavy use of them. Interfaces are the specification of a small set of methods, properties and events a class has to implement when implementing the interface. For example, the interface IEnumerable<T> specifies the GetEnumerator method which is used to iterate over sequences. Interfaces are declared just like classes: type MyInterface = public interface method MakeItSo : IEnumerable; property Bar : String read write; end; Please notice, that for properties the getter and setter are not explicitly specified. Delegates Delegates define signatures for methods, so that these methods can be passed in parameters (e.g. callbacks) or stored in variables, etc. They're the type-safe NET equivalent to function pointers. They're also used in events. When assigning a method to a delegate, one has to use the @ operator, so the compiler knows, that one doesn't want to call the method but just assign it. Oxygene can create anonymous delegates; for example methods can be passed to the Invoke method of a control without declaring the delegate: method MainForm.MainForm_Load(sender: System.Object; e: System.EventArgs); begin Invoke(@DoSomething); end; An anonymous delegate with the signature of the method DoSomething will be created by the compiler. Oxygene supports polymorphic delegates, which means, that delegates which have parameters of descending types are assignment compatible. Assume two classes MyClass and MyClassEx = class(MyClass), then in the following code BlubbEx is assignment compatible to Blubb. type delegate Blubb(sender : Object; m : MyClass); delegate BlubbEx(sender : Object; mx : MyClassEx); Fields can be used to delegate the implementation of an interface, if the type they're of implements this interface: Implementor = public class(IMyInterface) // ... implement interface ... end; MyClass = public class(IMyInterface) fSomeImplementor : Implementor; public implements IMyInterface; //takes care of implementing the interface end; In this example the compiler will create public methods and properties in MyClass, which call the methods / properties of fSomeImplementor, to implement the members of IMyInterface. This can be used to provide mixin-like functionality. Anonymous methods Anonymous methods are implemented inside other methods. They are not accessible outside of the method unless stored inside a delegate field. Anonymous methods can use the local variables of the method they're implemented in and the fields of the class they belong to. Anonymous methods are especially useful when working with code that is supposed to be executed in a GUI thread, which is done in .NET by passing a method do the Invoke method (Control.Invoke in WinForms, Dispatcher.Invoke in WPF): method Window1.PredictNearFuture; //declared as async in the interface begin // ... Calculate result here, store in variable "theFuture" Dispatcher.Invoke(DispatcherPriority.ApplicationIdle, method; begin theFutureTextBox.Text := theFuture; end); end; Anonymous methods can have parameters, too: method Window1.PredictNearFuture; //declared as async in the interface begin // ... Calculate result here, store in variable "theFuture" Dispatcher.Invoke(DispatcherPriority.ApplicationIdle, method(aFuture : String); begin theFutureTextBox.Text := aFuture ; end, theFuture); end; Both source codes use anonymous delegates. Property notification Property notification is used mainly for data binding, when the GUI has to know when the value of a property changes. The .NET framework provides the interfaces INotifyPropertyChanged and INotifyPropertyChanging (in .NET 3.5) for this purpose. These interfaces define events which have to be fired when a property is changed / was changed. Oxygene provides the notify modifier, which can be used on properties. If this modifier is used, the compiler will add the interfaces to the class, implement them and create code to raise the events when the property changes / was changed. property Foo : String read fFoo write SetFoo; notify; property Bar : String; notify 'Blubb'; //will notify that property "Blubb" was changed instead of "Bar" The modifier can be used on properties which have a setter method. The code to raise the events will then be added to this method during compile time. Code examples Hello World namespace HelloWorld; interface type HelloClass = class public class method Main; end; implementation class method HelloClass.Main; begin writeLn('Hello World!'); end; end. Generic container namespace GenericContainer; interface type TestApp = class public class method Main; end; Person = class public property FirstName: String; property LastName: String; end; implementation uses System.Collections.Generic; class method TestApp.Main; begin var myList := new List<Person>; //type inference myList.Add(new Person(FirstName := 'John', LastName := 'Doe')); myList.Add(new Person(FirstName := 'Jane', LastName := 'Doe')); myList.Add(new Person(FirstName := 'James', LastName := 'Doe')); Console.WriteLine(myList[1].FirstName); //No casting needed Console.ReadLine; end; end. Generic method namespace GenericMethodTest; interface type GenericMethodTest = static class public class method Main; private class method Swap<T>(var left, right : T); class method DoSwap<T>(left, right : T); end; implementation class method GenericMethodTest.DoSwap<T>(left, right : T); begin var a := left; var b := right; Console.WriteLine('Type: {0}', typeof(T)); Console.WriteLine('-> a = {0}, b = {1}', a , b); Swap<T>(var a, var b); Console.WriteLine('-> a = {0}, b = {1}', a , b); end; class method GenericMethodTest.Main; begin var a := 23;// type inference var b := 15; DoSwap<Integer>(a, b); // no downcasting to Object in this method. var aa := 'abc';// type inference var bb := 'def'; DoSwap<String>(aa, bb); // no downcasting to Object in this method. DoSwap(1.1, 1.2); // type inference for generic parameters Console.ReadLine(); end; class method GenericMethodTest.Swap<T>(var left, right : T); begin var temp := left; left:= right; right := temp; end; end. Program output: Type: System.Int32 -> a = 23, b = 15 -> a = 15, b = 23 Type: System.String -> a = abc, b = def -> a = def, b = abc Type: System.Double -> a = 1,1, b = 1,2 -> a = 1,2, b = 1,1 Differences between Delphi and Oxygene : Replaced with the namespace keyword. Since Oxygene doesn't compile per-file but per-project, it does not depend on the name of the file. Instead the unit or namespace keyword is used to denote the default namespace that all types are defined in for that file and : is the preferred keyword, though and still work. : In Oxygene all methods are overloaded by default, so no special keyword is needed for this : This constructor call has been replaced by the keyword. It can still be enabled in the for legacy reasons : Characters in strings are zero-based and read-only. Strings can have nil values, so testing against empty string is not always sufficient. Criticism Some people would like to port their Win32 Delphi code to Oxygene without making major changes. This is not possible because while Oxygene looks like Delphi, there are enough changes so as to make it incompatible for a simple recompile. While the name gives it the appearance of another version of Delphi, that is not completely true. On top of the language difference, the Visual Component Library framework is not available in Oxygene. This makes porting even more difficult because classic Delphi code relies heavily on the VCL. See also C# Object Pascal Embarcadero Delphi Free Pascal Eiffel Java References External links .NET programming languages Class-based programming languages Mono (software) Object-oriented programming languages Pascal (programming language) compilers Pascal programming language family
45470892
https://en.wikipedia.org/wiki/Windows%20Insider
Windows Insider
Windows Insider is an open software testing program by Microsoft that allows users who own a valid license of Windows 11, Windows 10, or Windows Server to register for pre-release builds of the operating system previously only accessible to software developers. Microsoft launched Windows Insider for developers, enterprise testers and the "technically able" to test new developer features on pre-release software and builds to gather low level diagnostics feedback in order to identify, investigate, mitigate and improve the Windows 10 OS, with the help, support and guidance of the Insider program Participants, in direct communication with Microsoft Engineers via a proprietary communication and diagnostic channel. It was announced on September 30, 2014 along with Windows 10. By September 2015, over 7 million people took part in the Windows Insider program. On February 12, 2015, Microsoft started to test out previews of Windows 10 Mobile. Microsoft announced that the Windows Insider program would continue beyond the official release of Windows 10 for future updates. Gabriel Aul and Dona Sarkar were both previously the head of the Windows Insider Program. The current head of the Windows Insider program is Amanda Langowski. Similar to the Windows Insider program, the Microsoft Office, Microsoft Edge, Skype, Bing, Xbox and Visual Studio Code teams have set up their own Insider programs. History Microsoft originally launched Windows Insider for enterprise testers and the "technically able" to test out new developer features and to gather feedback to improve the features built into Windows 10. By the time of the official launch of Windows 10 for PCs, a total of 5 million volunteers were registered on both Windows 10 and Windows 10 Mobile. They were also among the first people to receive the official update to Windows 10. With the release of Windows 10, the Windows Insider app was merged with the Settings app. This made the ability to install Windows Insider preview builds an optional feature which could be accessed directly from within Windows 10. In May 2017, Microsoft announced that the program would extend to Windows Server 2016. The first Insider build for this operating system was released on 13 July 2017. On June 24, 2021, Microsoft announced that the program would extend to Windows 11, with the Dev and Beta channels transitioning to the new operating system. The first Insider build for Windows 11 was released on June 28, 2021 for the Dev Channel. Channels Windows Insider Preview updates are delivered to testers in different channels (previously "rings") or logical categories: Windows Insiders in Dev Channel (previously Fast Ring) receive updates prior to Windows Insiders in Beta Channel (previously Slow Ring) but might experience more bugs and other issues. Release Preview Channel (previously Release Preview Ring) was introduced in February 2016. As of November 5, 2019, Microsoft has completely abandoned the Skip Ahead ring from the Windows Insider Program, stating "Our goal is to provide everyone in the Fast ring the freshest builds at the same time". As of June 15, 2020, Microsoft has introduced "channels" model to its Windows Insider Program, succeeding its "rings" model. Supported devices Supported processors On July 17, 2017, reports began to come that Windows 10 Creators Update refused to install on PCs and tablets sporting Intel Atom "Clover Trail" processors. At first, it appeared as though this might have been a temporary block as Microsoft and hardware partners work to fix the issues preventing the operating system to run well. However, Microsoft later confirmed that devices running the "Clover Trail" Intel Atom processors would not be receiving the Creators Update, as the processors are no longer supported by Intel and does not have the appropriate firmware to properly run versions of Windows 10 newer than the Anniversary Update. The following processors are no longer supported and will remain on Windows 10 Anniversary Update: Atom Z2760 Atom Z2520 Atom Z2560 Atom Z2580 Because PCs with unsupported processors could not receive new features updates, Microsoft agreed to extend support for these PCs with the bug fixes and security updates for the latest compatible version of Windows 10. Versions of Windows 10 that were released before a microprocessor was released is also not supported and installations for those operating systems may be actively blocked. For example, Windows 10 Version 1507 LTSB will not install on Kaby Lake processors. Due to security concerns such as the zero day exploit, Windows 11 now requires an 8th generation or later Intel CPU or a 2nd generation AMD Ryzen or later CPU, with a Trusted Platform Module 2.0 security chip and Secure Boot enabled. Testing is being conducted on 7th generation Intel and 1st generation Ryzen CPUs. Older CPUs and systems without TPM or Secure Boot may be supported, but will require changes to be made to the system registry to be able to upgrade to the operating system. Supported smartphones Microsoft initially launched Windows 10 Technical Preview for certain third-generation (x30 series) phones from their Lumia family and subsequently released it to second-generation (x20 series) devices throughout the testing phase. Some hacked their non-Lumia phones (which were not supported at the time) to download the preview builds. Microsoft responded by blocking all unsupported models. To roll back the installed technical preview back to Windows Phone 8.1, Microsoft launched Windows Device Recovery Tool that removes Windows 10 and recovers the latest officially released software and firmware. Preview build 10080, released on May 14, 2015, was the first to support a non-Lumia device, the HTC One M8 for Windows. This was followed up by Xiaomi who, in partnership with Microsoft, released a ROM port of Windows 10 to its flagship Mi 4 handset on June 1, 2015. At that time, it was limited to certain registered users in China. Build 10080 and its follow-up build 10166 also added support for fourth-generation Lumia (x40 series) devices. As a result, all compatible Windows Phone 8 or later Lumia phones now support the preview. In August 2015, Microsoft stated that while all Windows Phone devices, including those from Microsoft's new hardware partners announced the previous year, would receive the final version of Windows 10 Mobile, not all would receive preview builds through the Insider program. However, the company did not provide any information at the time on whether new devices would be added to the preview program. Microsoft instead focused on promoting new devices that come with Windows 10 Mobile, including their Lumia 950 and Lumia 950 XL flagships, and the low-cost Lumia 550 and Lumia 650. Since their release, these new Windows 10 devices became eligible to receive future updates in advance via the Insider program, beginning with build 10586 on December 4, 2015. The Windows-based LG Lancet also received this version but has not been upgraded since. On February 19, 2016, Microsoft released the first Windows 10 Mobile "Redstone" preview, build 14267. Starting with this build, future preview versions became exclusively available for devices that were already running a non-Insider preview of the OS, except for the Mi4 ROM version. This was followed by build 14291, released for existing Windows 10 devices on March 17, 2016 in conjunction with the official RTM release of Windows 10 Mobile to third and fourth-generation Lumias. The following week, it became available to the newly upgraded older Lumias in addition to several other devices already on Windows 10 Mobile at the time. All supported devices subsequently received Insider preview builds as far as build 15063, the "Creators Update", released on March 20, 2017. This included the official release of build 14393, the "Anniversary Update", on August 2, 2016. However, it was announced in April 2017 that many devices, including all third-generation Lumias, would not receive the RTM version of the Creators Update and further "Redstone" development builds, following feedback from users. Of the devices that remain supported, nearly all, except the Lumia 640 and its XL variant, had originally come with Windows 10 Mobile instead of Windows Phone 8.1. See also Microsoft Developer Network Microsoft Garage References External links Microsoft articles needing attention Microsoft development tools Microsoft software Insider Windows 10 Windows 11 Windows Phone Windows Phone software
577098
https://en.wikipedia.org/wiki/FSF%20Free%20Software%20Awards
FSF Free Software Awards
Free Software Foundation (FSF) grants two annual awards. Since 1998, FSF has granted the award for Advancement of Free Software and since 2005, also the Free Software Award for Projects of Social Benefit. Presentation ceremonies In 1999 it was presented in the Jacob Javits Center in New York City. The 2000 Award Ceremony was held at the Museum of Jewish Art and History in Paris. From 2001 to 2005, the award has been presented in Brussels at the Free and Open source Software Developers' European Meeting (FOSDEM). Since 2006, the awards have been presented at the FSF's annual members meeting in Cambridge, Massachusetts. Advancement of Free Software award This is annually presented by the Free Software Foundation (FSF) to a person whom it deems to have made a great contribution to the progress and development of free software, through activities that accord with the spirit of free software. Winners Source: Award for the Advancement of Free Software 1998 Larry Wall for numerous contributions to Free Software, notably Perl. The other finalists were the Apache Project, Tim Berners-Lee, Jordan Hubbard, Ted Lemon, Eric S. Raymond, and Henry Spencer. 1999 Miguel de Icaza for his leadership and work on the GNOME Project. The other finalists were Donald Knuth for TeX and METAFONT and John Gilmore for work done at Cygnus Solutions and his contributions to the Free Software Foundation. 2000 Brian Paul for his work on the Mesa 3D Graphics Library. The other finalists were Donald Becker for his work on Linux drivers and Patrick Lenz for the open source site Freshmeat. 2001 Guido van Rossum for Python. The other finalists were L. Peter Deutsch for GNU Ghostscript and Andrew Tridgell for Samba. 2002 Lawrence Lessig for promoting understanding of the political dimension of free software, including the idea that "code is law". The other finalists were Bruno Haible for CLISP and Theo de Raadt for OpenBSD. 2003 Alan Cox for his work advocating the importance of software freedom, his outspoken opposition to the USA's DMCA as well as other technology control measures, and his development work on the Linux kernel. The other finalists were Theo de Raadt for OpenBSD and Werner Koch for GnuPG. 2004 Theo de Raadt for his campaigning against binary blobs, and the opening of drivers, documentation and firmware of wireless networking cards for the good of everyone. The other finalists were Andrew Tridgell for Samba and Cesar Brod for advocacy in Brazil. 2005 Andrew Tridgell for his work on Samba and his BitKeeper client which led to the withdrawal of gratis BitKeeper licenses, spurring the development of git, a free software distributed revision control system for the Linux kernel. The other finalists were Hartmut Pilch founder of the Foundation for a Free Information Infrastructure for his combatting of the Software Patent Directive in Europe and Theodore Ts'o for his Linux kernel filesystem development. 2006 Theodore Ts'o for his work on the Linux kernel and his roles as a project leader in the development of Kerberos and ONC RPC. The other finalists were Wietse Venema for his creation of the Postfix mailserver and his work on security tools, and Yukihiro Matsumoto for his work in designing the Ruby programming language. 2007 Harald Welte for his work on GPL enforcement (Gpl-violations.org) and Openmoko 2008 Wietse Venema For his "significant and wide-ranging technical contributions to network security, and his creation of the Postfix email server." 2009 John Gilmore For his "many contributions and long term commitment to the free software movement." 2010 Rob Savoye For his work on Gnash Additionally, a special mention was made to honor the memory and contribution of Adrian Hands, who used a morse input device to code and successfully submit a GNOME patch, three days before he died from ALS. 2011 Yukihiro Matsumoto the creator of Ruby, for his work on GNU, Ruby, and other free software for over 20 years. 2012 Fernando Pérez for his work on IPython, and his role in the scientific Python community. 2013 Matthew Garrett for his work to support software freedom in relation to Secure Boot, UEFI, and the Linux kernel 2014 Sébastien Jodogne for his work on easing the exchange of medical images and developing Orthanc. 2015 Werner Koch the founder and driving force behind GnuPG. GnuPG is the de facto tool for encrypted communication. Society needs more than ever to advance free encryption technology. 2016 Alexandre Oliva for his work in promoting Free Software and the involvement in projects like the maintenance of linux-libre and the reverse engineer of the proprietary software used by Brazilian citizens to submit their taxes to the government. 2017 Karen Sandler for her dedication to Free Software as the former Executive Director of GNOME Foundation, current Executive Director of Software Freedom Conservancy, co-organizer of Outreachy, and through years of pro bono legal advice. 2018 Deborah Nicholson Deborah was the director of community operations at the Software Freedom Conservancy, Stallman praised her body of work and her unremitting and widespread contributions to the free software community. "Deborah continuously reaches out to, and engages, new audiences with her message on the need for free software in any version of the future. " 2019 Jim Meyering a prolific free software programmer, maintainer and writer, having contributed significantly to the GNU Core Utilities, GNU Autotools and Gnulib. 2020 Bradley M. Kuhn for his work in enforcing the GNU General Public License (GPL) and promoting copyleft through his position at Software Freedom Conservancy. Social benefit award Source: The Award for Projects of Social Benefit The Free Software Award for Projects of Social Benefit is an annual award granted by the Free Software Foundation (FSF). In announcing the award, the FSF explained that: According to Richard Stallman, former President of FSF, the award was inspired by the Sahana project which was developed, and was used, for organising the transfer of aid to tsunami victims in Sri Lanka after the 2004 Indian Ocean earthquake. The developers indicated that they hope to adapt it to aid for other future disasters. This is the second annual award created by the FSF. The first was the Award for the Advancement of Free Software (AAFS). Winners The award was first awarded in 2005, and the recipients have been: 2005 Wikipedia The Free Encyclopedia 2006 The Sahana FOSS Disaster Management System "An entirely volunteer effort to create technology for managing large-scale relief efforts" 2007 Groklaw "An invaluable source of legal and technical information for software developers, lawyers, law professors, and historians" 2008 Creative Commons "[For] foster[ing] a growing body of creative, educational and scientific works that can be shared and built upon by others [and] work[ing] to raise awareness of the harm inflicted by increasingly restrictive copyright regimes." 2009 Internet Archive For collecting freely available information, archiving the web, collaborating with libraries, and creating free software to make information available to the public. 2010 Tor For writing software to help privacy online. 2011 GNU Health For their work with health professionals around the world to improve the lives of the underprivileged. 2012 OpenMRS "A free software medical record system for developing countries. OpenMRS is now in use around the world, including South Africa, Kenya, Rwanda, Lesotho, Zimbabwe, Mozambique, Uganda, Tanzania, Haiti, India, China, United States, Pakistan, the Philippines, and many other places." 2013 GNOME Foundation's Outreach Program for Women OPW's work benefits society, "addressing gender discrimination by empowering women to develop leadership and development skills in a society which runs on technology". 2014 Reglue which donates refurbished Linux computers to underprivileged children in Austin, TX. 2015 Library Freedom Project a partnership among librarians, technologists, attorneys, and privacy advocates which aims to make real the promise of intellectual freedom in libraries. By teaching librarians about surveillance threats, privacy rights and responsibilities, and digital tools to stop surveillance, the project hopes to create a privacy-centric paradigm shift in libraries and the local communities they serve. 2016 SecureDrop an open-source software platform for secure communication between journalists and sources (whistleblowers) 2017 Public Lab a non-profit organization that facilitates collaborative, open source environmental research in a model known as Community Science 2018 OpenStreetMap a collaborative project to create a free editable map of the world. Founded by Steve Coast in the UK in 2004, OpenStreetMap is built by a community of over one million community members and has found its application on thousands of Web sites, mobile apps, and hardware devices. OpenStreetMap is the only truly global service without restrictions on use or availability of map information. 2019 Let's Encrypt a Certificate Authority (CA) that provides an easy way to obtain and install free TLS/SSL certificates. 2020 CiviCRM free program that nonprofit organizations around the world use to manage their mailings and contact databases Award for outstanding new Free Software contributor The third annual award created by the FSF, the award is presented to an exceptional newcomer to the free software community. Winners The award was first awarded for 2019 at LibrePlanet 2020, and the recipients have been: 2019 Clarissa Lima Borges Outreachy internship work focused on usability testing for various GNOME applications. 2020 Alyssa Rosenzweig Leads the Panfrost project, a project to reverse engineer and implement a free driver for the Mali series of graphics processing units (GPUs) used on a wide variety of single-board computers and mobile phones. Award Committee 1998: Peter H. Salus, Scott Christley, Rich Morin, Adam Richter, Richard Stallman, and Vernor Vinge 1999: Peter H. Salus, no further details found 2000: no details found 2001 The selection committee included: Miguel de Icaza, Ian Murdock, Eric S. Raymond, Peter H. Salus, Vernor Vinge, and Larry Wall 2002 The selection committee included: Enrique A. Chaparro, Frederic Couchet, Hong Feng, Miguel de Icaza, Raj Mathur, Frederick Noronha, Jonas Öberg, Eric S. Raymond, Guido van Rossum, Peter H. Salus, Suresh Ramasubramanian, and Larry Wall 2003 The selection committee included: Enrique A. Chaparro, Frederic Couchet, Miguel de Icaza, Raj Mathur, Frederick Noronha, Jonas Öberg, Bruce Perens, Peter H. Salus, Suresh Ramasubramanian, Richard Stallman, and Vernor Vinge 2004: Suresh Ramasubramanian, Raj Mathur, Frederick Noronha, Hong Feng, Frederic Couchet, Enrique A. Chaparro, Vernor Vinge, Larry Wall, Alan Cox, Peter H Salus, Richard Stallman 2005: Peter H. Salus (chair), Richard Stallman, Alan Cox, Lawrence Lessig, Guido van Rossum, Frederic Couchet, Jonas Öberg, Hong Feng, Bruce Perens, Raj Mathur, Suresh Ramasubramanian, Enrique A. Chaparro, Ian Murdock 2006: Peter H. Salus (chair), Richard Stallman, Andrew Tridgell, Alan Cox, Lawrence Lessig, Vernor Vinge, Frederic Couchet, Jonas Öberg, Hong Feng, Raj Mathur, Suresh Ramasubramanian 2008: Suresh Ramasubramanian (Chair), Peter H. Salus, Raj Mathur, Hong Feng, Andrew Tridgell, Jonas Öberg, Vernor Vinge, Richard Stallman, and Fernanda G. Weiden. 2009: Suresh Ramasubramanian (Chair), Peter H. Salus, Lawrence Lessig, Raj Mathur, Wietse Venema, Hong Feng, Andrew Tridgell, Jonas Öberg, Vernor Vinge, Richard Stallman, Fernanda G. Weiden and Harald Welte. 2010: Suresh Ramasubramanian (Chair), Peter H. Salus, Raj Mathur, Wietse Venema, Hong Feng, Andrew Tridgell, Jonas Öberg, Vernor Vinge, Richard Stallman, Fernanda G. Weiden and Harald Welte. 2011: Suresh Ramasubramanian (Chair), Peter H. Salus, Raj Mathur, Wietse Venema, Hong Feng, Andrew Tridgell, Jonas Öberg, Vernor Vinge, Richard Stallman, Fernanda G. Weiden and Harald Welte. 2012: Suresh Ramasubramanian (Chair), Peter H. Salus, Raj Mathur, Wietse Venema, Hong Feng, Andrew Tridgell, Jonas Öberg, Vernor Vinge, Richard Stallman, Fernanda G. Weiden and Harald Welte. 2013: Suresh Ramasubramanian (Chair), Wietse Venema, Hong Feng, Andrew Tridgell, Jonas Öberg, Vernor Vinge, Richard Stallman, Fernanda G. Weiden, Rob Savoye and Harald Welte. 2014: Suresh Ramasubramanian (Chair), Marina Zhurakhinskaya, Matthew Garrett, Rob Savoye, Wietse Venema, Richard Stallman, Vernor Vinge, Hong Feng, Fernanda G. Weiden, Harald Welte, Jonas Öberg, and Yukihiro Matsumoto. See also List of computer-related awards References External links Official Advancement of Free Software Award site Official Free Software Award for Projects of Social Benefit site Awards established in 1998 Free Software Foundation Free-software awards Lists of award winners
27440132
https://en.wikipedia.org/wiki/Black%20Mesa%20%28video%20game%29
Black Mesa (video game)
Black Mesa is a first-person shooter game developed and published by Crowbar Collective. It is a third-party remake of Half-Life (1998) made in the Source game engine. Originally published as a free mod in September 2012, Black Mesa was approved by Half-Life developers Valve for a commercial release; the first commercial version was published as an early-access version in May 2015, followed by a full release in March 2020 for Windows and Linux. Black Mesa was developed in response to Half-Life: Source (2005), Valve's port of Half-Life to the Source engine, which lacked new features or improvements. Two teams wanted to improve on the Source remake and eventually merged to become Crowbar Collective. While they had originally targeted a release by 2009, the team realized they had rushed to this point and reevaluated their efforts to improve the quality of the remake. Since then, attention to details, adapting the game to an improved version of the Source engine, and completely reworking the oft-derided final chapters of Half-Life (known as Xen) had lengthened the development efforts of the remake. Due to its long development time, the modification became notable for its delays on the status of its completion. Major changes include reskinned collection of textures, models and NPCs, a longer runtime, improved level and puzzle design along with challenging enemy artificial intelligence, and additional dialogue and story elements. The early-access version of Black Mesa received positive reviews and gained more positive reviews as it was updated and improved. Reviewers praised the gameplay and attention to detail, comparing it to that of an official Valve release, and the improvements to the Xen chapters. Gameplay Black Mesa is a first-person shooter that requires the player to perform combat tasks and solve various puzzles to advance through the game. From a design standpoint, the core gameplay remains largely unchanged from the original base Half-Life game; the player can carry a number of weapons that they find through the course of the game, though they must also locate and monitor ammunition for most weapons. The player's character is protected by a hazard suit that monitors the player's health and can be charged as a shield, absorbing a limited amount of damage. Health and battery packs can be found scattered through the game, as well as stations that can recharge either health or suit charge. However, unlike Half-Life: Source, which merely featured the original game's assets and geometry ported to the Source engine, Black Mesa has been purpose-built from the ground up to take full advantage of the newest versions of Source, not just for its graphical capabilities, but for its myriad updates to the game's physics engine, puzzle complexity, and platforming capability. The artificial intelligence of the enemy characters has also been improved over Half-Life to provide more of a challenge, with some of the combat spaces redefined to provide more options to the player. In addition, several narrative and design changes have been made to account for the numerous story threads presented via retcon in Half-Life 2. While most of the general design and progress through the game levels remains the same as Half-Life, the largest change in Black Mesa is the reworking of the game's final chapter, Xen, which was generally considered the weakest part of the original game. Black Mesa also includes support for the individual and team deathmatch multiplayer modes from Half-Life on similarly-updated maps. Plot The plot of Black Mesa is almost identical to Half-Lifes storyline. Like in the original game, the player controls Gordon Freeman, a theoretical physicist working at the Black Mesa Research Facility. He is tasked to place a sample of anomalous material into an Anti-Mass Spectrometer for analysis, using the Mark IV Hazardous Environment Suit to do so safely. However, the sample causes a "resonance cascade", devastating the facility and creating an interdimensional rift to an alien dimension called Xen, bringing its alien creatures to Earth. Freeman survives the incident, finds other survivors, and is tasked to make his way to the surface to call for help. Upon reaching the surface, however, he finds that the facility is being cleansed of any living thing - human or alien - by the military. Freeman learns from the surviving scientists the only way to stop the alien invasion is to cross over to Xen and destroy the entity keeping the rift open. Development Initial efforts (2005–2012) With the release of Half-Life 2 in 2004, Valve re-released several of its previous titles, ported to their new Source game engine, including the critically acclaimed 1998 game Half-Life as Half-Life: Source. The Source engine is graphically more advanced than the GoldSrc engine used for the original games. Half-Life: Source features the Havok physics engine and improved effects for water and lighting. The level architecture, textures, and models of the game, however, remained unchanged. Half-Life: Source was met with mixed reviews. IGN liked the new user interface and other technical features, but noted that it did not receive as many improvements as Valve's other Source engine ports. GameSpy said that while it was a "fun little bonus", it was "certainly not the major graphical upgrade some people thought it might be". Valve's managing director Gabe Newell is quoted as saying that a complete Source remake of Half-Life by its fans was "not only possible…but inevitable". Black Mesa began as the combination of two independent volunteer projects, each aiming to completely recreate Half-Life using Source. The Leakfree modification was announced in September 2004. The Half-Life: Source Overhaul Project was announced one month later. After realizing their similar goals, project leaders for both teams decided to combine their efforts; they formed a new 13-person team under the name Black Mesa: Source. The "Source" in the project's title was later dropped when Valve asked the team to remove it in order to "stem confusion over whether or not [it was] an endorsed or official product", which at the time it was not. Eventually, the team rebranded itself as the Crowbar Collective. Most of the team was distributed across the world and used online collaboration to work remotely, with some limited in-person meetings. Originally based on the version of Source released with Counter-Strike: Source in 2004, the project switched to a more recent version released with Valve's The Orange Box in 2007. This new version included more advanced particle effects, hardware-accelerated facial animation, and support for multi-core processor rendering, amongst other improvements. The team had expected this to be a relatively fast project, with trailers released in 2005 and 2008, and an initial release estimate of late 2009, but by mid-2009, had backed off that date, and changed their expected release date to "when it's done". Wired included the game on their "Vaporware of the Year" lists in 2009 and 2010. In the lead-up to the 2012 release, team member Carlos Montero said that in 2009 that they thought they were going to be able to make that date, but "ended up busting our asses to make that a reality, and we went against a lot of our core values in the process. We found ourselves rushing things, cutting things, making quality sacrifices we did not want to make." Montero said then they decided to re-evaluate the state of the project, set higher bars for the quality of work they wanted to produce, and started to back through what they had already done to improve upon that, at which point they were not sure when the project would be completed. The first standalone version of Black Mesa was released as a free download on September 14, 2012. This contained remakes of all Half-Life chapters except the final chapter set on the alien world Xen, which the team intended to rework for inclusion in a future release, as Xen in the original Half-Life was often considered its weakest part. The development team estimated that the initial release of Black Mesa gave players eight to ten hours of content to complete. Black Mesa initial release coincided with the launch of Valve's Steam Greenlight program which allowed users to vote for games to be put onto the Steam storefront. Black Mesa became one of the first ten titles to be voted on by fans and approved by Valve to be included on Steam through Greenlight. Transitioning to commercial release (2013–2014) A new version of the Source engine had been introduced by 2013 that, in addition to new engine features, included support for OS X and Linux platforms, however, developers had to pay to gain access to the full feature set of this engine. According to Adam Engels, the project lead at the completion of Black Mesa, Valve had actually approached their team around this time and suggested to them about making Black Mesa a commercial release and thus getting a license to the Source engine. The team considered this option, and since access to the full Source engine would help make Black Mesa the best game they could, they opted to go the commercial route as to be able to pay for that license, not having intended to profit off the game from the start. The team affirmed they had gotten Valve's permission to sell by November 2013. Some of the team were later invited to Valve's offices in Bellevue, Washington in 2015. At this point in 2013, the team cautioned that a final version was still some distance away as they were still dealing with the updated Source engine, and they had not yet done much with Xen. Crowbar Collective continued to offer the free version of Black Mesa based on the earlier Source engine off their website. With the new Source engine, the team started to look closer at how Valve had used Source in Half-Life 2 compared to what they had done in the original Half-Life, and developed changes for Black Mesa that reflected what they believed were Valve's design principles in Half-Life 2. One of those was the idea that when introducing a new mechanic, the level was designed to teach the new mechanic without potential harm to the player-character, followed by then testing that mechanic in a more harmful situation to the character. The team also included a brief mention to the long-fall boots from Aperture Science from the Portal series; Portal had come out after Half-Life 2 but loosely tied narratively to the Half-Life universe, and the team felt it appropriate to show the competing lab's technology within the Black Mesa facility from this connection. Once the team had gotten all but the Xen levels completed in the new Source engine they were content with, they released these on Steam's early access on May 5, 2015 to get feedback and bug testing, stating that the Xen sections were still a work in progress. This version also included the deathmatch multiplayer modes with some of Half-Life remade maps. Early access also brought Crowbar Collective additional support from developers and artists to help with finalizing the project. Xen and final release (2015–2020) The release of the Xen part of the game had been the most difficult, since the team wanted to redesign the levels to overcome the poor perception that they had in Half-Lifes original release. The team said, "We want our version of Xen to feel like it really belongs with the rest of the game in terms of mechanics, cohesion and progression," while at the same time, they wanted "to push the boundaries and explore this unique and varied setting; to build an experience that feels both fresh and familiar to players from all walks of Half-Life veterancy." Developing their new version of Xen was a chicken-or-the-egg dilemma, as without level design it was difficult to develop art assets, and without art assets it was hard to come up with cohesive level designs. They also wanted to give more story elements there, such as why human scientists were studying the world of Xen in the first place, trying to capture the same type of world-building by level design that Valve had been able to with the first parts of Half-Life. They also significantly reworked the boss battles to be more challenging and representative of the area they had in mind. Ultimately, the team expanded out Xen from about a one-hour experience in the original Half-Life to four hours in Black Mesa. In addition to reworking Xen from scratch, as the team members got closer to release, they recognized they saw the game more as an entry for new players into the Half-Life series, and worked to introduce designs and features that would be more appropriate a decade since Half-Lifes release. They made combat more interesting by improving the enemy's artificial intelligence while creating combat areas with more cover and options for the player. Because of their expansion of Xen, they also wanted to make sure players were not slowed down in the earlier parts of the game, and make redesigns in some of these levels. The release of Xen in the early access version of Black Mesa had been pushed off a few times; initially planned for a December 2017 release, a beta version of a segment of the remade Xen was released in June 2019 for stress-testing by players The full beta was released on December 6, 2019. Additional Xen levels were added over time, and by December 24, 2019, the full Xen chapter was released as part of the game's early access. The finished Black Mesa was released for Windows on March 6, 2020. By chance, this release was about two weeks before Valve's official return to the Half-Life universe after 13 years with the virtual reality game Half-Life: Alyx. Black Mesa project lead Adam Engels said this was not intentional as they had planned to have Black Mesa out earlier, but the attention to Alyx had helped to boost interest in Black Mesa. In addition to ongoing support for the game before moving onto other projects, the Crowbar Collective has stated they have been contacted by other teams, such as the Sven Co-op team, to help integrate their work into the final Black Mesa product. The team also wants to incorporate support for the Steam Workshop so that other players can add their own mods to the game. In addition to the modification itself, the game's thematic score, produced by sound designer Joel Nielsen, was independently released as a soundtrack in 2012. Nielsen released the score for the Xen levels in 2019. Definitive Edition Following the first release of Black Mesa, the team announced work on a Definitive Edition, or Black Mesa 1.5, revamping non-Xen levels from the original Valve design to make them more challenging, as well as to take advantage of new lighting features available in the custom Source engine branch the game uses. This free update was released on November 25, 2020. Modifications As Black Mesa is built on the Source engine, it itself is also moddable with support for Steam Workshop, and several projects have been started to create versions of Half-Life mods and expansions within Black Mesa. Xen Museum The Crowbar Collective released an expansion in April 2021 called Xen Museum that presents a virtual museum that documents the team's past five years of effort in creating Black Mesa and mostly their work in recreating Xen from the original Half-Life. Uplink and Uplink Redux In 2012, mapper Michael "Hezus" Jansen created the mod Black Mesa: Uplink, a remake of Half-Life'''s demo level, Uplink. Jansen worked on the mod for three years before release, saying "I've recreated something people played 13 years ago, that means it's intertwined with nostalgic feelings." With the transition from mod to game in 2015, Jansen returned to the idea and started work on recreating it for the Steam version that featured new content and updated graphics called Black Mesa: Uplink Redux. However, in 2019, Jansen halted the production of the mod due to health issues. Surface Tension Uncut and On a Rail Uncut In Surface Tension Uncut, the "Surface Tension" chapter was expanded to include certain areas of the original game that were not released along with the remake, as the developer had left before his work was finished. The developer, Chon Kemp, known on the Black Mesa: Community Forums by the pseudonym TextFAMGUY1, also modified the "On a Rail" chapter to include the areas cut from Black Mesa to make gameplay less tedious. Kemp was later hired by Crowbar Collective to remake Surface Tension Uncut for the Steam release, while the uncut version of "On a Rail" was published on Steam Workshop. Hazard Course On December 29, 2015, PSR Digital released Black Mesa: Hazard Course, a remake of Half-Lifes tutorial level of the same name. The mod had been in development from 2012 to 2015 for the original mod version of Black Mesa as Crowbar Collective had not implemented a training level in the game, citing its obsolete use due to the tutorial HUD. The mod includes an intro tram ride and brief meeting with scientists reminiscent of the PlayStation 2 version of the level. In 2016, PSR Digital released an announcement that the mod had become broken due to differences between the mod version and the Steam version of Black Mesa. With fixes through the next years, the team re-released the mod for the Steam version of Black Mesa on December 29, 2020, the 5th anniversary of the mod's release. Azure Sheep In 2018, the HECU Collective announced that they would be remaking the Half-Life 1 mod Azure Sheep, originally released in 2001. A demo of the mod was made available for download on November 18, 2018, with Part One being released in 2019. In 2021, the mod's next parts were postponed as the HECU Collective focused to work on Black Mesa: Blue Shift. Blue Shift On February 16, 2021, the HECU Collective announced that they would be taking a break from their mod Black Mesa: Azure Sheep and were now focusing on a remake of Half-Life: Blue Shift. Unlike another remake in progress, Guard Duty by Tripmine Studios, the mod utilizes assets from Black Mesa instead of creating it from scratch and is released in chapters. The mod began development after a previous attempt at a Blue Shift remake, Insecurity, was abandoned. Half of the members working on the mod while the other half worked on Azure Sheep. The first of the game's eight chapters was released on March 16, 2021, and as of November 2021, the first three chapters have been released. Reception During its development, Black Mesa has received attention from several video game publications. It has been featured in articles from Computer Gaming World, PC PowerPlay, and PC Gamer UK magazines. Valve published a news update about the modification on their Steam digital distribution platform in 2007 saying that "We're as eager to play [Black Mesa] here as everyone else." The project was awarded Top Unreleased Mod by video game modification website Mod DB in 2005 and 2006. Mod DB gave the project an honorable mention in their choice of Top Unreleased Mod in 2007. After receiving a development version of Black Mesa in December 2009, PC PowerPlay magazine said that the game's setting "looks, sounds, [and] plays better than ever before". The "subtle" changes from the original Half-Life were said to have a "substantial" overall impact. They also noted the project's "frustrating" then-five-year development time, and current lack of release date, but added that the developers were making progress. After the first major release in 2012, early impressions of the game were very positive, receiving a score of 86/100 on Metacritic, based on nine reviews. The game was praised for its high polish, with many critics comparing its quality to that of an official Valve game. Destructoid praised the game for the improvements it made over the original Half-Life, saying it was "something that felt very familiar, [but also] very fresh."Black Mesa won ModDB's Mod of the Year Award for 2012. In 2014, Black Mesa was named by PC Gamer among the "Ten top fan remade classics you can play for free right now". The remake's final release in 2020 was similarly praised by reviewers. On review aggregate OpenCritic, Black Mesa had an average 86/100 review score with 100% approval rating based on 14 reviews. PC Gamer said the overall project felt like a professional work, and that while the original structure of Half-Life hampered some of Black Mesa, the redesigns that were done to the original levels for the Source engine were well done, particularly with the new Xen sections that helped Black Mesa Xen to feel more like a proper closure to the game than the original version. Eurogamer said that Black Mesa felt more like an evolution than a remake of Half-Life, with the designs of the Crowbar Collective to trim down certain levels while adding other features helped to enhance the overall product, making the overall title more of a survival horror than a first-person shooter. Dario Casali, a designer at Valve who has worked on all Half-Life games, remarked in an interview that during development on Half-Life: Alyx he attempted to play the entirety of the original Half-Life again for research, but after five hours decided to play Black Mesa'' instead, reasoning it was a much more enjoyable product. References External links 2020 video games Early access video games Fangames First-person shooters Half-Life (series) Linux games Multiplayer and single-player video games Science fiction video games Source (game engine) mods Steam Greenlight games Vaporware video games Video game remakes Video games set in New Mexico Video games set in the 2000s Video games with Steam Workshop support War video games set in the United States Windows games
45304274
https://en.wikipedia.org/wiki/Government%20Engineering%20College%2C%20Dahod
Government Engineering College, Dahod
The Government Engineering College, Dahod (GECD or GEC Dahod) is one of the 18 Government Engineering Colleges in Gujarat. It was established in 2004. It specializes in the fields of engineering and technology. The institute is recognized by the All India Council for Technical Education (AICTE), New Delhi. The college is administered by the Directorate of Technical Education in Gandhinagar, Gujarat, India, and is affiliated with Gujarat Technological University (GTU), Ahmedabad. Organisation and administration The college offers several undergraduate courses leading to Bachelor of Engineering (B.E.) degrees (number of seats in brackets): Electronics and Communication Engineering Computer Engineering Mechanical Engineering Electrical Engineering Civil Engineering The college also offers postgraduate courses leading to Master of Engineering (M.E.) degrees: Applied Mechanics (CASAD) Mechanical Engineering (Specialization in CAD/CAM) Department of Electronics & Communication Engineering Electronics and Communication Department was established in 2004 situated at academic building block no. 4. It has laboratories in thrust and on developing path in areas of Audio-Visual, Analog and Digital Communication, Microwave, VLSI Design and Signal Processing. Department has Computer Centre with all latest computer facilities. Every year, modern equipment is being purchased to remain in tune with the advancement in the field of Electronics & Communication, Microwave and Information Technology etc. The Computer laboratory of this department is equipped with computer systems latest processor with multimedia, software and LAN systems. Areas of Electronics & Communication – Analog & Digital Electronics, Analog & Digital Communication, Antenna Systems, Microwave Engineering, Fiber Optic Communication, Computer Hardware and Networking, Micro Processor and Micro Controller, Digital Signal Processing, VLSI Design. Department of Computer Engineering Computer Engineering was introduced in GEC Dahod from year 2009. Its first batch was graduated in the year in 2013–14. The department is located on first floor of Block No. 4 in college campus. With the intake of 60 and adequate staff, the department has ample amount of class rooms and labs on its own floor. Teaching and Lab work is carried out with intention to help students overcome practical difficulties. All the subject lectures and laboratories are engaged by departmental staff and even visiting faculties are appointed as and when needed. Facilities The department has 5 labs equipped with total 110 desktop computers. Computers with advanced Quad-Core AMD (3.6 GHz) and Intel i5 processors are used by students and faculties to perform basic programming to parallel processing. All the computers are equipped with software tools / IDEs to support program development. Final year students work on projects in groups or individuals according to syllabus of GTU. The project work is divided in two phases which are undertaken by students in 7th and 8th semester. Current projects undertaken by students are API development in php, using Facebook API, PHP programming and Android application development. Creative workshops like are organized every year to expose students to real world applications. Students of final year are encouraged to undertake good industry standard projects. Computer Engineering department maintains domain name and web-site for college since year 2012. Department of Mechanical Engineering Mechanical Engineering Department is established in the year 2004 with intake capacity of 60 students. To enhance the qualitative education, the intake has been increased to 120 students in the year 2009. This department has also started the post graduate course of M.E with specialization of CAD/CAM in the year 2010. This department is facilitated wIth all essential laboratories and equipments according to the curriculum of Gujarat technological University. Department of Electrical Engineering The department is in the Engineering Block -3 building, share with General Department at first floor. At the beginning in 2004 the department has 60 students intake but later on in 2009 the intake capacity of the department is double as 120 students per year. Department of Civil Engineering Civil Engineering Department has started right from the inception of this institute with an intake capacity of 60 students; later on this intake capacity is double and today is 120 students per year. The Civil Engineering Department is also looking after the civil construction of the institute and maintains the same in liaison with R&B Department of Government of Gujarat. Academics GEC Dahod started with the intake of 60 students each for civil, mechanical, electrical and electronics and communication department. The intake for civil, mechanical and electrical branches has doubled today making it 120 students intake for each. The intake is 60 students each for EC and Computer Engineering department. Civil and Mechanical departments also run Post Graduate programs affiliated to GTU each with intake of 18 students. GEC Dahod has two boys hostels located within the campus. One girls hostel located in the campus of Technical High school, Dahod near Circuit House. The college runs 5 four year full-time programs of Bachelors of Engineering in above mentioned discipline. Admission GEC Dahod is a fully government funded institute. The candidates who have passed the HSC examination (Science Stream) by the Gujarat Higher-Secondary/Central Board of Higher Secondary (within Gujarat state only) with Physics, Chemistry, Mathematics subjects as group A, are eligible for admission. Students from other states may apply according to quota allocated to their states. These applications are not taken at college. The students have to contact ACPC, Ahmedabad for this. The ACPC (Admission Committee for Professional Courses) is a centralized body run by the government of Gujarat that fills up 100% of total seats on merit basis of total of Normalized Merit Rank that includes percentage wise weight of state level exam and/or JEE MAIN Exam. Examination system Examination System is according to Gujarat Technological University Scheme. Internal examination Internal examination, submission of term work and university external examination, together make an individual eligible for passing (for each subject). All have to appear for the internal examination. For passing, a minimum 40% marks are required. University examination The university examination is conducted at the end of every semester/year for the subjects. Minimum passing standard is 40% in each head. A student securing 30% marks in the theory may be declared passed if the total marks scored in the subject is 45%. Institute Internet Facility GEC Dahod is a node of NKN (National Knowledge Network) which is a national initiative to connect technical campuses all over India. The college is equipped with high speed Wifi facility (NaMo Wifi) by The Government of Gujarat throughout the campus. Computer department handles internet facility and campus wide LAN. There are above 400 LAN nodes spread over all the campus including amenities building, workshop and library. Library is equipped with SOUL software to manage more than 14,000 volumes. Student life Hostel Government Engineering College, Dahod has two hostels, each for boys and girls. Boys Hostel is located inside the college campus. It has capacity of 120 students. The admission to first batch commenced in academic year 2013–14. Admissions to hostel are done every year based on number of vacant seats at that time. The Girls Hostel for Government Engineering College, Dahod is located within the Dahod city. It is inside the campus of Technical High School and near to Circuit House and Police Headquarters, Dahod. It has capacity of 54 students. Events and Festivals Some of the annual events are held at GECD. "Ignite" is an annual technical festival of GECD held during the month of January or February. Various events are organised from each department such as hackathons, robotics, etc. Sports Week is organised which includes outdoor and indoor games and sports like Cricket, Carrom, Chess, Badminton, Table Tennis etc. References http://www.gecdahod.ac.in https://web.archive.org/web/20120304113553/http://www.gujacpc.nic.in/Documents2009/Fee/Tuition%20Fee%20link%20for%20BE%20BTech.pdf http://www.gtu.ac.in/affiliation/Affiliated_Colleges_BE_2009-10.pdf http://gujarateducation.gswan.gov.in/education/e-citizen/notification/download/Notification-For-GTU_Affiliation_revised_8_7_08.pdf https://web.archive.org/web/20100331053105/http://gujaratuniversity.org.in/web/NWD/0100_Gujarat%20University%20Affiliated%20Colleges/0910_Gujarat_University_Affiliated_Colleges_List%20%282009-2010%29.pdf https://web.archive.org/web/20120818084526/http://www.jacpcldce.ac.in/Adm09/Inst_Engg_09.htm https://web.archive.org/web/20120226020748/http://www.jacpcldce.ac.in/Adm09/InstProfile/115.mht 2004 establishments in Gujarat Educational institutions established in 2004 Universities and colleges in Gujarat Dahod district
43143801
https://en.wikipedia.org/wiki/Android%20Lollipop
Android Lollipop
Android Lollipop (codenamed Android L during development) is the fifth major version of the Android mobile operating system developed by Google and the 12th version of Android, spanning versions between 5.0 and 5.1.1. Unveiled on June 25, 2014 at the Google I/O 2014 conference, it became available through official over-the-air (OTA) updates on November 12, 2014, for select devices that run distributions of Android serviced by Google (such as Nexus and Google Play edition devices). Its source code was made available on November 3, 2014. It is the fifth major update and the twelfth version of Android. One of the most prominent changes in the Lollipop release is a redesigned user interface built around a design language known as Material Design, which was made to retain a paper-like feel to the interface. Other changes include improvements to the notifications, which can be accessed from the lockscreen and displayed within applications as top-of-the-screen banners. Google also made internal changes to the platform, with the Android Runtime (ART) officially replacing Dalvik for improved application performance, and with changes intended to improve and optimize battery usage. Android Lollipop was succeeded by Android Marshmallow, which was released in October 2015. , 1.21% of Android devices run Lollipop 5.0 (API 21), and 2.71% run Lollipop 5.1 (API 22), with a combined 3.92% of usage share. Development The release was internally codenamed "Lemon Meringue Pie". Android 5.0 was first unveiled under the codename "Android L" on June 25, 2014 during a keynote presentation at the Google I/O developers' conference. Alongside Lollipop, the presentation focused on a number of new Android-oriented platforms and technologies, including Android TV, in-car platform Android Auto, wearable computing platform Android Wear, and health tracking platform Google Fit. Part of the presentation was dedicated to a new cross-platform design language referred to as "material design". Expanding upon the "card" motifs first seen in Google Now, it is a design with increased use of grid-based layouts, responsive animations and transitions, padding, and depth effects such as lighting and shadows. Designer Matías Duarte explained that "unlike real paper, our digital material can expand and reform intelligently. Material has physical surfaces and edges. Seams and shadows provide meaning about what you can touch." The material design language would not only be used on Android, but across Google's suite of web software as well, providing a consistent experience across all platforms. Features Android 5.0 introduces a refreshed notification system. Individual notifications are now displayed on cards to adhere to the material design language, and batches of notifications can be grouped by the app that produced them. Notifications are now displayed on the lock screen as cards, and "heads up" notifications can also be displayed as large banners across the top of the screen, along with their respective action buttons. A do-not-disturb feature is also added for notifications. The recent apps menu was redesigned to use a three-dimensional stack of cards to represent open apps. Individual apps can also display multiple cards in the recents menu, such as for a web browser's open tabs. Upon the release of this version, for Most android devices, the navigation buttons were completely changed from a left arrow, a house, and two squares, to a left triangle, a circle and a square. Lollipop also contains major new platform features for developers, with over 5,000 new APIs added for use by applications. For example, there is the possibility to save photos in a raw image format. Additionally, the Dalvik virtual machine was officially replaced by Android Runtime (ART), which is a new runtime environment that was introduced as a technology preview in KitKat. ART is a cross-platform runtime which supports the x86, ARM, and MIPS architectures in both 32-bit and 64-bit environments. Unlike Dalvik, which uses just-in-time compilation (JIT), ART compiles apps upon installation, which are then run exclusively from the compiled version from then on. This technique removes the processing overhead associated with the JIT process, improving system performance. Lollipop also aimed to improve battery consumption through a series of optimizations known as "Project Volta". Among its changes are a new battery saver mode, job-scheduling APIs which can restrict certain tasks to only occur over Wi-Fi, and batching of tasks to reduce the overall amount of time uthat internal radios are active on. The new developer tool called "Battery Historian" can be used for tracking battery consumption by apps while in use. The Android Extension Pack APIs also provide graphics functions such as new shaders, aiming to provide PC-level graphics for 3D games on Android devices. A number of system-level, enterprise-oriented features were also introduced under the banner "Android for Work". The Samsung Knox security framework was initially planned to be used as a foundation for "Android for Work", but instead Google opted to use its own technology for segregating personal and work-oriented data on a device, along with the accompanying APIs for managing the environment. With the "Smart Lock" feature, devices can also be configured so users do not have to perform device unlocking with a PIN or pattern when being on a trusted location, or in proximity of a designated Bluetooth device or NFC tag. Lollipop was, additionally, to have device encryption enabled by default on all capable devices; however, due to performance issues, this change was held over to its successor, Android Marshmallow. Release A developer preview of Android L, build LPV79, was released for the Nexus 5 and 2013 Nexus 7 on June 26, 2014 in the form of flashable images. Source code for GPL-licensed components of the developer preview was released via Android Open Source Project (AOSP) in July 2014. A second developer preview build, LPV81C, was released on August 7, 2014, alongside the beta version of the Google Fit platform and SDK. As with the previous build, the second developer preview build is available only for the Nexus 5 and 2013 Nexus 7. On October 15, 2014, Google officially announced that Android L would be known as Android 5.0 "Lollipop". The company also unveiled launch devices for Android5.0including Motorola's Nexus 6 and HTC's Nexus 9for release on November 3, 2014. Google stated that Nexus (including the Nexus 4, 5, 7, and 10) and Google Play edition devices would receive updates to Lollipop "in the coming weeks"; one more developer preview build for Nexus devices and the new SDK revision for application developers would be released on October 17, 2014. Update schedules for third-party Android devices may vary by manufacturer. The full source code of Android5.0 was pushed to AOSP on November 3, 2014, allowing developers and OEMs to begin producing their own builds of the operating system. On December 2, 2014, factory images for Nexus smartphones and tablets were updated to the 5.0.1 version, which introduces a few bug fixes, and a serious bug that affected Nexus 4 devices and prevented the audio from working during phone calls. A device-specific Lollipop 5.0.2 (LRX22G) version was released for the first-generation Nexus 7 on December 19, 2014. Android5.1, an updated version of Lollipop, was unveiled in February 2015 as part of the Indonesian launch of Android One, and is preloaded on Android One devices sold in Indonesia and the Philippines. Google officially announced 5.1 by releasing updates for existing devices on March 9, 2015. In 2015, Amazon.com forked Lollipop to produce Fire OS 5 "Bellini" for Amazon's Fire HD-series devices. See also Android version history iOS 8 Material Design OS X Yosemite Windows 8.1 Windows Phone 8.1 References External links Android (operating system) 2014 software
7809456
https://en.wikipedia.org/wiki/Dale%20W.%20Jorgenson
Dale W. Jorgenson
Dale Weldeau Jorgenson (born May 7, 1933, in Bozeman, Montana) is the Samuel W. Morris University Professor at Harvard University, teaching in the Department of Economics and John F. Kennedy School of Government. He served as Chairman of the Department of Economics from 1994 to 1997. Awards and memberships Jorgenson has been honored with membership in the American Philosophical Society (1998), the Royal Swedish Academy of Sciences (1989), the U.S. National Academy of Sciences (1978), and the American Academy of Arts and Sciences (1969). He was elected to Fellowship in the American Association for the Advancement of Science (1982), the American Statistical Association (1965), and the Econometric Society (1964). He was awarded honorary doctorates by the Faculty of Social Sciences at Uppsala University (1991), the University of Oslo (1991), Keio University (2003), the University of Mannheim (2004), the University of Rome (2006), the Stockholm School of Economics (2007), the Chinese University of Hong Kong (2007), and Kansai University (2009). Jorgenson served as President of the American Economic Association in 2000 and was named a Distinguished Fellow of the Association in 2001. He was a Founding Member of the Board on Science, Technology, and Economic Policy of the National Research Council in 1991 and served as Chairman of the Board from 1998 to 2006. He also served as Chairman of Section 54, Economic Sciences, of the National Academy of Sciences from 2000 to 2003 and was President of the Econometric Society in 1987 and President of the American Economic Association in 2000. Currently he is a Vice President of the Society for Economic Measurement (SEM). Jorgenson received the prestigious John Bates Clark Medal of the American Economic Association in 1971. The medal is awarded annually (biennially at the time Jorgenson received it) to an economist under forty who has made outstanding contributions to the field. According to the citation for the award: Dale Jorgenson has left his mark with great distinction on pure economic theory (with, for example, his work on the growth of a dual economy); and equally on statistical method (with, for example, his development of estimation methods for rational distributed lags). But he is preeminently a master of the territory between economics and statistics, where both have to be applied to the study of concrete problems. His prolonged exploration of the determinants of investment spending, whatever its ultimate lessons, will certainly long stand as one of the finest examples in the marriage of theory and practice in economics. He was also an advocate for a carbon tax on greenhouse gas emissions as a means of reducing global warming when he testified before congress in 1997. His research has also been used to advocate for the FairTax, a tax reform proposal in the United States to replace all federal payroll and income taxes (both corporate and personal) with a national retail sales tax and monthly tax rebate to households of citizens and legal resident aliens. However, Jorgenson supports a tax plan of his own design, which he calls Efficient Taxation of Income, described in his book Investment, Vol. 3: Lifting the Burden: Tax Reform, the Cost of Capital, and U.S. Economic Growth. The approach would introduce different tax rates for property-type income and earned income from work. Research Jorgenson’s 1963 paper, “Capital Theory and Investment Behavior,” introduced all the important features of the cost of capital employed in the subsequent literature. His principal innovations were the derivation of investment demand from a model of capital as a factor of production, the incorporation of the tax treatment of income from capital into the price of capital input, and econometric modeling of gestation lags in the investment process. In 1971 Jorgenson surveyed empirical research on investment in the Journal of Economic Literature. In the same year he was awarded the John Bates Clark Medal of the American Economic Association for his research on investment behavior. In 2011 Jorgenson’s paper was chosen as one of the Top 20 papers published in the first 100 years of the American Economic Review. The predominant role of investment. In 2005 Jorgenson traced the American growth resurgence to its sources in individual industries in his book, Information Technology and the American Growth Resurgence, co-authored with Mun S. Ho and Kevin J. Stiroh. This book employed the framework originated by Jorgenson, Frank M. Gollop, and Barbara M. Fraumeni, but added detailed information about investments in information technology equipment and software. Jorgenson and his co-authors demonstrated that input growth, due to investments in human and non-human capital, was the source of more than 80 percent of U.S. economic growth over the past half century, while growth in total factor productivity accounted for only 20 percent. Jorgenson and Khuong Vu (2005) established similar results for the world economy. New architecture for the national accounts. Jorgenson and Steven Landefeld, Director of the U.S. Bureau of Economic Analysis, have proposed a new system of national accounts that incorporates the cost of capital for all assets, including information technology equipment and software. The new system is presented in their book with William Nordhaus, published in 2006. In March 2007 Jorgenson's cost of capital was recommended by the United Nations Statistical Commission for incorporation into the United Nations’ 2008 System of National Accounts. Paul Schreyer (2009) has published an OECD Manual, Measuring Capital, to serve as a guide to practitioners. The “new architecture” was endorsed by the Advisory Committee on Measuring Innovation in the 21st Century to the Secretary of Commerce in 2008. Jorgenson (2009) has presented an updated version of the “new architecture” in his Richard and Nancy Ruggles Memorial Lecture to the International Association for Research in Income and Wealth. The World KLEMS Initiative was established at Harvard University on August 19–20, 2010. This will ultimately include industry-level production accounts, incorporating capital (K), labor (L), energy (E), materials (M) and services (S) inputs, for more than forty countries. Accounts for 25 or the 27 EU members, assembled by 18 EU-based research teams, were completed on June 30, 2008, and are presented by Marcel P. Timmer, Robert Inklaar, Mary O’Mahony, and Bart van Ark (2010). This landmark study also provides industry-level accounts for Australia, Canada, Japan, and Korea, as well as the U.S., based on the methodology of Jorgenson, Ho, and Stiroh (2005). These industry-level production accounts are now included in the official national accounts for Australia, Canada, and five European countries. The World KLEMS initiative will extend these efforts to important emerging and transition economies, including Argentina, Brazil, Chile, China, India, Indonesia, Mexico, Russia, Turkey, and Taiwan. Welfare measurement. In 1990 Jorgenson presented econometric methods for welfare measurement in his Presidential Address to the Econometric Society. These methods have generated a new approach to cost of living measurement and new measures of the standard of living, inequality, and poverty. This has required dispensing with ordinal measures of individual welfare that are not comparable among individuals, as persuasively argued by Amartya Sen in 1977. Jorgenson and Daniel T. Slesnick have met this requirement by substituting cardinal measures of individual welfare that are fully comparable among individuals. In 1989 Arthur Lewbel showed how the household equivalence scales proposed by Jorgenson and Slesnick can be used for this purpose. Evaluation of alternative policies. In 1993 Jorgenson and Peter J. Wilcoxen surveyed this evaluation of energy, environmental, trade, and tax policies, based on the econometric general equilibrium models Jorgenson developed with Ho and Wilcoxen. The concept of an intertemporal price system provides the unifying framework. This system balances current demands and supplies for products and factors of production. Asset prices are linked to the present values of future capital services through rational expectations equilibrium. The long-run dynamics of economic growth are captured through linkages among capital services, capital stocks, and past investments. Alternative policies are compared in terms of the impact of changes in policy on individual and social welfare. This approach was incorporated into the official guidelines for preparing economic analyses by the U.S. Environmental Protection Agency in 2000. Notes External links Harvard biography Senate article SAST REPORT: a candid conversation with economist Dale W. Jorgenson about his tax reform proposal 1933 births People from Bozeman, Montana Mathematicians from Montana Reed College alumni Harvard University alumni Kansai University alumni Living people Fellows of the American Statistical Association Fellows of the Econometric Society Presidents of the Econometric Society Harvard University faculty Harvard Kennedy School faculty Members of the Royal Swedish Academy of Sciences Members of the United States National Academy of Sciences Fellows of the American Academy of Arts and Sciences Presidents of the American Economic Association 20th-century American economists 21st-century American economists Distinguished Fellows of the American Economic Association Economists from Montana Members of the American Philosophical Society
534794
https://en.wikipedia.org/wiki/Association%20for%20the%20Advancement%20of%20Artificial%20Intelligence
Association for the Advancement of Artificial Intelligence
The Association for the Advancement of Artificial Intelligence (AAAI) is an international scientific society devoted to promote research in, and responsible use of, artificial intelligence. AAAI also aims to increase public understanding of artificial intelligence (AI), improve the teaching and training of AI practitioners, and provide guidance for research planners and funders concerning the importance and potential of current AI developments and future directions. History The organization was founded in 1979 under the name "American Association for Artificial Intelligence" and changed its name in 2007 to "Association for the Advancement of Artificial Intelligence". It has in excess of 4,000 members worldwide. In its early history, the organization was presided over by notable figures in computer science such as Allen Newell, Edward Feigenbaum, Marvin Minsky and John McCarthy. The current president is Yolanda Gil, and the president elect is Bart Selman. Conferences and publications The AAAI provides many services to the Artificial Intelligence community. The AAAI sponsors many conferences and symposia each year as well as providing support to 14 journals in the field of artificial intelligence. AAAI produces a quarterly publication, AI Magazine, which seeks to publish significant new research and literature across the entire field of artificial intelligence and to help members to keep abreast of research outside their immediate specialties. The magazine has been published continuously since 1980. AAAI organises the "AAAI Conference on Artificial Intelligence", which is considered to be one of the top conferences in the field of artificial intelligence. Awards In addition to AAAI Fellowship, the AAAI grants several other awards: ACM-AAAI Allen Newell Award The ACM-AAAI Allen Newell Award is presented to an individual selected for career contributions that have breadth within computer science, or that bridge computer science and other disciplines. This endowed award is accompanied by a prize of $10,000, and is supported by the Association for the Advancement of Artificial Intelligence (AAAI), Association for Computing Machinery (ACM), and by individual contributions. Past recipients: Lydia Kavraki (2019) Daphne Koller (2019) Henry Kautz (2018) Margaret A. Boden (2017) Jitendra Malik (2016) Eric Horvitz (2015) Jon Kleinberg (2014) Moshe Tennenholtz and Yoav Shoham (2012) Stephanie Forrest (2011) Takeo Kanade (2010) Michael I. Jordan (2009) Barbara J. Grosz and Joseph Halpern (2008) Leonidas Guibas (2007) Karen Spärck Jones (2006) Jack Minker (2005) Richard P. Gabriel (2004) David Haussler and Judea Pearl (2003) Peter Chen (2002) Ruzena Bajcsy (2001) Lotfi A. Zadeh (2000) Nancy Leveson (1999) Saul Amarel (1998) Carver Mead (1997) Joshua Lederberg (1995) Fred Brooks (1994) AAAI/EAAI Outstanding Educator Award The annual AAAI/EAAI Outstanding Educator Award was created in 2016 to honor a person (or group of people) who has made major contributions to AI education that provide long-lasting benefits to the AI community. Past recipients: AI4K12.org team: David_S._Touretzky, Christina Gardner-McCune, Fred G. Martin, and Deborah Seehorn (2022) Michael Wooldridge (2021) Marie desJardins (2020) Ashok Goel (2019) Todd W. Neller (2018) Sebastian Thrun (2017) Peter Norvig and Stuart J. Russell (2016) AAAI Squirrel AI Award for Artificial Intelligence for the Benefit of Humanity The AAAI Squirrel AI Award for Artificial Intelligence for the Benefit of Humanity is a $1 million award that recognizes the positive impacts of AI to meaningfully improve, protect, and enhance human life. Membership Grades AAAI Senior Members Senior Member status is designed to recognize AAAI members who have achieved significant accomplishments within the field of artificial intelligence. To be eligible for nomination for Senior Member, candidates must be consecutive members of AAAI for at least five years and have been active in the professional arena for at least ten years. Applications should include information that details the candidate's scholarship, leadership, and/or professional service. See also List of computer science awards References Artificial intelligence associations Computer science organizations Organizations established in 1979 Palo Alto, California Computer science-related professional associations
82921
https://en.wikipedia.org/wiki/Disk%20image
Disk image
A disk image, in computing, is a computer file containing the contents and structure of a disk volume or of an entire data storage device, such as a hard disk drive, tape drive, floppy disk, optical disc, or USB flash drive. A disk image is usually made by creating a sector-by-sector copy of the source medium, thereby perfectly replicating the structure and contents of a storage device independent of the file system. Depending on the disk image format, a disk image may span one or more computer files. The file format may be an open standard, such as the ISO image format for optical disc images, or a disk image may be unique to a particular software application. The size of a disk image can be large because it contains the contents of an entire disk. To reduce storage requirements, if an imaging utility is filesystem-aware it can omit copying unused space, and it can compress the used space. History Disk images were originally (in the late 1960s) used for backup and disk cloning of mainframe disk media. The early ones were as small as 5 megabytes and as large as 330 megabytes, and the copy medium was magnetic tape, which ran as large as 200 megabytes per reel. Disk images became much more popular when floppy disk media became popular, where replication or storage of an exact structure was necessary and efficient, especially in the case of copy protected floppy disks. Uses Disk images are used for duplication of optical media including DVDs, Blu-ray discs, etc. It is also used to make perfect clones of hard disks. A virtual disk may emulate any type of physical drive, such as a hard disk drive, tape drive, key drive, floppy drive, CD/DVD/BD/HD DVD, or a network share among others; and of course, since it is not physical, requires a virtual reader device matched to it (see below). An emulated drive is typically created either in RAM for fast read/write access (known as a RAM disk), or on a hard drive. Typical uses of virtual drives include the mounting of disk images of CDs and DVDs, and the mounting of virtual hard disks for the purpose of on-the-fly disk encryption ("OTFE"). Some operating systems such as Linux and macOS have virtual drive functionality built-in (such as the loop device), while others such as older versions of Microsoft Windows require additional software. Starting from Windows 8, Windows includes native virtual drive functionality. Virtual drives are typically read-only, being used to mount existing disk images which are not modifiable by the drive. However some software provides virtual CD/DVD drives which can produce new disk images; this type of virtual drive goes by a variety of names, including "virtual burner". Enhancement Using disk images in a virtual drive allows users to shift data between technologies, for example from CD optical drive to hard disk drive. This may provide advantages such as speed and noise (hard disk drives are typically four or five times faster than optical drives, are quieter, suffer from less wear and tear, and in the case of solid-state drives, are immune to some physical trauma). In addition it may reduce power consumption, since it may allow just one device (a hard disk) to be used instead of two (hard disk plus optical drive). Virtual drives may also be used as part of emulation of an entire machine (a virtual machine). Software distribution Since the spread of broadband, CD and DVD images have become a common medium for Linux distributions. Applications for macOS are often delivered online as an Apple Disk Image containing a file system that includes the application, documentation for the application, and so on. Online data and bootable recovery CD images are provided for customers of certain commercial software companies. Disk images may also be used to distribute software across a company network, or for portability (many CD/DVD images can be stored on a hard disk drive). There are several types of software that allow software to be distributed to large numbers of networked machines with little or no disruption to the user. Some can even be scheduled to update only at night so that machines are not disturbed during business hours. These technologies reduce end-user impact and greatly reduce the time and man-power needed to ensure a secure corporate environment. Efficiency is also increased because there is much less opportunity for human error. Disk images may also be needed to transfer software to machines without a compatible physical disk drive. For computers running macOS, disk images are the most common file type used for software downloads, typically downloaded with a web browser. The images are typically compressed Apple Disk Image (.dmg suffix) files. They are usually opened by directly mounting them without using a real disk. The advantage compared with some other technologies, such as Zip and RAR archives, is they do not need redundant drive space for the unarchived data. Software packages for Windows are also sometimes distributed as disk images including ISO images. While Windows versions prior to Windows 7 do not natively support mounting disk images to the files system, several software options are available to do this; see Comparison of disc image software. Security Virtual hard disks are often used in on-the-fly disk encryption ("OTFE") software such as FreeOTFE and TrueCrypt, where an encrypted "image" of a disk is stored on the computer. When the disk's password is entered, the disk image is "mounted", and made available as a new volume on the computer. Files written to this virtual drive are written to the encrypted image, and never stored in cleartext. The process of making a computer disk available for use is called "mounting", the process of removing it is called "dismounting" or "unmounting"; the same terms are used for making an encrypted disk available or unavailable. Virtualization A hard disk image is interpreted by a Virtual Machine Monitor as a system administrator using terms of naming, a hard disk image for a certain Virtual Machine monitor has a specific file. Hard drive imaging is used in several major application areas: Forensic imaging is the process that involves copying the contents and recording an image of the entire drives contents (imaging) into a single file (or a very small number of files). A component of forensic imaging involves verification of the values imaged to ensure the integrity of the file(s) imaged. Forensic images are created using software tools that can be acquired. Some tools have added forensic functionality previously mentioned; it is typically used to replicate the contents of the hard drive for use in another system. This can typically be done by software programs as it only structure are files themselves. Data recovery imaging is the process of imaging each sector, systematically, on the source drive to another destination storage medium from which required files can then be retrieved. In data recovery situations, one cannot always rely on the integrity of their particular file structure and therefore a complete sector copy is mandatory to imaging end there though. Forensic images are typically acquired using software tools compatible with their system. Note that some forensic imaging software tools may have limitations in terms of the software's ability to communicate, diagnose, or repair storage mediums that (often times) are experiencing errors or even a failure of some internal component. System backup Some backup programs only back up user files; boot information and files locked by the operating system, such as those in use at the time of the backup, may not be saved on some operating systems. A disk image contains all files, faithfully replicating all data, including file attributes and the file fragmentation state. For this reason, it is also used for backing up optical media (CDs and DVDs, etc.), and allows the exact and efficient recovery after experimenting with modifications to a system or virtual machine, in one go. There are benefits and drawbacks to both "file-based" and "bit-identical" image backup methods. Files that don't belong to installed programs can usually be backed up with file-based backup software, and this is preferred because file-based backup usually saves more time or space because they never copy unused space (as a bit-identical image does), they usually are capable of incremental backups, and generally have more flexibility. But for files of installed programs, file-based backup solutions may fail to reproduce all necessary characteristics, particularly with Windows systems. For example, in Windows certain registry keys use short filenames, which are sometimes not reproduced by file-based backup, some commercial software uses copy protection that will cause problems if a file is moved to a different disk sector, and file-based backups do not always reproduce metadata such as security attributes. Creating a bit-identical disk image is one way to ensure the system backup will be exactly as the original. Bit-identical images can be made in Linux with dd, available on nearly all live CDs. Most commercial imaging software is "user-friendly" and "automatic" but may not create bit-identical images. These programs have most of the same advantages, except that they may allow restoring to partitions of a different size or file-allocation size, and thus may not put files on the same exact sector. Additionally, if they do not support Windows Vista, they may slightly move or realign partitions and thus make Vista unbootable (see Windows Vista startup process). Rapid deployment of clone systems Large enterprises often need to buy or replace new computer systems in large numbers. Installing operating system and programs into each of them one by one requires a lot of time and effort and has a significant possibility of human error. Therefore, system administrators use disk imaging to quickly clone the fully prepared software environment of a reference system. This method saves time and effort and allows administrators to focus on each systems unique idiosyncrasies they must bear. There are several types of disk imaging software available that use single instancing technology to reduce the time, bandwidth, and storage required to capture and archive disk images. This makes it possible to rebuild and transfer information-rich disk images at lightning speeds, which is a significant improvement over the days when programmers spent hours configuring each machine within an organization. Legacy hardware emulation Emulators frequently use disk images to simulate the floppy drive of the computer being emulated. This is usually simpler to program than accessing a real floppy drive (particularly if the disks are in a format not supported by the host operating system), and allows a large library of software to be managed. Copy protection circumvention A mini image is an optical disc image file in a format that fakes the disk's content to bypass CD/DVD copy protection. Because they are the full size of the original disk, Mini Images are stored instead. Mini Images are small, on the order of kilobytes, and contain just the information necessary to bypass CD-checks. Therefore; the Mini Image is a form of a No-CD crack, for unlicensed games, and legally backed up games. Mini images do not contain the real data from an image file, just the code that is needed to satisfy the CD-check. They cannot provide CD or DVD backed data to the computer program such as on-disk image or video files. Creation Creating a disk image is achieved with a suitable program. Different disk imaging programs have varying capabilities, and may focus on hard drive imaging (including hard drive backup, restore and rollout), or optical media imaging (CD/DVD images). A virtual disk writer or virtual burner is a computer program that emulates an actual disc authoring device such as a CD writer or DVD writer. Instead of writing data to an actual disc, it creates a virtual disk image. A virtual burner, by definition, appears as a disc drive in the system with writing capabilities (as opposed to conventional disc authoring programs that can create virtual disk images), thus allowing software that can burn discs to create virtual discs. File formats Apple Disk Image IMG (file format) VHD (file format) VDI (file format) VMDK QCOW Utilities RawWrite and WinImage are examples of floppy disk image file writer/creator for MS-DOS and Microsoft Windows. They can be used to create raw image files from a floppy disk, and write such image files to a floppy. In Unix or similar systems the dd program can be used to create disk images, or to write them to a particular disk. It is also possible to mount and access them at block level using a loop device. Apple Disk Copy can be used on Classic Mac OS and macOS systems to create and write disk image files. Authoring software for CDs/DVDs such as Nero Burning ROM can generate and load disk images for optical media. See also Boot image Card image Comparison of disc image software Disk cloning El Torito (CD-ROM standard) ISO image, an archive file of an optical media volume Loop device Mtools no-CD crack Protected Area Run Time Interface Extension Services (PARTIES) ROM image Software cracking References External links Software repository including RAWRITE2 Archive formats Compact Disc and DVD copy protection Computer file formats Disk image emulators Hacker culture Hardware virtualization Optical disc authoring Warez
58420830
https://en.wikipedia.org/wiki/Wandersong
Wandersong
Wandersong is a puzzle adventure video game developed by American-Canadian indie developer Greg Lobanov. A music-themed game, it follows The Bard, a wandering singer, as they go on a quest to gather pieces of a song that will save their world from destruction. In the game, the player uses The Bard's singing to affect the environment, solve puzzles, and defend against enemies. Wandersong was released on macOS, Microsoft Windows, and Nintendo Switch in September 2018, PlayStation 4 in January 2019, and Xbox One in December 2019. Gameplay Wandersong is a side-scrolling puzzle and adventure game that uses music as a puzzle-solving mechanic. The player character can sing to cause events to occur in the environment around them, using a coloured "song wheel" with eight directions, each representing musical notes spanning an octave, which is controlled by the mouse on a computer setup, or the right thumbstick on a controller. Certain challenges in the game involve the player matching notes or tunes with a non-player character's song, similar to a typical rhythm game. Functions that allow the player to perform dances also exist, but do not actually serve a purpose in gameplay. The player character's goal is to gather the pieces of the "Earthsong", a composition that, when sung, will allow the universe to be preserved. The main game is estimated to be 9 hours in length. Plot The protagonist, a bard, meets a messenger of the goddess Eya in a dream. The messenger tells them that the world is ending, and that the Bard is not "The Hero" of prophecy. However, they may be able to save the world by learning the "Earthsong". Each of the world's seven "Overseers" knows a piece of the Earthsong, and so the Bard is tasked with learning the songs corresponding to the Overseers. Each Overseer's song allows the Bard to travel to their domain in the spirit world and ask for their piece of the Earthsong. The Bard travels around the world with a witch named Miriam, and they make friends in each location they visit as they search for the Overseers' songs. However, the third Overseer they meet is abruptly killed by a stranger before the Bard can learn the piece of the Earthsong. Eya's messenger reveals that the stranger, Audrey Redheart, is the prophesied Hero, chosen to bring about the end of the world by killing the Overseers. The Bard is able to learn the piece of the Earthsong from the dead Overseer, having gained the ability to speak to ghosts from the first Overseer. The Bard and Miriam continue travelling in hopes that the completed Earthsong may still be able to save the world. Miriam, who was initially cynical and distrusting of the Bard, gradually grows to value their friendship. They meet Audrey several more times as she continues her quest to kill the Overseers. The Bard tries to convince Audrey to give up on her quest, saying that the title of "Hero" is not what makes her special. Audrey rejects this and kills the last Overseer before the Bard can learn the Earthsong. The world fades to greyscale as it begins to die. The Bard sings, and all the people they met on their journey respond in harmony around the world. Color returns to the world, and its destruction is stopped. The messenger of Eya tells the Bard that their song was not the Earthsong, but something new that harmonized with Eya's song. Development Wandersong was developed in GameMaker Studio by Greg Lobanov, who had also built a custom level editor and audio editor to create the game. It marks Lobanov's return to story-driven games, a milestone that had been a goal of his after years of learning game design skills developing games such as Perfection and Coin Crypt. Development on the game started in October 2015, and was inspired by a cross-country bike trip Lobanov took across the United States from October 2014 to March 2015. Lobanov wanted to translate the "feel of that journey", where he had met various people and felt overwhelmed by their perceived kindness and generosity. Lobanov originally went for a more literal interpretation of the cross-country bike trip that inspired him, developing games about biking. He was ultimately dissatisfied with these early concepts, feeling that they were "missing the spirit of the trip", and opted to create a game with a "really hippy, dopey, joyful rainbow message" that resembled his experiences on the trip. Lobanov's tenure in the "Indie House" in Vancouver, British Columbia early in development had an influence on the game's direction, as Lobanov shared his ideas with, and received feedback from, fellow game developers. Wandersong was funded through a Kickstarter campaign that successfully raised US$21,936 from 989 contributors in February 2016. Lobanov invented the game's "song wheel" mechanic as a way to encourage the player to experiment with the environment, as opposed to the specificity of other music-related mechanics in games such as The Legend of Zelda: Ocarina of Time. The wheel was designed with accessibility in mind; Lobanov designed Wandersong so that the solutions to puzzles would always be hinted at through direction, rather than sound and colour alone, allowing players with color blindness or hearing loss to solve the game's puzzles. The introduction of new uses of the song wheel in each of the game's puzzle areas complements the absence of a consistently rising difficulty curve, allowing the game to also be accessible to players with a lower skill set in puzzle solving. Many puzzles in Wandersong were reworked from their initial, more complex iterations, after game testers and patrons at conventions, where the game was showcased during development, found them difficult or hard to understand. The game's "paper cutout" art style is primarily inspired by Kirby's Epic Yarn and its "physical feeling", the simplicity of which was also easier for Lobanov to work with as the game's sole programmer and artist. Characters flip over like paper when changing direction in the game's two-dimensional art style. The game's visuals have drawn comparisons by observers to The Legend of Zelda: The Wind Waker and the Paper Mario series. Its story, which Lobanov had planned the beginning and ending of from the start of development, is mostly constructed out of spontaneous ideas Lobanov had during development based on the game's core mechanic of singing, which were made easier to implement through the custom level editor and the minimalist art style. The apocalyptic setting of Wandersong is juxtaposed with an optimistic and positive atmosphere. Lobanov felt that a completely positive atmosphere would result in a "really empty experience", and decided to add tension through "really dark, scary, big things" to the game's story and opposing it with a "carefree, happy response" from The Bard. The game's non-intimidating tone and pace was informed by animated cartoons that incorporated music, such as Over the Garden Wall and Steven Universe. The cartoons also served as a reference for the game's animations. Audio The game's soundtrack was created by Lobanov and Gordon McGladdery, who is the director of A Shell in the Pit, a video game sound design and music production company known for their work on indie games such as Duelyst, Parkitect, and Rogue Legacy. Lobanov, who had not been traditionally trained in music, self-taught to compose and perform music for the game. Lobanov enlisted the help of McGladdery early in development after talking with and considering various composers. Lobanov's admiration of McGladdery's musical style and the convenience of both of them being located in Vancouver had attracted him to McGladdery. Early in development, the composition of music for particular levels and scenes in the game would take place after the level or scene was completed, with changes to the level or scene also being implemented to better fit with the music composed. The game's sound design was produced in REAPER by A Shell in the Pit's Em Halberstadt, who previously worked on Night in the Woods as a sound designer. The game was a departure from Halberstadt's ambient style and foley production for Night in the Woods and employed a more musical direction that also involved character voices. For each sound effect in Wandersong, an entire scale of variations were created in Kontakt to fit the scale of the background music that it would appear with. An eight-track extended play featuring McGladdery's music for the game entitled Wandersong: Dreams & Wonder was released on November 8, 2017, and a 7" single limited to 750 copies featuring the tracks "Sailing with the Coffee Pirates" and "Dreamscape" was released by Yetee Records on May 17, 2018. Release Wandersong was showcased at GDC 2016, PAX West 2016, PAX West 2017, Day of the Devs 2017, and PAX West 2018. The game was published by Humble Bundle's publishing imprint "Presented by Humble Bundle", and released on September 27, 2018, on macOS and Microsoft Windows through the Humble Bundle Store and Steam, and on Nintendo Switch through the Nintendo eShop. It was part of the first wave of games to be released with the GameMaker Studio 2 Nintendo Switch license, with Hyper Light Drifter and Undertale also being released for the Switch in September. The game was later released on PlayStation 4 via the PlayStation Store on January 22, 2019, Microsoft Windows via the Microsoft Store as a launch title for Xbox Game Pass for PC on June 9, 2019, and Xbox One via the Microsoft Store on December 6, 2019. On Xbox One, the game was also made available through Xbox Game Pass for Console. Steam controversy Wandersong saw a troubled release on Steam, as the game was erroneously tagged for review under Valve's policies against games that intentionally inflated users' statistics, such as achievements and trading card progression. Trading cards and achievements for Wandersong were inaccessible to players, who could not add the game to their "favorite games" lists as a result of the policy until January 9, 2019, when Valve, the developers of Steam, contacted Lobanov via email, describing the error as a glitch in which Steam did not update the game's store page correctly. Sales of the game on Steam were not impacted by the error. Reception Wandersong received "generally favorable reviews" according to review aggregator Metacritic. Reviewers praised the game's story, music, visuals, and attention to detail, but found the gameplay lackluster by comparison. Writing for Destructoid, CJ Andriessen gave the game a score of 9/10, calling it "a rollercoaster ride through the spectrum of feelings, all wrapped up in a lovingly crafted construction paper world". Giada Zavarise of Rock, Paper, Shotgun wrote that Wandersong "wants to make you smile" and noted its accessibility for players who are colorblind or deaf, but remarked that the puzzles and platforming make it "less instantly accessible than, say, [...] Night in the Woods". Phil Savage of PC Gamer praised the characters, saying that "it's the smaller character details—the people you meet along your journey—that ultimately resonate". Javy Gwaltney's review for Game Informer criticized Wandersong platforming and rhythm segments, stating that the gameplay "isn't much fun" and lacks depth or challenge, though the characters are "charming enough to keep things engaging". Accolades Wandersong was named the 9th best Nintendo Switch game of 2018 and the 13th best PC game of 2018 by the magazine Paste. At the 2019 Independent Games Festival, Wandersong was nominated for Excellence in Narrative, and received honorable mentions for the Seumas McNally Grand Prize, Excellence in Visual Art, and Excellence in Audio. It was also nominated for the "Game, Original Family" award at the 2019 NAVGTR Awards, and for Most Fulfilling Community-Funded Game at the 2019 SXSW Gaming Awards. At the 2019 G.A.N.G. Awards, Wandersong was nominated for Best Interactive Score and Best Sound Design for an Indie Game. See also List of GameMaker Studio games List of puzzle video games References Sources Citations External links 2018 video games GameMaker Studio games Indie video games Kickstarter-funded video games LGBT-related video games MacOS games Music video games Nintendo Switch games PlayStation 4 games Puzzle video games Single-player video games Video games developed in Canada Windows games Humble Games games Xbox One games
1267921
https://en.wikipedia.org/wiki/Diversi-Dial
Diversi-Dial
Diversi-Dial, or DDial was an online chat server that was popular during the mid-1980s. It was a specialized type of bulletin board system that allowed all callers to send lines of text to each other in real-time, often operating at 300 baud. In some ways, it was a sociological forerunner to IRC, and was a cheap, local alternative to CompuServe chat, which was expensive and billed by the minute. At its peak, at least 35 major DDial systems existed across the United States, many of them in large cities. During the evening when telephone rates were low, the biggest DDial systems would link together using Telenet or PC Pursuit connections, forming regional chat networks. History Diversi-Dial was written by Bill Basham, a computer hobbyist who ran a company known as Diversified Software Research in Farmington Hills, Michigan. The software was written while he was a resident of Rockford, Illinois. Bill Basham ran a copy of the system himself in Rockford at the time. Kim Kirkpatrick, "Hubcap" also in Rockford, ran DDial#2 and a lot of early testing was done between Bill and Kim. Another early Rockford site was owned by Dale Wishop (GOD) named "Heaven and Hell - The World that GOD Rulz Over!". The phone company had to run miles of new cable just to accommodate the additional phone lines. When "Heaven and Hell" shutdown, Dale sold some of his 1200 baud Applecat modems to Scott and Terri Johnson (Megabucks and Spender), and DDial #12 “Spenderz Never Inn” stated up in its place, running on two Apple //e computers and 12 phone lines. All of Bill's software followed the same naming scheme as "Diversi-[something]". Organization Customers typically paid the local DDial owner a flat rate of about $5 to $20 per month. Open access to anonymous visitors (called nons, r0s JAMFs or m0es) was an effective hook to draw in paid registrations. Nons typically had a five-minute connect time limit unless they were "validated" by an assistant sysop, and were shut out of the system during peak usage hours. A typical DDial system ran on a small cluster of Apple II computers, with seven connections per computer. In 1989, a DDial-like clone, Synergy Teleconferencing System AKA STS was developed for the IBM PC, but by this time it was outpaced by alternatives like GEnie. By the mid-1990s, DDials had been bypassed by the Internet and IRC, although Chicago's God's Country, kept an incredibly loyal following between 1985–87 and 1989-1998. Many of its users are still close to this day. Many client software programs existed for BBS connections back then, but one in particular for C64 or Commodore64 was optimized just for DDial, call Eagleterm 6a. Written by Jungle Jim (Jj), aka Jim Sanders, and released as freeware and saw widespread use among Ddialers. EagleTerm6a took full advantage of Commodore 64 pulse dial modem technology, heavily optimized to find the max pulse speed of the user's local phone connection, and rapid fire re-dial back in to beat the other callers when reconnecting, easily beating the newer tone dial modems just coming to market. During peak times, the DDial system was jam packed with callers far exceeding the number of available lines, and a super fast dialer was a plus. Later versions of Eagleterm6a were protected against reverse engineering (not de-compilable using unBlitz.) Major DDials One DDial owner went on to become the founder and CEO of Honesty.com, the first web-based third-party Internet application corporation, focused on E-Commerce sites such as eBay, Amazon.com, and Yahoo! Auctions, by utilizing the knowledge gleaned from having run a social and community based computer system for a decade prior to initial popularity of the Web. Point Zer0 was the other long-term Chicago-area Ddial, along with Jokertown. Other Chicago-area Ddials of Note included Kaleidoscope, General Modem (DDial #13), Tangled Web, The Bunker (DDial #4), Cloud Nine (DDial #38), Black Magic and others. At one period of time, the Chicago area hosted over 10 DDial or clone systems, possibly due to its relative proximity to the Rockford origins of Basham's DD #1. ENTchat, an Internet-based DDial look-alike, was somewhat active in the mid- to late 1990s but also went offline. In 2006, The Late Night BBS went online, utilizing the original DDial software running on an Apple //e, but was accessible from the Internet via telnet. The system provided an authentic 1980s ddial experience, including the traditional 300 bit/s connection speed. Late Night BBS has since gone offline. As of 2012, there are only two known DDial stations in operation: The Savage Frontier, DDial Station #28, has been modified to run under emulation and is therefore Internet accessible. This system served the Philadelphia metropolitan area in the 1980s and 1990s, at times under other names. RMAC (aka Rover's Multiuser Active Conference), DDial Station #34, runs on original Apple IIe hardware with modems and has been constructed in an Internet-accessible manner. This system served the Dallas / Fort Worth metroplex in the late 1980s. Today, the system uses authentic DDial software with TASC/Paradise mods, and can be reached via telnet at rmac.d-dial.com. Retro-Dial, a Linux-based chat server with the look and feel of DDial, currently has multiple stations in operation which are usually linked with each other as well as The Savage Frontier. The home station for Retro-Dial can be reached via telnet at carriersync.com. In late June, 2013, several members of Rockford, IL area DDials held a spontaneous reunion online, connected to each other via RMAC DDial #34. References Diversi-Dial instructions and software images DDial Highlights Searchable database of old DDials and other bulletin board systems Discussion group for DDial users The Savage Frontier RMAC Diversi-Dial #34 Project Page for Mouse's DDial Emulator XxSwitchBladexX's Digital Dial homepage CarrierSync - Retro-Dial Home Page Bulletin board system software
62534641
https://en.wikipedia.org/wiki/RGraph
RGraph
RGraph is an HTML5 software library for charting written in native JavaScript. It was created in 2008. RGraph started as an easy-to-use commercial tool based on HTML5 canvas only. It became freely available to use under the open-source MIT license and supports more than 50 chart types in both SVG and canvas. License RGraph is published using the Open Source MIT license. Mentions In July 2014, Salesforce made RGraph available to be plugged into the reporting and dashboard tools on its mobile platform. RGraph is among six third-party visualization tools available inside the dashboards, together with Google Charts, D3.js, CanvasJS, Chart.js, and Highcharts. In a book "Android Cookbook: Problems and Solutions for Android Developers," RGraph is recommended as an alternative to creating Android charts in pure Java. See also JavaScript framework JavaScript library References External links Data visualization software JavaScript libraries JavaScript visualization toolkits JavaScript Visualization API Charts Infographics Free software programmed in JavaScript Software using the MIT license Free data analysis software Formerly proprietary software
62381917
https://en.wikipedia.org/wiki/Software%20bot
Software bot
A software bot is a type of software agent in the service of software project management and software engineering. A software bot has an identity and potentially personified aspects in order to serve their stakeholders. Software bots often compose software services and provide an alternative user interface, which is sometimes, but not necessarily conversational. Software bots are typically used to execute tasks, suggest actions, engage in dialogue, and promote social and cultural aspects of a software project. The term bot is derived from robot. However, robots act in the physical world and software bots act only in digital spaces. Some software bots are designed and behave as chatbots, but not all chatbots are software bots. Erlenhov et al. discuss the past and future of software bots and show that software bots have been adopted for many years. Usage Software bots are used to support development activities, such as communication among software developers and automation of repetitive tasks. Software bots have been adopted by several communities related to software development, such as open-source communities on GitHub and Stack Overflow. GitHub bots have user accounts and can open, close, or comment on pull requests and issues. GitHub bots have been used to assign reviewers, ask contributors to sign the Contributor License Agreement, report continuous integration failures, review code and pull requests, welcome newcomers, run automated tests, merge pull requests, fix bugs and vulnerabilities, etc. The Slack tool includes an API for developing software bots. There are slack bots for keeping track of todo lists, coordinating standup meetings, and managing support tickets. The Chatbot company products further simplify the process of creating a custom Slack bot. On Wikipedia, Wikipedia bots automate a variety of tasks, such as creating stub articles, consistently updating the format of multiple articles, and so on. Bots like ClueBot NG are capable of recognizing vandalism and automatically remove disruptive content. Taxonomies and Classification Frameworks Lebeuf et al. provide a faceted taxonomy to characterize bots based on a literature review. It is composed of 3 main facets: (i) properties of the environment that the bot was created in; (ii) intrinsic properties of the bot itself; and (iii) the bot's interactions within its environment. They further detail the facets into sets of sub-facets under each of the main facets. Paikari and van der Hoek defined a set of dimensions to enable comparing software bots, applied especifically to chatbots. It resulted in six dimensions: Type: the main purpose of the bot (information, collaboration, or automation) Direction of the "conversation" (input, output, or bi-directional) Guidance (human-mediated, or autonomous) Predictability (deterministic, or evolving) Interaction style (dull, alternate vocabulary, relationship-builder, human-like) Communication channel (text, voice, or both) Erlenhov et al. raised the question of the difference between a bot and simple automation, since much research done in the name of software bots uses the term bot to describe various different tools and sometimes things are "just" plain old development tools. After interviewing and surveying over 100 developers the authors found that not one, but three definitions dominated the community. They created three personas based on these definitions and the difference between what the three personas see as being a bot is mainly the association with a different set of human-like traits. The chat bot persona (Charlie) primarily thinks of bots as tools that communicates with the developer through a natural language interface (typically voice or chat), and caring little about what tasks the bot is used for or how it actually implements these tasks. The autonomous bot persona (Alex) thinks of bots as tools that work on their own (without requiring much input from a developer) on a task that would normally be done by a human. The smart bot persona (Sam) separates bots and plain old development tools through how smart (technically sophisticated) a tool is. Sam cares less about how the tool communicates, but more about if it is unusually good or adaptive at executing a task. The authors recommends that people doing research or writing about bots try to put their work in the context of one of the personas since the personas have different expectations and problems with the tools. Example of notable bots Dependabot and Renovatebot update software dependencies and detect vulnerabilities. (https://dependabot.com/) Probot is an organization that create and maintain bots for GitHub. The example bots using Probot are the following. Auto Assign (https://probot.github.io/apps/auto-assign/) license bot (https://probot.github.io/) Sentiment bot (https://probot.github.io/apps/sentiment-bot/) Untrivializer bot (https://probot.github.io/apps/untrivializer/) Refactoring-Bot (Refactoring-Bot): provides refactoring based on static code analysis Looks good to me bot (LGTM) is a Semmle product that inspects pull requests on GitHub for code style and unsafe code practices. Issues and threats Software bots may not be well accepted by humans. A study from the University of Antwerp has compared how developers active on Stack Overflow perceive answers generated by software bots. They find that developers perceive the quality of software bot-generated answers to be significantly worse if the identity of the software bot is made apparent. By contrast, answers from software bots with human-like identity were better received. In practice, when software bots are used on platforms like GitHub or Wikipedia, their username makes it clear that they are bots, e.g., DependaBot, RenovateBot, DatBot, SineBot. Bots may be subject to special rules. For instance, the GitHub terms of service does not allow `bot` but accepts `machine account`, where a `machine account` has two properties: 1) a human takes full responsibility of the bot's actions 2) it cannot create other accounts. See also Chatbot Daemon Internet bot Software agent References Software engineering
84636
https://en.wikipedia.org/wiki/OK%20Computer
OK Computer
OK Computer is the third studio album by English rock band Radiohead, released on 21 May 1997 on EMI subsidiaries Parlophone and Capitol Records. Radiohead self-produced the album with Nigel Godrich, an arrangement they have used for their subsequent albums. Other than the song "Lucky", recorded in 1995, Radiohead recorded OK Computer in Oxfordshire and Bath in 1996 and early 1997, mostly in the historic mansion St Catherine's Court. The band distanced themselves from the guitar-centred, lyrically introspective style of their previous album, The Bends. OK Computers abstract lyrics, densely layered sound and eclectic influences laid the groundwork for Radiohead's later, more experimental work. The album's lyrics depict a world fraught with rampant consumerism, social alienation, emotional isolation and political malaise; in this capacity, OK Computer has been said to have prescient insight into the mood of 21st-century life. The band used unconventional production techniques, including natural reverberation through recording on a staircase, and no audio separation. Strings were recorded at Abbey Road Studios in London. Guitarist Ed O'Brien estimated that 80 per cent of the album was recorded live. Despite lowered sales estimates by EMI, who deemed the record uncommercial and difficult to market, OK Computer reached number one on the UK Albums Chart and debuted at number 21 on the Billboard 200, Radiohead's highest album entry on the US charts at the time, and was soon certified 5× platinum in the UK and double platinum in the US. The songs "Paranoid Android", "Karma Police", "Lucky" and "No Surprises" were released as singles. The album expanded Radiohead's international popularity and has sold at least 7.8 million units worldwide. A remastered version with additional tracks, OKNOTOK 1997 2017, was released in 2017, marking the album's twentieth anniversary. In 2019, in response to an internet leak, Radiohead released MiniDiscs [Hacked], comprising hours of demos, rehearsals, live performances and other material. OK Computer received critical acclaim and has been cited by listeners, critics and musicians as one of the greatest albums of all time. It was nominated for the Album of the Year and won Best Alternative Music Album at the 1998 Grammy Awards. It was also nominated for Best British Album at the 1998 Brit Awards. The album initiated a stylistic shift in British rock away from Britpop toward melancholic, atmospheric alternative rock that became more prevalent in the next decade. In 2014, it was included by the Library of Congress in the National Recording Registry as "culturally, historically, or aesthetically significant". Background In 1995, Radiohead toured in support of their second album, The Bends (1995). Midway through the tour, Brian Eno commissioned them to contribute a song to The Help Album, a charity compilation organised by War Child; the album was to be recorded over the course of a single day, 4 September 1995, and rush-released that week. Radiohead recorded "Lucky" in five hours with engineer Nigel Godrich, who had engineered on The Bends and produced several Radiohead B-sides. Godrich said of the session: "Those things are the most inspiring, when you do stuff really fast and there's nothing to lose. We left feeling fairly euphoric. So after establishing a bit of a rapport work-wise, I was sort of hoping I would be involved with the next album." Singer Thom Yorke said "Lucky" shaped the nascent sound and mood of their upcoming record: "'Lucky' was indicative of what we wanted to do. It was like the first mark on the wall." Radiohead found touring stressful and took a break in January 1996. They sought to distance their new material from the introspective style of The Bends. Drummer Philip Selway said: "There was an awful lot of soul-searching [on The Bends]. To do that again on another album would be excruciatingly boring." Yorke said he did not want to do "another miserable, morbid and negative record", and was "writing down all the positive things that I hear or see. I'm not able to put them into music yet and I don't want to just force it." The critical and commercial success of The Bends gave Radiohead the confidence to self-produce their third album. Their label Parlophone gave them a £100,000 budget for recording equipment. Guitarist Jonny Greenwood said "the only concept that we had for this album was that we wanted to record it away from the city and that we wanted to record it ourselves". According to guitarist Ed O'Brien, "Everyone said, 'You'll sell six or seven million if you bring out The Bends Pt 2,' and we're like, 'We'll kick against that and do the opposite'." A number of producers, including major figures such as R.E.M. producer Scott Litt, were suggested, but the band were encouraged by their sessions with Godrich. They consulted him for advice on what equipment to use, and prepared for the sessions by buying their own equipment, including a plate reverberator purchased from songwriter Jona Lewie. Although Godrich had sought to focus his work on electronic dance music, he outgrew his role as advisor and became the album's co-producer. Recording In early 1996, Radiohead recorded demos for their third album at Chipping Norton Recording Studios, Oxford. In July, they began rehearsing and recording in their Canned Applause studio, a converted shed near Didcot, Oxfordshire. Even without the deadline that contributed to the stress of The Bends, the band had difficulties, which Selway blamed on their choice to self-produce: "We're jumping from song to song, and when we started to run out of ideas, we'd move on to a new song ... The stupid thing was that we were nearly finished when we'd move on, because so much work had gone into them." The members worked with nearly equal roles in the production and formation of the music, though Yorke was still firmly "the loudest voice", according to O'Brien. Selway said "we give each other an awful lot of space to develop our parts, but at the same time we are all very critical about what the other person is doing." Godrich's role as co-producer was part collaborator, part managerial outsider. He said that Radiohead "need to have another person outside their unit, especially when they're all playing together, to say when the take goes well ... I take up slack when people aren't taking responsibility—the term producing a record means taking responsibility for the record ... It's my job to ensure that they get the ideas across." Godrich has produced every Radiohead album since, and has been characterised as Radiohead's "sixth member", an allusion to George Martin's nickname as the "fifth Beatle". Radiohead decided that Canned Applause was an unsatisfactory recording location, which Yorke attributed to its proximity to the band members' homes, and Jonny Greenwood attributed to its lack of dining and bathroom facilities. The group had nearly completed four songs: "Electioneering", "No Surprises", "Subterranean Homesick Alien" and "The Tourist". They took a break from recording to embark on an American tour in 1996, opening for Alanis Morissette, performing early versions of several new songs. During the tour, filmmaker Baz Luhrmann commissioned Radiohead to write a song for his upcoming film Romeo + Juliet and gave them the final 30 minutes of the film. Yorke said: "When we saw the scene in which Claire Danes holds the Colt .45 against her head, we started working on the song immediately." Soon afterwards, the band wrote and recorded "Exit Music (For a Film)"; the track plays over the film's end credits but was excluded from the soundtrack album at the band's request. The song helped shape the direction of the rest of the album; Yorke said it "was the first performance we'd ever recorded where every note of it made my head spin—something I was proud of, something I could turn up really, really loud and not wince at any moment." Radiohead resumed recording in September 1996 at St Catherine's Court, a historic mansion near Bath owned by actress Jane Seymour. The mansion was unoccupied but sometimes used for corporate functions. The change of setting marked an important transition in the recording process. Greenwood, comparing the mansion to previous studio settings, said it "was less like a laboratory experiment, which is what being in a studio is usually like, and more about a group of people making their first record together." The band made extensive use of the different rooms and acoustics in the house. The vocals on "Exit Music (For a Film)" feature natural reverberation achieved by recording on a stone staircase, and "Let Down" was recorded in a ballroom at 3 a.m. Isolation allowed the band to work at a different pace, with more flexible and spontaneous working hours. O'Brien said that "the biggest pressure was actually completing [the recording]. We weren't given any deadlines and we had complete freedom to do what we wanted. We were delaying it because we were a bit frightened of actually finishing stuff." Yorke was satisfied with the recordings made at the location, and enjoyed working without audio separation, meaning that instruments were not overdubbed separately. O'Brien estimated that 80 per cent of the album was recorded live, and said: "I hate doing overdubs, because it just doesn't feel natural. ... Something special happens when you're playing live; a lot of it is just looking at one another and knowing there are four other people making it happen." Many of Yorke's vocals were first takes; he felt that if he made other attempts he would "start to think about it and it would sound really lame". Radiohead returned to Canned Applause in October for rehearsals, and completed most of OK Computer in further sessions at St. Catherine's Court. By Christmas, they had narrowed the track listing to 14 songs. The strings were recorded at Abbey Road Studios in London in January 1997. The album was mixed over the next two months at various London studios, then mastered by Chris Blair at Abbey Road. Godrich preferred a quick and "hands-off" approach to mixing, and said: "I feel like I get too into it. I start fiddling with things and I fuck it up ... I generally take about half a day to do a mix. If it's any longer than that, you lose it. The hardest thing is trying to stay fresh, to stay objective." Music and lyrics Style and influences Yorke said that the starting point for the record was the "incredibly dense and terrifying sound" of Bitches Brew, the 1970 avant-garde jazz fusion album by Miles Davis. He described the sound of Bitches Brew to Q: "It was building something up and watching it fall apart, that's the beauty of it. It was at the core of what we were trying to do with OK Computer." Yorke identified "I'll Wear It Proudly" by Elvis Costello, "Fall on Me" by R.E.M., "Dress" by PJ Harvey and "A Day in the Life" by the Beatles as particularly influential on his songwriting. Radiohead drew further inspiration from the recording style of film soundtrack composer Ennio Morricone and the krautrock band Can, musicians Yorke described as "abusing the recording process". Jonny Greenwood described OK Computer as a product of being "in love with all these brilliant records ... trying to recreate them, and missing". According to Yorke, Radiohead hoped to achieve an "atmosphere that's perhaps a bit shocking when you first hear it, but only as shocking as the atmosphere on the Beach Boys' Pet Sounds". They expanded their instrumentation to include electric piano, Mellotron, cello and other strings, glockenspiel and electronic effects. Jonny Greenwood summarised the exploratory approach as "when we've got what we suspect to be an amazing song, but nobody knows what they're gonna play on it." Spin characterised OK Computer as sounding like "a DIY electronica album made with guitars". Critics suggested a stylistic debt to 1970s progressive rock, an influence that Radiohead have disavowed. According to Andy Greene in Rolling Stone, Radiohead "were collectively hostile to seventies progressive rock ... but that didn't stop them from reinventing prog from scratch on OK Computer, particularly on the six-and-a-half-minute 'Paranoid Android'." Tom Hull believed the album was "still prog, but may just be because rock has so thoroughly enveloped musical storytelling that this sort of thing has become inevitable." Writing in 2017, The New Yorkers Kelefa Sanneh said OK Computer "was profoundly prog: grand and dystopian, with a lead single that was more than six minutes long". Lyrics The album's lyrics, written by Yorke, are more abstract compared to his personal, emotional lyrics for The Bends. Critic Alex Ross said the lyrics "seemed a mixture of overheard conversations, techno-speak, and fragments of a harsh diary" with "images of riot police at political rallies, anguished lives in tidy suburbs, yuppies freaking out, sympathetic aliens gliding overhead." Recurring themes include transport, technology, insanity, death, modern British life, globalisation and anti-capitalism. Yorke said: "On this album, the outside world became all there was ... I'm just taking Polaroids of things around me moving too fast." He told Q: "It was like there's a secret camera in a room and it's watching the character who walks in—a different character for each song. The camera's not quite me. It's neutral, emotionless. But not emotionless at all. In fact, the very opposite." Yorke also drew inspiration from books, including Noam Chomsky's political writing, Eric Hobsbawm's The Age of Extremes, Will Hutton's The State We're In, Jonathan Coe's What a Carve Up! and Philip K. Dick's VALIS. The songs of OK Computer do not have a coherent narrative, and the album's lyrics are generally considered abstract or oblique. Nonetheless, many musical critics, journalists, and scholars consider the album to be a concept album or song cycle, or have analysed it as a concept album, noting its strong thematic cohesion, aesthetic unity, and the structural logic of the song sequencing. Although the songs share common themes, Radiohead have said they do not consider OK Computer a concept album and did not intend to link the songs through a narrative or unifying concept while it was being written. Jonny Greenwood said: "I think one album title and one computer voice do not make a concept album. That's a bit of a red herring." However, the band intended the album to be heard as a whole, and spent two weeks ordering the track list. O'Brien said: "The context of each song is really important ... It's not a concept album but there is a continuity there." Composition Tracks 1–6 The opening track, "Airbag", is underpinned by a beat built from a seconds-long recording of Selway's drumming. The band sampled the drum track with a sampler and edited it with a Macintosh computer, inspired by the music of DJ Shadow, but admitted to making approximations in emulating Shadow's style due to their programming inexperience. The bassline stops and starts unexpectedly, achieving an effect similar to 1970s dub. The song's references to automobile accidents and reincarnation were inspired by a magazine article titled "An Airbag Saved My Life" and The Tibetan Book of the Dead. Yorke wrote "Airbag" about the illusion of safety offered by modern transit, and "the idea that whenever you go out on the road you could be killed". The BBC wrote of the influence of J. G. Ballard, especially his 1973 novel Crash, on the lyrics. Music journalist Tim Footman noted that the song's technical innovations and lyrical concerns demonstrated the "key paradox" of the album: "The musicians and producer are delighting in the sonic possibilities of modern technology; the singer, meanwhile, is railing against its social, moral, and psychological impact ... It's a contradiction mirrored in the culture clash of the music, with the 'real' guitars negotiating an uneasy stand-off with the hacked-up, processed drums." Split into four sections with an overall running time of 6:23, "Paranoid Android" is among the band's longest songs. The unconventional structure was inspired by the Beatles' "Happiness Is a Warm Gun" and Queen's "Bohemian Rhapsody", which also eschew a traditional verse-chorus-verse structure. Its musical style was also inspired by the music of the Pixies. The song was written by Yorke after an unpleasant night at a Los Angeles bar, where he saw a woman react violently after someone spilled a drink on her. Its title and lyrics are a reference to Marvin the Paranoid Android from Douglas Adams's The Hitchhiker's Guide to the Galaxy series. The use of electric keyboards in "Subterranean Homesick Alien" is an example of the band's attempts to emulate the atmosphere of Bitches Brew. Its title references the Bob Dylan song "Subterranean Homesick Blues", and the lyrics describe an isolated narrator who fantasises about being abducted by extraterrestrials. The narrator speculates that, upon returning to Earth, his friends would not believe his story and he would remain a misfit. The lyrics were inspired by an assignment from Yorke's time at Abingdon School to write a piece of "Martian poetry", a British literary movement that humorously recontextualises mundane aspects of human life from an alien perspective. William Shakespeare's Romeo and Juliet inspired the lyrics for "Exit Music (For a Film)". Initially Yorke wanted to work lines from the play into the song, but the final draft of the lyrics became a broad summary of the narrative. He said: "I saw the Zeffirelli version when I was 13 and I cried my eyes out, because I couldn't understand why, the morning after they shagged, they didn't just run away. It's a song for two people who should run away before all the bad stuff starts." Yorke compared the opening of the song, which mostly features his singing paired with acoustic guitar, to Johnny Cash's At Folsom Prison. Mellotron choir and other electronic voices are used throughout the track. The song climaxes with the entrance of drums and distorted bass run through a fuzz pedal. The climactic portion of the song is an attempt to emulate the sound of trip hop group Portishead, but in a style that bass player Colin Greenwood called more "stilted and leaden and mechanical". The song concludes by fading back to Yorke's voice, acoustic guitar and Mellotron. "Let Down" contains multilayered arpeggiated guitars and electric piano. Jonny Greenwood plays his guitar part in a different time signature to the other instruments. O'Brien said the song was influenced by Phil Spector, a producer and songwriter best known for his reverberating "Wall of Sound" recording techniques. The lyrics, Yorke said, are about a fear of being trapped, and "about that feeling that you get when you're in transit but you're not in control of it—you just go past thousands of places and thousands of people and you're completely removed from it". Of the line "Don't get sentimental / It always ends up drivel", Yorke said: "Sentimentality is being emotional for the sake of it. We're bombarded with sentiment, people emoting. That's the Let Down. Feeling every emotion is fake. Or rather every emotion is on the same plane whether it's a car advert or a pop song." Yorke felt that scepticism of emotion was characteristic of Generation X and that it had informed the band's approach to the album. "Karma Police" has two main verses that alternate with a subdued break, followed by a different ending section. The verses centre around acoustic guitar and piano, with a chord progression indebted to the Beatles' "Sexy Sadie". Starting at 2:34, the song transitions into an orchestrated section with the repeated line "For a minute there, I lost myself". It ends with guitarist Ed O'Brien generating feedback using a delay effect. The title and lyrics to "Karma Police" originate from an in-joke during The Bends tour; Jonny Greenwood said "whenever someone was behaving in a particularly shitty way, we'd say 'The karma police will catch up with him sooner or later.'" Tracks 7–12 "Fitter Happier" is a short musique concrète track that consists of sampled musical and background sound and spoken-word lyrics recited by "Fred", a synthesised voice from the Macintosh SimpleText application. Yorke wrote the lyrics "in ten minutes" after a period of writer's block while the rest of the band were playing. He described the words as a checklist of slogans for the 1990s, and he considered the lyrics "the most upsetting thing I've ever written", and said it was "liberating" to give the words to a neutral-sounding computer voice. Among the samples in the background is an audio loop from the 1975 film Three Days of the Condor. The band considered using "Fitter Happier" as the album's opening track, but decided the effect was off-putting. Steve Lowe called the song "penetrating surgery on pseudo-meaningful corporations' lifestyles" with "a repugnance for prevailing yuppified social values". Among the loosely connected imagery of the lyrics, Footman identified the song's subject as "the materially comfortable, morally empty embodiment of modern, Western humanity, half-salaryman, half-Stepford Wife, destined for the metaphorical farrowing crate, propped up on Prozac, Viagra and anything else his insurance plan can cover." Sam Steele called the lyrics "a stream of received imagery: scraps of media information, interspersed with lifestyle ad slogans and private prayers for a healthier existence. It is the hum of a world buzzing with words, one of the messages seeming to be that we live in such a synthetic universe we have grown unable to detect reality from artifice." "Electioneering", featuring a cowbell and a distorted guitar solo, is the album's most rock-oriented track and one of the heaviest songs Radiohead has recorded. It has been compared to Radiohead's earlier style on Pablo Honey. The cynical "Electioneering" is the album's most directly political song, with lyrics inspired by the Poll Tax Riots. The song was also inspired by Chomsky's Manufacturing Consent, a book analysing contemporary mass media under the propaganda model. Yorke likened its lyrics, which focus on political and artistic compromise, to "a preacher ranting in front of a bank of microphones". Regarding its oblique political references, Yorke said, "What can you say about the IMF, or politicians? Or people selling arms to African countries, employing slave labour or whatever. What can you say? You just write down 'Cattle prods and the IMF' and people who know, know." O'Brien said the song was about the promotional cycle of touring: "After a while you feel like a politician who has to kiss babies and shake hands all day long." "Climbing Up the Walls" – described by Melody Maker as "monumental chaos" – is layered with a string section, ambient noise and repetitive, metallic percussion. The string section, composed by Jonny Greenwood and written for 16 instruments, was inspired by modern classical composer Krzysztof Penderecki's Threnody to the Victims of Hiroshima. Greenwood said, "I got very excited at the prospect of doing string parts that didn't sound like 'Eleanor Rigby', which is what all string parts have sounded like for the past 30 years." Select described Yorke's distraught vocals and the atonal strings as "Thom's voice dissolving into a fearful, blood-clotted scream as Jonny whips the sound of a million dying elephants into a crescendo". For the lyrics, Yorke drew from his time as an orderly in a mental hospital during the Care in the Community policy of deinstitutionalising mental health patients, and a New York Times article about serial killers. He said: "No Surprises", recorded in a single take, is arranged with electric guitar (inspired by the Beach Boys' "Wouldn't It Be Nice"), acoustic guitar, glockenspiel and vocal harmonies. The band strove to replicate the mood of Louis Armstrong's 1968 recording of "What a Wonderful World" and the soul music of Marvin Gaye. Yorke identified the subject of the song as "someone who's trying hard to keep it together but can't". The lyrics seem to portray a suicide or an unfulfilling life, and dissatisfaction with contemporary social and political order. Some lines refer to rural or suburban imagery. One of the key metaphors in the song is the opening line, "a heart that's full up like a landfill"; according to Yorke, the song is a "fucked-up nursery rhyme" that "stems from my unhealthy obsession of what to do with plastic boxes and plastic bottles ... All this stuff is getting buried, the debris of our lives. It doesn't rot, it just stays there. That's how we deal, that's how I deal with stuff, I bury it." The song's gentle mood contrasts sharply with its harsh lyrics; Steele said, "even when the subject is suicide ... O'Brien's guitar is as soothing as balm on a red-raw psyche, the song rendered like a bittersweet child's prayer." "Lucky" was inspired by the Bosnian War. Sam Taylor said it was "the one track on [The Help Album] to capture the sombre terror of the conflict", and that its serious subject matter and dark tone made the band "too 'real' to be allowed on the Britpop gravy train". The lyrics were pared down from many pages of notes, and were originally more politically explicit. The lyrics depict a man surviving an aeroplane crash and are drawn from Yorke's anxiety about transportation. The musical centerpiece of "Lucky" is its three-piece guitar arrangement, which grew out of the high-pitched chiming sound played by O'Brien in the song's introduction, achieved by strumming above the guitar nut. Critics have compared its lead guitar to Pink Floyd and, more broadly, arena rock. The album ends with "The Tourist", which Jonny Greenwood wrote as an unusually staid piece where something "doesn't have to happen ... every three seconds". He said, "'The Tourist' doesn't sound like Radiohead at all. It has become a song with space." The lyrics, written by Yorke, were inspired by his experience of watching American tourists in France frantically trying to see as many tourist attractions as possible. He said it was chosen as the closing track because "a lot of the album was about background noise and everything moving too fast and not being able to keep up. It was really obvious to have 'Tourist' as the last song. That song was written to me from me, saying, 'Idiot, slow down.' Because at that point, I needed to. So that was the only resolution there could be: to slow down." The "unexpectedly bluesy waltz" draws to a close as the guitars drop out, leaving only drums and bass, and concludes with the sound of a small bell. Title The title OK Computer is taken from the 1978 Hitchhiker's Guide to the Galaxy radio series, in which the character Zaphod Beeblebrox speaks the phrase "Okay, computer, I want full manual control now." The members of Radiohead listened to the series on the bus during their 1996 tour and Yorke made a note of the phrase. "OK Computer" became a working title for the B-side "Palo Alto", which had been considered for inclusion on the album. The title stuck with the band; according to Jonny Greenwood, it "started attaching itself and creating all these weird resonances with what we were trying to do". Yorke said the title "refers to embracing the future, it refers to being terrified of the future, of our future, of everyone else's. It's to do with standing in a room where all these appliances are going off and all these machines and computers and so on ... and the sound it makes." He described the title as "a really resigned, terrified phrase", to him similar to the Coca-Cola advertisement "I'd Like to Teach the World to Sing". Wired writer Leander Kahney suggests that it is an homage to Macintosh computers, as the Mac's speech recognition software responds to the command "OK computer" as an alternative to clicking the "OK" button. Other titles considered were Ones and Zeroes—a reference to the binary numeral system—and Your Home May Be at Risk If You Do Not Keep Up Payments. Artwork The OK Computer artwork is a computer-generated collage of images and text created by Yorke, credited under the pseudonym the White Chocolate Farm, and Stanley Donwood. Yorke commissioned Donwood to work on a visual diary alongside the recording sessions. Yorke explained, "If I'm shown some kind of visual representation of the music, only then do I feel confident. Up until that point, I'm a bit of a whirlwind." The blue-and-white palette was, according to Donwood, the result of "trying to make something the colour of bleached bone". The image of two stick figures shaking hands appears in the liner notes and on the disc label in CD and LP releases. Yorke explained the image as emblematic of exploitation: "Someone's being sold something they don't really want, and someone's being friendly because they're trying to sell something. That's what it means to me." The image would later be used as the artwork for Radiohead's first compilation album, Radiohead: The Best Of. Explaining the artwork's themes, Yorke said, "It's quite sad, and quite funny as well. All the artwork and so on ... It was all the things that I hadn't said in the songs." Visual motifs in the artwork include motorways, aeroplanes, families with children, corporate logos and cityscapes. The photograph of a motorway on the cover was likely taken in Hartford, Connecticut, where Radiohead performed in 1996. The words "Lost Child" feature prominently on the cover, and the booklet artwork contains phrases in the constructed language Esperanto and health-related instructions in both English and Greek. Uncut critic David Cavanagh said the use of non-sequiturs created an effect "akin to being lifestyle-coached by a lunatic". White scribbles, Donwood's method of correcting mistakes rather than using the computer function undo, are present everywhere in the collages. The liner notes contain the full lyrics, rendered with atypical syntax, alternate spelling and small annotations. The lyrics are also arranged and spaced in shapes that resemble hidden images. In keeping with the band's then-emerging anti-corporate stance, the production credits contain the ironic copyright notice "Lyrics reproduced by kind permission even though we wrote them." Release and promotion Commercial expectations According to Selway, Radiohead's American label Capitol saw the album as "'commercial suicide'. They weren't really into it. At that point, we got the fear. How is this going to be received?" Yorke recalled: "when we first gave it to Capitol, they were taken aback. I don't really know why it's so important now, but I'm excited about it." Capitol lowered its sales forecast from two million to half a million. In O'Brien's view, only Parlophone, the band's British label, remained optimistic while global distributors dramatically reduced their sales estimates. Label representatives were reportedly disappointed with the lack of marketable singles, especially the absence of anything resembling Radiohead's hit "Creep". "OK Computer isn't the album we're going to rule the world with", Colin Greenwood predicted at the time. "It's not as hitting-everything-loudly-whilst-waggling-the-tongue-in-and-out, like The Bends. There's less of the Van Halen factor." Marketing Parlophone launched an unorthodox advertising campaign, taking full-page advertisements in high-profile British newspapers and tube stations with lyrics for "Fitter Happier" in large black letters against white backgrounds. The same lyrics, and artwork adapted from the album, were repurposed for shirt designs. Yorke said they chose the "Fitter Happier" lyrics to link what a critic called "a coherent set of concerns" between the album artwork and its promotional material. Other unconventional merchandise included a floppy disk containing Radiohead screensavers and an FM radio in the shape of a desktop computer. In America, Capitol sent 1,000 cassette players to prominent members of the press and music industry, each with a copy of the album permanently glued inside. Capitol president Gary Gersh said, "Our job is just to take them as a left-of-centre band and bring the centre to them. That's our focus, and we won't let up until they're the biggest band in the world." Radiohead planned to produce a video for every song on the album, but the project was abandoned due to financial and time constraints. According to "No Surprises" video director Grant Gee, the plan was scrapped when the videos for "Paranoid Android" and "Karma Police" went over budget. Also scrapped were plans for trip hop group Massive Attack to remix the album. Radiohead's website was created to promote the album, which went live at the time of its release, making the band the first to manage an online presence. Their first fansite, "atease", was made shortly following the album's release, with its title taken from "Fitter Happier". In 2017, during OK Computers 20th anniversary, Radiohead's website was temporarily restored to its 1997 state. Singles Radiohead chose "Paranoid Android" as the lead single, despite its unusually long running time and lack of a catchy chorus. Colin Greenwood said the song was "hardly the radio-friendly, breakthrough, buzz bin unit shifter [radio stations] can have been expecting", but that Capitol supported the choice. The song premiered on the Radio 1 programme The Evening Session in April 1997 and was released as a single in May 1997. On the strength of frequent radio play on Radio 1 and rotation of the song's music video on MTV, "Paranoid Android" reached number three in the UK, giving Radiohead their highest chart position. "Karma Police" was released in August 1997 and "No Surprises" in January 1998. Both singles charted in the UK top ten, and "Karma Police" peaked at number 14 on the Billboard Modern Rock Tracks chart. "Lucky" was released as a single in France, but did not chart. "Let Down", considered for release as the lead single, was issued as a promotional single in September 1997 and charted on the Modern Rock Tracks chart at number 29. Meeting People Is Easy, Grant Gee's rockumentary following the band on their OK Computer world tour, premiered in November 1998. Tour Radiohead embarked on a world tour in promotion of OK Computer, the "Against Demons" tour, commencing at the album launch in Barcelona on 22 May 1997. The tour took the band across the UK and Ireland, continental Europe, North America, Japan and Australasia, concluding on 18 April 1998 in New York. It was taxing for the band, particularly Yorke, who said: "That tour was a year too long. I was the first person to tire of it, then six months later everyone in the band was saying it. Then six months after that, nobody was talking any more." The tour included Radiohead's first headline Glastonbury Festival performance on 28 June 1997; despite technical problems that almost caused Yorke to abandon the stage, the performance was acclaimed and cemented Radiohead as a major live act. Rolling Stone described it as "an absolute triumph", and in 2004 Q called it the greatest concert of all time. Sales OK Computer was released in Japan on 21 May, in the UK on 16 June, in Canada on 17 June and in the US on 1 July. It was released on CD, double-LP vinyl record, cassette and MiniDisc. It debuted at number one in the UK with sales of 136,000 copies in its first week. In the US, it debuted at number 21 on the Billboard 200. It held the number-one spot in the UK for two weeks and stayed in the top ten for several more, ultimately becoming the year's bestselling record there. By February 1998, OK Computer had sold at least half a million copies in the UK and 2million worldwide. By September 2000, Billboard reported that it had sold 4.5million copies worldwide. The Los Angeles Times reported that by June 2001 it had sold 1.4 million copies in the US, and in April 2006 the IFPI announced it had sold 3 million copies across Europe. It has been certified triple platinum in the UK and double platinum in the US, in addition to certifications in other markets. By May 2016, Nielsen SoundScan figures showed OK Computer had sold 2.5million digital album units in the US, plus an additional 900,000 sales measured in album-equivalent units. Twenty years to the week after its initial release, the Official Charts Company recorded total UK sales of 1.5million, including album-equivalent units. Tallying American and European sales, OK Computer has sold at least 6.9 million copies worldwide (or 7.8 million with album-equivalent units). Critical reception OK Computer was almost uniformly praised on release. Critics described it as a landmark release of far-reaching impact and importance, but noted that its experimentalism made it a challenging listen. According to Tim Footman, "Not since 1967, with the release of Sgt. Pepper's Lonely Hearts Club Band, had so many major critics agreed immediately, not only on an album's merits, but on its long-term significance, and its ability to encapsulate a particular point in history." In the British press, the album garnered favourable reviews in NME, Melody Maker, The Guardian and Q. Nick Kent wrote in Mojo that "Others may end up selling more, but in 20 years' time I'm betting OK Computer will be seen as the key record of 1997, the one to take rock forward instead of artfully revamping images and song-structures from an earlier era." John Harris wrote in Select: "Every word sounds achingly sincere, every note spewed from the heart, and yet it roots itself firmly in a world of steel, glass, random-access memory and prickly-skinned paranoia." The album was well received by critics in North America. Rolling Stone, Spin, the Los Angeles Times, the Pittsburgh Post-Gazette, Pitchfork and the Daily Herald published positive reviews. In The New Yorker, Alex Ross praised its progressiveness, and contrasted Radiohead's risk-taking with the musically conservative "dadrock" of their contemporaries Oasis. Ross wrote: "Throughout the album, contrasts of mood and style are extreme ... This band has pulled off one of the great art-pop balancing acts in the history of rock." Ryan Schreiber of Pitchfork lauded the record's emotional appeal, writing that it "is brimming with genuine emotion, beautiful and complex imagery and music, and lyrics that are at once passive and fire-breathing". Reviews for Entertainment Weekly, the Chicago Tribune, and Time were mixed. Robert Christgau from The Village Voice said Radiohead immersed Yorke's vocals in "enough electronic marginal distinction to feed a coal town for a month" to compensate for the "soulless" songs, resulting in "arid" art rock. In an otherwise positive review, Andy Gill wrote for The Independent: "For all its ambition and determination to break new ground, OK Computer is not, finally, as impressive as The Bends, which covered much the same sort of emotional knots, but with better tunes. It is easy to be impressed by, but ultimately hard to love, an album that luxuriates so readily in its own despondency." Accolades OK Computer was nominated for Grammy Awards as Album of the Year and Best Alternative Music Album at the 40th Annual Grammy Awards in 1998, winning the latter. It was also nominated for Best British Album at the 1998 Brit Awards. The album was shortlisted for the 1997 Mercury Prize, a prestigious award recognising the best British or Irish album of the year. The day before the winner was announced, oddsmakers had given OK Computer the best chance to win among ten nominees, but it lost to New Forms by Roni Size/Reprazent. The album appeared in many 1997 critics' lists and listener polls for best album of the year. It topped the year-end polls of Mojo, Vox, Entertainment Weekly, Hot Press, Muziekkrant OOR, HUMO, Eye Weekly and Inpress, and tied for first place with Daft Punk's Homework in The Face. The album came second in NME, Melody Maker, Rolling Stone, Village Voice, Spin and Uncut. Q and Les Inrockuptibles both listed the album in their unranked year-end polls. Praise for the album overwhelmed the band; Greenwood felt the praise had been exaggerated because The Bends had been "under-reviewed possibly and under-received." They rejected links to progressive rock and art rock, despite frequent comparisons made to Pink Floyd's 1973 album The Dark Side of the Moon. Yorke responded: "We write pop songs ... there was no intention of it being 'art'. It's a reflection of all the disparate things we were listening to when we recorded it." He was nevertheless pleased that listeners identified the album's influences: "What really blew my head off was the fact that people got all the things, all the textures and the sounds and the atmospheres we were trying to create." Legacy Retrospective appraisal OK Computer has appeared frequently in professional lists of the greatest albums of all time. A number of publications, including NME, Melody Maker, Alternative Press, Spin, Pitchfork, Time, Metro Weekly and Slant Magazine placed OK Computer prominently in lists of best albums of the 1990s or of all time. It was voted number 4 in Colin Larkin's All Time Top 1000 Albums 3rd Edition (2000). Rolling Stone ranked it 42 on its list of The 500 Greatest Albums of All Time in 2020. It was previously ranked at 162 in 2003 and 2012. Retrospective reviews from BBC Music, The A.V. Club and Slant received the album favourably. Rolling Stone gave the album five stars in the 2004 edition of The Rolling Stone Album Guide, with critic Rob Sheffield saying, "Radiohead was claiming the high ground abandoned by Nirvana, Pearl Jam, U2, R.E.M., everybody; and fans around the world loved them for trying too hard at a time when nobody else was even bothering." "Most would rate OK Computer the apogee of pomo texture", Christgau said in retrospect. According to Acclaimed Music, a site which uses statistics to numerically represent reception among critics, OK Computer is the 8th most celebrated album of all time. In 2014, the United States National Recording Preservation Board selected the album for preservation in the National Recording Registry of the Library of Congress, which designates it as a sound recording that has had significant cultural, historical or aesthetic impact in American life. The album has been cited by some as undeserving of its acclaim. In a poll surveying thousands conducted by BBC Radio 6 Music, OK Computer was named the sixth most overrated album "in the world". David H. Green of The Daily Telegraph called the album "self-indulgent whingeing" and maintains that the positive critical consensus towards OK Computer is an indication of "a 20th-century delusion that rock is the bastion of serious commentary on popular music" to the detriment of electronic and dance music. The album was selected as an entry in "Sacred Cows", an NME column questioning the critical status of "revered albums", in which Henry Yates said "there's no defiance, gallows humour or chink of light beneath the curtain, just a sense of meek, resigned despondency" and criticised the record as "the moment when Radiohead stopped being 'good' [compared to The Bends] and started being 'important'". In a Spin article on the "myth" that "Radiohead Can Do No Wrong", Chris Norris argues that the acclaim for OK Computer inflated expectations for subsequent Radiohead releases. Christgau felt "the reason the readers of the British magazine Q absurdly voted OK Computer the greatest album of the 20th century is that it integrated what was briefly called electronica into rock". Having deemed it "self-regarding" and overrated, he later warmed to the record and found it indicative of Radiohead's cerebral sensibility and "rife with discrete pleasures and surprises". Commentary, interpretation and analysis OK Computer was recorded in the lead up to the 1997 general election and released a month after the victory of Tony Blair's New Labour government. The album was perceived by critics as an expression of dissent and scepticism toward the new government and a reaction against the national mood of optimism. Dorian Lynskey wrote, "On May 1, 1997, Labour supporters toasted their landslide victory to the sound of 'Things Can Only Get Better.' A few weeks later, OK Computer appeared like Banquo's ghost to warn: No, things can only get worse." According to Amy Britton, the album "showed not everyone was ready to join the party, instead tapping into another feeling felt throughout the UK—pre-millennial angst. ... huge corporations were impossible to fight against—this was the world OK Computer soundtracked, not the wave of British optimism." In an interview, Yorke doubted that Blair's policies would differ from the preceding two decades of Conservative government. He said the public reaction to the death of Princess Diana was more significant, as a moment when the British public realised "the royals had had us by the balls for the last hundred years, as had the media and the state." The band's distaste with the commercialised promotion of OK Computer reinforced their anti-capitalist politics, which would be further explored on their subsequent releases. Critics have compared Radiohead's statements of political dissatisfaction to those of earlier rock bands. David Stubbs said that, where punk rock had been a rebellion against a time of deficit and poverty, OK Computer protested the "mechanistic convenience" of contemporary surplus and excess. Alex Ross said the album "pictured the onslaught of the Information Age and a young person's panicky embrace of it" and made the band into "the poster boys for a certain kind of knowing alienation—as Talking Heads and R.E.M. had been before." Jon Pareles of The New York Times found precedents in the work of Pink Floyd and Madness for Radiohead's concerns "about a culture of numbness, building docile workers and enforced by self-help regimes and anti-depressants." The album's tone has been described as millennial or futuristic, anticipating cultural and political trends. According to The A.V. Club writer Steven Hyden in the feature "Whatever Happened to Alternative Nation", "Radiohead appeared to be ahead of the curve, forecasting the paranoia, media-driven insanity, and omnipresent sense of impending doom that's subsequently come to characterise everyday life in the 21st century." In 1000 Recordings to Hear Before You Die, Tom Moon described OK Computer as a "prescient ... dystopian essay on the darker implications of technology ... oozing [with] a vague sense of dread, and a touch of Big Brother foreboding that bears strong resemblance to the constant disquiet of life on Security Level Orange, post-9/11." Chris Martin of Coldplay remarked that, "It would be interesting to see how the world would be different if Dick Cheney really listened to Radiohead's OK Computer. I think the world would probably improve. That album is fucking brilliant. It changed my life, so why wouldn't it change his?" The album inspired a radio play, also titled OK Computer, which was first broadcast on BBC Radio 4 in 2007. The play, written by Joel Horwood, Chris Perkins, Al Smith and Chris Thorpe, interprets the album's 12 tracks into a story about a man who awakens in a Berlin hospital with memory loss and returns to England with doubts that the life he's returned to is his own. Influence The release of OK Computer coincided with the decline of Britpop. Through OK Computers influence, the dominant UK guitar pop shifted toward an approximation of "Radiohead's paranoid but confessional, slurry but catchy" approach. Many newer British acts adopted similarly complex, atmospheric arrangements; for example, the post-Britpop band Travis worked with Godrich to create the languid pop texture of The Man Who, which became the fourth best-selling album of 1999 in the UK. Some in the British press accused Travis of appropriating Radiohead's sound. Steven Hyden of AV Club said that by 1999, starting with The Man Who, "what Radiohead had created in OK Computer had already grown much bigger than the band," and that the album went on to influence "a wave of British-rock balladeers that reached its zenith in the '00s". OK Computers popularity influenced the next generation of British alternative rock bands, and established musicians in a variety of genres have praised it. Bloc Party and TV on the Radio listened to or were influenced by OK Computer; TV on the Radio's debut album was titled OK Calculator as a lighthearted tribute. Radiohead described the pervasiveness of bands that "sound like us" as one reason to break with the style of OK Computer for their next album, Kid A. Although OK Computers influence on rock musicians is widely acknowledged, several critics believe that its experimental inclination was not authentically embraced on a wide scale. Footman said the "Radiohead Lite" bands that followed were "missing [OK Computer] sonic inventiveness, not to mention the lyrical substance." David Cavanagh said that most of OK Computers purported mainstream influence more likely stemmed from the ballads on The Bends. According to Cavanagh, "The populist albums of the post-OK Computer era—the Verve's Urban Hymns, Travis's Good Feeling, Stereophonics' Word Gets Around, Robbie Williams' Life thru a Lens—effectively closed the door that OK Computers boffin-esque inventiveness had opened." John Harris believed that OK Computer was one of the "fleeting signs that British rock music might [have been] returning to its inventive traditions" in the wake of Britpop's demise. While Harris concludes that British rock ultimately developed an "altogether more conservative tendency", he said that with OK Computer and their subsequent material, Radiohead provided a "clarion call" to fill the void left by Britpop. OK Computer triggered a minor revival of progressive rock and ambitious concept albums, with a new wave of prog-influenced bands crediting OK Computer for enabling their scene to thrive. Brandon Curtis of Secret Machines said, "Songs like 'Paranoid Android' made it OK to write music differently, to be more experimental ... OK Computer was important because it reintroduced unconventional writing and song structures." Steven Wilson of Porcupine Tree said, "I don't think ambition is a dirty word any more. Radiohead were the Trojan Horse in that respect. Here's a band that came from the indie rock tradition that snuck in under the radar when the journalists weren't looking and started making these absurdly ambitious and pretentious—and all the better for it—records." However, Pitchfork journalist Marc Hogan argued that OK Computer marked an "ending point" for the rock-oriented album era, as its dual level of mainstream and critical success went unmatched by any guitar-based album in subsequent decades. In 2005, Q named OK Computer the tenth best progressive rock album. Reissues and compilations Radiohead left EMI, parent company of Parlophone, in 2007 after failed contract negotiations. EMI retained the copyright to Radiohead's back catalogue of material recorded while signed to the label. After a period of being out of print on vinyl, EMI reissued a double LP of OK Computer on 19 August 2008, along with later albums Kid A, Amnesiac and Hail to the Thief, as part of the "From the Capitol Vaults" series. OK Computer became the year's tenth bestselling vinyl record, selling almost 10,000 units. The reissue was connected in the press to a general climb in vinyl sales and cultural appreciation of records as a format. "Collector's Edition" reissue EMI reissued OK Computer again on 24 March 2009, alongside Pablo Honey and The Bends, without Radiohead's involvement. The reissue came in two editions: a 2-CD "Collector's Edition" and a 2-CD, 1-DVD "Special Collector's Edition". The first disc contains the original studio album, the second disc contains B-sides collected from OK Computer singles and live recording sessions, and the DVD contains a collection of music videos and a live television performance. All the material on the reissue had been previously released. Press reaction to the reissue expressed concern that EMI was exploiting Radiohead's back catalogue. Larry Fitzmaurice of Spin accused EMI of planning to "issue and reissue [Radiohead's] discography until the cash stops rolling in". Pitchforks Ryan Dombal said it was "hard to look at these reissues as anything other than a cash-grab for EMI/Capitol—an old media company that got dumped by their most forward-thinking band." Daniel Kreps of Rolling Stone defended the release, saying: "While it's easy to accuse Capitol of milking the cash cow once again, these sets are pretty comprehensive." The reissue was critically well received, although reception was mixed about the supplemental material. Reviews in AllMusic, Uncut, Q, Rolling Stone, Paste and PopMatters praised the supplemental material, but with reservations. A review written by Scott Plagenhoef for Pitchfork awarded the reissue a perfect score, arguing that it was worth buying for fans who did not already own the rare material. Plagenhoef said, "That the band had nothing to do with these is beside the point: this is the final word on these records, if for no other reason that the Beatles' September 9 remaster campaign is, arguably, the end of the CD era." The A.V. Club writer Josh Modell praised the bonus disc and DVD, and said the album was "the perfect synthesis of Radiohead's seemingly conflicted impulses". In April 2016, XL Recordings acquired Radiohead's back catalogue. The "collector's editions" of Radiohead albums, issued without Radiohead's approval, were removed from streaming services. In May 2016, XL reissued Radiohead's back catalogue on vinyl, including OK Computer. OKNOTOK 1997 2017 On 23 June 2017, Radiohead released a 20th-anniversary OK Computer reissue, OKNOTOK 1997 2017, on XL. The reissue includes a remastered version of the album, plus eight B-sides and three previously unreleased tracks: "I Promise", "Man of War" and "Lift". The special edition includes books of artwork and notes and an audio cassette of demos and session recordings, including previously unreleased songs. OKNOTOK debuted at number two on the UK Album Chart, boosted by Radiohead's third headline performance at Glastonbury Festival. It was the bestselling album in independent UK record shops for a year. MiniDiscs [Hacked] In early June 2019, nearly 18 hours of demos, outtakes and other material recorded during the OK Computer period leaked online. On 11 June, Radiohead made the archive available to stream or purchase from the music sharing site Bandcamp for 18 days, with proceeds going to the environmental advocacy group Extinction Rebellion. Track listing All tracks are written by Thom Yorke, Jonny Greenwood, Philip Selway, Ed O'Brien and Colin Greenwood. "Airbag" – 4:44 "Paranoid Android" – 6:26 Subterranean Homesick Alien – 4:28 Exit Music (For a Film) – 4:28 Let Down – 4:59 Karma Police – 4:25 Fitter Happier – 1:58 Electioneering – 3:50 Climbing Up the Walls – 4:44 No Surprises – 3:51 Lucky – 4:21 The Tourist – 5:27 Personnel Nigel Godrich – committing to tape, audio level balancing Radiohead – committing to tape, music Thom Yorke Jonny Greenwood Philip Selway Ed O'Brien Colin Greenwood Stanley Donwood – pictures The White Chocolate Farm – pictures Gerard Navarro – studio assistance Jon Bailey – studio assistance Chris Scard – studio assistance Chris "King Fader" Blair – mastering Nick Ingman – string conducting Matt Bale – additional artwork Charts Weekly charts Year-end charts Certifications and sales Notes Footnotes Citations Bibliography External links 1997 albums Radiohead albums Grammy Award for Best Alternative Music Album United States National Recording Registry recordings Albums produced by Nigel Godrich Capitol Records albums Parlophone albums Alternative rock albums by British artists Art rock albums by British artists United States National Recording Registry albums
3129914
https://en.wikipedia.org/wiki/Scala%20%28software%29
Scala (software)
Scala is a freeware software application with versions supporting Windows, OS X, and Linux. It allows users to create and archive musical scales, analyze and transform them with built-in theoretical tools, play them with an on-screen keyboard or from an external MIDI keyboard, and export them to hardware and software synthesizers. Scala can retune MIDI streams and files using pitch bend. It also supports MIDI sysex and file-based tunings. Originally a command-line program, Scala now uses the GTK+ GUI toolkit. Scala is written in the Ada programming language, and is the work of Manuel Op de Coul of the Netherlands. Scala can also be used as a midi sequencer, by way of its ASCII-based sequencing format, seq. Because of its great flexibility when it comes to tuning formats, it is a very powerful tool for those who want to compose and sequence microtonal music. Scala's motto is , Latin for 'it finds and perfects' or 'it discovers and accomplishes'. Its logo is a Renaissance-style relief print of a cherub holding a compass and a globe inscribed with a diatonic musical scale and a circle of fourths. File formats Scala can open, transform, and save standard MIDI files. It can also export MIDI tuning tables in .tun format. It provides a native, human-readable sequencing language (). But it is best known for its use of human-readable text files to store musical scales. The Scala scale file format has become a standard for representing microtonal scales in a way that can be used by other software. The Scala site lists over thirty applications that support the format, including several major commercial packages like Apple Logic 7, Celemony Melodyne 3, and Cakewalk Rapture. Scala's developer also makes freely available an archive of over 4,000 Scala scale files, containing many musical scales of historical, cultural, and theoretical interest. See also Microtuner Microtonal music References External links Scala's home page Freeware MIDI Scorewriters Windows multimedia software MacOS multimedia software Linux audio video-related software Audio software that uses GTK
4640825
https://en.wikipedia.org/wiki/Oracle%20Communications%20Messaging%20Server
Oracle Communications Messaging Server
Oracle Communications Messaging Server is Oracle's messaging (email) server software. The software was obtained by Oracle as part of the company's acquisition of Sun in 2010. Oracle's Messaging Server could potentially be the most widely deployed commercial email server on the planet, with claims of 150 million mailboxes deployed worldwide (mostly by ISPs, telcos, universities, government, and cable TV broadband providers). History of development Oracle Communications Messaging Server has a long history, drawing technology from Sun Internet Mail Server (SIMS) Netscape Messaging Server (NMS) PMDF from Innosoft In addition to the Messaging Server's three parents, the software has undergone multiple brand naming changes: iPlanet Messaging Server Sun ONE Messaging Server Sun Java System Messaging Server Oracle Communications Messaging Exchange Server Oracle Communications Messaging Server The code base has been carried on throughout these minor brand changes with only feature enhancements and bug fixes. The Messaging Server was part of Sun's Java Enterprise System bundle of Internet/Intranet server software from 2003 to 2006. In 2006, the Messaging Server was packaged as part of smaller bundle called the Sun Java System Communications Suite which includes Sun Java System Calendar Server, Sun Java System Instant Messaging Server, and Sun Java System Communications Express. This suite is now known as Oracle Communications Unified Communications Suite. Supporting server software for the Messaging Server includes Sun Java System Directory Server, Sun Java System Access Manager, and Oracle iPlanet Web Server. The supporting software is included in the Communications Suite bundle with limited-use license rights. Messaging Server also includes a web application called Convergence, which provides webmail as well as web-based access to other Communications Suite functionality such as calendaring and instant messaging. The Messaging Server is supported to run on multiple operating systems including Solaris and Red Hat Enterprise Linux. Versions 5.2, 6.1 and 6.2 were also available for HP-UX and Microsoft Windows. References and footnotes External links Oracle Communications Messaging Server Overview Oracle Communications Unified Communications Suite Overview MsgServerDocWiki: contains documentation, FAQ and tips for installation, configuration, operation, and troubleshooting of Sun Java System Messaging Server Factotum: a blog written by a Messaging Server tech writer. Provides some inside information and sneak preview of what's to come in future releases. Sun Java System Messaging Server 6.3 Product Library Documentation Messaging Server 8.0.1 Documentation Sun Microsystems software Message transfer agents Oracle software
3028164
https://en.wikipedia.org/wiki/Joe%20Beimel
Joe Beimel
Joseph Ronald Beimel (pronounced "BUY mul") (born April 19, 1977) is an American professional baseball pitcher who is a free agent. He played in Major League Baseball (MLB) for the Pittsburgh Pirates, Minnesota Twins, Tampa Bay Devil Rays, Los Angeles Dodgers, Washington Nationals, Seattle Mariners, and Colorado Rockies. Beimel was known for his exceptional slider pitch and high change which kept hitters guessing at the plate. He is also the only MLB player to ever wear the number 97. Amateur career Beimel attended St. Marys Area High School and was a letterman in football, wrestling, basketball, and baseball. He played two seasons of junior college baseball at Allegany College of Maryland in Cumberland, Maryland and one season at Duquesne University in Pittsburgh. Six former Allegany College players have made it to Major League Baseball. Beimel was the fifth Major League Baseball player to come out of the Allegany College of Maryland program. The five other Allegany Trojans to make the big leagues were John Kruk, Stan Belinda, Steve Kline, Scott Seabol and Scott Patterson. At Allegany, Beimel played for Junior College Hall of Fame Coach Steve Bazarnic. During Beimel's years at Allegany the Trojans advanced to the Junior College World Series both seasons. At Duquesne University he was the team leader in wins and complete games and was second on the staff in strikeouts and ERA. Professional career Pittsburgh Pirates He was drafted by the Texas Rangers in the 26th round of the 1996 Draft after his freshman year in college but chose to remain in school. He was later selected in the 18th round of the 1998 draft by the Pittsburgh Pirates, after his junior year, and signed with the Pirates on June 5, 1998. He is the first pitcher drafted by the Pirates out of the Pittsburgh-based Duquesne University; he is the only such pitcher to have made it to the Major Leagues. His minor league stops in the Pirates organization included their development level team in Erie (1998, 1–4, 6.32, 6 starts), their "A" ball team in Hickory (1999, 5–11, 4.43, 22 starts), their "A+" team in Lynchburg (2000, 10–6, 3.36, 18 starts, 2 CGs), and their "AA" team in Altoona (1-6, 4.16, 10 starts, 1 CG). After a strong spring, he made the Pirates Major League roster at the start of the 2001 season. He made his major league debut as the starting pitcher on April 8, 2001, against the Houston Astros, pitching 5 innings, allowing 2 runs, and recording his first career victory. He appeared as both a starter and a reliever that season, finishing with a record of 7–11, ERA of 5.23 in 42 appearances, 15 of them as a starter. He made another 8 starts on the 2002 squad but has been primarily used as a relief pitcher ever since. After finishing both the 2002 & 2003 seasons in the Pirates bullpen as an average middle reliever, the Pirates released him right before the start of the 2004 season. Minnesota Twins He was subsequently signed as a free agent by the Minnesota Twins on April 11, 2004. He spent the bulk of the season with Minnesota's Class-AAA Minor League affiliate in Rochester, where he had a mediocre season (2-4, 6.97, in 49 appearances). He made just three relief appearances for the Twins as a September call-up and then was released after the season. Tampa Bay Devil Rays Beimel was signed as a free agent by the Tampa Bay Devil Rays on November 5, 2004, and spent most of the 2005 season with their AAA team the Durham Bulls, going 1–2 with a 3.93 era in 48 games. He made a few trips to the big leagues to pitch for the Devil Rays during the season, making 7 appearances with an era of 3.27. Los Angeles Dodgers In 2006, he was signed by the Los Angeles Dodgers and became a valuable member of their relief corps, with a 2.96 era in 62 appearances, primarily as a late inning left-handed specialist. However, he has also been effective enough against right-handed batters to be used as both a set-up man and emergency closer, and proved remarkably effective when put into games to work the Dodgers out of jams. He wore #97 for the Dodgers, which at the time was the highest number ever used by a Dodger. The number represents the year of his first child's birth. His successful season ended on a down note; right before 2006 divisional series between the Dodgers and New York Mets began, Beimel cut his hand on glass at a bar in New York. Due to his injury Beimel was left off the series roster. At first he claimed that it happened in his hotel room before divulging the truth after the Dodgers lost to the Mets three games to none. Beimel was completely sober for 15 months following the incident and now drinks only occasionally. During the 2007 season, Beimel set a record for the Dodgers by making 83 appearances, the most by a left-handed pitcher in the Dodgers history. During his first two years with the Dodgers, Beimel became known for his ability to get Barry Bonds out. Beimel held Bonds to 1–16 at the plate, with the one hit being a solo home run. He also walked Bonds only three times. After the arrival of new manager Joe Torre, Beimel was forced to cut his hair, a situation similar to one Stump Merrill had with Torre's successor as Dodger manager, Don Mattingly, when Merrill managed Mattingly with the New York Yankees. Relationship with fans Joe gained a cult following in 2008 in a series of fan-made YouTube videos. When the Dodgers conducted their second annual online fan vote during Spring Training to determine what player should be immortalized as part of the team's bobblehead promotions, Beimel took home the honors for 2008 after a strong Internet turnout, including a campaign that was orchestrated by his parents, Ron and Marge Beimel. Washington Nationals On March 18, 2009, Beimel and the Washington Nationals agreed to a one-year $2 million deal; he became their eighth-inning set up man. Colorado Rockies On July 31, 2009, Beimel was traded by the Nationals to the Colorado Rockies for Ryan Mattheus and Robinson Fabian. He signed a minor league contract on March 22, 2010, and was brought up to the majors on April 15. His entrance song was "God's Gonna Cut You Down" by Johnny Cash. Return to Pittsburgh On January 27, 2011, the Pittsburgh Pirates signed Beimel to a minor league contract. He began the season on the disabled list due to soreness in his forearm and elbow, which he initially experienced during spring training. He spent the first weeks of season on rehab assignments with the Advance-A Bradenton Marauders and Triple-A Indianapolis Indians. The Pirates activated Beimel from the disabled list on April 15, 2011. On May 28, 2011, Beimel was placed on the 15-day disabled list, due to the same injury which had put him on the shelf to begin the season. Daniel Moskos was recalled to take his place. He was designated for assignment on August 23, and released a week later. Texas Rangers On February 6, 2012, Beimel signed a minor league deal with the Texas Rangers. He was released on March 26. Beimel underwent Tommy John surgery on May 1, 2012. Seattle Mariners Beimel signed a minor league deal with the Seattle Mariners on January 24, 2014. In his first appearance with Seattle, Beimel recorded an out without throwing a pitch, picking David Freese off of first base. He enjoyed a successful 2014 campaign, posting a 2.20 ERA in 56 relief appearances. Beimel signed a $600,000 deal with the Texas Rangers on March 6, 2015. He struggled through spring training, allowing 11 earned runs in three innings pitched. On March 23, Beimel was released. Had he been promoted to the major league level, Beimel's salary would have risen to $1.5 million. Instead, he was paid $147,541. Beimel signed a minor league contract with the Seattle Mariners on April 2, 2015. He re-joined the Major League team a month later. Miami Marlins During the month of May, Beimel signed a minor league deal with the Marlins. Beimel's deal with the Marlins fell through on May 17, making him a free agent. Kansas City Royals Beimel signed a minor league contract with the Kansas City Royals on June 6, 2016. He was released on July 15. New Britain Bees On March 22, 2017, Beimel signed with the New Britain Bees of the Atlantic League of Professional Baseball. Beimel announced his retirement from professional baseball on June 25, 2017, after appearing in 22 games for the Bees. San Diego Padres On June 11, 2021, Beimel made his comeback to organized baseball at the age of 44 after signing a minor league contract with the San Diego Padres organization. Personal life Beimel is a Democrat. He has two children with his first wife Emily, and one child with his second wife Carley. He wears #97 to represent the year of his first child Drew's birth. References External links 1977 births Living people People from St. Marys, Pennsylvania Baseball players from Pennsylvania Major League Baseball pitchers Pittsburgh Pirates players Minnesota Twins players Tampa Bay Devil Rays players Los Angeles Dodgers players Washington Nationals players Colorado Rockies players Seattle Mariners players Allegany Trojans baseball players Duquesne Dukes baseball players Bradenton Marauders players Erie SeaWolves players Hickory Crawdads players Altoona Curve players Lynchburg Hillcats players Rochester Red Wings players Durham Bulls players Las Vegas 51s players Potomac Nationals players Colorado Springs Sky Sox players Indianapolis Indians players Gwinnett Braves players Tacoma Rainiers players Pennsylvania Democrats Omaha Storm Chasers players New Britain Bees players San Antonio Missions players Allegany College of Maryland alumni
253828
https://en.wikipedia.org/wiki/The%20Mummy%20%281999%20film%29
The Mummy (1999 film)
The Mummy is a 1999 American action-adventure film written and directed by Stephen Sommers. It is a remake of the 1932 film of the same name, starring Brendan Fraser, Rachel Weisz, John Hannah, Kevin J. O'Connor, and Arnold Vosloo in the title role as the reanimated mummy. The film follows adventurer Rick O'Connell as he travels to Hamunaptra, the City of the Dead, with a librarian and her older brother, where they accidentally awaken Imhotep, a cursed high priest with supernatural powers. Development of the film took years, with multiple screenplays and directors attached. In 1997, Stephen Sommers successfully pitched his version of a more adventurous and romantic take on the source material. Principal photography took place in Morocco and the United Kingdom; the crew endured dehydration, sandstorms, and snakes shooting on location in the Sahara desert. Industrial Light & Magic provided many of the visual effects, blending live-action footage and computer-generated imagery to create the titular monster. Jerry Goldsmith provided the orchestral score. The Mummy was theatrically released on May 7, 1999. Despite mixed reviews from critics, it was a commercial success and grossed over $416.4 million worldwide against a production budget of $80 million. The film's success spawned two direct sequels, The Mummy Returns (2001) and The Mummy: Tomb of the Dragon Emperor (2008). It also led to spinoffs such an animated series and the prequel The Scorpion King (2002), which generated its own sequels. Attempts to reboot the property and kickstart a new media franchise led to a 2017 film. Plot In Thebes, Egypt, 1290 BC, high priest Imhotep has an affair with Anck-su-namun, the mistress of Pharaoh Seti I. Imhotep and Anck-su-namun kill the Pharaoh after he discovers their relationship. Imhotep flees, while Anck-su-namun kills herself, believing that Imhotep can resurrect her. Imhotep and his priests steal her corpse and travel to Hamunaptra, the city of the dead. The resurrection ritual is stopped by Pharaoh's bodyguards, the Medjai. Imhotep is buried alive with flesh-eating scarab beetles and locked in a sarcophagus at the feet of a statue of the Egyptian god Anubis. The Medjai are sworn to prevent Imhotep's return. In 1926 AD, Jonathan Carnahan presents his sister Evelyn—a librarian and aspiring Egyptologist—with an intricate box and map that lead to Hamunaptra. Jonathan reveals he stole the box from an American adventurer, Rick O'Connell, who discovered the city while in the French Foreign Legion. Evelyn and Jonathan find Rick and make a deal with him to lead them to the city. Rick guides Evelyn and her party to the city, encountering a band of American treasure hunters led by Rick's cowardly acquaintance Beni Gabor. Despite being warned to leave by Ardeth Bay, leader of the Medjai, the two expeditions continue their excavations. Evelyn searches for the Book of Amun-Ra, made of pure gold. Instead of finding the book, she stumbles upon Imhotep's remains. The team of Americans, meanwhile, discover the black Book of the Dead, accompanied by canopic jars carrying Anck-su-namun's preserved organs. At night, Evelyn reads from the Book of the Dead aloud, accidentally awakening Imhotep. The expeditions return to Cairo, and Imhotep follows them with the help of Beni, who has agreed to serve him. Imhotep regenerates his full strength by killing the members of the American expedition and brings the ten plagues back to Egypt. Rick, Evelyn, and Jonathan meet Ardeth at a museum. Ardeth hypothesizes that Imhotep wants to resurrect Anck-su-namun by sacrificing Evelyn. Evelyn believes that if the Book of the Dead brought Imhotep back to life, the Book of Amun-Ra can kill him again and deduces the book's whereabouts in Hamunaptra. Imhotep corners the group with an army of slaves. Evelyn agrees to accompany Imhotep if he spares the rest of the group. Although Imhotep does not honor his word, Rick and the others fight their way to safety. Imhotep, Evelyn, and Beni return to Hamunaptra, pursued by Rick, Jonathan, and Ardeth, who are able to locate the Book of Amun-Ra. Imhotep prepares to sacrifice Evelyn, but she is rescued after a battle with Imhotep's mummified priests. Evelyn reads from the Book of Amun-Ra, making Imhotep mortal, and he is fatally wounded by Rick. Beni accidentally sets off a booby trap while looting the city of its riches, and is killed by a swarm of flesh-eating scarabs as Hamunaptra collapses into the sand. Ardeth bids Rick, Evelyn, and Jonathan goodbye, and the trio rides away on a pair of camels laden with Beni's treasure. Cast Brendan Fraser as Rick O'Connell Rachel Weisz as Evelyn Carnahan John Hannah as Jonathan Carnahan Arnold Vosloo as Imhotep Kevin J. O'Connor as Beni Gabor Jonathan Hyde as Dr. Allen Chamberlain Oded Fehr as Ardeth Bay Erick Avari as Dr. Terence Bey Stephen Dunham as Isaac Henderson Corey Johnson as David Daniels Tuc Watkins as Bernard Burns Omid Djalili as Warden Gad Hassan Aharon Ipalé as Pharaoh Seti I Bernard Fox as Captain Winston Havelock Patricia Velásquez as Anck-su-namun Production Origins In the late 1980s, producers James Jacks and Sean Daniel decided to update the original 1932 Mummy film for the modern era. Universal gave them the go-ahead, but only if they kept the budget around $10 million. Jacks remembers that the studio "essentially wanted a low-budget horror franchise". In 1987, George A. Romero wrote a film treatment, and was attached to direct. Screenwriter Abbie Bernstein recalled that Universal Studios wanted an unstoppable Mummy akin to the Terminator. Bernstein's story took place in the present; scientists inadvertently bring a mummy to life, who wants to use an ancient device to destroy all life on earth. "[The Mummy] had no more social interaction than the T-Rex did in Jurassic Park," Bernstein recalled. Romero drifted away from the project and the script was abandoned. Next, Jacks and Daniel recruited horror filmmaker/writer Clive Barker to direct. Barker's 1990 treatment and a successive 1991 screenplay by Mick Garris were dark and violent, with the story revolving around an art museum that rebuilds an entire Egyptian tomb in Beverly Hills. Jacks recalls that Barker's take was "dark, sexual and filled with mysticism", and that, "it would have been a great low-budget movie". Barker recalled the concept was too weird for the studio, and that his vision treated the Mummy as a jumping-off point for the film instead of the central character. Alan Ormsby, unaware of the previous efforts to launch the film, was brought on next. Ormsby pitched a more straightforward update to the 1932 film, again focusing on the Mummy as a relentless Terminator-like character. Joe Dante was attached as director, increasing the budget for his idea of Daniel Day-Lewis as a brooding Mummy. This version's draft was later re-written by John Sayles. It was set in contemporary times and focused on reincarnation with elements of a love story. It came close to being made—with some elements like the flesh-eating scarabs making it to the final product—but Universal balked at the higher price tag. George A. Romero returned to the project in 1994 with a vision of a zombie-style horror film similar to Night of the Living Dead, but which also relied heavily upon elements of tragic romance and ambivalence of identity. Romero completed a draft in October 1994, co-written with Ormsby and Sayles, that revolved around female archaeologist Helen Grover and her discovery of the tomb of Imhotep, an Egyptian general who lived in the time of Ramesses II. Unfolding in a nameless American city in modern times, events are set into motion when Imhotep inadvertently awakens as a result of his body having been exposed to rays from an MRI scan in a high-tech forensic archaeology lab. Helen finds herself drawn into a tentative relationship with Imhotep while also experiencing clairvoyant flashbacks to a previous life in the Nineteenth Dynasty of Egypt as a priestess of Isis. Summoning mystical powers through incantation, Imhotep later resurrects the mummy of Karis, a loyal slave. Karis embarks on a vengeful rampage against the grave robbers of his tomb. Romero's script was considered too dark and violent by Jacks and the studio, who wanted a more accessible picture. Romero was unable to extricate himself from another contract he had in negotiation with MGM, and so his involvement with the film was severed and the development of an entirely new script was commissioned. Mick Garris returned in 1995, developing a script that combined elements of the 1932 film and 1942's The Mummy's Tomb. This draft was a period piece awash in Egyptian art-inspired Art Deco, but the vision once again proved too expensive for the studio and was discarded for a modern setting. While the project came close to entering production, Universal was sold to Seagram. Sheinberg chose to produce The Mummy through his independent company and write a new script. Unable to find a suitable high-profile writer and director (Wes Craven was offered the film but turned it down) the project unraveled again, and Garris left the project for the second time. Still determined to create a new Mummy film, in 1996 Universal hired Kevin Jarre to write a new screenplay. According to Jacks, the executives were now convinced the film should be a larger-budget period piece. Stephen Sommers called Jacks and Daniel in 1997 with his vision of The Mummy "as a kind of Indiana Jones or Jason and the Argonauts with the mummy as the creature giving the hero a hard time". Sommers had seen the original film when he was eight, and wanted to recreate the things he liked about it on a bigger scale. Discussing other classic horror characters, Sommers recalled that "Frankenstein made me said—I always felt sorry for him. Dracula was kind of cool and sexy. But The Mummy just plain scared me." He had wanted to make a Mummy film, but other writers or directors were always attached. After the box office disappointment of Babe: Pig in the City and other films, Universal needed a hit film. At the time, Universal's management had changed in response to the box office failures, and the losses led the studio to revisit its successful franchises from the 1930s. New chair Stacey Snider distributed packets detailing the studio's holdings—including nearly 5,000 old scripts and films. Sommers received his window of opportunity and pitched his Mummy to Universal with an 18-page treatment. Sommers did not want to remake the original film; instead of making another straight horror movie, he wanted to turn it into a romantic adventure with horror elements. Conscious of how the shambling, bandaged Mummy of the old films had become something of a punchline, he wanted a faster, meaner, and scarier monster. Sommers incorporated his own research and the services of a UCLA archaeology professor to make the ancient Egyptian language accurate. Snider recalled that Sommer's treatment was unique in that it took place in the 1920s, rather than a contemporary setting. Universal liked the treatment so much that they approved the concept and increased the budget, and Sommers spent a year working on the screenplay. Casting Producer James Jacks offered the role of Rick O'Connell to Tom Cruise (who was later cast in the reboot film), Brad Pitt, Matt Damon and Ben Affleck, but the actors were not interested or could not fit the role into their respective schedules. Jacks and director Stephen Sommers were impressed with the money that George of the Jungle was making at the box office and cast Brendan Fraser as a result; Sommers also commented that he felt Fraser fit the Errol Flynn swashbuckling character he had envisioned for Rick perfectly. Fraser's turn in George bolstered his perceived star power, yet he remained far cheaper than the biggest actors working. The actor understood that his character "doesn't take himself too seriously, otherwise the audience can't go on that journey with him." Evelyn Carnahan was named in tribute to Lady Evelyn Carnarvon, the daughter of amateur Egyptologist Lord Carnarvon, both present at the opening of the tomb of Tutankhamun in 1922. The studio originally considered American actresses, and Rachel Weisz auditioned multiple times before getting the part. Rachel Weisz was not a big fan of horror films, but saw the movie as more of a "hokum" comic book. John Hannah was picked for the role of Jonathan Carnahan, despite the fact that Hannah felt he was not a comedic actor. Said Sommers, "He had no idea why we cast him." Jacks had previously produced Hard Target with Arnold Vosloo. The South African actor liked the Mummy script, but told Sommers he wanted to play the role "absolutely straight. From Imhotep's point of view, this is a skewed version of Romeo and Juliet." He was offered the role after a single audition. Carrying some extra weight and conscious of it because of Imhotep's skimpy costume, Vosloo lost 10 to 15 pounds for the role by eschewing alcohol and sugar. Filming Filming began on May 4, 1998, and lasted 17 weeks. The crew could not shoot in Egypt because of unstable political conditions, so principal photography began in Marrakech, Morocco. Marrakech had the extra advantage of being a much less modern city than Cairo, making it easier to dress like the 1920s; the production set up two weeks before filming, taking down telephone wires and cables and shipping in period cars and camels. Locals served as extras for crowd scenes. After shooting in Marrakech, filming moved to the Sahara desert outside the small town of Erfoud. Production designer Allan Cameron found a dormant volcano, Gara Medouar, where the exteriors for Hamunaptra could be constructed. A concrete ramp was built to allow access into the horseshoe-shaped formation, where the city was built from prefabricated parts shipped from England. A survey of the volcano was conducted so that an accurate model and scale models of the columns and statues could be replicated back at Shepperton Studios, where all of the scenes involving the underground passageways of the City of the Dead were shot. These sets took 16 weeks to build, and included fiberglass columns rigged with special effects for the movie's final scenes. To avoid dehydration in the scorching heat of the Sahara, the production's medical team created a drink that the cast and crew had to consume every two hours. Sandstorms were daily inconveniences, and wildlife were a major problem, with many crew members having to be airlifted to medical care after being bitten or stung. Brendan Fraser nearly died during a scene where his character is hanged. The production had the official support of the Royal Moroccan Army, and the cast members had kidnapping insurance taken out on them. After shooting in North Africa, production moved back to the United Kingdom before completion of shooting on August 29, 1998. Here, the dockyards at Chatham doubled for the Giza Port on the Nile River. This set was in length and featured "a steam train, an Ajax traction engine, three cranes, an open two-horse carriage, four horse-drawn carts, five dressing horses and grooms, nine pack donkeys and mules, as well as market stalls, Arab-clad vendors and room for 300 costumed extras". Special effects The filmmakers reportedly spent $15 million of the budget on the special effects alone. The Mummy features hundreds of shots that required optical or digital special effects in post-production. Effects house Industrial Light & Magic (ILM) contributed more than 140 shots to the film, with additional work done by Cinesite (60 shots) and Pacific Title/Mirage (45 shots, and the film's title sequences.) Sommers engaged ILM while still developing the script, having previously worked with ILM effects supervisor John Andrew Berton, Jr. on Deep Rising. ILM was eager for the challenge the film provided, and produced a proof of concept for The Mummys effects in late 1997 to demonstrate the feasibility of Sommers' vision to executives. The filmmakers sought to make something faster and scarier for the title creature, using cutting-edge techniques to create something never before seen. ILM started developing the look of the Mummy three months before filming started. "We wanted to create a photorealistic corpse that was obviously not a man in a suit, obviously not an animatronic, and obviously alive," he recalled. Over a months-long period, the designers worked on developing four distinct stages for the Mummy. Stage one was the mummy at its most decayed, with tattered bits of clothing, skin, and sinew hanging over a skeleton. Stage two added areas of regenerated skin, with stages three and four having the Mummy almost fully regenerated with only small areas of its innards showing through. The artists developed black-and-white sketches, then moved on to color treatments before building the creature in the computer; models were also made to use as reference for the digital artists. The initial states of the Mummy were created entirely by computer, while later stages combined live-action performance. To supplement prosthetics and makeups applied on set, LED lights and pieces of tape served as tracking points so that digital "cutouts" could be applied to Vosloo's face and body in postproduction; Vosloo remarked that walking around the set he felt like a Christmas tree. The final creature was created with a combination of live-action acting with prosthetics and digital imagery. A digital representation of the Mummy was created in Alias, featuring simulated muscles for much of the body attached to a skeleton. The animators controlled parts of the Mummy via procedural animation; animating the underlying bones in turn controlled the stretch and movement of the overlaid muscles. Finally, layered on top of the procedural animation and motion capture was additional animation to tweak the performance; given the limitations of the technology, subtle movements like facial or hand animations had to be done by hand. Shots that featured Vosloo with overlaid computer-generated prosthetics were the most difficult for the effects team, requiring careful match-moving. Rather than using a stunt performer, Vosloo performed the motion capture for the character himself. Scenes were blocked out and performed on set during principle photography (first with Vosloo in the scene, than without.) The shots were then replicated in the motion capture studio, with Vosloo's performance recorded by eight cameras from different angles. In addition to the Mummy, the script called for numerous effects shots to magnify the sweeping adventure of the film. Vistas like a flashback shot of the ancient city of Thebes combined location footage shot in the desert with composited actors shot on green screen, model miniatures, matte paintings, and computer-generated effects. The plagues Imhotep unleashes were accomplished using particle-based computer graphics, with ILM designers swapping out models of different qualities depending on how far from the camera the swarming "insects" were. While the film made extensive use of computer-generated imagery, many scenes, including ones where Rachel Weisz's character is covered with rats and locusts, were shot using live animals. Another close-up shot used footage of anesthetized locusts attached to a stunt performer combined with extra computer-generated pests. Sandstorms used procedural graphics based on programs used to create tornados in Twister, while the masses of flesh-eating scarabs used techniques developed for Star Wars: Episode I – The Phantom Menace. A shot of a firestorm engulfing Cairo combined real palm trees, physical models, and matte paintings with computer-generated hail, fire, and rubble. Pacific Title/Mirage also enhanced shots with digital camera shake. The finale involves an army of mummies coming to Imhotep's defense. Many of these mummies were created by Make-Up Effects Supervisor Nick Dudman, who produced makeup, prosthetics, and animatronic effects in the film. Each suit came with variations for stunt moves or pyrotechnics. After principal photography, the suits were sent to ILM to scan and be modeled in the computer. Using parts of the Imhotep mummy to save time, ILM recreated the underlings digitally to add into the scenes, and used motion capture to animate them. The animators credited Fraser's ability to consistently re-enact his movements in multiple takes as saving time when it came to match the motion-captured digital mummies to the live-action fight scenes. Music The score for The Mummy was composed and conducted by Jerry Goldsmith, with orchestrations provided by Alexander Courage. Goldsmith had previously scored Deep Rising for Sommers. As he reached the final few years of his career, Goldsmith was coming off a number of action and adventure films in the 1990s, from multiple Star Trek films to Air Force One. Goldsmith provided The Mummy with suitably bombastic music, with the traditional European orchestra supplemented with regional instruments such as the bouzouki. The opening of the film contains nearly all of Goldsmith's major themes for the score, with what music critic Jeff Bond calls an "Egyptian theme" reused in different configurations throughout to establish the epic settings and sense of place for Hamunaptra; a theme for Imhotep/the Mummy that is performed in an understated manner early in the film, before repeating in more forceful, brassy renditions after the Mummy has regenerated; a love theme used for both Imhotep/Anck-su-namun and Rick/Evelyn; and a heroic theme for Rick. In addition to the extensive brass and percussion elements, the score uses sparing amounts of vocals, unusual for much of Goldsmith's work. The soundtrack was released by Decca Records on May 4, 1999. Overall, Goldsmith's score was well received. AllMusic described it as a "grand, melodramatic score" which delivered the expected highlights. Other reviews positively noted the dark, percussive sound meshed well with the plot, as well as the raw power of the music. The limited but masterful use of the chorus was also lauded, and most critics found the final track on the CD to be the best overall. On the other hand, some critics found the score lacked cohesion, and that the constant heavy action lent itself to annoying repetition. Roderick Scott off CineMusic.net summed up the score as "representative of both Goldsmith's absolute best and his most mediocre. Thankfully […] his favourable work on this release wins out." Release Test audiences reacted poorly to the film's title, which conjured up negative impressions of an old horror film, but domestic marketing president Marc Shmuger recalled that they decided "we would redefine the myth with the film" rather than change the title. Enthusiasm for the film was low, but Universal took out a television spot for the Super Bowl (reportedly costing $1.6 million) that Sommers recalled immediately reversed the discussion of the film's prospects. The producers were concerned that the imminent release of The Phantom Menace would sink the film's box office fortunes, resulting in them moving the release date from May 21 to 7. The Mummy grossed $43 million in 3,210 theaters in the United States and Canada on its opening weekend. Its weekend take was the highest non-holiday May opening, and ninth-biggest opening of all time. The film later fell to second place behind The Phantom Menace. The Mummy grossed over $155.4 million in the United States and Canada and $261 million internationally, grossing over $416.4 million worldwide. The Mummy was released on home video in VHS and DVD formats in September 1999. The title was a tremendous success for Universal on home video, selling 7 million units on VHS and 1 million on DVD, making it the year's best-selling live-action VHS and second-best-selling DVD (behind The Matrix.) The Mummys performance helped Universal gross over $1 billion in home video sales. The film was later digitally remastered and received a Blu-ray release in 2008. Critical response The Mummy received mixed reviews from critics. On Rotten Tomatoes, the film holds an approval rating of 61%, based on 101 reviews, with an average rating of 5.9 out of 10. On Metacritic, the film has a score of 48 out of 100, based on 34 critics, indicating mixed or average reviews. Audiences polled by CinemaScore gave the film an average grade of "B" on an A+ to F scale. Roger Ebert of the Chicago Sun-Times gave the film a positive review, writing, "There is hardly a thing I can say in its favor, except that I was cheered by nearly every minute of it. I cannot argue for the script, the direction, the acting or even the mummy, but I can say that I was not bored and sometimes I was unreasonably pleased." Critics such as Entertainment Weeklys Owen Gleiberman and The New York Times Stephen Holden concurred with the assessment of the film as a breezy crowd-pleaser. Less positively, Keith Phipps of The A.V. Club wrote that the film's attempt to create a big, Indiana Jones-inspired action film felt "forced" and the end result was unsatisfying. Other reviews complained of an overstuffed plot or recycled elements from better movies. Reviewers comparing the film to the 1932 original sometimes favored the original's focus on atmosphere and dread, though others welcomed the change to a more energetic Indiana Jones-type film. The effects were generally praised, especially the title creature. Ernest Larson's review for Jump Cut felt that the effects were too similar to ILM's other work, and that the effects alone could not support the weight of the rest of the movie. Bob Graham of the San Francisco Chronicle and Hal Hinson from the Dallas Observer agreed that the effects never overshadowed the human aspects of the film. Gleiberman noted that the horrors of the effects were undercut by the lightheartedness of the film, while the BBC's Almar Haflidason felt that the effects were occasionally unconvincing, and the heavy reliance on cutting-edge computer-generated imagery would likely date the film heavily as time passed. Critics generally praised the acting, with Haflidason writing that the efforts of the cast sold material that would otherwise have been cheesy. David Hunter of The Hollywood Reporter wrote that all the actors managed to hold their own amid the special effects, although he felt Vosloo was largely wasted after Imhotep regenerates and the screenplay gives him little to do. Reviews from USA Today, the British Film Institute, and The A.V. Club noted the film featured questionable casting of ethnic roles and occasionally traded in stereotypes of Arabs. Writing to commemorate the twentieth anniversary of the film's release, writer Maria Lewis noted that on paper, The Mummy should not have been a success, as yet another period adventure film coming after a decade of failed period adventure films. Its connection with audiences, if not critics, was down to its successful blend of "heart, humour, [heroics], and horror." She declared it the "pivotal blockbuster of the nineties." Emma Stefansky, writing for Thrillist, noted it was "the beginning of the end" for action-adventure films, as superhero films would soon supplant it in the coming years. Rotten Tomatoes called the film "Indiana Jones for a new generation." Reviewers noted Fraser's portrayal of Rick set a new mold for action heroes that more films would follow in the years after, while also considering Evelyn a character allowed to break free from a traditional damsel in distress role. Adaptations The Mummys box office performance led to numerous sequels and spinoffs. The film sequel The Mummy Returns (2001) features most of the surviving principal characters. As a married couple, Rick and Evelyn confront Imhotep and the Scorpion King. The film also introduces the heroes' son, Alex. A second sequel, The Mummy: Tomb of the Dragon Emperor (2008), takes place in China with the Terracotta Emperor inspiring the villain, and Rachel Weisz replaced with Maria Bello. The films inspired both an animated TV series titled The Mummy, which lasted two seasons, and a spin-off prequel, The Scorpion King (2002). Universal announced plans in 2012 to reboot the franchise; a new film, also titled The Mummy, was released in June 2017. Developer Konami Nagoya published two video game adaptations of The Mummy under license from Universal Interactive Studios, in 2000: an action-adventure game for the PlayStation and Microsoft Windows developed by Rebellion Developments, as well as a Game Boy Color puzzle game. There was also a 2002 video game based on the 2001 animated series, published by Ubisoft for Game Boy Advance. The film also inspired a roller coaster, Revenge of the Mummy, found in three Universal Studios Theme Parks: Hollywood, California; Orlando, Florida; and Sentosa, Singapore. Notes References External links 1990s action films 1990s fantasy adventure films 1999 films 1999 horror films Remakes of American films American films Ancient Egypt in fiction Egyptian-language films English-language films Films based on Egyptian mythology Films directed by Stephen Sommers Films produced by James Jacks Films produced by Sean Daniel Films scored by Jerry Goldsmith Films set in 1923 Films set in 1926 Films set in ancient Egypt Films set in deserts Films shot at Shepperton Studios Films set in Egypt Films set in the 13th century BC Films shot in Egypt Films shot in Morocco Films with screenplays by Stephen Sommers Films using motion capture Mummy films Reboot films Seti I The Mummy (franchise) Treasure hunt films Universal Pictures films
7077
https://en.wikipedia.org/wiki/Computer%20file
Computer file
A computer file is a computer resource for recording data in a computer storage device, primarily identified by its file name. Just as words can be written to paper, so can data be written to a computer file. Files can be shared with and transferred between computers and mobile devices via removable media, networks, or the Internet. Different types of computer files are designed for different purposes. A file may be designed to store an Image, a written message, a video, a computer program, or any wide variety of other kinds of data. Certain files can store multiple data types at once. By using computer programs, a person can open, read, change, save, and close a computer file. Computer files may be reopened, modified, and copied an arbitrary number of times. Files are typically organized in a file system, which tracks file locations on the disk and enables user access. Etymology The word "file" derives from the Latin filum ("a thread"). "File" was used in the context of computer storage as early as January 1940. In Punched Card Methods in Scientific Computation, W. J. Eckert stated, "The first extensive use of the early Hollerith Tabulator in astronomy was made by Comrie. He used it for building a table from successive differences, and for adding large numbers of harmonic terms". "Tables of functions are constructed from their differences with great efficiency, either as printed tables or as a file of punched cards." In February 1950, in a Radio Corporation of America (RCA) advertisement in Popular Science magazine describing a new "memory" vacuum tube it had developed, RCA stated: "the results of countless computations can be kept 'on file' and taken out again. Such a 'file' now exists in a 'memory' tube developed at RCA Laboratories. Electronically it retains figures fed into calculating machines, holds them in storage while it memorizes new ones – speeds intelligent solutions through mazes of mathematics." In 1952, "file" denoted, among other things, information stored on punched cards. In early use, the underlying hardware, rather than the contents stored on it, was denominated a "file". For example, the IBM 350 disk drives were denominated "disk files". The introduction, circa 1961, by the Burroughs MCP and the MIT Compatible Time-Sharing System of the concept of a "file system" that managed several virtual "files" on one storage device is the origin of the contemporary denotation of the word. Although the contemporary "register file" demonstrates the early concept of files, its use has greatly decreased. File contents On most modern operating systems, files are organized into one-dimensional arrays of bytes. The format of a file is defined by its content since a file is solely a container for data. On some platforms the format is indicated by its filename extension, specifying the rules for how the bytes must be organized and interpreted meaningfully. For example, the bytes of a plain text file ( in Windows) are associated with either ASCII or UTF-8 characters, while the bytes of image, video, and audio files are interpreted otherwise. Most file types also allocate a few bytes for metadata, which allows a file to carry some basic information about itself. Some file systems can store arbitrary (not interpreted by the file system) file-specific data outside of the file format, but linked to the file, for example extended attributes or forks. On other file systems this can be done via sidecar files or software-specific databases. All those methods, however, are more susceptible to loss of metadata than container and archive file formats. File size At any instant in time, a file have a size, normally expressed as number of bytes, that indicates how much storage is associated with the file. In most modern operating systems the size can be any non-negative whole number of bytes up to a system limit. Many older operating systems kept track only of the number of blocks or tracks occupied by a file on a physical storage device. In such systems, software employed other methods to track the exact byte count (e.g., CP/M used a special control character, Ctrl-Z, to signal the end of text files). The general definition of a file does not require that its size have any real meaning, however, unless the data within the file happens to correspond to data within a pool of persistent storage. A special case is a zero byte file; these files can be newly created files that have not yet had any data written to them, or may serve as some kind of flag in the file system, or are accidents (the results of aborted disk operations). For example, the file to which the link points in a typical Unix-like system probably has a defined size that seldom changes. Compare this with which is also a file, but as a character special file, its size is not meaningful. Organization of data in a file Information in a computer file can consist of smaller packets of information (often called "records" or "lines") that are individually different but share some common traits. For example, a payroll file might contain information concerning all the employees in a company and their payroll details; each record in the payroll file concerns just one employee, and all the records have the common trait of being related to payroll—this is very similar to placing all payroll information into a specific filing cabinet in an office that does not have a computer. A text file may contain lines of text, corresponding to printed lines on a piece of paper. Alternatively, a file may contain an arbitrary binary image (a blob) or it may contain an executable. The way information is grouped into a file is entirely up to how it is designed. This has led to a plethora of more or less standardized file structures for all imaginable purposes, from the simplest to the most complex. Most computer files are used by computer programs which create, modify or delete the files for their own use on an as-needed basis. The programmers who create the programs decide what files are needed, how they are to be used and (often) their names. In some cases, computer programs manipulate files that are made visible to the computer user. For example, in a word-processing program, the user manipulates document files that the user personally names. Although the content of the document file is arranged in a format that the word-processing program understands, the user is able to choose the name and location of the file and provide the bulk of the information (such as words and text) that will be stored in the file. Many applications pack all their data files into a single file called an archive file, using internal markers to discern the different types of information contained within. The benefits of the archive file are to lower the number of files for easier transfer, to reduce storage usage, or just to organize outdated files. The archive file must often be unpacked before next using. Operations The most basic operations that programs can perform on a file are: Create a new file Change the access permissions and attributes of a file Open a file, which makes the file contents available to the program Read data from a file Write data to a file Delete a file Close a file, terminating the association between it and the program Truncate a file, shortening it to a specified size within the file system without rewriting any content Files on a computer can be created, moved, modified, grown, shrunk (truncated), and deleted. In most cases, computer programs that are executed on the computer handle these operations, but the user of a computer can also manipulate files if necessary. For instance, Microsoft Word files are normally created and modified by the Microsoft Word program in response to user commands, but the user can also move, rename, or delete these files directly by using a file manager program such as Windows Explorer (on Windows computers) or by command lines (CLI). In Unix-like systems, user space programs do not operate directly, at a low level, on a file. Only the kernel deals with files, and it handles all user-space interaction with files in a manner that is transparent to the user-space programs. The operating system provides a level of abstraction, which means that interaction with a file from user-space is simply through its filename (instead of its inode). For example, rm filename will not delete the file itself, but only a link to the file. There can be many links to a file, but when they are all removed, the kernel considers that file's memory space free to be reallocated. This free space is commonly considered a security risk (due to the existence of file recovery software). Any secure-deletion program uses kernel-space (system) functions to wipe the file's data. File moves within a file system complete almost immediately because the data content does not need to be rewritten. Only the paths need to be changed. Moving methods There are two distinct implementations of file moves. When moving files between devices or partitions, some file managing software deletes each selected file from the source directory individually after being transferred, while other software deletes all files at once' only after every file has been transferred. With the mv command for instance, the former method is used when selecting files individually, possibly with the use of wildcards (example: mv -n sourcePath/* targetPath, while the latter method is used when selecting entire directories (example: mv -n sourcePath targetPath). Microsoft Windows Explorer uses the former method for mass storage filemoves, but the latter method using Media Transfer Protocol, as described in . The former method (individual deletion from source) has the benefit that space is released from the source device or partition imminently after the transfer has begun, meaning after the first file is finished. With the latter method, space is only freed after the transfer of the entire selection has finished. If an incomplete file transfer with the latter method is aborted unexpectedly, perhaps due to an unexpected power-off, system halt or disconnection of a device, no space will have been freed up on the source device or partition. The user would need to merge the remaining files from the source, including the incompletely written (truncated) last file. With the individual deletion method, the file moving software also does not need to cumulatively keep track of all files finished transferring for the case that a user manually aborts the file transfer. A file manager using the latter (afterwards deletion) method will have to only delete the files from the source directory that have already finished transferring. Identifying and organizing In modern computer systems, files are typically accessed using names (filenames). In some operating systems, the name is associated with the file itself. In others, the file is anonymous, and is pointed to by links that have names. In the latter case, a user can identify the name of the link with the file itself, but this is a false analogue, especially where there exists more than one link to the same file. Files (or links to files) can be located in directories. However, more generally, a directory can contain either a list of files or a list of links to files. Within this definition, it is of paramount importance that the term "file" includes directories. This permits the existence of directory hierarchies, i.e., directories containing sub-directories. A name that refers to a file within a directory must be typically unique. In other words, there must be no identical names within a directory. However, in some operating systems, a name may include a specification of type that means a directory can contain an identical name for more than one type of object such as a directory and a file. In environments in which a file is named, a file's name and the path to the file's directory must uniquely identify it among all other files in the computer system—no two files can have the same name and path. Where a file is anonymous, named references to it will exist within a namespace. In most cases, any name within the namespace will refer to exactly zero or one file. However, any file may be represented within any namespace by zero, one or more names. Any string of characters may be a well-formed name for a file or a link depending upon the context of application. Whether or not a name is well-formed depends on the type of computer system being used. Early computers permitted only a few letters or digits in the name of a file, but modern computers allow long names (some up to 255 characters) containing almost any combination of unicode letters or unicode digits, making it easier to understand the purpose of a file at a glance. Some computer systems allow file names to contain spaces; others do not. Case-sensitivity of file names is determined by the file system. Unix file systems are usually case sensitive and allow user-level applications to create files whose names differ only in the case of characters. Microsoft Windows supports multiple file systems, each with different policies regarding case-sensitivity. The common FAT file system can have multiple files whose names differ only in case if the user uses a disk editor to edit the file names in the directory entries. User applications, however, will usually not allow the user to create multiple files with the same name but differing in case. Most computers organize files into hierarchies using folders, directories, or catalogs. The concept is the same irrespective of the terminology used. Each folder can contain an arbitrary number of files, and it can also contain other folders. These other folders are referred to as subfolders. Subfolders can contain still more files and folders and so on, thus building a tree-like structure in which one "master folder" (or "root folder" — the name varies from one operating system to another) can contain any number of levels of other folders and files. Folders can be named just as files can (except for the root folder, which often does not have a name). The use of folders makes it easier to organize files in a logical way. When a computer allows the use of folders, each file and folder has not only a name of its own, but also a path, which identifies the folder or folders in which a file or folder resides. In the path, some sort of special character—such as a slash—is used to separate the file and folder names. For example, in the illustration shown in this article, the path uniquely identifies a file called in a folder called , which in turn is contained in a folder called . The folder and file names are separated by slashes in this example; the topmost or root folder has no name, and so the path begins with a slash (if the root folder had a name, it would precede this first slash). Many computer systems use extensions in file names to help identify what they contain, also known as the file type. On Windows computers, extensions consist of a dot (period) at the end of a file name, followed by a few letters to identify the type of file. An extension of identifies a text file; a extension identifies any type of document or documentation, commonly in the Microsoft Word file format; and so on. Even when extensions are used in a computer system, the degree to which the computer system recognizes and heeds them can vary; in some systems, they are required, while in other systems, they are completely ignored if they are presented. Protection Many modern computer systems provide methods for protecting files against accidental and deliberate damage. Computers that allow for multiple users implement file permissions to control who may or may not modify, delete, or create files and folders. For example, a given user may be granted only permission to read a file or folder, but not to modify or delete it; or a user may be given permission to read and modify files or folders, but not to execute them. Permissions may also be used to allow only certain users to see the contents of a file or folder. Permissions protect against unauthorized tampering or destruction of information in files, and keep private information confidential from unauthorized users. Another protection mechanism implemented in many computers is a read-only flag. When this flag is turned on for a file (which can be accomplished by a computer program or by a human user), the file can be examined, but it cannot be modified. This flag is useful for critical information that must not be modified or erased, such as special files that are used only by internal parts of the computer system. Some systems also include a hidden flag to make certain files invisible; this flag is used by the computer system to hide essential system files that users should not alter. Storage Any file that has any useful purpose must have some physical manifestation. That is, a file (an abstract concept) in a real computer system must have a real physical analogue if it is to exist at all. In physical terms, most computer files are stored on some type of data storage device. For example, most operating systems store files on a hard disk. Hard disks have been the ubiquitous form of non-volatile storage since the early 1960s. Where files contain only temporary information, they may be stored in RAM. Computer files can be also stored on other media in some cases, such as magnetic tapes, compact discs, Digital Versatile Discs, Zip drives, USB flash drives, etc. The use of solid state drives is also beginning to rival the hard disk drive. In Unix-like operating systems, many files have no associated physical storage device. Examples are and most files under directories , and . These are virtual files: they exist as objects within the operating system kernel. As seen by a running user program, files are usually represented either by a file control block or by a file handle. A file control block (FCB) is an area of memory which is manipulated to establish a filename etc. and then passed to the operating system as a parameter; it was used by older IBM operating systems and early PC operating systems including CP/M and early versions of MS-DOS. A file handle is generally either an opaque data type or an integer; it was introduced in around 1961 by the ALGOL-based Burroughs MCP running on the Burroughs B5000 but is now ubiquitous. File corruption When a file is said to be corrupted, it is because its contents have been saved to the computer in such a way that they cannot be properly read, either by a human or by software. Depending on the extent of the damage, the original file can sometimes be recovered, or at least partially understood. A file may be created corrupt, or it may be corrupted at a later point through overwriting. There are many ways by which a file can become corrupted. Most commonly, the issue happens in the process of writing the file to a disk. For example, if an image-editing program unexpectedly crashes while saving an image, that file may be corrupted because the program could not save its entirety. The program itself might warn the user that there was an error, allowing for another attempt at saving the file. Some other examples of reasons for which files become corrupted include: The computer itself shutting down unexpectedly (for example, due to a power loss) with open files, or files in the process of being saved; A download being interrupted before it was completed; Due to a bad sector on the hard drive; The user removing a flash drive (such as a USB stick) without properly unmounting (commonly referred to as "safely removing"); Malicious software, such as a computer virus; A flash drive becoming too old. Although file corruption usually happens accidentally, it may also be done on purpose, as to fool someone else into thinking an assignment was ready at an earlier date, potentially gaining time to finish said assignment. There are services that provide on demand file corruption, which essentially fill a given file with random data so that it cannot be opened or read, yet still seems legitimate. One of the most effective countermeasures for unintentional file corruption is backing up important files. In the event of an important file becoming corrupted, the user can simply replace it with the backed up version. Backup When computer files contain information that is extremely important, a back-up process is used to protect against disasters that might destroy the files. Backing up files simply means making copies of the files in a separate location so that they can be restored if something happens to the computer, or if they are deleted accidentally. There are many ways to back up files. Most computer systems provide utility programs to assist in the back-up process, which can become very time-consuming if there are many files to safeguard. Files are often copied to removable media such as writable CDs or cartridge tapes. Copying files to another hard disk in the same computer protects against failure of one disk, but if it is necessary to protect against failure or destruction of the entire computer, then copies of the files must be made on other media that can be taken away from the computer and stored in a safe, distant location. The grandfather-father-son backup method automatically makes three back-ups; the grandfather file is the oldest copy of the file and the son is the current copy. File systems and file managers The way a computer organizes, names, stores and manipulates files is globally referred to as its file system. Most computers have at least one file system. Some computers allow the use of several different file systems. For instance, on newer MS Windows computers, the older FAT-type file systems of MS-DOS and old versions of Windows are supported, in addition to the NTFS file system that is the normal file system for recent versions of Windows. Each system has its own advantages and disadvantages. Standard FAT allows only eight-character file names (plus a three-character extension) with no spaces, for example, whereas NTFS allows much longer names that can contain spaces. You can call a file "" in NTFS, but in FAT you would be restricted to something like (unless you were using VFAT, a FAT extension allowing long file names). File manager programs are utility programs that allow users to manipulate files directly. They allow you to move, create, delete and rename files and folders, although they do not actually allow you to read the contents of a file or store information in it. Every computer system provides at least one file-manager program for its native file system. For example, File Explorer (formerly Windows Explorer) is commonly used in Microsoft Windows operating systems, and Nautilus is common under several distributions of Linux. See also Block (data storage) Computer file management Data hierarchy File camouflage File copying File conversion File deletion File directory File manager File system Filename Flat-file database Object composition Soft copy References External links File Inter-process communication
37127072
https://en.wikipedia.org/wiki/Nook%20HD
Nook HD
The Nook HD and Nook HD+ are the third generation of Nook's line of color tablet e-reader/media players by Barnes & Noble for using their copy restricted (DRM) proprietary files, or other files. They are the successors to the Nook Tablet and both were released on November 8, 2012. The 7-inch version, the Nook HD (also styled NOOK HD), is available in two internal memory sizes - 8 GB (US$129) with approximately 5 GB available for user content, and 16 GB (US$149) with about 13 GB available for user content. Memory is expandable via a microSD card (up to 64 GB). The Nook HD is available in two colors: Snow (white) and Smoke (black-grey). A 9-inch version, the Nook HD+ (also styled NOOK HD+), is available with 32 GB ($179) of internal memory. Its memory is also expandable via a microSD card (up to 64 GB). The Nook HD+ is only available in one color, Slate (black-grey). When the devices were first introduced, purchasers of the Nook HD or Nook HD+ received an incentive of a $30 gift card to the Barnes & Noble shop. This expired in February 2013. In May 2013, B&N updated the Nook HD and HD+ to provide full access to the Google Play Store, which allowed users to install apps that were unavailable in the Nook Store. In June 2013, B&N announced they would stop making Nook tablets in-house. Later, B&N changed its mind and said a new Nook tablet would be released. In June 2014, Barnes & Noble announced it would be teaming up with Samsung to develop co-branded color tablet, the Samsung Galaxy Tab 4 Nook featuring Samsung's hardware with 7-inch and 10.1-inch displays and customized Nook software from Barnes & Noble. The Galaxy Tab 4 Nook began to be sold in the US in August 2014. History On June 25, 2013, Barnes & Noble announced it: "is abandoning its Nook tablet hardware business and will instead rely on a 'partnership model for manufacturing in the competitive color tablet market' that will seek third-party manufacturers to build eReaders that run Nook software." "The company plans to significantly reduce losses in the NOOK segment by limiting risks associated with manufacturing,” Barnes & Noble said in a press release. “Going forward, the company intends to continue to design eReading devices and reading platforms, while creating a partnership model for manufacturing in the competitive color tablet market. Thus, the widely popular lines of Simple Touch and Glowlight products will continue to be developed in house, and the company’s tablet line will be co-branded with yet to be announced third party manufacturers of consumer electronics products. At the same time, the company intends to continue to build its digital catalog, adding thousands of eBooks every week, and launching new NOOK Apps." On August 20, 2013 CNET reports B&N reversing the decision to eliminate the Color Nook devices: "The bookseller will continue to design and make Nook color devices, with at least one new Nook set for the holiday season, as its chairman shelves a bid to buy the retail side." In reporting on Barnes & Noble's June 5, 2014 announcement that the bookseller would be teaming up with Samsung to develop a co-branded tablet, Samsung Galaxy Tab 4 Nook, which was released in August 2014. Modifying the Nook tablet Rooting Developers have found means to root the device, which provides access to hidden files and settings, making it possible to run apps that require deep access to your file system or make dramatic changes to your device. Alternate operating systems, Android variants and more While Nook is a variant of Android (runs the same programs) with a different user interface and bundled software, a more standard variant of Android (CyanogenMod) is available for the Nook and the smartphone/tablet version of Ubuntu operating system to run applications incompatible with Android. On February 1, 2014, official CyanogenMod 10.2.1 ("Android 4.3 Jelly Bean") was released for the Nook HD and HD+. CyanogenMod versions for Nook HD and Nook HD+ are released for download under the hummingbird and ovation codenames respectively. CyanogenMod releases monthly M-builds ("rolling release") and no versions marked "stable" are to be expected after version 11.0 M6 Release ("Android 4.4.2 KitKat"). The latest version (4.4.4) is available for the Nook HD/Nook HD+ as a "SNAPSHOT" and "NIGHTLIES" versions. Since August 2013, a developer preview of Ubuntu Touch 13.10 is also available based on the above ovation. Ubuntu Touch can be installed along with Android, allowing dual booting. File transfer Transferring a user's files to another computer is possible, provided that the files are not copy restricted by DRM, using the Media Transfer Protocol (MTP) in supported operating systems. See also Comparison of: E-book readers Tablet computers References Barnes & Noble Android (operating system) devices Tablet computers introduced in 2012 Tablet computers Touchscreen portable media players
68018813
https://en.wikipedia.org/wiki/Yang%20Fuqing%20%28scientist%29
Yang Fuqing (scientist)
Yang Fuqing (born 6 November 1932) is a Chinese computer software expert who is a professor at the School of Information Science and Technology, Peking University, a member of the Chinese Academy of Sciences, and currently chairwoman of the university's School of Software and Microelectronics and director of National Engineering Research Center of Software Engineering. Biography Yang was born in Wuxi, Jiangsu, on 6 November 1932, to Yang Jiechen (), a businessman, and Li Wenying (). Her last name "Fuqing" means lotus in Chinese. In 1945, she attended the Wuxi No.1 Girls' Middle School, where she was fascinated by mathematics. In 1949, Wuxi was liberated. One day, Yang and other eight girls took part in a charity performance for poor students organized by the local communist government in the people's theater. After the performance, Yang and the girls became popular amateur dance stars in Wuxi overnight. Since then, dance became her hobby. In 1951, she was admitted to the Department of Mathematics, Tsinghua University withe the highest marks in her school. In 1952, the Communist Party of China regrouped China's higher education institutions, she moved to Peking University with her department. Under the supervision of , she became the first graduate student majoring in computational mathematics in China. In 1957, China sent a computer delegation to the Soviet Union. As a member of the delegation, Yang first contacted the vacuum tube computer in the computing center of the Academy of Sciences of the Soviet Union and began to learn how to write programs. In 1958, she transferred to the Department of Mathematics and Mechanics, Moscow University and studied programming automation under the guidance of Mikhail Romonovic Shulabola. Yang returned to China in October 1959 and taught at her alma mater. In 1962, she went to the Soviet Union again and joined the Computing Center of Dubner Institute of Nuclear Physics. In December 1969, she participated in the development of China's first integrated circuit computer DJS11 and was responsible for the design of instruction system and operating system. In 1973, Peking University was invited to participate in the overall design of DJS200/XT2 series computer, she was appointed as a member of the overall design group of the 200 Series Software and the leader of the 240 Computer Software. She became deputy director of the Department of Computer Science and Technology in 1981, and was promoted to director in 1983. In November 1994, was officially registered, Yang was made the chairwoman. Personal life Yang met Wang Yangyuan in Peking University. They got married in the autumn of 1960. The couple has a son and a daughter. Honours and awards 1991 Member of the Chinese Academy of Sciences (CAS) 1997 Science and Technology Progress Award of the Ho Leung Ho Lee Foundation 1998 State Science and Technology Progress Award (Second Class) 2003 Fellow of the Institute of Electrical and Electronics Engineers (IEEE Fellow) 2007 State Science and Technology Progress Award (Second Class) 2011 Lifetime Achievement Award of the China Computer Federation References External links Development of Software Engineering: Co-operative efforts from academia, government and industry, Yang Fuqing on uci.edu 1932 births Living people People from Wuxi Scientists from Jiangsu Peking University alumni Peking University faculty Members of the Chinese Academy of Sciences Fellow Members of the IEEE
185177
https://en.wikipedia.org/wiki/Don%20Norman
Don Norman
Donald Arthur Norman (born December 25, 1935) is an American researcher, professor, and author. Norman is the director of The Design Lab at University of California, San Diego. He is best known for his books on design, especially The Design of Everyday Things. He is widely regarded for his expertise in the fields of design, usability engineering, and cognitive science. He is a co-founder and consultant with the Nielsen Norman Group. He is also an IDEO fellow and a member of the Board of Trustees of IIT Institute of Design in Chicago. He also holds the title of Professor Emeritus of Cognitive Science at the University of California, San Diego. Norman is an active Distinguished Visiting Professor at the Korea Advanced Institute of Science and Technology (KAIST), where he spends two months a year teaching. Much of Norman's work involves the advocacy of user-centered design. His books all have the underlying purpose of furthering the field of design, from doors to computers. Norman has taken a controversial stance in saying that the design research community has had little impact in the innovation of products, and that while academics can help in refining existing products, it is technologists that accomplish the breakthroughs. To this end, Norman named his website with the initialism JND (just-noticeable difference) to signify his endeavors to make a difference. Early academics In 1957, Norman received a B.S. degree in electrical engineering from Massachusetts Institute of Technology (MIT). Norman received an M.S. degree in electrical engineering from the University of Pennsylvania. He received a PhD in psychology from the University of Pennsylvania. He was one of the earliest graduates from the Mathematical Psychology group at University of Pennsylvania and his advisor was Duncan Luce. After graduating, Norman took up a postdoctoral fellowship at the Center for Cognitive Studies at Harvard University and within a year became a lecturer. After four years with the Center, Norman took a position as an associate professor in the Psychology Department at University of California, San Diego (UCSD). Norman applied his training as an engineer and computer scientist, and as an experimental and mathematical psychologist, to the emerging discipline of cognitive science. Norman eventually became founding chair of the Department of Cognitive Science and chair of the Department of Psychology. At UCSD, Norman was a founder of the Institute for Cognitive Science and one of the organizers of the Cognitive Science Society (along with Roger Schank, Allan Collins, and others), which held its first meeting at the UCSD campus in 1979. Together with psychologist Tim Shallice, Norman proposed a framework of attentional control of executive functioning. One of the components of the Norman-Shallice model is the supervisory attentional system. Cognitive engineering career Norman made the transition from cognitive science to cognitive engineering by entering the field as a consultant and writer. His article "The truth about Unix: The user interface is horrid" in Datamation (1981) catapulted him to a position of prominence in the computer world. Soon after, his career took off outside of academia, although he still remained active at UCSD until 1993. Norman continued his work to further human-centered design by serving on numerous university and government advisory boards such as the Defense Advanced Research Projects Agency (DARPA). He currently serves on numerous committees and advisory boards like at Motorola, the Toyota National College of Technology, TED Conference, Panasonic, Encyclopædia Britannica and many more. Norman was also part of a select team flown in to investigate the 1979 Three Mile Island nuclear accident. In 1993, Norman left UCSD to join Apple Computer, initially as an Apple Fellow as a User Experience Architect (the first use of the phrase "User Experience" in a job title), and then as the Vice President of the Advanced Technology Group. He later worked for Hewlett-Packard before joining with Jakob Nielsen to form the Nielsen Norman Group in 1998. He returned to academia as a professor of computer science at Northwestern University, where he was co-director of the Segal Design Institute until 2010. In 2014, he returned to UCSD to become director of the newly established The Design Lab housed at the California Institute for Telecommunications and Information Technology. Awards and honors Norman has received many awards for his work. He received two honorary degrees, one "S. V. della laurea ad honorem" in Psychology from the University of Padua in 1995 and one doctorate in Industrial Design and Engineering from Delft University of Technology. In 2001, he was inducted as a Fellow of the Association for Computing Machinery (ACM) and won the Rigo Award from SIGDOC, the Association for Computing Machinery's Special Interest Group (SIG) on the Design of Communication (DOC). In 2006, he received the Benjamin Franklin Medal in Computer and Cognitive Science. In 2009, Norman was elected an Honorary Fellow of the Design Research Society. In 2011 Norman was elected a member of the National Academy of Engineering for the development of design principles based on human cognition that enhance the interaction between people and technology. Nielsen Norman Group Norman, alongside colleague Jakob Nielsen, formed the Nielsen Norman Group (NN/g) in 1998. The company's vision is to help designers and other companies move toward more human-centered products and internet interactions, and are pioneers in the field of usability. User-centered design In 1986, Norman introduced the term "user-centered design" in the book User Centered System Design: New Perspectives on Human-computer Interaction, a book edited by him and by Stephen W. Draper. In the introduction of the book, the idea that designers should aim their efforts at the people who will use the system is introduced:People are so adaptable that they are capable of shouldering the entire burden of accommodation to an artifact, but skillful designers make large parts of this burden vanish by adapting the artifact to the users.In his book The Design of Everyday Things, Norman uses the term "user-centered design" to describe design based on the needs of the user, leaving aside what he deems secondary considerations, such as aesthetics. User-centered design involves simplifying the structure of tasks, making things visible, getting the mapping right, exploiting the powers of constraint, designing for error, explaining affordances and the seven stages of action. In his book The Things that Make Us Smart: Defending the Human Attribute in the Age of the Machine, Norman uses the term "cognitive artifacts" to describe "those artificial devices that maintain, display, or operate upon information in order to serve a representational function and that affect human cognitive performance". Similar to his The Design of Everyday Things book, Norman argues for the development of machines that fit our minds, rather than have our minds be conformed to the machine. On the Revised Edition of The Design of Everyday Things, Norman backtracks on his previous claims about aesthetics and removed the term User-Centered Design altogether. In the preface of the book, he says :The first edition of the book focused upon making products understandable and usable. The total experience of a product covers much more than its usability: aesthetics, pleasure, and fun play critically important roles. There was no discussion of pleasure, enjoyment and emotion, Emotion is so important that I wrote an entire book, Emotional Design, about the role it plays in design.He instead currently uses the term human-centered design and defines it as: "an approach that puts human needs, capabilities, and behavior first, then designs to accommodate those needs, capabilities, and ways of behaving." Bibliography He is on numerous educational, private, and public sector advisory boards, including the editorial board of Encyclopædia Britannica. Norman published several important books during his time at UCSD, one of which, User Centered System Design, obliquely referred to the university in the initials of its title. This is a list of select publications. Psychology books Usability books Other publications Direct manipulation interfaces (1985) about direct manipulation interfaces in collaboration with E. L. Hutchins (first author) and J.D. Hollan User Centered System Design: New Perspectives on Human-Computer Interaction (1986) (editor in collaboration with Stephen Draper) Combining his books, Design of Everyday Things, Turn Signals Are the Facial Expressions of Automobiles, Things That Make Us Smart, with various technical reports. See also Cognitive engineering Executive system Human action cycle Human-computer interaction Human-centered design User-centered design Interaction design References External links Publications by Donald Norman from Interaction-Design.org Donald Norman at Userati Video: Franklin Institute Award on Donald Norman from April 2006 by the Franklin Institute Video: Video: Living With Complexity, April 2011 talk at Stanford University An evening of UX Hacking with Don Norman at Stanford" (Stanford University, December 17, 2013) 1935 births Living people Apple Inc. employees Apple Fellows Cognitive scientists MIT School of Engineering alumni University of Pennsylvania School of Engineering and Applied Science alumni Harvard University faculty University of California, San Diego faculty Northwestern University faculty American computer scientists Fellows of the Association for Computing Machinery Human–computer interaction researchers Members of the United States National Academy of Engineering Design researchers Fellows of the Cognitive Science Society Center for Advanced Study in the Behavioral Sciences fellows
1432156
https://en.wikipedia.org/wiki/Eumel
Eumel
EUMEL (pronounced oimel for Extendable Multi User Microprocessor ELAN System and also known as L2 for Liedtke 2) is an operating system (OS) which began as a runtime system (environment) for the programming language ELAN. It was created in 1979 by Jochen Liedtke at the Bielefeld University. EUMEL initially ran on the 8-bit Zilog Z80 processor. It later was ported to many different computer architectures. More than 2000 Eumel systems shipped, mostly to schools and also to legal practices as a text processing platform. EUMEL is based on a virtual machine using a bitcode and achieves remarkable performance and function. Z80-based EUMEL systems provide full multi-user multi-tasking operation with virtual memory management and complete isolation of one process against all others. These systems usually execute ELAN programs faster than equivalent programs written in languages such as COBOL, BASIC, or Pascal, and compiled into Z80 machine code on other operating systems. One of the main features of EUMEL is that it is persistent, using a fixpoint/restart logic. This means that if the OS crashes, or the power fails, a user loses only a few minutes of work: on restart they continue working from the prior fixpoint with all program state intact fully. This is also termed orthogonal persistence. EUMEL was followed by the L3 microkernel, and later the L4 microkernel family. References Discontinued operating systems Microkernels
4132710
https://en.wikipedia.org/wiki/Wind%20controller
Wind controller
A wind controller, sometimes referred to as a wind synthesizer, is an electronic wind instrument. It is usually a MIDI controller associated with one or more music synthesizers. Wind controllers are most commonly played and fingered like a woodwind instrument, usually the saxophone, with the next most common being brass fingering, particularly the trumpet. Models have been produced that play and finger like other acoustic instruments such as the recorder or the tin whistle. The most common form of wind controller uses electronic sensors to convert fingering, breath pressure, bite pressure, finger pressure, and other gesture or action information into control signals that affect musical sounds. The control signals or MIDI messages generated by the wind controller are used to control internal or external devices such as analog synthesizers or MIDI-compatible synthesizers, synth modules, softsynths, sequencers, or even non-instruments such as lighting systems. Simpler breath controllers are also available. Unlike wind controllers, they do not trigger notes and are intended for use in conjunction with a keyboard or synthesizer. A breath controller can be used with a keyboard MIDI controller to add articulation and expression to notes sounded on the keyboard. For example, a performer who has pressed a long held note on the keyboard with a sustained sound, such as a string pad, could blow harder into the breath controller set to control volume to make this note crescendo, or gradually blow more and more gently, to make the volume die away. Some wind controllers contain a built in sound generator and can be connected directly to an amplifier or a set of headphones. Some even include small built in speakers such as the Roland Aerophone series and the Akai EWI SOLO, however their small speaker systems cannot reproduce bass notes correctly or provide adequate sound levels for serious live performance, so these built in sound systems are strictly for home practice at modest playback levels. Some wind controllers such as EWI USB, Berglund NuEVI and NuRAD are strictly "controllers" and do not make a sound on their own, and thus must be connected via MIDI or USB to a sound generating device (or a soft synth). For this reason, a wind controller can sound like almost anything (depending on the capabilities of its sound generator). Wind controller models such as the Akai EWI5000, EWI SOLO, and Roland Aerophones have built-in onboard sample sounds, as well as the MIDI and/or USB outputs. The now discontinued EWI 4000s had a DSP subtractive synthesizer built in rather than sampled instruments and so remains popular on the second hand market. The fingering and shape of the wind controller put no acoustic limitations on how the wind controller actually sounds. For example, a wind controller can be made to sound like a trumpet, saxophone, violin, piano, pipe organ, choir, synthesizers or even a barnyard rooster. Whether designed primarily to appeal to woodwind, brass or harmonica players, controllers can produce any virtual instrument sound. Some virtual instruments and hardware synthesizers are better suited to adaption for wind controller performance than others. A hardware or software synthesiser's suitability is largely dependent on the control options available. MIDI CC mapping options allow the player to control elements like the filter cut off via breath control for expressive dynamics. Custom patches (or presets) are required for optimal expressivity, to take advantage of the considerable benefits of wind control. History Predecessors Already in the 1930s Benjamin F. Miessner was working on various electroacoustic instruments. Among these was an electroacoustic clarinet, that featured an electromagnetic pickup for the reed vibration and was connected to a variety of electronic filters. Miessner's patent from 1938 marks the birth of the electronic wind instrument family. Early experiments with fully electronic instruments started in the 1940s. Leo F. J. Arnold invented an electronic clarinet that featured an on/off-switch controlled by the human breath. This instrument is documented in Arnold's patent from 1942. The Frenchman Georges Jenny and the German engineer Ernst Zacharias played an essential role in the development of the first analog wind controllers in the 1950s. Jenny received his patent for an electronic wind instrument in 1954. It features a breath transducer for variable volume control, that works with a piezo element. The prototypes of Zacharias, who started to work on electronic wind instruments in 1956, lead to the first commercially produced wind synthesizer – the Hohner Electra-Melodica, released in 1967. Analog wind controllers The first widely played wind controller was the Lyricon from Computone which came about in the 1970s era of analog synthesizers. The Lyricon was based on the fingerings of the saxophone and used a similar mouthpiece. It set the standard for hardware-based wind controllers with a number of features that have been preserved in today's MIDI wind controllers, including the ability to correctly interpret the expressive use of reed articulation, breath-controlled dynamics, and embouchure-controlled pitch variation. The Lyricon also expanded the playing range several octaves beyond the accustomed range for woodwind players. Tone generation on the Lyricon was limited to a dedicated analog synthesizer designed specifically to interpret various wired analog outputs from the instrument. Notable early recording artists on the Lyricon include Roland Kirk and Tom Scott. Third-party adaptations would later bring the Lyricon into the MIDI era. The next wind controller of note was the brass style Steiner EVI invented by wind controller pioneer Nyle Steiner. Steiner was the inventor of the brass style EVI (electronic valve instrument) wind controller designed for brass players, as well as the EWI (electronic woodwind instrument) designed for woodwind players. Steiner made many very important contributions to the development wind controllers. His research started in the late 1960s and his first wind controller was the Steiner Parker EVI released in 1975. Originally this EVI was only a "controller" which sent control voltages only for pitch and gate and was to be connected to commercial analog synthesizers. The breath sensor on this early original model EVI was very crude consisting of a simple on/off switch activated by the player's breath pressure. Steiner went on to refine and develop new expressive methods of sensing the player's gestures which have since become standard wind controller features such as an expressive proportional type breath sensor (as compared to earlier switch on/off type breath sensing), tonguing velocity sensing, a vibrato lever for the right hand thumb, pitch bend up and down thumb sensors, glide sensing for portamento effects, bite sensing, lip sensing, and others. Steiner's analog wind controller systems eventually included his own analog synthesizer design bundled into a complete self-contained system (Steinerphone). Steiner was also a studio musician and he played his EVI on the soundtrack of the film "Apocalypse Now". Shortly after the release of the Steiner EVI, woodwind musicians asked Steiner to make a woodwind version of the EVI, and Steiner designed the EWI. The EWI was made famous in the mid 1980s by jazz musician Michael Brecker with the group Steps Ahead when he played the Steinerphone EWI with dazzling bravura. Around 1985 Steiner developed a sophisticated MIDI interface for his EVI and EWI by modifying the JL Cooper Wind Driver box. In 1987, Akai licensed Steiner's EVI and EWI designs and released the Akai EVI1000 brass style and woodwind style EWI1000 wind controllers along with a companion EWV2000 sound module. The EWV2000 featured a MIDI output jack which allowed it to connect to additional MIDI synthesizers opening up a universe of possibilities and numerous recordings in both movie and television soundtracks as well as pop music recordings. The EVI1000 or EWI1000 controllers combined with the EWV2000 sound generator were actually a hybrid digital/analog system. Analog signals were derived from the various sensors (e.g., key, bite, bend, glide, etc.) on the EVI1000/EWI1000 controller unit, then converted to digital signals by a front-end microprocessor in the EWV2000. These digital signals were then altered by the microprocessor and D/A converted to internal analog control voltages appropriate for the analog synthesizer ICs within the EWV2000. The D/A used within the EWV2000 used a very high resolution and conversion rate, such that the responsiveness to the player felt immediate, i.e. "analog". The subsequent EWI3000, EWI3020, and EWI3030m systems also used this A/D/A scheme within their dedicated tone modules, though these later models of the EWI would support MIDI in and out. MIDI controller revolution With the advent of MIDI and computer-based digital samplers in the early 1980s, the new music technology ushered in a variety of "alternative" MIDI controllers. In the 1960s and 1970s, the main way for a musician to play synthesizers was with a keyboard. With MIDI, it became possible for non-keyboardists to play MIDI synthesizers and samplers for the first time. These new controllers included, most notably: MIDI drums, MIDI guitar synthesizers, and MIDI wind controllers. Leading the way to demonstrate the virtuosic potential of this new arsenal of MIDI technology on the world stage through extensive touring and big-label recordings were guitarist Pat Metheny playing the guitar synthesizer and saxophonist Michael Brecker playing the wind controller, each leading their own bands. Digital wind controllers and MIDI The most widely played [citation?] purely digital wind controllers include the Yamaha WX series and the Akai EWI series. These instruments are capable of generating a standard MIDI data stream, thereby eliminating the need for dedicated synthesizers and opening up the possibility of controlling any MIDI-compatible synthesizer or other device. These instruments, while usually shaped something like a clarinet with a saxophone-like key layout, offer the option to recognize fingerings for an assortment of woodwinds and brass. The major distinction between the approach taken by the two companies is in the action of their keys. Yamaha WX series instruments have moving keys like a saxophone or flute that actuate small switches when pressed. Akai EWI series instruments have immovable, touch-sensitive keys that signal when the player is merely making contact with the keys. In the hands of skilled players each of these instruments has proved its ability to perform at a high level of artistry. The now defunct Casio DH series were toy-like wind controllers introduced in the mid-1980s and had a built-in speaker (with limited sound sources) as well as being usable as MIDI controllers. A recent addition to the wind controller category is the Synthophone, an entirely electronic wind controller embedded in the shell of an alto saxophone. Since the electronic components take up the open space of the saxophone, it is not playable as an acoustic instrument; however, since the exterior matches that of the acoustic instrument, it is significantly more familiar to play. Additionally, keyboard-based breath controllers are also available. These modulate standard keyboards, computers and other midi devices, meaning they are not played like a woodwind, but like a keyboard, but with a breath controller (similar to a pump organ.) Yamaha's BC series can be used to control DX and EX units. Midi Solutions makes a converter box that allows any midi device to be controlled by the Yamaha BC controllers. TEControl also makes a USB device that is simply a jump drive with a breath tube attached that can be plugged into any standard computer. Acoustic wind instrument conversion to software MIDI as wind control Through the 1990s the major hardware-based wind controllers improved through successive models and a number of minor, and less commercially successful, controllers were introduced. These software solutions for a time were the only viable bridge between the woodwind or brass player and the synthesizer. But dating back to the 1980s a lesser known software-based alternative began to emerge. With a software-based conversion program the musician plays an ordinary wind instrument into a microphone at which point a software program (sometimes with dedicated computer hardware) interpreted the pitch, dynamics, and expression of this acoustic sound and generates a standard MIDI data stream just in time to play along with the performer through a synthesizer. While the first commercial product attempting this approach dates back to the Fairlight Voicetracker VT-5 of 1985, a more successful modern approach using software on personal computers (combined with a digital audio workstation and softsynths) is relatively new. Two more recent examples of this highly unusual archaic approach were Thing-1 from ThingTone Software, and Digital Ear Realtime from Epinoisis Software. Range of expression Due in part to their fast and sensitive key switching and breath sensing systems both the hardware and software based wind controllers put precise demands on a player who hopes to play with technical mastery. An accomplished woodwind or brass player may find that a hardware or software based wind controller will produce an unwanted note (called a "glitch") even at the slightest imperfection in fingering or articulation technique. As the better recordings show, these difficulties can be overcome with practice. In contrast to live performance with a wind controller, and in response to these technical challenges, some "performances" in recordings are achieved through careful post-processing or note-by-note insertion and editing using a notation or sequencer program. Virtually all current synthesizers and their sound libraries are designed to be played primarily with a keyboard controller, whereby the player often reserves one hand to manipulate the many real-time controls to determine how the instrument sounds, and perhaps using a foot to manipulate an expression pedal. Wind controller players do not have access to as many of these controls and thus are often limited in exploiting all of the potential voicings and articulation changes of their synthesizers, but the technologies of physical modeling (Yamaha VL70-m), sample modeling and hybrid technologies (SWAM engine) promise more expression control for wind controller players. Furthermore, sound designers are paying more attention to the different playing idioms in which their sounds will be used. For example, certain percussion sounds do not work well with a wind controller simply because playing a struck instrument it is not idiomatic to the woodwind, whereas synthesized instruments that model the acoustic properties of a woodwind will seem fitting and natural to a wind controller player. A few of the many hardware (Yamaha, Roland, Akai, Kurzweill, Aodyo) and software (Native Instruments, Garritan, SampleModeling, Sample Logic, LinPlug, Audio Modeling) synthesizers provide specific support for wind controllers, and they vary widely with respect to how well they emulate acoustic wind, brass, and string instruments. The SWAM technology, devised by Audio Modeling, has specific settings for Yamaha, EWI, Sylphyo and Aerophone wind controllers and has succeeded in producing very rapid natural responsiveness with their woodwinds and bowed strings virtual instruments. Also Samplemodeling has specific settings for wind controllers on their Kontakt-based brass. That said, virtually all current synthesizers respond to MIDI continuous controllers and the data provided by wind controller breath and lip input can usually be routed to them in an expressive way. An example of a hardware synthesizer with wind controller support is the Yamaha VL70-m which uses physical modeling synthesis. Physical modeling allows for a unique level of responsiveness to the control signals sent from a wind controller. The emulation of acoustic instrument sounds varies in quality. The VL70-m is able to connect directly to the Yamaha WX series of controllers and via MIDI to the Akai and other wind controllers. Similarly, an example of a software synthesizer with support for wind controller playing is the Zebra synthesizer from Urs Heckmann, Apple's ES2 softsynth, Korg's Mono/Poly softsynth, Audio Modeling's SWAM instruments, and many others. It is important to note that whatever synth is used, it will need to be set up with specially designed breath responsive patches for optimal response to a wind controller. Manufacturers The major manufacturers of wind controllers are Akai, Roland, and Yamaha. As of the beginning of 2022 the available mass production wind controllers include the Akai EWI SOLO, EWI5000, EWI USB, Roland Aerophone models AE-01, AE-05, AE-10, AE-20 and AE-30. Less commonly available models include the AODYO SYLPHYO, Synthophone and the handmade Berglund NuRad and NuEVI. Older models out of production included the Akai EWI 4000s, EWI 3020, EWI 3000, Yamaha WX11, Yamaha WX7, Yamaha WX5, and offerings from Casio including the DH-100, DH-200, DH-500 and DH-800. Wind controllers with saxophone fingerings Synthophone The Synthophone is a Wind Controller synthesizer. It is a MIDI sax offering real sax fingerings and a standard sax embouchure. The MIDI hardware allows the key action as well as breath and lip pressure to be read as MIDI data. Since it is a saxophone, the fingerings are the same with some additions - Several combinations allow real-time editing of patches and harmony. The instrument has made several appearances in NAMM, including 1997. "The design of the Synthophone goes back to the 'pre-MIDI times' of 1981, where the first prototype (a wood-stick with Boehm-like keys) was designed by Martin Hurni. It was connected to a dedicated analog synthesizer system. This first stage of Synthophone was followed by a REAL alto sax with keys connected to a switching system to give a more realistic playing feel." "At the ARS ELECTRONICA 1984 contest, the first prize was given to the design of the Synthophone for its 'most original and future-oriented development in the field of electronic sound production'." After, the MIDI capable prototype was developed to increase its functionality to a Wind controller. The Synthophone is an evolution of the acoustic saxophone into the information age. The Synthophone is made by Softwind Instruments in Bern, Switzerland. Others After the Synthophone, several other MIDI saxes have been released that offer real sax fingerings: in 2019 the Travel Sax by Odisei Music, and in 2020 the YDS-150 digital saxophone by Yamaha. These MIDI saxes have sensors for breath pressure to adjust the volume, but they do not read lip pressure and thus do not allow the pitch to be controlled by the embouchure or by the manner of breathing. With the YDS-150, pitch bend can be achieved using a separate input on the instrument. Both the Travel Sax and the YDS-150 provide for settings customisation using a Bluetooth-connected mobile app. Distinguishing features The Synthophone requires different maintenance than a saxophone. It differs from other wind controllers by not having onboard presets, it must be used with a computer or MIDI synthesizer. The reed is glued to a machined metal piece (lip sensor). The additional finger combinations allow the instrument to produce polyphonic effects to make it a chordal instrument or it can be played as a homophonic instrument. Some other distinguishing features are selectable diatonic tonality, six chord variations (inversions, subs, number of voices, unison/chords) adjusted with lips, freeze harmony, sustain, and obligato or portamento. Programmable to change to the keys of Bb, C, Eb. The electronics are within a Yamaha YAS-275 saxophone. See also Akai EWI Casio digital horn Eigenharp Lyricon Yamaha WX5 Variophon References External links Wind Controller FAQ TEControl USB MIDI Breath Controller Breath Controller Demonstration Video Breath Controllers Softwind Instruments Synthesizers MIDI controllers
37429859
https://en.wikipedia.org/wiki/United%20States%20v.%20Clark
United States v. Clark
United States of America v. Clark (U.S vs. Clark, 11-2270) (United States Court of Appeals, Eighth Circuit 2012) is the name of a lawsuit against Jason Elliott Clark by the U.S. government based on identity theft, bank fraud and conspiracy. This was an appeal from the United States District Court for the District of Minnesota. Clark appealed his conviction for aggravated identity theft based on the sufficiency of the evidence and the court's admission of certain prior acts of evidence. Case summary Jason Elliott Clark along with, Marcus Benson, Jason Richard Hansen, and Nou Thoa were indicted for stealing identities and subsequently stealing more than $150,000 from victims. Clark was charged with two counts of bank fraud and one count of aggravated identity theft. Clark and his partners in crime conspired to obtain funds from the accounts of others by creating counterfeit checks including stolen bank routing and account numbers. Clark used personally identifiable information with the intent to commit bank fraud and access device fraud. The bank fraud charges were related to the unlawful withdrawals and transfer of money from the accounts of the victims. Full conviction would have resulted in a maximum penalty of 30 years for conspiracy, 30 years for each count of bank fraud, 15 years for identity theft, 10 years for each count of access device fraud, and two years on each aggravated identity theft count. Ultimately Clark was sentenced to 48 months in prison based on bank fraud conspiracy under , bank fraud (two counts) under , identity theft under and aggravated identity theft under . Crime details The conspiracy was led by Benson, while Clark, Hansen and Thoa were co-conspirators. Benson provided fraudulent checks which the co-conspirators deposited into his personal bank account. The money was then withdrawn, the bulk of it sent to Benson and a small portion kept by the co-conspirators as payment for the services provided. Benson, Clark and Hanson originally became friends when working together at an electronics store. Hansen left this job and began working as an analyst for a mortgage broker. During his employment, he decided to use confidential personally identifiable information including social security numbers, dates of birth, addresses, and account numbers of individuals applying for mortgage loans. Benson at the same time also started his own mortgage loan brokerage. Hansen contacted Benson through Clark to offer him this personal information, claiming they were mortgage leads. Ultimately Benson used the confidential personally identifiable information to obtain the fraudulent checks. Hansen also approached Thao for assistance with the check-cashing scheme. In September 2007, Clark deposited a check from the victim (D.R.O) for $10,250 and then again for $145,000; supposedly as payment for a property that Clark owned and was now selling. Wells Fargo initiated an investigation where the initial suspicion was that Clark was a fraud victim. It turned out quite the opposite later when they received a testimony from D.R.O that they had in fact never sent any money to Clark. Clark subsequently admitted to getting the checks from Benson, withdrawing the funds, and sending him the vast majority of the funds. Officers arrested Benson and executed a consent search of his home where they found fraudulent ID documents, credit cards, and skimmers as well as photographs of Clark and the other co-defendants, Hansen and Thao. Trial and appeal Based on , the government must prove that the defendant knew that the identity was associated with a real person rather than being fabricated. Clark argued that the evidence was insufficient for a reasonable juror to find beyond a reasonable doubt that Clark knew that the ID on the check belonged to an actual person. The court concluded that a reasonable juror could infer that Clark (as a bank account holder and prior identity thief) knew that banks only open accounts and give credit to real people. Double jeopardy A defendant may move for judgment of acquittal after the government closes its evidence or all evidence under rule 23 of the Federal Rules of Criminal Procedure. Clark attempted to gain acquittal on identity theft and aggravated identity theft. Clark argued that sections and of US code proscribe to the same offense. The Double Jeopardy Clause of the Fifth Amendment to the United States Constitution would prohibit a sentence for both identity theft and aggravated identity theft. In United States v. Felix 503 U.S. 378 (1992), it was ruled that an offense and conspiracy to commit that offense are not the same; fundamentally one cannot be tried twice for the same offense. In this case however, the court decided that section allows for punishment for both identity theft and aggravated identity theft and that the intent of the United States Congress is to impose multiple punishments for both identity theft and aggravated identity theft, as both are separate statutes. In Blockburger v. United States, , it was confirmed that multiple punishments for convictions that fall under separate statutes do not violate the double jeopardy clause. The motion of judgment of acquittal was denied, citing Chase applying the same standard of review to the district. Character evidence During the trial, the district court brought up a case from 2001 where Clark pleaded guilty for identity theft. Clark argued that this reference to prior bad acts was not admissible as character evidence. The court concluded that evidence of prior bad actions was in fact admissible under exceptions of rule 404(b) of the Federal Rules of Evidence for limited purposes such as intent, knowledge or absence of mistake as long as it was relevant to a material issue. Since Clark presented a defense that he acted in good faith when depositing these checks, his knowledge and intent were considered an issue. Rule 404 (b) also permits the inclusion of past bad acts when the prior act is similar and not too remote in time from the current crime. The motion for admission of prior acts was also denied. Citing Ruiz-Estrada, the court stated that they would only reverse the decision when such evidence had no bearing on the case and was primarily being used to prove that the defendant had the propensity to commit a criminal act. Citing Balanga, the court concluded that a reasonable juror could have found the defendant guilty of the charged conduct beyond a reasonable doubt. The court furthered that the conviction can be based on both circumstantial and direct evidence, citing Erdman. Clark ultimately received a total sentence of 48 months in jail – 24 months for bank fraud and 24 months for aggravated identity theft. Legal details: identity theft and bank fraud Identity theft is usually not committed as an end in itself but rather as a means of a facilitating some other crime like financial or real property theft. Identity theft was not a federal crime in the US until 1998 when the Identity Theft and Assumption Deterrence Act became effective. Previously only credit granting agencies who suffered monetary losses were considered victims. With the passing of this act it was the first time that the person whose ID was stolen was viewed as the actual victim. Under the production of an unlawful identification document, authentication features or false identification, possesses such a document is committing a crime and will be appropriately punished per stated consequences. One of the largest most sophisticated identity theft cases in the US involved 111 people who used skimming devices to swipe and steal consumer credit card information at retail and food establishments. It is estimated that identity theft costs the US Internal Revenue Service about $5.2 Billion in 2011. Check fraud in the US cost American consumers and banks about $20 billion in 2010 and some argue that this system (checks) should have been eliminated a long time ago. Not only does eliminating checks reduce the probability of fraud but also reduce the costs of processing. Bank fraud in the US is covered under and refers to any attempt to defraud a financial institution for the purpose of obtaining money, funds, credits, assets or other property owned by the financial institution. As most financial systems these days are electronic bank fraud crimes are often executed through computer systems and networks. Under exceeding authorization to a computer system knowingly with the intent to defraud and obtain anything of value is also illegal. Damaging a bank computer is also an offense under . Fraud prevention - data mining and pattern identification Companies are taking advantage of security information and event management systems in conjunction with large data mining and pattern identification techniques to reduce the risk of financial fraud. Patterns of fraud and inappropriate transactions must be discerned from normal or acceptable user activity. Although the above case focuses on check fraud, credit card fraud has typically been the most prevalent type of identity theft facilitated fraud. Banks have employed real-time monitoring based on rule based engines however due to the dynamic nature of fraud, the bank's own fraud trends, the customer's patterns and the exchange of data between financial institutions has to be just as flexible. Summary of laws applied : Fraud and related activity in connection with identification documents, authentication features, and information : Fraud and related activity in connection with identification documents, authentication features, and information : Fraud and related activity in connection with identification documents, authentication features, and information : Aggravated identity theft : Bank fraud : Attempt and conspiracy See also Bank fraud Check fraud Check washing Civil Identity Program of the Americas Credit card fraud Fair and Accurate Credit Transactions Act Fair Credit Billing Act Fair Credit Reporting Act Ghosting (identity theft) Hacking Identity document forgery Identity fraud Wireless identity theft References External links United States v. Clark - Actual Case at Findlaw Stop Fraud Website - Three Indicted Identity Theft State Statutes Identity Theft Resource Center - USA Identity Theft and Fraud – United States Department of Justice United States Court of Appeals for the Eighth Circuit cases 2012 in United States case law United States property case law United States banking case law United States computer case law United States identity theft case law
9485394
https://en.wikipedia.org/wiki/Paramount%20Defenses
Paramount Defenses
Paramount Defenses is a privately held American Cyber Security company that develops cyber security solutions to "help organizations secure their foundational Microsoft Active Directory deployments". The company was founded in 2006 by Sanjay Tandon, former Microsoft Program Manager for Active Directory Security. The company is headquartered in Newport Beach, California and led by its founder. It operates under the guidance of an Advisory Board that amongst others includes Karen Worstell, former Microsoft Chief Information Security Officer (CISO) and Donald Codling, former Unit Chief Liaison from the Federal Bureau of Investigation (FBI) Cyber Division to the Department of Homeland Security National Cyber Security Division. The company produced The Paramount Brief, an executive summary, declassified in 2016, that describes a serious cyber security risk that potentially impacts the foundational cyber security of 85% of organizations worldwide. History Paramount Defenses was founded in 2006. In 2013, the United States Patent Office awarded its founder a cyber security patent governing the accurate determination of effective access in information systems. References Computer security software companies Computer security software Active Directory Software companies established in 2006 Companies based in Newport Beach, California Companies based in Orange County, California
53027714
https://en.wikipedia.org/wiki/Fritz%20Menzer
Fritz Menzer
Ostwin Fritz Menzer (* 6 April 1908 in Herrndorf near Niederschöna in Saxony between Chemnitz and Dresden † died 25 October 2005 in Bad Homburg vor der Höhe) was a German cryptologist, who before and during World War II, worked in the In 7/VI, the Wehrmacht signals intelligence agency, later working in (OKW/ Chi) that was the cipher bureau of the supreme command of the Nazi party, and later in Abwehr, the military intelligence service of the Wehrmacht. He was involved in the development and production of cryptographic devices and procedures, as well as the security control of their own methods. Life At the age of 18, he joined the Reichswehr as a mechanic and was assigned to a motorized battalion with a location in Leipzig. Menzer had already developed an interest in cryptography and was granted a patent for a "combined measuring apparatus for angles and lengths, the data [from which was] expressed in an enciphered form in a four-place combination of letters". After 12 years in the Signals Corps, where he had risen to the rank of Oberfunkmeister (a senior NCO rank), where his duties were to lick and seal hundreds of envelopes daily, he was eventually sent to OKW/Chi for testing. After his inclination and aptitude test for cryptanalysis work had been recognized at the Army Signal School, he was transferred to the Cipher Department of the High Command of the Wehrmacht (OKW/Chi) in May 1933, where he was taught cryptanalysis techniques, among others by the head of the main group B, Wilhelm Fenner. A year later, in 1936, in a team led by Otto Buggisch, he had developed cryptanalytic methods to break the C-36, a rotor-key cipher machine of the Sweden Boris Hagelin type. In addition, he developed a cryptanalytic method for breaking the Wehrmacht's own machine, the Enigma machine. Subsequently, he was commissioned to lead his own unit within OKW/Chi, which had to deal with the cryptanalysis of foreign cryptographic methods, as well as the development and security check of own procedures and construction of new cryptanalytic aids. Thus, at the age of 28, Fritz Menzer became the Chief of Communications Security for the German Army. Menzer stated at the time: Since the troops and their command, because of their ignorance of the scientific status of cryptanalytic methods, regarded encipherment as a drag on modern communications technique; I often had to overcome great difficulties to put through my ideas. His service as a soldier ended on 32 May 1938 with a rank of Senior Radio Technician. He stayed with the OKW/Chi as civilian. Two years later, in 1940, he was promoted as a government () inspector, and was entrusted with the management of Unit IIc of OKW/Chi, dealing with the development and manufacture of special encryption for government agencies such as Reich Security Main Office and Abwehr, as well as for the German industry. On 1 April 1940, he was promoted to the rank of Superior Government Inspector. With the increased emphasis on cryptographic security and long range communications, in early 1942, Menzer's section was broken up into three functional subsections. Later in 1942, Admiral Canaris, gave Menzer the responsibility of testing the security of the Abwehr cryptographic systems. To what extent Fritz Menzer can actually be regarded as an inventor of various, partly innovative key machines, such as the key machine 39 and the key device 41, also Hitlermühle is controversial "Always the firearms" and not OKW/Chi. However, he was probably at least involved in the design of the machines and entrusted with their technical review. Later in 1942, Canaris commissioned Menzer to carry out security checks on their own cryptographic procedures. Menzer recognized blatant cryptographic weaknesses of the methods used, and in the summer of 1943 reworked all the manual methods used by the Abwehr. He introduced the ABC Schlüssel, Procedure 62 and Procedure 40, which were all double transposition (Transposition cipher) and substitution systems, as well as the Schlüsselrad or Cipher Wheel, a hand cranked autoclave. He remained as an adviser cryptologist in the defense until the end of the war, which he did not experience in Berlin, but together with part of the OKW/Chi under the direction of Wilhelm Fenner in the south of the Reich, in Werfen. On 23 April 1945, OKW/Chi was officially disbanded and the staff was assigned to the General der Nachrichtenaufklärung (GdNA) Just before the American army reached its location about 40 km south of Salzburg, they burned their documents or threw them into the Salzach. With the capitulation of the Wehrmacht on 8 May 1945, the service was terminated for all former members of the OKW. Menzer was captured and interned at the US camp Neufeld near Munich. On 17 June, he was released and travelled to the Soviet Zone of Occupation (SBZ), first to the city of Leipzig, and on 22 September to Zschopau, where he worked as a teacher from January 1946. Shortly afterwards, due to his past in the Wehrmacht he was unsuccessfully dismissed. In the turmoil of the beginning of the Cold War he again came in contact with the US-Americans in Berlin on 8 September 1947 and was taken to Camp King in Oberursel near Frankfurt. Menzer was released and returned Zschopau on 12 September. He was arrested on 20 September by the Soviet Russians and imprisoned in Dresden and interrogated with regard to his contacts with the Americans. Finally, on 13 March 1948, he was released after he had consented to spy for the Soviets. In April 1949, he decided to flee from the Soviet Zone and travelled over West Berlin to the western occupation zones (Allied-occupied Germany). His name last appeared in 1951 in documents. A death notice from the Frankfurt area shows his death at the age of 97 years. Accordingly, he was buried on 5 November 2005 in Bad Homburg Menzer's inventions During Menzer's service with the OKW/Chi and the Abwehr between 1935 and 1945, he was responsible for a number of advances in the machine cryptography science. His technique was to adapt the use of Hagelin pin wheels to provide for irregular wheel motion in cryptographic machinery. Before World War II, there were two types of cipher machinery used by Germany. These were the Enigma cipher machine and those of Hagelin type cryptographic machinery. In the latter, all wheels stepped once with each encipherment with the cycle extended by the use of different length wheels. For the Enigma, motion was of the odometer-type, with the only variation being the starting point of the cycle on each rotors. Fritz Menzer's inventions were designed to make such motions unpredictable. {| class="wikitable" |- ! colspan=5 style="background:LightSteelBlue"|Device types and description |- ! style="text-align: center;background:#ccc" |German Device Name ! style="text-align: center;background:#ccc" |Translation ! style="text-align: center;background:#ccc" |Year Invented ! style="text-align: center;background:#ccc" |Notes on Device |- | Lückenfüllerwalze | Gap-filling wheel | February 1943 | In a normal Enigma rotor, on the left side, it had a movable (with respect the rotor) ring with a single drive notch, and on the right a fixed 26 notched blocking wheel that regulated the drive. When the drive notch on one rotor reached the reading position, the next subordinate rotor would advance one position. For the Lückenfüllerwalze, the notch ring was fixed on the rotor and had 26 drive notches, any of which could be filled in to make them inactive, thus providing for irregular stepping of the subordinate rotor. Dr Walter Fricke was responsible for initial design of the device. |- | Schlüsselgerät 39 (SG-39) | Key device 39 | 1939 | |- | M40 | Device 40 | February 1940 | This was a machine designed by Menzer in 1940 and never put into use. The machine was mechanical in operation. It was a cylinder, with about 30 slots for cipher alphabets. These slots were rotated by a hand crank, and might move from 0 to 3 slots after each letter. The plain text alphabet was mixed and was in a fixed horizontal slot. The plain text was enciphered by reading from this plain text alphabet to the cipher alphabet which had been brought next to it. Otto Buggisch described the principles as that of the Trittheim Table, a historic cryptographic principle from the 17th century. It was a form of Polyalphabetic cipher. The motion was governed by 3 (or possibly 4) with positive and negative lug settings as with the Hagelin machines. The motion was the sum of the positive settings, subject to an overlap principle, similar in principle to the M-209 device. Otto Buggisch did not know the cycle of the motion of the wheels, or the details of the construction by which they acted to vary the motion when the crank was turned. Additional security was provided by using only 36 strips at one time, leaving about 4 slots blank. When these slots reached the enciphering position, a random letter was chosen and inserted in the cipher text, and the plain text letter was enciphered by the next strip that came to the enciphering position. No ideas were ever formulated on the total number of strips to be used, or the frequency of settings changes. Preliminary tests by Dr Doering and Otto Buggisch, gave the machine a high security rating. However it was just as bulky as the Enigma cipher device, and could not print letters, which was then the chief improvement desired. For these reasons it was rejected, and only a lab model was ever built. |- | Schlüsselgerät 41 (SG-41) | Key Device 41 | 1941 | This cipher machine was based on Hagelin encipherment, but included a mechanism for variable stepping the Hagelin wheels. The device had six pin wheels which were mutually prime. The first five of these wheels had kicks of 1,2,4,8 and 10 respectively. The sixth wheel made these kicks positive and negative. The enciphering cycle of one letter, consisted of three elements: This took place if, and only if, the sixth wheel had an active pin the motion index position. If this was the case, then all the following occurred: Wheel 1 moved one step. Each of the remaining four wheels moved one step unless the wheel to its left had an active pin in the motion index position in which case it would move two steps. A key kick was generated which was the sum of all the kicks of wheels which had active pins in the kick index position. However, if the sixth wheel had an active pin in kick index position, the key kick would be 25 minus the sum of all the other kicks. Under such a circumstance, the key would complement itself. Identical to step 1, except when it occurred whether or not Wheel 6 had an active pin in the motion index position. In this step, Wheel 6 also stepped one or two positions, depending on the state of Wheel 5. The original specifications called for a lightweight, durable machine to be used by units forward of the division. Menzer designed it to provide a cipher tape, being keyboard operated in order to improve encryption speed. As a result of the keyboard operation, he was able to redesign the arrangement of letters on the print wheels to flatten the cipher frequency count. During the war there was shortages of aluminum and magnesium resulting in the machine weighing between 12 and 15 kilograms, which was too heavy for field use. Removal of the keyboard would have made the machine lighter, but the design of the print wheels prevented their being directly used for encipherment. Production stopped because no one knew what to do. About 1000 machines were built and these were distributed to the Abwehr, which began using them in 1944. The Luftwaffe supposedly used these for 10 figure traffic, which was possibly for weather reports. |- | Schlüsselkasten | Key case, Cipher Box | | The Cipher Box was a mechanical cipher device making use of the principles of sliding strips. Basically, it was a 3/4 pound aluminum box containing three Hagelin pin wheels and a coil spring which determined the stepping of a sliding strip or ''slide rule on the top of the box." Two alphabets were written on the slide rule, 13 characters of each on the fixed base, and 13 characters of each on the top and the bottom of the sliding strip. The latter were so written that only one alphabet at a time was in phase. Alphabets could be changed as often as required. In use, the slide was pulled to the right until it stopped, winding the spring that drove the mechanism. Pressing a button released the slide to move left. When at either or both of reading positions, , the pinks were all inactive, the slide stopped and the encipherment took place. If the step came from alone, or and together, the slide took an additional step. When the slide stopped, either the top or the bottom alphabet would be in phase and the cipher value could be read off. Pressing the button again would allow the strip to slide left to its next stop. Many Enigma devices were planned to be replaced with the Schlüssselkasten. It had a fairly high level of security. Given the alphabets on the side rule, it was possible to recover the pin patterns with a crib of about 30 letters. Without the crib, computer assistance would have been necessary and large quantities of cipher would have been required to recover the alphabets. A modification was considered in which two 26-character alphabets were slid against one another, rather than the 13 character segments. This would have increased the device security considerably, since more text would have been required to recover the alphabets. It would, however, have simplified recovery of the pin pattern after alphabet recovery. The solution of a single message was most unlikely |- | Schlüsselscheibe | Lock washer | | This was designed by Menzer for use by agents. The principle of operation was similar to the Schlüssselkasten. Three resettable but permanently notched wheels were used. For encryption, the inner disk was rotated to wind the spring. Pressing the key would release the inner disk and allow it to rotate until stopped by the notched rings. If the inner disk stopped in a position were its letters were in phase with those of the outer disk, the cipher value would be read directly. If the stop was in an intermediate position, the number of the line opposite the plain value would be read, and the cipher value taken from the cell with that number. |- | Schlüssselrad | Key Wheel | | The Schlüssselrad was a hand-operated cipher device also designed for agent use. It was made up of two disks. The lower disk had 52 notches or holes around its edge, into which a pencil or stylus could be inserted to turn a disk. On the face of the disk were 52 cells into which a keyboard-mixed alphabet could be inscribed twice, clockwise. The upper disk had a direct standard alphabet inscribed clockwise on one half of its periphery, next to a semicircle windows that, when two disks were assembled concentrically, revealed 26 characters of mixed sequence on the lower disk. The upper disk also had a notch into its edges which exposed ten of the holes on the lower disk. This notch had the digits 0 to 9 inscribed next to it, in a counterclockwise direction so that when the exposed holes were lined up with the numbers, the letters on the lower disk were lined up with the letters on the upper disk. Various methods of key generation were used. On Chilean links, an 11-letter key word was numbered as for a transposition key, with the first digit of a two-digit numbers dropped. This key was extended by appending a two-digit group count and a four-digit time group: A N T O F O G A S T A 1 6 0 7 4 8 5 2 9 1 3 1 2 1 4 4 0 On other links, a Fibonacci sequence of 100-125 digits would be generated through various manipulations of date, time and a secret number. If a message were longer that the key, it would be reversed as many as times as necessary. Key generation tables were also used. In use, the key constituted the input to an autoclave. After aligning the alphabets according to a prearranged system or according to an indicator in the message, a stylus was inserted into the hole corresponding to the first key digit, and the lower disk was rotated clockwise until the stylus was stopped by the end of the notch. The plain text was then found on the upper disk and its cypher value read off the lower disk. The stylus was then placed in the whole corresponding to the second digit of the key, and the same procedure was repeated for the second letter of the text. Thus, the true key at any point in the same cypher was equal to the sum of all previous key inputs (mod 26). |} References 20th-century German inventors German cryptographers 1908 births 2005 deaths German military personnel of World War II History of telecommunications in Germany Telecommunications in World War II Reichswehr personnel
33461978
https://en.wikipedia.org/wiki/Skyglobe
Skyglobe
Skyglobe is an astronomy program for MS-DOS and Microsoft Windows first developed in the late 1980s and early 1990s, and originally sold as Shareware but now available as closed-source freeware. It plots the positions of stars, Messier objects, planets, sun and moon. Skyglobe was designed by Mark A. Haney and his company KlassM Software Inc. in Ann Arbor, Michigan, and first released in 1989, after Mark graduated in computer science from Michigan State University. Use of Skyglobe is still suggested to students at Villanova University. Registered users of the MS-DOS version received additional software, such as a Skyglobe screensaver, and Crystal Sphere, an application simulating the 3,800 stars nearest the solar system. Crystal Sphere may have been the progenitor of CircumSpace, KlassM Software's subsequent stellar neighborhood simulator, displaying the nearest 7,780 stars. At least two versions of Skyglobe were later released for Microsoft Windows, being variously listed as versions 1.0, 2.0, 2.02 and 4.0, as well as other possible iterations between. Skyglobe for Windows (also known as SG4WIN) is freely available online. Accuracy Skyglobe accounts for the earth's precession in its calculations and should therefore be accurate to tens of thousands of years in the past and the future, but its manual does warn that the positions of planets might not be accurate throughout this range (it says "their coordinates are approximately correct for as far back and forward as we have data"). Skyglobe's dates use the Julian calendar until October 4, 1582 and the Gregorian calendar thereafter. It does not have a zero year. User interface Commands in the DOS version of Skyglobe are mostly keystroke-based, and by default the available keys are listed on the screen for reference. Keys exist to adjust the viewer's location, viewing direction, and time (all of which are shown on the display by default), and to control how many objects are rendered. There is also a zoom control, and a function to search for particular objects. If the object being searched for is not currently above the horizon but will be in the next 24 hours, Skyglobe will adjust the time appropriately. A mouse can be used to change the viewing direction or to point at objects; the object under the mouse will be named in the lower-left corner of the screen, along with its horizontal and equatorial coordinates. Clicking with the mouse will re-center the display at that point, and right-clicking will "lock" that position or object to the display's center so you can more easily follow it over time changes. (It is also possible to lock objects using keyboard commands.) Skyglobe can be made to animate the changes of any of its parameters with its "turbo" function, most commonly used to speed up time. The turbo function can animate the changes of centuries or millennia to demonstrate precession. See also Space flight simulation game List of space flight simulation games Planetarium software List of observatory software References External links SkyGlobe download CircumSpace 1.0 download site Astronomy software DOS software
26145730
https://en.wikipedia.org/wiki/Diamond%20Cut%20Audio%20Restoration%20Tools
Diamond Cut Audio Restoration Tools
Diamond Cut Audio Restoration Tools (also known as DC-Art and Diamond Cut Audio Lab) is a set of digital audio editor tools from Diamond Cut Productions used for audio restoration, record restoration, sound restoration of gramophone records and other audio containing media. Origins Diamond Cut Audio Restoration Tools (DC-Art) was originally a private venture by R&D engineer Craig Maier and software engineer Rick Carlson. Developed in the early 1990s, the original concept was conceived in an attempt to preserve the extensive Edison Lateral collection of test pressing recordings held at the Edison National Historic Site in West Orange, New Jersey. DC-Art was developed so that the many test pressings could be transferred to digital tape for preservation and archival purposes. The total number of songs which were recorded with the original software numbered over 1200 in anywhere from two to five takes and included many recordings that had not been played since the late 1920s. In 1995, the Diamond Cut Audio Restoration Tools (DC-Art) program was first formally introduced into the commercial marketplace. Since then DC-Art (or DC for short) has been used throughout the world for not only musical audio restoration applications, but for others such as 911 call restoration, clarification of police surveillance recordings, cleanup of radio broadcasts for release on CD, restoration of historic spoken word recordings, cockpit voice recording restoration among others. The full version is highly useful and flexible for any number of audio related operations. Versions Unlike other programs in the field of record and audio restoration, DC-Art has undergone regular and significant modifications on an almost annual basis. Many of the changes appear to stem from the frequent discussions and interest in the process of audio restoration by addressing them using new and novel algorithms in order to both simplify and improve the outcomes of the audio restoration process. As such, many of the releases include alterations aimed at both novice and experienced/expert users or engineers in the field: DC-Art DC-Art was the first of the officially released audio restoration programs from Diamond Cut Productions in April 1995 and featured a 16-bit processing architecture for the various audio processing algorithms. Initially it was released as a beta version to potential customers and identified as QA 1.1. During the initial release it took several months of de-bugging to optimise the various algorithms. DC-Art V1.0 was made available to the public by July 1996 and later through Tracer Technologies, Inc. in order to facilitate the marketing and distribution of the software. Version 2.0 of DC-Art was released later in December 1997 and included innovative real-time filter previews. If you have an original copy of QA 1.1, consider it a collectable antique! DC-Art-32 DC-Art 32 was released in 1998 and also referred to as "DC-Art Version 3.0". In contrast to the earlier release, DC-Art 32 used a 32-bit processing architecture to improve the accuracy of the various audio processing algorithms. Unlike other audio-restoration software, DC-Art 32 also introduced a novel enhancement processes termed the Virtual Valve Amplifier (aka. VVA). This novel enhancement algorithm provided the opportunity to re-create or enhance various harmonic frequencies that are otherwise lost during the audio restoration process and to "color" sound to match that of the era from which the recording was made. DC-Art Millennium DC-Art Millennium/Live or "DC-Art Version 4.0" was publicly released in August 1999. Unlike previous versions, DC-Art Millennium brought with it a new level of performance and features in the audio restoration and enhancement software market. Unlike other software of its type it included features like live feed-through mode whereby a user could effectively restore a recording on the fly. At the time, it was also one of a few programs available to the public that supported 24-bit/96 kHz sound files which was later found to facilitate impulse noise detection and FFT-based noise removal algorithms. In addition, DC-Art Millennium boasted over 15 real-time tools that could be used individually or in combination (called a multi-filter) and thus setting the restoration software benchmark for its time. In August 2001, DC-Art Live and Millennium were up-dated to version 4.8 with various bug fixes and the addition of a digital high resolution VU meter. Also during this period, changes were made to improve the frequency resolution of the spectrum analyzer. DC-Enhance/MP3 DC-Enhance/MP3 was released in February 2001 as a low cost product for improving the fidelity of MP3 audio files. Unlike other members of the DC-Art family DC-Enhance/MP3 did not include any tools for audio restoration, rather focusing on enhancement of bass and harmonic attributes of already digitised audio. Due to the limited function of this program it was rapidly superseded by other DC-Art versions which included MP3 enhancement as a part of the multifilter component. DC-Audio Mentor As the name suggests, Audio Mentor was designed to take novice users "by the hand" and guide them through the restoration process of removing clicks, pops, hiss and surface noise from any recording format. In a rather innovative method, Audio Mentor took a user through a series of informative steps from setting the recording levels, noise reduction, sound enhancement to making a CD in minutes. It included facilities to automatically split a file into its individual tracks such as when transcribed from a cassette or LP record. In addition, it simplified the process of tracking audio files undergoing or having undergone the restoration process by automatically transferring them between various folders and included a novel mechanism for keeping track of what had been done on each individual file (e.g. noise reduction, Enhancement, Final Touches). DC-Audio Mentor underwent several months of beta testing before being released in mid July 2006. DC-5 DC-5 beta began development in early 2000. As the result of a substantial number of improvements, testing did not commence until June 2002 before being released in September 2002. Previous versions of DC-Art software used a source vs destination file setup termed "Classic edit mode" whereby each time a filter was applied to a source file it would generate a new, altered destination file tiled within the same window. Whilst this was commendable for archival purposes, for the home user or intermediate expert it resulted in too many semi-complete restoration files. DC-5 addressed this issue by introducing "Fast Edit" mode and a new "Fast Edit" history whereby each change to the original file could be reversed. In this mode, both the source and destination files became the one file on screen, whilst information regarding changes made were saved in the background to enable them to be selectively un-done. Other improvements included VOX recording facilities, timer recording, dithering, 192 kHz sampling rate support and most importantly a "Live Feed-through mode" was added which enabled a multifilter to perform a record restoration on the fly often required for radio broadcasting. Changes were also made to the functionality of the continuous noise filter whereby new inflexion points could be added to better match the FFT filter curve to that of the sampled noise in order to facilitate its removal without producing digital artifacts. Other improvements were aimed at novice users or reducing the time required to restore recordings made from old shellac records. These included new and simplified EZ DeClick and EZ DeCrackle filters for repairing unwanted impulse noises. Similarly, a weighting function was added to the median filter which had advantages for both de-crackling a recording and facilitating speech clarification by changing the "timbre" of the sound often required with forensic audio recordings. Other additions included a de-clipping filter and acoustical analysis facilities for spectrograph-voice prints. DC-6 DC-6 Beta testing began May 2004 and was released in late August 2004. This period saw a significant change to how the various software algorithms were being performed and resulted in increased speed of filter processing (reportedly anywhere from 25 to 100% faster). During this period, there were also a number of simplified filter designs added to aid novice users with the basics of record restoration. This included the introduction of an EZClean filter that de-clicked, de-hissed and removed residual hum from a record in one step. Similarly, the Live/Forensic edition saw a number of additional improvements relating to forensics work. This included dual logging of incoming un-processed and out-going processed audio streams, time domain and frequency domain adaptive filter algorithms for separating human voice from significant background noise. It is also possible that DC-6 was the first to boast a new “Auto-continuous noise” FFT algorithm that would automatically calculate the noise fingerprint of a recording “on the fly” and update it on a continuous basis. Whilst this method was deemed not as effective for removing noise compared to the other traditional techniques, the convenience it provided to novice or time-restricted users and the additional ability to adapt to changing noise environments outweighed the noise reduction performance limitations. AFDF/VVA VST plugin In 2006 several of the audio enhancement tools previously only available in the DC-Art line were trialed and released as VST plugin devices for use in other VST plugin compatible audio software. These plugins included a suite of powerful digital signal processing (DSP) audio tools including the Automatic Frequency Domain Filter (AFDF) as well as the Virtual Valve Amplifier (VVA). These tools were readily welcomed especially the AFDF due to its ability to automatically adapt itself to noisy files and remove large amounts of noise with little intervention from the user. The AFDF filter found its application in situations where voice recordings were obscured by noise and needed to separated from cacophonous environmental sounds in order to make them audible and understandable. This was primarily because it was designed as an “Adaptive” filter that constantly changed during operation, adjusting itself to the noisy environment. Unlike other filters offered in the DC-Art range, the AFDF is optimized for Forensics recordings and not “High Fidelity” files. As such, it was programmed to exhibit a faster response time and a narrower effective bandwidth while producing higher levels of noise reduction at the expense of potentially producing higher levels of digital artifacts. Applications included recordings collected by law enforcement officials and others responsible for cleaning up poorly recorded evidence in a short time for accurate presentation at trials. DC-7 DC-7 began beta testing in July 2007 and was later released in October 2007. It boasted a significantly different algorithm included into the FFT-based continuous noise filter called "Artifact Suppression" mode. The new algorithm reportedly reduced the aliasing and artifacting that often accompanies noise reduction with such filters. Of equal significance was the introduction of a simple Virtual Phono Preamp for accurate and simplified equalization of almost any record format. Another milestone was the development and inclusion of an "EZ Enhancer" filter which simplified the otherwise difficult process of sound enhancement that often follows the noise reduction process. DC-7 also introduced a simple "Tune Library" which allowed a user to catalogue the progression and restoration of audio files in a database style view. Somewhere along the line it was also decided to include the Spectrogram View previously only found in the DC-Forensics range with the basic audio restoration/Live software version to aid in manual impulse noise removal and other challenging operations. Together, these changes provided very successful and popular in the field of audio restoration tools and was likened to the transition "from regular television to High Definition". DC7.5 DC-7 was further evolved to DC-7.5 which was released for beta testing in November 2008. It included a number of improvements namely the addition of a narrow crackle filter for smoothing the sound of recordings made on old shellac records. Other modifications included better discrimination of impulse noise filters in the presence of Brass/voice solos and the FFT-based continuous noise filter which was also modified to include a larger range of FFT filter sizes (from 32 to 1600+ frequency bins). Similarly, a number of other restorations were updated and improved before being made available to the public in January 2009. DC-8 DC-8 is the latest of the Diamond Cut Audio Restoration Software releases. DC8 was released for private beta testing in late November 2009. Due to a number of significant changes to the core of the program, rigorous beta testing took several months delaying the final release to March 2010. DC8 included a number of novel and innovative advances including a "Big Click Filter" for automatically repairing extremely large clicks created by a cracked or badly gouged record. DC8 also introduced the concept of a Direct Spectral Editor tool to provide users with the ability to manually attenuate or interpolate very long-lived noise events on recordings like coughs, whistling, chair movement and other unwanted spurious signals. In addition, the spectral editing tool has other uses in audio restoration including the selective removal of record “swoosh” sounds that are otherwise unable to be removed without significant audio fidelity degradation. The DSE tool has been discontinued in the United States versions, but is available elsewhere throughout the world. Other improvements included changes to the performance of the manual impulse noise interpolator through the use of a combination of time and frequency domain techniques. DC8 also included file support for Broadcast Wave Formats (BWF), Flac, and Ogg Vorbis. The Virtual Phono Preamplifier was also expanded to include 49 additional LP EQ recording curves usable with either RIAA or Flat Preamp front-end hardware. Of the novel additions was the inclusion of the “Sub-Harmonic Synthesizer” and the “Overtone Synthesizer”. These two filters generate harmonic frequencies to re-create the lost lower and upper octaves of a recording improving the fidelity of a restoration. See also List of music software References External links Acoustics software
39648
https://en.wikipedia.org/wiki/New%20England%20Digital
New England Digital
New England Digital Corporation (1976–1993) was founded in Norwich, Vermont, and relocated to White River Junction, Vermont. It was best known for its signature product, the Synclavier Synthesizer System, which evolved into the Synclavier Digital Audio System or "Tapeless Studio." The company sold an FM digital synthesizer/16-bit polyphonic synthesizer and magnetic disk-based non-linear 16-bit digital recording product, referred to as the "Post-Pro." The Synclavier was developed as the "Dartmouth Digital Synthesizer" by Dartmouth College Professors Jon Appleton and Frederick J. Hooven, in association with NED co-founders Sydney A. Alonso and Cameron W. Jones. The Synclavier would become the pioneering prototype hardware and software system for all digital non-linear synthesis, polyphonic sampling, magnetic (hard-disk) recording and sequencing systems technology that is commonplace in all music and sound effects/design today. The instrument's development picked up speed in late 1978/early 1979, when master synthesist, sound designer, and musical arranger, Denny Jaeger, began working with NED to help create system upgrades, advanced capabilities, and unique sounds that were tailored to fit the needs of the product for the commercial music industry. The second generation's user interface panel and overall music design features of the original Synclavier (that would become Synclavier II) were substantially driven and designed by Denny Jaeger. His relentless attention to detail and unparalleled understanding of synthesis, audio recording, and technology provided tremendous product/market insight to the original founding hardware and software engineering team of Alonso and Jones. In November 1979, immediately following the arrival of Denny Jaeger, Alonso hired Brad Naples as the company's Business Manager. Working in tandem, Jaeger and Naples were the main drivers of the marketing and sales/business development efforts of the company. However, all four individuals—Alonso, Jones, Jaeger, and Naples—worked as a collaborative team, which was quite unique and unparalleled at the time. NED unveiled the newly improved Synclavier II at the AES show in May 1980, where it became an instant hit. In 1981 New England Digital pioneered the recording of digital audio to hard disk with the introduction of their Sample-To-Disk option. Their software module known as SFM (Signal File Manager) was popular among the academic world for research and analysis of audio. The SFM also found use in the US Military for the analysis of submarine sounds. The company continued to refine the Synclavier II, with Jaeger leading more musician-friendly, technological improvements, and Naples evolving to become the company's President/CEO (1983–1993) to assist Alonso and Jones, who were substantially expanding the hardware and software team. Musicians such as New York City-based multi-instrumentalist Kashif were involved in the creative development of Synclavier. It became one of the most advanced electronic synthesis and recording tools of the day. Early adopters included: John McLaughlin Pat Metheny Michael Jackson, particularly on his 1982 album Thriller. Denny Jaeger and Michel Rubini, the first to use the Synclavier to score a major motion picture (The Hunger, with David Bowie, released through MGM in April, 1983) and to score the first network TV series (The Powers of Matthew Starr, from Paramount Television, released September, 1982). Laurie Anderson, whose 1984 album "Mister Heartbreak" includes visual depictions of Synclavier sound waves in the liner notes Frank Zappa, who composed his 1986 Grammy-winning album Jazz from Hell on the instrument. He continued to use it on his studio albums until his death in 1993, culminating in the posthumous release of his magnum opus Civilization, Phaze III (by Zappa's estimation, 70% of this two-hour work is exclusively Synclavier.) Producer Mike Thorne, who used the Synclavier to shape the sound of the 80s producing bands such as Siouxsie and The Banshees, Soft Cell, Marc Almond, and Bronski Beat Record label founder Daniel Miller (Mute Records). It found use on most Depeche Mode albums in which band member Alan Wilder was involved. Sting Genesis The Cars Herbie Hancock Sean Callery Eddie Jobson The system was nearly as famous for where it was not used, as it was for the list of premier studios in which it was: the extremely sophisticated synthesizer enjoyed the distinction of being banned from many famous concert halls, out of fear that it would make the musicians themselves obsolete. A notable exception being the massive, 55 minute Dialogue for Synclavier and Orchestra by American Composer Frank Proto, commissioned and performed by the Cincinnati Symphony Orchestra in 1986. Bringing together the full forces of a contemporary orchestra with a fully decked out Synclavier in a live performance, it displayed what can be achieved, combining both seemingly incompatible disciplines, by a composer with intimate knowledge of not only the available orchestral and electronic forces, but with the compositional skills to take advantage of both, without resorting to gimmicky devices frequently found in attempts to wed the two. The mature Synclavier was a modular, component-based system that included facilities for FM-based synthesis, digital sampling, hard-disk recording, and sophisticated computer-based sound editing. By the late 1980s, complete Synclavier systems were selling for upwards of $200,000, to famous musicians such as Sting, Michael Jackson and Stevie Wonder, and to major studios the world over. The Synclavier was also employed by experimental musicians, such as John McLaughlin, Kraftwerk, Laurie Anderson, Frank Zappa, Kashif and Peter Buffett who used it extensively in their music. It is still used to this day in major movies for sound design, along with TV, Commercials and Music composition and production. The Synclavier became a victim of the early 1990s economic downturn, the high prices (due in part to high specs, failure to diversify, and high executive salaries) and the rapidly increasing capabilities of personal computers, MIDI-enabled synthesizers and low-cost digital samplers. In the span of two years, the company saw enormous sales evaporate, and in 1992 they closed their doors forever. Parts of the company were purchased by Fostex, which used the technical knowledge base of staff to build several hard-disk recording systems in the 1990s (like Fostex Foundation 2000 and 2000re), and AirWorks Media, a Canadian company who used portions of code in their TuneBuilder product line. Simultaneously, a group of ex-employees and product owners collaborated to form The Synclavier Company, primarily as a maintenance organization for existing customers, but with an eye to adapting Synclavier software for stand-alone personal computer use, while in Europe the previously profitable, but now motherless, NED Europe is run by ex-head of European operations, Steve Hills. , it was still trading in London, England, as Synclavier Europe. In 1998, under the company Demas, NED co-founder Cameron W. Jones (original and current owner of the Synclavier trademark and software) collaborated with ex-employee Brian S. George (owner of Demas, the company that purchased all of NED's hardware and technical assets) and original co-founding partner Sydney Alonso to develop an emulator designed to run Synclavier software for Apple Computer's Macintosh computer systems and hardware designed to share the core processing with the later generation of Apple G3 computers giving enhanced features and greater speed to the system. References External links Dialogue for Synclavier and Orchestra Synclavier Web Site Rauner Library at Dartmouth College Synthesizer manufacturing companies of the United States Companies based in Vermont White River Junction, Vermont Manufacturing companies established in 1976 de:New England Digital
936113
https://en.wikipedia.org/wiki/JAWS%20%28screen%20reader%29
JAWS (screen reader)
JAWS ("Job Access With Speech") is a computer screen reader program for Microsoft Windows that allows blind and visually impaired users to read the screen either with a text-to-speech output or by a refreshable Braille display. JAWS is produced by the Blind and Low Vision Group of Freedom Scientific. A May–June 2021 screen reader user survey by WebAIM, a web accessibility company, found JAWS to be the most popular screen reader worldwide; 53.7% of survey participants used it as a primary screen reader, while 70.0% of participants used it often. JAWS supports Windows 10 and Windows 11 along with all versions of Windows Server released since Windows Server 2008. There are two versions of the program: the Home edition for non-commercial use and the Professional edition for commercial environments. Before JAWS 16, the Home edition was called Standard, and only worked on home Windows operating systems. A DOS version, sometimes also known as JDOS, is free. The JAWS Scripting Language allows the user to use programs without standard Windows controls, and programs that were not designed for accessibility. History JAWS was originally released in 1989 by Ted Henter, a former motorcycle racer who lost his sight in a 1978 automobile accident. In 1985, Henter, along with a investment from Bill Joyce, founded the Henter-Joyce Corporation in St. Petersburg, Florida. Joyce sold his interest in the company back to Henter in 1990. In April 2000, Henter-Joyce, Blazie Engineering, and Arkenstone, Inc. merged to form Freedom Scientific. JAWS was originally created for the MS-DOS operating system. It was one of several screen readers giving blind users access to text-mode MS-DOS applications. A feature unique to JAWS at the time was its use of cascading menus, in the style of the popular Lotus 1-2-3 application. What set JAWS apart from other screen readers of the era was its use of macros that allowed users to customize the user interface and work better with various applications. Ted Henter and Rex Skipper wrote the original JAWS code in the mid-1980s, releasing version 2.0 in mid-1990. Skipper left the company after the release of version 2.0, and following his departure, Charles Oppermann was hired to maintain and improve the product. Oppermann and Henter regularly added minor and major features and frequently released new versions. Freedom Scientific now offers JAWS for MS-DOS as a freeware download from their web site. In 1993, Henter-Joyce released a highly modified version of JAWS for people with learning disabilities. This product, called WordScholar, is no longer available. JAWS for Windows In 1992, as Microsoft Windows became more popular, Oppermann began work on a new version of JAWS. A principal design goal was not to interfere with the natural user interface of Windows and to continue to provide a strong macro facility. Test and beta versions of JAWS for Windows (JFW) were shown at conferences throughout 1993 and 1994. During this time, developer Glen Gordon started working on the code, ultimately taking over its development when Oppermann was hired by Microsoft in November 1994. Shortly afterwards, in January 1995, JAWS for Windows 1.0 was released. A new revision of JAWS for Windows is released about once a year, with minor updates in between. Features JAWS allows all major functions of the Microsoft Windows operating system to be controlled with keyboard shortcuts and spoken feedback. These shortcuts are kept as consistent as possible throughout most programs, but the very high number of functions needed to fluidly use modern computer software effectively requires the end user to memorize many specific keystrokes. Virtually every aspect of JAWS can be customized by the user, including all keystrokes and factors such as reading speed, granularity used when reading punctuation, and hints. JAWS also includes a scripting language to automate tasks and make more complex modifications to the program's behavior. The software includes a distinct mode designed specifically for web browsers, activated when a browser is in the foreground. When browsing web pages, JAWS first declares the title and number of links. Speech can be stopped with the control key, lines are navigated with the up/down arrow keys, and the tab key moves between links and controls. Specific letter keys on the keyboard can be pressed to navigate to the next or previous element of a specific type, such as text boxes or check boxes. JAWS can access headings in Word and PDF documents in a similar fashion. The JAWS feature set and its configurability have been described as "complex," with training recommended for users such as web designers performing accessibility testing, to avoid drawing the wrong conclusions from such testing. References External links 1995 software Screen readers Windows-only software Proprietary software
21045346
https://en.wikipedia.org/wiki/DIYbio%20%28organization%29
DIYbio (organization)
DIYbio is an informal umbrella organization for individuals and local groups active in do-it-yourself biology, encompassing both a website and an email list. It serves as a network of individuals from around the globe that aims to help make biology a worthwhile pursuit for citizen scientists, biohackers, amateur biologists, and do-it-yourself biological engineers who value openness and safety. It was founded by Jason Bobe and Mackenzie Cowell in 2008. The website provides resources for those in the do-it-yourself biology community. It maintains a directory of local groups encompassing both meetup groups and organizations maintaining community laboratory space, and a weekly blog listing events hosted by these organizations. The website also hosts safety information including ethics codes developed by the community and an "ask a biosafety professional" feature, as well as DIY instructions for making several types of laboratory equipment. Community The blending of biology expertise gained from experimentation, and software development, quality control, awareness of open source principles, and security expertise transferred from the professional work of many DIYbio enthusiasts, has led to a unique subculture among this community, with some members referring to themselves as biopunks in reference to the cypherpunks of the turn of the century. The work 'A Biopunk Manifesto' delivered by Patterson at the UCLA conference lays down the principles of the biopunk movement, in an homage to the prior work of cypherpunk Eric Hughes. That a significant proportion of the DIYbio mailing list membership are openly in support of outsourcing DNA synthesis and sequencing makes it difficult to determine whether this definition truly applies; in general, the two hobbies are impossible to distinguish and share a common community. Both are forms of citizen science. As DIYbio has grown, tools and materials have become available including instructions on how to build lab equipment and DIYbio stores like The ODIN that provide inexpensive materials. Some participants call themselves ‘biohackers’, not hackers in the sense of infiltrating protected places and stealing information, but hackers in the original sense of taking things apart and putting them back together in a new, better way. These biohackers often pursue these interests outside of their jobs, companies or institutional labs. Activities Beginning in 2009 the FBI engaged active members of the DIYbio Google Groups mailing list much like they engage scientific boards at universities and businesses. The dialogue focused on safety issues and aimed to instill a sense of self-policing in the ad-hoc online community. Because DIYbio and biohacking takes place on an international level, the FBI is limited in its ability to monitor and investigate all activity. However, in 2012 the FBI held a DIYbio conference in Walnut Creek, California where they paid to fly in biohackers from all over the world in an attempt to forge a connection to the DIYbio community. DIYbio was featured at a table in Newcastle Maker Faire in March, 2010, with DNA extraction experiments and projects involving isolation of luminescent bacteria being demonstrated or given away. The "Dremelfuge", an open-source, 3D printed Dremel-powered centrifuge, was presented as an example of how Biotech can be made more accessible. A presentation on the potential of DIYbio and synthetic biology gathered a sizeable attendance. Internal discussions and proposed projects for DIYbio members often include discussion of risk mitigation and public perception. An oft-discussed topic is the search for a convenient and safe "model organism" for DIYbio which would evoke less suspicion than E.coli. Suggestions include Janthinobacterium lividum, Bacillus subtilis, Acetobacteria or Gluconacetobacter spp., and baker's yeast. A list of potential biosafe organisms was drawn up by the National Center for Biotechnology Education. See also Synthetic biology Genetic engineering References External links diybio.org (main site) diybio.eu (European amateur biology network) openpcr.org Open-source DIYbio PCR thermocycler project OpenWetWare (wiki) DIYbio on the News Hour (PBS) Biopunk Biology societies Do it yourself Transhumanist organizations
25153973
https://en.wikipedia.org/wiki/Honey%20Boy%20%28singer%29
Honey Boy (singer)
Keith Williams (born c. 1955), better known as Honey Boy, is a Jamaican reggae singer best known for his recordings in the 1970s who is regarded as one of the pioneers of lovers rock. Biography Williams was born in Saint Elizabeth Parish c. 1955. He moved to the United Kingdom in the late 1960s, living in Oxford before settling in London. He found work as a backing vocalist with Laurel Aitken before recording his debut single "JAMAICA" for Trojan in 1971 and for Junior Lincoln's Banana label, beginning with the "Homeward Bound" single. Several singles followed, credited to Honey Boy and other pseudonyms such as Happy Junior and Boy Wonder. He contributed "Jamaica" to the 1971 live album Trojan Reggae Party, and his first album, This is Honey Boy, was released in 1973. In the mid-1970s he worked with former Studio One musician Winston Curtis who had relocated to the UK and moved into production. He also recorded for Count Shelley. With the advent of lovers rock in the mid-1970s, Honey Boy became a major figure in the scene, having several hits on the reggae charts in 1977. In 1980 he recorded the Arise album with members of Aswad. In 2002, Honey Boy was featured vocalist on "Always There" on UB40's album UB40 Present the Fathers of Reggae. Album discography This is Honey Boy (1973), Count Shelley Sweet Cherries Impossible Love (1974), Cactus Taste of Honey (1975), Cactus Strange Thoughts (1976), Trojan Lovers (1976), Third World Dark End of the Street (1978), Diamond Arise (1980), Diamond Roxy (19??), Cougar Love You Tonight (1995) Master Piece (2000) Cactus The Gospel and I, Pt.2 (2007), Jet Star References 1950s births Living people People from Saint Elizabeth Parish Jamaican reggae musicians Jamaican male singers
1581476
https://en.wikipedia.org/wiki/King%20Teucer
King Teucer
In Greek mythology, King Teucer (; Ancient Greek: Τεῦκρος Teûkros) was said to have been the son of the river-god Scamander and the nymph Idaea. Mythology Before the arrival of Dardanus, the land that would eventually be called Dardania (and later still the Troad) was known as Teucria and the inhabitants as Teucrians, after Teucer. According to Virgil, Teucer was originally from Crete but left the island during a great famine with a third of its inhabitants. They settled near the Scamander river, named after Teucer's father, not far from the Rhaetean promontory. However, Dionysius of Halicarnassus states that Teucer had come to the Troad from Attica where he was a chief of the Xypetȇ region. In both cases he ended up in the region which would be known as the Troad. His company was said to have been greatly annoyed by a vast number of mice during their first night in the region. Teucer had previously been directed by an oracle before leaving Crete to build a settlement in the place where he should be attacked in the night-time by an enemy sprung from the earth or "where the earth-born should attack them"; since mice had attacked them during the night he resolved to settle there. He probably founded the city of Hamaxitus and established it as his capital. Teucer is said to have had a felicitous reign as he was successful in all of his undertakings. He was said to have been the first to build a temple to Apollo Sminthius or Apollo the "destroyer of mice" since Apollo was said to have destroyed mice infesting that area during Teucer's reign. Batea (also known as Batia or Arisba), King Teucer's daughter and only child, was given in marriage to Dardanus. In Lycophron's Alexandra, Dardanus was said to wed Arisba from "Crete's royal house". Dardanus received land on Mount Ida from his father-in-law when Teucer died since he did not have a biological son. There Dardanus founded the city of Dardania. After Teucer's death, his kingdom was incorporated into that of Dardanus and the entire region came to be known as Dardania. Yet in later times, the people of Troy often referred to themselves as "Teucrians". For example, Aeneas is called the "great captain of the Teucrians". In most myths mentioning King Teucer, he is described as being a distant ancestor of the Trojans. Diodorus states that Teucer was "the first to rule as king over the land of Troy" while in the Aeneid, Anchises recalls him being the Trojans' "first forefather". This suggests that King Teucer was considered the first figure to bear the bloodline of the Trojans as his father Scamander did not have such acclamations. Family tree See also Teucer Tjeker Notes References Apollodorus, The Library with an English Translation by Sir James George Frazer, F.B.A., F.R.S. in 2 Volumes, Cambridge, MA, Harvard University Press; London, William Heinemann Ltd. 1921. ISBN 0-674-99135-4. Online version at the Perseus Digital Library. Greek text available from the same website. Diodorus Siculus, The Library of History translated by Charles Henry Oldfather. Twelve volumes. Loeb Classical Library. Cambridge, Massachusetts: Harvard University Press; London: William Heinemann, Ltd. 1989. Vol. 3. Books 4.59–8. Online version at Bill Thayer's Web Site Diodorus Siculus, Bibliotheca Historica. Vol 1-2. Immanel Bekker. Ludwig Dindorf. Friedrich Vogel. in aedibus B. G. Teubneri. Leipzig. 1888-1890. Greek text available at the Perseus Digital Library. Dionysus of Halicarnassus, Roman Antiquities. English translation by Earnest Cary in the Loeb Classical Library, 7 volumes. Harvard University Press, 1937-1950. Online version at Bill Thayer's Web Site Dionysius of Halicarnassus, Antiquitatum Romanarum quae supersunt, Vol I-IV. . Karl Jacoby. In Aedibus B.G. Teubneri. Leipzig. 1885. Greek text available at the Perseus Digital Library. Publius Vergilius Maro, Aeneid. Theodore C. Williams. trans. Boston. Houghton Mifflin Co. 1910. Online version at the Perseus Digital Library. Publius Vergilius Maro, Bucolics, Aeneid, and Georgics. J. B. Greenough. Boston. Ginn & Co. 1900. Latin text available at the Perseus Digital Library. Strabo, The Geography of Strabo. Edition by H.L. Jones. Cambridge, Mass.: Harvard University Press; London: William Heinemann, Ltd. 1924. Online version at the Perseus Digital Library. Strabo, Geographica edited by A. Meineke. Leipzig: Teubner. 1877. Greek text available at the Perseus Digital Library. Children of Potamoi Mythological kings of Troy Kings in Greek mythology Characters in Greek mythology
10807068
https://en.wikipedia.org/wiki/Journal%20of%20Systems%20and%20Software
Journal of Systems and Software
The Journal of Systems and Software is a computer science journal in the area of software systems, established in 1979 and published by Elsevier. The journal publishes research papers, state-of-the-art surveys, and practical experience reports. It includes papers covering issues of programming methodology, software engineering, and hardware/software systems. Topics include: "software systems, prototyping issues, high-level specification techniques, procedural and functional programming techniques, data-flow concepts, multiprocessing, real-time, distributed, concurrent, and telecommunications systems, software metrics, reliability models for software, performance issues, and management concerns." Impact factor and h5-index According to the 2020 Journal Citation Reports, the Journal of Systems and Software has an impact factor of 2.829. According to Google Scholar, the journal has an h5-index of 61, which ranks no. 2 among international publication venues in software systems. Past Editors in Chief John Manley and Alan Salisbury (1979-1983) Richard E. Fairley (1984-1985) Robert L. Glass (1986-2001) David N. Card (2002-2008) Hans van Vliet (2009-2017) Paris Avgeriou & David Shepherd (2018-current) Notable articles A few of the most notable (downloaded) articles are: Software defect prediction based on enhanced metaheuristic feature selection optimization and a hybrid deep neural network A software engineering perspective on engineering machine learning systems: State of the art and challenges MeTeaM: A method for characterizing mature software metrics teams References External links Online access Publications established in 1979 Computer science journals Software engineering publications Systems engineering Elsevier academic journals English-language journals Monthly journals
24275843
https://en.wikipedia.org/wiki/Amiya%20Pujari
Amiya Pujari
Amiya Kumar Pujari (19 June 1948 – 4 March 2003) was an Indian computer scientist and information technology pioneer and leader. Early life Pujari was born into an Oriya Brahmin family in Sambalpur, India. He took an interest in science, mathematics and engineering from very early on in his life. He derived inspiration from scientists, poets, writers and freedom fighters of India. Career Pujari was a leading information technology pioneer in the states of Kerala and Odisha where he made significant contributions in e-governance and other computer science research. He dedicated most of his life to spreading computer and technology education in Indian society and government. List of positions held : Accomplishments As Director Computer Centre, University of Kerala (August 1990 - 1997) Co-ordination of computer related activities in computer applications in all departments of the university including computerization of research projects, administrative applications and office automation, Chairman of Board of Studies in Computer Science in the University of Kerala for the period 1992-Jan 1995. Chairman/member of Doctoral Committee in Computer Science in the University of Kerala, Member of the Board of studies in Electronics, University of Kerala - 1994, Member of the Board of studies in Computer Science, Bharathiar University, Coimbatore - 1993, Member of the Board of studies in Computer Applications, Institute of Management in Government, Government of Kerala, Trivandrum, Guest Faculty member in the Institute of Management in Kerala. Department of Future Studies and Academic Staff College in the University of Kerala. Organized and taught self-financed courses in the Computer Centre, Member/Chairman of Board of examiners for examinations in Computer Science, Future Studies and other engineering subjects in the University of Kerala and University of Calicut, University of Jodhpur and Utkal University, Served as PhD and MTech, theses examiner of the Jawaharlal Nehru University, New Delhi, Guided six PhD students (in Applied Science faculty and Engineering faculty) and two students for MSc (Eng) by research, Served as expert member in the inspection commission of the Kerala University for granting affiliation to new course in college, Served in expert committees for inspection of academic institutions in the country formed by the UGC and AICTE. As General Manager, OCAC (August 1986 - July 1990) Led the technical activities OCAC, in all aspects of computerization, including consultancy, hardware installation, software development, training in many government departments and public sector undertakings in Odisha, Associated with several universities, educational institutions and academic centers including: Member of Board of Studies in Electronics and Computer Applications, Sambalpur University, Member of the advisory board for Computer Science in the College of Engineering and Technology, OUAT, Odisha Member of Committee for vocational courses (Computer Technology) SCERT, Odisha, Member of sectoral panel (Physical sciences) of the State Council on Science and Technology, Government of Odisha, Guest faculty member in the Odisha University of Agricultural and Technology, Gopabandhu Academy of Administration, Bhubaneswar, Computer Counsellor in the Indira Gandhi National Open University, Bhubaneswar Centre, External guide for two MTech theses submitted at IIT, Kharagpur. As Head, Computer Centre and Director, Department of Computer Science, University of Kerala (March 1977 - July 1986) Responsible for the establishment and management of a new department of computer centre and undertook consultancy, system design and program development for the users belonging to the university, government departments and industries in Kerala, Taught computer related subjects in university departments, Guest faculty member in institutions including Institute of Management in Government, Institute of Engineers, LBS Centre, Computer Society of India etc., Organised a new teaching department of Computer Science and taught the PGDCA students (as Director of the Computer Science Department (1985–86)). As Technical Officer in Computer Division of ECIL (March 1972 - March 1977) Design and development of the first indigenously developed 3rd generation computer TDC - 316 and 4th generation computer MICRO 78 in the country as a member of the design team for these computers, Real time software development of a micro computer based data logger, In-charge of a dual real time computer system integration and testing for the Fast Breeder Test Reactor project of BARC, Bombay, In-house customer training on computer applications, Feasibility studies and project implementation of computer based projects for many user organisations. Other positions held Item writer as well as an expert member to select items for inclusion in the Question Bank in Computer Science, of the Association of Indian Universities, New Delhi, Roster of specialists in the EXPERTBASE project of the Technology Information forecasting and Assessment Council (TIFAC) of the Department of Science and Technology, New Delhi, Proctor of the DoE-ACC accreditation scheme of the Department of Electronics, Government of India, Member of Advisory Committee for the software training and Development Centre of ER&DC, Thiruvananthapuram, Member of the expert committee for computer training, Shramik Vidyapeeth, Thiruvananthapuram, External expert for job and promotions interviews at organisations including VSSC, IHRD, Directorate of Tech. Education, LBS Centre, University of Kerala, University of Cochin, Centre for Development Studies, Trivandrum etc. Honours Dr. A. K. Pujari Award at CIT. The Amiya K. Pujari Award is given for the best paper of the CIT conference held each year. The Conference on Information Technology is an international conference and forum for research in the area of Information Technology. Dr. A. K. Pujari Scholarship for best student in Science at DAV College Titilagarh Education Research and publications Pujari A.K.:Successful computerization - A few guidelines and an appreciation; International Seminar on Computer Application in Industry and Management, University of Patras, Greece, 1979 Pujari A.K., S.L. Mehndiratta:An evolutionary and adaptive database management system based on an integrated knowledge based data dictionary approach, 3rd joint BCS/ACM symposium, July 1984, University of Cambridge, U. K. (Presented in poster session) Pujari A.K., S.L. Mehndiratta:Architecture of an expert database management system, Proceeding of International Conference R I A O - 85, March 1985, Grenoble, France. Pujari A.K., S.L. Mehndiratta: A unified approach to database system specification and representation, Third International workshop on Software Specification and design, August 1985, U.K. Pujari A.K., S.L. Mehndiratta:Knowledge Engineering in the context of a large personnel database system, INFORMATICS - 85, International seminar, November 1985, Trivandrum. Pujari A.K., S.L. Mehndiratta:A unified approach to database system specification and design, Fourth International Workshop on software Specification and Design, April 1987, California, U.S.A., (Position paper selected for participation). Nayak M.R., Pujari A.K.:An expert Agricultural Planning And Decision System Proc. Of the Int. Conf. On Expert Systems for Development, March 1989, Kathmandu, Nepal. Pujari A.K., B. Murali Mohan, Dilip Kumar:A micro computer based general purpose data logger, Proceedings of the All India Seminar on Data Loggers and Signal Conditioners, February 1977, National Aeronautical Laboratory, Bangalore. Pujari A.K.:A report on the computer utilization in scientific field at the University of Kerala; Proceeding of seminar on Scientific Computer System, September 1977, Kurukshetra University Pujari A.K.:Computers and its applications, University Herald, July 1987, Kerala University, Trivandrum. Pujari A.K.:Trends in computer applications, CSI, Trivandrum Chapter Convention, April 1979, Trivandrum. Pujari A.K.:Directions of Research in computer Technology, (Radio talk broadcast on 20.11.82 from AIR, Trivandrum); Published in the CSI - Newsletter, 1983, Trivandrum Chapter. Pujari A.K.:Data Semantics and conceptual schema in Database Management System, Proc. Of the annual convention of the Computer Society of India, CSI-84, March, 1984, Hyderabad. Pujari A.K.:Prospects of growth in Computer software Industry, National Seminar on Scope for Industrial Growth in Odisha, November 1986, Bhubaneswar. Pujari A.K.:The KDDEN Expert System Shell, Proc. Of the seminar on AI and Expert Systems, conducted by M/s M. N. Dastur & Co. Ltd., 15–18 November 1989, Calcutta. Pujari A.K.:Computer and Education - a stimulus paper and call for participation in COMED 91, CSI Communication, February 1991, Bombay. Pujari A.K.:Computer and Education - an Overview, National Seminar on Computer and Education (COMED 91), Computer Society of India, May 1991, Trivandrum. Pujari A.K.:An introduction to Multimedia Technology, National Seminar on Multimedia '93. April 1993, Trivandrum. Pujari A.K., MIS Education - IT Perspective, Fifth Annual Management Education Convention, August 1993, Trivandrum. Pujari A.K., Ajith Kumar N.K. - Object Oriented LAN Design, National Conf. In software Engg. SOFTEN - December 93, IEEE Kerala Section, Trivandrum. N. K. Ajith Kumar, A.K. Pujari - Design Issues in communication Gateways, National Conference on Computer and Communication, May 1994, IETE Kerala Section, Trivandrum. V.N. Neelakandan, A.K. Pujari - A knowledge based Flood Control Information System, National Symposium on Geographic Information System, Jointly organized by the University of Madras and University of Waterloo, February 1995, Madras. Pujari A.K., Trends in Intelligent Information Systems, Seminar on Computer Aided Management, Society of R&D Managers, June 1995, Trivandrum. Pujari A.K., Tripathy P. K.:Oriya Design Guide, GiswaBharat@tdil (5), Department of Information Technology, Govt. of India. References External links Computer Society of India Dr. A. K. Pujari Symposium on IT and Education Scientists from Odisha University of Kerala faculty 1948 births 2003 deaths People from Sambalpur Indian computer scientists Indian Institute of Science alumni
48738733
https://en.wikipedia.org/wiki/Crouton%20%28computing%29
Crouton (computing)
Crouton (Chromium OS Universal Chroot Environment) is a set of scripts which allows Ubuntu, Debian, and Kali Linux systems to run parallel to a Chrome OS system. Crouton works by using a chroot instead of dual-booting to allow a user to run desktop environments at the same time: Chrome OS and another environment of the user's choice. In Google I/O 2019, Google announced all Chromebooks shipped that year onward will be Linux compatible out of the box. Usage Crouton requires you to switch your Chrome OS device to Developer Mode. This requires a full Powerwash of the device. Once you get it up and running, you use special commands in the Crosh terminal. Despite having many Linux distributions to choose from, none are officially supported by their developers. While Crostini has become an officially supported way to run Linux applications, Many people still prefer Crouton due to the fact it allows you to install a desktop environment. References External links Crouton on GitHub Crouton on reddit Crouton Central on Google forums Crouton Users on Google+ Communities Linux software
19287205
https://en.wikipedia.org/wiki/George%20Berkeley%20Ross
George Berkeley Ross
George Berkeley Ross (January 24, 1918 – September 1, 2006) was an early pioneer of information technology in the American petroleum industry who spearheaded the digitization of the exploration for petroleum. Biography Early Life at Humble In 1936, Ross started his career working as a mail boy for Humble Oil in Midland, Texas, the epicenter of the oil boom. Ross married Virginia Louise Elkin of the well-established Midland family in 1941. Fifty years later after serving on the team that had developed the software to computerize the geological exploration for oil, Ross ended his long and successful career as a Senior Analyst for Exploration Systems in Computer Geology at Exxon. Ross’s mathematical aptitude took him up the corporate ladder. After short stints as a gas station attendant in Odessa and the mailroom in Midland, he was tapped for the assignment of Field Lab Assistant plotting well logs in the Geologic Lease and Scouting Department, a post that permitted him to work alongside the Scouts in West Texas. World War II World War II intervened, and Ross served as a first lieutenant in the artillery corps of the Ozark Division of the 102nd Infantry in the Allied invasion of Europe. A member of the US Army Reserve since his high school days in San Antonio, Ross’s mathematical background made him a natural for field artillery. He served with distinction in the European theatre. After landing at Normandy, Ross saw action as the Ozark Division advanced across the border between the Netherlands and Germany. They crossed the Wurm and Roer Rivers and occupied the sector from Homburg to Düsseldorf. In April 1945, Ross and the Ozark Division crossed the Rhine where they encountered stiff opposition in the Wesergebirge, then pushed on to the Elbe, where they halted on orders and established an outpost from Berlin, occupying their position until V-E Day. Wildcatting in Texas and New Mexico During the post-war period, Ross remained in the US Army Reserve. Shortly after his return to Texas, Humble’s Chief Scout, Dave Fransen, urged Ross to accept the position of Junior Scout in West Texas. In his new capacity, Ross worked with wildcats drilling in the Kelly-Snyder Field and the Sprayberry Trend Area Field, an area fifty-five miles long and twenty miles wide that now contains two thousand oil wells. In 1953, the Rosses moved their growing family to Roswell, New Mexico, which had become a hive of wildcatting that kept Ross busy monitoring leases, expiration dates and drilling schedules. The Korean War intervened, and Ross was recalled to active military duty. Serving as a major in Field Artillery Operations with the 1st Cavalry Division in Korea and Japan, Ross supervised supply chain management for forces engaged on the peninsula. After he completed two years of service, Ross came home to Roswell now as a Senior Scout, but soon he and his family returned to corporate headquarters in Midland where he continued to contribute to Humble’s drilling success. The Pentagon years and the Short Range Committee At the same time, Ross maintained his military commission and served annual tours at the Pentagon under a succession of Secretaries of Defence: Charles E. Wilson; Neil H. McElroy and Thomas S. Gates. With twenty years of military service behind him, Ross retired from the Army, but he remained an active veteran attending annual reunions into this century. It was at the Pentagon in the late fifties where Ross first learned about the potential of computerization and information technology and became privy to the work of the Short Range Committee that led to the origin of COBOL (Common Business-Oriented Language). Writing code in COBOL and FORTRAN The need to control and manage the torrent of information and geological data flooding into the petroleum corporations inspired Ross and a small group of perceptive industry savants to pursue the adaptation of computing technology and harness it to the geological exploration for oil. In order to contribute to the design of a viable corporate information technology strategy, Ross taught himself COBOL and the IBM Mathematical Formula Translating System (FORTRAN). Very esoteric subjects in 1962, computer programming and software development would have mystified the average person, but for Ross, a naturally gifted mathematician, they were a piece of cake. He was to become a self-taught expert in many fields. In addition to his studies of computer programming, Ross studied law, geology, economics, mathematics and government. Computerizing geological exploration for petroleum In the early 1960s, the major companies set up the Permian Basin Well Data System, and Ross was selected for work on three of its key committees. In addition to developing computer codes, Ross helped design formats for reporting data relevant to geological exploration. In 1966, Humble began to organize its own internal Information Systems project, and Ross was appointed a member of the first group in Midland. By this time, Ross had been named Scouting Supervisor, but with the pace of technological change his duties increasingly focused on the analysis of records that could best be done by computerization. In 1970, Ross was promoted to his new post as a Senior Analyst in Computer Geology for the Exploration Information Systems based at the Humble corporate headquarters in Houston. In his new capacity, Ross managed the computerization of an important portion of the oil industry, geological exploration. Ross had started working in geological analysis when recordkeeping was completely manual, and he had been an integral player in the digitization of the most crucial part of the industry, exploration for new sources of petroleum. Later life In 1986 on his fiftieth anniversary with Humble-Exxon, Ross was the subject of a lengthy article in Exploration Update that highlighted his pivotal contribution to the industry. He retired later that year to enter an active retirement filled with travel, cultural pursuits and charitable activities. A lifelong enthusiast of museums, libraries, the theatre and cultural institutions, Ross launched into a whirlwind of activities to support worthy causes. The Alley Theater, the Commemorative Air Force, the Pioneers of the Permian Basin, American Indian College, Lindenwood University and Mount Holyoke College all benefited from his patronage. Ross was keenly interested in the conservation of natural resources, especially petroleum, and he was an authority on public transportation. A great advocate of rail travel, he supported many railroad organizations in Texas and in several other states. Ross firmly believed that government could and should do much more to invest in rebuilding America’s railroads and adapting new light rail systems for urban transport. In Europe, Ross admired the much more advanced public transportation systems. While travelling on Eurostar, a high speed train that runs through the Channel Tunnel between Paris and London, Ross experienced a vision of the future of swift, clean, efficient and ecologically sound mass transit that inspired him to carry his message to the powers that be back home in America. Many public officials benefited from Ross’s analyses of the need for a more serious approach to mass transportation in America in general and Texas in particular. In addition to his many public and official pursuits, Ross was a highly accomplished ballroom dancer who cut a dashing figure at the Petroleum Club and Lechner’s, a favourite restaurant in Houston. Ross performed elegant versions of the Waltz and the Polka, and he did a masterful and energetic version of the Texas Two-Step. External links 1. "Ross," Houston Daily Chronicle, September 10, 2006. 2. "George Berkeley Ross," Infinityplus This article incorporates text from George Berkeley Ross / Infinityplus, released under GFDL. 1918 births 2006 deaths People in information technology ExxonMobil people United States Army personnel of World War II United States Army personnel of the Korean War People from San Antonio
22183674
https://en.wikipedia.org/wiki/Melionica
Melionica
Melionica is a genus of moths of the family Noctuidae. Species Melionica albipuncta (Gaede, 1916) Melionica bertha (Schaus & Clements, 1893) Melionica citrinea Berio, 1970 Melionica fletcheri Berio, 1973 Melionica rubella Berio, 1973 Melionica mulanjensis Hacker & Legrain, 2006 Melionica pallescens Hacker & Legrain, 2006 Melionica subrubella Hacker & Legrain, 2006 References afromoths Hadeninae
337124
https://en.wikipedia.org/wiki/SIGCOMM
SIGCOMM
SIGCOMM is the Association for Computing Machinery's Special Interest Group on Data Communications, which specializes in the field of communication and computer networks. It is also the name of an annual 'flagship' conference, organized by SIGCOMM, which is considered to be the leading conference in data communications and networking in the world. Known to have an extremely low acceptance rate (~10%), many of the landmark works in Networking and Communications have been published through it. Of late, a number of workshops related to networking are also co-located with the SIGCOMM conference. These include Workshop on Challenged Networks (CHANTS), Internet Network Management (INM), Large Scale Attack Defense (LSAD) and Mining Network Data (MineNet). SIGCOMM also produces a quarterly magazine, Computer Communication Review, with both peer-reviewed and editorial (non-peer reviewed) content, and a bi-monthly refereed journal IEEE/ACM Transactions on Networking, co-sponsored with IEEE. SIGCOMM hands out the following awards on an annual basis The SIGCOMM Award, for outstanding lifetime technical achievement in the fields of data and computer communications The Rising Star Award, for a young research under the age of 35 who has made outstanding contributions during this early part of his or her career. The Test of Time Award recognizes papers published 10 to 12 years in the past in a SIGCOMM sponsored or co-sponsored venue whose contents still represent a vibrant, useful contribution. Best Paper Award and the Best Student Paper Award at that year's conference. The SIGCOMM Doctoral Dissertation Award recognizes excellent thesis research by doctoral candidates in the field of computer networking and data communication. The SIGCOMM Networking Systems Award recognizes the development of a networking system that has had a significant impact on the world of computer networking. References Association for Computing Machinery Special Interest Groups
30797574
https://en.wikipedia.org/wiki/History%20of%20computer%20animation
History of computer animation
The history of computer animation began as early as the 1940s and 1950s, when people began to experiment with computer graphics – most notably by John Whitney. It was only by the early 1960s when digital computers had become widely established, that new avenues for innovative computer graphics blossomed. Initially, uses were mainly for scientific, engineering and other research purposes, but artistic experimentation began to make its appearance by the mid-1960s – most notably by Dr Thomas Calvert. By the mid-1970s, many such efforts were beginning to enter into public media. Much computer graphics at this time involved 2-dimensional imagery, though increasingly as computer power improved, efforts to achieve 3-dimensional realism became the emphasis. By the late 1980s, photo-realistic 3D was beginning to appear in film movies, and by mid-1990s had developed to the point where 3D animation could be used for entire feature film production. The earliest pioneers: 1940s to mid-1960s John Whitney John Whitney, Sr (1917–1995) was an American animator, composer and inventor, widely considered to be one of the fathers of computer animation. In the 1940s and 1950s, he and his brother James created a series of experimental films made with a custom-built device based on old anti-aircraft analog computers (Kerrison Predictors) connected by servos to control the motion of lights and lit objects – the first example of motion control photography. One of Whitney's best known works from this early period was the animated title sequence from Alfred Hitchcock's 1958 film Vertigo, which he collaborated on with graphic designer Saul Bass. In 1960, Whitney established his company Motion Graphics Inc, which largely focused on producing titles for film and television, while continuing further experimental works. In 1968, his pioneering motion control model photography was used on Stanley Kubrick's movie 2001: A Space Odyssey, and also for the slit-scan photography technique used in the film's "Star Gate" finale. The first digital image One of the first programmable digital computers was SEAC (the Standards Eastern Automatic Computer), which entered service in 1950 at the National Bureau of Standards (NBS) in Maryland, USA. In 1957, computer pioneer Russell Kirsch and his team unveiled a drum scanner for SEAC, to "trace variations of intensity over the surfaces of photographs", and so doing made the first digital image by scanning a photograph. The image, picturing Kirsch's three-month-old son, consisted of just 176×176 pixels. They used the computer to extract line drawings, count objects, recognize types of characters and display digital images on an oscilloscope screen. This breakthrough can be seen as the forerunner of all subsequent computer imaging, and recognising the importance of this first digital photograph, Life magazine in 2003 credited this image as one of the "100 Photographs That Changed the World". From the late 1950s and early 1960s, mainframe digital computers were becoming commonplace within large organisations and universities, and increasingly these would be equipped with graphic plotting and graphics screen devices. Consequently, a new field of experimentation began to open up. The first computer-drawn film In 1960, a 49-second vector animation of a car traveling down a planned highway was created at the Swedish Royal Institute of Technology on the BESK computer. The consulting firm Nordisk ADB, which was a provider of software for the Royal Swedish Road and Water Construction Agency realized that they had all the coordinates to be able to draw perspective from the driver's seat for a motorway from Stockholm towards Nacka. In front of a specially designed digital oscilloscope with a resolution of about 1 megapixel a 35 mm camera with an extended magazine was mounted on a specially made stand. The camera was automatically controlled by the computer, which sent a signal to the camera when a new image was fed on the oscilloscope. It took an image every twenty meters (yards) of the virtual path. The result of this was a fictional journey on the virtual highway at a speed of 110 km/h (70 mph). The short animation was broadcast on November 9, 1961, at primetime in the national television newscast Aktuellt. Bell Labs Bell Labs in Murray Hill, New Jersey, was a leading research contributor in computer graphics, computer animation and electronic music from its beginnings in the early 1960s. Initially, researchers were interested in what the computer could be made to do, but the results of the visual work produced by the computer during this period established people like Edward Zajac, Michael Noll and Ken Knowlton as pioneering computer artists. Edward Zajac produced one of the first computer generated films at Bell Labs in 1963, titled A Two Gyro Gravity Gradient attitude control System, which demonstrated that a satellite could be stabilized to always have a side facing the Earth as it orbited. Ken Knowlton developed the Beflix (Bell Flicks) animation system in 1963, which was used to produce dozens of artistic films by artists Stan VanDerBeek, Knowlton and Lillian Schwartz. Instead of raw programming, Beflix worked using simple "graphic primitives", like draw a line, copy a region, fill an area, zoom an area, and the like. In 1965, Michael Noll created computer-generated stereographic 3D movies, including a ballet of stick figures moving on a stage. Some movies also showed four-dimensional hyper-objects projected to three dimensions. Around 1967, Noll used the 4D animation technique to produce computer animated title sequences for the commercial film short Incredible Machine (produced by Bell Labs) and the TV special The Unexplained (produced by Walt DeFaria). Many projects in other fields were also undertaken at this time. Boeing-Wichita In the 1960s, William Fetter was a graphic designer for Boeing at Wichita, and was credited with coining the phrase "Computer Graphics" to describe what he was doing at Boeing at the time (though Fetter himself credited this to colleague Verne Hudson). Fetter's work included the 1964 development of ergonomic descriptions of the human body that are both accurate and adaptable to different environments, and this resulted in the first 3D animated "wire-frame" figures. Such human figures became one of the most iconic images of the early history of computer graphics, and often were referred to as the "Boeing Man". Fetter died in 2002. Ivan Sutherland Ivan Sutherland is considered by many to be the creator of Interactive Computer Graphics, and an internet pioneer. He worked at the Lincoln Laboratory at MIT (Massachusetts Institute of Technology) in 1962, where he developed a program called Sketchpad I, which allowed the user to interact directly with the image on the screen. This was the first Graphical User Interface, and is considered one of the most influential computer programs ever written by an individual. Mid-1960s to mid-1970s The University of Utah Utah was a major center for computer animation in this period. The computer science faculty was founded by David Evans in 1965, and many of the basic techniques of 3D computer graphics were developed here in the early 1970s with ARPA funding (Advanced Research Projects Agency). Research results included Gouraud, Phong, and Blinn shading, texture mapping, hidden surface algorithms, curved surface subdivision, real-time line-drawing and raster image display hardware, and early virtual reality work. In the words of Robert Rivlin in his 1986 book The Algorithmic Image: Graphic Visions of the Computer Age, "almost every influential person in the modern computer-graphics community either passed through the University of Utah or came into contact with it in some way". Evans and Sutherland In 1968, Ivan Sutherland teamed up with David Evans to found the company Evans & Sutherland—both were professors in the Computer Science Department at the University of Utah, and the company was formed to produce new hardware designed to run the systems being developed in the University. Many such algorithms have later resulted in the generation of significant hardware implementation, including the Geometry Engine, the Head-mounted display, the Frame buffer, and Flight simulators. Most of the employees were active or former students, and included Jim Clark, who started Silicon Graphics in 1981, Ed Catmull, co-founder of Pixar in 1979, and John Warnock of Adobe Systems in 1982. First computer animated character, Nikolai Konstantinov In 1968, a group of Soviet physicists and mathematicians with N. Konstantinov as its head created a mathematical model for the motion of a cat. On a BESM-4 computer they devised a programme for solving the ordinary differential equations for this model. The Computer printed hundreds of frames on paper using alphabet symbols that were later filmed in sequence thus creating the first computer animation of a character, a walking cat. Ohio State Charles Csuri, an artist at The Ohio State University (OSU), started experimenting with the application of computer graphics to art in 1963. His efforts resulted in a prominent CG research laboratory that received funding from the National Science Foundation and other government and private agencies. The work at OSU revolved around animation languages, complex modeling environments, user-centric interfaces, human and creature motion descriptions, and other areas of interest to the discipline. Cybernetic Serendipity In July 1968, the arts journal Studio International published a special issue titled Cybernetic Serendipity – The Computer and the Arts, which catalogued a comprehensive collection of items and examples of work being done in the field of computer art in organisations all over the world, and shown in exhibitions in London, UK, San Francisco, CA. and Washington, DC. This marked a milestone in the development of the medium, and was considered by many to be of widespread influence and inspiration. Apart from all the examples mentioned above, two other particularly well known iconic images from this include Chaos to Order by Charles Csuri (often referred to as the Hummingbird), created at Ohio State University in 1967, and Running Cola is Africa by Masao Komura and Koji Fujino created at the Computer Technique Group, Japan, also in 1967. Scanimate The first machine to achieve widespread public attention in the media was Scanimate, an analog computer animation system designed and built by Lee Harrison of the Computer Image Corporation in Denver. From around 1969 onward, Scanimate systems were used to produce much of the video-based animation seen on television in commercials, show titles, and other graphics. It could create animations in real time, a great advantage over digital systems at the time. National Film Board of Canada The National Film Board of Canada, already a world center for animation art, also began experimentation with computer techniques in 1969. Most well-known of the early pioneers with this was artist Peter Foldes, who completed Metadata in 1971. This film comprised drawings animated by gradually changing from one image to the next, a technique known as "interpolating" (also known as "inbetweening" or "morphing"), which also featured in a number of earlier art examples during the 1960s. In 1974, Foldes completed Hunger / La Faim, which was one of the first films to show solid filled (raster scanned) rendering, and was awarded the Jury Prize in the short film category at 1974 Cannes Film Festival, as well as an Academy Award nomination. Atlas Computer Laboratory and Antics The Atlas Computer Laboratory near Oxford was for many years a major facility for computer animation in Britain. The first entertainment cartoon made was The Flexipede, by Tony Pritchett, which was first shown publicly at the Cybernetic Serendipity exhibition in 1968. Artist Colin Emmett and animator Alan Kitching first developed solid filled colour rendering in 1972, notably for the title animation for the BBC's The Burke Special TV program. In 1973, Kitching went on to develop a software called "Antics", which allowed users to create animation without needing any programming. The package was broadly based on conventional "cel" (celluloid) techniques, but with a wide range of tools including camera and graphics effects, interpolation ("inbetweening"/"morphing"), use of skeleton figures and grid overlays. Any number of drawings or cels could be animated at once by "choreographing" them in limitless ways using various types of "movements". At the time, only black & white plotter output was available, but Antics was able to produce full-color output by using the Technicolor Three-strip Process. Hence the name Antics was coined as an acronym for ANimated Technicolor-Image Computer System. Antics was used for many animation works, including the first complete documentary movie Finite Elements, made for the Atlas Lab itself in 1975. From around the early 1970s, much of the emphasis in computer animation development was towards ever increasing realism in 3D imagery, and on effects designed for use in feature movies. First digital animation in a feature film The first feature film to use digital image processing was the 1973 movie Westworld, a science-fiction film written and directed by novelist Michael Crichton, in which humanoid robots live amongst the humans. John Whitney, Jr, and Gary Demos at Information International, Inc. digitally processed motion picture photography to appear pixelized to portray the Gunslinger android's point of view. The cinegraphic block portraiture was accomplished using the Technicolor Three-strip Process to color-separate each frame of the source images, then scanning them to convert into rectangular blocks according to its tone values, and finally outputting the result back to film. The process was covered in the American Cinematographer article "Behind the scenes of Westworld". SIGGRAPH Sam Matsa whose background in graphics started with the APT project at MIT with Doug Ross and Andy Van Dam petitioned Association for Computing Machinery (ACM) to form SICGRAPH (Special Interest Committee on Computer Graphics), the forerunner of ACM SIGGRAPH in 1967. In 1974, the first SIGGRAPH conference on computer graphics opened. This annual conference soon became the dominant venue for presenting innovations in the field. Towards 3D: mid-1970s into the 1980s Early 3D animation in the cinema The first use of 3D wireframe imagery in mainstream cinema was in the sequel to Westworld, Futureworld (1976), directed by Richard T. Heffron. This featured a computer-generated hand and face created by then University of Utah graduate students Edwin Catmull and Fred Parke which had initially appeared in their 1972 experimental short A Computer Animated Hand. The same film also featured snippets from 1974 experimental short Faces and Body Parts. The Oscar-winning 1975 short animated film Great, about the life of the Victorian engineer Isambard Kingdom Brunel, contains a brief sequence of a rotating wireframe model of Brunel's final project, the iron steam ship SS Great Eastern.The third movie to use this technology was Star Wars (1977), written and directed by George Lucas, with wireframe imagery in the scenes with the Death Star plans, the targeting computers in the X-wing fighters, and the Millennium Falcon spacecraft. The Walt Disney film The Black Hole (1979, directed by Gary Nelson) used wireframe rendering to depict the titular black hole, using equipment from Disney's engineers. In the same year, the science-fiction horror film Alien, directed by Ridley Scott, also used wireframe model graphics, in this case to render the navigation monitors in the spaceship. The footage was produced by Colin Emmett at the Atlas Computer Laboratory. Nelson Max Although Lawrence Livermore Labs in California is mainly known as a centre for high-level research in science, it continued producing significant advances in computer animation throughout this period. Notably, Nelson Max, who joined the Lab in 1971, and whose 1976 film Turning a sphere inside out is regarded as one of the classic early films in the medium (International Film Bureau, Chicago, 1976). He also produced a series of "realistic-looking" molecular model animations that served to demonstrate the future role of CGI in scientific visualization ("CGI" = Computer-generated imagery). His research interests focused on realism in nature images, molecular graphics, computer animation, and 3D scientific visualization. He later served as computer graphics director for the Fujitsu pavilions at Expo 85 and 90 in Japan. NYIT In 1974, Alex Schure, a wealthy New York entrepreneur, established the Computer Graphics Laboratory (CGL) at the New York Institute of Technology (NYIT). He put together the most sophisticated studio of the time, with state of the art computers, film and graphic equipment, and hired top technology experts and artists to run it – Ed Catmull, Malcolm Blanchard, Fred Parke and others all from Utah, plus others from around the country including Ralph Guggenheim, Alvy Ray Smith and Ed Emshwiller. During the late 1970s, the staff made numerous innovative contributions to image rendering techniques, and produced many influential software, including the animation program Tween, the paint program Paint, and the animation program SoftCel. Several videos from NYIT become quite famous: Sunstone, by Ed Emshwiller, Inside a Quark, by Ned Greene, and The Works. The latter, written by Lance Williams, was begun in 1978, and was intended to be the first full-length CGI film, but it was never completed, though a trailer for it was shown at SIGGRAPH 1982. In these years, many people regarded NYIT CG Lab as the top computer animation research and development group in the world. The quality of NYIT's work attracted the attention of George Lucas, who was interested in developing a CGI special effects facility at his company Lucasfilm. In 1979, he recruited the top talent from NYIT, including Catmull, Smith and Guggenheim to start his division, which later spun off as Pixar, founded in 1986 with funding by Apple Inc. co-founder Steve Jobs. Framebuffer The framebuffer or framestore is a graphics screen configured with a memory buffer that contains data for a complete screen image. Typically, it is a rectangular array (raster) of pixels, and the number of pixels in the width and the height is its "resolution". Color values stored in the pixels can be from 1-bit (monochrome), to 24-bit (true color, 8-bits each for RGB—Red, Green, & Blue), or also 32-bit, with an extra 8-bits used as a transparency mask (alpha channel). Before the framebuffer, graphics displays were all vector-based, tracing straight lines from one co-ordinate to another. In 1948, the Manchester Baby computer used a Williams tube, where the 1-bit display was also the memory. An early (perhaps first known) example of a framebuffer was designed in 1969 by A. Michael Noll at Bell Labs, This early system had just 2-bits, giving it 4 levels of gray scale. A later design had color, using more bits. Laurie Spiegel implemented a simple paint program at Bell Labs to allow users to "paint" directly on the framebuffer. The development of MOS memory (metal–oxide–semiconductor memory) integrated-circuit chips, particularly high-density DRAM (dynamic random-access memory) chips with at least 1kb memory, made it practical to create a digital memory system with framebuffers capable of holding a standard definition (SD) video image. This led to the development of the SuperPaint system by Richard Shoup at Xerox PARC during 1972–1973. It used a framebuffer displaying 640×480 pixels (standard NTSC video resolution) with eight-bit depth (256 colors). The SuperPaint software contained all the essential elements of later paint packages, allowing the user to paint and modify pixels, using a palette of tools and effects, and thereby making it the first complete computer hardware and software solution for painting and editing images. Shoup also experimented with modifying the output signal using color tables, to allow the system to produce a wider variety of colors than the limited 8-bit range it contained. This scheme would later become commonplace in computer framebuffers. The SuperPaint framebuffer could also be used to capture input images from video. The first commercial framebuffer was produced in 1974 by Evans & Sutherland. It cost about $15,000, with a resolution of 512 by 512 pixels in 8-bit grayscale color, and sold well to graphics researchers without the resources to build their own framebuffer. A little later, NYIT created the first full-color 24-bit RGB framebuffer by using three of the Evans & Sutherland framebuffers linked together as one device by a minicomputer. Many of the "firsts" that happened at NYIT were based on the development of this first raster graphics system. In 1975, the UK company Quantel, founded in 1973 by Peter Michael, produced the first commercial full-color broadcast framebuffer, the Quantel DFS 3000. It was first used in TV coverage of the 1976 Montreal Olympics to generate a picture-in-picture inset of the Olympic flaming torch while the rest of the picture featured the runner entering the stadium. Framebuffer technology provided the cornerstone for the future development of digital television products. By the late 1970s, it became possible for personal computers (such as the Apple II) to contain low-color framebuffers. However, it was not until the 1980s that a real revolution in the field was seen, and framebuffers capable of holding a standard video image were incorporated into standalone workstations. By the 1990s, framebuffers eventually became the standard for all personal computers. Fractals At this time, a major step forward to the goal of increased realism in 3D animation came with the development of "fractals". The term was coined in 1975 by mathematician Benoit Mandelbrot, who used it to extend the theoretical concept of fractional dimensions to geometric patterns in nature, and published in English translation of his book Fractals: Form, Chance and Dimension in 1977. In 1979–80, the first film using fractals to generate the graphics was made by Loren Carpenter of Boeing. Titled Vol Libre, it showed a flight over a fractal landscape, and was presented at SIGGRAPH 1980. Carpenter was subsequently hired by Pixar to create the fractal planet in the Genesis Effect sequence of Star Trek II: The Wrath of Khan in June 1982. JPL and Jim Blinn Bob Holzman of NASA's Jet Propulsion Laboratory in California established JPL's Computer Graphics Lab in 1977 as a group with technology expertise in visualizing data being returned from NASA missions. On the advice of Ivan Sutherland, Holzman hired a graduate student from Utah named Jim Blinn. Blinn had worked with imaging techniques at Utah, and developed them into a system for NASA's visualization tasks. He produced a series of widely seen "fly-by" simulations, including the Voyager, Pioneer and Galileo spacecraft fly-bys of Jupiter, Saturn and their moons. He also worked with Carl Sagan, creating animations for his Cosmos: A Personal Voyage TV series. Blinn developed many influential new modelling techniques, and wrote papers on them for the IEEE (Institute of Electrical and Electronics Engineers), in their journal Computer Graphics and Applications. Some of these included environment mapping, improved highlight modelling, "blobby" modelling, simulation of wrinkled surfaces, and simulation of butts and dusty surfaces. Later in the 1980s, Blinn developed CG animations for an Annenberg/CPB TV series, The Mechanical Universe, which consisted of over 500 scenes for 52 half-hour programs describing physics and mathematics concepts for college students. This he followed with production of another series devoted to mathematical concepts, called Project Mathematics!. Motion control photography Motion control photography is a technique that uses a computer to record (or specify) the exact motion of a film camera during a shot, so that the motion can be precisely duplicated again, or alternatively on another computer, and combined with the movement of other sources, such as CGI elements. Early forms of motion control go back to John Whitney's 1968 work on 2001: A Space Odyssey, and the effects on the 1977 movie Star Wars Episode IV: A New Hope, by George Lucas' newly created company Industrial Light & Magic in California (ILM). ILM created a digitally controlled camera known as the Dykstraflex, which performed complex and repeatable motions around stationary spaceship models, enabling separately filmed elements (spaceships, backgrounds, etc.) to be coordinated more accurately with one another. However, neither of these was actually computer-based—Dykstraflex was essentially a custom-built hard-wired collection of knobs and switches. The first commercial computer-based motion control and CGI system was developed in 1981 in the UK by Moving Picture Company designer Bill Mather. 3D computer graphics software 3D computer graphics software began appearing for home computers in the late 1970s. The earliest known example is 3D Art Graphics, a set of 3D computer graphics effects, written by Kazumasa Mitazawa and released in June 1978 for the Apple II. The 1980s The '80s saw a great expansion of radical new developments in commercial hardware, especially the incorporation of framebuffer technologies into graphic workstations, allied with continuing advances in computer power and affordability. Silicon Graphics, Inc (SGI) Silicon Graphics, Inc (SGI) was a manufacturer of high-performance computer hardware and software, founded in 1981 by Jim Clark. His idea, called the Geometry Engine, was to create a series of components in a VLSI processor that would accomplish the main operations required in image synthesis—the matrix transforms, clipping, and the scaling operations that provided the transformation to view space. Clark attempted to shop his design around to computer companies, and finding no takers, he and colleagues at Stanford University, California, started their own company, Silicon Graphics. SGI's first product (1984) was the IRIS (Integrated Raster Imaging System). It used the 8 MHz M68000 processor with up to 2 MB memory, a custom 1024×1024 frame buffer, and the Geometry Engine to give the workstation its impressive image generation power. Its initial market was 3D graphics display terminals, but SGI's products, strategies and market positions evolved significantly over time, and for many years were a favoured choice for CGI companies in film, TV, and other fields. Quantel In 1981, Quantel released the "Paintbox", the first broadcast-quality turnkey system designed for creation and composition of television video and graphics. Its design emphasized the studio workflow efficiency required for live news production. Essentially, it was a framebuffer packaged with innovative user software, and it rapidly found applications in news, weather, station promos, commercials, and the like. Although it was essentially a design tool for still images, it was also sometimes used for frame-by-frame animations. Following its initial launch, it revolutionised the production of television graphics, and some Paintboxes are still in use today due to their image quality, and versatility. This was followed in 1982 by the Quantel Mirage, or DVM8000/1 "Digital Video Manipulator", a digital real-time video effects processor. This was based on Quantel's own hardware, plus a Hewlett-Packard computer for custom program effects. It was capable of warping a live video stream by texture mapping it onto an arbitrary three-dimensional shape, around which the viewer could freely rotate or zoom in real-time. It could also interpolate, or morph, between two different shapes. It was considered the first real-time 3D video effects processor, and the progenitor of subsequent DVE (Digital video effect) machines. In 1985, Quantel went on to produce "Harry", the first all-digital non-linear editing and effects compositing system. Osaka University In 1982, Japan's Osaka University developed the LINKS-1 Computer Graphics System, a supercomputer that used up to 257 Zilog Z8001 microprocessors, used for rendering realistic 3D computer graphics. According to the Information Processing Society of Japan: "The core of 3D image rendering is calculating the luminance of each pixel making up a rendered surface from the given viewpoint, light source, and object position. The LINKS-1 system was developed to realize an image rendering methodology in which each pixel could be parallel processed independently using ray tracing. By developing a new software methodology specifically for high-speed image rendering, LINKS-1 was able to rapidly render highly realistic images." It was "used to create the world's first 3D planetarium-like video of the entire heavens that was made completely with computer graphics. The video was presented at the Fujitsu pavilion at the 1985 International Exposition in Tsukuba." The LINKS-1 was the world's most powerful computer, as of 1984. 3D Fictional Animated Films at the University of Montreal In the '80s, University of Montreal was at the front run of Computer Animation with three successful short 3D animated films with 3D characters. In 1983, Philippe Bergeron, Nadia Magnenat Thalmann, and Daniel Thalmann directed Dream Flight, considered as the first 3D generated film telling a story. The film was completely programmed using the MIRA graphical language, an extension of the Pascal programming language based on Abstract Graphical Data Types. The film got several awards and was shown at the SIGGRAPH '83 Film Show. In 1985, Pierre Lachapelle, Philippe Bergeron, Pierre Robidoux and Daniel Langlois directed Tony de Peltrie, which shows the first animated human character to express emotion through facial expressions and body movements, which touched the feelings of the audience. Tony de Peltrie premiered as the closing film of SIGGRAPH '85. In 1987, the Engineering Institute of Canada celebrated its 100th anniversary. A major event, sponsored by Bell Canada and Northern Telecom (now Nortel), was planned for the Place des Arts in Montreal. For this event, Nadia Magnenat Thalmann and Daniel Thalmann simulated Marilyn Monroe and Humphrey Bogart meeting in a café in the old town section of Montreal. The short movie, called Rendez-vous in Montreal was shown in numerous festivals and TV channels all over the world. Sun Microsystems, Inc The Sun Microsystems company was founded in 1982 by Andy Bechtolsheim with other fellow graduate students at Stanford University. Bechtolsheim originally designed the SUN computer as a personal CAD workstation for the Stanford University Network (hence the acronym "SUN"). It was designed around the Motorola 68000 processor with the Unix operating system and virtual memory, and, like SGI, had an embedded frame buffer. Later developments included computer servers and workstations built on its own RISC-based processor architecture and a suite of software products such as the Solaris operating system, and the Java platform. By the '90s, Sun workstations were popular for rendering in 3D CGI filmmaking—for example, Disney-Pixar's 1995 movie Toy Story used a render farm of 117 Sun workstations. Sun was a proponent of open systems in general and Unix in particular, and a major contributor to open source software. National Film Board of Canada The NFB's French-language animation studio founded its Centre d'animatique in 1980, at a cost of $1 million CAD, with a team of six computer graphics specialists. The unit was initially tasked with creating stereoscopic CGI sequences for the NFB's 3-D IMAX film Transitions for Expo 86. Staff at the Centre d'animatique included Daniel Langlois, who left in 1986 to form Softimage. First turnkey broadcast animation system Also in 1982, the first complete turnkey system designed specifically for creating broadcast-standard animation was produced by the Japanese company Nippon Univac Kaisha ("NUK", later merged with Burroughs), and incorporated the Antics 2-D computer animation software developed by Alan Kitching from his earlier versions. The configuration was based on the VAX 11/780 computer, linked to a Bosch 1-inch VTR, via NUK's own framebuffer. This framebuffer also showed realtime instant replays of animated vector sequences ("line test"), though finished full-color recording would take many seconds per frame. The full system was successfully sold to broadcasters and animation production companies across Japan. Later in the '80s, Kitching developed versions of Antics for SGI and Apple Mac platforms, and these achieved a wider global distribution. First solid 3D CGI in the movies The first cinema feature movie to make extensive use of solid 3D CGI was Walt Disney's Tron, directed by Steven Lisberger, in 1982. The film is celebrated as a milestone in the industry, though less than twenty minutes of this animation were actually used—mainly the scenes that show digital "terrain", or include vehicles such as Light Cycles, tanks and ships. To create the CGI scenes, Disney turned to the four leading computer graphics firms of the day: Information International Inc, Robert Abel and Associates (both in California), MAGI, and Digital Effects (both in New York). Each worked on a separate aspect of the movie, without any particular collaboration. Tron was a box office success, grossing $33 million on a budget of $17 million. In 1984, Tron was followed by The Last Starfighter, a Universal Pictures / Lorimar production, directed by Nick Castle, and was one of cinema's earliest films to use extensive CGI to depict its many starships, environments and battle scenes. This was a great step forward compared with other films of the day, such as Return of the Jedi, which still used conventional physical models. The computer graphics for the film were designed by artist Ron Cobb, and rendered by Digital Productions on a Cray X-MP supercomputer. A total of 27 minutes of finished CGI footage was produced—considered an enormous quantity at the time. The company estimated that using computer animation required only half the time, and one half to one third the cost of traditional special effects. The movie was a financial success, earning over $28 million on an estimated budget of $15 million. Inbetweening and morphing The terms inbetweening and morphing are often used interchangeably, and signify the creating of a sequence of images where one image transforms gradually into another image smoothly by small steps. Graphically, an early example would be Charles Philipon's famous 1831 caricature of French King Louis Philippe turning into a pear (metamorphosis). "Inbetweening" (AKA "tweening") is a term specifically coined for traditional animation technique, an early example being in E.G.Lutz's 1920 book Animated Cartoons. In computer animation, inbetweening was used from the beginning (e.g., John Whitney in the '50s, Charles Csuri and Masao Komura in the '60s). These pioneering examples were vector-based, comprising only outline drawings (as was also usual in conventional animation technique), and would often be described mathematically as "interpolation". Inbetweening with solid-filled colors appeared in the early '70s, (e.g., Alan Kitching's Antics at Atlas Lab, 1973, and Peter Foldes' La Faim at NFBC, 1974), but these were still entirely vector-based. The term "morphing" did not become current until the late '80s, when it specifically applied to computer inbetweening with photographic images—for example, to make one face transform smoothly into another. The technique uses grids (or "meshes") overlaid on the images, to delineate the shape of key features (eyes, nose, mouth, etc.). Morphing then inbetweens one mesh to the next, and uses the resulting mesh to distort the image and simultaneously dissolve one to another, thereby preserving a coherent internal structure throughout. Thus, several different digital techniques come together in morphing. Computer distortion of photographic images was first done by NASA, in the mid-1960s, to align Landsat and Skylab satellite images with each other. Texture mapping, which applies a photographic image to a 3D surface in another image, was first defined by Jim Blinn and Martin Newell in 1976. A 1980 paper by Ed Catmull and Alvy Ray Smith on geometric transformations, introduced a mesh-warping algorithm. The earliest full demonstration of morphing was at the 1982 SIGGRAPH conference, where Tom Brigham of NYIT presented a short film sequence in which a woman transformed, or "morphed", into a lynx. The first cinema movie to use morphing was Ron Howard's 1988 fantasy film Willow, where the main character, Willow, uses a magic wand to transform animal to animal to animal and finally, to a sorceress. 3D inbetweening With 3D CGI, the inbetweening of photo-realistic computer models can also produce results similar to morphing, though technically, it is an entirely different process (but is nevertheless often also referred to as "morphing"). An early example is Nelson Max's 1977 film Turning a sphere inside out. The first cinema feature film to use this technique was the 1986 Star Trek IV: The Voyage Home, directed by Leonard Nimoy, with visual effects by George Lucas's company Industrial Light & Magic (ILM). The movie includes a dream sequence where the crew travel back in time, and images of their faces transform into one another. To create it, ILM employed a new 3D scanning technology developed by Cyberware to digitize the cast members' heads, and used the resulting data for the computer models. Because each head model had the same number of key points, transforming one character into another was a relatively simple inbetweening. The Abyss In 1989 James Cameron's underwater action movie The Abyss was released. This was the first cinema movie to include photo-realistic CGI integrated seamlessly into live-action scenes. A five-minute sequence featuring an animated tentacle or "pseudopod" was created by ILM, who designed a program to produce surface waves of differing sizes and kinetic properties for the pseudopod, including reflection, refraction and a morphing sequence. Although short, this successful blend of CGI and live action is widely considered a milestone in setting the direction for further future development in the field. Walt Disney and CAPS The Great Mouse Detective (1986) was the first Disney film to extensively use computer animation, a fact that Disney used to promote the film during marketing. CGI was used during a two-minute climax scene on the Big Ben, inspired by a similar climax scene in Hayao Miyazaki's The Castle of Cagliostro (1979). The Great Mouse Detective, in turn, paved the way for the Disney Renaissance. The late 1980s saw another milestone in computer animation, this time in 2D: the development of Disney's "Computer Animation Production System", known as "CAPS/ink & paint". This was a custom collection of software, scanners and networked workstations developed by The Walt Disney Company in collaboration with Pixar. Its purpose was to computerize the ink-and-paint and post-production processes of traditionally animated films, to allow more efficient and sophisticated post-production by making the practice of hand-painting cels obsolete. The animators' drawings and background paintings are scanned into the computer, and animation drawings are inked and painted by digital artists. The drawings and backgrounds are then combined, using software that allows for camera movements, multiplane effects, and other techniques—including compositing with 3D image material. The system's first feature film use was in The Little Mermaid (1989), for the "farewell rainbow" scene near the end, but the first full-scale use was for The Rescuers Down Under (1990), which therefore became the first traditionally animated film to be entirely produced on computer—or indeed, the first 100% digital feature film of any kind ever produced. 3D animation software in the 1980s The 1980s saw the appearance of many notable new commercial software products: 1982: Autodesk Inc was founded in California by John Walker, with a focus on design software for the PC, with their flagship CAD package AutoCAD. In 1986, Autodesk's first animation package was AutoFlix, for use with AutoCAD. Their first full 3D animation software was 3D Studio for DOS in 1990, which was developed under license by Gary Yost of The Yost Group. 1983: Alias Research was founded in Toronto, Canada, by Stephen Bingham and others, with a focus on industrial and entertainment software for SGI workstations. Their first product was Alias-1 and shipped in 1985. In 1989, Alias was chosen to animate the pseudopod in James Cameron's The Abyss, which gave the software high-profile recognition in movie animation. In 1990 this developed into PowerAnimator, often known just as Alias. 1984: Wavefront was founded by Bill Kovacs and others, in California, to produce computer graphics for movies and television, and also to develop and market their own software based on SGI hardware. Wavefront developed their first product, Preview, during the first year of business. The company's production department helped tune the software by using it on commercial projects, creating opening graphics for television programs. In 1988, the company introduced the Personal Visualiser. 1984: TDI (Thomson Digital Image) was created in France as a subsidiary of aircraft simulator company Thomson-CSF, to develop and commercialise on their own 3D system Explore, first released in 1986. 1984: Sogitec Audiovisuel, was a division of Sogitec avionics in France, founded by Xavier Nicolas for the production of computer animation films, using their own 3D software developed from 1981 by Claude Mechoulam and others at Sogitec. 1986: Softimage was founded by National Film Board of Canada filmmaker Daniel Langlois in Montreal. Its first product was called the Softimage Creative Environment, and was launched at SIGGRAPH '88. For the first time, all 3D processes (modelling, animation, and rendering) were integrated. Creative Environment (eventually to be known as Softimage 3D in 1988), became a standard animation solution in the industry. 1987: Side Effects Software was established by Kim Davidson and Greg Hermanovic in Toronto, Canada, as a production/software company based on a 3D animation package called PRISMS, which they had acquired from their former employer Omnibus. Side Effects Software developed this procedural modelling and motion product into a high-end, tightly integrated 2D/3D animation software which incorporated a number of technological breakthroughs. 1989: the companies TDI and Sogitec were merged to create the new company ExMachina. CGI in the 1990s Computer animation expands in film and TV The decade saw some of the first computer animated television series. For example Quarxs, created by media artist Maurice Benayoun and comic book artist François Schuiten, was an early example of a CGI series based on a real screenplay and not animated solely for demonstrative purposes. The 1990s began with much of CGI technology now sufficiently developed to allow a major expansion into film and TV production. 1991 is widely considered the "breakout year", with two major box-office successes, both making heavy use of CGI. First of these was James Cameron's movie Terminator 2: Judgment Day, and was the one that first brought CGI to widespread public attention. The technique was used to animate the two "Terminator" robots. The "T-1000" robot was given a "mimetic poly-alloy" (liquid metal) structure, which enabled this shapeshifting character to morph into almost anything it touched. Most of the key Terminator effects were provided by Industrial Light & Magic, and this film was the most ambitious CGI project since the 1982 film Tron. The other was Disney's Beauty and the Beast, the second traditional 2D animated film to be entirely made using CAPS. The system also allowed easier combination of hand-drawn art with 3D CGI material, notably in the "waltz sequence", where Belle and Beast dance through a computer-generated ballroom as the camera "dollies" around them in simulated 3D space. Notably, Beauty and the Beast was the first animated film ever to be nominated for a Best Picture Academy Award. Another significant step came in 1993, with Steven Spielberg's Jurassic Park, where 3D CGI dinosaurs were integrated with life-sized animatronic counterparts. The CGI animals were created by ILM, and in a test scene to make a direct comparison of both techniques, Spielberg chose the CGI. Also watching was George Lucas who remarked "a major gap had been crossed, and things were never going to be the same." Warner Bros' 1999 The Iron Giant was the first traditionally-animated feature to have a major character, the title character, to be fully computer-generated. Flocking Flocking is the behavior exhibited when a group of birds (or other animals) move together in a flock. A mathematical model of flocking behavior was first simulated on a computer in 1986 by Craig Reynolds, and soon found its use in animation. Jurassic Park notably featured flocking, and brought it to widespread attention by mentioning it in the actual script. Other early uses were the flocking bats in Tim Burton's Batman Returns (1992), and the wildebeest stampede in Disney's The Lion King (1994). With improving hardware, lower costs, and an ever-increasing range of software tools, CGI techniques were soon rapidly taken up in both film and television production. In 1993, J. Michael Straczynski's Babylon 5 became the first major television series to use CGI as the primary method for their visual effects (rather than using hand-built models), followed later the same year by Rockne S. O'Bannon's SeaQuest DSV. Also the same year, the French company Studio Fantome produced the first full-length completely computer animated TV series, Insektors (26×13'), though they also produced an even earlier all 3D short series, Geometric Fables (50 x 5') in 1991. A little later, in 1994, the Canadian TV CGI series ReBoot (48×23') was aired, produced by Mainframe Entertainment and Alliance Atlantis Communications, two companies that also created Beast Wars: Transformers which was released 2 years after ReBoot. In 1995, there came the first fully computer-animation feature film, Disney-Pixar's Toy Story, which was a huge commercial success. This film was directed by John Lasseter, a co-founder of Pixar, and former Disney animator, who started at Pixar with short movies such as Luxo Jr. (1986), Red's Dream (1987), and Tin Toy (1988), which was also the first computer-generated animated short film to win an Academy Award. Then, after some long negotiations between Disney and Pixar, a partnership deal was agreed in 1991 with the aim of producing a full feature movie, and Toy Story was the result. The following years saw a greatly increased uptake of digital animation techniques, with many new studios going into production, and existing companies making a transition from traditional techniques to CGI. Between 1995 and 2005 in the US, the average effects budget for a wide-release feature film leapt from $5 million to $40 million. According to Hutch Parker, President of Production at 20th Century Fox, , "50 percent of feature films have significant effects. They're a character in the movie." However, CGI has made up for the expenditures by grossing over 20% more than their real-life counterparts, and by the early 2000s, computer-generated imagery had become the dominant form of special effects. Motion capture Motion capture, or "Mocap", records the movement of external objects or people, and has applications for medicine, sports, robotics, and the military, as well as for animation in film, TV and games. The earliest example would be in 1878, with the pioneering photographic work of Eadweard Muybridge on human and animal locomotion, which is still a source for animators today. Before computer graphics, capturing movements to use in animation would be done using Rotoscoping, where the motion of an actor was filmed, then the film used as a guide for the frame-by-frame motion of a hand-drawn animated character. The first example of this was Max Fleischer's Out of the Inkwell series in 1915, and a more recent notable example is the 1978 Ralph Bakshi 2D animated movie The Lord of the Rings. Computer-based motion capture started as a photogrammetric analysis tool in biomechanics research in the 1970s and 1980s. A performer wears markers near each joint to identify the motion by the positions or angles between the markers. Many different types of markers can be used—lights, reflective markers, LEDs, infra-red, inertial, mechanical, or wireless RF—and may be worn as a form of suit, or attached direct to a performer's body. Some systems include details of face and fingers to capture subtle expressions, and such is often referred to as "performance capture". The computer records the data from the markers, and uses it to animate digital character models in 2D or 3D computer animation, and in some cases this can include camera movement as well. In the 1990s, these techniques became widely used for visual effects. Video games also began to use motion capture to animate in-game characters. As early as 1988, an early form of motion capture was used to animate the 2D main character of the Martech video game Vixen, which was performed by model Corinne Russell. Motion capture was later notably used to animate the 3D character models in the Sega Model 2 arcade game Virtua Fighter 2 in 1994. In 1995, examples included the Atari Jaguar CD-based game Highlander: The Last of the MacLeods, and the arcade fighting game Soul Edge, which was the first video game to use passive optical motion-capture technology. Another breakthrough where a cinema film used motion capture was creating hundreds of digital characters for the film Titanic in 1997. The technique was used extensively in 1999 to create Jar-Jar Binks and other digital characters in Star Wars: Episode I – The Phantom Menace. Match moving Match moving (also known as motion tracking or camera tracking), although related to motion capture, is a completely different technique. Instead of using special cameras and sensors to record the motion of subjects, match moving works with pre-existing live-action footage, and uses computer software alone to track specific points in the scene through multiple frames, and thereby allow the insertion of CGI elements into the shot with correct position, scale, orientation, and motion relative to the existing material. The terms are used loosely to describe several different methods of extracting subject or camera motion information from a motion picture. The technique can be 2D or 3D, and can also include matching for camera movements. The earliest commercial software examples being 3D-Equalizer from Science.D.Visions and rastrack from Hammerhead Productions, both starting mid-90s. The first step is identifying suitable features that the software tracking algorithm can lock onto and follow. Typically, features are chosen because they are bright or dark spots, edges or corners, or a facial feature—depending on the particular tracking algorithm being used. When a feature is tracked it becomes a series of 2D coordinates that represent the position of the feature across the series of frames. Such tracks can be used immediately for 2D motion tracking, or then be used to calculate 3D information. In 3D tracking, a process known as "calibration" derives the motion of the camera from the inverse-projection of the 2D paths, and from this a "reconstruction" process is used to recreate the photographed subject from the tracked data, and also any camera movement. This then allows an identical virtual camera to be moved in a 3D animation program, so that new animated elements can be composited back into the original live-action shot in perfectly matched perspective. In the 1990s, the technology progressed to the point that it became possible to include virtual stunt doubles. Camera tracking software was refined to allow increasingly complex visual effects developments that were previously impossible. Computer-generated extras also became used extensively in crowd scenes with advanced flocking and crowd simulation software. Being mainly software-based, match moving has become increasingly affordable as computers become cheaper and more powerful. It has become an essential visual effects tool and is even used providing effects in live television broadcasts. Virtual studio In television, a virtual studio, or virtual set, is a studio that allows the real-time combination of people or other real objects and computer generated environments and objects in a seamless manner. It requires that the 3D CGI environment is automatically locked to follow any movements of the live camera and lens precisely. The essence of such system is that it uses some form of camera tracking to create a live stream of data describing the exact camera movement, plus some realtime CGI rendering software that uses the camera tracking data and generates a synthetic image of the virtual set exactly linked to the camera motion. Both streams are then combined with a video mixer, typically using chroma key. Such virtual sets became common in TV programs in the 1990s, with the first practical system of this kind being the Synthevision virtual studio developed by the Japanese broadcasting corporation NHK (Nippon Hoso Kyokai) in 1991, and first used in their science special, Nano-space. Virtual studio techniques are also used in filmmaking, but this medium does not have the same requirement to operate entirely in realtime. Motion control or camera tracking can be used separately to generate the CGI elements later, and then combine with the live-action as a post-production process. However, by the 2000s, computer power had improved sufficiently to allow many virtual film sets to be generated in realtime, as in TV, so it was unnecessary to composite anything in post-production. Machinima Machinima uses realtime 3D computer graphics rendering engines to create a cinematic production. Most often, video games machines are used for this. The Academy of Machinima Arts & Sciences (AMAS), a non-profit organization formed 2002, and dedicated to promoting machinima, defines machinima as "animated filmmaking within a real-time virtual 3-D environment". AMAS recognizes exemplary productions through awards given at its annual The practice of using graphics engines from video games arose from the animated software introductions of the '80s "demoscene", Disney Interactive Studios' 1992 video game Stunt Island, and '90s recordings of gameplay in first-person shooter video games, such as id Software's Doom and Quake. Machinima Film Festival. Machinima-based artists are sometimes called machinimists or machinimators. 3D animation software in the 1990s There were many developments, mergers and deals in the 3D software industry in the '90s and later. Wavefront followed the success of Personal Visualiser with the release of Dynamation in 1992, a powerful tool for interactively creating and modifying realistic, natural images of dynamic events. In 1993, Wavefront acquired Thomson Digital Images (TDI), with their innovative product Explore, a tool suite that included 3Design for modelling, Anim for animation, and Interactive Photorealistic Renderer (IPR) for rendering. In 1995, Wavefront was bought by Silicon Graphics, and merged with Alias. Alias Research continued the success of PowerAnimator with movies like Terminator 2: Judgment Day, Batman Returns and Jurassic Park, and in 1993 started the development of a new entertainment software, which was later to be named Maya. Alias found customers in animated film, TV series, visual effects, and video games, and included many prominent studios, such as Industrial Light & Magic, Pixar, Sony Pictures Imageworks, Walt Disney, and Warner Brothers. Other Alias products were developed for applications in architecture and engineering. In 1995, SGI purchased both Alias Research and Wavefront in a 3-way deal, and the merged company Alias Wavefront was launched. Alias Wavefront's new mission was to focus on developing the world's most advanced tools for the creation of digital content. PowerAnimator continued to be used for visual effects and movies (such as Toy Story, Casper, and Batman Forever), and also for video games. Further development of the Maya software went ahead, adding new features such as motion capture, facial animation, motion blur, and "time warp" technology. CAD industrial design products like AliasStudio and Alias Designer became standardized on Alias|Wavefront software. In 1998, Alias|Wavefront launched Maya as its new 3D flagship product, and this soon became the industry's most important animation tool. Maya was the merger of three packages—Wavefront's Advanced Visualizer, Alias's Power Animator, and TDI's Explore. In 2003 the company was renamed simply "Alias". In 2004, SGI sold the business to a private investment firm, and it was later renamed to Alias Systems Corporation. In 2006, the company was bought by Autodesk. Softimage developed further features for Creative Environment, including the Actor Module (1991) and Eddie (1992), including tools such as inverse kinematics, enveloping, metaclay, flock animation, and many others. Softimage customers include many prominent production companies, and Softimage has been used to create animation for hundreds of major feature films and games. In 1994, Microsoft acquired Softimage, and renamed the package Softimage 3D, releasing a Windows NT port two years later. In 1998, after helping to port the products to Windows and financing the development of Softimage and Softimage|DS, Microsoft sold the Softimage unit to Avid Technology, who was looking to expand its visual effect capabilities. Then, in 2008, Autodesk acquired the brand and the animation assets of Softimage from Avid, thereby ending Softimage Co. as a distinct entity. The video-related assets of Softimage, including Softimage|DS (now Avid|DS) continue to be owned by Avid. Autodesk Inc's PC DOS-based 3D Studio was eventually superseded in 1996 when The Yost Group developed 3D Studio Max for Windows NT. Priced much lower than most competitors, 3D Studio Max was quickly seen as an affordable solution for many professionals. Of all animation software, 3D Studio Max serves the widest range of users. It is used in film and broadcast, game development, corporate and industrial design, education, medical, and web design. In 2006, Autodesk acquired Alias, bringing the StudioTools and Maya software products under the Autodesk banner, with 3D Studio Max rebranded as Autodesk 3ds Max, and Maya as Autodesk Maya. Now one of the largest software companies in the world, Autodesk serves more than 4 million customers in over 150 countries. Side Effects Software's PRISMS was used extensively to create visual effects for broadcast and feature films into the '90s, with projects like Twister, Independence Day, and Titanic. In 1996, Side Effects Software introduced Houdini, a next-generation 3D package that proved to be more sophisticated and artist-friendly than its predecessor. Houdini is used around the world to develop cutting edge 3D animation in the film, broadcast and gaming industries, and Side Effects Software has consistently proved itself to be an industry innovator. CGI in the 2000s 2000 breakthrough capture of the reflectance field over the human face In 2000, a team led by Paul Debevec managed to adequately capture (and simulate) the reflectance field over the human face using the simplest of light stages. which was the last missing piece of the puzzle to make digital look-alikes of known actors. Motion capture, photorealism, and uncanny valley The first mainstream cinema film fully made with motion capture was the 2001 Japanese-American Final Fantasy: The Spirits Within directed by Hironobu Sakaguchi, which was also the first to use photorealistic CGI characters. The film was not a box-office success. Some commentators have suggested this may be partly because the lead CGI characters had facial features which fell into the "uncanny valley". In 2002, Peter Jackson's The Lord of the Rings: The Two Towers was the first feature film to use a real-time motion capture system, which allowed the actions of actor Andy Serkis to be fed direct into the 3D CGI model of Gollum as it was being performed. Motion capture is seen by many as replacing the skills of the animator, and lacking the animator's ability to create exaggerated movements that are impossible to perform live. The end credits of Pixar's film Ratatouille (2007) carry a stamp certifying it as "100% Pure Animation — No Motion Capture!" However, proponents point out that the technique usually includes a good deal of adjustment work by animators as well. Nevertheless, in 2010, the US Film Academy (AMPAS) announced that motion-capture films will no longer be considered eligible for "Best Animated Feature Film" Oscars, stating "Motion capture by itself is not an animation technique." Virtual cinematography The early 2000s saw the advent of fully virtual cinematography with its audience debut considered to be in the 2003 films The Matrix Reloaded and The Matrix Revolutions with its digital look-alikes so convincing that it is often impossible to know if some image is a human imaged with a camera or a digital look-alike shot with a simulation of a camera. The scenes built and imaged within virtual cinematography are the "Burly brawl" and the end showdown between Neo and Agent Smith. With conventional cinematographic methods the burly brawl would have been prohibitively time-consuming to make with years of compositing required for a scene of few minutes. Also a human actor could not have been used for the end showdown in Matrix Revolutions: Agent Smith's cheekbone gets punched in by Neo leaving the digital look-alike naturally unhurt. 3D animation software in the 2000s Blender (software) is a free open source virtual cinematography package, used by professionals and enthusiasts alike. Poser is another DIY 3D graphics program especially aimed at user-friendly animation of soft objects Pointstream Software is a professional optical flow program that uses a pixel as its basic primitive form usually tracked over a multi-camera setup from the esteemed Arius3D, makers of the XYZ RGB scanner, used in the production process of the Matrix sequels CGI in the 2010s In SIGGRAPH 2013 Activision and USC presented a real-time digital face look-alike of "Ira" using the USC light stage X by Ghosh et al. for both reflectance field and motion capture. The end result, both precomputed and real-time rendered with the state-of-the-art Graphics processing unit: Digital Ira, looks fairly realistic. Techniques previously confined to high-end virtual cinematography systems are rapidly moving into the video games and leisure applications. Further developments New developments in computer animation technologies are reported each year in the United States at SIGGRAPH, the largest annual conference on computer graphics and interactive techniques, and also at Eurographics, and at other conferences around the world. References Audiovisual introductions in 1960 Computer-related introductions in 1960 History of animation History of computing New media Multimedia
6524142
https://en.wikipedia.org/wiki/Jlime
Jlime
Jornada Linux Mobility Edition or JLime is a Linux distribution originally aimed for the HP Jornada platform. It was created in late 2003 by Kristoffer Ericson and Henk Brunstin. It is developed using the OpenEmbedded build system. History and name The work on JLime began in late 2003 due to the need for a working Linux distribution on the HP 6xx Jornada platform. The idea behind JLime is a distribution that brings speed and portability to the Jornada. The Jornada had been unsupported in 2.6 kernel (due to lack of developers / test machines) and first year was focused on enabling support. 2.6.9 was the first kernel to be able to boot. In early February 2006 the JLime site was renovated by the JLime forum user "chazco". Later development added the NEC Mobilepro 900 and Ben NanoNote among the supported devices. JLime installer JLime developers "Chazco" and "B_Lizzard" created an initrd based installation tool which can install JLime onto the Jornada 6xx without the need of any Linux machine, however development of this method has halted and has not been applied to any handheld PC other than the jornada 6xx. Most PDA systems use flash memory, but the Jornada handheld computers lack this facility. Therefore, JLime is installed onto a (partitioned) compact flash card. The installer uses a text based dialog orientated interface. Package management JLime uses a minimalistic tool of APT called ipkg to handle packages (see package management system), it can install/remove/update through any existing internet connection or locally. Packages are downloaded from so-called feed repositories and dependencies are handled automatically. IceWM on Jornada JLime is a fully functioning Linux distro and currently uses IceWM window manager as a GUI 'environment'. JLime includes the following applications with IceWM: Minimo, XChat, dillo, Rox-filer, Abiword, Leafpad, Torsmo and a few other useful applications. Developer List Here is the list of the current active developers involved in the JLiME project. Kernel maintainers Kristoffer Ericson (kristoffer) - kernel hp6xx/hp7xx Rafael Ignacio Zurita (rafa) - kernel hp6xx Michael Petchkovsky (cosmo0) - kernel mp900 Package maintainers Alex Palestras (B_lizzard) - packages hp6xx (OE) Matt Oudenhoven (wicked) - packages hp7xx/mp900 (OE) Site maintainers chazco - General maintenance - webmaster Past maintainers Jan Misiak (fijam) - Documentation maintainer & Kernel Tester (2006 - Oct 2007) Releases JLime developed ports See also Familiar Linux Jornada (PDA) HP Jornada X25 MobilePro OpenEmbedded References External links jLime Home page Review of JLime Donkey 1.0.2 by Charles Hague, HPC:Factor (22 January 2007) Pocket PC software Platform-specific Linux distributions Embedded Linux distributions Linux distributions
462914
https://en.wikipedia.org/wiki/The%20Sims%202
The Sims 2
The Sims 2 is a 2004 strategic life simulation video game developed in Redwood City, California studio of Maxis and published by Electronic Arts. It is the sequel to the first game in the franchise, The Sims. The game allows the player to create their own Sim, neighborhoods, houses, and families. Players manage their Sims so they can experience rewards and form relationships in a manner similar to real life. The Sims 2, like its predecessor, does not have a defined final goal as the gameplay is open-ended. Sims have life goals, wants and fears, the fulfillment of which can produce either good or bad outcomes. All Sims age, and can live up to 90 sim days or more. The Sims 2 builds on its predecessor by allowing Sims to age through six stages of life and incorporating a 3D graphics engine that allows the player to get 360º views of the game as opposed to the fixed 2D isometric view of The Sims. Genetics are also a new game mechanic, as previously children in The Sims did not always look like their parents. Although gameplay is not linear, storylines and scripted events exist in the game's pre-built neighborhoods. The Sims 2 was released on September 14, 2004, for Microsoft Windows. A port to macOS was released on June 17, 2005. Eight expansion packs and nine "stuff packs" were subsequently released. In addition, several console versions have been released. The Sims 2 is offered on mobile platforms, with manufacturers such as Nokia offering The Sims 2 from the Ovi Store. A sequel, The Sims 3, was released in June 2009. The Sims 2 was critically acclaimed, gaining a 90% score from aggregators Metacritic and GameRankings. It was also a commercial success, selling one million copies in its first ten days, a record at the time. During April 2008, The Sims 2 website announced that 100 million copies of The Sims series had been sold. By March 2012, the game had sold 13 million copies over all platforms with over six million PC copies, making it one of the best-selling PC games of all-time. Gameplay From the neighborhood view, the player selects one lot to play, as in The Sims. There are both residential and community lots, but Sims can only live in residential lots. Sims can travel to community lots in order to purchase things like clothing and magazines, and to interact with NPCs and townies. The player can choose between playing a pre-made inhabited lot, moving a household into an unoccupied pre-built lot, or constructing a building on an empty lot. One novelty from The Sims is foundations. The player switches among the "live" mode (default) to control Sims, the "buy" mode to add, move or delete furniture, or the "build" mode to rebuild the house and make structural changes. Buy and build mode cannot be accessed when on a community lot, but the lots can be built on by using the neighborhood view. It is also possible to import neighborhood terrains from SimCity 4. The game contains some time-bound social challenges that provide a reward if successful. Sims can throw parties to gain aspiration points or invite the headmaster over for dinner in order to enroll their children in private school. Some expansion packs have new mini-games, like running a Greek house in University or dating in Nightlife. In Nightlife, each date is a challenge to keep both Sims as happy as possible while accumulating aspiration points. Various other expansion packs introduce supernatural characters which Sims can be turned into, such as Zombies, Vampires, Werewolves, PlantSims, and Witches. Sims The main part of the game is to lead a Sim from the start of life to death. A Sim will be born when a female Sim and a male Sim try for a baby several times. The mother will spend 3 Sim days (each day lasts 24 minutes though time can be sped-up) pregnant before giving birth to a baby. During Pregnancy, the belly does not expand gradually. Instead, every day, it "pops" to a bigger size. Players can name the new Sim upon birth. The baby's appearance and personality will be based on the genetics of its parents (though the baby's appearance is hidden until it becomes a toddler). Babies can also be adopted by calling adoption service on the phone, even by single parents, old age sims or same-gender couples. The baby will change into a toddler in 3 days, and 4 more days for the toddler to change into a child. After 8 days, the child grows into a teenager, and will live 15 days before changing into an adult. After 29 days, the Sim will become an elder. An elder will eventually die; the length of this final stage depends on the aspiration bar when they become an elder. Babies, toddlers, children, teens, and adults can be advanced to their next life stage at any time during the 24 Sim hours before they will grow up automatically. For babies, this requires using the birthday cake. Toddlers, children, teens, and adults can use the "Grow Up" self-interaction. If the university expansion pack is installed, teens have the option to go to college, where they will be young adults for approximately 24 days. Aging can be disabled via cheats. Players will need to build up talent badges, skills and relationships with other people, so that they can be successful in their career. A player will also need to make sure a Sim is happy and well by fulfilling wants (including lifetime wants, avoiding fears, and fulfilling motives). Pregnancy, toddlers, teens, and elders are new stages of life. Young adult is a unique age added with the University expansion. Teen Sims will become young adults once they are moved to a university, and will be adults once they leave campus, regardless of the reason. Create-a-Sim In The Sims 2, Create a Family is entered by clicking the "Families" button in the lower left-hand corner of the neighborhood view, then clicking the large "Create New Family" button. Clicking the button labeled "Create A Sim" will expand a tab which has the "Create a Sim" and "Make a Child" icons. "Make a Child" will be grayed out unless the family contains an adult male and adult female. Clicking the "Create a Sim" icon will generate a random adult Sim, who may be male or female which can be edited by the player. As opposed to The Sims, any age besides baby or young adult (which must be made in the University Create a Student) may be created. Instead of having to choose from already finished faces which include hair, it is now possible to alter the facial structure (e.g. widening the nose, thinning the lips, elongating the chin, etc.) and choose any hairstyle to go with it. Different eye colors and an additional skin tone is available for the Sims as well. If Sims are older than a child, their aspiration and turn-ons/offs (Nightlife or later) may be determined. There are ten personality traits which are: sloppy, neat, shy, outgoing, lazy, active, serious, playful, grouchy, and nice but only 25 personality points which can be assigned to those traits. However, in The Sims 2, all personality points must be assigned. Additionally, there are twelve pre-set personalities, one for each zodiac sign. A zodiac sign will be set which matches the personality the player has selected for the Sim. A sim also has one of six Aspirations which is a lifetime goal that strongly influences their Wants and Fears which are: Grow Up, Romance, Family, Knowledge, Popularity, and Fortune. The Sims 2 also comes with The Sims 2 Body Shop, which enables users to create custom genetics, make-up, clothes, and Sims, with the help of third-party tools, such as Microsoft Paint, Paint.NET, Adobe Photoshop, GIMP, and SimPE. Social interactions There are several new social interactions introduced in The Sims 2. These new social interactions can create memories and can be related to certain age groups. Social interactions can come up in the Wants and Fears panel and can be dependent on the Sim's personality and aspiration. Sims with certain personalities may not want to complete certain social interactions. Influence Influencing social interactions are introduced in the University expansion pack. A Sim is able to influence another Sim to complete a social interaction or a chore. Sims gain influence points by completing Wants and can lose influence points by completing Fears. The size of the influence bar depends on the number of friends the Sim has. It also can grow in size with business perks from the Open for Business expansion pack. Influence was also in the Nightlife expansion but added nothing. Chemistry The Nightlife expansion pack introduced a new feature, Turn-Ons and Turn-Offs. Teenagers and older are able to choose their turn-ons and turn-offs. These and other factors such as aspiration and personality, determine the chemistry that one Sim has with another in the form of lightning bolts. Sims can have up to 3 lightning bolts with another Sim. The higher the chemistry is that a Sim has with another Sim, the greater the chance for social interactions to be accepted. New turn-ons and turn-offs are introduced with the Bon Voyage expansion pack. Fury Fury is introduced in the Nightlife expansion pack and occurs when one Sim gets angry at another. During this time relationships with the Sim who is furious are harder to build. Also, the Sim who is furious may pick a fight or vandalize the home lot of the Sim they are furious with. Fury can be caused by another Sim burgling the Sim's house, getting fined after calling emergency services when there was no emergency, fighting, cheating on (the cheater or the Sim that was cheated with, often both), and more. Reputation Reputation, which is found in the previous Sims game The Urbz: Sims in the City is reintroduced in the Apartment Life expansion pack. A Sim gains reputation by interacting with other Sims on community lots. Sims with higher reputations are more likely to gain perks such as free objects and job promotions. Careers There are careers that come with the game that require skills and a certain number of friends in order for promotion. Success in these careers unlocks career rewards and higher salaries plus bonuses. Sims also receive chance cards. Correct answers to these chance cards creates rewards for Sims while incorrect answers could cause a Sim to lose its job. Nightlife and Apartment Life allow Sims to gain promotions through social interactions with other Sims. One specific career option is "Jazzercise instructor". Neighborhoods The Sims 2 ships with three pre-made worlds, known as neighborhoods for the player to explore, all with a specific theme and storylines. These worlds are Pleasantview, a continuation of the playable neighborhood from The Sims, featuring many of the same families, such as the Goths and the Pleasants - Strangetown, a small desert town themed around the supernatural, with aliens, mad scientists and haunted graveyards. The final neighborhood, Veronaville, is a European-themed town based on the works of William Shakespeare, with it's central plot being a loose, modern retelling of Romeo and Juliet. Aside from these pre-made neighborhoods, players can create and populate towns of their own, using built-in presets, or create their own entirely using SimCity 4, since SimCity 4 maps are compatible with The Sims 2. Expansion packs also add several new neighborhoods, such as University towns, a shopping district, a Downtown area and several vacation destinations. Seasons adds a fully-fledged neighborhood, a rural small town called Riverblossom Hills, Free Time adds a hobby-themed town named Desiderata Valley, while Apartment Life adds Belladonna Cove, a bigger, more metropolitan area featuring apartments and high-rises. Comparison to The Sims Graphically, The Sims 2 is more detailed than The Sims and lets players view its world in full 3D. This is a change from earlier Sim games, such as SimCity 2000, which used dimetric projection and fixed resolutions, as the camera was in The Sims. In The Sims, Sims are 3D meshes, but The Sims 2 introduces far more detail in mesh quality, texture quality, and animation capability. A Sim's facial features are customizable and unique, and Sims can smile, frown, and blink. The player can adjust a Sim's features in the in-game Create-a-Sim tool; for example, noses can be made to be very large or very small. Texturing is achieved through use of raster images, though it appears more lifelike. The Sims 2 characters pass through seven life stages — babies, toddlers, children, teenagers, young adults (only with University), adults, and elders — with eventual death of old age, while babies in The Sims only become children before ceasing to age further. The aspiration system (described above) is also new to The Sims 2. Sims can become pregnant and produce babies that take on genetic characteristics of their parents, such as eye color, hair color, facial structure, and personality traits as opposed to The Sims, in which the baby would take on random appearance and personality. Genetics play a major role in the game, and as such, dominant and recessive genes play a larger role than they did in the original game. A player can also aspire to have a Sim abducted by aliens. Males then have the chance to become impregnated and produce after three Sim days a half-alien child. Some of the other additions to gameplay are career rewards, a week cycle, the cleaning skill (which was a hidden skill in The Sims), a variety of meals (depending on time of day), exercise clothing, body shape affected by diet and exercise, and houses built on foundations. Cutscenes were another new feature in The Sims 2. There are cutscenes featuring first kiss, woohoo, child birth, alien abductions, also going to college and graduating in The Sims 2: University. Development Preliminary development on The Sims 2 began in late 2000 following the release of The Sims. EA Games announced on May 5, 2003, that the Maxis studio had begun development on The Sims 2. A teaser trailer was provided on The Sims: Makin' Magic CD, released October 2003, which was later uploaded to websites all over the Internet. The game was first shown at E3 in Los Angeles, California on May 13, 2004. Will Wright admitted that while most of the content of The Sims 2 are original ideas, inspiration for its own expansions and constituents spawned from the successes of the first game. The community interest in the antecedent The Sims: Unleashed and The Sims: Hot Date expansions ensured the creation of The Sims 2: Pets and The Sims 2: Nightlife expansions, respectively. After development concluded, designers from Maxis regarded The Sims 2 as very capricious during creation. Bugs would appear, and Sims would be "tweaked", or have anomalies not present in a previous run. On December 15, 2012, Electronic Arts announced that the official website would be shut down on January 14, 2013. It is now no longer possible to download content from the official site, create exchanges, or participate in the official forum communities. On July 16, 2014, Electronic Arts announced the end of support for The Sims 2. As a response The Sims 2: Ultimate Collection was released at the same time as a limited time offer. The game became available for free download from Origin exclusively following an announcement by EA that they would no longer be supporting the Game. This offer ended at 10:00 PDT July 31, 2014. EA stated that they were planning on releasing the compilation as a retail release. However, as of now, no further information has since been released or confirmed on its planned retail release date, and the game has since been removed entirely from Origin as of the end of 2017. On August 7, 2014, Aspyr Media released The Sims 2: Super Collection as digital download exclusively available at the Mac App Store; the game was updated for OS X Mavericks, 4K and Retina. This compilation only includes the first six expansion packs and the first three stuff packs. Aspyr stated they were unable to include the remaining packs for the game due to licensing conflicts with EA. Like the Ultimate Collection, no new updates on when the remaining packs will be released separately or as a single add-on to the Super Collection have emerged. Music Mark Mothersbaugh composed the build mode, buy mode, Create a Sim, neighborhood music, and main theme of The Sims 2. The game also features original "Simlish"-language songs on the radio, provided by Jerry Martin, The Humble Brothers, Kirk Casey, and others. In later expansion and stuffpacks, well-known recording artists provided "Simlish" versions of their songs for the in-game radio stations, including Depeche Mode, Kajagoogoo, Lily Allen, Datarock, Plain White T's, and Katy Perry, among others. "Pressure" by Paramore, "Don't Cha" by The Pussycat Dolls, "Good Day" by Tally Hall, and "Like Light to the Flies" by Trivium were among the songs re-recorded by their original artists in Simlish for the console version of The Sims 2. Reception and legacy The Sims 2 received widespread critical acclaim. On Metacritic, which assigns a normalized rating out of 100 to reviews from mainstream critics, The Sims 2 has an average score of 90 based on 61 reviews, indicating "universal acclaim". The game also received the Editor's Choice Award from IGN and GameSpy upon final review of the finished product. The Sims 2 had a successful E3. From 71 online reviews, the average score was 90 out of 100. Seven of those sources awarded the game a 100-out-of-100 score. X-Play gave the game a 4/5. Computer Gaming World awarded the game as their 2004 "Strategy Game of the Year (General)", beating out RollerCoaster Tycoon 3, The Political Machine, and Silent Storm. However, critics noted some serious bugs in the game. The Sims creator, Will Wright, was recognized by being nominated at the Billboard Digital Entertainment Awards for Visionary and Game Developer. The game was also nominated for two international awards in 2005. The Mac version of the game won an Apple Design Award in 2006. Computer Games Magazine named The Sims 2 the sixth-best computer game of 2004. The editors wrote that it is "more of a game and less of a dollhouse [than The Sims], but it remains a celebration of the beauty of the mundane." It also won the magazine's "Best Voice Acting" award. The Sims 2 was an instant commercial success, selling a then-record one million copies in its first ten days. The game sold 4.5 million units within its first year, and 7 million by October 2006. It received a "Double Platinum" sales award from the Entertainment and Leisure Software Publishers Association (ELSPA), indicating sales of at least 600,000 copies in the United Kingdom. It received a "Double Platinum" award from the Asociación Española de Distribuidores y Editores de Software de Entretenimiento (aDeSe), for more than 160,000 sales in Spain during its first 12 months. As of March 2012, The Sims 2 had sold 13 million units across all platforms with at least six million units on PC, making it one of the best-selling PC games of all-time. During April 2008, The Sims 2 website announced that 100 million copies of The Sims series had been sold. Even after subsequent Sims installments, The Sims 2 still has an active fanbase. To this day, the game has a large modding community, with new user-created content being continually uploaded to fansites such as Mod The Sims and Sim-themed blogs hosted on Tumblr (nicknamed "Simblrs.") Controversy The Sims 2s malleable content and open-ended customization have led to controversy on the subject of pay sites. Custom content is distributed through independent websites, some of which charge for downloading materials. Charging money for custom content is considered a violation of the game's EULA, which prohibits the commercial use of Electronic Arts' intellectual property. On July 22, 2005, Florida attorney Jack Thompson alleged that Electronic Arts and The Sims 2 promoted nudity through the use of a mod or a cheat code. The claim was made that pubic hair, labia and other genital details were visible once the "blur" (the pixelation that occurs when a Sim is using the toilet or is naked in the game) was removed. Electronic Arts executive Jeff Brown said in an interview with GameSpot: Prior to Thompson's statement, there was an enterable code which allowed to modify the size (including to zero) of pixelation accessible from the console menu. Shortly after the statement, subsequent patches and expansion packs removed the "intProp censorGridSize" code; this code had been left over from the beta testing stage of the original game and had not been intended for a public audience. Editions, compilations, and add-ons Many Sims games have been ported to macOS by Aspyr. The Sims 2 has also been ported to a number of video game consoles including the PlayStation 2, the Xbox, Nintendo DS and the Nintendo GameCube. macOS version macOS ports of the base game, the first six expansion packs, and the first three Stuff Packs have been released by Aspyr Media. The port for the base game was announced on October 19, 2004. The Sims 2 had reached beta status on March 1, 2005, and was released on June 17 the same year. It was, at release, compatible with Mac OS X Panther and above on PowerPC Macintosh systems. The Sims 2 Body Shop was also available for macOS. Aspyr Media released The Sims 2 with all ported expansions and stuff packs as The Sims™2: Super Collection for Intel Macs in 2014. The game is available for purchase on the Mac App Store for OS X 10.9 Mavericks and above. Console versions The console versions of The Sims 2 featured local splitscreen multiplayer, a story mode and an option to control game characters directly, as opposed to queuing options as is traditional Sims gameplay. In this videogame, you cannot have children or age, but you are able to get married. You must earn aspiration points to unlock rewards by filling up your "goals" which will also be needed to complete story mode. Story mode is a sequence of multiple levels along with developed storylines which each character asks you to fulfill wants that pertain to their story. There is also a sandbox mode where you you can live in a preset family or build your own. Handheld versions The three handheld versions of the game are completely different among themselves, unlike the home console versions of the game which are virtually identical to each other. All three handheld versions take on more of a linear storyline. Game Boy Advance version The Game Boy Advance version of The Sims 2 takes place in Strangetown, and shares a similar GUI to its predecessors (The Sims Bustin' Out and The Urbz). Players are guided through a goal-oriented game based on the reality television concept in which partitions of the game are divided into "episodes". Characters from the previous handheld sims games also appeared. Nintendo DS version The Nintendo DS version of The Sims 2 (commonly referred to as "The Sims 2 Hotel") begins with the player's car breaking down in Strangetown. Upon arriving, an anonymous donor grants the player the deed to a hotel which can be operated and customized at the player's discretion. The player's job is to bring life back into Strangetown by encouraging people to come to the hotel, which players can do by upgrading it and making the guests happy. There are several ways in which a player can make Strangetown a nicer place, but is up to the player to find them. Unlike most games in the Sim series, this one takes place in real-time. PlayStation Portable version The PlayStation Portable version of the game is played in third person, much like the Nintendo DS version. The game contains elements of role-playing games and has more of a solid storyline the player is required to navigate through in order to unlock most of the things available in the other versions. The option to build your own home is replaced by a pre-built home where you can customize the furniture and decor. Conversations and jobs are carried out via a mini-game function. The player's character does not age, nor are they able to marry or have children, but they can have a significant other and "WooHoo". Relationships are mainly used for the point of solving goals, though a close friend may move in with the player after progressing in the game. When the player completes a goal their sanity meter, represented as a Plumbob, will fill up slightly and if the player actively doesn't complete their goals the sanity meter will rapidly deplete until the player is hospitalised or abducted by aliens. The player can also earn "Sanity Points" by completing goals which they can use to unlock special perks. Another feature unique to this, and the Nintendo DS, version are "Secrets" which the player can find scattered around Strangetown or by socialising with characters. The game begins with the player's character driving through the Strangetown desert, presumably the "Road to Nowhere" in their car, when suddenly a flying green diamond (Also known as the Plumbob, the marker and logo of the Sims games) flies past the player and causes them to lose control of, and damage, their car. Fortunately, the player finds a gas station. The player takes their car into the garage. At that point the player takes control. The player is introduced to a vehicle mechanic named Oscar who, after a brief tutorial in teaching the player how to talk to NPC Sims, informs the player their car will only take a short while to fix. The player is then free to roam around the gas station, and after being introduced to some more NPCs, including Bella Goth, who claims to be abducted by aliens, completing tasks and being taught the basic objective of the game which is "Secret Hunting" for the store clerk. The player then exits the shop only to find the garage around the back has completely disappeared along with Oscar and their car, with only the foundation of the garage remaining. The only thing left from the disappearance is a cell phone, which the player answers and a man named Doctor Dominic Newlow offers the player a job, requiring him or her to get a ride into town and find a place to stay. The player informs Police Deputy Duncan about the situation who replies that he can do nothing about it and suggests the player find a place to stay. After having bought Bella's house for pocket change and getting donuts for Deputy Duncan (which happen to have been found in the trash), the player finally gets a lift into Strangetown's Paradise Place, only to find more tasks and mysteries. Expansion packs The Sims 2 expansion packs provide additional game features and items. Eight expansion packs were released throughout the game's lifecycle. The Sims 2: Apartment Life is the final expansion pack for The Sims 2. Stuff packs Stuff packs are add-ons that intend to add only new items (usually in the amount of 60) to the base game. However, some releases include certain gameplay elements introduced in previous expansion packs. There are ten total stuff packs. However, The Sims 2: Holiday Party Pack served as the pilot release for this line of products, which were called "booster packs". After the success of the pilot release, EA named the releases "stuff packs" and launched the line with The Sims 2: Family Fun Stuff. The Sims 2: Mansion & Garden Stuff is the final stuff pack for The Sims 2. Core game editions Expansion-only compilations Compilations of expansion packs and stuff packs without the core game have also been released. Downloadable content Pre-order content Most of expansion packs and stuff packs were released with pre-order items. This game content was redeemable at the official site using a code supplied by the retailer from which the player purchased, each retailer was often associated with an exclusive download. A total of 60 pre-order items were released. The Sims 2 Store The Sims 2 Store was an online store where players of The Sims 2 for PC could purchase and download content for their game online for additional fees. It offered objects, clothing, skins, and hairstyles that are both exclusive to the store and also come from earlier expansion and stuff packs. It also had featured seven exclusive item collections that could only be found in the store. The store used a point system that players can purchase. It was opened from July 2008 to March 31, 2011, as a beta version limited to the United States and Canada. To download, players must install The Sims 2 Store Edition and the EA Download Manager. The exclusive collections were "Cubic", "Art Deco", "Spooky", "Castle", "Asian Fusion", "Art Nouveaulicious" and "Oh Baby", including a total of 471 items. Since the closure of The Sims 2 Store on March 31, 2011, The Sims 2: Store Edition and the savegame cannot be used with The Sims 2: Ultimate Collection. Third-party tools SimPE is an open-source utility for The Sims 2 that allows editing of Sims' characteristics, relationships and careers. It also allows the creation of objects. As the tool is intended for use by experienced modders, the SimPE interface is not considered intuitive and users risk corrupting the game files. TS2 Enhancer, developed by Rick Halle, is a commercial utility for editing characters and neighborhoods, but has since fallen into disuse. Notes References External links Archived official website The Sims 2 at MobyGames The Sims 2 on The Sims Wiki 2004 video games Electronic Arts games Game Boy Advance games Interactive Achievement Award winners Life simulation games GameCube games Nintendo DS games MacOS games PlayStation 2 games PlayStation Portable games EyeToy games Social simulation video games The Sims Video games scored by Jerry Martin Video games scored by Kevin Manthei Video games scored by Silas Hite Video games developed in the United States Video games featuring protagonists of selectable gender Video games with expansion packs Windows games Xbox games Split-screen multiplayer games Aspyr games Alien abduction in video games Video games about ghosts Obscenity controversies in video games Video games with alternative versions Video games with custom soundtrack support Video games scored by Mark Mothersbaugh